狐狸视频

>

National Robotics Initiative grant will provide surgical robots with a new level of machine intelligence

Nabil Simaan and robot
Nabil Simaan testing a surgical robot that he designed. (Joe Howell / Vanderbilt)

Providing surgical robots with a new kind of machine intelligence that significantly extends their capabilities and makes them much easier and more intuitive for surgeons to operate is the goal of a major new grant announced as part of the聽.

The five-year, $3.6 million project, titled聽Complementary Situational Awareness for Human-Robot Partnerships, is a close collaboration among research teams directed by聽, associate professor of mechanical engineering at Vanderbilt University;聽, professor of robotics at聽; and聽, the John C. Malone Professor of Computer Science at聽.

鈥淥ur goal is to establish a new concept called complementary situational awareness,鈥 said Simaan. 鈥淐omplementary situational awareness refers to the robot鈥檚 ability to gather sensory information as it works and to use this information to guide its actions.鈥

鈥淚 am delighted to be working with Nabil Simaan on a medical robotics project,鈥 Choset said. 鈥淚 believe him to be a thought leader in the field.鈥 Taylor added, 鈥淸rquote]This project advances our shared vision of human surgeons, computers and robots working together to make surgery safer, less invasive and more effective.鈥漑/rquote]

One of the project鈥檚 objectives is to restore the type of awareness surgeons have during open surgery 鈥 where they can directly see and touch internal organs and tissue 鈥 which they have lost with the advent of minimally invasive surgery because they must work through small incisions in a patient鈥檚 skin. Minimally invasive surgery has become increasingly common because patients experience less pain, blood loss and trauma, recover more quickly and get fewer infections, and is less expensive than open surgery.

Surgeons have attempted to compensate for the loss of direct sensory feedback through pre-operative imaging, where they use techniques like MRI, X-ray imaging and ultrasound to map the internal structure of the body before they operate. They have employed miniaturized lights and cameras to provide them with visual images of the tissue immediately in front of surgical probes. They have also developed methods that track the position of the probe as they operate and plot its position on pre-operative maps.

Howie Choset
Carnegie Mellon's Howie Choset (Courtesy of Carnegie Mellon University)

Simaan, Choset and Taylor intend to take these efforts to the next level. They intend to create a system that acquires data from a number of different types of sensors as an operation is underway and integrates them with pre-operative information to produce dynamic, real time maps that precisely track the position of the robot probe and show how the tissue in its vicinity responds to its movements.

For example, adding pressure sensors to robot probes will provide real time information on how much force the probe is exerting against the tissue surrounding it. Not only does this make it easier to work without injuring the tissue but it can also be used to 鈥減alpate鈥 tissue to search for hidden tumor edges, arteries and aneurisms. Such sensor data can also feed into computer simulations that predict how various body parts shift in response to the probe鈥檚 movements.

To acquire sensory data during surgery, the VU team lead by Simaan will develop methods that allow surgical snake-like robots explore the shapes and variations in stiffness of internal organs and tissues. The team will generate models that estimate locations of hidden anatomical features such as arteries and tumors and provide them to the JHU and CMU teams to create adaptive telemanipulation techniques that assist surgeons in carrying out various surgical procedures.

To create these dynamic, three-dimensional maps, the CMU team led by Choset will employ a technique called聽聽that allows mobile robots to navigate in unexplored areas. This class of algorithms was developed for navigating through rigid environments, such as buildings, landforms and streets, so the researchers must extend the technique so it will work in the flexible environment of the body. These maps will form the foundation of the Complementary Situation Awareness (CSA) framework.

Once they can create these maps, the collaborators intend to use them to begin semi-automating various surgical sub-tasks, such as tying off a suture, resecting a tumor or ablating tissue. For example, the resection sub-task would allow a surgeon to instruct his robot to resect tissue from point 鈥渁鈥 to 鈥渂鈥 to 鈥渃鈥 to 鈥渄鈥 to a depth of five millimeters and the robot would then cut out the tissue specified.

Russell Taylor
Johns Hopkins' Russell Taylor (Courtesy of Johns Hopkins University)

The researchers also intend to create what they call 鈥渧irtual fixtures.鈥 These are pre-programmed restrictions on the robot鈥檚 actions. For example, a robot might be instructed not to cut in an area where a major blood vessel has been identified. Not only would this prevent the robot聽from cutting聽a blood vessel when operating autonomously, but it would also prevent a surgeon from doing so accidently when operating the robot manually.

鈥淲e will design the robot to be aware of聽what聽it is touching and then use this information to assist the surgeon in carrying out surgical tasks safely,鈥 Simaan said.

The Johns Hopkins team led by Taylor will develop the system infrastructure for the CSA framework, with special emphasis on the interfaces used by the surgeon. The software will be based on Johns Hopkins鈥 open-source聽聽toolkit, permitting researchers both within and outside the team to access the results of the research and adapt them for other projects.

The teams will be using several different experimental robots during this research, but all the systems will share a common surgeon interface based on mechanical components from early model聽聽donated by聽聽(Sunnyvale, California) and interfaced to control electronics designed by Johns Hopkins.

Although these prototypes are not intended for use on human patients, the research results could eventually lead to advances in surgical care.

Although the development effort is focused on surgical robots, the CSA modeling and control framework could have a major impact in other applications as well.

According to Simaan, CSA could be used by a bomb squad robot to disarm a bomb or by a human user operating a robotic excavator to dig out the foundation of a new building without damaging the underground pipes or by rescue robots searching deep tunnels for injured miners.

鈥淚n the past we have used robots to augment specific manipulative skills,鈥 said Simaan. 鈥淭his project will be a major change because the robots will become partners not only in manipulation but in sensory information gathering and interpretation, creation of a sense of robot awareness and in using this robot awareness to complement the user鈥檚 own awareness of the task and the environment鈥

The project is funded by the聽.