Technology

Giving Neurosurgeons a Gift of Touch


The procedures used in brain surgery are some of the most complex in medicine, with the margin for error minimal. Yet neurosurgeons have no way to effectively practice for an operation before they cut into a patient. Thenkurussi Kesavadas hopes to change that. A professor of mechanical and aerospace engineering at the State University of New York at Buffalo, Kesavadas has built a virtual-reality tool for brain docs. The system, which runs on a Windows computer, allows physicians to examine three-dimensional images of what surgery would be like.

More radical still, neurosurgeons wearing a special glove wired to software Kesavadas designed receive tactile feedback that replicates what their fingers and hands would experience in the operating room. "When surgeons virtually cut the skull, they actually feel the tool vibrating in their hands. They can feel smoothness and roughness. And they say, 'Wow, this seems real!'" says Kesavadas, who's developing the brain-surgery simulator with help from neurosurgeons at the University of Miami in Florida and should go into service sometime next year.

Kesavadas is a visionary on the cutting edge (for lack of a better phrase) of a medical field that has yet to get a specific name. It comprises disparate disciplines including telemedicine, robotic surgery, and three-dimensional imaging of the human body. These developing technologies, combined with powerful computers and the Internet, are on the cusp of radically altering many facets of medicine by pushing those who live by the Hippocratic oath into the realm of video games.

TISSUE STRENGTH. For Kesavadas, 40, it all started in the 1990s at Penn State University, where he was a doctoral student in industrial engineering. He was working on a project to design a robotic systems that would carry out tasks normally requiring lots of human dexterity, including surgery.

His team soon realized that simply building a device to cut and sew involved not just mechanical design but also research into the physical properties of human tissue and how much force machines could or should apply to perform medical tasks without harming patients. "I found it very interesting that one of the issues in using robotics was to evaluate how strong human tissue is, and how you could use that [information] to program robots," says Kesavadas.

At the time, computer models evaluating the physical properties of the human body remained in the very early stages. Lots of imaging efforts were under way, but very few other researchers were looking at such characteristics as resistance, density, elasticity, and hardness. In 1995, Kesavadas left Penn State for Iowa State University, where he wrote software to help design factories. But the idea of creating a system that could replicate key physical attributes of the human body -- such as how strong bones are, how much skin can stretch, and what a liver feels like to someone pushing on the abdomen -- never left his mind.

SAVVY SENSORS. So he moved to the SUNY Buffalo in 1996 and became involved in the Virtual Reality Lab, where he started a project that later became the "Virtual Human Model for Medical Applications." Working in conjunction with physicians from the nearby Erie County Medical Center, Kesavadas started testing ways to translate what a doctor feels into computer data.

To do this, he began to combine three-dimensional body images from the National Institutes of Health's Visual Human project with so-called haptic sensors. These thimble-size devices can record pressure and, to a certain degree, texture when they're moved across or pushed against a surface.

This initial work evolved into a system where a doctor dons a specially wired glove or finger-fitting that holds haptic sensors. Consisting of a small ball that rolls like a trackball across a surface, the sensors also record force feedback, or how much the surface resists when someone pushes against it. And they're wired to a laptop.

VIRTUAL POKING. In the initial examination, they record or replicate what the doctor would feel by pushing his hand against a patient. By converting feedback from the sensors into bits and bytes, the system can build a 3-D image of the virtual exam. It can also store the information for later playback when a doctor can put on a haptic glove, move his hand across a 3-D representation on a computer screen, and feel what the first doctor felt.

"When we carry out a virtual abdominal palpatation [a medical term for a hands-on examination] using the playback data, we can poke on the abdomen and feel the same type of tension or stiffness on the surface that was captured by the haptic sensors in the first exam," Kesavadas says.

By 1999, he and a team of graduate students had created special mathematical equations, called algorithms, to describe the tactile information they were collecting. They used these algorithms to break down the equations describing images showing a digitized body into increments of less than six inches. This allowed the software to account for textural changes, a necessary function as tumors or other telltale signs that doctors can feel are sometimes very small. Kesavadas cautions, however, that the system still lacks superfine feedback.

CYBERTOUCHING. Even in its current form, Kesavadas envisions numerous uses for his creation. Doctors could objectively benchmark examination data so that exams done several days apart, when compared through the lens of the virtual-reality feedback system, might indicate important changes, such as the size of a tumor or the hardness of a stomach wall.

With the rise of fast Internet connections, Kesavada sees no reason why a doctor in one part of the country couldn't ask for a haptic consult from a colleague in another part, and play back the tactile examination data to get a better idea of what the patient's body felt like. That might help doctors decide when to evacuate patients from remote locales for special treatment -- or distinguish what seems to be severe appendicitis from plain old intestinal cramps.

The tactile systems he has developed might also be useful attachments to robotic surgery devices. Already, two companies have built machines for doing heart surgery. But without tactile feedback, threading sutures has proven very difficult. "Imagine trying to tie your shoe laces with no sense of touch. That's what doctors have to do with robotic surgical systems," says Kesavadas, who hopes to introduce a marketable product based on the technology within the next few years. He puts the price tag as low as $5,000 for the simplest models.

Ultimately, Kesavadas believes this technology could help eliminate the use of cadavers and allow phyisicans to get up to speed on new procedures faster than working with live surgical cases. Down the road, he sees a day when "...there will be something that digitally defines the physical properties head to toe of human beings of different ages..." to give doctors a tactile reference library for the entire human lifespan. No doubt, Kesavadas' work will have a lot to do with making such an ambitious task a reality. By Alex Salkever


We Almost Lost the Nasdaq
LIMITED-TIME OFFER SUBSCRIBE NOW

(enter your email)
(enter up to 5 email addresses, separated by commas)

Max 250 characters

Sponsored Links

Buy a link now!

 
blog comments powered by Disqus