About Our Laboratory

The Haptics and Medical Robotics (HAMR) Laboratory seeks to extend the current knowledge surrounding the human perception of touch, especially as it relates to applications of human/robot interaction and collaboration. We are particularly interested in medical robotics applications such as minimally invasive surgical robots, upper-limb prosthetic devices, and rehabilitation robots. To solve many of the problems in these areas, we apply techniques from human perception, human motor control, neuromechanics, and control theory. Check out Prof. Brown’s presentation in the iRos Workshop, “Intro to Haptics for XR” and his Hopkins at Home talk, “Engineering the Sense of Touch.”

Research Projects

The study of human-machine interactions is extremely useful in designing and improving the human experience in both real and virtual environments. Rendering a desired haptic experience through a device; however, can be challenging due to the complexity physical sensing mechanisms of the human body and the complex cognitive system involved in understanding haptic perception; compounded by the effect from other sensing modalities. This research is primarily focused on studying the psychophysics involved in human interaction to better understand how the human brain can be used in the control loop of haptic devices to either compensate for inertia of mechanical systems and/or simulation of required inertia to create a life-like experience.

Read More

Minimally invasive surgical robots, such as the Intuitive Surgical da Vinci robot, allow surgeons to remotely operate surgical tools through incisions smaller than the width of a human finger. While these telerobotic laparoscopic systems allow for greater dexterity and improved vision over traditional laparoscopic surgery, they do not allow the surgeon to feel the interactions between the robot and the surgical environment. Experienced robotic surgeons have, therefore, learned to heavily rely on vision to guide their manipulations with the robot and estimate physical properties of tissue. For novice surgeons, this skill can only be developed with repeated practice and training. Still, vision is not always as accurate as the sense of touch, as demonstrated by my work on prosthetics. Thus, there exist a number of surgical procedures in which the surgeon needs to palpate or “feel” the tissue to determine critical information, such as the location of blood-carrying vessels or the boundaries of a cancerous tumor.  My work in robotic minimally invasive surgery (RMIS) has two complimentary foci: investigating strategies for advanced training platforms for RMIS and investigating strategies for pinching palpation feedback in RMIS.

Advanced training platforms for RMIS

For a novice surgical trainee, learning the technical skills needed to effectively and efficiently operate a minimally invasive surgical robot can be particularly daunting. Many trainees spend considerable time practicing inanimate training tasks with the clinical robot to develop the necessary technical skills. Training with the clinical robot, however, requires the use of expert human graders to evaluate a trainee’s performance. This evaluation process can be subjective, time-consuming, and cost-ineffective as most experts are practicing physicians themselves.

To address this challenge, I am investigating the use of advanced training platforms that measure the physical interactions between the surgical robot and the training environment to access a trainee’s skill at a given inanimate training task. The current system, the Smart Task Board, uses the task completion time, high frequency accelerations of the two primary robotic tools and the robotic camera, as well as the forces produced on the training task to evaluate skill according to a standardized assessment tool. To accomplish this, machine learning models are trained to recognize patterns between the time, acceleration, and force data, and ground truth labels of skill provided by expert human graders.

A key drawback of robotic surgery platforms is that they do not provide haptic cues. This makes it difficult for surgeons to learn to operate, as they must learn to rely on their eyesight to characterize cues normally detected by touch, such as force. Substantial research has investigated the incorporation of haptic feedback into surgery platforms, but less research has been done to learn about how haptics can be used to train surgeons for robotic surgery. By providing users with force feedback during training, we predict that they will more rapidly learn to operate with only their eyes. Surgeons experienced in robotic surgery often say that they have learned to “feel with their eyes.” We believe that haptics can teach this skill more effectively.

[1] J. D. Brown, C. E. O’Brien, K. W. Miyasaka, K. R. Dumon, and K. J. Kuchenbecker, “Analysis of the Instrument Vibrations and Contact Forces Caused by an Expert Robotic Surgeon Doing FRS Tasks,” in 8th Annual Hamlyn Symposium on Medical Robotics, 2015, pp. 75–77.

[2] J. D. Brown, C. E. O’Brien, S. C. Leung, K. R. Dumon, D. I. Lee, and K. J. Kuchenbecker, “Using Physical Interaction Data of Peg Transfer Task to Automatically Rate Trainee’s Skill in Robotic Surgery.,” Under review for IEEE Transactions on Biomedical Engineering.

Read More

As we interact with the outside world, we form percepts by integrating many channels of continuously varying haptic information  (e.g. grip force, surface slip, surface texture, temperature,etc.). All of this information is critical when we want to perform intricate, dexterous manipulation tasks, such as suturing or tying shoes. Most haptic feedback interfaces for telerobots, such as surgical robots and upper-limb prostheses, have focused on single modality haptic feedback, which limits the richness of perception. We are currently exploring whether providing multiple channels of continuous haptic feedback (e.g. both grip force and surface slip) through different haptic modalities aids telerobotics users in dexterous manipulation tasks.

Read More

Motor-neurological conditions such as stroke can disrupt normal muscle activity for any part of the human body, especially in the upper extremities. Upper-limb impairment makes it harder for affected people to perform daily routines that involve manipulation (e.g. opening a door, typing on a keyboard, etc.) The selective activation of a muscle group for a particular manipulation is referred to as a muscle synergy. The lab is currently in joint investigation with the Brain, Learning, and Movement Lab at the Johns Hopkins Medical Institute to understand how well people impacted by stroke can apply and suppress forces from the fingers. Specifically, our labs are quantifying and characterizing microforces from the finger synergies of stroke-affected individuals. Our aim is to apply this synergy characterization to inform better treatments with faster rates of recovery.

Read More

The majority of currently commercially available prostheses offer limited or no haptic feedback to the user. For upper-limb amputees, two current options are body-powered and myoelectrically-controlled prostheses. Amputees operate body-powered prostheses by using their opposite shoulder to pull on a cable, controlling the grip aperture of the attached hook end-effector. Because of this direct connection with the end-effector, body-powered prostheses provide amputees with some vague haptic sensations; however, they typically offer limited degrees of freedom for manipulation. Myoelectric prostheses, controlled by muscle activity detected through electromyography (EMG), allow for higher degrees of freedom for manipulation but eliminate direct connection with the end-effector. For both options, amputees cannot feel crucial, high-fidelity haptic sensations of the interaction between the prosthesis and the environment and are forced to rely heavily on vision, which can be mentally taxing.

In order to explore the utility of haptic feedback for anthropomorphically driven prostheses our  group has designed a wearable anthropomorphically driven prosthesis with built in haptic feedback system. Currently we are running studies to quantify the effects of the built-in haptic feedback system during the unilateral manual dexterity task; the box and blocks test. This research will inform the design of future prosthetics and provide information on the importance of certain forms of haptic feedback.

Previously, our group compared the efficacy of cutaneous and kinesthetic haptic feedback modalities for myoelectric prosthesis users through sensory substitution in a stiffness discrimination task, using the haptic devices shown below. We are currently studying the effect of haptic feedback on task performance and mental effort in a dexterous manipulation grasp-and-lift task.


Is force feedback useful in body-powered prostheses?

The inherent haptic force feedback in body-powered prosthesis is thought to be one of the main reasons it has remained a popular choice for amputees, despite remaining relatively unchanged since its development over 60 years ago. This claim, however, has never been experimentally validated. A main reason for this is that removing haptic feedback from the prosthesis without also removing the ability to control the prosthetic gripper is next to impossible. However, empirical evidence of the benefit of haptic feedback would be useful for researchers considering similar haptic feedback strategies for myoelectric prostheses.

To answer this question, we developed a unique, body-powered prosthesis which featured removable haptic feedback. In this prosthesis, a cable-driven elbow exoskeleton was used to control a cable-driven, voluntary closing prosthetic gripper through two linear actuators.  These actuators are electrically connected, and they can separately regulate the transmission of the control input from the body to the gripper and the feedback from the gripper back to the body. Using this unique prosthesis, we found that the haptic force feedback available in a body-powered prosthetic gripper provides more utility than vision alone, which is what is currently available in myoelectric prostheses.

Substantial progress has been made in the control of myoelectrically driven upper-limb prosthetics, which has improved the quality of life for many amputees. However, one of the biggest complaints that amputees still have about these commercially available prosthetics is the lack of tactile feedback. Tactile feedback is crucial not only for highly dexterous tasks, but also for basic manipulation of objects in daily life. The somatosensory system can portray a large spectrum of tactile sensations that run the gamut from vibratory to proprioceptive information. With a traditional myoelectric prosthesis, these cutaneous and kinesthetic sensations are not possible. In contrast, a body-powered prosthesis provides proprioceptive information through the forces of the cables and harness as the user operates his or her prosthesis. This is a key reason why many users choose a body-powered prosthesis over a myoelectric prosthesis.

Read More

ARE YOU READY TO JOIN OUR LAB?