Elvis Chen

Elvis C. S. ChenElvis C. S. Chen
519.931.5777 x24350




Why I Became a Scientist

Early in my academic training, I had the opportunity to work in a research laboratory where I could apply my knowledge in computing to interventional medicine to improve patient care. I was fascinated by the workflow of bench-to-beside research, by which what seemingly abstract concepts are transformed into actionable to make a positive impact on one’s well-being. Computing is not just about numbers and algorithms but is actionable that is limited only by one’s imagination. Working as a scientist at Robarts, I am able to bridge the gap between basic science and interventional medicine and to bring an idea from conception to clinical translation. I am privileged by the academic freedom and by the opportunity to work with some of the brightest minds, and honored by the responsibility to pass down what I have learned to my trainees.

Research Summary

Minimally Invasive Surgeries (MIS) are becoming established alternatives with curative intent to many types of open procedures. Because surgical sites are hidden beneath the skin, MIS centrally relies on medical imaging and medical mechatronics for surgical planning, intraoperative guidance, and postprocedural assessment. The ability to perform dexterous surgical maneuvers through small incisions is limited by the lack of direct human vision and appropriate surgical tools with intuitive control. Thus, my research centrally addresses the question of:

  1. How to further improve patient outcomes in the context of minimally invasive surgery?
  2. How to make surgical tasks, performed through small incisions in the skin, easier to execute?
Three pillars of my research theme are i) medical mechatronics, ii) AI-enabled image computing, and iii) mixed-reality system for 3D visualizations.

Research Questions

Can minimally-invasive surgery be performed without ionizing radiation?

Medical imaging modalities based on ionizing radiation as CT, fluoroscopy, and 2D X-ray are commonly used intraoperatively to guide the placement of surgical instruments, but they are harmful to both the patient and medical practitioners. We are developing ultrasound-based guidance systems for many surgical interventions performed percutaneously.

How to reach a surgical target safely and accurately?

Many surgical interventions including tumour ablation, deep-brain stimulation, and general anesthesia, rely on the accurate placement of a “needle-like” medical device into a surgical target without causing collateral damage. We are developing “GPS” for surgery to assist surgeons to perform these delicate tasks.

How to visualize surgical scenes hidden beneath the skin in 3D?

While several imaging modalities such as CT and MRI inherently depict patient anatomy in 3D, they are nonetheless presented to the surgeons “slice-by-slice” on 2D monitors. We are developing advanced 3D visualization techniques that enhance surgeons’ ability to perceive medical information in 3D.


Dr. Chen completed his BSc (Hon), MSc, and PhD in computer science at Queen’s University. Under the supervision of Dr. Randy E. Ellis, Dr. Chen’s theses focused on the study of the forward and inverse kinematics for knees after total knee arthroplasty.

  • BScH in Computer Science, Queen’s University, Canada
  • MSc in Computer Science, Queen’s University, Canada
  • PhD in Computer Science, Queen’s University, Canada


Dr. Chen pursued a post-doctoral fellowship at Queen’s University under the joint supervision of Drs. Gabor Fichtinger, Parvin Mousavi, and Purang Abolmaesumi in the research area of ultrasound-guided spine intervention. In 2009, he joined Dr. Terry M. Peters’ lab at Robarts Research Institute, at the University of Western Ontario.

  • PDF, School of Computing, Queen’s University, Canada
  • Limited Licencee certified by the Professional Engineers Ontario


  • New Investigator Award, Canadian Society for Biomechanics, 2002


Contact Info

Robarts Research Institute
Western University
519.931.5777 x24350
Email: chene@robarts.ca
Personal Research Website: https://chene.github.io/