Mimic Technologies Blog

Where Is the Best Place To Do Robotic Surgical Simulation & Training?

Simulation is increasingly recognized as being an important component when learning how to drive a surgical robot. However, very little discussion has been held about where the best place is to learn and does the location of the simulator make a difference in its utilization.

Essentially there are three places where simulation can be used for training:
• Inside the OR using the actual robot console connected to a simulator
• In a dedicated space outside the OR, such as a Sim Center, using a surgeon console emulator
• Wherever you can find an appropriate space using a portable simulator

Let’s look at these options in a little more detail:

Inside the OR

Back in 2010, Mimic helped Intuitive Surgical develop a simulation training product that uses the da Vinci® Surgeon Console. The da Vinci® Skills Simulator can be attached to a robot’s second console or the primary console.

The second console is a device where a second surgeon can sit and watch the operative field and have the same immersive feel as the primary surgeon. Although not a simulator, it can be an effective training tool. The users can swap controls of the robotic arms between the two consoles so the student can be allowed to intervene at the appropriate time while proctor can regain control as necessary. As long as space is not an issue, it is felt that the use of a second console helps with the overall efficiency of an OR hence their popularity; approximately 20% of all da Vinci® Systems are sold with a second console, although the percentage is higher on the new Xi systems and also in the US. In theory, a second console can be moved outside of the OR when not in use, which would allow for training both inside and outside the OR. Practically speaking, the second console can be cumbersome to move and is rarely moved outside of the OR. The primary console rarely leaves the OR. Even so, 50% to 60% of customers purchasing a single console robot also purchase a da Vinci® Skills Simulator with their system.

The advantage of the Skills Simulator is that it uses the real surgeon console for input, which means the hardware fidelity is 100%. The downside is that Skills Simulators are almost always in the OR and not always accessible. The more successful the robotic program, the less time there is to use the Skills Simulator in the OR because case volume dominates OR usage. Also, some institutions also do not like giving people access to the OR outside of the normal working day. A common concern is that the system may not be shut down properly, which costs OR time the next day, or worse, the console cable may be damaged as the trainee switches it between the real robot and the simulator.

Outside the OR

Many larger institutions have been fortunate enough to invest in dedicated simulation centers. These vary from small dedicated rooms to large purpose-built buildings. These have been good locations for console emulator products such as Mimic’s dV-Trainer®. The positive is that these are often in a location that is accessible 24/7 and are often supported by dedicated staff that have knowledge of the systems and can act as proctors. The downside is that some of these centers are not in a convenient location, which will act as a deterrent to utilization. Some users are also looking for higher fidelity than the 85% to 90% offered by the dV-Trainer®.


It is still too early to say where will be the best location for a portable simulator, but the picture below illustrates just how versatile the FlexVR™ simulator is. It can be set up on nearly any table, including a spare table in the hall outside of the OR! The fidelity of the FlexVR™ has not been independently measured, but without force feedback to help hold ones hands in the air, it is expected that fidelity will fall just below Mimic’s dV-Trainer®.

The real question is, does location influence usage?

One of Mimic’s high utilization customers has been using simulation for over 5 years. They have systems in the OR, in a separate room in the OR area, and in the sim center located 2 blocks away. We have been able to track the usage of all three systems. The graph below shows the number of sessions carried out on the simulator over the past 5+ years. In total, over 200 users have done 16,000 exercises spending on average around 4 hours each on the simulator.

As we can see, the most simulation was carried out when the first system was installed. After a dip in 2013, simulation has evolved into a steady average of around 2,500 sessions per year.

The graph above shows the relative usage between the different simulators. Beginning in 2007, the dV-Trainer® was the only option and all the simulation activity took place outside of the OR. In 2011, the Si Skills Simulator was made available for training with the surgeon console. This led to an initial reduction in the usage of the dV-Trainer®, though over the next 3 years the two systems were used evenly. During 2016, the da Vinci® Xi began to supplant the Si and a new Xi Skills Simulator was required. The initial usage of the DVSS for the Xi was high, however, as time has gone on, the usage has decreased and been replaced by usage of the dV-Trainer® which accommodates both Si and Xi simulation.


It is still too early to say how simulation utilization will be affected by Mimic’s release of the portable FlexVR™ simulator. However, drawing a parallel to training inside the OR versus outside the OR, it is expected that more access will lead to higher utilization. All said, simulator access is an important consideration when deciding where to locate a simulator and the beauty of the FlexVR is that location is no longer a primary concern – the price, size, and portability make it a system that can be used virtually everywhere!

Leave a Comment »

How Long Should a Trainee Practice on a Robotic Simulator?

This is probably the most frequent question asked when people are trying to understand the impact of investing in a simulator for their robotic surgical program. Of course, there is not one answer to this question as training objectives can vary significantly depending on the trainee’s discipline, level of surgical experience, and standards set by a hospital’s robotics committee. However, regardless of the training objective, a trainee should not be judged by “how long” they have trained, but by whether training data indicates “proficiency”. However, rather than boring you with published articles on what it means to be proficient, I will share some generalized data with you that will help answer this question.

I have been able to analyze the utilization at some of our high usage centers on both dV-Trainers® and da Vinci® Skills Simulators. While the overall usage is higher on the dV-Trainer®, the utilization distribution is the same across both platforms. These systems represented 705 different users who spent over 2,600 hours on the systems.

The average utilization of the system is just under 4 hours per user. I, therefore, chose to split the users into three groups: those who spent more than 10 hours on the system, those who spent between 4 and 10 hours, and those who spent less than 4 hours.

As you can see, the vast majority of people spent less than 4 hours with approximately 1 in 4 spending less than an hour.

However, if you look at the impact on the utilization you will see that the smaller group of dedicated trainees (those that trained for more than 10 hours) dominate the total simulation usage.

How does this help answer the original questions? One interpretation is that if someone does not spend more than 4 hours on a simulator, they cannot be seen to be serious about training. If they spend over 10 hours, then they are clearly more interested in developing the right skill level. Interestingly, 10 hours is the amount of time that the Urology Department at Hartford Hospital felt was the optimum time residents should be spending on simulation based on their 2015 study (Read the full study here). The answer could therefore be, “at least 10 hours should be spent on simulation training or as long as it takes you to become proficient”.

The real answer is that it will vary by individual and will be partly based on their level of interest and their innate ability. In 2014, a paper was published by Andrea Moglia from Pisa on the innate ability that medical students might have for robotics (Read the full study here). The paper was able to show that 6.6% of the student population of 125 had a significantly higher score than the median and were as good as expert surgeons the first time they sat down on the console. At the other end, there were 11% who clearly had no aptitude at all. It should be expected that users with innate ability will require significantly less training time than the average user, but the time needed will be dependent on the curricula and the proficiency thresholds that have been set by their institutions. The time an individual will need to spend on a simulator to gain proficiency will therefore be driven by a number of factors, including their own motivations, their innate ability, and the proficiency standard set by their institution.

Leave a Comment »

The Relationship Between Robotic Surgical Technical Skill in RARP and Patient Outcomes – 3 Abstracts From AUA 2017

One of the questions we have been studying and trying to better understand is the relationship between surgical skill and patient outcomes. The rough rule of thumb has always been to ask a surgeon how many of these procedures he or she has done. The logic is that the more surgeries they have performed the better they are likely to be.

Three abstracts were presented that looked at this particular subject at AUA 2017. Two of them leveraged the Michigan Urological Surgery Improvement Collaborative (MUSIC) and one was from a single center study looking at a small cohort of patients from a single surgeon.

High level conclusion:

All three abstracts concluded that there was a relationship between the technical skill of a surgeon and selective patient outcomes. These varied from improved continence at 3 months to reduced urethral catheter replacement for more technically skilled surgeons. There were also suggestions of improved readmission rates as well as decreased blood loss though this was not consistent across all abstracts.

Two of the abstracts were as follows:
Technical Skill Assessment of Surgeons Performing Robot-Assisted Radical Prostatectomy: Relationship Between Crowdsourced Review and Patient Outcomes (Paper# PNFBA-02: Best Abstract – Khurshid R. Ghani et al., Ann Arbor, MI. Read the full study here)

Surgical Skill and Patient Outcomes After Robot-assisted Radical Prostatectomy (Paper# PD58-06: James O. Peabody et al., Detroit, MI. Read the full study here)

Both of these abstracts leveraged the MUSIC database and asked 29 surgeons to provide a video of a case and both used a clip of the vesicourethral anastomosis. Both abstracts also were able to look at results against 2,256 patients. In both instances, the surgeons were reviewed using a Global Evaluative Assessment of Robotic Skills (GEARS) that divided them into quartiles and compared results of the top 25% (Most Skilled) with the bottom 25% (Least Skilled).

The difference between the two papers is that one used the C-SATS platform and crowdsourced the evaluation using 285 reviewers while the other leveraged a group of 56 surgical peers. There is not enough detail in the abstracts to understand the nuance and difference between the reviewers and the scoring achieved.

The results for the crowdsourced study can be seen below:

While the results from the peer reviewed study are seen below:

Given that there is an overlap in a number of the authors in both abstracts, it will be interesting to see if a paper is published that tries to explain these differences.

The third paper was:
Surgical Technical Performance Impacts Patient Outcomes in Robotic-Assisted Radical Prostatectomy (Paper# MP51-15: Mitchell G. Goldenberg, Toronto, Canada; S. Larry Goldenberg, Vancouver, Canada; Teodor P. Grantcharov, Toronto, Canada. Read the full study here)

This looked at 28 case matched patients from one surgeon over a 7 year time period and was reviewed by one surgeon. They also used the GEARS scoring system as well as the Robotic Anastomosis Competency Evaluation (RACE) and the Generic Error Rating Tool (GERT).

They were able to see a correlation between patients who did not achieve continence at three months and the number of errors that occurred during the bladder neck dissection.

Fundamentally these abstracts show that surgical skill matters. The more a surgeon can develop their technical skill set away from a patient, using validated simulation curricula that is driving to proficiency, the better it will be for the patient as well the hospital as fewer readmissions will lead to lower overall costs.

For more articles from the Journal of Urology and AUA, click here. For more validation studies about Mimic products, click here.

Focused on assisting hospitals to better maximize their investment in robotic surgery, Mimic has over 15 years of experience providing tools and support for robotic surgery training and program support. Contact us today to learn more.

Leave a Comment »

Interesting Studies from AUA 2017

In May, Mimic joined top urologists from around the world in attending the AUA Annual Meeting in Boston. Among the rich content and papers presented, here are 3 especially interesting robotic surgery training and simulation takeaways from papers presented at AUA 2017:

1. Trainees take between 15% and 120% longer than expert surgeons when carrying out procedural steps (Paper#MP51-04; Muammer Altok, Mary Achim, Surena Matin, Curtis Pettaway, John Davis, Houston, TX).

2. Novices position their arms in a less ergonomic fashion than expert surgeons (Paper#PD46-06; Kenta Takayasu, Kenji Yoshida, Tadashi Mastuda, Osaka, Japan).

3. Viewing a patient-specific simulated 3D model of a kidney tumor helped novices in identifying tumor locations (Paper#PD46-12; Rai et.al.).

1. A Decade of Robot-Assisted Radical Prostatectomy Training: Time-Based Metrics from Fellows and Residents (Paper# MP51-04, Muammer Altok, Mary Achim, Surena Matin, Curtis Pettaway, John Davis, Houston, TX)

A common way to train fellows is to allow them to carry out steps of the procedure and as they build up confidence, they eventually migrate to the complete procedure. This paper looked at the difference in time at various stages of the procedure between experts and novice surgeons and graded them by quartile. Overall fellows and residents were involved in 1,622 cases. The increase in time to complete the segments varied from 15% (E-PLND) to 120% (dorsal vein complex) depending on the part of the case being carried out as can be seen in the table below:

A Grade 4 to 5 success rate was achieved in 95% of the cases. Modern training in robot-assisted surgery is evolving towards curriculum-based training that includes didactics, dry-lab exercises, wet-lab operations, surgical assistance, and ultimately console performance under careful supervision. After a decade of training 4 clinical fellows and up to 12 residents per year, this study transformed their step-wise time metrics into a simple table to use to benchmark performance. A non-validated qualitative feedback was also recorded. Read the full study here.

2. Analysis of the Posture Pattern During Robotic Simulator Task Using Optical Motion Capture System (Kenta Takayasu, Kenji Yoshida, Tadashi Mastuda, Osaka, Japan)

This study was essentially looking to see if the relative position and movement of the shoulders, elbows, and wrists was different between novice and expert surgeons carrying out two Mimic simulation exercises. The table below shows there was in fact a significant variation:

We have often seen that there is a difference in economy of motion between expert surgeons but this is another way of looking at the same phenomena. In addition, there are differences between novices and experts in the positional relationship between the elbow and wrist and joint angle of the upper limb, indicating that experts may have less posture stress. Read the full study here.

3. Virtual Simulation Improves a Novice’s Ability to Localize Renal Tumors in 3D Physical Models – a Multi-institutional Prospective Randomized Controlled Study (Paper#PD46-12, Rai et al.)

This is an interesting paper that evaluates if bringing patient-specific 3D models into a simulated environment helps in identifying tumor location. One hundred medical students were put through the protocol below where they were exposed to a CT scan, half then looked at a 3D model on a dV-Trainer and half went to look straight at a physical model.

Those who had also looked at the 3D virtual representation on the dV-Trainer more accurately visualized the tumor location on the physical model. Read the full study here.

For more articles from the Journal of Urology and AUA, click here. For more validation studies about Mimic products, click here.

Focused on assisting hospitals to better maximize their investment in robotic surgery, Mimic has over 15 years of experience providing tools and support for robotic surgery training and program support. Contact us today to learn more.

Leave a Comment »

Tips to Take Your Robotics Program to the Next Level

Here are some tips that may help you take your robotics program to the next level:

• Encourage your robotics committee to establish minimum credentialing thresholds of simulation performance.
• Create accounts for all trainees (i.e., no training under “guest”). This ensures performance can be tracked.
• Implement Mimic curriculum to test for innate ability of residents and fellows. Predict who is most likely to excel in your robotics program.
Ensure training data is regularly uploaded to the cloud with Mimic’s MScore Portal so it can be easily reviewed by your robotics committee. Allow Mimic to customize a dashboard of the data/analytics.
• Use Mimic data analytics to compare your institution’s performance to other hospitals in your IDN or the rest of the world in terms of quality, efficiency, safety, and risk.
Increase access to training by providing simulators outside of the operating room with Mimic’s dV-Trainer (a surgeon console emulator) or FlexVR (a portable simulator that trainees can take home).
Promote team training between the surgeon and the first assist with Mimic’s Xperience Team Trainer. Trainees should prove competence as a first assist before they begin training at the surgeon’s console.
Introduce trainees to procedures with surgeon-lead simulation, Maestro AR (prostatectomy with Drs. Patel & van der Poel, partial nephrectomy with Dr. Gill, hysterectomy with Dr. Advincula, and inguinal hernia with Dr. Low).
Earn CMEs when attending advanced hands-on robotics training programs with MimicMED and MimicMED partners, such as with Florida Hospital Nicholson Center (home of MimicMED), UPMC Center for Advanced Robotics Training (CART), and the STAN Institute at Nancy Hospital (France).

(800) 918-1670

Leave a Comment »