This year’s Association of Medical Illustrators (AMI) Annual Conference brought together 400+ visual communication experts to share tales of cutting-edge medical procedures, newly discovered molecular processes, revealing scientific mysteries, and inspirational journeys that fascinate all of us. Mimic was honored to have three of our talented team members (Steve Rowse, Gordon Nealy and Emily Shaw) speak at the event, and even go on to win the prestigious “Charlotte Holt Award of Excellence” for the dV-Trainer, Mimic’s Robotic Surgery Simulator.
The following is a transcript from Mimic’s lead 3D Artist, Steve Rowse, who spoke about “Designing the Original Robotic Surgery Simulator” at AMI 2014. In his speech, he talks about the role of a 3D medical robotic simulation artist and why he passed up a “dream job” as the Lead Technical Environment Artist on Microsoft’s Halo team to work with Mimic Technologies.
“I’m a 3D medical robotic simulation artist. Of course, when I tell people this I typically get a blank stare followed by… ‘wow.. that’s umm, interesting.’ So what do I do?
I am responsible for the 3D content in our simulator software including soft and hard modeling, animation, lighting, rigging, rendering, FX, and the technical art pipeline.
I’d like to share a bit of the inner workings of our software and how we are helping to bridge the gap between medical illustration and real-time interaction through Virtual Reality, Simulation and Augmentation.
In previous jobs, I have created all kinds of content including, 2d and 3d animations, 2d and 3d environments, Vehicles and Weapons for Sony PlayStation SOCOM series, X-Box’s Need for Speed and children’s PC games.
In 2012 I passed up an offer for my “dream job” as the Lead Technical Environment Artist on Microsoft’s Halo team and took a position with Mimic because I became totally intrigued with what they were doing and recognized it as the next big idea – not just in 3D but in technology. Sitting on the corner of Simulation, Robotics, Virtual Reality and of course Medical illustration, I knew I just had to become a part of this. I saw clearly after interviewing with Mimic, that medical simulation is here and that the fusion of video games and Medical Illustration is inevitable. I believe that both careers are very important to our futures and we as professionals have to be aware of the other.
In other words it’s going to take a bridge to combine the two. Simulation is for the purpose of education and not the purpose of entertainment. So In recognition of this, when I was faced with my first major task at Mimic of redesigning and creating a look and feel for our exercises, I had to keep this in mind. My first reaction was to go photo-realistic. That not only presented a lot of technical and practical problems but my worry was that surgeons wouldn’t accept the mostly-realistic experience because they get caught up in the details of it not being exact. I came face to face to the Uncanny Valley. – the creepy experience we get when things are almost reality – but not quite.
The uncanny valley is a hypothesis in the field of human aesthetics which holds that when human features look and move almost, but not exactly, like natural human beings, it causes a response of revulsion among some human observers. The “valley” refers to the dip in a graph of the comfort level of humans as subjects move toward a healthy, natural human likeness described in a function of a subject’s aesthetic acceptability.
My choice then became to make as immersive experience as possible, but with a distinct message that this is not real thus not interfering with the users experience. This launched me into a Medical Illustration type position and not just a medical rendering position. My choice was to follow Pixar’s example of creating an experience rather than duplicating reality. I believe that Illustration holds an invaluable contribution to the world of simulation, because after all aren’t we just bringing a text book to life by learning in a more experiential way? No matter what the technical advances are in CG, Artists and Illustration will always be an important and crucial piece of creating experiences and to enter this forum we must be both illustrators and game developers – a pretty high demand. In order to do this we use hand painted textures and simplified geometry. This also works to our advantage when is comes to more Software development issues like tiling textures and real time deformation.
So what else does the future hold in simulation?
One of the area’s that we are now working in is Augmented Reality using real video as a backdrop to answer questions in an interactive way about a surgery. Because this is actual video we avoid the uncanny valley, but we also lose our total interactive experience. In addition we are inventing new way to develop software in perceived 3D. What this means is that our development space is in 3D – not just our end product. While this is not artistically exciting at this point – what’s exciting about this is the technology behind it that will, I believe, be the next bridge to a hybrid of illustration, CT scanned 3D data and real time simulation which will give us the ability to offer virtual surgeries of real patients before the actual surgery is performed.
The thought I most want to share is simple and straightforward. Despite the ever present force of the technology push, and the desire for more “realistic looks”, illustration and art are not going to be replaced – they will just look slightly different. Illustration, for the purpose of education and communication will be essential in Virtual Reality and Simulation.”
“At the heart of our profession are compelling stories” – AMI 2014
Mimic’s Steve Rowse, Gordon Nealy and Emily Shaw will speak about “Designing the Original Robotic Surgery Simulator” at this year’s Association of Medical Illustrators (AMI) Annual Conference at Mayo Clinic in Rochester, MN, July 23-26, 2014.
This world-renowned event brings together 400+ visual communication experts to share tales of cutting-edge medical procedures, newly discovered molecular processes, revealing scientific mysteries, and inspirational journeys that fascinate all of us.
Since 1945, AMI Annual Conference has brought together artists, illustrators, animators, and visual communicators from around the world, to celebrate innovations and explore advancements, inspirations, and ideas in the dynamic fields of biomedical science and healthcare.
Speakers will highlight topics ranging from sculpting, anatomical body painting, écorché modeling, mobile technology, virtual 3D simulation, and paper quilling to environmental physiology, regenerative medicine, 3D medical imaging, patient engagement strategies, and the separation of conjoined twins.
Meet the Mimic team speaking at this year’s AMI!
Mimic is honored to have three of our inspirational team members sharing their stories of bringing to life the Original Robotic Surgery Simulator.
Steve Rowse started his art career in computer gaming at Humongous Entertainment, a 2D children’s gaming company, as an animation lead and environment artist after graduating from the Art Institute of Seattle. Steve was the Lead Vehicle Artist for what would eventually become the very successful Forza Motor Sports series on Xbox, a vehicle artist for Electronic Arts in Canada (EA) on the Need for Speed franchise; his title credits include Underground 2, Most Wanted, and Carbon. Steve has also worked for SONY as a weapons and vehicle artist at Zipper Entertainment working on such titles as M.A.G. and the SOCOM series of games. Steve now works for Mimic Technologies as the software team’s 3D artist. He is responsible for creating all the art content, and visual technologies for Mimics software.
Gordon Nealy, MS, is a graduate of the Medical College of Georgia. He began his medical illustration profession on staff at the Cleveland Clinic in Ohio and Tufts New England Medical Center. His assiduous pursuit of video and 3D graphics brought him to Seattle, where he joined an organization that designed curricula and visual databases for science education. As Gordon developed his 3D skills, Microsoft was entering the video game enterprise. This was an opportune time, so he jumped over to the burgeoning game industry where he participated in shipping four successful games on PC and Xbox. Today, he is the Art Director at Mimic Technologies.
Emily Shaw, MA, CMI, EMT-B, has almost a decade of experience in the field of clinical simulation with Laerdal DC and Laerdal Medical, Clinical Simulation Center–Baltimore, SiTEL, MedStar, and MedStar Health, and currently conducting robotic surgery simulator sales as the NE Territory Manager for Mimic Technologies. Emily also holds a BFA in painting from Maryland Institute College of Art and a MA in medical illustration from The Johns Hopkins University School of Medicine’s Department of Art as Applied to Medicine.
More about the speakers here: http://www.ami.org/meetings/2014/?page_id=661
Lehigh Valley Health Network is hosting the third annual Robotic Simulation Olympics this weekend! Participants are competing for the title “America’s Next Top Doctor” by trying their skills on Mimic’s simulation platform for the da Vinci Surgical System®, the same training system used by LVHN robotic surgeons. The final round is June 28 at Coca-Cola Park in Allentown. Finals begin at 5 p.m., followed by the awards presentation and the closing ceremonies.
Can’t make the event in person? Finals will be streamed live tomorrow from Coca Cola Park between 5 and 6:30PM EST tomorrow night at SimulationOlympics.com
This special event is spearheaded by Martin Martino, MD, a board-certified gynecologic oncologist with Lehigh Valley Health Network. He is also medical director of the robotic surgery program and a founding member of the Robotic Training Network
“I’m passionate about finding the state-of-the-art treatments, whether it’s robotic surgery or new chemotherapies that have been identified, to help our patients get better,” he says. Learn more about him at http://www.lvhn.org/find_a_doctor/profile-2365
With the development of augmented reality operating platforms, the way surgeons use imaging as a real-time adjunct to surgical technique is changing.
A recent survey distributed to the European Robotic Urological Society (ERUS) mailing list included a questionnaire with three themes: surgeon demographics, current use of imaging, and potential uses of an augmented reality operating environment in robotic urological surgery.
According to the June 2014 study published in The International Journal of Medical Robotics and Computer Assisted Surgery, 87% of the ERUS survey respondents felt there was a role for augmented reality as a navigation tool in robotic surgery.
According to the abstract, “This survey has revealed the contemporary robotic surgeon to be comfortable in the use of imaging for intraoperative planning . . . it also suggests that there is a desire for augmented reality platforms within the urological community.”
“Augmented reality really is the ‘way of the future’ for surgical training,” explained Jan Ostman, VP Marketing & International Sales for Mimic Technologies. “That’s why we’ve been focused on creating robotic surgery training simulation solutions with this type of imaging technology.”
With Maestro AR™ (augmented reality), Mimic introduced the first robotic surgery simulation technology that provides 3D virtual instruments for interaction with anatomy in a 3D video environment. It addresses demand from the robotic community for truly interactive procedure-specific simulation.
“The parallel is that in order to use navigation, surgeons must be prepared, demonstrating familiarity with the procedure, which is where Maestro AR comes in,” explains Todd Larson, Executive Director of Mimic Medical Education & Development (MimicMED). “We are using augmented reality to teach the surgical steps, anatomy and help trainees predict regions for dissection.”
Using the dV-Trainer®, surgeons using Maestro AR manipulate virtual robotic instruments to interact with anatomical regions within augmented 3D case video footage. Learn more and watch a preview video of the first module: Maestro AR for Partial Nephrectomy
The following blog post is a transcript of a speech Andrew J. Hung, MD (Director, Surgical Simulation & Education at USC Institute of Urology) gave during this year’s American Urological Association Annual Meeting in Orlando, Florida.
“Existing robotic virtual reality simulators have been focused on basic surgical skills. This has been a very appropriate entry point for surgeons who want to adopt robotic surgery into their practice. Procedure specific simulation is the next frontier.
In developing Procedure-Specific Simulations we have learned the following:
First, the collaboration between the surgeon and the software developer is absolutely critical. A surgeon brings to the table an intricate understanding of the anatomy as well as the procedure. The software developer brings engineering, and very specific technical capabilities, such as the 3D artist who brings our ideas into a 3D environment that’s interactive and engaging.
Second, our whole process starts with the storyboard. In our case, the robotic Partial Nephrectomy scripted from beginning to end, skin-to-skin, not leaving out a single detail. And, we record the operating room procedure. The High Definition Stereo 3D is very sophisticated recording device capable of recording the procedure with the absolute highest graphic definition.
Third, we then take the whole procedure, edit it, introduce anatomy, and ask users to point out main parts, as a baseline test, before they are allowed to move on to the next step in the operation. The procedure step leads to another cognitive type question. We ask users ‘what is the next step coming up?’ because it is important to know what to anticipate in the surgeon’s mind.
Finally, because this is now called ‘Procedure-Specific,’ we’re teaching advanced skills such as tissue retraction. We ask the surgeon, ‘Where do you grab the tissue?’ and ‘which direction do you pull it?’ and ‘with what amount of force should it be pulled?’ The intent is to create tissue manipulation that appears to be real.
In the past, procedure simulation was designed by engineers and then validated in retrospect by surgeons. But now we have flipped this process around. Now, during design development, the surgeons and engineers have to collaborate and really test their user studies. We invited faculty, expert surgeons, novice surgeons, and we’ve had them form the basic validation steps. We ask the experts, ‘is this realistic? Is this going to be a useful training tool?’ And, importantly, we compare and look in very close detail at novice performance versus expert performance because we want to see that there is an ability to distinguish between the two.
We not only look at which questions had a good difference between a novice user and an expert user, but actually focused on the questions that failed to meet current validation. And we looked at the very visual questions or exercises, and thought about how we could improve this particular question. As a result, we now have the ability to test augmented reality experiences, and this is where more and more procedure specific simulation is going.
The emphasis here is that during development, validation begins. The more advanced levels of validation, for example being able to bring or correlate the performance on this platform and real tissue, happens immediately as part of the development process. The next few steps that we want to take include integrating global assessment; sophisticated assessment into a procedure specific environment; and, full virtual reality integration.
Today everybody wants to manipulate that tissue. They want it to be real. And finally, the ultimate goal of what we do, and I think all of the folks at simulation want to do, is go beyond Procedure-Specific and into Patient-Specific rehearsal.”