Skip to main content
Advertisement

Main menu

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home

User menu

  • Alerts
  • Log in

Search

  • Advanced search
American Journal of Neuroradiology
American Journal of Neuroradiology

American Journal of Neuroradiology

ASHNR American Society of Functional Neuroradiology ASHNR American Society of Pediatric Neuroradiology ASSR
  • Alerts
  • Log in

Advanced Search

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Follow AJNR on Twitter
  • Visit AJNR on Facebook
  • Follow AJNR on Instagram
  • Join AJNR on LinkedIn
  • RSS Feeds

AJNR Awards, New Junior Editors, and more. Read the latest AJNR updates

Research ArticleNeurointervention

Teleproctoring for Neurovascular Procedures: Demonstration of Concept Using Optical See-Through Head-Mounted Display, Interactive Mixed Reality, and Virtual Space Sharing—A Critical Need Highlighted by the COVID-19 Pandemic

A.T. Rai, G. Deib, D. Smith and S. Boo
American Journal of Neuroradiology June 2021, 42 (6) 1109-1115; DOI: https://doi.org/10.3174/ajnr.A7066
A.T. Rai
aFrom the Department of Interventional Neuroradiology (A.T.R., G.D., S.B.), Rockefeller Neuroscience Institute, West Virginia University School of Medicine, Morgantown, West Virginia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for A.T. Rai
G. Deib
aFrom the Department of Interventional Neuroradiology (A.T.R., G.D., S.B.), Rockefeller Neuroscience Institute, West Virginia University School of Medicine, Morgantown, West Virginia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for G. Deib
D. Smith
bWest Virginia University Reed College of Media (D.S.), Morgantown, West Virginia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for D. Smith
S. Boo
aFrom the Department of Interventional Neuroradiology (A.T.R., G.D., S.B.), Rockefeller Neuroscience Institute, West Virginia University School of Medicine, Morgantown, West Virginia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for S. Boo
  • Article
  • Figures & Data
  • Info & Metrics
  • Responses
  • References
  • PDF
Loading

Abstract

BACKGROUND AND PURPOSE: Physician training and onsite proctoring are critical for safely introducing new biomedical devices, a process that has been disrupted by the pandemic. A teleproctoring concept using optical see-through head-mounted displays with a proctor's ability to see and, more important, virtually interact in the operator's visual field is presented.

MATERIALS AND METHODS: Test conditions were created for simulated proctoring using a bifurcation aneurysm flow model for WEB device deployment. The operator in the angiography suite wore a Magic Leap-1 optical see-through head-mounted display to livestream his or her FOV to a proctor's computer in an adjacent building. A Web-based application (Spatial) was used for the proctor to virtually interact in the operator's visual space. Tested elements included the quality of the livestream, communication, and the proctor's ability to interact in the operator's environment using mixed reality. A hotspot and a Wi-Fi-based network were tested.

RESULTS: The operator successfully livestreamed the angiography room environment and his FOV of the monitor to the remotely located proctor. The proctor communicated and guided the operator through the procedure over the optical see-through head-mounted displays, a process that was repeated several times. The proctor used mixed reality and virtual space sharing to successfully project images, annotations, and data in the operator's FOV for highlighting any device or procedural aspects. The livestream latency was 0.71 (SD, 0.03) seconds for Wi-Fi and 0.86 (SD, 0.3) seconds for the hotspot (P = .02). The livestream quality was subjectively better over the Wi-Fi.

CONCLUSIONS: New technologies using head-mounted displays and virtual space sharing could offer solutions applicable to remote proctoring in the neurointerventional space.

ABBREVIATIONS:

COVID-19
coronavirus disease 2019
OST-HMD
optical see-through head-mounted display

Proctoring is a key component for safe introduction of new devices in not only the neuroendovascular space but all procedural/surgical fields dependent on comprehensive physician training for device familiarity and use. For neuroendovascular devices, typical training programs have included didactic elements, hands-on training on flow models, and on-site proctoring of the initial cases. Interest in developing remote capabilities for physician training and procedural oversight is not new, and the concept of teleproctoring for surgical procedures has been explored for almost 2 decades.1⇓-3 As technology evolves, specifically in the field of near-eye optics, miniaturization, wearable tech with virtual spaces, and high-speed networking, the ability to remotely project expertise may become more mainstream.

The travel restrictions and social isolation imposed by the coronavirus disease 2019 (COVID-19) pandemic have hampered onsite training, bringing the need for reliable remote proctoring solutions to the forefront. There has been a renewed interest and surge in telehealth solutions and online collaboration. Many platforms, services, and companies have emerged focusing on virtual alternatives for in-person interactions, leading to a digital transformation of practice-related clinical medicine, education, and research. A remotely proctored cardiovascular procedure was recently demonstrated using video conferencing and an intraoperative telemonitoring robot.4 Other proprietary outfits offering similar capabilities have emerged on the market as well, but none are portable. These generally require hardware capital investment and a continued subscription model.

Previous studies have tested the potential of wearable technology for remote mentoring to project expertise and training across the world.5 A recent review covered the role of augmented reality in surgical education6 to supplement traditional training before proceeding to real cases. The natural evolution could be to disseminate this training remotely. Studies have evaluated optical see-through head-mounted displays (OST-HMDs) such as GLASS by Google (https://www.google.com/glass/start/) and HoloLens 2 by Microsoft (https://www.microsoft.com/en-us/hololens) for medical and surgical applications, including intraoperative use.7⇓⇓⇓-11 Augmented and mixed-reality integration within Smartglasses (https://uploadvr.com/waveguides-smartglasses/) connected through networks and cloud-based servers has opened the possibility of using these wearable devices to project training across geographic boundaries.12⇓⇓⇓⇓-17 The portability of these devices makes them an attractive tool for teleproctoring applications; however, there are important challenges such as image stabilization that need to be addressed before these can be used for mainstream remote proctoring. The pandemic has highlighted the need for easy-to-use tools and is spawning research and development to make these user-friendly. The goal of this investigation was to demonstrate a proof of concept that uses an optical headset for remote proctoring.

The objectives were to use a currently available device and network to test the concept of teleproctoring. The key requirement was the ability of the proctor to see the operator's FOV and virtually interact in that visual field by either pointing out key elements or displaying images, without disrupting it and with seamless bidirectional communication.

MATERIALS AND METHODS

There was no institutional review board approval required for the project, and it does not involve any human subjects.

Test Environment

A test environment was created using a flow model (Vascular Simulations) to mimic proctoring of a neurovascular bifurcation aneurysm case using a Woven EndoBridge intrasaccular device (WEB; MicroVention). The same biplane angiography suite used for actual cases was the setting of virtual proctoring. The “operator” is defined as the person doing the procedure, and the “proctor” as the one guiding the procedure. The proctor had extensive experience in WEB proctoring, and the operator had never used the device before. The operator was in sterile garb in the angiography suite, and the proctor was remotely located in an adjacent building just over 400 feet away in direct line (Fig 1). The ceiling- and wall-mounted cameras of the angiography suite captured the room environment (Fig 2). The fluoroscopy video feed from the C-arm captured the procedure.

FIG 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 1.

The setup. The left panel is an aerial view of the location of the experiment. The angiography suite (labeled) is in the hospital, which is separate from the proctor's location in the adjacent building (labeled). The right panel shows the operator's environment in the angiography suite (lower panel) with the operator wearing the OST-HMD and the spatial computer. The hotspot connection is demonstrated by the orange link between the OST-HMD and the laptop in the angiography suite and between the laptop and the computer in the proctor's office via the Zoom link. The direct Wi-Fi connection is demonstrated by the blue link between the OST-HMD worn by the operator with direct streaming to the computer in the proctor's office.

FIG 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 2.

Operator's environment: The upper panel shows the operator in the angiography suite as viewed from the ceiling- (A1) and the wall-mounted (A2) cameras. The flow model can be observed on the angiography table. The operator is wearing the Magic Leap-1 and performing the procedure while listening to any instructions from the proctor. The angiography monitor (A3) shows a roadmap image of the basilar apex aneurysm with the WEB device ready for deployment. Proctor's environment: The lower panel shows the proctor's perspective. The same image as on the angiography monitor (A3) is livestreamed through the Magic Leap-1 and displayed on the proctor's computer (B1). The proctor's desktop also shows the participants on the Zoom bridge; from top to bottom, these include the media engineer, the backup neurointerventionalist in the angiography suite, and the proctor (B2). The proctor can also see the operator's hands (B3) when he looks down and can advise on hand positioning for device deployment. The operator's avatar in the virtual room on the Spatial app is also displayed on the proctor's screen (B4) and follows the head movements of the operator, as in this case, when the operator is looking down at his hands.

Hardware

The primary hardware used for this experiment was Magic Leap-1 (Magic Leap; https://www.magicleap.com/en-us). The device has a small wearable computer and an optical headset worn by the operator (Figs 1 and 2). For mixed-reality applications, the eyewear provides a 50-degree-wide FOV from 14.6 inches in front of the face to infinity, with 1.3 megapixels per eye at a refresh rate of 120 hertz and supporting almost 17 million colors. The eyewear is a fixed multifocal headset with 2 focal planes, and the near clipping plane for the device is 37.084 cm. Due to this dual focal plane design, the 2 optimal distances for viewing objects in the wearer's FOV are 50 cm and 1.5 m (https://developer.magicleap.com/en-us/learn/guides/design-comfort). Magic Leap-1 is produced in 2 different sizes based on the user's interpupillary distance. Size 1, which was used for this procedure, is designed for wearers with an interpupillary distance of <65. Size 2 is designed for wearers with an interpupillary distance of >65. (https://www.magicleap.care/hc/en-us/articles/360008834511-Sizing). The device is also equipped with surround audio for communication. The headset allows the wearer to clearly see through the lenses and projects any augmented reality images directly on to the user's eyes using waveguide18 technology, a tool used in near-eye optics (Smartglasses). The third component of the system is a hand controller with haptic feedback, which could be used to move or zoom into mixed-reality images.

The operator livestreamed his FOV to a standard laptop (MacBook Pro; Apple) using the Magic Leap “device stream” function, which is a beta application available to developers only. A Zoom bridge (https://zoom.us) was established so that the livestream could be viewed from anywhere. For the purpose of this experiment, this was viewed by 2 other people, the proctor and a moderator. The moderator had expertise in media and information technology and acted as an observer and recorder of the experiment. The proctor used a standard desktop (Apple iMac) to view the livestream of the operator's FOV and communicate with the operator.

Networks

The experiment was tested with 2 networks. For the first (Fig 1), a hotspot was created in the angiography suite for the operator's OST-HMD livestreaming to the laptop in the angiography control room. The hotspot used Wi-Fi-protected access (WPA2) to establish a connection between the OST-HMD and the laptop. The remote Zoom connection was established over a standard Wi-Fi network at the 2 locations, ie, the operator's angiography environment and the proctor's office in a separate-but-adjacent building. The Zoom connection supports Health Insurance Portability and Accountability Act compliance with end-to-end encryption (https://zoom.us/docs/doc/Zoom-hipaa.pdf).

For the second (Fig 1), a secure Wi-Fi network operating in both locations (operator and proctor) was used. A requirement for operating over a Wi-Fi is that both the OST-HMD streaming the operator's FOV and the proctor's computer to which it is streaming are on the same network. The Wi-Fi network assigned static Internet Protocol addresses to the OST-HMD and the proctor's computer. The difference from the hotspot was that the operator could directly livestream to the proctor's computer without using a Zoom bridge.

Virtual Space Sharing and Mixed Reality

Both the operator and the proctor securely logged into a virtual space-sharing platform app, Spatial (https://spatial.io). This third-party, free application enables multiple users to join a room from any device, including a mixed-reality head-mounted display such as HoloLens or Magic Leap, a virtual reality head-mounted display such as Quest (Oculus), or through a Web browser on a desktop. The operator logged into Spatial through the app installed on the OST-HMD (Fig 3), and the proctor did this through the office desktop. The Spatial software is not Health Insurance Portability and Accountability Act–compliant. A virtual “room” was created that both the operator and the proctor could access. The purpose of space sharing was for the proctor to “drop” images in the operator's FOV. For example, These images could be screenshots of the angiography monitor annotating procedural aspects, a 3D model of the aneurysm highlighting a specific aspect or data regarding the neurovascular device such as measurements (Fig 3). The images could be placed anywhere in the operator's FOV so as not to hinder the primary view of the angiography monitor. Typically, if the angiography monitor is in the operator's 12 o'clock view (directly ahead when the operator is performing and facing the monitor), the virtual images were placed at 10 o'clock or 2 o'clock (Fig 3) and the operator could review with slight head turning. The app created an avatar of the operator and displayed it in the virtual room. The avatar followed the head position of the operator and displayed it in the virtual room on the proctor's computer. The avatar was used to evaluate the full scope of possibilities for virtual space sharing; however, the avatar was not critical for conducting the remote proctoring. The proctor could thus follow the operator's head movements via the avatar displayed in the virtual room (Fig 3).

FIG 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 3.

Virtual space sharing and mixed reality. The images in this figure are screenshots from the proctor's computer livestreamed from the operator in the angiography suite. The upper panel shows a view through the operator's headset as displayed on the proctor's computer. It shows the operator logging into the Spatial app (A1). The proctor has displayed an image regarding aneurysm size and morphology in the operator's visual field at about 10 o'clock (A2) using the Spatial app and the same image with the operator's avatar looking at it in the virtual room created on the app. The middle panel shows a WEB-sizing chart (B1), 3D image of the aneurysm flow model (B2), and an angiography image annotated by the proctor defining the WEB device (B3). The lower panel shows a 3D model of another aneurysm showing that the operator can manipulate and anchor in his FOV as a reference to be used when necessary.

Communication

The OST-HMD contains a speaker and a microphone through which the operator could communicate with the proctor. The proctor used his desktop microphone. The communication was conducted via the virtual room using the space-sharing app (Spatial).

Latency

The latency between the operator visualizing an object through the OST-HMD and its livestream to the proctor's computer was tested using a digital stopwatch displayed on the computer. The operator observed the stopwatch using the OST-HMD, which streamed it back to the computer that was displaying the running stopwatch. The live and streamed stopwatches were displayed side by side, and the computer screen was recorded for 20 seconds. The recording could then be viewed and paused to observe the duration displayed on the live and streamed stopwatches, allowing calculation of the time difference. Five measurements were made at 1, 5, 10, 15, and 20 seconds, and this was repeated 3 times each for both the hotspot and Wi-Fi networks, yielding 15 latency calculations each (Fig 4).

FIG 4.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 4.

Latency test. Screenshots from the latency test recordings for the Wi-Fi and hotspot networks at 1, 5, 10, 15, and 20 seconds are displayed. Three recordings were made for each network, yielding 15 measurements each. The left panel demonstrates livestreaming directly over the Wi-Fi network. These show a side-by-side display of the stopwatch running on the computer and a livestreamed image of the stopwatch through the OST-HMD (Magic Leap-1) next to it. The right panel shows the same display when using the hotspot. The difference in time between the computer stopwatch and its livestreamed image constitutes the latency in milliseconds. The graphic scheme next to the screenshots shows the different components in the stream that can impact the latency.

RESULTS

A flow model of a basilar apex aneurysm was used for the experiment. On the basis of the aneurysm measurements, an appropriately sized WEB device was selected. A Via-21 microcatheter (MicroVention) was placed in the aneurysm. The operator who had never used the WEB device before wore the OST-HMD. Although the eyewear device-calibration process includes a user-specific procedure to improve comfort and eye-tracking performance, this procedure was not deemed necessary because the applications used do not enable eye-tracking. The “spatial computer” was strapped over the shoulder (Fig 1). Apart from the operator, a vendor representative to manage the flow model and another neurointerventional physician were present in the angiography suite to monitor the progress and act as a backup. A secure connection was established between the OST-HMD and the laptop in the angiography suite for device-streaming. Calibration of the OST-HMD using the angiography screen was performed to center the image in its FOV while the operator was wearing it and stream it to the laptop. Once the image was centered, the operator secured the OST-HMD in place using the adjustments on the headset. The Zoom link was successfully activated using the hotspot to stream the operator's FOV to the proctor's computer in the adjacent building and also to a media engineer in another city. The operator logged into the virtual space-sharing application (Spatial) through the OST-HMD, and the proctor and the media engineer did the same on their respective computers. The proctor could view the operator's avatar in the virtual room (Fig 3). The proctor communicated with the operator via the headset and proceeded to instruct the WEB deployment. The proctor could view the operator's FOV whether looking at the angiography monitor or the table (Fig 3). The proctor could ask the operator to look at his hands and advise on appropriate hand positioning and technique for device insertion and advancement. During the procedure, the proctor also displayed a WEB-sizing chart, aneurysm morphology, and annotated angiography images in the operator's FOV in predesignated spaces (Fig 3). If required, the operator could manipulate the dropped images, ie, move or magnify using the hand controller. The operator deployed the WEB device successfully without detaching. The device was resheathed and redeployed several times with the proctor observing and advising.

This experiment was then repeated using the Wi-Fi network with the operator livestreaming directly from the headset to the proctor's computer. For this experiment, the virtual space-sharing application was not used, and the proctor instructed the operator on the basis of livestream communication through the OST-HMD.

The latency between visualizing an image through the headset and its display on the computer was 0.71 (SD, .03) seconds over the Wi-Fi and 0.86 (SD, 0.3) seconds over the hotspot (P = .02). Subjectively, the quality of the livestream was also smoother over the Wi-Fi compared with the hotspot, which had infrequent-but-noticeable pixilation of the images.

DISCUSSION

The overarching goal of the experiment was to create a teleproctoring environment that mimicked reality. A proctor typically stands behind or next to the operator in the angiography suite. Guidance is in the form of verbal cues and occasionally pointing out elements on the monitor or the operating table. The operator's hands are occupied manipulating the catheters and devices while the eyes are focused on the screen and the ears tuned to the proctor. We wanted to maintain these relationships and define these concepts before choosing the setup. An OST-HMD seemed to satisfy these requirements by allowing the proctor to see what the operator was seeing, by allowing clear communications, and, most important, using mixed reality to display images or annotations in the operator's FOV without obstructing it, while the operator maintained his posture, ie, hands on the device and eyes on the screen. The system as tested using currently available off-the-shelf components met these objectives. The only tool not commercially available was the livestreaming functionality of Magic Leap, which is currently only accessible as a developer beta application. Livestreaming an environment or even a first-person view by itself is not novel, but the ability to interact in that visual space is the differentiating feature for this study. Even though we used Magic Leap for this experiment, the capability of virtually interacting in the wearer's visual field is not limited to Magic Leap, and other optical platforms allow similar interactions.

We identified several aspects that are suboptimal in the current configuration and constitute substantial areas for improvement. The first is image stabilization of the livestream. If the operator moves his or her head suddenly, it will cause a jerky imaging stream. This is because the cameras and sensors on the head-mounted display track not just the wearer's eyes but are also affected by head movement. Some of this issue can be offset by practice and awareness of head movements, but the more reliable way is through technology. A discourse on the technical methodologies to improve stability of the livestream is beyond the scope of the current study, and the reader is directed to the US patent office Web site where considerable interest in this direction is evident (https://www.usa.gov/federal-agencies/u-s-patent-and-trademark-office).

The headset has to be light and comfortable to avoid strain and operator fatigue. The Magic Leap-1 solves this requirement by separating the computing platform from the OST-HMD to make the headset lighter. Because it is snug on the face, it cannot be worn by an operator wearing glasses and requires prescription inserts that click into the headset. There is extensive ongoing research and development into making these devices more portable and realistic. All of the leading tech companies have programs focused on mixed-reality, wearable, visual devices with iterations geared toward ease of use, seamless blending of augmented reality, smooth eye-tracking, and superfast image-processing.

The second key requirement for this concept to work is a smooth, continuous, jitter-free livestream from the operator to the proctor and an equally swift and seamless display of the proctor's mixed-reality interactions in the operator's visual field. We used a commercial hotspot and a standard internal Wi-Fi network for the experiment because our goal was to test the most easily available and least complex methods. This use also sets the lowest bar for a network, and anything over it would be an improvement. The latency was lower over the Wi-Fi, which could be expected, but it needs to be much lower for mainstream applications. Different hospital systems have different networks, and generally, it is cumbersome to interface with these. A medium that resides outside these networks may have better acceptability and standardization than one that has to interface with an institution's informatics outfit. Our latency over the Wi-Fi from OST-HMD to the computer was 700 ms; interactive games have latencies that vary between 100 and 1000 ms.19 While we evaluated total latency times, ie, from object visualization via OST-HMD to its display on the remote computer, there are several elements within the latency pipeline (Fig 4), including the capacity of the hardware to process and transmit images that affect this performance that were not tested, thus a limitation of the experiment. Another limitation is that we did not use network packet measurements to test latency, which would be important to incorporate in future testing. Lowering latency has been an area of active research and development. Latencies up to 500 ms were considered adequate in 1 article evaluating the impact of latency on completing surgical tasks using remote robotic-assisted surgery.20 Our latency was around 700 ms over the Wi-Fi network, which can certainly be improved, but we were testing only the latency for streaming and not remote robotic operations. For teleproctoring applications, ultra-reliable and low-latency communications–specific21 content delivery networks22 and many other technical innovations can improve performance and reliability.

We did not test security and used devices with end-to-end encryption, but with all the advances in telehealth,23,24 we do not consider this is a major limiting step. Widespread adoption, however, will require assessing and addressing data and privacy concerns for the regulatory authorities. The current interest in robotic neurovascular interventions25,26 offers the tantalizing possibility of merging remote proctoring with robotics to project expertise across distances. Other advances in hologram technologies27,28 can even put the proctor virtually in the operating room with an operator. Similarly, wearable skin-stretch hand or finger devices incorporating haptic feedback,29⇓-31 with the ability to stream that feel to a proctor, add another dimension to what is possible by bringing technology together to build the concept of remote proctoring.

CONCLUSIONS

The investigation in this article explores and demonstrates the concept of remote proctoring using commercially available tools. The proctor could visualize the operator's visual field and, more critically, could virtually interact in that field by annotating images and dropping content without disrupting the workflow. Portability of equipment was a key requirement of the experiment, highlighted using an optical see-through display. We identified certain areas that require improvement and reviewed the literature showing that there is active work in key technologic areas that can enhance this concept. In the future, further incorporation of technologies aimed at distance-immersive interactions will make these experiences very realistic. Necessity is the mother of invention, and the current pandemic has exposed the need to conduct all aspects of our lives remotely, spawning industries and ventures geared toward solving that need.

Footnotes

  • Disclosures: Ansaar T. Rai—RELATED: Consulting Fee or Honorarium: Stryker Neurovascular, MicrovVention, Cerenovus; UNRELATED: Consultancy: Stryker Neurovascular, MicroVention, Cerenovus; Payment for Lectures Including Service on Speakers Bureaus: MicroVention, Stryker Neurovascular; Payment for Development of Educational Presentations: Stryker Neurovascular, MicroVention. Gerard Deib—UNRELATED: Employment: West Virginia University. SoHyun Boo—UNRELATED: Consultancy: MicroVention, Comments: proctor for WEB embolization device providing consultation for new device users as mandated by the FDA; invited to educational speaking engagements.

References

  1. 1.↵
    1. Ballantyne GH
    . Robotic surgery, telerobotic surgery, telepresence, and telementoring: review of early clinical results. Surg Endosc 2002;16:1389–402 doi:10.1007/s00464-001-8283-7 pmid:12140630
    CrossRefPubMed
  2. 2.↵
    1. Rassweiler J,
    2. Frede T
    . Robotics, telesurgery and telementoring–their position in modern urological laparoscopy. Arch Esp Urol 2002;55:610–28. Arch Esp Urol pmid:12224160
    PubMed
  3. 3.↵
    1. Smith CD,
    2. Skandalakis JE
    . Remote presence proctoring by using a wireless remote-control videoconferencing system. Surg Innov 2005;12:139–43 doi:10.1177/155335060501200212 pmid:16034503
    CrossRefPubMed
  4. 4.↵
    1. Goel SS,
    2. Greenbaum AB,
    3. Patel A, et al
    . Role of teleproctoring in challenging and innovative structural interventions amid the COVID-19 pandemic and beyond. JACC Cardiovasc Interv 2020;13:1945–48 doi:10.1016/j.jcin.2020.04.013 pmid:32819483
    Abstract/FREE Full Text
  5. 5.↵
    1. Datta N,
    2. MacQueen IT,
    3. Schroeder AD, et al
    . Wearable technology for global surgical teleproctoring. J Surg Educ 2015;72:1290–95 doi:10.1016/j.jsurg.2015.07.004 pmid:26276303
    CrossRefPubMed
  6. 6.↵
    1. Williams MA,
    2. McVeigh J,
    3. Handa AI, et al
    . Augmented reality in surgical training: a systematic review. Postgrad Med J 2020;96:537–42 doi:10.1136/postgradmedj-2020-137600 pmid:32229513
    Abstract/FREE Full Text
  7. 7.↵
    1. Hiranaka T,
    2. Nakanishi Y,
    3. Fujishiro T, et al
    . The use of Smart Glasses for surgical video streaming. Surg Innov 2017;24:151–54 doi:10.1177/1553350616685431 pmid:28068887
    CrossRefPubMed
  8. 8.↵
    1. García-Cruz E,
    2. Bretonnet A,
    3. Alcaraz A
    . Testing Smart Glasses in urology: clinical and surgical potential applications. Actas Urol Esp 2018;42:207–11 doi:10.1016/j.acuro.2017.06.007 pmid:29037757
    CrossRefPubMed
  9. 9.↵
    1. Carrera JF,
    2. Wang CC,
    3. Clark W, et al
    . A systematic review of the use of Google Glass in Graduate Medical Education. J Grad Med Educ 2019;11:637–48 doi:10.4300/JGME-D-19-00148.1 pmid:31871562
    CrossRefPubMed
  10. 10.↵
    1. Kulak O,
    2. Drobysheva A,
    3. Wick N, et al
    . Smart Glasses as a surgical pathology grossing tool. Arch Pathol Lab Med 2020 Jul 27. [Epub ahead of print] doi:10.5858/arpa.2020-0090-OA pmid:32823276
    CrossRefPubMed
  11. 11.↵
    1. Mitrasinovic S,
    2. Camacho E,
    3. Trivedi N, et al
    . Clinical and surgical applications of Smart Glasses. Technol Health Care 2015;23:381–401 doi:10.3233/THC-150910 pmid:26409906
    CrossRefPubMed
  12. 12.↵
    1. Deib G,
    2. Johnson A,
    3. Unberath M, et al
    . Image guided percutaneous spine procedures using an optical see-through head mounted display: proof of concept and rationale. J Neurointerv Surg 2018;10:1187–91 doi:10.1136/neurintsurg-2017-013649 pmid:29848559
    Abstract/FREE Full Text
  13. 13.↵
    1. García-Vázquez V,
    2. von Haxthausen F,
    3. Jäckle S, et al
    . Navigation and visualisation with HoloLens in endovascular aortic repair. Innov Surg Sci 2018;3:167–77 doi:10.1515/iss-2018-2001 pmid:31579781
    CrossRefPubMed
  14. 14.↵
    1. Hanna MG,
    2. Ahmed I,
    3. Nine J, et al
    . Augmented reality technology using microsoft HoloLens in anatomic pathology. Arch Pathol Lab Med 2018;142:638–44 doi:10.5858/arpa.2017-0189-OA pmid:29384690
    CrossRefPubMed
  15. 15.↵
    1. Amini S,
    2. Kersten-Oertel M
    . Augmented reality mastectomy surgical planning prototype using the HoloLens template for healthcare technology letters. Healthc Technol Lett 2019;6:261–65 doi:10.1049/htl.2019.0091 pmid:32038868
    CrossRefPubMed
  16. 16.↵
    1. Mitsuno D,
    2. Ueda K,
    3. Hirota Y, et al
    . Effective application of mixed reality device HoloLens: simple manual alignment of surgical field and holograms. Plast Reconstr Surg 2019;143:647–51 doi:10.1097/PRS.0000000000005215 pmid:30688914
    CrossRefPubMed
  17. 17.↵
    1. Al Janabi HF,
    2. Aydin A,
    3. Palaneer S, et al
    . Effectiveness of the HoloLens mixed-reality headset in minimally invasive surgery: a simulation-based feasibility study. Surg Endosc 2020;34:1143–49 doi:10.1007/s00464-019-06862-3 pmid:31214807
    CrossRefPubMed
  18. 18.↵
    1. Xia H,
    2. Chen T,
    3. Hu C, et al
    . Recent advances of the polymer micro/nanofiber fluorescence waveguide. Polymers 2018;10: 1086 doi:10.3390/polym10101086 pmid:30961011
    CrossRefPubMed
  19. 19.↵
    1. Kay R
    . Pragmatic Network Latency Engineering Fundamental Facts and Analysis. cPacket Networks, White Paper. 2009;1–31. https://docplayer.net/3464921-Pragmatic-network-latency-engineering-fundamental-facts-and-analysis.html. Accessed December 23, 2020
  20. 20.↵
    1. Anvari M,
    2. Broderick T,
    3. Stein H, et al
    . The impact of latency on surgical precision and task completion during robotic-assisted remote telepresence surgery. Comput Aided Surg 2005;10:93–99 doi:10.3109/10929080500228654 pmid:16298920
    CrossRefPubMed
  21. 21.↵
    1. She C,
    2. Chen Z,
    3. Yang C, et al
    . Improving network availability of ultra-reliable and low-latency communications with multi-connectivity. IEEE Trans Commun 2018;66:5482–96 doi:10.1109/TCOMM.2018.2851244
    CrossRef
  22. 22.↵
    1. Farber DA,
    2. Greer RE,
    3. Swart AD, et al
    . Internet content delivery network: Google Patents; 2003. https://patents.google.com/patent/US6654807B2/en. Accessed December 23, 2020
  23. 23.↵
    1. Parisien LR,
    2. Shin M,
    3. Constant M, et al
    . Telehealth utilization in response to the novel coronavirus (COVID-19) pandemic in orthopaedic surgery. J Am Acad Orthop Surg 2020;28:e487–92 doi:10.5435/JAAOS-D-20-00339 pmid:32459409
    CrossRefPubMed
  24. 24.↵
    1. Ben-Zeev D
    . The digital mental health genie is out of the bottle. Psychiatr Serv 2020;71:1212–13 doi:10.1176/appi.ps.202000306 pmid:32576123
    CrossRefPubMed
  25. 25.↵
    1. Nogueira RG,
    2. Sachdeva R,
    3. Al-Bayati AR, et al
    . Robotic assisted carotid artery stenting for the treatment of symptomatic carotid disease: technical feasibility and preliminary results. J Neurointerv Surg 2020;12:341–44 doi:10.1136/neurintsurg-2019-015754 pmid:32115435
    Abstract/FREE Full Text
  26. 26.↵
    1. Mendes Pereira V,
    2. Cancelliere NM,
    3. Nicholson P, et al
    . First-in-human, robotic-assisted neuroendovascular intervention. J Neurointerv Surg 2020;12:338–40 doi:10.1136/neurintsurg-2019-015671.rep pmid:32132138
    Abstract/FREE Full Text
  27. 27.↵
    1. Neves CA,
    2. Vaisbuch Y,
    3. Leuze C, et al
    . Application of holographic augmented reality for external approaches to the frontal sinus. Int Forum Allergy Rhinol 2020;10:920–25 doi:10.1002/alr.22546 pmid:32362076
    CrossRefPubMed
  28. 28.↵
    1. Moro C,
    2. Phelps C,
    3. Jones D, et al
    . Using holograms to enhance learning in health sciences and medicine. Med Sci Educ 2020;30:1351–52 doi:10.1007/s40670-020-01051-7 pmid:32382451
    CrossRefPubMed
  29. 29.↵
    1. Chossat JB,
    2. Chen DKY,
    3. Park YL, et al
    . Soft wearable skin-stretch device for haptic feedback using twisted and coiled polymer actuators. IEEE Trans Haptics 2019;12:521–32 doi:10.1109/TOH.2019.2943154 pmid:31562105
    CrossRefPubMed
  30. 30.↵
    1. Maisto M,
    2. Pacchierotti C,
    3. Chinello F, et al
    . Evaluation of wearable haptic systems for the fingers in augmented reality applications. IEEE Trans Haptics 2017;10:511–22 doi:10.1109/TOH.2017.2691328 pmid:28391207
    CrossRefPubMed
  31. 31.↵
    1. Cosco F,
    2. Garre C,
    3. Bruno F, et al
    . Visuo-haptic mixed reality with unobstructed tool-hand integration. IEEE Trans Vis Comput Graph 2013;19:159–72 doi:10.1109/TVCG.2012.107 pmid:22508901
    CrossRefPubMed
  • Received September 15, 2020.
  • Accepted after revision January 11, 2021.
  • © 2021 by American Journal of Neuroradiology
PreviousNext
Back to top

In this issue

American Journal of Neuroradiology: 42 (6)
American Journal of Neuroradiology
Vol. 42, Issue 6
1 Jun 2021
  • Table of Contents
  • Index by author
  • Complete Issue (PDF)
Advertisement
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Neuroradiology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Teleproctoring for Neurovascular Procedures: Demonstration of Concept Using Optical See-Through Head-Mounted Display, Interactive Mixed Reality, and Virtual Space Sharing—A Critical Need Highlighted by the COVID-19 Pandemic
(Your Name) has sent you a message from American Journal of Neuroradiology
(Your Name) thought you would like to see the American Journal of Neuroradiology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Cite this article
A.T. Rai, G. Deib, D. Smith, S. Boo
Teleproctoring for Neurovascular Procedures: Demonstration of Concept Using Optical See-Through Head-Mounted Display, Interactive Mixed Reality, and Virtual Space Sharing—A Critical Need Highlighted by the COVID-19 Pandemic
American Journal of Neuroradiology Jun 2021, 42 (6) 1109-1115; DOI: 10.3174/ajnr.A7066

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
0 Responses
Respond to this article
Share
Bookmark this article
Teleproctoring for Neurovascular Procedures: Demonstration of Concept Using Optical See-Through Head-Mounted Display, Interactive Mixed Reality, and Virtual Space Sharing—A Critical Need Highlighted by the COVID-19 Pandemic
A.T. Rai, G. Deib, D. Smith, S. Boo
American Journal of Neuroradiology Jun 2021, 42 (6) 1109-1115; DOI: 10.3174/ajnr.A7066
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Purchase

Jump to section

  • Article
    • Abstract
    • ABBREVIATIONS:
    • MATERIALS AND METHODS
    • RESULTS
    • DISCUSSION
    • CONCLUSIONS
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • Responses
  • References
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • No citing articles found.
  • Crossref (12)
  • Google Scholar

This article has been cited by the following articles in journals that are participating in Crossref Cited-by Linking.

  • Augmenting Performance: A Systematic Review of Optical See-Through Head-Mounted Displays in Surgery
    Mitchell Doughty, Nilesh R. Ghugre, Graham A. Wright
    Journal of Imaging 2022 8 7
  • Mixed Reality in Modern Surgical and Interventional Practice: Narrative Review of the Literature
    Mats T Vervoorn, Maaike Wulfse, Tristan P C Van Doormaal, Jelle P Ruurda, Niels P Van der Kaaij, Linda M De Heer
    JMIR Serious Games 2023 11
  • Magic Leap 1 versus Microsoft HoloLens 2 for the Visualization of 3D Content Obtained from Radiological Images
    Giulia Zari, Sara Condino, Fabrizio Cutolo, Vincenzo Ferrari
    Sensors 2023 23 6
  • Live Virtual Surgery and Virtual Reality in Surgery: Potential Applications in Hand Surgery Education
    Aaron S. Long, Mariana N. Almeida, Lauren Chong, Adnan Prsic
    The Journal of Hand Surgery 2023 48 5
  • Advances and Technical Standards in Neurosurgery
    Nirali Patel, Katherine Hofmann, Robert F. Keating
    2024 49
  • Advancing endovascular neurosurgery training with extended reality: opportunities and obstacles for the next decade
    Shray A. Patel, Michael M. Covell, Saarang Patel, Sandeep Kandregula, Sai Krishna Palepu, Avi A. Gajjar, Oleg Shekhtman, Georgios S. Sioutas, Ali Dhanaliwala, Terence Gade, Jan-Karl Burkhardt, Visish M. Srinivasan
    Frontiers in Surgery 2024 11
  • The Advantage of Using an Optical See-Through Head-Mounted Display in Ultrasonography-Guided Needle Biopsy Procedures: A Prospective Randomized Study
    Tadafumi Shimizu, Takaaki Oba, Ken-ichi Ito
    Journal of Clinical Medicine 2023 12 2
  • Defining Metrics for Assessing Surgeon Performance During a Telementoring Program for Adolescent Idiopathic Scoliosis Surgery
    Alaaeldin Ahmad, Engin Çetin, Steven Theiss, Selcen Yüksel, Michael Cunningham, Monica Ghidinelli, Emre Acaroğlu
    Global Spine Journal 2025
  • Intraoperative Telestration System in Endoscopic Transsphenoidal Surgery Contributes to Improved Surgical Safety and Efficient Surgical Education
    Yoji Tanaka, Daisu Abe, Motoki Inaji, Shoko Hara, Ryosuke Sakai, Taketoshi Maehara
    World Neurosurgery 2024 190
  • Letter to the Editor Regarding “Virtual Reality During Brain Mapping for Awake-Patient Brain Tumor Surgery: Proposed Tasks and Domains to Test”
    Manikon Pullay Silven, Giulia Di Giovanni, Giovanni Federico Nicoletti, Domenico Gerardo Iacopino
    World Neurosurgery 2024 181

More in this TOC Section

  • Contour Neurovascular System: Five Year Follow Up
  • Effect of SARS-CoV2 on Endovascular Thrombectomy
  • Flow diversion for distal circulation aneurysms
Show more NEUROINTERVENTION

Similar Articles

Advertisement

Indexed Content

  • Current Issue
  • Accepted Manuscripts
  • Article Preview
  • Past Issues
  • Editorials
  • Editor's Choice
  • Fellows' Journal Club
  • Letters to the Editor
  • Video Articles

Cases

  • Case Collection
  • Archive - Case of the Week
  • Archive - Case of the Month
  • Archive - Classic Case

More from AJNR

  • Trainee Corner
  • Imaging Protocols
  • MRI Safety Corner
  • Book Reviews

Multimedia

  • AJNR Podcasts
  • AJNR Scantastics

Resources

  • Turnaround Time
  • Submit a Manuscript
  • Submit a Video Article
  • Submit an eLetter to the Editor/Response
  • Manuscript Submission Guidelines
  • Statistical Tips
  • Fast Publishing of Accepted Manuscripts
  • Graphical Abstract Preparation
  • Imaging Protocol Submission
  • Evidence-Based Medicine Level Guide
  • Publishing Checklists
  • Author Policies
  • Become a Reviewer/Academy of Reviewers
  • News and Updates

About Us

  • About AJNR
  • Editorial Board
  • Editorial Board Alumni
  • Alerts
  • Permissions
  • Not an AJNR Subscriber? Join Now
  • Advertise with Us
  • Librarian Resources
  • Feedback
  • Terms and Conditions
  • AJNR Editorial Board Alumni

American Society of Neuroradiology

  • Not an ASNR Member? Join Now

© 2025 by the American Society of Neuroradiology All rights, including for text and data mining, AI training, and similar technologies, are reserved.
Print ISSN: 0195-6108 Online ISSN: 1936-959X

Powered by HighWire