1887

Abstract

Objectives

To date, no automated solution can anticipate or detect a request from a surgeon during surgical procedures without requiring the surgeon to alter his/her behavior. We are addressing this gap by developing a system that can pass the correct surgical instruments as required by the main surgeon. The study uses a manipulator robot that automatically detects and analyzes explicit and implicit requests during surgery, emulating a human nurse handling surgical equipment. This project constitutes an important step in a research project that involves other challenges related to operative efficiency and safety.

At the 2016 QF Annual Research Forum Conference, we would like to present our preliminary results in the execution of the project. First, a description of the methodology used to acquire surgical team interactions during several cardiothoracic procedures observed at the HMC Heart Hospital, followed by the analysis of the data acquired. Secondly, experimental results of actual human-robot interaction tests emulating the human nurse behavior.

Methods

In order to study the interactions at the operating room during surgical procedures, a model of analysis was structured and executed for several cardiothoracic operations captured with MS Kinect V2 sensors. The data obtained was meticulously studied and relevant observations stored in a database to facilitate the analysis and comparison of events representing the different interactions among the surgical team.

Surgical Annotations

Two or three consecutive events identify in time a sequence of manipulation. For the purpose of developing a structure of annotations, each record in the database can be divided on information containing the time of occurrence of the event counted from the beginning of the procedure, information describing how the manipulation event occurs, information related to the position of the instrument in the space around the patient, and a final optional component with brief additional information that might help to understand the event considered and its relations to the surgical operations flow.

Figure 1: Operating room at HMC Heart Hospital. (a) Kinetic Sensor location (b) Surgical team and instrument locations for a cardio thoracic procedure as viewed from the sensor

1.1. Information containing the time of occurrence of the sequence

Timing information of sequences is basically described by time stamps corresponding to the occurrence of its initial and final events. Some special sequences might include an additional intermediate event that we call as ‘Ongoing’. Additionally, all events are counted as they occur. The status of this counting process is also included as a field in the time occurrence group.

1.2. Information describing how the manipulation sequence occurs

A careful observation of the routines performed at the surgical room allowed us to identify different sequences of events that can be classified into three general categories that describe how the manipulation event occurs:

Commands that correspond to the requests of instruments or operations addressed to supporting staff. These requests can be discriminated as verbal, non-verbal or a combination of both. Commands are not exclusively made by surgeons, sometimes the nurse handling instruments requests actions from the circulatory staff too.

Predictions made by the supporting staff when selecting and preparing instruments in advance to handle them to the surgeon, Fig. 3. These predictions can be divided into right, or wrong depending on the surgeon's decision to accept or reject the instrument when it is offered to him. Sometimes an instrument whose use was predicted incorrectly at a given time might be required by the surgeon in a near following sequence. We classified this kind of event as a partially wrong prediction, Fig. 4.

Actions that correspond to independent or coupled sequences necessary for the flow of the surgical procedure. For instance, as illustrated in the start and end events of Fig. 2, the surgeon picks up himself an instrument from the Mayo tray. The detailed observation of all relevant actions is essential to understand how commands are delivered, what intermediate events are triggered in response, and how the instruments are handled in space and time between its original and final location.

The information presented in the Table 1 summarizes the most common sequences of events found during the surgical procedures analyzed. The table also shows how the roles of the surgical team are distributed in relation to the events considered.

1.3. Information related to the instrument and its position in the space around the patient

The instrument is identified simply by its name. Several instances of the same instrument are used during surgery, but for annotation purposes we refer to all of them as if only one were available. In cases where some physical characteristic differentiate the instrument from others of the same kind such as in the case of size, a different instrument name is selectable. In Table 2, for example, a ‘Retractor’ is differentiated from ‘Big Retractor’. An instrument can be located at any of the possibilities listed under the label ‘Area’ in Table 2 as it can be verified from Fig. 1. In case, one of the members of the surgical team holds the instrument in one or both of his hands, the specification of the exact situation can be obtained by selecting any of the options under de label ‘Hands’ in Table 2.

1.4. Additional information

Any remarkable information related to the event can be included in this unlimited field. For example, at some point the nurse can offer two instruments simultaneously to the surgeon. This is a rare situation since usually the exchange is performed instrument by instrument.

Figure 2: Example of an action: The surgeon picks up directly an instrument

Figure 3: The nurse anticipates the use of one instrument

Figure 4: A Partially wrong prediction: One of two instruments is accepted

Table 1 Description of events and relations to the roles of surgical staff

Table 2 Information about location of the instrument

2.Annotation Software Tool

Based on libraries and information provided by Microsoft we wrote code in order to use MS Kinect Studio to annotate the surgical procedures. The use of Kinect Studio has several advantages if compared to other tools we evaluated, such as extreme precision to identify the length and timing of a sequence, and efficiency in the analysis of simultaneous streams of information. Figure 3 shows the screen presented by Kinect Studio when annotations are being made for the color stream of the surgical recording used as example in the same illustration. The color stream is rendered at a speed of 30fps, which means that every 0.03S in average there is a frame available to annotate if necessary. The blue markers on the Timeline are located at events of interest. On the left side of the screen, a set of fields that correspond to the information of interest is displayed to be filled for each event of interest.

Figure 5: Annotations in Kinect Studio are introduced as metadata fields for the Timeline Marker

The collection of the different entries to describe the interaction are written as an output text file that can be processed with conventional database software tools. The structure of the records in the resultant text file, is presented in Fig. 6. The set of annotations obtained within MS Kinect Studio is exported as a text table that follows the model illustrated in Fig. 6. The structure of the record presented contains the events and relations to the roles of surgical staff listed in Table 1 as well as the fields of information for the instrument as presented in Table 2.

Figure 6: Structure of the annotation record obtained from the metadata associated to the timeline markers in Kinect Studio

Figure 7: Annotations Database as processed within Excel for a surgery of 30 minutes

The annotations database obtained within Kinect Studio for the surgical procedure1 used as example in this report was exported to MS Excel for analysis. A partial image of this database is presented in Fig. 7 where it is possible to appreciate some of the first sequences stored. The colors in the third column are used to differentiate events that belong to the same sequence. These colors are chosen arbitrarily. After the final event of the sequence is identified, the same color is available to signal a new sequence. In total the database of this example contains 259 records for a period of 30 minutes. Queries performed by using Database functionalities generate the results for predictions and commands illustrated in Fig. 8 and Fig. 9.

Figure 8 Predictions: (a) Discrimination as right, wrong or partially wrong. (b) Instruments received by the surgeon (c) Nurse hand holding instruments (d) Instruments rejected by the surgeon (e) Time elapsed while the prediction is performed.

Figure 9 Commands: (a) Discrimination as verbal, nonverbal or combination of both verbal and nonverbal (b) Instruments requested (c) Time elapsed in verbal commands (d) Time elapsed in nonverbal commands (e) Time elapsed while the instrument is requested

Experimental Setup

As a preliminary step to operate a manipulator robot as robotic nurse, surgical personnel at the HMC Heart Hospital are requested to perform a mock knot as illustrated in the Fig. 10 on a synthetic model. During the performance of this task, a Kinect sensor captures body position, hand gestures as well as voice commands. This information is processed by a workstation running windows compatible software that controls the robot to react passing the requested surgical instrument to the subject so that it can be used to complete the task.

Figure 10: (a) Mock knot used as preliminary test of interaction (b) Robotic Set up for experiments at the HMC Heart Hospital

The robot used is a Barrett Robotic manipulator with seven degrees of freedom with as shown in Fig. 10. This FDA approved robot is one of the most advanced robotic systems known as safe to operate with human subjects since it has force sensing capabilities that are used to avoid potential impacts.

Summary

As part of development of a NPRP project that studies the feasibility of having robotic nurses at the operating room that can recognize verbal and nonverbal commands to deliver instruments from the tray to the hand of the surgeon, we have studied interaction activities of surgical teams performing cardio thoracic procedures at the Heart Hospital in Doha. Using state of the art sensor devices we achieved to capture plenty of information that has been carefully analyzed and annotated into databases. We would like to present at the 2016 QF Annual Research Forum Conference our current findings as well as the results of Human interaction tests with a manipulator robot acting as a robotic nurse in the execution of a task that involves gesture/verbal recognition, recognition of the instrument and safe delivery to the surgeon.

1 Wedge Lung Resection. In this procedure the surgeon removes a small wedge-shaped piece of lung that contains cancer and a margin of healthy tissue around the cancer.

Loading

Article metrics loading...

/content/papers/10.5339/qfarc.2016.ICTPP2886
2016-03-21
2024-12-21
Loading full text...

Full text loading...

/content/papers/10.5339/qfarc.2016.ICTPP2886
Loading
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error