Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.
So no surprises!
Sorry for the sick, let's hope they find someone to finance but cost cutting is needed here now!
it's time to earn with ROLO
Strange date to give an answer!
I play this game.. I see it hard but... a miracle is needed every now and then
fingers crossed
Happy Thanksgiving!
Boom!
Graphical User Interface For A Robotic Surgical System
DOCUMENT ID US 11504191 B2
DATE PUBLISHED 2022-11-22
INVENTOR INFORMATION
NAME
CITY
STATE
ZIP CODE
COUNTRY
Mccloud; Jefferson C.
Providence
RI
N/A
US
Bacher; Daniel
Providence
RI
N/A
US
APPLICANT INFORMATION
NAME
Titan Medical Inc.
CITY
Toronto
STATE
N/A
ZIP CODE
N/A
COUNTRY
CA
AUTHORITY
N/A
TYPE
assignee
ASSIGNEE INFORMATION
NAME
TITAN MEDICAL INC.
CITY
Toronto
STATE
N/A
ZIP CODE
N/A
COUNTRY
CA
TYPE CODE
03
APPLICATION NO
15/780593
DATE FILED
2017-01-19
DOMESTIC PRIORITY (CONTINUITY DATA)
us-provisional-application US 62280334 20160119
US CLASS CURRENT:
1/1
CPC CURRENT
TYPE
CPC
DATE
CPCI
A 61 B 34/20
2016-02-01
CPCI
A 61 B 34/30
2016-02-01
CPCI
A 61 B 34/25
2016-02-01
CPCI
A 61 B 34/76
2016-02-01
CPCA
A 61 B 2017/00323
2013-01-01
CPCA
A 61 B 2017/00973
2013-01-01
CPCA
A 61 B 2017/00314
2013-01-01
CPCA
A 61 B 2034/742
2016-02-01
CPCA
A 61 B 2090/371
2016-02-01
Abstract
A method, apparatus and computer readable medium for schematically representing a spatial position of an instrument used in a robotic surgery system is disclosed. The instrument includes an end effector coupled to a positioning device for spatially positioning the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace. The method involves causing a processor circuit to calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device. The method also involves causing the processor circuit to generate display signals for displaying a graphical depiction of the surgical workspace on a display in communication with the processor circuit, the graphical depiction including a planar representation includes an instrument movement region having a boundary indicating limitations to transverse movement of the instrument within the surgical workspace, and a two-dimensional projection of the current spatial position of the positioning device and the end effector onto the planar representation.
Background/Summary
CROSS-REFERENCE TO RELATED APPLICATION
(1) This application is a U.S. National Phase Application under 35 U.S.C. 371 of International Application No. PCT/CA2017/000011 filed on Jan. 19, 2017 and published as WO 2017/124177 A1 on Jul. 27, 2017. This application claims priority to U.S. Provisional Application No. 62/280,334, filed Jan. 19, 2016. The entire disclosures of all of the above applications are incorporated herein by reference.
BACKGROUND
1. Field
(1) This disclosure relates to surgical robotic systems and more particularly to schematically representing a spatial position of an instrument used in a robotic surgery system.
2. Description of Related Art
(2) In robotic surgery systems, a graphical user interface is generally used to provide alerts and notifications that provide the surgeon with sufficient information to perform surgical tasks. It is common to provide an image of the surgical site within a patient's body cavity that shows both the area where the surgical tasks are being performed and often a portion of the surgical instruments that are deployed to perform the tasks.
SUMMARY
(3) In accordance with one disclosed aspect there is provided a method for schematically representing a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to a positioning device for spatially positioning the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace. The method involves causing a processor circuit to calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device. The method also involves causing the processor circuit to generate display signals for displaying a graphical depiction of the surgical workspace on a display in communication with the processor circuit, the graphical depiction including a planar representation includes an instrument movement region having a boundary indicating limitations to transverse movement of the instrument within the surgical workspace, and a two-dimensional projection of the current spatial position of the positioning device and the end effector onto the planar representation.
(4) The end effector may be represented by an indicator and the positioning device may be represented by an area corresponding to two dimensional projected extents of at least a portion of the positioning device.
(5) The method may involve generating the boundary by defining a three-dimensional boundary within the surgical workspace, and generating a two-dimensional projection of the three-dimensional boundary onto the planar representation.
(6) The boundary of the instrument movement region may further include at least one keep-out zone identifying a further limitation to movement of the instrument within the surgical workspace.
(7) The keep-out zone may be defined based on at least one of input received from an operator at an input device and patient imaging data received at the processor circuit.
(8) The method may further involve, in response to a determination that the instrument is proximate the boundary of the instrument movement region, causing the processor circuit to display an active constraint indication at the boundary.
(9) The robotic surgery system may include a plurality of instruments within the surgical workspace and displaying the graphical depiction may involve displaying a graphical depiction for each of the plurality of instruments.
(10) Displaying the graphical depiction may include displaying the graphical depiction at a peripheral region of the display.
(11) The graphical depiction may further include an instrument depth range indicating limitations to axial movement of the instrument into the surgical workspace, an indicator representing a current depth of the end effector within the instrument depth range, and an input device depth range representing a portion of the instrument depth range that is accessible for a current mapping between the input device workspace and the surgical workspace.
(12) The method may further involve, in response to a determination that the end effector is proximate an end of the input device depth range, causing the processor circuit to display an active constraint indication.
(13) The method may involve receiving an enablement signal at the processor circuit, the enablement signal having an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace, and the method may further involve in response to the enablement signal transitioning from the active to the in-active state, causing the processor circuit to generate display signals for displaying a current hand controller position indicator on the graphical depiction as offset from the two-dimensional projection of the current spatial position of the end effector, and in response to the enablement signal transitioning from the inactive to the active state, discontinuing display of the current hand controller position indicator.
(14) The input signals produced by the input device may include a rotation signal defining a current rotation of the hand controller, the rotation signal being operable to cause rotation of the end effector in the surgical workspace and the graphical depiction may include an instrument rotation range indicating limitations on rotational movement of the instrument, an indicator representing a current rotation of the end effector, and an input device rotation range representing a portion of the instrument rotation range that is accessible for a current mapping between the input device workspace and the surgical workspace.
(15) The method may involve receiving an enablement signal at the processor circuit, the enablement signal having an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace, and the method may further involve in response to the enablement signal transitioning from the active to the in-active state, causing the processor circuit to generate display signals for displaying a current hand controller rotation indicator on the graphical depiction as offset from the indicator representing a current rotation of the end effector, and in response to the enablement signal transitioning from the inactive to the active state, discontinuing display of current hand controller rotation indicator.
(16) In accordance with another disclosed aspect there is provided an apparatus for schematically representing a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to a positioning device for spatially positioning the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace. The apparatus includes a processor circuit operably configured to calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device, and to generate display signals for displaying a graphical depiction of the surgical workspace on a display in communication with the processor circuit. The graphical depiction includes a planar representation including an instrument movement region having a boundary indicating limitations to transverse movement of the instrument within the surgical workspace, and a two-dimensional projection of the current spatial position of the positioning device and the end effector onto the planar representation.
(17) The processor circuit may be operably configured to display an active constraint indication at the boundary in response to a determination that the instrument is proximate the boundary of the instrument movement region.
(18) The graphical depiction may further include an instrument depth range indicating limitations to axial movement of the instrument into the surgical workspace, an indicator representing a current depth of the end effector within the instrument depth range, and an input device depth range representing a portion of the instrument depth range that is accessible for a current mapping between the input device workspace and the surgical workspace.
(19) The processor circuit may be operably configured to display an active constraint indication in response to a determination that the end effector is proximate an end of the input device depth range.
(20) The processor circuit may be operably configured to receive an enablement signal at the processor circuit, the enablement signal having an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace, the processor circuit being operably configured to, in response to the enablement signal transitioning from the active to the in-active state, generate display signals for displaying a current hand controller position indicator on the graphical depiction as offset from the two-dimensional projection of the current spatial position of the end effector, and in response to the enablement signal transitioning from the inactive to the active state, discontinue display of the current hand controller position indicator.
(21) The input signals produced by the input device include a rotation signal defining a current rotation of the hand controller, the rotation signal being operable to cause rotation of the end effector in the surgical workspace and the graphical depiction may include an instrument rotation range indicating limitations on rotational movement of the instrument, an indicator representing a current rotation of the end effector, and an input device rotation range representing a portion of the instrument rotation range that is accessible for a current mapping between the input device workspace and the surgical workspace.
(22) The processor circuit may be operably configured to receive an enablement signal at the processor circuit, the enablement signal having an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace, and the processor circuit may be operably configured to, in response to the enablement signal transitioning from the active to the in-active state, generate display signals for displaying a current hand controller rotation indicator on the graphical depiction as offset from the indicator representing a current rotation of the end effector, and in response to the enablement signal transitioning from the inactive to the active state, discontinue display of current hand controller rotation indicator.
(23) In accordance with another disclosed aspect there is provided a computer readable medium encoded with codes for directing a processor circuit of a robotic surgery system to represent a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to a positioning device for spatially positioning the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace. The codes direct the processor circuit to calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device, and to generate display signals for displaying a graphical depiction of the surgical workspace on a display in communication with the processor circuit, the graphical depiction including a planar representation including an instrument movement region having a boundary indicating limitations to transverse movement of the instrument within the surgical workspace, and a two-dimensional projection of the current spatial position of the positioning device and the end effector onto the planar representation.
(24) Other aspects and features will become apparent to those ordinarily skilled in the art upon review of the following description of specific disclosed embodiments in conjunction with the accompanying figures.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) In drawings which illustrate disclosed embodiments,
(2) FIG. 1 is a perspective view of a robotic surgery system;
(3) FIG. 2 is a perspective view of an instrument mount of the robotic surgery system shown in FIG. 1;
(4) FIG. 3 is a block diagram of the processor circuit elements of the robotic surgery system shown in FIG. 1;
(5) FIG. 4 is a flowchart depicting blocks of code for directing a workstation processor circuit shown in FIG. 3 to display a representation of a spatial position of an instrument;
(6) FIG. 5 is a schematic view of graphical depictions generated by the workstation processor circuit shown in FIG. 3;
(7) FIG. 6 is a schematic representation of a surgical workspace and an input device workspace for a right side instrument of the robotic surgery system shown in FIG. 1;
(8) FIG. 7 is a perspective view of a right input device of the robotic surgery system shown in FIG. 1;
(9) FIG. 8 is a perspective view of the right side instrument of the robotic surgery system shown in FIG. 1;
(10) FIG. 9 is a flowchart depicting blocks of code for directing the workstation processor circuit shown in FIG. 3 to execute a base setting process;
(11) FIG. 10 is a flowchart depicting blocks of code for directing the workstation processor circuit shown in FIG. 3 to execute a process for calculating the 3D spatial position of the instrument;
(12) FIG. 11 is a further perspective view of the right side instrument of the robotic surgery system shown in FIG. 1 in a bent condition;
(13) FIG. 12 is a perspective view of the left and right side instruments of the robotic surgery system shown in FIG. 1;
(14) FIG. 13 is a flowchart depicting blocks of code for directing the workstation processor circuit shown in FIG. 3 to execute a process for generating display signals for displaying the graphical depictions shown in FIG. 5; and
(15) FIG. 14 are a series of examples of graphical depictions for positions of the left and right instruments.
DETAILED DESCRIPTION
(16) Referring to FIG. 1, a robotic surgery system is shown generally at 100. The system 100 includes a workstation 102 and an instrument cart 104. The instrument cart 104 includes at least one instrument 106 mounted on a moveable instrument mount 108 that houses an instrument drive for manipulating the instrument. The workstation 102 includes an input device 110 for use by a surgeon for controlling the instrument 106 via the instrument drive to perform surgical operations on a patient. The input device 110 may be implemented using a haptic interface available from Force Dimension, of Switzerland, for example.
(17) The instrument 106 and instrument mount 108 are shown in more detail in FIG. 2. Referring to FIG. 2 the instrument 106 includes an insertion tube 202 that is inserted through an incision in a wall of the patient's abdomen or other body cavity to provide access to a surgical workspace within the body cavity. Once inserted into the surgical workspace, the instrument 106 is deployed as shown in the insert 206 in FIG. 2. In this embodiment the instrument 106 includes a right side instrument 208 comprising a positioning device 209 and an end effector 210 and a left side instrument 212 comprising a positioning device 213 and an end effector 214.
(18) In the embodiment shown the end effector 210 is a pair of forceps having opposing moveable gripper jaws 216 controlled by the instrument drive for grasping tissue, while the end effector 214 is a pair of curved dissecting forceps. The instrument 106 also includes a camera 218 deployed on an articulated arm 220 that is able to pan and tilt the camera. The camera 218 includes a pair of spaced apart image sensors 222 and 224 for producing stereoscopic views of the surgical workspace. The instruments 208 and 212 and the camera 218 are initially positioned in-line with the insertion tube 202 prior to insertion through the incision and then deployed as shown at 206.
(19) Referring back to FIG. 1, the input device 110 includes a right input device 116 and a left input device 118. The right input device 116 includes a right hand controller 112 and the left input device 118 includes a left hand controller 114, the hand controllers being mechanically coupled to the respective input devices. The workstation 102 also includes a workstation processor circuit 120, which is in communication with the input devices 116 and 118 and the hand controllers 112 and 114 for receiving input from a surgeon. The instrument cart 104 also includes an instrument processor circuit 130 for controlling the instrument 106. In this embodiment the instrument processor circuit 130 is in communication with the workstation processor circuit 120 via an interface cable 132 for transmitting signals between the workstation processor circuit 120 and the instrument processor circuit 130. In other embodiments communication between the workstation processor circuit 120 and the processor circuit 130 may be wireless or via a computer network, and the workstation 102 and may even be located remotely from the instrument cart 104.
(20) The workstation 102 also includes a display 122 in communication with the workstation processor circuit 120 for displaying real time images and/or other graphical depictions of the surgical workspace. In this embodiment where the camera 218 includes the pair of spaced apart image sensors 222 and 224, the display 122 is configured to provide separate 2D stereoscopic views of the surgical workspace that provide a 3D depth effect when viewed through suitable stereoscopic spectacles worn by the surgeon.
(21) The workstation 102 also includes a footswitch 134, which is actuable by the surgeon to provide an enablement signal to the workstation processor circuit 120. The enablement signal has an active state and an inactive state and in this embodiment depressing the footswitch 134 causes the enablement signal to change from the active state to the inactive state. The active state of the enablement signal permits movement of the instrument 106 in response to the input signals produced by the input device 110 while the inactive state inhibits movement of the instrument.
(22) The input signals are generated by the right and left input devices 116 and 118 in response to movement of the hand controllers 112 and 114 by a surgeon within an input device workspace. The positioning devices 209 and 213 associated with the instruments 208 and 212 spatially position the respective end effectors 210 and 214 in the surgical workspace in response to the input signals.
(23) A block diagram of the processor circuit elements of the system 100 is shown in FIG. 3. Referring to FIG. 3 the workstation processor circuit 120 includes a microprocessor 250. The workstation processor circuit 120 also includes a workstation memory 252, a USB interface 254, an input/output 256 and a motion control interface 258, all of which are in communication with the microprocessor 250. The input/output 256 includes an input for receiving an enablement signal from the footswitch 134 and an output for producing display signals for driving the display 122.
(24) In this embodiment the input device 110 communicates using a USB protocol and the USB interface 254 receives input signals produced by the input device in response to movements of the hand controllers 112 and 114. The microprocessor 250 processes the input signals based on a current mapping between the input device workspace and the surgical workspace and causes the motion control interface 258 to transmit control signals, which are conveyed to the instrument processor circuit 130 via the interface cable 132. The mapping may include a scale factor that scales movements in input device workspace to produce scaled movements in surgical workspace. For example a 100 mm translation in input device workspace may be scaled by a scale factor of 0.5 to produce a 50 mm movement in surgical workspace for fine movement.
(25) The enablement signal produced by the footswitch 134 is received at the input/output 256. The workstation memory 252 includes a current buffer 320 and a previous buffer 340 including a plurality of stores for storing values associated with the control signals, as described later herein.
(26) The instrument processor circuit 130 includes a microprocessor 280, a memory 282, a communications interface 284, and a drive control interface 286, all of which are in communication with the microprocessor. The microprocessor 280 receives the input signals at the communications interface 284. The microprocessor 280 processes the control signals and causes the drive control interface 286 to produce drive signals for moving the instruments 208 and 212.
(27) The workstation processor circuit 120 thus acts as a master subsystem for receiving user input, while the instrument processor circuit 130 and instruments 208 and 212 act as a slave subsystem in responding to the user input.
(28) Referring to FIG. 4, a flowchart depicting blocks of code for directing the workstation processor circuit 120 to display a representation of a spatial position of the instrument 106 is shown generally at 300. The blocks generally represent codes that direct the microprocessor 250 to perform various functions. The actual code to implement each block may be written in any suitable program language, such as C, C++, C#, Java, OpenGL, and/or assembly code, for example.
(29) The process 300 begins at block 302, which directs the microprocessor 250 to determine whether the enablement signal is active. If the footswitch 134 is not currently being depressed then the instruments 208 and 212 are under control of the input device 110 and block 302 directs the microprocessor 250 to block 306. If the footswitch 134 is currently depressed then movement of the instrument 106 is inhibited and block 302 directs the microprocessor 250 to block 304 to execute a base setting process, which will be described later herein. Following the base setting process at block 304, the microprocessor 250 is directed to block 306.
(30) Block 306 directs the microprocessor 250 to calculate a current three-dimensional (3D) spatial position of the instruments 208 and 212 within the surgical workspace for current input signals received from the input device 110. Referring back to FIG. 2, the right side positioning device 209 and left side positioning device 213 of the instruments 208 and 212 are shown actuated to each assume a posture in accordance with the control signals received at the instrument processor circuit 130. Similarly the end effectors 210 and 214 are disposed in a posture in accordance with the control signals received at the instrument processor circuit 130. The 3D spatial position of the instruments 208 and 212 herein refers to 3D positions of each portion of instruments including the positioning devices 209 and 213 and the end effectors 210 and 214. Details of the calculation of these 3D positions in surgical workspace are described later herein.
(31) Block 308 then directs the microprocessor 250 to generate display signals for displaying a graphical depiction of the surgical workspace on the display 122. Referring back to FIG. 1, a right graphical depiction 136 is displayed on the display 122 for the right side instrument 208. Similarly, a left graphical depiction 138 is displayed for the left side instrument 212. The graphical depictions 136 and 138 are displayed at a peripheral region of the display 122 to prevent obscuring a live view 140 of the surgical workspace also displayed on the display.
(32) Block 308 then directs the microprocessor 250 back to block 302 and the process 300 is repeated. In one embodiment the process 300 is repeated at a frequency of about 1 kHz.
(33) Referring to FIG. 5, the graphical depictions 136 and 138 are shown in larger scale in FIG. 5. The graphical depictions 136 and 138 are presented as a planar representation including a positioning device movement region 400 having a boundary 402 indicating limitations to transverse movements (translation and orientation) of the positioning device 209 within the surgical workspace. The graphical depictions 136 and 138 also include an end effector movement region 404 having a boundary 406 representing a further region within which the end effector 210 is able to move. Even when the positioning device 209 is at the boundary 402, the end effector 210 may still be able to turn outwardly to access the end effector movement region 404 beyond the positioning device movement region 400.
(34) The graphical depictions 136 and 138 also include a two-dimensional (2D) projection of the current spatial position of the respective positioning devices 209 and 213 and the end effectors 210 and 214. In the embodiment shown the end effectors 210 and 214 are represented by indicators 408 and 410 that indicate at least an approximate orientation of jaws of the respective end effectors. The positioning devices 209 and 213 are represented by areas 412 and 414 corresponding to 2D projected extents of portions of the positioning devices onto the planar representation.
(35) The graphical depictions 136 and 138 also each include an instrument depth range 416 and 418 indicating limitations to axial movement of the instruments into the surgical workspace. The limitations to axial movement of the instrument are represented by ends 424 and 426 of the instrument depth range 416 and ends 428 and 430 of the instrument depth range 418. The instrument depth ranges 416 and 418 also each include a current depth indicator 420 and 422 (in this case a circle) representing a current depth of the end effector within the respective instrument depth ranges. The current depth indicator 420 is closer to the end 424 of the range 416 than the current depth indicator 422, since the right side instrument 208 is located further into the surgical workspace than the left side instrument 212 (as shown in FIG. 2). The instrument depth range 416 also includes an input device depth range 432 (shown as a hatched area) representing a portion of the instrument depth range 416 that is accessible for a current mapping between the input device workspace and the surgical workspace. Similarly, the instrument depth range 418 includes an input device depth range 434 (shown as a hatched area) representing a portion of the instrument depth range 418 that is accessible for a current mapping between the input device workspace and the surgical workspace.
(36) The input signals produced by the input device 110 also include rotation signals defining a current rotation of each of the hand controllers 112 and 114. The rotation signals are used by the workstation processor circuit 120 to produce control signals for causing rotation of the respective end effectors 210 and 214 in the surgical workspace. The graphical depictions 136 and 138 shown in FIG. 5 also include instrument rotation ranges 440 and 442 indicating limitations on rotational movements of the end effectors 210 and 214. In the graphical depictions 136 and 138 a “?” indicator represents a current rotation of the end effectors 210 and 214 with respect to a reference, which in FIG. 5 is taken as a vertical line 444 (shown only for the right graphical depiction 136 in FIG. 5). The graphical depictions 136 and 138 further display input device rotation ranges 446 and 448 (shown as hatched areas) representing a portion of the respective instrument rotation ranges 440 and 442 that are accessible for a current mapping between the input device workspace and the surgical workspace.
(37) As disclosed above, blocks 302-308 of the process 300 are repeated at a frequency of about 1 kHz, thus updating the graphical depictions 136 and 138 to provide the surgeon with a near real-time display of the spatial position of the instruments 208 and 212. In the embodiment shown in FIG. 1-4 the instrument 106 includes a pair of instruments 208 and 212, however in other embodiments the system 100 may have a single instrument and only a single graphical depiction would thus be displayed. Alternatively, where more than two instruments are used, a graphical depiction may be displayed for each instrument.
(38) Referring to FIG. 6, a schematic representation of the surgical workspace and the input device workspace for the right side instrument 208 as viewed from above the input device 116 is shown at 480. The hand controller 112 of the input device 116 is moveable within a hemispherical 3D volume and the corresponding input device workspace is shown in FIG. 6 as a horizontally hatched semi-circular area 482. In FIG. 6, the input device workspace 482 is shown superimposed on the surgical workspace 484, which is represented by vertically hatched areas accessible by the right side instrument 208. The surgical workspace 484 is also a 3D volume and has a boundary surface 485 defining constraints to movement of the positioning device 209. A point 486 represents a point of insertion of the insertion tube 202 through the wall of the patient's body cavity.
(39) The boundary surface 485 in FIG. 6 and the planar representation of the boundary 402 in FIG. 5 represent limitations to movement of the instrument 208 and end effector 210 based on the extent of the input device workspace 482 within the surgical workspace 484. Additional limitations may be placed on movement of the instrument 208 and end effector 210 due to the patient's anatomy. For example, portions of other organs, vasculature, and other sensitive tissues may also limit movement of the instrument 208 and end effector 210 within the surgical workspace 484. In another embodiment, one or more keep-out zones 498 may be designated within the surgical workspace 484 and the boundary surface 485 may be generated to include these keep-out zones. The keep-out zones 498 are used to further limit movement of the instrument 208 and end effector 210 within the input device workspace 482. Designation of the keep-out zones 498 may be in accordance with input from the surgeon, which may be received at the input device 110. Alternatively, the keep out zones 498 may be designated in accordance with imaging or other patient data that is uploaded to the workstation processor circuit 120. For example, if the patient has had imaging such as magnetic resonance imaging (MRI) or a CT scan, patient specific data relating to the surgical site may be used to define the one or more keep-out zones 498. Subsequently when generating the graphical depictions 136 and 138, the keep-out zones 498 would be included in the definitions of the boundary 402 as an additional zone 436 within the boundary.
(40) Movements of the hand controller 112 of the input device 116 are able to cause the positioning device 209 of the instrument 208 to move within the surgical workspace 484 while the end effector 210 is capable of extending outwardly to reach into a region 488 for the current mapping. The area 488 represents an additional portion of surgical workspace that can be accessed by the end effector 210 and has a 3D boundary surface 489.
(41) The right graphical depiction 136 shown in FIG. 5 generally corresponds to a transverse cross-section taken along the line 5-5, where the intersection of the line 5-5 defines the boundary 402 of the positioning device movement region 400 and the boundary 406 of the end effector movement region 404 as shown in FIG. 5. The representation in FIG. 6 is shown for the right input device 116 and right hand controller 112 controlling the right side instrument 208. The left input device 118, left hand controller 114, and left side instrument 212 have been omitted for sake of clarity but may be similarly represented.
(42) Changes in the mapping between the input signals produced by the input device 110 and the control signals produced by the workstation processor circuit 120 at the motion control interface 258 may be made when the footswitch 134 is depressed allowing the hand controllers 112 and 114 to be repositioned to access a different portion of the surgical workspace 484 or in response to a change of scale factor, allowing a larger or smaller proportion of the surgical workspace to be accessed.
(43) Input Device
(44) The right input device 116 is shown in greater detail in FIG. 7. For simplicity, only the right input device 116 will be further described, it being understood that left input device 118 operates in the same way. Referring to FIG. 7, the input device 116 is supported on a base 500 and includes arms 502, 504, and 506. The right hand controller 112 is mounted to the arms 502-506 to permit positioning and rotation about orthogonal axes x.sub.1, y.sub.1 and z.sub.1 of a Cartesian reference frame. The Cartesian reference frame has an origin at a point midway along a body of the hand controller 112 and the location of the origin defines the hand controller position 508 (i.e. at the origin). In this embodiment, the hand controller 112 is mounted on a gimbal mount 510. The arms 502-506 confine movements of the hand controller 112 and hence the hand controller position 508 to within the hemispherical input device workspace, as shown in FIG. 6. In one embodiment the input device 116 may also be configured to generate haptic forces for providing haptic feedback to the hand controller 112 through the arms 502-506.
(45) The input device 116 has sensors (not shown) that sense the position of each of the arms 502-506 and rotation of the hand controller 112 about each of the x.sub.1, y.sub.1 and z.sub.1 axes and produces signals representing the position of the hand controller in the workspace and the rotational orientation of hand controller relative to an input device Cartesian reference frame x.sub.r, y.sub.r, z.sub.r. In this embodiment, the position and orientation signals are transmitted as input signals via a USB connection 518 to the USB interface 254 of the workstation processor circuit 120.
(46) In this embodiment, the gimbal mount 510 has a pin 512 extending downwardly from the mount and the base 500 includes a calibration opening 514 for receiving the pin. When the pin 512 is received in the opening 514 the input device 116 is located in a calibration position that is defined relative to the input device Cartesian reference frame x.sub.r, y.sub.r, z.sub.r. The input device reference frame has an x.sub.r-z.sub.r plane parallel to the base 500 and a y.sub.r axis perpendicular to the base. The z.sub.r axis is parallel to the base 500 and is coincident with an axis 516 passing centrally through the input device 116.
(47) The input device 116 produces current hand controller signals and current hand controller orientation signals that represent the current position and orientation of the hand controller 112. The signals may be represented by a current hand controller position vector and a current hand controller rotation matrix. The current hand controller position vector is given by:
(48) P .fwdarw. MCURR = { x 1 y 1 z 1 } ,
(49) where x.sub.1, y.sub.1, and z.sub.1 represent coordinates of the hand controller position 508 (i.e. the origin of the coordinate system x.sub.1, y.sub.1, z.sub.1) relative to the input device reference frame x.sub.r, y.sub.r, z.sub.r. The current hand controller rotation matrix is given by:
(50) R MCURR = [ x 1 ? x y 1 ? x z 1 ? x x 1 ? y y 1 ? y z 1 ? y x 1 ? z y 1 ? z z 1 ? z ] ,
where the columns of the matrix represent the axes of the hand controller reference frame x.sub.1, y.sub.1,
(51) z.sub.1 relative to the input device reference frame x.sub.r, y.sub.r, z.sub.r. The matrix R.sub.MCURR thus defines the current rotational orientation of the hand controller 112 relative to the x.sub.r, y.sub.r, and z.sub.r fixed master reference frame. The current hand controller position vector custom character.sub.MCURR and current handle rotation matrix R.sub.MCURR are transmitted as current hand controller position and current hand controller orientation signals via the USB connection 518 to the USB interface 254 of the workstation processor circuit 120. The workstation processor circuit 120 stores the three values representing the current handle position vector custom character.sub.MCURR in a store 322 and the nine values representing the current hand controller rotation matrix R.sub.MCURR in a store 324 of the current buffer 320 of workstation memory 252.
(52) Instrument
(53) The right side instrument 208 is shown in greater detail in FIG. 8. Referring to FIG. 8, the positioning device 209 is configured to position the end effector 210 within the surgical workspace by activating various drivers in the instrument mount 108 in response to the drive signals produced by the drive control interface 286 of the instrument processor circuit 130 in response to the control signals received at the communications interface 284 from the workstation processor circuit 120. The drive signals are produced based on the current hand controller position vector {right arrow over (P)}.sub.MCURR and current hand controller rotation matrix R.sub.MCURR are stored in the stores 322 and 324 of the current buffer 320 in the workstation memory 252.
(54) The instrument 208 includes a plurality of the identical “vertebra” 550 as described in U.S. Patent Publication No. 2016/0143633, which is incorporated herein by reference. The vertebra 550 are operable to move with respect to each other when control wires passing through the vertebra are extended or retracted to cause movements of the positioning device 209. The position and orientation of the end effector 210 is defined relative to a fixed slave reference frame having axes x.sub.v, y.sub.v and z.sub.v, which intersect at a point referred to as the fixed slave reference position 552. The fixed slave reference position 552 lies on a longitudinal axis 554 of the instrument 208 and is contained in a plane perpendicular to the longitudinal axis and containing a distal edge of the insertion tube 202.
(55) In the embodiment shown, the end effector 210 includes gripper jaws 216, which may be positioned and oriented within an end effector workspace. A tip of the gripper jaws 216 may be designated as an end effector position 560 defined as the origin of an end effector Cartesian reference frame x.sub.2, y.sub.2, z.sub.2. The end effector position 560 is defined relative to the slave reference position 552 and the end effector may be positioned and orientated relative to the fixed slave reference frame x.sub.v, y.sub.v, z.sub.v, for causing movement of the positioning device 209 and/or the end effector 210.
(56) The current hand controller position signal {right arrow over (P)}.sub.MCURR and current hand controller orientation signal R.sub.MCURR cause movement of the end effector 210 of the instrument 208 to new end effector positions and desired new end effector orientations and are represented by a new end effector position vector {right arrow over (P)}.sub.EENEW:
(57) P EENEW = { x 2 y 2 z 2 } ,
where x.sub.2, y.sub.2, and z.sub.2 represent coordinates of the end effector position 560 within the end effector workspace relative to the x.sub.v, y.sub.v, z.sub.v fixed slave reference frame, and a 3×3 end effector rotation matrix R.sub.EENEW:
(58) R EENEW = [ x 2 ? x y 2 ? x z 2 ? x x 2 ? y y 2 ? y z 2 ? y x 2 ? z y 2 ? z z 2 ? z ] ,
where the columns of the R.sub.EENEW matrix represent the axes of the end effector reference frame x.sub.2, y.sub.2, and z.sub.2 written in the fixed slave reference frame x.sub.v, y.sub.v, and z.sub.v. R.sub.EENEW thus defines a new orientation of the end effector 210 in the end effector workspace, relative to the x.sub.v, y.sub.v, and z.sub.y fixed slave reference frame. Values for the vector {right arrow over (P)}.sub.EENEW and rotation matrix R.sub.EENEW are calculated as described later herein and stored in stores 330 and 332 of the current buffer 320 of the workstation memory 252 respectively.
Base Setting Process
(59) When the system 100 initially starts up, the workstation processor circuit 120 sets a master base position vector {right arrow over (P)}.sub.MBASE equal to the current hand controller vector {right arrow over (P)}.sub.MCURR and causes a definable master base rotation matrix R.sub.MBASE to define an orientation that is the same as the current orientation defined by the hand controller rotation matrix R.sub.MCURR associated with the current hand controller rotation. At startup the following operations are therefore performed:
{right arrow over (P)}.sub.MBASE={right arrow over (P)}.sub.MCURR, and
R.sub.MBASE=R.sub.MCURR.
(60) The hand controller 112 reference frame represented by the axes x.sub.1, y.sub.1, and z.sub.1 shown in FIG. 7 and the definable master base reference frame represented by the axes x.sub.mb, y.sub.mb, and z.sub.mb (also shown in FIG. 7) thus coincide at startup of the system 100. Referring back to FIG. 3, the workstation processor circuit 120 stores the values representing the definable master base position vector custom character.sub.MBASE and the definable master base rotation matrix R.sub.MBASE in the stores 326 and 328 of the current buffer 320 of the workstation memory 252.
(61) At startup of the system 100 there would be no previously stored values for the new end effector position vector {right arrow over (P)}.sub.EENEW and the new end effector Rotation matrix R.sub.EENEW and in one embodiment these values are set to home configuration values. A home configuration may be defined that produces a generally straight positioning device 209 of the instrument 208 as shown in FIG. 8 and the values of {right arrow over (P)}.sub.EENEW and R.sub.EENEW for the home configuration may be preconfigured at initialization. On startup of the system 100 the workstation processor circuit 120 also causes a definable end effector base position vector {right arrow over (P)}.sub.EEBASE and a definable end effector base rotation matrix R.sub.EEBASE to be set to the home configuration values of {right arrow over (P)}.sub.EENEW and R.sub.EENEW. In other embodiments, the home configuration may define configuration variables to produce different bent or both straight and bent instrument positioning device poses for the home configuration. At startup the following operations are therefore performed:
{right arrow over (P)}.sub.EEBASE={right arrow over (P)}.sub.EENEW, and
R.sub.EEBASE=R.sub.EENEW.
(62) The end effector reference frame represented by the axes x.sub.2, y.sub.2, and z.sub.2 shown in FIG. 8 and the definable slave base reference frame represented by the axes x.sub.sb, y.sub.sb, and z.sub.sb thus coincide at startup of the system 100. Referring back to FIG. 3, the workstation processor circuit 120 stores the values x.sub.sb, y.sub.sb, and z.sub.sb representing the definable slave base position vector {right arrow over (P)}.sub.EEBASE in store 334 and stores the values representing the definable slave base rotation matrix R.sub.MBASE in a store 336 of the current buffer 320 of the workstation memory 252.
(63) The base setting process (block 304 of the process 300 shown in FIG. 4) is executed asynchronously when the enablement signal produced by the footswitch 134 transitions from the active state to the inactive state at block 302. Further details of the base setting process 304 are shown as a process flowchart in FIG. 9. The base setting process 304 begins at block 600, which directs the microprocessor 250 of the workstation processor circuit 120 to inhibit further movement of the instrument 208 by transmitting control signals via the motion control interface 258 that cause the instrument processor circuit 130 to produce drive signals at the drive control interface 286 that do not to cause further movements of the instrument 208. In one embodiment the microprocessor 250 maintains the same control signals and since the drive signals produced by the drive control interface 286 are produced in response to the control signals, the drive signals will also be maintained at the values that were active at the time the footswitch 134 was depressed. The instrument 208 will thus remain immobile at a current position and orientation.
(64) Block 602 then directs the microprocessor 250 to determine whether the enablement signal has transitioned from the inactive state to the active state again. If the enablement signal remains in the inactive state, block 602 directs the microprocessor 250 to repeat block 602 and the process 304 is thus effectively suspended while the enablement signal is in the inactive state. When the enablement signal transitions from the inactive state to the active state, block 602 directs the microprocessor 250 to block 604.
(65) Block 604 directs the microprocessor 250 to set new base positions and orientations for the hand controller 112 and end effector 210 respectively. While the footswitch 134 is depressed the surgeon may move the hand controller 112 to a new location to relocate the input device workspace relative to the surgical workspace. When the enablement signal transitions to the active state, block 604 directs the microprocessor 250 to cause current values of current hand controller position vector {right arrow over (P)}.sub.MCURR and the hand controller rotation matrix R.sub.MCURR to be stored in locations 326 and 328 of the current buffer 320 workstation memory 252 as new values for the master base position vector {right arrow over (P)}.sub.MBASE and master base rotation matrix R.sub.MBASE. Block 604 also directs the microprocessor 250 to cause current values for the end effector position signal {right arrow over (P)}.sub.EENEW and the end effector orientation signal R.sub.EENEW to be stored in stores 334 and 336 of the current buffer 320 as the definable end effector base position vector {right arrow over (P)}.sub.EEBASE and definable slave base rotation matrix R.sub.MBASE.
(66) The base setting process 304 then continues at block 606, which directs the microprocessor 250 to permit further movement of the instrument 208 while the enablement signal produced by the 134 remains active.
(67) The base setting process 304 thus allows the instrument 208 to be immobilized by depressing the footswitch 134 while the hand controller 112 of the input device 116 is moved to a new location. When the footswitch 134 is released, control of the instrument 208 resumes at the new position of the hand controller 112. The hand controller 112 may thus be repositioned as desired while the instrument remains immobile, preventing unintended movements that may inflict injury to the patient.
(68) In one embodiment, when the footswitch 134 causes the enablement signal to transition to the inactive state, the indicators 408, 412, 410 and 414 in FIG. 5 representing the positions and orientations of the respective left and right instruments 208 and 212 are immobilized on the graphical depictions 136 and 138 at their current respective positions and additional indicators 450 and 452 representing current input device 110 inputs are displayed. The immobilized indicators 408, 412, 410 and 414 represent the position and orientation of the immobilized instruments 208 and 212 while the additional indicators 450 and 452 represent current positions of the input devices 116 and 118 and hand controllers 112 and 114. Subsequently, when the enablement signal again transitions to the active state, the additional indicators 450 and 452 are deleted or gradually faded out and the indicators 408, 412, 410 and 414 are once again rendered active. Aligning the displayed indicators 408 and 450 and the indicators 410 and 452 prior to releasing the footswitch 134 minimizes the offset between the hand controllers 112 and 114 and the respective instruments 208 and 212. Similarly current hand controller rotation indicators 454 and 456 may be displayed at an offset from the indicator representing a current rotation of the end effectors 210 and 214. Accordingly, while the footswitch 134 is depressed, the user can offset roll, orientation and translation (XYZ). When the footswitch 134 is released the instruments 208 and 212 are re-engaged and the roll and translation offset is fixed.
(69) Instrument Position and Orientation
(70) Further details of block 306 of the process 300 shown in FIG. 3 for calculating the 3D spatial position of the instrument are shown in FIG. 10. Referring to FIG. 10, the process 306 includes blocks of codes executed by the workstation processor circuit 120 for calculating a new end effector position and orientation control signals {right arrow over (P)}.sub.EENEW and R.sub.EENEW in response to the current hand controller position {right arrow over (P)}.sub.MCURR and hand controller orientation R.sub.MCURR. These control signals when received by the instrument processor circuit 130 at the communications interface 284 are used by the microprocessor 280 to produce drive signals at the drive control interface 286 to position and orient the end effector 210. In one embodiment the process 306 is executed periodically at a rate of about 1 kHz.
(71) The process 306 begins at block 630 which directs the microprocessor 250 to read current values for {right arrow over (P)}.sub.MCURR and R.sub.MCURR from the current buffer 320 of the workstation memory 252, which represent the current hand controller position vector {right arrow over (P)}.sub.MCURR and current hand controller matrix R.sub.MCURR. Block 632 then directs the microprocessor 250 to calculate new end effector position signals {right arrow over (P)}.sub.EENEW and new end effector orientation signals R.sub.EENEW representing a desired end effector position 560 and desired end effector orientation, relative to the fixed slave reference position 552 and the slave base orientation. Block 632 also directs the microprocessor 250 to store values representing the new end effector position vector {right arrow over (P)}.sub.EENEW in the store 330 and to store values representing the desired end effector orientation matrix R.sub.EENEW in the store 332 of the current buffer 320 of the workstation memory 252.
(72) The new end effector position signals {right arrow over (P)}.sub.EENEW and new end effector orientation signals R.sub.EENEW are calculated according to the following relations:
{right arrow over (P)}.sub.EENEW=A({right arrow over (P)}.sub.MCURR-{right arrow over (P)}.sub.MBASE)+{right arrow over (P)}.sub.EEBASE Eqn 1a
R.sub.EENEW=R.sub.EEBASER.sub.MBASE.sup.-1R.sub.MCURR Eqn 1b
where:
(73) custom character.sub.EENEW is the new end effector position vector that represents the new desired position of the end effector 210 in the end effector workspace, and is defined relative to the slave base reference position; A is a scalar value representing a scaling factor in translational motion between the master and the slave; {right arrow over (P)}.sub.MCURR is the current representation of the hand controller position vector stored in the store 322 of the current buffer 320, the hand controller position vector being defined relative to the fixed master reference frame x.sub.r, y.sub.r, and z.sub.r; {right arrow over (P)}.sub.MBASE is the last-saved position vector {right arrow over (P)}.sub.MCURR for the hand controller 112 that was shifted at the last transition of the enablement signal from the inactive state to the active state or on system initialization or by operation of a control interface by an operator; {right arrow over (P)}.sub.EEBASE is the last saved position vector {right arrow over (P)}.sub.EENEW for the end effector 210 that was shifted at the last transition of the enablement signal from the inactive state to the active state or on system initialization; R.sub.EENEW is the new end effector orientation matrix representing the current orientation of the end effector 210, and is defined relative to the fixed slave reference position 552; R.sub.EEBASE is the last-saved rotation matrix R.sub.EENEW of the end effector 210 shifted at the last transition of the enablement signal from the inactive state to the active state; R.sub.MBASE.sup.-1 is the inverse of rotation matrix R.sub.MBASE, which is the last-saved rotation matrix R.sub.MCURR of the hand controller 112 saved at the last transition of the enablement signal from the inactive state to the active state; and R.sub.MCURR is the currently acquired rotation matrix representing the orientation of hand controller 112 relative to the fixed master reference frame x.sub.r, y.sub.r, and z.sub.r.
(74) Block 634 then directs the microprocessor 250 to determine whether the enablement signal is in the active state. If the enablement signal is in the active state, block 636 directs the microprocessor 250 to cause the motion control interface 258 to transmit control signals based on the newly calculated values for {right arrow over (P)}.sub.EENEW and R.sub.EENEW. When the control signals are received at the communications interface 284 of the instrument processor circuit 130, the microprocessor 280 causes drive signals to be produced to cause the end effector 210 to assume a position and orientation determined by the current position and current orientation of the hand controller 112.
(75) Block 638 then directs the microprocessor 250 to copy the current position vector {right arrow over (P)}.sub.MCURR and the current rotation matrix R.sub.MCURR stored in stores 322 and 324 of the current buffer 320 into stores 342 ({right arrow over (P)}.sub.MPREV) and 344 (R.sub.MPREV) of the previous buffer 340 of the workstation memory 252. Block 638 also directs the microprocessor 250 to copy the newly calculated end effector position vector {right arrow over (P)}.sub.EENEW and the newly calculated end effector rotation matrix R.sub.EENEW into stores 346 and 348 of the previous buffer 340. By storing the newly calculated end effector position vector {right arrow over (P)}.sub.EENEW and newly calculated end effector rotation matrix R.sub.EENEW, as previously calculated end effector position vector {right arrow over (P)}.sub.EEPREV and previously calculated end effector rotation matrix R.sub.EEPREV, a subsequently acquired new end effector position vector {right arrow over (P)}.sub.EENEW and subsequently acquired new end effector rotation matrix R.sub.EENEW can be calculated from the next received hand controller position vector {right arrow over (P)}.sub.MCURR and next receive hand controller rotation matrix R.sub.MCURR provided by the input device 116.
(76) If at block 634, the enablement signal is in the inactive state the microprocessor 250 is directed to block 642. Block 642 directs the microprocessor 250 to cause the motion control interface 258 to transmit control signals based on the previously calculated values of custom character.sub.EPREV and R.sub.EEPREV in the respective stores 346 and 348 of the previous butter 340 of the workstation memory 252. The control signals transmitted by the motion control interface 258 are thus derived from the last saved values of custom character.sub.EENEW and R.sub.EENEW, causing the end effector 210 to remain stationary since the same control signals as previously determined are transmitted to the communications interface 284 of the instrument processor circuit 130. The microprocessor 250 is then directed to block 640.
(77) While enablement signal remains inactive (i.e. while the footswitch 134 is depressed) the control signals transmitted by the motion control interface 258 are based only on the previously calculated end effector position and previously calculated orientation signals {right arrow over (P)}.sub.EEPREV and R.sub.EEPREV that were in effect before the enablement signal transitioned to inactive.
(78) In another embodiment certain special functions may be executed before executing block 636 when the enablement signal is determined to be in the active state at block 634. One example of such a special function is an alignment control function, as described in applicant's co pending U.S. Patent Publication No. 2018/0271607 and U.S. Patent Publication No. 2017/0367777, hereby incorporated by reference in their entirety. For example, in one embodiment an alignment control function may have one of two outcomes. The first outcome may direct the microprocessor 250 to execute block 636, which directs the microprocessor to cause the motion control interface 258 to transmit control signals to the instrument processor circuit 130 based on the newly calculated end effector position and newly calculated end effector orientation {right arrow over (P)}.sub.EENEW and R.sub.EENEW. The second outcome directs the microprocessor 250 to execute block 638, which causes the microprocessor to cause the motion control interface 258 to transmit control signals based on a previously calculated end effector position and previously calculated end effector orientation {right arrow over (P)}.sub.EEPREV and R.sub.EEPREV. This causes the end effector 210 to assume a position and orientation determined by a previous position and previous orientation of the hand controller 112.
(79) Accordingly, when the enablement signal is in the inactive state, the hand controller 112 can be moved and rotated and the calculations of {right arrow over (P)}.sub.EENEW and R.sub.EENEW will still be performed by block 632, but there will be no movement of the end effector 210, since the previous control signals are sent to the instrument processor circuit 130. This allows “clutching” or repositioning of the hand controller 112 without corresponding movement of the end effector 210. The movement may be useful in relocating the hand controller within the input device workspace to a comfortable position and/or providing an increased range of movement for the end effector 210 within the surgical workspace.
(80) The end effector position vector {right arrow over (P)}.sub.EENEW or {right arrow over (P)}.sub.EEPREV and end effector orientation matrix R.sub.EENEW or R.sub.EEPREV produced at block 636 or block 638 provide a desired location end effector tip 560 with respect to the fixed slave reference position 552. However, in the embodiment shown in FIG. 3, the microprocessor 250 causes the motion control interface 258 to transmit motion control signals that define a pose required by the positioning device 209 to position and orient the end effector 210 in the desired end effector position and orientation. The motion control signals are thus generated based on a kinematic configuration of the positioning device 209 and end effector 210 to position the end effector position 560 at the desired position and orientation.
(81) Motion Control Signals
(82) The right side instrument 208 is shown in a bent pose in FIG. 11 and FIG. 12. The left side instrument 212 is also shown in FIG. 12 in a straight pose corresponding to the home configuration. Referring to FIG. 11 and FIG. 12, the positioning device 209 of the instrument 208 has a first articulated segment referred to as an s-segment 700 and a second articulated segment referred to as a distal segment 702. The segments each include a plurality of the vertebra 550. The s-segment 700 begins at a distance from the insertion tube 202, referred to as the insertion distance q.sub.ins, which is a distance between the fixed slave reference position 552 defined at the origin of the slave fixed base reference frame x.sub.v, y.sub.v, and z.sub.v, and a first position 704 at an origin of a first position reference frame x.sub.3, y.sub.3, and z.sub.3. The insertion distance q.sub.ins represents an unbendable portion of the positioning device 209 that extends out of the end of the insertion tube 202. In the embodiment shown, the insertion distance q.sub.ins may be about 10-20 mm, while in other embodiments the insertion distance may be longer or shorter, varying from 0-100 mm, for example.
(83) The s-segment 700 extends from the first position 704 to a third position 706 defined as an origin of a third reference frame having axes x.sub.5, y.sub.5, and z.sub.5 and is capable of assuming a smooth s-shape when control wires (not shown) inside the s-segment 700 are pushed and pulled. The s-segment 700 has a mid-point at a second position 708, defined as the origin of a second position reference frame having axes x.sub.4, y.sub.4, and z.sub.4. The s-segment 700 has a length L.sub.1, best shown in FIG. 12 for the left side instrument positioning device 213. In the embodiment shown, the length L.sub.1 may be about 65 mm.
(84) The distal segment 702 extends from the third position 706 to a fourth position 710 defined as an origin of a fourth reference frame having axes x.sub.6, y.sub.6, and z.sub.6. The distal segment 702 has a length L.sub.2, best shown in FIG. 12 for the left side instrument positioning device 213. In the embodiment shown, the length L.sub.2 may be about 23 mm.
(85) Each end effector 210 and 214 also has an end effector length, which in the embodiment shown is a gripper length L.sub.3 extending from the fourth position 710 to the end effector tip position 560 defined as the origin of the axes x.sub.2, y.sub.2, and z.sub.2. The gripper length L.sub.3 is best shown in FIG. 12 again for the left side instrument positioning device 213 and in one embodiment may be about 25 mm. The slave reference position 552, first position 704, second position 708, third position 706, fourth position 710, and the end effector position 560 may collectively be referred to as instrument reference positions.
(86) As described in U.S. Patent Publication No. 2016/0143633 (hereby incorporated herein by reference in its entirety) by pushing and pulling on control wires inside the positioning devices 209 and 213, the s-segments 700 of the positioning devices 209 and 213 may be bent into various degrees of an s-shape, from the straight condition shown in FIG. 8 to a partial s-shape for the right side instrument 208 shown in FIGS. 11 and 12 to a full s-shape. The s-segment 700 is sectional in that it has a first section 712 and a second section 714 on opposite sides of the second position 708. Referring to FIG. 5, the first and second sections 712 and 714 lie in a first bend plane containing the first position 704, second position 708, and third position 706. The first bend plane is at an angle d.sub.prox to the x.sub.v-z.sub.v plane of the fixed slave reference frame x.sub.v, y.sub.v, and z.sub.v. The first section 712 and second section 714 are bent in the first bend plane through opposite but equal angles ?.sub.prox such that no matter the angle ?.sub.prox or the bend plane angle d.sub.prox, the z.sub.5 axis of the third position 706 is always parallel to and aligned in the same direction as the z.sub.v axis of the fixed slave reference position 552. Thus, by pushing and pulling on the control wires within the positioning device 209, the third position 706 can be placed at any of a number of discrete positions in space within a cylindrical volume about the first position 704. This cylindrical volume may be referred to as the s-segment workspace.
(87) In addition, the distal segment 702 lies in a second bend plane containing the third position 706 and the fourth position 710. The second bend plane is at an angle d.sub.dist to the x.sub.v-z.sub.v plane of the fixed slave reference frame x.sub.v, y.sub.v, and z.sub.v. The distal segment 702 is bent in the second bend plane at an angle ?.sub.dist. Thus, by pushing and pulling the control wires within the positioning device 209, the fourth position 710 can be placed within another volume in space about the fourth position 710. This volume may be referred to as the distal workspace. The combination of the s-segment workspace and the distal workspace may be referred to as the positioning device workspace as this represents the total possible movement of the instrument 208 as effected by the positioning device 209. The left side instrument 212 may be similarly positioned by the positioning device 213.
(88) The distance between the fourth position 710 and the end effector position 560 is the distance between the movable portion of the distal segment 702 and the tip of the gripper end effector 210 in the embodiment shown, i.e. the length the gripper length L.sub.3 shown in FIG. 12. Generally, a portion of the gripper between the fourth position 710 and the end effector position 560 will be unbendable.
(89) In the embodiment shown, the end effector 210 include moveable gripper jaws 216 that are rotatable about the z.sub.2 axis in the x.sub.2-y.sub.2 plane of the end effector reference frame, the angle of rotation being represented by an angle ? relative to the positive x.sub.2 axis. Finally, the gripper jaws 216 may be at any of varying degrees of openness from fully closed to fully open (as limited by a hinge joint of the jaws). The varying degrees of openness may be defined as the “gripper”. In summary therefore, the motion control signals are generated based on a kinematic configuration of the positioning device 209 and end effector 210 as defined by the following configuration variables: q.sub.ins represents a distance from the slave reference position 552 defined by axes x.sub.v, y.sub.v, and z.sub.v to the first position 704 defined by axes x.sub.3, y.sub.3 and z.sub.3 where the s-segment 700 of the positioning device 209 begins; d.sub.prox represents a first bend plane in which the s-segment 700 is bent relative to the x.sub.v-y.sub.v plane of the fixed slave reference frame; ?.sub.prox represents an angle at which the first and second sections 712 and 714 of the s-segment 700 are bent in the first bend plane; d.sub.dist represents a second bend plane in which the distal segment 702 is bent relative to the x.sub.v-y.sub.v plane of the fixed slave reference frame;
(90) ?.sub.dist represents an angle through which the distal segment 702 is bent in the second bend plane; ? represents a rotation of the end effector 210 about axis z.sub.2; and Gripper: represents a degree of openness of the gripper jaws 216 of the end effector 210 (this is a value which is calculated in direct proportion to a signal produced by an actuator (not shown) on the hand controller 112 indicative of an amount of pressure the operator exerts by squeezing the actuator to actuate the jaws 216 to close).
(91) To calculate the configuration variables, it will first be recalled that the end effector rotation matrix R.sub.EENEW is a 3×3 matrix:
(92) R EENEW = [ x 2 ? x y 2 ? x z 2 ? x x 2 ? y y 2 ? y z 2 ? y x 2 ? z y 2 ? z z 2 ? z ] ,
where the last column of R.sub.EENEW is the z-axis of the end effector reference frame written relative to the fixed slave reference frame x.sub.v, y.sub.v, and z.sub.v. The values ?.sub.dist, d.sub.dist, and ? associated with the distal segment 702 may be calculated according to the relations:
(93) ? dist = p 2 - a ? tan ? ? 2 ? ( z 2 ? x 2 + z 2 ? y 2 , z 2 ? z ) Eqn ? ? 2 d dist = - a ? tan ? ? 2 ? ( z 2 ? y , z 2 ? x ) . ? If ? | d dist | > p 2 ? : Eqn ? ? 3 ? = a ? tan ? ? 2 ? ( - y 2 ? z , x 2 ? z ) - d dist + p Eqn ? ? 4 ? a
else
?a tan 2(y.sub.2z,-x.sub.2z)-d.sub.dist Eqn 4b
(94) The third position 706 may then be written in terms of a vector p.sub.3/v from the fixed slave reference position 552 to the third position. Similarly a vector p.sub.4/3 may be defined from the third position 706 to the fourth position 710 and a vector p.sub.5/4 may be defined from the fourth position 710 to the end effector position 560. These values can then be used to compute the location of third position 706 relative to the fixed slave reference position 552 by subtracting the vectors p.sub.4/3 and p.sub.5/4 from the end effector position vector {right arrow over (P)}.sub.EENEW:
p.sub.3/v={right arrow over (P)}.sub.EENEW-p.sub.4/3-p.sub.5/4 Eqn 5
where:
(95) p _ 4 ? / ? 3 .Math. i _ = - L 2 ? ? cos ? ? d dist ? ? ( sin ? ? ? dist - 1 ) p 2 - ? dist Eqn ? ? 6 ? a p _ 4 ? / ? 3 .Math. j _ = L 2 ? ? sin ? ? d dist ? ? ( sin ? ? ? dist - 1 ) p 2 - ? dist Eqn ? ? 6 ? b p _ 4 ? / ? 3 .Math. k _ = L 2 ? ? cos ? ( ? dist ) p 2 - ? dist Eqn ? ? 6 ? c p _ 5 ? / ? 4 .Math. i _ = L 3 ? ? cos ? ( d dist ) ? ? cos ? ( ? dist ) Eqn ? ? 7 ? a p _ 5 ? / ? 4 .Math. j _ = - L 3 ? ? sin ? ( d dist ) ? ? cos ? ( ? dist ) Eqn ? ? 7 ? b p _ 5 ? / ? 4 .Math. k _ = L 3 ? ? sin ? ( ? dist ) , Eqn ? ? 7 ? c
where i is a unit vector in the x direction, j is a unit vector in the y direction, and k is a unit vector in the z direction.
The vector p.sub.3/v from the fixed slave reference position 552 to the third position 706 may then be used to find the configuration variables d.sub.prox and ?.sub.prox for the s-segment 700. The angle d.sub.prox is calculated by solving the following two equations for d.sub.prox:
(96) p _ 3 ? / ? v .Math. i _ = - L 1 ? ? cos ? ? d prox ? ? ( sin ? ? ? prox - 1 ) p 2 - ? prox Eqn ? ? 8 ? a p _ 3 ? / ? v .Math. j _ = L 1 ? ? sin ? ? d prox ? ? ( sin ? ? ? prox - 1 ) p 2 - ? prox . Eqn ? ? 8 ? b
Taking a ratio of Eqn 8b and Eqn 8a yields:
d.sub.prox=a tan 2(-p.sub.3/v.Math.j,p.sub.3/v.Math.i), Eqn 9
where i and j are unit vectors in the x and y directions respectively. A closed form solution cannot be found for ?.sub.prox, and accordingly ?.sub.prox must be found using a numerical equation solution to either of equations Eqn 8a or Eqn 8b. For example, a Newton-Raphson method may be employed, which iteratively approximates successively better roots of a real-valued function. The Newton-Raphson method can be implemented using the following equations:
(97) f ? ( ? prox ) = L 1 p 2 - ? prox ? cos ? ? d prox ? ( 1 - sin ? ? ? prox ) - p _ 3 ? / ? v .Math. i _ = 0 , Eqn ? ? 10
where i is the unit vector in the x direction. The equation Eqn 10 is Eqn 8a rewritten in the form f(?.sub.prox)=0. The Newton-Raphson method tends to converge very quickly because in the range 0<?.sub.prox<p the function has a large radius of curvature and has no local stationary points. Following the Newton-Raphson method, successive improved estimates of ?.sub.prox can be made iteratively to satisfy equation Eqn 10 using the following relationship:
(98) 0 ? n + 1 = ? n - f ? ( ? n ) f ' ? ( ? n ) Eqn ? ? 11
(99) Finally, upon determination of ?.sub.prox, the following equation can be used to find q.sub.ins:
(100) q ins = - p _ 3 ? / ? v . k _ - L 1 ? ? cos ? ? ? prox p 2 - ? prox , Eqn ? ? 12
where k is the unit vector in the z direction and p.sub.3/v. {circumflex over (k)} is the dot product of the vector p.sub.3/v, and the unit vector k.
(101) The above configuration variables calculated for the end effector position and orientation signals
(102) The above configuration variables are calculated for the end effector position and orientation signals custom character.sub.EENEW and R.sub.EENEW at block 636 or custom character.sub.EEpREv and R.sub.EEpREv at block 642 of the process 306. The configuration variables generally define a pose of the positioning device 209 required to position the end effector 210 at the desired location and orientation in end effector workspace. Configuration variables are produced for each end effector 210 and 214 of the respective right and left side instruments 208 and 212. Two sets of configuration variables referred to as left and right configuration variables respectively are thus produced and transmitted by the motion control interface 258 to the instrument processor circuit 130 and used by the microprocessor 280 to generate drive control signals for spatially positioning the positioning device 209 and end effector 210 of the instrument 208 in the surgical workspace.
(103) 3D Spatial Positioning
(104) Further details of block 308 of the process 300 shown in FIG. 3 are shown in FIG. 13. Referring to FIG. 13, the process 308 includes blocks of codes executed by the workstation processor circuit 120 for generating display signals for displaying the graphical depictions 136 and 138 shown in FIG. 5. The process 308 uses the values of the configuration variables that were determined at block 306 to determine locations in the surgical workspace for points along the positioning devices 209 for the current inputs from the input device 110 and footswitch 134. The locations are determined relative to fixed slave reference position 552 within the surgical workspace. The process 308 generally involves determining theoretical locations for each of the reference points, namely the first position 704, second position 708, third position 706, fourth position 710 and the end effector position 560 in the surgical workspace. Once the theoretical location of each reference point is determined, the theoretical locations of various intermediate points along the positioning device 209 within the surgical workspace may be determined. Each of the sections 712, 714 of the s-segment 700, and the distal segment 702 of the positioning device 209 includes a plurality of vertebra 550 and centers of the vertebrae are spaced apart by the same distance. Since the s-segment 700 and distal segment 702 form smooth continuous constant-radius curves when bent, the theoretical location of the center of each vertebra can thus be calculated mathematically. The theoretical locations may be used to determine motion control signals used by the instrument processor circuit 130 to generate drive signals for the actual positioning of the instrument 208 in surgical workspace. The theoretical locations are also used by the workstation processor circuit 120 to generate the graphical depictions 136 and 138 shown in FIG. 5.
(105) The process 308 begins at block 740, which directs the microprocessor 250 to select the first reference position (shown at 704 in FIG. 11) for processing. Block 742 then directs the microprocessor 250 to determine the location of the first position 704, which is spaced from the fixed slave reference position 552 by an unbendable portion of the positioning device 209 having length q.sub.ins. The location of the first position 704 is thus determined by simple addition of the q.sub.ins configuration variable to the fixed slave reference position 552 in the z.sub.v axis. The location may be expressed in terms of a vector p.sub.1/v from the fixed slave reference position 552 to the first position 704 within the surgical workspace.
(106) Block 744 then directs the microprocessor 250 to determine locations of intermediate points along the first section 712 of the positioning device 209 (i.e. between the first position 704 and the second position 708). The location of the first position 704 determined at block 740 is used to determine locations of all vertebrae 550 in the first section 712 of the s-segment 700. For example in the embodiment shown in FIG. 11, assuming there are 15 vertebrae 550 in the first section 712 between the first position 704 and the second position 708, the center of the n.sup.th vertebrae will lie at a theoretical location that is at an intermediate point along the first section 220 calculated as:
(107) n ? ? ? prox 15
relative to the first position 704. A vector from the first position 704 to the n.sup.th vertebra position may thus be determined and added to the vector p.sub.1/v from the fixed slave reference position 552 to the first position 704 to determine the location of each of the n vertebrae of the first section 712 relative to the fixed slave reference position 552 in the surgical workspace.
(108) Block 746 then directs the microprocessor 250 to determine whether all of the reference positions have been processed, and if not, the microprocessor is directed to block 748 where the next reference position is selected for processing. Block 748 then directs the microprocessor 250 back to block 742 and blocks 742 and 744 are repeated for each reference position.
(109) The location of the second position 708 relative to the fixed slave reference position 552 may be determined from the configuration variables q.sub.ins, ?.sub.prox, and d.sub.prox. Determining a vector p.sub.2/v from the fixed slave reference position 552 to the second position 708 provides a theoretical location of the second position in absolute terms within the surgical workspace. For the embodiment shown in FIG. 11, assuming again that there are 15 vertebrae in the second section 714, the center of the n.sup.th vertebrae of the second section would lie in an intermediate point along the second section. The angle at which the second section 222 is bent in the first bend plane d.sub.prox is equal and opposite to the angle ?.sub.prox used for the calculations concerning the vertebrae of the first section 712. Therefore, an intermediate point of the n.sup.th vertebrae can be calculated as:
(110) n ? ( - ? prox ) 15
relative to the second position 708. A vector from the second position 708 to the n.sup.th vertebra position may thus be determined and added to the vector p.sub.2/v from the slave reference position 552 to the second position 708 to provide the theoretical location of the n.sup.th vertebrae of the second section 714 in absolute terms within the positioning device workspace. This process may be repeated for each of the 15 vertebrae in the second section 714 of the s-segment 700 to find absolute locations for each vertebrae intermediate point within the surgical workspace relative to the fixed slave reference position 552.
The location of the third position 706 at the end of the s-segment 700 may be expressed in terms of the vector p.sub.3/v having vector components as set out in Eqn 8a, 8b, and 8c above. The location of the third position 706 may be used as the reference point for determining the theoretical locations of all vertebrae 550 in the distal segment 702 using the method provided above. Assuming that there are 15 vertebrae in the distal segment 702, the center of the n.sup.th vertebrae would lie in an intermediate point that is along the distal segment. The angle at which the distal segment 702 is bent in the second bend plane d.sub.dist is ?.sub.dist. Therefore, an intermediate point of the n.sup.th vertebrae can be calculated as:
(111) n ? ? ? dist 15
relative to the third position 706. A vector from the third position 706 to the n.sup.th vertebra position may thus be determined and added to the vector p.sub.3/v to arrive at the theoretical location of the n.sup.th vertebrae in the distal segment 702 in absolute terms in the surgical workspace. This procedure is repeated for each of the 15 vertebrae in the distal segment 702 to find the theoretical location for each vertebrae intermediate point in the positioning device workspace in absolute terms, relative to the fixed slave reference position 552.
(112) The location of the fourth position 710 may be determined from the vector p.sub.4/3 relative to the third position 706 having vector components as set out in Eqn 6a, 6b, and 6c above. Adding the vector p.sub.4/3 to the vector p.sub.3/v from the fixed slave reference position 552 to the third position 234 will arrive at the theoretical location of the fourth position in absolute terms relative to the fixed slave reference position in the surgical workspace.
(113) Finally, the theoretical location of the end effector position 560 may be determined as a vector p.sub.5/4 relative to the fourth position 710 according to vector component relations set out in Eqn 7a, 7b and 7c above. Adding the vector from the fourth position 710 to the end effector position 550 to the vector p.sub.4/3 and to the vector p.sub.3/v from the fixed slave reference position 552 will arrive at the theoretical location of the end effector position 560 in absolute terms relative to the fixed slave reference position.
(114) If at block 746, each of the reference positions along the positioning device 209 has been processed, the locations of a plurality of points along the 209 and end effector 210 will have been determined, thus defining the 3D spatial positioning of the instrument 208 in the surgical workspace.
(115) The process 308 then continues at block 748, which directs the microprocessor 250 to generate a two-dimensional projection of the current 3D spatial position of the positioning device 208 to generate the area 412 representing the positioning device shown in the graphical depiction 136 of FIG. 5. Block 748 also directs the microprocessor 250 to generate a two-dimensional projection of the current 3D spatial position of the end effector 210 to generate the indicator 408 representing the end effector shown in FIG. 5. In one embodiment, the planar representation 136 is generated for a plane that is aligned with the x.sub.v-y.sub.v plane (i.e. perpendicular to the z.sub.v axis) and the projection is generated from the x.sub.v and y.sub.v components of the location of each intermediate point along the positioning device 209 and end effector 210 (i.e. the z.sub.v components are set to zero).
(116) The process 308 then continues at block 750, which directs the microprocessor 250 to determine whether any projected portion of the positioning device 209 is proximate the boundary 406 in FIG. 5 indicating that a constraint to further movement of the positioning device is active. Block 750 also directs the microprocessor 250 to determine whether any projected portion of the end effector 210 is proximate the boundary 402. If either of these conditions is detected, block 750 directs the microprocessor 250 to block 752.
(117) Block 752 directs the microprocessor 250 to cause an active constraint alert to be generated. In one embodiment a visual alert may be generated by changing a color or displayed intensity of the boundary 402 or 406 or by displaying an alert symbol on the display 122. The alert may alternatively be displayed in the graphical depictions 136 and 138 overlaying the location of the indicators 412 and 414. In other embodiments an audible alert may be generated. Alternatively or additionally, the microprocessor 250 may cause the input device 110 to generate haptic feedback via the hand controller 112. Block 752 then directs the microprocessor 250 back to block 302 in FIG. 4.
(118) If at block 750, the positioning device 209 and end effector 210 are not proximate any boundaries, the microprocessor 250 is directed back to block 302 in FIG. 4.
(119) Depth
(120) The instrument depth range 416 depiction shown in FIG. 5 is generated as follows. The depth range is taken along an axis 492 shown in FIG. 6, with the end 424 of the range corresponding to a maximum depth 494 of the end effector 210 within the area 488. The end 426 of the instrument depth range 416 corresponds to a minimum depth 496 of the end effector 210 within area 488. The input device depth range 432 similarly corresponds to the portion of the hatched area 482 along the axis 492. The current depth indicator 420 is positioned on the instrument depth range 416 at a location corresponding to the z value of the end effector position 560. In one embodiment, the microprocessor 250 may cause an active constraint indication to be generated when the end effector 210 is proximate either ends 424 or 426 of the input device depth range. The alert may take the form of an audible alert, visual alert displayed on the display 122, or haptic feedback through the right input device 116 and hand controller 112. The instrument depth range 418 is similarly generated for the left side instrument 212.
(121) Rotation
(122) The instrument rotation range 440 shown in FIG. 5 is generated from the configuration variable y (i.e. the rotation of the end effector 210 about the axis z.sub.2, as shown in FIG. 11). The “?” indicator represents the current rotation angle ? of the end effector 210, where the vertical line 444 is taken the reference corresponding to the right hand controller 112 being held in a generally un-rotated position. The instrument rotation range 440 has an extent that corresponds to the extent of the range of rotation provided by the hand controller 112. The instrument rotation range 440 may also be offset depending of the mapping between the input device workspace and the surgical workspace. For example, after the footswitch 134 has been depressed the hand controller 112 may be rotated prior to releasing the footswitch 134, thus offsetting the working rotational range, as shown in FIG. 5.
(123) Positioning Device Active Constraints
(124) The intermediate positions of the positioning device 209 of the right side instrument 208 calculated as described define the 3D location of the positioning device 209 of the instrument 208 within the surgical workspace (shown at 484 in FIG. 5). For each intermediate location of the vertebra 550, the microprocessor 250 determines whether the location is proximate a portion of the 3D boundary surface 485 of the surgical workspace 484. Examples of the graphical depictions for positions of the positioning device 209 of the instruments 208 and 212 are shown in FIG. 14. Referring to FIG. 14, the first example 800 shows the graphical depictions 136 and 138 for the instruments 208 and 212 in the start position after insertion where the positioning devices 209 and 213 are in a substantially straight position as shown in the side view of the instrument to the left of the depictions. The graphical depictions 136 and 138 depict the positioning devices 209 and 213 as respective dots located at the center.
(125) In the next example 802, the positioning device 209 has been moved up and the positioning device 213 has been moved down and intermediate locations at 804 are determined by the microprocessor 250 to be proximate upper and lower portions of the boundary surface 485. The dots depicting the instruments 208 and 212 are shown at locations proximate the boundary. An alert may be generated by coloring portions of the boundary in a conspicuous color to indicate the condition to the surgeon.
(126) An example of left/right limits for the positioning devices 209 and 213 are shown at 806. In the example shown at 808, the positioning devices 209 and 213 are positioned generally as in the example 806 but with the end effectors 210 and 214 turned outwardly. The end effectors 210 and 214 are located proximate the boundary surface 489 of the region 488 shown in FIG. 5 and are depicted by the indicators 408 and 410 respectively. The positioning devices 209 and 210 are represented by the areas 412 and 414 respectively. Alerts may be generated and depicted as conspicuously colored regions at the boundary surface 489 to indicate the condition to the surgeon.
(127) An example 810 shows the instruments 208 and 212 slightly turned in so that the end effector indicators 408 and 410 and the areas 412 and 414 are visible. In the example 812, the end effectors 210 and 214 remain turned inwardly while the positioning devices 209 and 213 have reached the upper and lower limits as shown at 814. In example 816, the end effectors 210 and 214 have turned outwardly and are proximate respective upper and lower portions of the 3D boundary surface 489. In the final example 818, a similar situation shown in example 812 is shown for the left/right limits to positioning device movement.
(128) While specific embodiments have been described and illustrated, such embodiments should be considered illustrative of the invention only and not as limiting the invention as construed in accordance with the accompanying claims.
Claims
1. A method for schematically representing a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to an articulated arm configured to spatially position the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace, the method comprising: by a processor circuit, calculating a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device; by the processor circuit, causing a display to display a graphical depiction of the surgical workspace, the graphical depiction including a planar representation comprising: an instrument movement region indicating a range of movement of the instrument within the surgical workspace, the instrument movement region representing a virtual depiction of the input device workspace and including a non-anatomical boundary indicating limitations to the range of movement of the instrument within the surgical workspace; and a two-dimensional projection of a current spatial position of the articulated arm and the end effector onto the planar representation; by the processor circuit, receiving an enablement signal including an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state locking the instrument in the current spatial position and inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace; by the processor circuit, in response to the enablement signal transitioning from the active state to the inactive state, causing the display to display a current hand controller position indicator on the graphical depiction as an offset from the two-dimensional projection of a current spatial position of the end effector; and by the processor circuit, in response to the enablement signal transitioning from the inactive state to the active state, causing the display to discontinue displaying the current hand controller position indicator while causing the display to continue displaying the planar representation including the instrument movement region and the two-dimensional projection of the current spatial position of the articulated arm and the end effector.
2. The method of claim 1 wherein in the two-dimensional projection the end effector is represented by an indicator and the articulated arm is represented by an area corresponding to two dimensional projected extents of at least a portion of the articulated arm.
3. The method of claim 1 further comprising generating the non-anatomical boundary by: defining a three-dimensional boundary within the surgical workspace; and generating a two-dimensional projection of the three-dimensional boundary onto the planar representation.
4. The method of claim 1 further comprising, by the processor circuit, in response to a determination that the instrument is proximate the non-anatomical boundary of the instrument movement region, causing the display to display an active constraint indication at the non-anatomical boundary.
5. The method of claim 1 wherein the robotic surgery system comprises a plurality of instruments within the surgical workspace and wherein displaying the graphical depiction comprises displaying a graphical depiction for each of the plurality of instruments.
6. The method of claim 1 wherein displaying the graphical depiction comprises displaying the graphical depiction at a peripheral region of the display.
7. The method of claim 1 wherein the graphical depiction further comprises: an instrument depth range indicating limitations to axial movement of the instrument into the surgical workspace; an indicator representing a current depth of the end effector within the instrument depth range; and an input device depth range representing a portion of the instrument depth range that is accessible by the instrument based on a current mapping between the input device workspace and the surgical workspace, wherein the input device workspace defines a limited range of the instrument depth range being accessible by the instrument.
8. The method of claim 7 further comprising, by the processor circuit, in response to a determination that the end effector is proximate an end of the input device depth range, causing the display to display an active constraint indication.
9. The method of claim 7 wherein the input device depth range is depicted as a hatched region superimposed on a depiction of the instrument depth range.
10. The method of claim 1 wherein the input signals include a rotation signal defining a current rotation of the hand controller, the rotation signal being operable to cause rotation of the end effector in the surgical workspace, and wherein the graphical depiction further comprises: an instrument rotation range indicating limitations on rotational movement of the instrument; an indicator representing a current rotation of the end effector; and an input device rotation range representing a portion of the instrument rotation range that is accessible for a current mapping between the input device workspace and the surgical workspace.
11. The method of claim 10 further comprising by the processor circuit: in response to the enablement signal transitioning from the active state to the inactive state, causing the display to display a current hand controller rotation indicator on the graphical depiction as an offset from the indicator representing a current rotation of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, causing the display to discontinue displaying the current hand controller rotation indicator.
12. The method of claim 1 wherein the non-anatomical boundary is depicted as a hemisphere.
13. The method of claim 1 wherein the instrument movement region further includes at least one keep-out zone identifying a first region of the surgical workspace positioned within a second region of the surgical workspace circumscribed by the non-anatomical boundary, the at least one keep-out zone indicating a region that may not be accessed by the instrument.
14. An apparatus for schematically representing a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to an articulated arm configured to spatially position the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace, the apparatus comprising: a display; and a processor circuit configured to: calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device; cause the display to display a graphical depiction of the surgical workspace on a display, the graphical depiction including a planar representation comprising: an instrument movement region indicating a range of movement of the instrument within the surgical workspace, the instrument movement region representing a virtual depiction of the input device workspace and including a non-anatomical boundary indicating limitations to the range of movement of the instrument within the surgical workspace; and a two-dimensional projection of a current spatial position of the articulated arm and the end effector onto the planar representation; receive an enablement signal, the enablement signal including an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state locking the instrument in the current spatial position and inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace; in response to the enablement signal transitioning from the active state to the inactive state, cause the display to display a current hand controller position indicator on the graphical depiction as an offset from the two-dimensional projection of a current spatial position of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, cause the display to discontinue displaying the current hand controller position indicator while causing the display to continue displaying the planar representation including the instrument movement region and the two-dimensional projection of the current spatial position of the articulated arm and the end effector.
15. The apparatus of claim 14 wherein the processor circuit is configured to cause the display to display an active constraint indication at the non-anatomical boundary in response to a determination that the instrument is proximate the non-anatomical boundary of the instrument movement region.
16. The apparatus of claim 14 wherein the graphical depiction further comprises: an instrument depth range indicating limitations to axial movement of the instrument into the surgical workspace; an indicator representing a current depth of the end effector within the instrument depth range; and an input device depth range representing a portion of the instrument depth range that is accessible by the instrument based on a current mapping between the input device workspace and the surgical workspace, wherein the input device workspace defines a limited range of the instrument depth range being accessible by the instrument.
17. The apparatus of claim 16 wherein the processor circuit is configured to cause the display to display an active constraint indication in response to a determination that the end effector is proximate an end of the input device depth range.
18. The apparatus of claim 14 wherein the input signals include a rotation signal defining a current rotation of the hand controller, the rotation signal being operable to cause rotation of the end effector in the surgical workspace, and wherein the graphical depiction further comprises: an instrument rotation range indicating limitations on rotational movement of the instrument; an indicator representing a current rotation of the end effector; and an input device rotation range representing a portion of the instrument rotation range that is accessible for a current mapping between the input device workspace and the surgical workspace.
19. The apparatus of claim 18 wherein the processor circuit is further configured to: in response to the enablement signal transitioning from the active state to the inactive state, cause the display to display a current hand controller rotation indicator on the graphical depiction as an offset from the indicator representing a current rotation of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, cause the display to discontinue display of current hand controller rotation indicator.
20. The apparatus of claim 14 wherein the instrument movement region further includes at least one keep-out zone identifying a first region of the surgical workspace positioned within a second region of the surgical workspace circumscribed by the non-anatomical boundary, the at least one keep-out zone indicating a region that may not be accessed by the instrument.
21. The method of claim 13 wherein the at least one keep-out zone is defined based on input received from an operator and patient imaging data.
22. A non-transitory computer readable medium storing instructions that, when executed by a processor circuit of a robotic surgery system, direct the processor circuit to represent a spatial position of an instrument used in a robotic surgery system, the instrument including an end effector coupled to a positioning device for spatially positioning the end effector in a surgical workspace in response to input signals generated by movement of a hand controller of an input device in an input device workspace, the instructions further directing the processor circuit to: calculate a current three-dimensional spatial position of the instrument within the surgical workspace for current input signals received from the input device; cause a display to display a graphical depiction of the surgical workspace, the graphical depiction including a planar representation comprising: an instrument movement region indicating a range of movement of the instrument within the surgical workspace, the instrument movement region representing a virtual depiction of the input device workspace and including a non-anatomical boundary indicating limitations to the range of movement of the instrument within the surgical workspace; and a two-dimensional projection of a current spatial position of the positioning device and the end effector onto the planar representation; receive an enablement signal including an active state and an inactive state, the active state permitting movement of the instrument in response to the input signals and the inactive state locking the instrument in the current spatial position and inhibiting movement of the instrument to facilitate repositioning of the hand controller within the input device workspace; in response to the enablement signal transitioning from the active state to the inactive state, cause the display to display a current hand controller position indicator on the graphical depiction as an offset from the two-dimensional projection of a current spatial position of the end effector; and in response to the enablement signal transitioning from the inactive state to the active state, cause the display to discontinue displaying the current hand controller position indicator while causing the display to continue displaying the planar representation including the instrument movement region and the two-dimensional projection of the current spatial position of the positioning device and the end effector.
added a bit!
who knows, if they sell some pipeline... type RTX-321
worth less than the money they have
they will meet again at the AAGL
“We look forward to meeting with nationally renowned gynecologic surgeons to learn more about their surgical experience with traditional methods of surgery and multi-port RAS systems. Initiating and continuing conversations with potential users and purchasers of the Enos™ single-access RAS system will further clarify our next steps in advancing new and disruptive technology in minimally invasive gynecologic surgery, and how Titan may fit into future surgical practice,” Mr. Vance continued. “We also plan to meet with other minimally invasive surgical companies to build awareness of our platform and discuss opportunities for collaboration to improve outcomes for women’s health in gynecologic surgery, as a high unmet need exists in optimizing the recoveries of patients, post-surgery. The timing is perfect for these discussions as Titan prepares for our first Enos system delivery at our Chapel Hill facility next month,” concluded Mr. Vance.
https://ir.titanmedicalinc.com/news-events/press-releases/detail/371/titan-medical-announces-american-association-of-gynecologic
Boom Boom Boom!
INSTRUMENT CASSETTE ASSEMBLIES FOR ROBOTIC SURGICAL INSTRUMENTS
DOCUMENT ID
US 20220361971 A1
DATE PUBLISHED
2022-11-17
INVENTOR INFORMATION
NAME
CITY
STATE
ZIP CODE
COUNTRY
Boonzaier; James A.
Cape Town
N/A
N/A
ZA
Hornsby; Jack A.
Tempsford
N/A
N/A
GB
Ahuja; Akshaya
St. Neots
N/A
N/A
GB
Laakso; Aki Hannu Einari
Raleigh
NC
N/A
US
Pflaumer; Hans C.
Apex
NC
N/A
US
Barton; Rupert A.
Cambridge
N/A
N/A
GB
Turner; Adam R.
Cambridge
N/A
N/A
GB
Smitheman; Paul
Cambridge
N/A
N/A
GB
Brady; Matthew J.
Cambridge
N/A
N/A
GB
APPLICANT INFORMATION
NAME
Titan Medical, Inc.
CITY
Toronto
STATE
N/A
ZIP CODE
N/A
COUNTRY
CA
AUTHORITY
N/A
TYPE
assignee
APPLICATION NO
17/686749
DATE FILED
2022-03-04
DOMESTIC PRIORITY (CONTINUITY DATA)
us-provisional-application US 63188554 20210514
US CLASS CURRENT:
1/1
CPC CURRENT
TYPE
CPC
DATE
CPCI
A 61 B 90/50
2016-02-01
CPCI
A 61 B 34/37
2016-02-01
CPCI
A 61 B 34/71
2016-02-01
CPCI
A 61 B 34/35
2016-02-01
CPCA
A 61 B 2034/301
2016-02-01
CPCA
A 61 B 2034/302
2016-02-01
Abstract
A surgical instrument of a robotic surgical system includes an elongated shaft assembly, and end effector, and an instrument cassette assembly. The elongated shaft assembly has a proximal end portion and a distal end portion. The end effector is supported the distal end portion of the elongated shaft assembly. The instrument cassette assembly is supported on the proximal end portion of the elongated shaft assembly. The instrument cassette assembly includes a cassette housing and an actuator system supported in the cassette housing. The actuator system is operably coupled to the end effector for operating the end effector.
Background/Summary
CROSS REFERENCE TO RELATED APPLICATIONS [0001] This application claims the benefit of U.S. Provisional Application Ser. No. 63/188,554, filed May 14, 2021, the entire contents of which are incorporated by reference herein.
TECHNICAL FIELD
[0002] This disclosure relates to robotic systems and, more particularly, to instrument cassettes for robotic surgical instruments.
BACKGROUND
[0003] Surgical instruments used in laparoscopic and/or robotic surgery generally have a proximally located actuating mechanism that may be used to actuate a distal end effector for performing a surgical task within a body cavity of a patient. Such instruments may be used in applications where there is an area of limited access for an operator. The distal end of the instrument may be inserted into the area of limited access and the operator may remotely and/or robotically manipulate the instrument via the actuator mechanism.
SUMMARY
[0004] In accordance with an aspect of this disclosure, a robotic surgical system includes a drive unit and a surgical instrument removably connected to the drive unit. The surgical instrument includes an elongated shaft assembly, an end effector, and an instrument cassette assembly. The elongated shaft assembly has a proximal end portion and a distal end portion. The end effector is supported the distal end portion of the elongated shaft assembly. The instrument cassette assembly is supported on the proximal end portion of the elongated shaft assembly. The instrument cassette assembly includes a cassette housing, an actuator system supported in the cassette housing and operably coupled to the end effector for operating the end effector. The actuator system includes a cable actuator assembly, a shaft assembly defining a longitudinal axis, a rotation actuator assembly, and an axial actuator assembly. The cable actuator assembly includes a plurality of cables that extends from the cassette housing to the end effector for manipulating the end effector. The rotation actuator assembly is coupled to the shaft assembly and positioned to rotate the shaft assembly about the longitudinal axis for imparting rotational force to the end effector. The axial actuator assembly is coupled to the shaft assembly and positioned to axially translate the shaft assembly relative to the longitudinal axis for imparting axial force to the end effector.
[0005] In aspects, the cable actuator assembly may include a crank, a first slider, and a second slider, the first and second sliders coupled to the crank. The crank may be rotatable to linearly translate the first and second sliders relative to one another. The first slider may support a first cable of the plurality of cables and the second slider may support a second cable of the plurality of cables. The crank may be coupled to a driver that is engaged with the drive unit. The driver may be configured to impart rotational force on the crank.
[0006] In aspects, the rotation actuator assembly may include a drive wheel and a belt drive shaft supporting a belt. The belt may be coupled to the shaft assembly and the drive wheel may be coupled to the belt drive shaft. The drive wheel and the belt drive may be disposed transverse to one another. The drive wheel may be configured to rotate the belt drive shaft. Rotation of the belt drive shaft may rotate the belt to rotate the shaft assembly.
[0007] In aspects, the axial actuator assembly may include a drive disc, a drive arm coupled to the drive disc, and a drive plate coupled to the drive arm and to the shaft assembly. In aspects, the drive arm may include a first pin coupled to the drive disc and a second pin coupled to the drive plate. The drive plate may define a pin slot that receives the second pin. The second pin may be slidable along the pin slot to axially translate the drive plate and the shaft assembly as the drive disc rotates.
[0008] According to one aspect, this disclosure is directed to a surgical system including a cassette housing and an actuator system. The actuator system is supported in the cassette housing and includes a cable actuator, a shaft assembly, a rotation actuator assembly, and an axial actuator assembly. The cable actuator assembly includes a plurality of cables. The shaft assembly defines a longitudinal axis. The rotation actuator assembly is coupled to the shaft assembly and positioned to rotate at least a portion of the shaft assembly about the longitudinal axis. The axial actuator assembly is coupled to the shaft assembly and positioned to axially translate at least a portion of the shaft assembly relative to the longitudinal axis.
[0009] According to another aspect, this disclosure is directed to a surgical instrument for a robotic surgical system. The surgical instrument includes an elongated shaft assembly, an end effector, and an instrument cassette assembly. The elongated shaft assembly has a proximal end portion and a distal end portion. The end effector is supported at the distal end portion of the elongated shaft assembly. The instrument cassette assembly is supported on the proximal end portion of the elongated shaft assembly. The instrument cassette assembly includes a cassette housing and an actuator system. The actuator system is supported in the cassette housing and operably coupled to the end effector for operating the end effector. The actuator system includes a cable actuator assembly, a shaft assembly, a rotation actuator assembly, and an axial actuator assembly. The cable actuator assembly includes a plurality of cables that extends from the cassette housing to the end effector for manipulating the end effector. The shaft assembly defines a longitudinal axis. The rotation actuator assembly is coupled to the shaft assembly and positioned to rotate the shaft assembly about the longitudinal axis for imparting rotational force to the end effector. The axial actuator assembly is coupled to the shaft assembly and positioned to axially translate the shaft assembly relative to the longitudinal axis for imparting axial force to the end effector.
[0010] According to still another aspect, this disclosure is directed to a robotic surgical system. The robotic surgical system includes a drive unit and a surgical instrument removably connected to the drive unit. The surgical instrument includes an elongated shaft assembly, an end effector, and an instrument cassette assembly. The elongated shaft assembly has a proximal end portion and a distal end portion. The end effector is supported on the distal end portion of the elongated shaft assembly. The instrument cassette assembly is supported on the proximal end portion of the elongated shaft assembly. The instrument cassette assembly includes a cassette housing and an actuator system. The actuator system is supported in the cassette housing and is operably coupled to the end effector for operating the end effector. The actuator system includes a cable actuator assembly including a spindle, an upper crank, and a lower crank. The upper crank is coupled to a first cable and the lower crank is coupled to a second cable. The upper and lower cranks are movable along the spindle to move the first and second cables for manipulating the end effector.
[0011] In aspects, the first and second cables may be movable relative to one another.
[0012] In aspects, the drive unit may rotate the spindle about a spindle axis. Rotation of the spindle may cause the upper and lower cranks to translate along the spindle axis. The upper and lower cranks may translate in opposite directions along the spindle axis.
[0013] In aspects, the second cable may extend through the lower crank. The lower crank may include a spine through which the second cable slides as the upper crank moves relative to the lower crank.
[0014] In aspects, the upper crank may define a first spiral passage through an outer surface thereof. The lower crank may define a second spiral passage through an outer surface thereof. The second spiral passage may turn in an opposite direction than the first spiral passage. The spindle may include a first pin that slides through the first spiral passage and a second pin that slides through the second spiral passage.
[0015] According to another aspect, this disclosure is directed to a surgical system. The surgical system includes a cassette housing and an actuator system. The actuator system is supported in the cassette housing. The actuator system includes a cable actuator assembly including a spindle, an upper crank, and a lower crank. The upper crank is coupled to a first cable. The lower crank is coupled to a second cable. The upper and lower cranks are movable along the spindle to move the first and second cables.
[0016] In aspects, the spindle may rotate about a spindle axis to cause the upper and lower cranks to translate along the spindle axis.
[0017] According to still another aspect, this disclosure is directed to a surgical instrument for a robotic surgical system. The surgical instrument includes an elongated shaft assembly, an end effector, an instrument cassette assembly. The elongated shaft assembly has a proximal end portion and a distal end portion. The end effector is supported the distal end portion of the elongated shaft assembly. The instrument cassette assembly is supported on the proximal end portion of the elongated shaft assembly. The instrument cassette assembly includes a cassette housing and a cable actuator assembly. The cable actuator assembly is supported in the cassette housing and includes a spindle, an upper crank, and a lower crank. The upper crank is coupled to a first cable. The lower crank is coupled to a second cable. The upper and lower cranks are translatable along the spindle to move the first and second cables for manipulating the end effector as the spindle rotates relative to the upper and lower cranks.
[0018] According to yet another aspect, this disclosure is directed to a robotic surgical system. The robotic surgical system includes a drive unit and a surgical instrument removably connected to the drive unit. The surgical instrument includes an elongated shaft assembly, an end effector, and an instrument cassette assembly. The elongated shaft assembly has a proximal end portion and a distal end portion. The end effector is supported on the distal end portion of the elongated shaft assembly. The instrument cassette assembly is supported on the proximal end portion of the elongated shaft assembly. The instrument cassette assembly includes a cassette housing and an actuator system. The actuator system is supported in the cassette housing and operably coupled to the end effector for operating the end effector. The actuator system includes a cable actuator assembly and a drive actuator assembly. The cable actuator assembly includes a crank that supports an upper slider and a lower slider. The upper and lower sliders are coupled to cables that extend to the end effector. The drive actuator assembly includes a rotation actuator assembly and an axial actuator assembly. The rotation actuator assembly has at least one spool that rotates an inner shaft assembly coupled to the end effector to impart rotational force to the end effector. The axial actuator assembly includes a pivotable clevis that moves an axial drive cable relative to the inner shaft assembly to impart axial force to the end effector.
[0019] In aspects, the crank may be coupled to the upper and lower sliders by first and second pins. The first pin may be slidably positioned within an elongated pin slot defined in the upper slider and the second pin may be slidably positioned within an elongated pin slot defined in the lower slider. The upper slider and lower slider may be positioned to translate in opposite directions as the crank rotates.
[0020] In aspect, the at least one spool of the rotation actuator assembly may include an input spool and an output spool that are coupled together by a rotation cable. The input spool may be nonrotatably coupled to a driver, the input spool configured to rotate when the driver rotates. Rotation of the input spool moves the rotation cable about the output spool to rotate the inner shaft assembly.
[0021] In aspects, the axial actuator assembly may include a threaded nut that is pinned to the pivotable clevis to enable the pivotable clevis to pivot relative to the threaded nut. The threaded nut may be threadedly coupled to a threaded driver. The threaded driver may be rotatable to cause the threaded nut to translate along the threaded driver. Translation of the threaded nut along the threaded driver may cause the pivotable clevis to pivot about a mounting protrusion such that the axial drive cable moves between extended and retracted positions relative to the inner shaft assembly.
[0022] According to one aspect, this disclosure is directed to a surgical system. The surgical system includes a cassette housing and an actuator system supported in the cassette housing. The actuator system includes a cable actuator assembly and a drive actuator assembly. The cable actuator assembly includes a crank that supports an upper slider and a lower slider. The upper and lower sliders are coupled to cables. The drive actuator assembly includes a rotation actuator assembly and an axial actuator assembly. The rotation actuator assembly has at least one spool that rotates an inner shaft assembly. The axial actuator assembly includes a pivotable clevis that moves an axial drive cable relative to the inner shaft assembly.
[0023] According to yet another aspect, this disclosure is directed to a surgical instrument for a robotic surgical system. The surgical instrument includes an elongated shaft assembly, an end effector, a cassette housing, and a drive actuator assembly. The elongated shaft assembly has a proximal end portion and a distal end portion. The elongated shaft assembly includes an inner shaft assembly. The end effector is supported the distal end portion of the elongated shaft assembly. The cassette housing is supported on the proximal end portion of the elongated shaft assembly. The drive actuator assembly is supported in the cassette housing and is operably coupled to the end effector for operating the end effector. The drive actuator assembly includes a rotation actuator assembly and an axial actuator assembly. The rotation actuator assembly has a spool that rotates the inner shaft assembly to impart rotational force to the end effector. The axial actuator assembly includes a pivotable clevis that moves an axial drive cable relative to the inner shaft assembly to impart axial force to the end effector.
[0024] Other aspects, features, and advantages will be apparent from the description, the drawings, and the claims that follow.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate aspects of this disclosure and, together with a general description of this disclosure given above, and the detailed description given below, explain the principles of this disclosure, wherein:
[0026] FIG. 1 is a perspective view of a robotic surgical system being used for a surgical procedure on a patient in accordance with the principles of this disclosure;
[0027] FIGS. 2-4 are progressive views illustrating surgical instruments of the robotic surgical system of FIG. 1 being manipulated within a body cavity of the patient;
[0028] FIG. 5 is an enlarged, perspective view of proximal portions of surgical instruments of one surgical instrument system of the robotic surgical system of FIG. 1;
[0029] FIG. 6 is an enlarged top view of FIG. 5;
[0030] FIG. 7 is a perspective view of one surgical instrument of the surgical instruments shown in FIG. 5;
[0031] FIG. 8 is an enlarged, perspective view of an instrument cassette assembly of the surgical instrument of FIG. 7 with portions thereof shown in phantom for clarity;
[0032] FIG. 9 is a perspective view, with parts separated, of the instrument cassette assembly of FIG. 8;
[0033] FIG. 10 is a perspective view, with parts separated, of an actuator system of the instrument cassette assembly of FIG. 9;
[0034] FIG. 11 is a perspective view of a crank of a cable actuator assembly of the actuator system of FIG. 10;
[0035] FIGS. 12 and 13 are progressive views illustrating cable actuator assemblies of the actuator system of FIG. 10 being actuated;
[0036] FIG. 14 is an enlarged view of the indicated area of detail shown in FIG. 8 and illustrating a rotation actuator assembly of the actuator system of FIG. 10 being actuated;
[0037] FIGS. 15-17 are progressive views illustrating an axial actuator assembly of the actuator system of FIG. 10 being actuated;
[0038] FIG. 18 is an enlarged, perspective view of proximal portions of surgical instruments of another surgical instrument system of the robotic surgical system of FIG. 1;
[0039] FIG. 19 is an enlarged top view of FIG. 18;
[0040] FIG. 20 is a perspective view, with parts separated, of another instrument cassette assembly of one of the surgical instruments of the surgical instrument system of FIG. 18, the instrument cassette assembly including an outer housing assembly and an inner housing assembly;
[0041] FIG. 21 is a perspective view, with parts separated, of the inner housing assembly of FIG. 20, the inner housing assembly including an actuator housing and an actuator assembly;
[0042] FIG. 22 is a perspective view of the actuator assembly of FIG. 21;
[0043] FIG. 23 is a perspective view, with parts separated, of the actuator assembly of FIG. 21, the actuator assembly shown with portions removed for clarity;
[0044] FIG. 24 is an enlarged, cross-sectional view as taken along section line 24-24 shown in FIG. 20;
[0045] FIG. 25 is an enlarged, cross-sectional view as taken along section line 25-25 shown in FIG. 21;
[0046] FIGS. 26 and 27 are progressive views illustrating the actuator assembly of FIG. 21 being actuated;
[0047] FIG. 28 is an enlarged, perspective view of proximal portions of surgical instruments of yet another surgical instrument system of the robotic surgical system of FIG. 1;
[0048] FIG. 29 is an enlarged, top view of FIG. 28;
[0049] FIG. 30 is a perspective view of one of the surgical instruments of FIG. 28;
[0050] FIG. 31 is an enlarged, perspective view of an instrument cassette assembly of the surgical instrument of FIG. 30;
[0051] FIG. 32 is an enlarged, perspective view of a cable actuator assembly of the instrument cassette assembly of FIG. 31;
[0052] FIG. 33 is an enlarged, perspective view, with parts separated, of the cable actuator assembly of FIG. 32;
[0053] FIGS. 34-36 are progressive views of the cable actuator assembly of FIG. 32 being actuated;
[0054] FIG. 37 is a perspective view of the instrument cassette assembly of FIG. 31 with portions thereof shown in phantom for clarity;
[0055] FIG. 38 is an enlarged perspective view, with parts separated, of a drive actuator assembly of the instrument cassette assembly of FIG. 31, the drive actuator assembly including an axial actuator assembly and a rotation actuator assembly, the rotation actuator assembly shown being actuated;
[0056] FIG. 39 is a perspective view, with parts separated, of the drive actuator assembly of FIG. 38; and
[0057] FIGS. 40-42 are progressive, cross-sectional views as taken along section line 40-40 shown in FIG. 37 and illustrating an actuation of the axial actuator assembly of the drive actuator assembly of FIG. 38.
DETAILED DESCRIPTION
[0058] Aspects of this disclosure are described in detail with reference to the drawings, in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein, the term “distal” refers to that portion of structure farther from the user, while the term “proximal” refers to that portion of structure, closer to the user. As used herein, the term “clinician” refers to a doctor, nurse, or other care provider and may include support personnel and/or equipment operators.
[0059] In the following description, well-known functions or constructions are not described in detail to avoid obscuring the present disclosure in unnecessary detail.
[0060] Robotic surgical systems have been used in minimally invasive medical procedures and can include robotic arm assemblies. Such procedures may be referred to as what is commonly referred to as “Telesurgery.” Some robotic arm assemblies include one or more robot arms to which surgical instruments can be coupled. Such surgical instruments include, for example, endoscopes, electrosurgical forceps, cutting instruments, staplers, graspers, electrocautery devices, or any other endoscopic or open surgical devices. Prior to or during use of the robotic surgical system, various surgical instruments can be selected and connected to the robot arms for selectively actuating end effectors of the connected surgical instruments.
[0061] With reference to FIGS. 1-4, a robotic surgical system is shown generally at 10. Robotic surgical system 10 employs various robotic elements to assist the clinician and allow remote operation (or partial remote operation) of surgical instruments 100, 200, 300 of surgical instrument systems 50, 60, 70 of robotic surgical system 10. Various controllers, circuitry, robotic arms, gears, cams, pulleys, electric and mechanical motors, etc. may be employed for this purpose and may be designed with surgical system 10 to assist the clinician during an operation or treatment. Such robotic systems may include remotely steerable systems, automatically flexible surgical systems, remotely flexible surgical systems, remotely articulating surgical systems, wireless surgical systems, modular or selectively configurable remotely operated surgical systems, etc.
[0062] Robotic surgical system 10 includes a workstation 12 and an instrument cart 14. The instrument cart 14 includes one or more surgical instrument systems 50, 60, 70 mounted on a moveable drive unit 18 that houses an instrument drive assembly 20 for manipulating the surgical instrument systems 50, 60, 70 and/or independent surgical instruments 100, 200, 300 thereof with the assistance of, for example one or more computing devices or controllers. The surgical instruments 100, 200, 300 can include, for example, graspers or forceps 26, which may be electrosurgical, an endoscope 28, and/or any other suitable instrument that can be driven by one or more associated tool drives (not shown) of instrument drive assembly 20. For example, besides graspers 26 and endoscope 28, the one or more surgical instruments 100, 200, 300 can include dexterous tools, such as grippers, needle drivers, staplers, dissectors, cutters, hooks, graspers, scissors, coagulators, irrigators, suction devices, that are used for performing a surgical procedure.
[0063] Each surgical instrument system 50, 60, 70 includes an insertion tube 16 defining a plurality of separate conduits, channels or lumens 16a therethrough that are configured to receive, for instance, the surgical instruments 100, 200, 300 for accessing a body cavity “BC” of a patient “P.” In other aspects, the insertion tube 16 may define a single conduit, channel or lumen therethrough that is configured to receive, for instance, the surgical instruments 100, 200, 300 for accessing a body cavity “BC” of a patient “P.” In particular, the insertion tube 16 can be inserted through an incision “I” and/or access device 17 (e.g., a surgical portal, which may include or more seals to facilitate sealed insertion through tissue “T” of the patient “P”) and into the body cavity “BC” of the patient “P”). With insertion tube 16 positioned in the patient “P,” the surgical instruments 100, 200, 300 can be advanced through insertion tube 16 into the body cavity “BC” of the patient “P.” Further, the workstation 12 includes an input device 22 for use by a clinician for controlling the insertion tube 16 and the various surgical instrument systems 50, 60, 70 (and surgical instruments 100, 200, 300 thereof) via the instrument drive assembly 20 to perform surgical operations on the patient “P” while the patient “P” is supported on a surgical table 24, for example. Input device 22 is configured to receive input from the clinician and produces input signals. Input device 22 may also be configured to generate feedback to the clinician. The feedback can be visual, auditory, haptic, or the like.
[0064] The workstation 12 can further include computing devices and/or controllers such as a master processor circuit 22a in communication with the input device 22 for receiving the input signals and generating control signals for controlling the robotic surgical system 10, which can be transmitted to the instrument cart 14 via an interface cable 22b. In some cases, transmission can be wireless and interface cable 22b may not be present. The input device 22 can include right and left-hand controls (not shown) and/or foot pedals (not shown), which are moved/operated to produce input signals at the input device 22 and/or to control robotic surgical system 10. The instrument cart 14 can include a slave processor circuit 20a that receives and the control signals from the master processor circuit 22a and produces slave control signals operable to control the various surgical instrument systems 50, 60, 70 (and surgical instruments 100, 200, 300 thereof) during a surgical procedure. The workstation 12 can also include a user interface, such as a display (not shown) in communication with the master processor circuit 22a for displaying information (such as, body cavity images) for a region or site of interest (for example, a surgical site, a body cavity, or the like) and other information to a clinician. While both master and slave processor circuits are illustrated, in other aspects, a single processor circuit may be used to perform both master and slave functions.
[0065] Turning now to FIGS. 5-17, surgical instrument system 50 of robotic surgical system 10 includes insertion tube 16 and a plurality of surgical instruments 100 that is insertable through insertion tube 16. Although only three surgical instruments 100 are shown, surgical instrument system 50 can include any number and/or type of surgical instruments such as graspers 26 and endoscope 28 as noted above.
[0066] As seen in FIGS. 6 and 7, surgical instrument 100 of surgical instrument system 50 defines a longitudinal axis “L” and includes an instrument cassette assembly 102 on a proximal end portion thereof, an elongated shaft assembly 104 that extends distally from instrument cassette assembly 102, and an end effector 106 supported on a distal end portion of elongated shaft assembly 104. End effector 106 is actuatable by instrument cassette assembly 102 for effectuating a surgical procedure. Indeed, actuating end effector 106 can cause end effector 106 to, for example, articulate, pivot, clamp, rotate, etc. relative to the longitudinal axis “L” of surgical instrument 100 for repositioning end effector 106 and/or for treating tissue “T” of the patient “P” as noted above (see FIGS. 2-4).
[0067] With reference to FIG. 8, instrument cassette assembly 102 of surgical instrument 100 includes a cassette housing 108 that supports an actuator system 110 and is coupled to elongated shaft assembly 104. Actuator system 110 includes a plurality of cable actuator assemblies 112, a rotation actuator assembly 114, and an axial actuator assembly 116.
[0068] As seen in FIG. 9, cassette housing 108 of instrument cassette assembly 102 includes a first housing 108a, a second housing 108b, and a lid 108c that couple together to support the actuator system 110 therein and secure cassette housing 108 to elongated shaft assembly 104.
[0069] With reference to FIGS. 10-13, each cable actuator assembly 112 of the actuator system 110 of instrument cassette assembly 102 includes a crank 112a that supports a bearing 112b and a driver 112c on a first end of the crank 112a. The crank 112a further supports a first slider 112d and a second slider 112e on a second end of the crank 112a. The first slider 112d is supported on a first side of the crank 112a and the second slider 112e is supported on a second side of the crank 112 opposite to the first side of the crank 112a. The first slider 112d supports a first cable 112g and the second slider 112e supports a second cable 112f The first slider 112d defines a first slot 112h and the second slider 112e defines a second slot 112k. The crank 112a includes a plate 112m having a central bearing prong 112n extending from the first end thereof. The central bearing prong 112n is received through the bearing 112b and is nonrotatably coupled to driver 112c of the cable actuator assembly 112. In aspects, central bearing prong 112n of the crank 112a can be keyed to a bore 112x defined in driver 112c on the second end of driver 112c to enable crank 112a and driver 112c to be press-fit together. Central bearing prong 112n of crank 112a and bore 112x of driver 112c can have any suitable counterpart geometry (e.g., square, triangle, star, chamfer, bevel, fillet, edge, groove, etc.) to enable driver 112c to impart rotational driving force to crank 112a via the nonrotatable coupling of driver 112c and crank 112a. In aspects, driver 112c can be secured to crank 112a via any suitable technique such as sonic welding, adhesive, fastener, snap-fit, etc., or combinations thereof such that driver 112c can rotate crank 112a about a central pivot axis “P1.” Briefly, as seen in FIG. 5, driver 112c defines a drive slot 112y therein to receive rotational drive force from moveable drive unit 18.
[0070] With continued reference to reference to FIGS. 10-13, the plate 112m of crank 112a further includes a shoulder 112p having a first finger 112q extending from a second side of the plate 112m, and a second finger 112r extending from the first side of the plate 112m and recessed from the first finger 112q. The first and second fingers 112q, 112r of crank 112a are received within first and second slots 112h, 112k of the respective first and second sliders 112d, 112e. Crank 112a is configured to move (e.g., translation or linear movement, which may be reciprocating movement) first and second sliders 112d, 112e relative to one another, as indicated by arrows “F” and “G” as shown in FIG. 13, when crank 112a rotates in clockwise and/or counterclockwise directions as indicated by arrows “H” and “I.” Linear movement of first and second sliders 112d, 112e causes first and second cables 112f, 112g to actuate (e.g., articulate, elevate, fire, clamp, etc. end effector 106 and/or jaw members thereof). Each cable actuator assembly 112 of the actuator system 110 may move independent and/or dependent of one or more of the other cable actuator assemblies 112 to actuate/operate the end effector 106 as desired.
[0071] Referring to FIGS. 10-14, rotation actuator assembly 114 of the actuator system 110 includes a driver 114a, a bearing 114b, a drive wheel 114c, a belt 114d, a belt drive shaft 114e, and a belt drum 114f. Drive wheel 114c of rotation actuator assembly 114 includes a first bevel gear 114g. Belt drive shaft 114e of rotation actuator assembly 114 includes a second bevel gear 114h that is transverse to first bevel gear 114g and positioned to meshingly engage first bevel gear 114g of drive wheel 114c. Belt drive shaft 114e further includes a drum gear 114y supported adjacent to second bevel gear 114h and positioned to enable belt 114d to slide therealong as belt 114d rotates about belt drive shaft 114e. First bevel gear 114g is configured to rotate second bevel gear 114h about pin axis “P2,” as indicated by arrow “R1,” when first bevel gear 114g rotates about drive axis “D1,” as indicated by arrow “R2.” Rotation of second bevel gear 114h about drive axis “D1,” as indicated by arrow “R1,” causes belt 114d to rotate about belt drive shaft 114e and belt drum 114f, as indicated by arrow “R3.”
[0072] Belt drum 114f of rotation actuator assembly 114 is connected to a shaft assembly 115 including an outer shaft 115a and an inner shaft assembly 115b such that rotation of belt 114d causes belt drum 114f to rotate about shaft axis “A1” defined by shaft assembly 115, as indicated by arrow “R4.” Inner shaft assembly 115b includes a first inner shaft 115c and a second inner shaft 115d that is slidably advanceable through first inner shaft 115c along shaft axis “A1” of shaft assembly 115. First inner shaft 115c supports belt drum 114f on a first end thereof with the second end of first inner shaft 115c coupled to end effector 106 (FIG. 7) for imparting rotational movement/force on end effector 106. Belt drum 114f is positioned to rotate about shaft axis “A1” as belt 114d rotates about belt drum 114f Further, like driver 112c of cable actuator assembly 112, driver 114a of rotation actuator assembly 114 includes a drive slot 112y on a first end thereof. Likewise, driver 114a of rotation actuator assembly 114 is nonrotatably coupled to drive wheel 114c, for example, via a central bearing prong 114n extending from a first end of drive wheel 114c. Central bearing prong 114n supports bearing 114b and may be mechanically coupled to the second end of driver 114a via a bore 114x of driver 114a (e.g., press-fit) and/or via sonic welding, adhesive, fastener, etc., or combinations thereof such that driver 114a imparts rotational movement on drive wheel 114c about drive axis “D1” as driver 114a rotates about drive axis “D1.”
[0073] Referring to FIGS. 10-17, axial actuator assembly 116 of actuator system 110 includes a driver 116a, a bearing 116b, a drive disc 116c defining an elongated pin notch 116d, a drive arm 116e coupled to drive disc 116c, and a drive plate 116f coupled to drive arm 116e. Drive arm 116e includes a first pin 116g on a first end thereof that is slidably and rotatably received within elongated pin notch 116d of drive disc 116c to enable rotation of drive arm 116e about pivot axis “P3” defined through first pin 116g. Drive arm 116e further includes a second pin 116j on a second end thereof that is slidably and rotatably received within a pin slot 116k defined through drive plate 116f and disposed at an acute angle (e.g., 45 degrees) relative to shaft axis “A1.” Drive arm 116e is also configured to pivot about pin axis “P4” defined through second pin 116j as second pin 116j slides through pin slot 116k of drive plate 116f. Further, like driver 112c of cable actuator assembly 112, driver 116a of axial actuator assembly 116 includes a drive slot 112y on a first end thereof. Likewise, driver 116a of axial actuator assembly 116 is nonrotatably coupled to drive disc 116c, for example, via a central bearing prong 116n extending from a first end of drive disc 116c. Central bearing prong 116n supports bearing 116b and may be mechanically coupled to the second end of driver 116a via a bore 116x of driver 116a (e.g., press-fit) and/or via sonic welding, adhesive, fastener, etc., or combinations thereof such that driver 116a imparts rotational movement on drive disc 116c about drive axis “D2” of axial actuator assembly 116 as driver 116a rotates about drive axis “D2.”
[0074] As seen in FIGS. 15-17, axial actuator assembly 116 of actuator system 110 is positioned to move between an intermediate position (FIG. 15), a retracted position (FIG. 16), and an extended position (FIG. 17). In the intermediate position, drive arm 116e is parallel to belt 114d and orthogonal to shaft axis “A1” with first pin 116g supported on a first side of elongated pin notch 116d and second pin 116j substantially centered along pin slot 116k of drive plate 116f. Rotation of driver 116a in a first direction (e.g., clockwise), as indicated by arrow “C,” causes first pin 116g of drive arm 116e to slide to a second side (e.g., proximally) of elongated pin notch 116d, as indicated by arrow “51.” Rotation of driver 116a in the first direction also causes second pin 116j of drive arm 116e to slide to a first side of pin slot 116k, such that drive plate 116f is urged toward the extended position (e.g., distally), as indicated by arrow “S2,” to impart distal axial movement to shaft assembly 115 and distal axial force and/or movement to end effector 106. Rotation of driver 116a in a second direction (e.g., counterclockwise), as indicated by arrow “CC,” causes second pin 116j of drive arm 116e to slide to a second side (e.g., distally) of pin slot 116k such that drive plate 116f is urged toward the retracted position (e.g., proximally), as indicated by arrow “S3” to impart proximal axial movement to shaft assembly 115 and proximal axial force and/or movement to end effector 106.
[0075] Turning now to FIGS. 18-27, surgical instrument system 60 of robotic surgical system 10 includes insertion tube 16 and a plurality of surgical instruments 200 that is insertable through insertion tube 16. Although only three surgical instruments 200 are shown, surgical instrument system 60 can include any number and/or type of surgical instruments such as graspers 26 and endoscope 28 as noted above.
[0076] As seen in FIG. 18, surgical instrument 200 of surgical instrument system 60 defines a longitudinal axis “L2” and includes an instrument cassette assembly 202 on a proximal end portion thereof and an elongated shaft assembly 204 that extends from instrument cassette assembly 202 to an end effector 106 (FIG. 7) supported on a distal end portion of elongated shaft assembly 204. End effector 106 is actuatable by instrument cassette assembly 202 for effectuating a surgical procedure. Indeed, actuating end effector 106 can cause end effector 106 to, for example, articulate, pivot, clamp, rotate, etc. relative to the longitudinal axis “L2” of surgical instrument 200 for repositioning end effector 106 and/or for treating tissue “T” of the patient “P” as noted above (see FIGS. 2-4).
[0077] With reference to FIGS. 20 and 21, instrument cassette assembly 202 of surgical instrument 200 includes an outer housing assembly 208 and an inner housing assembly 210 supported within outer housing assembly 208. Inner housing assembly 210 includes an actuator housing 212 and an actuator assembly 214 that is supported within inner housing assembly 210. Actuator housing 212 defines a plurality of actuator cavities 212a defined therein for receiving actuator assembly 214.
[0078] Turning now to FIGS. 20-24, actuator assembly 214 of inner housing assembly 210 includes a first set of cable actuator assemblies 216, a second set of cable actuator assemblies 218, and an axial actuator assembly 220 that are secured to a support plate 222. Notably, adjacent cable actuator assemblies 216, 218 may be disposed out of phase and/or offset from one another by, for example, 90 degrees (e.g., orthogonal to one another), and with axial actuator assembly 220 centered between first and second sets of cable actuator assemblies 216, 218 to conserve space and reduce size requirements of inner housing assembly 210. Support plate 222 defines bores 222a for supporting cable and actuator assemblies 216, 218, and 220 therein, and cable passages 222b for receiving cables 224 of cable actuator assembles 216, 218 therethrough. Although the first set of cable actuator assemblies 216 are shown to be longer than second set of cable actuator assemblies 218, and the second set of cable actuator assemblies 218 is otherwise substantially the same as the first set of cable actuator assemblies 218. Indeed, the first and second set of cable actuator assemblies 216, 218 may have any suitable length. Axial actuator assembly 220 is coupled to a drive cable 226 that extends through a tube 228 extending from support plate 222 that couples to elongated shaft assembly 204.
[0079] Each cable actuator assembly 216, 218 of actuator assembly 214 includes a spindle 230, an upper crank 232, a lower crank 234, an upper pin 236a, a lower pin 236b, an upper bearing 238a, and a lower bearing 238b.
[0080] Spindles 230 of actuator assemblies 216, 218 include an upper peg 230a extending from a first end thereof and a lower peg 230b extending from a second end thereof. Upper and lower pegs 230a, 230b secure to upper and lower bearings 238a, 238b, respectively. Upper peg 230a is engageable with movable drive unit 18 to enable movable drive unit 18 to impart rotational drive force on spindles 230 (e.g., through drive couplers—not shown—of drive unit 18). Each spindle 230 further defines an upper pin passage 230c and a lower pin passage 230d that extend transversely through spindle 230 at longitudinally spaced-apart locations and are positioned to receive upper and lower pins 236a, 236b, respectively, in a transverse (e.g., an orthogonal) relationship with spindle 230.
[0081] Lower crank 234 of each cable actuator assembly 216, 218 defines a spindle passage 234a longitudinally and centrally therethrough for receiving spindle 230 therethrough. Lower crank 234 further defines a spiral channel 234b in an outer surface thereof. Spiral channel 234b slidably receives lower pin 236b of spindle 230 to enable lower crank 234 to axially slide along a lower portion of spindle 230, as indicated by arrows “AA” (e.g., distally) and “AB” (e.g., proximally) when spindle 230 rotates about spindle axis “SA” relative to lower crank 234, as indicated by arrows “RA” (e.g., clockwise) and “RB” (e.g., counterclockwise) shown in FIGS. 26 and 27. Lower crank 234 further includes a spine 234c that extends longitudinally along the outer surface of lower crank 234. Spine 234c defines cable channels 234d, 234e longitudinally therethrough that support cables 224. A first cable 224a of cables 224 is secured to lower crank 234 (e.g., via cable channel 234e) and translates in the same direction and simultaneously with lower crank 234 (e.g., as indicated by arrows “AA” and “AB”). A second cable 224b of cables 224 is slidably movable through cable channel 234d of lower crank 234 and relative to lower yoke 234 as upper crank 232 translates relative to spindle 230. Second cable 224b is secured to upper crank 232 and movable with upper crank 232.
[0082] Upper crank 232 of each cable actuator assembly 216, 218 is substantially like lower crank 234 but includes a spiral channel 232a that turns along the outer surface thereof in an opposite direction as compared to spiral channel 234b of lower crank 234. And spiral channel 232a of upper crank 232 slidably receives upper pin 236a of spindle 230 to axially slide upper crank 232 along an upper portion of spindle 230, as indicated by arrows “BA” (e.g., proximally) and “BB” (e.g., distally), when spindle 230 rotates about spindle axis “SA” and relative to upper crank 232, as indicated by arrows “RA” (e.g., clockwise) and “RB” (e.g., counterclockwise) shown in FIGS. 26 and 27. Upper and lower cranks 234, 236 are positioned to translate in opposite axial directions relative to one another along spindle axis “SA” as upper and lower cranks 234, 236 rotate about spindle 230 and spindle axis “SA.” Notably, upper crank 232 further includes a spine 232x that supports and is secured to second cable 224b to enable second cable 224b to translate with upper crank 232 and relative to lower crank 234, as indicated by arrows “BA” and “BB.” Upper crank 232 is separate and disconnected from first cable 224a.
[0083] Axial actuator assembly 220 is substantially like cable actuator assemblies 216, 218, but includes a spindle 230, an upper crank 232, an upper pin 236a, an upper bearing 238a, and a lower bearing 238b (e.g., there is no lower crank or lower pin). Upper crank 232 is coupled to drive cable 226 and axially translatable upon rotation of spindle 230 thereof to translate drive cable 226 and impart axial drive force through drive cable 226 to, for example, end effector 106.
[0084] Turning now to FIGS. 28-42, surgical instrument system 70 of robotic surgical system 10 includes insertion tube 16 and a plurality of surgical instruments 300 that is insertable through insertion tube 16. Although only three surgical instruments 300 are shown, surgical instrument system 70 can include any number and/or type of surgical instruments such as graspers 26 and endoscope 28 as noted above.
[0085] As seen in FIG. 30, surgical instrument 300 of surgical instrument system 70 defines a longitudinal axis “L3” and includes an instrument cassette assembly 302 on a proximal end portion thereof and an elongated shaft assembly 304 that extends from instrument cassette assembly 302 to an end effector 306 (FIG. 7) supported on a distal end portion of elongated shaft assembly 304. End effector 306 is actuatable by instrument cassette assembly 302 for effectuating a surgical procedure. Indeed, actuating end effector 306 can cause end effector 306 to, for example, articulate, pivot, clamp, rotate, etc. relative to the longitudinal axis “L3” of surgical instrument 300 for repositioning end effector 306 and/or for treating tissue “T” of the patient “P” as noted above with respect to end effector 106 (see FIGS. 2-4).
[0086] With reference to FIGS. 28-30, instrument cassette assembly 302 of surgical instrument 300 includes an outer housing assembly 308 that supports an ID board 310, a latch release mechanism 312 having a button release 312a for selectively removing surgical instrument from movable drive unit 18, and an electrosurgical socket 314 for selectively connecting to an electrosurgical energy source via an electrosurgical cable (not shown). Instrument cassette assembly 302 further includes an actuator system 316 supported within outer housing assembly 308 for actuating end effector 306.
[0087] Turning now to FIGS. 31-36, actuator system 316 includes a plurality of cable actuator assemblies 318 and a drive actuator assembly 320. The drive actuator assembly 320 includes an axial actuator assembly 322 and a rotation actuator assembly 324.
[0088] With reference to FIG. 32, each cable actuator assembly 318 of the actuator system 316 includes a crank 330, an upper slider 332, a lower slider 334, and a driver 336. The upper slider 332 secured to an upper surface of crank body 330x, via a first pin 333a, on a first side of crank body 330x, the lower slider 334 secured to a lower surface of crank body 330x, via a second pin 333b, on a second side of crank body 330x. Crank body 330x defines a first opening 330a on the first side of crank body 330x that receives a lower portion of the first pin 333a and a second opening 330b on the second side of crank body 330x that receives an upper portion of the second pin 333b. Upper slider 332 defines an elongated pin slot 332a that receives an upper portion of the first pin 333a and lower slider 334 defines an elongated pin slot 334a that receives a lower portion of the second pin 333b. Upper and lower sliders 332, 334 further define ferrule openings 335 that support ferrules (not shown) therein for securing cables 338 of cable actuator assembly 318 to respective upper and lower sliders 332, 334. Crank 330 further includes a drive shaft 330c that extends from the upper and lower surfaces of crank body 330x and nonrotatably supports driver 336 so that rotation of driver 336 imparts rotational force to crank 330.
[0089] In some aspects, each cable actuator assembly 318 may be provided in the form of a rack and pinion arrangement. For example, crank 330 may be a pinion, and upper slider 332 and lower slider 334 are in the form of racks so that teeth of these respect rack and pinion feature engage one another. Indeed, upper and lower sliders 332, 334 may be disposed in the same plane as one another (e.g., vertically aligned or in registration), and/or vertically offset from one another such that one is higher and/or lower than the other along a vertical or central axis (not explicitly shown) extending through a center of crank 330.
[0090] As seen in FIGS. 34-36, rotation of driver 336 in a first direction (e.g., counterclockwise), as indicated by arrow “RCC,” rotates crank 330 such that first pin 333a moves inwardly along elongated pin slot 332a of upper slider 332 to an actuated position and second pin 333b moves inwardly along elongated pin slot 334a of lower slider 334 to an actuated position. As crank 330 is rotated in the first direction, crank 330 and first pin 333a cause upper slider 332 to translate distally as indicated by arrow “DCC” while crank 330 and second pin 333b cause lower slider 334 to translate proximally as indicated by arrow “PCC.” Rotation of driver 336 in a second direction opposite to the first direction (e.g., clockwise), as indicated by arrow “RC,” causes crank 330 to rotate such that first and second ins 33a, 33b move to the actuated position within respective elongated pin slots 332a, 334a of upper and lower sliders 332, 334 and drive upper slider 332 in a proximal direction, as indicated by arrow “PC,” and lower slider 334 in a distal direction, as indicated by arrow “DC.” As upper and lower sliders 332, 334 translate between proximal and distal positions, cables 338 secured thereto translate with the respective upper and lower sliders 332, 334. For instance, when upper slider 332 translates distally (or proximally), a first cable 338a of cables 338 secured to upper slider 332 translates distally (or proximally) with upper slider 332, and vice versa. Similarly, when lower slider 334 translates proximally (or distally), a second cable 338b of cables 338 secured to lower slider 334 translates proximally (or distally) with lower slider 334, and vice versa.
[0091] With reference to FIGS. 37-42, drive actuator assembly 320 of actuator system 316 includes an actuator housing 321 having a first housing portion 321a and a second housing portion 321b that define actuator mounts 321c, 321d for supporting the axial actuator assembly 322 and the rotation actuator assembly 324 between the first and second housing portions 321a, 321b. Second housing portion 312b further includes a mounting protrusion 312e extending from a sidewall thereof.
[0092] Rotation actuator assembly 324 of drive actuator assembly 320 includes an input spool 340, an output spool 342, a rotation cable 344 that couples to (e.g., wraps around) input and output spools 340, 342, bearings 346a, 346b, and a driver 348 that nonrotatably couples to input spool 340. Cable 344 may have any number of windings about input and output spools 340, 342 to enable rotational force to be transferred between input and output spools 340, 342. Notably, elongated shaft assembly 304 of surgical instrument 300 includes an inner shaft 304a to which output spool 342 nonrotatably couples, and which is coupled to actuator housing 321 by a shaft bearing 343. Output spool 342 imparts rotational force to inner shaft 304a from input spool 340 as rotation cable 344 rotates output spool 342 about inner shaft axis “ISA,” as indicated by arrow “OS,” (as rotation cable 344 translates-see arrows “T1” and “T2”) in response to rotation of driver 348 about driver axis “DA,” as indicated by arrow “IS.”
[0093] Axial actuator assembly 322 of drive actuator assembly 320 includes a clevis 350, a threaded nut 352 mounted to clevis 350 via pins 352a thereof, an upper bearing 354, a lower bearing 356, a threaded driver 357 having threads 357a, and a cable pivot 358. Threads 357a of threaded driver 357 are threadedly engageable with threads 352b of threaded nut 352 to enable threaded nut 352 to translate along threaded driver 357, as indicated by arrows “N1” and “N2,” when threaded driver 357 is rotated, as indicated by arrows “TD1” and “TD2” (e.g., clockwise and/or counterclockwise about axis “TDA.” Clevis 350 defines a nut mount 350a on a first end thereof that defines pin holes 350b therethrough for receiving pins 352a of threaded nut 352 therein. Clevis 350 further defines a protuberance hole 350c that receives mounting protrusion 312e of second housing portion 312b for securing clevis 350 to actuator housing 321. A second end of clevis 350 defines a cable pivot channel 350d for receiving cable pivot 358 and an axial drive cable 360 therein. Axial drive cable 360 is coupled to cable pivot 358 on a first end thereof and extends through inner shaft 304a to enable a second end of axial drive cable 360 to secure to end effector 306 for imparting axial drive force on end effector 306.
[0094] As seen in FIGS. 41 and 42, as threaded nut 352 translates along threaded driver 357, the first end of clevis 350 pivots about pins 352a and mounting protrusion 312e, relative to threaded nut 352, such that the second end of clevis 350 moves axial drive cable 360 between an extended position (FIG. 42) and a retracted position (FIG. 41), as indicated by arrows “ZZ” and “YY.” The second end of clevis 350 also pivot relative to cable pivot 350 as axial drive cable 360 is moved between the extended and retracted positions to impart axial force to end effector 306.
[0095] The disclosed structure can include any suitable mechanical, electrical, and/or chemical components for operating the disclosed system or components thereof. For instance, such electrical components can include, for example, any suitable electrical and/or electromechanical, and/or electrochemical circuitry, which may include or be coupled to one or more printed circuit boards. As appreciated, the disclosed computing devices (and/or servers) can include, for example, a “controller,” “processor,” “digital processing device” and like terms, and which are used to indicate a microprocessor or central processing unit (CPU). The CPU is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logical, control and input/output (I/O) operations specified by the instructions, and by way of non-limiting examples, include server computers. In some aspects, the controller includes an operating system configured to perform executable instructions. The operating system is, for example, software, including programs and data, which manages hardware of the disclosed apparatus and provides services for execution of applications for use with the disclosed apparatus. Those of skill in the art will recognize that suitable server operating systems include, by way of non-limiting examples, FreeBSD, OpenBSD, NetBSD®, Linux, Apple® Mac OS X Server®, Oracle® Solaris®, Windows Server®, and Novell® NetWare®. In some aspects, the operating system is provided by cloud computing.
[0096] In some aspects, the term “controller” may be used to indicate a device that controls the transfer of data from a computer or computing device to a peripheral or separate device and vice versa, and/or a mechanical and/or electromechanical device (e.g., a lever, knob, etc.) that mechanically operates and/or actuates a peripheral or separate device.
[0097] In aspects, the controller includes a storage and/or memory device. The storage and/or memory device is one or more physical apparatus used to store data or programs on a temporary or permanent basis. In some aspects, the controller includes volatile memory and requires power to maintain stored information. In various aspects, the controller includes non-volatile memory and retains stored information when it is not powered. In some aspects, the non-volatile memory includes flash memory. In certain aspects, the non-volatile memory includes dynamic random-access memory (DRAM). In some aspects, the non-volatile memory includes ferroelectric random-access memory (FRAM). In various aspects, the non-volatile memory includes phase-change random access memory (PRAM). In certain aspects, the controller is a storage device including, by way of non-limiting examples, CD-ROMs, DVDs, flash memory devices, magnetic disk drives, magnetic tapes drives, optical disk drives, and cloud-computing-based storage. In various aspects, the storage and/or memory device is a combination of devices such as those disclosed herein.
[0098] In various aspects, the memory can be random access memory, read-only memory, magnetic disk memory, solid state memory, optical disc memory, and/or another type of memory. In various aspects, the memory can be separate from the controller and can communicate with the processor through communication buses of a circuit board and/or through communication cables such as serial ATA cables or other types of cables. The memory includes computer-readable instructions that are executable by the processor to operate the controller. In various aspects, the controller may include a wireless network interface to communicate with other computers or a server. In aspects, a storage device may be used for storing data. In various aspects, the processor may be, for example, without limitation, a digital signal processor, a microprocessor, an ASIC, a graphics processing unit (“GPU”), field-programmable gate array (“FPGA”), or a central processing unit (“CPU”).
[0099] The memory stores suitable instructions and/or applications, to be executed by the processor, for receiving the sensed data (e.g., sensed data from camera), accessing storage device of the controller, generating a raw image based on the sensed data, comparing the raw image to a calibration data set, identifying an object based on the raw image compared to the calibration data set, transmitting object data to a post-processing unit, and displaying the object data to a graphic user interface. Although illustrated as part of the disclosed structure, it is also contemplated that a controller may be remote from the disclosed structure (e.g., on a remote server), and accessible by the disclosed structure via a wired or wireless connection. In aspects where the controller is remote, it is contemplated that the controller may be accessible by, and connected to, multiple structures and/or components of the disclosed system.
[0100] The term “application” may include a computer program designed to perform functions, tasks, or activities for the benefit of a user. Application may refer to, for example, software running locally or remotely, as a standalone program or in a web browser, or other software which would be understood by one skilled in the art to be an application. An application may run on the disclosed controllers or on a user device, including for example, on a mobile device, an IOT device, or a server system.
[0101] In some aspects, the controller includes a display to send visual information to a user. In various aspects, the display is a cathode ray tube (CRT). In various aspects, the display is a liquid crystal display (LCD). In certain aspects, the display is a thin film transistor liquid crystal display (TFT-LCD). In aspects, the display is an organic light emitting diode (OLED) display. In certain aspects, on OLED display is a passive-matrix OLED (PMOLED) or active-matrix OLED (AMOLED) display. In aspects, the display is a plasma display. In certain aspects, the display is a video projector. In various aspects, the display is interactive (e.g., having a touch screen) that can detect user interactions/gestures/responses and the like. In some aspects, the display is a combination of devices such as those disclosed herein.
[0102] The controller may include or be coupled to a server and/or a network. As used herein, the term “server” includes “computer server,” “central server,” “main server,” and like terms to indicate a computer or device on a network that manages the disclosed apparatus, components thereof, and/or resources thereof. As used herein, the term “network” can include any network technology including, for instance, a cellular data network, a wired network, a fiber-optic network, a satellite network, and/or an IEEE 802.11a/b/g/n/ac wireless network, among others.
[0103] In various aspects, the controller can be coupled to a mesh network. As used herein, a “mesh network” is a network topology in which each node relays data for the network. All mesh nodes cooperate in the distribution of data in the network. It can be applied to both wired and wireless networks. Wireless mesh networks can be considered a type of “Wireless ad hoc” network. Thus, wireless mesh networks are closely related to Mobile ad hoc networks (MANETs). Although MANETs are not restricted to a specific mesh network topology, Wireless ad hoc networks or MANETs can take any form of network topology. Mesh networks can relay messages using either a flooding technique or a routing technique. With routing, the message is propagated along a path by hopping from node to node until it reaches its destination. To ensure that all its paths are available, the network must allow for continuous connections and must reconfigure itself around broken paths, using self-healing algorithms such as Shortest Path Bridging. Self-healing allows a routing-based network to operate when a node breaks down or when a connection becomes unreliable. As a result, the network is typically quite reliable, as there is often more than one path between a source and a destination in the network. This concept can also apply to wired networks and to software interaction. A mesh network whose nodes are all connected to each other is a fully connected network.
[0104] In some aspects, the controller may include one or more modules. As used herein, the term “module” and like terms are used to indicate a self-contained hardware component of the central server, which in turn includes software modules. In software, a module is a part of a program. Programs are composed of one or more independently developed modules that are not combined until the program is linked. A single module can contain one or several routines, or sections of programs that perform a particular task.
[0105] As used herein, the controller includes software modules for managing various aspects and functions of the disclosed system or components thereof
[0106] The disclosed structure may also utilize one or more controllers to receive various information and transform the received information to generate an output. The controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in memory. The controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, programmable logic device (PLD), field programmable gate array (FPGA), or the like. The controller may also include a memory to store data and/or instructions that, when executed by the one or more processors, cause the one or more processors to perform one or more methods and/or algorithms.
[0107] The phrases “in an aspect,” “in aspects,” “in various aspects,” “in some aspects,” or “in other aspects” may each refer to one or more of the same or different aspects in accordance with the present disclosure. A phrase in the form “A or B” means “(A), (B), or (A and B).” A phrase in the form “at least one of A, B, or C” means “(A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C).”
[0108] Various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques).
[0109] Certain aspects of the present disclosure may include some, all, or none of the above advantages and/or one or more other advantages readily apparent to those skilled in the art from the drawings, descriptions, and claims included herein. Moreover, while specific advantages have been enumerated above, the various aspects of the present disclosure may include all, some, or none of the enumerated advantages and/or other advantages not specifically enumerated above.
[0110] The aspects disclosed herein are examples of the disclosure and may be embodied in various forms. For instance, although certain aspects herein are described as separate, each of the aspects herein may be combined with one or more of the other aspects herein. Specific structural and functional details disclosed herein are not to be interpreted as limiting, but as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. Like reference numerals may refer to similar or identical elements throughout the description of the figures.
[0111] Any of the herein described methods, programs, algorithms, or codes may be converted to, or expressed in, a programming language or computer program. The terms “programming language” and “computer program,” as used herein, each include any language used to specify instructions to a computer, and include (but is not limited to) the following languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, machine code, operating system command languages, Pascal, Perl, PL1, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, fifth, or further generation computer languages. Also included are database and other data schemas, and any other meta-languages. No distinction is made between languages which are interpreted, compiled, or use both compiled and interpreted approaches. No distinction is made between compiled and source versions of a program. Thus, reference to a program, where the programming language could exist in more than one state (such as source, compiled, object, or linked) is a reference to all such states. Reference to a program may encompass the actual instructions and/or the intent of those instructions.
[0112] Securement of any of the components of the disclosed devices may be effectuated using known securement techniques such welding, crimping, gluing, fastening, etc.
[0113] Persons skilled in the art will understand that the structures and methods specifically described herein and shown in the accompanying figures are non-limiting exemplary aspects, and that the description, disclosure, and figures should be construed merely as exemplary of aspects. It is to be understood, therefore, that this disclosure is not limited to the precise aspects described, and that various other changes and modifications may be effectuated by one skilled in the art without departing from the scope or spirit of the disclosure. Additionally, the elements and features shown or described in connection with certain aspects may be combined with the elements and features of certain other aspects without departing from the scope of this disclosure, and that such modifications and variations are also included within the scope of this disclosure. Accordingly, the subject matter of this disclosure is not limited by what has been particularly shown and described.
Claims
1. A robotic surgical system, comprising: a drive unit; a surgical instrument removably connected to the drive unit, the surgical instrument including: an elongated shaft assembly having a proximal end portion and a distal end portion; an end effector supported on the distal end portion of the elongated shaft assembly; and an instrument cassette assembly supported on the proximal end portion of the elongated shaft assembly, the instrument cassette assembly including: a cassette housing; and an actuator system supported in the cassette housing and operably coupled to the end effector for operating the end effector, the actuator system including: a cable actuator assembly including a plurality of cables that extends from the cassette housing to the end effector for manipulating the end effector; a shaft assembly defining a longitudinal axis; a rotation actuator assembly coupled to the shaft assembly and positioned to rotate at least a portion of the shaft assembly about the longitudinal axis for imparting rotational force to the end effector; and an axial actuator assembly coupled to the shaft assembly and positioned to axially translate at least a portion of the shaft assembly relative to the longitudinal axis for imparting axial force to the end effector.
2. The robotic surgical system of claim 1, wherein the cable actuator assembly includes a crank, a first slider, and a second slider, the first and second sliders coupled to the crank.
3. The robotic surgical system of claim 2, wherein the crank is rotatable to linearly translate the first and second sliders relative to one another.
4. The robotic surgical system of claim 3, wherein the first slider supports a first cable of the plurality of cables and the second slider supports a second cable of the plurality of cables.
5. The robotic surgical system of claim 3, wherein the crank is coupled to a driver that is engaged with the drive unit, the driver configured to impart rotational force on the crank.
6. The robotic surgical system of claim 1, wherein the rotation actuator assembly includes a drive wheel and a belt drive shaft supporting a belt, the belt coupled to the shaft assembly and the drive wheel coupled to the belt drive shaft.
7. The robotic surgical system of claim 6, wherein the drive wheel and the belt drive are disposed transverse to one another, the drive wheel configured to rotate the belt drive shaft.
8. The robotic surgical system of claim 6, wherein rotation of the belt drive shaft rotates the belt to rotate the shaft assembly.
9. The robotic surgical system of claim 1, wherein the axial actuator assembly includes a drive disc, a drive arm coupled to the drive disc, and a drive plate coupled to the drive arm and to the shaft assembly.
10. The robotic surgical system of claim 9, wherein the drive arm includes a first pin coupled to the drive disc and a second pin coupled to the drive plate, the drive plate defining a pin slot that receives the second pin, the second pin slidable along the pin slot to axially translate the drive plate and the shaft assembly as the drive disc rotates.
11. A surgical system, comprising: a cassette housing; and an actuator system supported in the cassette housing and including: a cable actuator assembly including a plurality of cables; a shaft assembly defining a longitudinal axis; a rotation actuator assembly coupled to the shaft assembly and positioned to rotate the shaft assembly about the longitudinal axis; and an axial actuator assembly coupled to the shaft assembly and positioned to axially translate the shaft assembly relative to the longitudinal axis.
12. The surgical system of claim 11, wherein the cable actuator assembly includes a crank, a first slider, and a second slider, the first and second sliders coupled to the crank.
13. The surgical system of claim 12, wherein the crank is rotatable to linearly translate the first and second sliders relative to one another.
14. The surgical system of claim 13, wherein the first slider supports a first cable of the plurality of cables and the second slider supports a second cable of the plurality of cables.
15. The surgical system of claim 13, wherein the crank is coupled to a driver, the driver configured to impart rotational force on the crank.
16. The surgical system of claim 11, wherein the rotation actuator assembly includes a drive wheel and a belt drive shaft supporting a belt, the belt coupled to the shaft assembly and the drive wheel coupled to the belt drive shaft.
17. The surgical system of claim 16, wherein the drive wheel and the belt drive are disposed transverse to one another, the drive wheel configured to rotate the belt drive shaft.
18. The surgical system of claim 16, wherein rotation of the belt drive shaft rotates the belt to rotate the shaft assembly.
19. The surgical system of claim 11, wherein the axial actuator assembly includes a drive disc, a drive arm coupled to the drive disc, and a drive plate coupled to the drive arm and to the shaft assembly.
20. A surgical instrument for a robotic surgical system, the surgical instrument comprising: an elongated shaft assembly having a proximal end portion and a distal end portion; an end effector supported at the distal end portion of the elongated shaft assembly; and an instrument cassette assembly supported on the proximal end portion of the elongated shaft assembly, the instrument cassette assembly including: a cassette housing; and an actuator system supported in the cassette housing and operably coupled to the end effector for operating the end effector, the actuator system including: a cable actuator assembly including a plurality of cables that extends from the cassette housing to the end effector for manipulating the end effector; a shaft assembly defining a longitudinal axis; a rotation actuator assembly coupled to the shaft assembly and positioned to rotate the shaft assembly about the longitudinal axis for imparting rotational force to the end effector; and an axial actuator assembly coupled to the shaft assembly and positioned to axially translate the shaft assembly relative to the longitudinal axis for imparting axial force to the end effector.
Boom Boom!
HAND CONTROLLER APPARATUS INCLUDING ERGONOMIC FEATURES FOR A ROBOTIC SURGERY SYSTEM
DOCUMENT ID
US 20220361967 A1
DATE PUBLISHED
2022-11-17
INVENTOR INFORMATION
NAME
CITY
STATE
ZIP CODE
COUNTRY
Duke; Jonathan Bradley
Louisville
CO
N/A
US
Walters; Chad Clayton
Apex
NC
N/A
US
Currat; Olivier Franck
Louisville
CO
N/A
US
Collins; Eric
Louisville
CO
N/A
US
Ward; William Jacob
Apex
NC
N/A
US
Rector; Mark Curtis
Raleigh
NC
N/A
US
Kelly; Brandon Michael
Raleigh
NC
N/A
US
Collins; Michael Darter
Holly Springs
NC
N/A
US
Durand; Zachary Kevin
Waxhaw
NC
N/A
US
APPLICANT INFORMATION
NAME
Titan Medical Inc.
CITY
Toronto
STATE
N/A
ZIP CODE
N/A
COUNTRY
CA
AUTHORITY
N/A
TYPE
assignee
APPLICATION NO
17/811292
DATE FILED
2022-07-07
DOMESTIC PRIORITY (CONTINUITY DATA)
parent US continuation 17643357 20211208 PENDING child US 17811292
parent US continuation 17468186 20210907 parent-grant-document US 11413100 child US 17643357
parent US continuation 16174602 20181030 parent-grant-document US 11116591 child US 17468186
US CLASS CURRENT:
1/1
CPC CURRENT
TYPE
CPC
DATE
CPCI
A 61 B 34/30
2016-02-01
CPCI
A 61 B 34/76
2016-02-01
CPCI
A 61 B 34/25
2016-02-01
CPCI
A 61 B 34/74
2016-02-01
CPCA
A 61 B 2034/305
2016-02-01
CPCA
A 61 B 2034/252
2016-02-01
CPCA
A 61 B 2034/744
2016-02-01
CPCA
A 61 B 2034/301
2016-02-01
CPCA
A 61 B 34/77
2016-02-01
Abstract
A hand controller apparatus for controlling a tool in a robotic surgery system has a body with a proximal end and a distally located interface end that can be coupled to an input apparatus for controlling a surgical tool. The hand controller apparatus includes a control lever attached to a pivot joint proximate a side surface of the body and extending along the body and away from the proximal end, the control lever being laterally moveable relative to the side surface of the body about the pivot joint. The control lever includes a tail region adjacent to the pivot joint and a paddle region connected to the tail region and extending toward the distally located interface end. The tail region includes an inner surface facing the body and an outer surface opposing the inner surface, and at least part of the outer surface of the tail region is outwardly curved.
Background/Summary
INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS
[0001] Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.
TECHNICAL FIELD
[0002] This disclosure relates generally to robotic surgery systems and more particularly to a hand controller apparatus for receiving operator input for controlling the robotic surgery system to perform surgical procedures.
DESCRIPTION OF RELATED ART
[0003] Robotic surgery systems generally include an operator interface that receives operator input from a surgeon and causes corresponding movements of surgical tools within a body cavity of a patient to perform a surgical procedure. For example, the operator may grasp and move a hand grip while the operator interface senses movements of the hand grip. The operator interface and hand grip may operate to sense inputs responsive to movement of the operator's hand in several different degrees of freedom, thus providing inputs for causing the surgical tool to mimic movements of the operator's hand. Additional movements such as opening and closing of jaws of an end effector associated with the surgical tool may also be initiated in response to additional operator inputs received at the operator interface.
SUMMARY
[0004] In some cases, a hand controller apparatus for controlling a tool in a robotic surgery can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control a surgical tool. The hand controller apparatus can also include a control lever attached to a pivot joint proximate a side surface of the body and extending along the body and away from the proximal end, the control lever being laterally moveable relative to the side surface of the body about the pivot joint. The hand controller apparatus can also include and a lateral movement detector configured to magnetically or inductively detect a lateral movement of the control lever. Detection of the lateral movement can cause the input apparatus to control movement of the surgical tool based on the detected lateral movement of the control lever.
[0005] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The lateral movement detector can be positioned in the body or in the control lever. The control lever can include a wiper disposed inside the body and extending from the pivot joint toward the proximal end and a paddle disposed outside the body and extending at an angle from the pivot joint toward the distally located interface end. The wiper can be configured to move in a direction opposite to a lateral movement of the paddle. The lateral movement detector can include a magnetic angular sensor configured to detect an angle formed between the paddle and the side surface of the body.
[0006] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The hand controller apparatus can further include a magnet attached to the wiper and configured to move along with the wiper. The magnetic angular sensor can be configured to detect the angle based on movement of the magnet. At least a portion of the wiper can include a magnetic material. The magnetic angular sensor can be configured to detect the angle based on movement of the portion of the wiper. The lateral movement detector can include an inductive sensor including a curved coil and configured to detect a curved movement of the wiper based on an electrical current induced at the curved coil by the movement of the wiper. The wiper can be formed at least partially of a metal.
[0007] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The control lever can include a paddle disposed outside the body and extending from the pivot joint toward the distally located interface end. The lateral movement detector can include an inductive sensor configured to detect a non-linear movement of a metallic portion disposed in or integrally formed with the paddle. The inductive sensor can include a substantially trapezoidal shaped coil. The inductive sensor can include a coil that can be curved toward the metallic portion. The metallic portion can include a substantially trapezoidal shape. The inductive sensor can include a substantially elliptical shaped coil. A portion of the elliptical shaped coil is curved toward the metallic portion.
[0008] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. A portion of the metallic portion is curved toward the substantially elliptical shaped coil. The control lever can include a paddle disposed outside the body and extending from the pivot joint toward the distally located interface end. The lateral movement detector can include a proximity sensor configured to detect a position of the paddle with respect to the side surface of the body. The hand controller apparatus can further include a presence detector configured to detect a presence of a hand of an operator on the body. The presence detector can include a capacitive proximity sensor coated on an inner wall of the body. The hand controller apparatus can further include a palm grip disposed on or in the proximal end, the palm grip including a generally downwardly curved and rounded shape configured to support a portion of an operator's palm.
[0009] In some cases, a robotic surgery system can include an instrument station including an insertion device configured to support a surgical tool. The robotic surgery system can also include a workstation in configured to be in data communication with the instrument station. The workstation can include a hand controller apparatus configured to control movement of the tool. The hand controller apparatus can include a body including a proximal end and a distally located interface end coupled to the input device. The hand controller apparatus can also include a control lever attached to a pivot joint proximate a side surface of the body and extending along the body and away from the proximal end, the control lever being laterally moveable relative to the side surface of the body about the pivot joint. The hand controller apparatus can also include a lateral movement detector configured to magnetically or inductively detect a lateral movement of the control lever. The input device can be configured to control movement of the tool based on the detected lateral movement of the control lever.
[0010] The robotic surgery system of any of preceding paragraphs and/or any of robotic surgery systems described below can include one or more of the following features. The control lever can include a wiper disposed inside the body and extending from the pivot joint toward the proximal end and a paddle disposed outside the body and extending at an angle from the pivot joint toward the distally located interface end. The wiper can be configured to move in a direction opposite to a lateral movement of the paddle. The lateral movement detector can include a magnetic angular sensor configured to detect an angle formed between the paddle and the side surface of the body. The lateral movement detector can include an inductive sensor including a curved coil and configured to detect a curved movement of the wiper based on an electrical current induced at the curved coil by the movement of the wiper. The wiper can be formed at least partially of a metal. The control lever can include a paddle disposed outside the body and extending from the pivot joint toward the distally located interface end. The lateral movement detector can include an inductive sensor configured to detect a non-linear movement of a metallic portion disposed in or integrally formed with the paddle.
[0011] In some cases, a method of operating a hand controller apparatus for controlling a tool in a robotic surgery system can include detecting lateral movement of a control lever of the hand controller apparatus between a closed position and an open position, the control lever rotatably attached to a body of the hand controller apparatus and configured to control opening and closing of a surgical tool. The method can also include magnetically or inductively detecting a change in an angle of the control lever relative to the body of the hand controller apparatus when the control lever is moved between the closed position and the open position. The method can also include causing opening and closing of the surgical tool based on the detected change in the angle.
[0012] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. The control lever can include a wiper disposed inside the body and extending from a pivot joint toward a proximal end of the body and a paddle disposed outside the body and extending from the pivot joint toward a distally located interface end, the paddle configured to move between the open and close positions. A magnetic portion can be disposed in or integrally formed with the wiper. The wiper and the magnetic portion can laterally move between a first position and a second position about the pivot joint in a direction opposite to a lateral movement of the paddle, the first and second positions respectively corresponding to the open and close positions of the paddle. Magnetically or inductively detecting the change in the angle can include determining an angular position of the magnetic portion between the first position and the second position in response to a lateral movement of the wiper and detecting the change in the angle based on the determined angular position of the magnetic target.
[0013] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. Determining the angular position can be performed with a magnetic angular detector disposed below the wiper. The control lever can include a wiper disposed inside the body and extending from a pivot joint toward a proximal end of the body and a paddle disposed outside the body and extending from the pivot joint toward a distally located interface end, the paddle configured to move between the open and close positions. A metallic portion can be disposed in or integrally formed with the wiper. Controlling the wiper and the metallic portion can partially rotate over a curved inductive coil between a first position and a second position about the pivot joint in a direction opposite to a lateral movement of the paddle. The first and second positions can respectively correspond to the open and close positions of the paddle. Magnetically or inductively detecting the change in the angle can include detecting induced electrical current at the curved inductive coil caused by a rotation of the wiper, demodulating the detected electrical current to produce a signal representing a position of the metallic portion and detecting the change in the angle based on the produced signal.
[0014] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. The control lever can include a paddle disposed outside the body and extending from a pivot joint, wherein a metallic portion can be disposed in or integrally formed with the paddle. The paddle and the metallic portion can move over an inductive coil between the open and close positions, the inductive coil facing the metallic portion. Magnetically or inductively detecting the change in the angle can include detecting induced electrical current at the inductive coil in response to a movement of the metallic portion, demodulating the detected electrical current to produce a signal representing a position of the metallic portion and detecting the change in the angle based on the produced signal. The inductive coil can have a substantially trapezoidal shape or a substantially elliptical shape.
[0015] In some cases, a method of operating a robotic surgery system that comprises a workstation including a hand controller apparatus and an instrument station including a surgical tool can include detecting lateral movement of a control lever of the hand controller apparatus between a closed position and an open position, the movement of the control level changing an angle between the control lever and a body of the hand controller apparatus. The method can also include magnetically or inductively detecting the change in the angle in response to the control lever moving between the closed position and the open position. The method can also include controlling an opening and closing movement of the tool based on the detected angle.
[0016] In some cases, a hand controller apparatus for controlling a tool in a robotic surgery system can include a body configured to be moved to generate a first operator input to cause a tool to move corresponding to the movement of the body. The hand controller apparatus can also include an input control interface formed on a surface of the body and configured to sense one or more of a plurality of second operator inputs associated with a plurality of tool functions, the plurality of second operator inputs being different from the first operator input. The hand controller apparatus can also include a processor configured to control the tool to perform one or more of the plurality of tool functions in response to the sensed one or more second operator inputs.
[0017] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The tool can include a surgical instrument and at least one function of the plurality of tool functions comprises a surgery routine. The surgery routine can include controlling the surgical instrument to perform at least one of: suturing, cutting, grasping or moving in a predetermined direction. The tool can include a camera configured to image a surgical site, and wherein at least one function of the plurality of tool functions comprises at least one of: causing a lens of the camera to be washed, causing the camera to zoom in and/or out, causing the camera to pan, or causing the camera to tilt. The hand controller apparatus can further include a memory storing the plurality of tool functions corresponding with the plurality of second operator inputs.
[0018] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The input control interface can be configured to sense at least one input of: swiping from a first side of the input control interface to a second side of the input control interface different from the first side, tapping, swiping and holding, tapping and holding, multiple tapping, or multiple tapping and holding. The processor can be configured to control the tool to perform one or more of the plurality of tool functions in response to the sensed at least one input. The input control interface can include a trackpad or a capacitive touch surface configured to sense the one or more second operator inputs. The one or more second operation inputs can include swiping from a first side of the trackpad to a second side of the trackpad different from the first side.
[0019] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The processor can be configured to cause the tool to become locked in a current surgery position in response to the sensed swiping from the first side of the trackpad to the second side of the trackpad. The tool can include a pair of jaws, and wherein the processor is configured to control the pair of jaws of the tool to be fixed in the current surgery position while the body is being repositioned. The body can include a housing on an end thereof, the housing including a generally downwardly curved and rounded shape configured to receive and support a portion of an operator's palm. The hand controller apparatus can further include at least one control lever attached to the body at a pivot joint and extending along the body, the at least one control lever being laterally moveable about the pivot joint, and wherein the at least one control lever is configured to control one or more of the plurality of tool functions.
[0020] In some cases, a method of operating a hand controller apparatus for controlling a tool in a robotic surgery system can include generating a first operator input based on movement of a body of the hand controller apparatus, the first input configured to control the tool to move corresponding to the movement of the body. The method can also include sensing, at an input control interface formed on a surface of the body, one or more of a plurality of second operator inputs corresponding to a plurality of tool functions, the plurality of second operator inputs different from the first operator input. The method can also include, by a processor, controlling the tool to perform one or more of the plurality of tool functions in response to the sensed one or more second operator inputs.
[0021] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. The tool can be a surgical instrument. Controlling the tool can include controlling the surgical instrument to perform at least one of the following: suturing, cutting, grasping or moving in a predetermined direction. The tool can include a camera configured to image a surgical site. Controlling the tool can include at least one of: causing a lens of the camera to be washed, causing the camera to zoom in and/or out, causing the camera to pan, or causing the camera to tilt. The method can further include storing the plurality of tool functions corresponding with the plurality of second operator inputs in a memory.
[0022] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. Sensing the one or more of second operator inputs can include sensing at least one of the following inputs: swiping from a first side of the input control interface to a second side of the input control interface different from the first side, tapping, swiping and holding, tapping and holding, multiple tapping, or multiple tapping and holding. The input control interface can include a trackpad or a capacitive touch surface.
[0023] In some cases, a hand controller apparatus for controlling one or more tools in a robotic surgery system can include a body configured to be moved to generate a first operator input to control a surgical instrument of the one or more tools to move corresponding to the movement of the body. The hand controller apparatus can also include an input control interface formed on a surface of the body and configured to sense a second operator input different from the first operator input. The hand controller apparatus can also include a processor configured to control at least first and second functions of first and second tools of the one or more tools in response to the received second operator input, the first function and the second function performed mutually exclusively of each other, the first and second functions being different from each other, and the first and second tools being different from each other.
[0024] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The processor can be configured to control the first function of the first tool in response to a first type of the second operator input while disabling the second function of the second tool, and control the second function of the second tool in response to a second type of the second operator input while disabling the first function of the first tool. The input control interface can include a trackpad or a capacitive touch surface configured to sense at least one of: swiping from a first side of the trackpad to a second side of the trackpad different from the first side, tapping, swiping and holding, tapping and holding, multiple tapping, or multiple tapping and holding.
[0025] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The trackpad can be configured to sense at least one of the following inputs: swiping from a first side of the trackpad to a second side of the trackpad different from the first side, tapping, swiping and holding, tapping and holding, multiple tapping, or multiple tapping and holding. The processor can be configured to perform different functions based on the sensed second operator input. The capacitive touch surface can include at least one capacitive input configured to sense a single-click or a multiple-click, and wherein the processor is configured to perform different functions based on the single-click or multiple-click.
[0026] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The first tool can a camera configured to image a surgical site, the first function being enabling and/or disabling the camera. The second tool can include an instrument clutch configured to reposition the body, the second function being enabling and/or disabling the instrument clutch. The track pad can be configured to sense swiping from a first side of the trackpad to a second side of the trackpad different from the first side and holding the second side of the trackpad. The processor can be configured to, in response to the sensed swiping and holding, disable an association of the body with the surgical instrument and enable association of the body with the camera.
[0027] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The track pad can be further configured to sense a release of the second side, and wherein the processor is further configured to, in response to the sensed release, disable the association of the body with the camera and enable the association of the body with the surgical instrument. The track pad can be further configured to sense a first swiping from a first side of the trackpad to a second side of the trackpad different from the first side and first releasing of the trackpad, and wherein the processor is further configured to, in response to the sensed first swiping and first releasing, disable an association of the body with the surgical instrument, and permit repositioning of the body without moving the surgical instrument.
[0028] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The track pad can be further configured to sense a second swiping from the first side of the track pad to the second side of the track pad and a second releasing of the trackpad, and wherein the processor is configured to, in response to the sensed second swiping and second releasing, enable the association of the body with the surgical instrument.
[0029] In some cases, a method of operating a hand controller apparatus for controlling one or more tools in a robotic surgery system can include generating a first operator input based on a movement of a body of the hand controller apparatus, the first operator input configured to control movement of a surgical instrument of the one or more tools. The method can also include sensing, at an input control interface formed on a surface of the body, a second operator input different from the first operator input. The method can also include, by a processor, controlling at least first and second functions of first and second tools of the one or more tools in response to the received second operator input by performing the first function and the second function mutually exclusively of each other, the first and second functions being different from each other, and the first and second tools being different from each other.
[0030] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. Controlling the at least first and second functions can include controlling the first function of the first tool in response to a first type of the second operator input while disabling the second function of the second tool and controlling the second function of the second tool in response to a second type of the second operator input while disabling the first function of the first tool.
[0031] In some cases, a hand controller apparatus for controlling a tool in a robotic surgery system can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control a surgical tool. The hand controller apparatus can also include a feedback device supported by the body and configured to provide feedback to a user in response to a change in a function of the hand controller apparatus from a first mode to a second mode, the second mode different from the first mode. The function can include at least one: controlling a camera that images a surgical site, instrument clutching to reposition the hand controller apparatus, a pre-set surgery routine, or an operation to control the surgical tool. The change from the first mode to the second mode can be configured to occur within the same function.
[0032] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. When the function includes controlling the camera, the first mode can include enabling control of the camera, and the second mode can include disabling control of the camera. When the function comprises instrument clutching, the first mode can include enabling instrument clutching and the second mode can include disabling instrument clutching.
[0033] In some cases, a hand controller apparatus for controlling a tool in a robotic surgery system can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control a surgical instrument. The hand controller apparatus can also include a feedback device positioned in or on the body and configured to provide feedback to a user in response to a change in a function of the hand controller apparatus from a first mode to a second mode, the second mode different from the first mode.
[0034] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The feedback device can include a haptic feedback device configured to provide a haptic feedback in response to the change in the function. The haptic feedback device can include a haptic actuator and a controller configured to sense the change in the function and actuate the haptic actuator to vibrate in response thereto. The haptic actuator ca be disposed adjacent to the proximal end or the distally located interface end. The hand controller apparatus can further include an input control interface formed on an upper surface of the body and configured to receive an additional user input. The haptic actuator can be disposed adjacent to the input control interface.
[0035] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The function can include at least one of: controlling a camera that images a surgical site, instrument clutching to reposition the hand controller apparatus, a pre-set surgery routine, or an operation to control the surgical instrument. When the function includes controlling the camera, the first mode can include enabling control of the camera and the second mode can include disabling control of the camera. When the function includes instrument clutching, the first mode can include enabling instrument clutching and the second mode comprises disabling instrument clutching.
[0036] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The change in the function can be generated from repositioning the body or from a secondary input of the robotic surgery system remote from the body. The feedback device can include a visual feedback device configured to provide a visual feedback in response to the change in the function. The feedback device can include an audio feedback device configured to provide an audio feedback in response to the change in the function.
[0037] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The feedback device can include a tactile feedback device configured to provide a tactile feedback in response to the change in the function. The tactile feedback can include at least one of the following: a bump, a beak, a grove, a lip, or a texture difference.
[0038] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The feedback device can include a force feedback device configured to provide a force feedback in response to the change in the function. The force feedback device can include a self-centering wheel. The feedback device is located in a portion of the body configured to contact a user's palm. The feedback device is configured to provide different feedbacks in response to different changes in the function. The different feedbacks can be configurable by the user.
[0039] In some cases, a robotic surgery system can include an instrument station comprising an insertion device configured to support a surgical tool. The robotic surgery system can also include a workstation in data communication with the instrument station. The workstation can include a hand controller apparatus configured to receive an operator input for controlling the tool. The hand controller apparatus can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control the tool. The hand controller apparatus can include a feedback device disposed in or on the body and configured to provide feedback to an operator in response to a change in a function of the hand controller apparatus from a first mode to a second mode, the second mode different from the first mode.
[0040] The robotic surgery system of any of preceding paragraphs and/or any of robotic surgery systems described below can include one or more of the following features. The feedback device can include at least one of the following: a haptic feedback device configured to provide a haptic feedback in response to the change in the function, a visual feedback device configured to provide a visual feedback in response to the change in the function, an audio feedback device configured to provide an audio feedback in response to the change in the function, a tactile feedback device configured to provide a tactile feedback in response to the change in the function or a force feedback device configured to provide a force feedback in response to the change in the function.
[0041] The robotic surgery system of any of preceding paragraphs and/or any of robotic surgery systems described below can include one or more of the following features. The change in the function can be generated from repositioning the body or from a secondary input of the workstation remote from the hand controller apparatus. The feedback device can be configured to provide different feedbacks in response to different changes in the function. The different feedbacks can be configurable by the operator.
[0042] In some cases, a method of operating a hand controller apparatus for controlling a tool in a robotic surgery system can include receiving an operator input. The method can also include determining that the received operator input triggers a change in a function of the hand controller apparatus from a first mode to a second mode, the second mode different from the first mode. The method can also include, with a feedback device supported by a body of the hand controller apparatus, providing operator feedback in response to the change in the function.
[0043] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. The function can include at least one of the following: controlling a camera that images a surgical site, instrument clutching to reposition the hand controller apparatus, a pre-set surgery routine, or an operation to control a surgical tool of the robotic surgery system. When the function includes controlling the camera, the first mode can include enabling control of the camera and the second mode can include disabling control of the camera.
[0044] The method of operating a hand controller apparatus of any of preceding paragraphs and/or any of methods described below can include one or more of the following features. When the function includes instrument clutching, the first mode can include enabling instrument clutching and the second mode comprises disabling instrument clutching. Providing the operator feedback can include providing different feedbacks in response to different changes in the function.
[0045] In some cases, a hand controller apparatus for controlling a tool in a robotic surgery system can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control a surgical tool. The hand controller apparatus can also include a control lever attached to a pivot joint proximate a side surface of the body and extending along the body and away from the proximal end, the control lever being laterally moveable relative to the side surface of the body about the pivot joint. The control lever can include a tail region adjacent to the pivot joint and a paddle region connected to the tail region and extending toward the distally located interface end, wherein the tail region includes an inner surface facing the body and an outer surface opposing the inner surface, and wherein at least part of the outer surface of the tail region is outwardly curved.
[0046] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The at least part of the outer surface of the tail region can include a substantially convex shape. The tail region can include a tail end horizontally overlapping the pivot joint. An outer surface of the tail end can be outwardly curved and an outer surface of the remaining portion of the tail region can be substantially flat. The tail region can include a tail end horizontally overlapping the pivot joint. A first portion of an outer surface of the tail end can be outwardly curved and a second portion of the outer surface of the tail end can be substantially flat.
[0047] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The at least part of the outer surface of the extension can have a substantially concave shape. The control lever can further include an extension extending downwardly from the paddle region, wherein at least part of the extension is curved toward the body. The hand controller apparatus can further include a cutout formed on or in the side surface of the body and configured to accommodate the control lever therein such that a longitudinal axis of the control lever is substantially parallel to a longitudinal axis of the body.
[0048] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The hand controller apparatus can further include a palm grip disposed in or on the proximal end, the palm grip including a generally downwardly curved and rounded shape configured to receive and support a portion of an operator's palm. The hand controller apparatus can further include a neck portion interposed between the pivot joint and the palm grip, wherein a width of the neck portion can be smaller than a width of the palm grip.
[0049] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The neck portion can include a protruding side surface. The protruding side surface of the neck portion can have a curvature that is substantially the same as a curvature of the at least part of the outer surface of the tail region. At least one of i) the protruding side surface of the neck portion, or ii) the at least part of the outer surface of the tail region can be configured to enable an operator to rotate the body of the hand controller apparatus about a longitudinal axis of the body with the operator's finger without rotation of the operator's wrist.
[0050] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The hand controller apparatus can further include an input control interface formed on an upper surface of the body and configured to sense an operator input. The input control interface can include a first side facing the proximal end and a second side opposing the first side and facing the distally located interface end. The hand controller apparatus can further include a slope region disposed between the first side of the input control interface and the neck portion and downwardly sloped to allow an operator's finger to be rested thereon. The slope region can be curved or linear. The input control interface can include a periphery at least part of which is raised to provide a tactile feedback for a location of the input control interface. The palm grip can be downwardly angled with respect to the neck portion to substantially resemble a natural curvature formed between an average operator's thumb and palm when the palm grip is grasped by the operator's hand.
[0051] In some cases, a robotic surgery system can include an instrument station including an insertion device configured to support a surgical tool. The robotic surgery system can also include a workstation in data communication with the instrument station. The workstation can include a hand controller apparatus configured to receive an operator input for controlling the tool. The hand controller apparatus can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control the tool. The hand controller apparatus can also include a control lever attached to a pivot joint proximate a side surface of the body and extending along the body and away from the proximal end, the control lever being laterally moveable relative to the side surface of the body about the pivot joint. The control lever can include a tail region adjacent to the pivot joint and a paddle region extending from the tail region toward the distally located interface end. The tail region includes an inner surface facing the body and an outer surface opposing the inner surface. At least part of the outer surface of the tail region can be outwardly curved.
[0052] The robotic surgery system of any of preceding paragraphs and/or any of robotic surgery systems described below can include one or more of the following features. The at least part of the outer surface of the tail region can include a substantially convex shape. The control lever can further include an extension extending downwardly from the paddle region. At least part of the downward extension can be curved toward the body. The at least part of the outer surface of the tail region can be configured to enable an operator to rotate the hand controller apparatus about a longitudinal axis of the body with the operator's finger without rotation of the operator's wrist.
[0053] In some cases, a hand controller apparatus for controlling a tool in a robotic surgery system can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control a surgical instrument. The hand controller apparatus can also include a control lever attached to a pivot joint proximate a side surface of the body and extending along the body and away from the proximal end, the control lever being laterally moveable relative to the side surface of the body about the pivot joint. The hand controller apparatus can also include a palm grip disposed in or on the proximal end, the palm grip including a substantially downwardly curved and rounded shape configured to receive and support a portion of an operator's palm. The hand controller apparatus can also include a neck portion interposed between the pivot joint and the palm grip, wherein a width of the neck portion is smaller than a width of the palm grip.
[0054] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. At least part of the neck portion may not horizontally overlap the pivot joint. The palm grip can include an upper portion extending from the neck portion toward the proximal end, a middle portion downwardly extending at a first angle from the upper portion, and a lower portion downwardly extending at a second angle from the middle portion. Each of the upper and lower portions can include a width smaller than a width of the middle portion. The upper portion includes a width greater than a width of the neck portion.
[0055] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The body can include an upper surface accommodating an input control interface configured to sense an operator input. The upper surface can be slanted toward the side surface of the body, the slanted upper surface configured to support an operator's index finger when the palm grip is grasped by the operator's hand. The pivot joint can be disposed inside the body and positioned closer to a longitudinal axis of the body than a longitudinal axis of the control lever.
[0056] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The control lever can include a wiper disposed inside the body and extending from the pivot joint toward the proximal end and a paddle disposed outside the body and extending at an angle from the pivot joint toward the distally located interface end. The wiper and the paddle can be connected to the pivot joint such that longitudinal axes of the wiper and the paddle are substantially parallel to each other. The longitudinal axes of the wiper and the paddle may not intersect a center of the pivot joint.
[0057] In some cases, a hand controller apparatus for controlling a tool in a robotic surgery system can include a body including a proximal end and a distally located interface end configured to be coupled to an input apparatus configured to control a surgical tool. The hand controller apparatus can also include a control lever attached to a pivot joint proximate a side surface of the body and extending along the body and away from the proximal end, the control lever being laterally moveable relative to the side surface of the body about the pivot joint. The pivot joint can be disposed inside the body and positioned closer to a longitudinal axis of the body than a longitudinal axis of the control lever. The longitudinal axis of the control lever may not intersect a center of the pivot joint.
[0058] The hand controller apparatus of any of preceding paragraphs and/or any of hand controller apparatuses described below can include one or more of the following features. The longitudinal axis of the control lever can be parallel to the longitudinal axis of the body when the control lever is in a closed position.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0059] Embodiments of the present disclosure will now be described hereinafter, by way of example only, with reference to the accompanying drawings in which:
[0060] FIG. 1 illustrates a robotic surgery system in accordance with some embodiments;
[0061] FIG. 2 illustrates a perspective view of a right side input device of the workstation shown in FIG. 1;
[0062] FIG. 3A illustrates a perspective view of a left side hand controller in an open position according to some embodiments;
[0063] FIG. 3B illustrates a perspective view of a right side hand controller in an open position according to some embodiments;
[0064] FIG. 4A illustrates a plan view of the hand controller of FIG. 3B according to some embodiments;
[0065] FIG. 4B illustrates a left side view of the hand controller of FIG. 3B according to some embodiments;
[0066] FIG. 5 illustrates a perspective view of a right side hand controller in a closed position according to some embodiments;
[0067] FIG. 6A illustrates a perspective view of a left side hand controller grasped by a user's left hand according to some embodiments;
[0068] FIG. 6B illustrates a perspective view of left and right side hand controllers grasped by a user's hands according to some embodiments;
[0069] FIG. 7 illustrates a perspective view of a right side hand controller in an open position according to some embodiments;
[0070] FIG. 8 illustrates a perspective view of a left side hand controller in an open position according to some embodiments;
[0071] FIG. 9 illustrates a perspective view of a hand controller having two pinchers in an open position according to some embodiments;
[0072] FIG. 10 illustrates an assembly view of the hand controller of FIG. 3B according to some embodiments;
[0073] FIG. 11A illustrates a closed-up plan view of a hand controller pincher showing angular magnetic detection according to some embodiments;
[0074] FIG. 11B illustrates a closed-up plan view of a hand controller wiper and a magnetic angular detector according to some embodiments;
[0075] FIG. 12A illustrates a conceptual diagram showing a magnetic angular detection method for a metallic portion in a wiper according to some embodiments;
[0076] FIG. 12B illustrates a perspective view of a hand controller including a linear coil inductive detector for sensing movement of a metallic portion in the paddle according to some embodiments;
[0077] FIG. 12C illustrates a plan view of the hand controller including the linear coil inductive detector of FIG. 12B;
[0078] FIG. 12D illustrates a modified printed circuit board (PCB) coil layout for the linear inductive detector shown in FIGS. 12B and 12C according to some embodiments;
[0079] FIG. 12E illustrates a standard PCB coil layout for a linear coil inductive detector according to some embodiments;
[0080] FIG. 13A illustrates a perspective view of a hand controller including a spiral coil inductive detector for sensing the movement of a metallic portion in the paddle according to some embodiments;
[0081] FIG. 13B illustrates a plan view of the hand controller including the spiral coil inductive detector of FIG. 13A according to some embodiments;
[0082] FIG. 13C illustrates a modified coil layout for the spiral inductive detector shown in FIGS. 13A and 13B according to some embodiments;
[0083] FIG. 13D illustrates a standard coil layout for a spiral coil inductive detector according to some embodiments;
[0084] FIG. 14 illustrates a perspective view of a metal shaped PCB coil formed inside the wall of a handpiece according to some embodiments;
[0085] FIG. 15 illustrates a plan view of an example trackpad of a hand controller according to some embodiments;
[0086] FIG. 16A illustrates a plan view of a capacitive touch surface having multiple ‘V’ shaped capacitive buttons according to some embodiments;
[0087] FIG. 16B illustrates a plan view of a capacitive touch surface having multiple rectangular capacitive buttons according to some embodiments;
[0088] FIG. 17A illustrates a closed-up view of the trackpad according to some embodiments;
[0089] FIG. 17B illustrates an example location of a capacitive gesture recognition circuitry according to some embodiments;
[0090] FIG. 17C illustrates a cross-sectional view of a capacitive gesture recognition PCB according to some embodiments;
[0091] FIG. 18 illustrates a flowchart for a shared input control process according to some embodiments;
[0092] FIGS. 19A and 19B illustrate conceptual diagrams showing a camera control operation according to some embodiments;
[0093] FIG. 20 illustrates a flowchart for a camera control process shown in FIGS. 19A and 19B according to some embodiments;
[0094] FIGS. 21A and 21B illustrate conceptual diagrams showing an instrument clutch operation according to some embodiments;
[0095] FIG. 22 illustrates a flowchart for an instrument clutch process shown in FIGS. 21A and 21B according to some embodiments;
[0096] FIG. 23 illustrates a flowchart for a gesture control process according to some embodiments;
[0097] FIG. 24 illustrates a flowchart for a hand controller feedback control process according to some embodiments;
[0098] FIG. 25A illustrates an example location of a haptic feedback device according to some embodiments;
[0099] FIG. 25B illustrates another example location of a haptic feedback device according to some embodiments;
[0100] FIG. 26 illustrates a block diagram of a hand controller according to some embodiments;
[0101] FIG. 27 illustrates a perspective view of a right side hand controller showing palm grip ergonomic features according to some embodiments;
[0102] FIG. 28 illustrates a perspective view of the right side hand controller of FIG. 27 grasped by a user's right hand according to some embodiments;
[0103] FIG. 29 illustrates a perspective view of a right side hand controller showing paddle ergonomic features according to some embodiments;
[0104] FIG. 30 illustrates a closed-up perspective view of a paddle of the right side hand controller of FIG. 29 according to some embodiments;
[0105] FIG. 31 illustrates a closed-up left side view of the paddle of FIG. 30 according to some embodiments;
[0106] FIG. 32 illustrates a rear view of a right side hand controller showing another example ergonomic features according to some embodiments; and
[0107] FIG. 33 illustrates a perspective view of an example coil to be used as a compression spring for pincer angle detection according to some embodiments.
DETAILED DESCRIPTION
Overview of Robotic Surgery System
[0108] FIG. 1 illustrates a robotic surgery system 100 in accordance with some embodiments. The robotic surgery system 100 includes a workstation 102 and an instrument station or a patient cart 104. The patient cart 104 includes at least one tool mountable on a moveable instrument mount, central unit or drive unit 106 that houses an instrument drive (not shown) for manipulating the tool. The tool may include an insertion device 108 configured to support at least one surgical instrument (hereinafter to be interchangeably used with an “instrument” or “surgical tool”) and a camera (not shown) that images a surgical site. The workstation 102 may also include a tool such as an instrument clutch (that may be implemented by a foot pedal described below). The insertion device 108 can support two or more instruments (not shown). The camera may include a primary camera and at least one secondary camera. The primary camera and the secondary camera may provide different viewing angles, perform different functions and/or produce different images. At least one of the primary camera and the secondary camera may be a two-dimensional (2D) or a three-dimensional (3D) camera. FIG. 1 is merely an example of a robotic surgery system, and certain elements may be removed, other elements added, two or more elements combined or one element can be separated into multiple elements depending on the specification and requirements of the robotic surgery system.
[0109] The workstation 102 includes an input device for use by a user (for example, a surgeon; hereinafter to be interchangeably used with an “operator”) for controlling the instrument via the instrument drive to perform surgical operations on a patient. The input device may be implemented using a haptic interface device available from Force Dimension, of Switzerland, for example. The input device includes a right input device 132 and a left input device 112 for controlling respective right and left instruments (not shown). The right input device 132 includes a right hand controller 122 (hereinafter to be interchangeably used with a “hand grip” or “handpiece”) and the left input device 112 includes a left hand controller 124. The right and left hand controllers 122 and 124 may be mechanically or electrically coupled to the respective input devices 132 and 112. Alternatively, the right and left hand controllers 122 and 124 may be wirelessly coupled to the respective input devices 132 and 112 or may be wireless coupled directly to the workstation 102. In some cases, when there are two instruments at the instrument station 104, the right and left hand controllers 122 and 124 may respectively control the two instruments. In some cases, when there are more than two instruments, the right and left hand controllers 122 and 124 may be used to select two of the multiple instruments that an operator wishes to use. In some cases, when there is only one instrument, one of the right and left hand controllers 122 and 124 may be used to select the single instrument.
[0110] The input devices 132 and 112 may generate input signals representing positions of the hand controllers 122 and 124 within an input device workspace (not shown). In some cases where the input devices 132 and 112 are coupled directly and wirelessly to the workstation, they would include the necessary sensors to allow wireless control such as an accelerometer, a gyroscope and/or magnetometer. In other cases, a wireless connection of the input devices 132 and 112 to the workstation 102 may be accomplished by the use of camera systems alone or in combination with the described sensors. The afore described sensors for wireless functionality may also be placed in each handpiece to be used in conjunction with the input devices 132 and 112 to independently verify the input device data. The workstation 102 also includes a workstation processor circuit 114, which is in communication with the input devices 132 and 112 for receiving the input signals.
[0111] The workstation 102 also includes a display 120 in communication with the workstation processor circuit 114 for displaying real time images and/or other graphical depictions of a surgical site produced by the camera associated with the instrument. The workstation 102 may include right and left graphical depictions (not shown) displayed on the display 120 respectively for the right and left side instruments (not shown). The graphical depictions may be displayed at a peripheral region of the display 120 to prevent obscuring a live view of the surgical workspace also displayed on the display. The display 120 may further be operable to provide other visual feedback and/or instructions to the user. A second auxiliary display 123 may be utilized to display auxiliary surgical information to the user (surgeon), displaying, for example, patient medical charts and pre-operation images. In some cases, the auxiliary display 123 may be a touch display and may also be configured to display graphics representing additional inputs for controlling the workstation 102 and/or the patient cart 104. The workstation 102 further includes a footswitch or foot pedal 126, which is actuatable by the user to provide input signals to the workstation processor circuit 114. In one case, the signal provided to the workstation processor circuit 114 may inhibit movement of the instrument while the footswitch 126 is depressed.
[0112] The patient cart 104 includes an instrument processor circuit 118 for controlling the central unit 106, insertion device 108, one or more instruments and/or one or more cameras. In such case, the instrument processor circuit 118 is in communication with the workstation processor circuit 114 via an interface cable 116 for transmitting signals between the workstation processor circuit 114 and the instrument processor circuit 118. In some cases, communication between the workstation processor circuit 114 and the processor circuit 118 may be wireless or via a computer network, and the workstation 102 may even be located remotely from the instrument station 104. Input signals are generated by the right and left input devices 132 and 112 in response to movement of the hand controllers 122 and 124 by the user within the input device workspace and the instrument is spatially positioned in a surgical workspace in response to the input signals.
[0113] FIG. 2 illustrates a perspective view of the right side input device 132 of the workstation 102 shown in FIG. 1. Since the structure and operations of the right and left input devices 132 and 112 are substantially the same, the description will be provided only for the right side input device 132. Furthermore, FIG. 2 illustrates only an example of an input device, input devices having other structures and shapes may also be used, as long as they receive a user's inputs for controlling the operation of the instrument. Referring to FIG. 2, the input device 132 includes three moveable arms 180, 182, and 184. The hand controller 122 may be coupled via a gimbal mount 186 to the moveable arms 180, 182, and 184. The input device 132 may include sensors (not shown) that sense the position of each of the arms 180, 182, and 184 and rotation of the hand controller 122 and produces signals representing a current position of the hand controller 122. In such case, the position signals are transmitted as input signals to the workstation processor circuit 114. The hand controller 122 may include a user actuatable button or input control interface 326a (see, for example, FIG. 3B), which may produce additional input signals for transmission to the workstation processor circuit 114.
[0114] Additional details of the robotic surgery system 100 including the hand controllers 122 and 124 are described in U.S. Patent Publication No. 2018/0168758, which is assigned to the assignee of the present application and the disclosure of which is incorporated by reference in its entirety.
Overview of Handpiece
[0115] FIG. 3A illustrates a perspective view of a left side handpiece 124 in an open position according to some embodiments. FIG. 3B illustrates a perspective view of a right side handpiece 122 in an open position according to some embodiments. FIG. 4A illustrates a plan view of the handpiece 124 of FIG. 3B according to some embodiments. FIG. 4B illustrates a left side view of the handpiece 124 of FIG. 3B according to some embodiments. FIG. 5 illustrates a perspective view of a right side handpiece 122 in a closed position according to some embodiments. The handpieces 124 and 122 shown in FIGS. 3A to 5 can be used respectively as hand controllers for the input devices 112 and 132 shown in FIG. 1.
[0116] Each of the handpieces 124 and 122 shown in FIGS. 3A and 3B includes a single pincher 308/328 (hereinafter to be interchangeably used with “pincer,” “paddle” or “control lever”). Each of the single paddle handpieces 124 and 122 may control the movement of one or a pair of jaws of a corresponding surgical instrument. The movement can include opening and/or closing of the one or more jaws. Thus, providing the single paddle handpieces 124 and 122 may be beneficial, as manufacturing costs can be reduced and their manufacturing procedure can be simplified. However, each handpiece may also include two pinchers (see, for example, FIG. 9). Furthermore, although FIG. 3A shows that the pincher 308 of the left side handpiece 124 is disposed on the left side of the body 305, the pincher 308 may be disposed on the right side of the body 305 (see, for example, FIG. 6A). Moreover, although FIG. 3B shows that the pincher 328 of the right side handpiece 122 is disposed on the right side of the body 325, the pincher 328 may be disposed on the left side of the body 325 (not shown).
[0117] Referring to FIG. 3A, the left side handpiece 124 includes a proximal end 301, an upper handpiece housing 302, a lower handpiece housing 304, a handpiece body 305, an input control interface 306a, a pincher 308 having a pivot point, a tail end 311 and a paddle end 313, an upper housing 310, a lower housing 312, a front plate (or connector) 314 and a distally located interface end 315. The proximal end 301 and the distally located interface end 315 may be part of the handpiece body 305.
[0118] Referring to FIG. 3B, the right side handpiece 122 includes a proximal end 321, an upper handpiece housing 322, a lower handpiece housing 324, a handpiece body 325, an input control interface 326a, a pincher 328 having a pivot joint 372 (see, for example, FIG. 11A), a tail end 331 and a paddle end 333, an upper housing 330, a lower housing 332 and a front plate 334, and a distally located interface end 335. The proximal end 321 and the distally located interface end 335 may be part of the handpiece body 325.
[0119] The handpiece 122 may be configured for operation by a right hand of the operator and the handpiece 124 may be configured for operation by a left hand. The left handpiece 124 may be configured as a mirror image of the right handpiece 122 as shown in FIGS. 3A and 3B, but may be differently configured depending on the nature of the task. For example, only one of the right and left handpieces 122 and 124 may include an input control interface. In such case, actuation on the single input control interface may perform input control for both of the handpieces 122 and 124. Furthermore, depending on the embodiment, at least one of the right and left handpieces 122 and 124 may include a plurality of input control interfaces. In some cases, the plurality of input control interfaces may have the same shape, function and/or structure. In some cases, the plurality of input control interfaces may have different shapes, functions and/or structures. Since the structure and operations of the right and left handpieces 122 and 124 are substantially the same, the description will be provided only for the right side handpiece 122.
[0120] The proximal end 321 of the right handpiece 122 may be shaped to be grasped by a right hand of an operator. Here, the proximal end 321 may include the handpiece housing 322 and 324. The proximal end 321 may also be referred to as a handle or a palm rest. The proximal end 321 may have a generally downwardly curved and rounded shape operable to receive and support a portion of the operator's palm when the body 325 is grasped in the hand of the operator. Although the upper and lower housings 322 and 324 appear to be as long as the remaining portion of the body 325, the present disclosure is not limited thereto. That is, the upper and lower housings 322 and 324 may be longer or shorter than the remaining portion of the body 325. The distally located interface end 335 may be configured for coupling to the input apparatus 132 for controlling the surgical tool associated with the robotic surgery system 100. At least a portion of the front plate 334 may be positioned in the distally located interface end 335.
[0121] The pincher 328 may be attached to the body 325 at the pivot joint 372 (see, for example, FIG. 11A). The pincher 328 may extend from the tail end 331 to the paddle end 333 along the body 325 to be away from the proximal end 321. FIG. 3B shows that the pincher 328 is in an open position. The open position means that the pincher 328 is opened by laterally moving away from the body 325 in a direction (for example, a clockwise direction) so that the pincher 328 forms an angle of ? (hereinafter referred to as a “pincher angle” or “pincer angle”) with respect to a side surface of the body 325. For the left side handpiece 124, the pincher 313 is opened by laterally moving away from the body 305 in an opposite direction (for example, a counterclockwise direction) so that the pincher 308 has a pincer angle (?) with respect to the body 305. In some cases, the pincer angle (?) can be in the range of 0° to about 15°. In some cases, the pincer angle (?) can be in the range of 0° to about 12.5°. By adjusting the pincer angle, the position or movement of the instrument can be adjusted in a highly accurate manner. In some cases, the maximum pincer angle (?) can be greater or smaller than about 15°.
[0122] In some cases, the pincher 328 can be elastically moved between the open position and the closed position. In such cases, the pincher 328 may be configured to have the open position as an original or default position. The pincher 328 can restore to the original position via an elastic element such as a compression spring, when it is released by a user (see reference numeral 348 in FIG. 11B). When an operator desires to fix the position of the pincher 328 in a partially open position, the operator may be required to hold the pincher 328 in the specific open position with his or her finger. In some cases, a magnet or an electromagnet may be used in place of or in addition to the compression spring to fix the pincher 328 in a particular position and/or to provide a rebounding or resistive force to cause the pincher 328 return to an open position upon being actuated/closed. Furthermore, the movement of the pincher 328 may be controlled by a processor so that the pincher 328 is fixed in a partially open position without the operator's finger holding the pincher 329 at the position. In such cases, the pincher 328 may be caused to remain in a particular position by varying the amount of electromagnetic force being delivered. Varying the electromagnetic force being delivered may also be used to provide a resistive or feedback force on the pincher 328 on the operator's finger placed on the pincher 328 or as the operator tries to actuate the pincher 328. In some cases, the pincher 328 can be non-elastically (for example, mechanically) moved between the open position and the closed position by pressing the tail end 331 of the pincher 328 or by pulling the paddle end 333 away from a side surface of the body 325 for example, in a clockwise direction. In such cases, the pincher 328 may be fixed in a partially open position without the operator's finger holding the pincher 328 at that position.
[0123] When the pincher 328 is laterally moved in the workstation 102, the processor circuit 114 generates and transmits a control signal to the patient cart 104 such that one or both of the jaws of the instrument are simultaneously opened or closed accordingly based on the control signal. For example, if the pincer angle (?) is small, the one or more jaws of the instrument are opened in a correspondingly small amount. Furthermore, if the pincer angle (?) is large, the one or more jaws of the instrument are opened in a correspondingly large amount.
[0124] The pincher 328 may be accommodated in a cutout 336 in a closed position where the pincer angle is generally 0° (see, for example, FIG. 5). The cutout 336 may include a recess, an indentation or a groove. The pincher 328 may be received in the cutout 336 such that a surface of the pincher 328 facing the body 325 is generally contiguous with side surfaces of the body 325 when the pincher 328 is in the closed position. In some cases, the handpiece 122 may not include a cutout portion, and the paddle end 333 of the pincher 328 contacts the body 325 in a closed position where the pincer angle is generally 0°.
[0125] The input control interface 326a may be positioned on an upper surface of the body 325. An operator may perform a primary control of repositioning the input devices or actuating actuators to control end-effectors (for example, one or more jaws) of the instruments. An additional control or secondary control (other than the primary control) may be performed using the input control interface 326a. For example, the input control interface 326a may be used to receive additional user inputs such as camera control or instrument clutch, that may be difficult for an operator to provide a user input with the handpieces or foot pedal particularly while the operator is moving the handpieces. The additional user inputs may also include controlling tool functions (to be described later), controlling paddle movements, and/or selecting particular instruments when there are more than two surgical instruments. The input control interface 326a may be generally horizontally aligned with at least a portion of the pincher 328. The input control 326a may be a PCB slider having an actuator surface or an input control interface (to be described below in greater detail). The input control interface 326a may be slightly inclined toward a side of the body 325 where an operator's index finger would be located when the operator grasps the handpiece. A detailed structure of a handpiece having an inclined input control interface is described in U.S. Patent Publication No. 2018/0168758, which is incorporated by reference in its entirety.
Operation of Handpiece
[0126] FIG. 6A illustrates a perspective view of a left side handpiece 124a grasped by a user's left hand according to some embodiments. FIG. 6B illustrates a perspective view of left and right side handpieces 124 and 122 grasped by a user's hands according to some embodiments.
[0127] Referring to FIG. 6A, a pincher 308a of the handpiece 124a is disposed on the right side of a body 325a. In such case, a user's left thumb 710 can be positioned on the pincher 308a of the handpiece 124a to close or open the pincher 308a. Furthermore, the remaining four fingers may be positioned on the left side of the body 325a. A user's index finger 720 may be positioned on the top surface of the body 325a to actuate the input control interface 326a.
[0128] Referring to FIG. 6B, the left side handpiece 124 is grasped by a user's left hand, whereas the right side handpiece 122 is grasped by a user's right hand. The operator's left index finger 720 is shown operating the left pincher 308 (partially shown) whereas the operator's thumb 710 is shown grasping the body 325 of the handpiece 124. Furthermore, the operator's right index finger 740 is shown operating the right pincher 328 whereas the operator's thumb 730 is shown grasping the body 325 of the handpiece 122. The operator can open and close the left and right pinchers 308 and 328 by making pincher movements (for example, by pressing the respective paddle ends 313 and 333) with the index fingers respectively.
[0129] The left handpiece 124 may be rotated by a user's left hand. Furthermore, the right handpiece 122 may be rotated by a user's right hand about the center of the front plate 334. For the left hand piece 124, a user's thumb 710 and a portion of a user's palm may grasp or support the handpiece 124, whereas one of the index finger 720, middle, ring and pinky fingers can be used to operate the pincher (not shown), for example, via a fingertip. Similarly, for the right hand piece 122, a user's thumb 730 and palm may grasp or support the handpiece 122, whereas one of the index finger 740, middle, ring and pinky fingers can be used to operate the pincher 328 (not shown), for example, via a fingertip. The pincher 308 of the left handpiece 124 and the pincher 328 of the right handpiece 122 may be sized such that when grasped by the hand of an average operator, the fingertips on the respective pinchers are positioned to receive distal phalanges of the operator's finger 720/740 and thumb 710/730.
[0130] The single control lever 328 of the right handpiece 122 may produce a control signal for the right input device 132 configured to simultaneously move one or a pair of jaws of a corresponding surgical tool. Furthermore, the single control lever 308 of the left handpiece 124 may produce a control signal for the left input device 112 configured to simultaneously move one or a pair of jaws of a corresponding surgical tool.
Additional Handpiece Examples
[0131] FIG. 7 illustrates a perspective view of a right side handpiece 122b in an open position according to some embodiments. The handpiece 122b of FIG. 7 has a different shape compared to the previous handpiece examples. For example, the handpiece 122b has a relatively long and substantially linear housing 322a. Furthermore, the handpiece 122b has a pincher 328 disposed near the top of a body 325. The handpiece 122b has a paddle with a relatively narrower width (measured in a longitudinal direction of the handpiece).
[0132] FIG. 8 illustrates a perspective view of a left side hand controller 124b in an open position according to some embodiments. The handpiece 124b of FIG. 8 has a different shape compared to the previous handpiece examples. For example, a portion of the body 305 and a portion of a housing 322b are cut. Thus, the housing 322b is relatively short. The housing 322b is also generally linear.
[0133] FIG. 9 illustrates a perspective view of another handpiece 117 in an open position according to some embodiments. The hand controller 117 includes two pinchers 309 and 316 respectively disposed on the left and right sides of the body 327. In such case, as the pinchers 309 and 316 are opened and closed, one or a pair of jaws of the instrument are opened and closed. A detailed operation of a two pincher handpiece is described in U.S. Patent Publication No. 2018/0168758, which is incorporated by reference in its entirety.
[0134] It is appreciated that the handpieces shown in FIG. 3A to FIG. 9 are merely examples and the present disclosure is not limited thereto. For example, it is possible to provide many other handpieces including one or more of the following variations: different body shapes, different pincher shapes or dimensions, different numbers of pinchers, different positions, shapes or numbers of input control interfaces, and/or different positions of other handpiece elements may also be possible.
Assembly of Handpiece
[0135] FIG. 10 illustrates an assembly view of the handpiece 122 of FIG. 3B according to some embodiments. FIG. 10 is merely an example assembly view of the handpiece 122, and certain elements may be removed, other elements added, two or more elements combined or one element can be separated into multiple elements depending on the specification and requirements of the handpiece. Referring to FIG. 10, the upper and lower handpiece housings 322 and 324 accommodate a first PCB 350, a first PCB carrier 354, a wiper (or extension or inner paddle) 370, a bar magnet (hereinafter to be interchangeably used with a “magnet,” “magnetic portion,” or a “magnetic target”) 352, a compression spring 348 and a pivot joint 372. The upper and lower housings 330 and 332 accommodate a center mount 342, a vibration motor (or a haptic actuator) 344 and a second PCB 326. The upper housing 330 has an opening 346 that accommodates and exposes a top surface of the second PCB 326. The front plate 334 and a front plate label 338 are connected to the center mount 342 via a screw 336 and a threaded insert 340.
[0136] The first PCB 350 may include a pincer angle detector and/or a presence detector (to be described in greater detail below). The first PCB 350 may also include a handpiece feedback control device (to be described later). The first PCB holder 354 accommodates the first PCB 350. The bar magnet and/or the compression spring 348 can also be used to detect a pincer angle in connection with the pincer angle detector as described herein. The pincher 328 may be rotatably fixed to an interior portion of the upper and lower handpiece housings 322 and 324 via a pin (not shown) inserted into a pin hole 334 of the pivot joint 372. For example, the pincher 328 may rotate laterally from a side portion of the body 325 about the pivot joint 372.
[0137] The second PCB 326 may include an IC for driving a trackpad or a capacitive touch surface 326a for user input and gesture control (to be described in greater detail below). The trackpad or capacitive touch surface may be positioned on the top surface of the second PCB 326. The second PCB 326 may also include one or more of the pincer angle detector, the presence detector or the handpiece feedback control device. The vibration motor 344 may be mounted on the center mount 342. However, the vibration motor 344 may be located in other positions inside the handpiece 122. The vibration motor 344 can be used for providing a haptic feedback to an operator (to be described in greater detail below).
Paddle Actuation Sensing/Pincer Angle Detection
[0138] As described herein, the pincher or paddle moves between a closed position and an open position. The open position includes a partially open position and a completely open position. The paddle would form a pincer angle with respect to a side surface of the handpiece body facing the paddle. In some cases, the pincer angle is the minimum at the closed position and the maximum at the completely open position. In operation, the pincer angle would be between the minimum and maximum at a partially open position. As the paddle moves from a closed position to a partially or completely open position, the one or more jaws of the surgical instrument also move to correspond to the movement of the paddle. Furthermore, as the paddle moves from the open position to the closed position, the one or more jaws of the surgical instrument also move to correspond to the movement of the paddle. Thus, it is advantageous to sense or detect an accurate position of the paddle or a pincer angle in order to more precisely control the movement of the surgical instrument.
[0139] Pincer angle detection or paddle actuation sensing can be done in various ways. In some cases, pincer angle can be detected by magnetically or inductively sending a movement of a metallic portion or target disposed in the wiper or paddle. For example, a magnetic angular detector, an inductive/eddy current detector or proximity sensor can be used for pincer angle detection. However, other detection methods can also be used as long as they can detect a position of the paddle or a pincer angle with respect to the body, or distance between the paddle and the body. Although the pincer angle detection or paddle actuation sensing is described in connection with one paddle handpiece, it can be applied to a handpiece having two paddles. In such cases, since the two paddles of the handpiece would move symmetrically, pincer angle detection for only one of the paddles may be sufficient to control the movement of the surgical instrument.
1. Magnetic Angular Detector for Detecting Wiper Movement
[0140] This method detects an angular movement of a magnetic portion or target that moves, when a pincher laterally moves with respect to a side surface of the handpiece body. In some cases, the magnetic target can be attached to and move along with the wiper, when the pincher laterally moves with respect to the side surface of the body. In some cases, at least a portion of the wiper can be a magnetic target. For example, a part or the entirety of the wiper can be formed of a magnetic material. In such cases, no separate magnet is required. Magnetic angle detection may provide several advantages over magnetic strength detection, primarily because angle does not drift with time or temperature (unlike strength).
[0141] FIG. 11A illustrates a close-up plan view of the pincher 328 showing magnetic angular detection according to some embodiments. FIG. 11B illustrates a close-up plan view of the wiper 370 and a magnetic angular detector according to some embodiments. The pincher 328 includes a paddle 329 disposed outside the body 325 and a wiper 370 disposed inside the body 325. Referring to FIG. 11A, the paddle 329 laterally moves between a closed position 329a and an open position 329b so that the paddle 329 forms a pincer angle (?) with respect to a surface of the handpiece body facing the paddle 329. As the paddle 329 laterally moves between the closed position 329a and the open position 329b, the wiper 370 moves in an opposite direction between a substantially parallel position 370a and an angled position 370b as shown in FIG. 11A. When the paddle 329 and the wiper 370 move in opposite directions, the two pincher elements 329 and 370 maintain a substantially parallel and spaced-apart relationship with respect to each other as indicated by two parallel dotted lines 371 in FIG. 11A. In some cases, the paddle 329 and the wiper 370 may not be substantially parallel but would maintain a spaced-apart relationship. The paddle 329 and the wiper 370 may elastically move between the open position and the closed position via the compression spring 348 disposed inside the handpiece body or by other appropriate means, for example as described above.
[0142] In the closed position, the pincer angle (?) may be generally zero, as the paddle 329 would contact the side surface of the handpiece body 325. In the completely open position, the pincer angle (?) may be about 12.5° to about 15°. Thus, the paddle 329 may move between the pincer angles in the range of 0° to 15°. However, the maximum pincer angle can be less than or greater than about 15° depending on the embodiment. The wiper 370 may generally form the same angle between the two positions 370a and 370b as the pincer angle, as the wiper 370 and the paddle 329 are fixed relative to each other.
[0143] In some embodiments, the magnet or magnetic target 352 may be attached to the wiper 370. In such cases, the wiper 370 may or may not be formed of a metallic material, as long as the magnet 352 can be attached to the wiper 370, for example, via adhesive. In some cases, the wiper 370 may be formed at least partially of a magnetic material. For example, a portion of the wiper 370 may be a magnet or the entirety of the wiper 370 can be a magnet. In such cases, no separate magnetic target needs to be attached to the wiper 370.
[0144] The first PCB 350 may include a magnetic angular detector configured to detect an angular movement of the magnetic target 352 that rotates or laterally moves along with the wiper 370 about the pivot joint 372 (ergonomic features of the pivot joint and paddle design to be described at the “Handpiece Ergonomic Features” section later). In some cases, the magnetic angular detector can be implemented with, for example, integrated circuits (ICs) available from Monolithic Power Systems Inc. (MPS). The MPS ICs generally detect the absolute angular position of a permanent magnet, typically a diametrically magnetized cylinder on a rotating shaft. The MPS ICs can be tunable and can provide a robust solution. For example, the MPS ICs may achieve greater than about 9 bits of resolution over the 12.5° range of the pincer angle.
[0145] In some cases, the magnetic angular detector can be implemented with, for example, ICs available from Analog Devices Inc. (ADI). The ADI ICs can be an anisotropic magnetoresistive (AMR) sensor with integrated signal conditioning amplifiers and ADC drivers that can produce two analog outputs indicating the angular position of the surrounding magnetic field.
[0146] MPS ICs and ADI ICs are merely example magnetic angular detectors that realize the magnetic angular detection, and other magnetic angular detecting circuits can also be used as long as they can detect an angular movement of a magnet attached to or integrally formed with the wiper 370. In some cases, the magnet 352 may include rare earth magnets. Rare earth magnets generally decay at a rate of less than 1% per decade. In some cases, any magnet could be used for the magnet 352.
2. Inductive/Eddy Current Detector
[0147] This method uses the concept of inductive or eddy current that is induced at an inductive coil when a metallic target moves over the coil. In some cases, the metallic target may be disposed in or integrally formed with the wiper, and move in an arced or curved path over the inductive coil disposed inside or outside the wiper. In some cases, the metallic target may be attached to or integrally formed with the paddle, and move in an angled path with respect to the inductive coil. For the purpose of convenience, the description will be made for the metallic target which is attached to either the wiper or the paddle (instead of being integrally formed with the wiper or the paddle). The inductive/eddy current detector is different from the magnetic angular detector in that the former does not require the use of a magnet. This method is also inherently resilient to outside electro-magnetic interference, as no magnet is required.
[0148] A. Inductive/Eddy Current Sensor for Detecting Target in Wiper
[0149] FIG. 12A illustrates a conceptual diagram showing an angular magnetic detection method for a metallic portion or target 369 disposed in the wiper 370 according to some embodiments. The first PCB 350 shown in FIG. 11B may include an inductive detector that can detect a metallic target (or metallic portion) moving in an arced path with respect to an inductive coil disposed in the first PCB 350. In some cases, the first PCB 350 may include both the magnetic angular detector and the inductive detector. In some cases, a separate PCB may be used to accommodate the inductive sensor.
[0150] The operation of an inductive detector for detecting a metallic target 369 at the wiper 370 is described with respect to FIG. 12A. The inductive detector may detect a curved or arced movement of the metallic target 369, when the paddle 329 moves laterally from side portion of the body 325. Referring to FIGS. 11A and 12A, the wiper 370 and the metallic target 369 move in a curved path between the two positions 370b and 370a over a curved PCB coil layout 382 of the inductive detector, as the paddle 329 moves in a curved direction opposite to the curved path of the wiper 370. The curved PCB coil layout 382 may be manufactured by bending a linear PCB coil layout into an arc track during PCB layout. The linear PCB track can be shaped to suit whatever path a metallic target takes.
[0151] The metallic target 369 disposed in the wiper 379 may move on substantially the same plane (or substantially parallel planes) as the plane on which the curved PCB coil layout 382 is positioned (for example, substantially coplanar). In some cases, to allow for different shapes of hand pieces, the metallic target 369 disposed in the wiper 379 may move on a different plan as the plan on which the curved PCB coil layout 382 is positioned. Therefore, the metallic target 369 can track the curved PCB coil layout 382, as the wiper 370 moves in the curved path. As the metallic target 369 moves over the curved PCB coil layout 382, electrical current is induced at the curved PCB coil layout 382. The metallic target 369 may have a trapezoidal shape as shown in FIG. 12A to more closely track the curved path over the curved PCB coil layout 382.
[0152] In some cases, the PCB coil layout 382 may include one transmitter coil and two receiver coils in different paths. The inductive sensor may demodulate and process secondary voltages received at the receiver coils, and obtain a signal representing the metallic target's position. The inductive sensor can be implemented with, for example, ICs available from Integrated Device Technology, Inc. (IDT). The IDT ICs can compare voltage values received at two receiver coils, combine this comparison with the knowledge of their different paths, and may cancel out certain mechanical tolerances (for example, even if the metal target were a bit off angle and it would not severely impact the result).
[0153] B. Inductive/Eddy Current Sensor for Detecting Target in Paddle
[0154] The pincer angle may be detected by inductively sensing a movement of a metallic target disposed in the paddle. In some cases, the movement of the metallic target in the paddle may be sensed by a linear coil inductive sensor. In some cases, the movement of the metallic target in the paddle may be sensed by a spiral-shaped coil inductive sensor.
[0155] a. Linear Coil Inductive Detector
[0156] FIG. 12B illustrates a perspective view of a handpiece including a linear coil inductive detector 383 for sensing the movement of a metallic target 387 disposed in the paddle 329 according to some embodiments. FIG. 12C illustrates a plan view of the handpiece including the linear coil inductive detector 383 of FIG. 12B. FIG. 12D illustrates a modified PCB coil layout 385a for the linear inductive detector 383 shown in FIGS. 12B and 12C according to some embodiments. FIG. 12E illustrates a standard PCB coil layout 385b for a linear coil inductive detector.
[0157] Referring to FIG. 12B, the linear coil inductive detector 383 may include a linear coil sensor 384 and a PCB coil layout 385. Referring to FIGS. 12B and 12C, the inductive detector 383 is disposed inside the handpiece so that the PCB coil layout 385 faces the metallic target 387 disposed in the paddle 329. In some cases, the inductive detector 383 or at least the PCB coil layout 385 of the detector 383 may be disposed inside the paddle 329. In such cases, the metallic target may be disposed inside the handpiece body to face the PCB coil in the paddle 329. Furthermore, when the PCB coil layout 385 is disposed inside the paddle 329, the linear coil sensor 384 may be disposed inside the body.
[0158] When the paddle 329 moves between the closed position 329a and the open position 329b (see, for example, FIG. 11A), it does not directly approach nor directly move away from the PCB coil layout 385. Instead, the paddle 329 moves with respect to the PCB coil layout 385 at an angle. Thus, the plane of the PCB coil layout facing the metallic target 387 would not be parallel to the plane of the metallic target 387. Thus, unlike the metallic target 369 disposed in the wiper 370 that moves in parallel with respect to the PCB coil 382, the PCB coil layout 385 and the metallic target 387 are not coplanar except that in the closed position of the paddle 329, the target 387 and the coil layout 385 would be coplanar.
[0159] The operation of the linear coil inductive detector 383 is described with respect to FIG. 12C. In FIG. 12C, the paddle and wiper in the dotted lines represent that the paddle 329 and the wiper 370 are in a closed position. The metallic target 387 forms a pincer angle (?) in an open position. As the paddle 329 laterally moves from the open position (?) to the closed position (generally 0°), the metallic target 387 (for example, a middle portion thereof) moves with respect to the PCB coil layout 385 from a position A to a position B on the PCB coil layout 385. Furthermore, as the paddle 329 laterally moves from the closed position to the open position (?), the metallic target 387 moves with respect to the PCB coil layout 385 from the position B to the position A on the PCB coil layout 385. In some cases, the inductive sensor 384 may process secondary voltages received at the receiver coils on the PCB coil layout 385, and obtain a signal representing the position of the metallic target 387. In some cases, the inductive sensor 384 can be implemented with, for example, ICs available from IDT.
[0160] In some cases, the PCB coil layout may have a modified linear coil layout 385a shown in FIG. 12D. The non-coplanar nature of the metallic target 387 with respect to the PCB coil layout 385 may become more substantial as the pincer angle becomes greater, and may become less substantial or insignificant as the pincer angle approaches zero. The modified PCB coil layout 385a may adjust the change in output of the sensor 384 due at least to the change in proximity to the metallic target 387 so that the output may become substantially the same as the standard linear layout 385b shown in FIG. 12E. In some cases, an additional adjustment may be made by a further modification to the modified PCB coil layout 385a and/or by a processor in order to further compensate the non-coplanar nature of the movement of the metallic target 387. This additional adjustment by the processor may be made to the modified coil layout 385a or the standard coil layout 385b.
[0161] In some cases, the metallic target 387 may also have a modified shape in order to at least partially compensate the non-coplanar nature of the movement of the metallic target 387 with respect to the PCB coil layout 385. For example, the metallic target 387 may have a generally trapezoidal shape (not shown) that is generally inverse with respect to the modified PCB coil layout 385a shown in FIG. 12D. For example, the trapezoidal shape of the metallic target 387 may have the height of the left side smaller than the height of the right side, unlike the trapezoidal shape of the modified PCB coil layout 385a where the height of the right side is smaller than the height of the left side. In some cases, paddle 329 may be modified such that a curvature of paddle or at least the inside face of paddle (the side of the paddle facing the body of the handpiece) may be slightly curved as opposed to being linear (as shown in the drawings, see for example FIG. 4A). The metallic target 387 would also follow this curvature. In some cases, only the metallic target 387 would be modified to be curved. This curvature of the paddle, paddle face and/or the metallic target 387 would provide a different amount of area of the metallic target that would be substantially coplanar with the PCB coil layout 385 as the pincher is depressed (moved laterally). The curvature could compensate the non-coplanar nature of the movement of the metallic target 392 with respect to the PCB coil layout 385. The described modifications are merely examples, and other modifications to the PCB coil layout 385, the metallic target 387, positioning of the PCB coil layout, curvature of the PCB coil layout and/or the modification by a processor may also be made so that the metallic target may follow a substantially coplanar moving path with respect to a sensor coil or at least the output from the inductive sensor follows a more standardized output.
[0162] b. Spiral Coil Inductive Detector
[0163] FIG. 13A illustrates a perspective view of a handpiece including a spiral coil
[0164] inductive detector 394 (sensor circuitry not shown; hereinafter to be interchangeably used with “spiral coil layout”) for sensing the movement of a metallic portion or target 392 disposed in the paddle 329 according to some embodiments. FIG. 13B illustrates a plan view of the handpiece including the spiral coil inductive detector 394 of FIG. 13A. FIG. 13C illustrates a modified coil layout 394a for the spiral inductive detector 394 shown in FIGS. 13A and 13B according to some embodiments. FIG. 13D illustrates a standard coil layout 394b for a spiral coil inductive detector.
[0165] Referring to FIGS. 13A and 13B, the inductive detector 394 is disposed inside the handpiece body so as to face the metallic target 392 disposed in the paddle 329. In some cases, the spiral coil layout 394 may be disposed inside the paddle 329. In such cases, the metallic target may be disposed inside the handpiece body to face the coil layout in the paddle 329.
[0166] The operation of the linear coil inductive detector 383 is described with respect to FIG. 13B. Referring to FIG. 13B, as the paddle 329 laterally moves from the open position (?) to the closed position (the paddle in the dotted lines shows that the paddle is positioned in the pincer angle being generally 0°), the metallic target 392 (for example, a middle portion thereof) moves with respect to the coil layout 394 from a position A to a position B on the coil layout 394. Furthermore, as the paddle 329 laterally moves from the closed position to the open position (?), the metallic target 392 moves with respect to the coil layout 394 from the position B to the position A on the coil layout 394. The inductive sensor circuitry may process secondary voltages received at the receiver coils on the spiral coil layout 394, and obtain a signal representing the position of the metallic target 392.
[0167] As described herein with respect to FIGS. 12B-12E, the non-coplanar nature of the metallic target 392 with respect to the coil layout 394 may become more substantial as the pincer angle becomes greater, and may become less substantial or insignificant as the angle approaches zero. In some cases, the spiral coil layout may have a modified coil layout 394a shown in FIG. 13C. The modified coil layout 394a may have an elliptical shape. The modified coil layout 394a may adjust the change in output of the sensor due at least to the change in proximity to the metallic target 392 so that the output may become substantially the same as the standard linear layout 394b shown in FIG. 13D. In other cases, at least some portion on the right half of the spiral coil (for example, the right end portion of the coil) may be bent toward or away from the paddle 329 in order to additionally compensate the non-coplanar nature of the movement of the metallic target 392 with respect to the PCB coil layout 394. In some cases, the paddle 329 and/or the metallic target 392 may be curved similarly as described above with respect to the “Linear Coil Inductive Detector”.
[0168] In some cases, an additional adjustment may be made by a further modification to the modified PCB coil layout 385a and/or by a processor in order to further compensate the non-coplanar nature of the movement of the metallic target 392. This additional adjustment by the processor may be made to the modified coil layout 394a or the standard coil layout 394b. In some cases, the spiral coil inductive sensor can be implemented with, for example, ICs available from Texas Instruments Inc. (TI).
[0169] In some cases, the metallic target 392 may also have a modified shape in order to at least partially compensate the non-coplanar nature of the movement of the metallic target 392 with respect to the PCB coil layout 394. For example, the metallic target 392 may have a generally elliptical shape (as opposed to a circular shape) similar to the spiral coil 394a. Furthermore, at least a portion of the metallic target 392 (for example, a right half) may be bent toward the coil 394 to compensate the non-coplanar nature of the movement of the metallic target 392 with respect to the PCB coil layout 394. The described modifications are merely examples, and other modifications to the PCB coil layout 394 and/or the metallic target 392 (including modification by a processor) may also be made so that the metallic target may follow a substantially coplanar moving path with respect to a sensor coil.
[0170] In some cases, the coil layout may instead be included on or inside the paddle 329 (not shown). In place of PCB traces to produce the coils used for inductive sensing, metal shapes on the inside walls of the paddle 329 or inside the paddle may be used. Laser direct structuring (LDS) may be utilized to produce the metals shapes as LDS is appropriate for extremely small and space constrained applications. The LDS metal may directly replace the PCB coil, but all of the described restrictions may apply (coplanar vs proximity, minimum inductance, etc.).
3. Proximity Sensor
[0171] The pincer angle can also be detected by a proximity sensor. The proximity sensor can
[0172] measure the distance between a sensor coil and a metallic target, as opposed to measuring a coplanar (or substantially coplanar) travel of the metallic target. For example, the proximity sensor can directly detect the position of the paddle 329 with respect to the surface of the handpiece body facing the paddle 329. This method is inherently resilient to outside electro-magnetic interference and simplifies the mechanical design, by not requiring an external effector (magnet) and by detecting the paddle's movement directly. In some cases, the proximity sensor can be implemented with, for example, ICs available from Texas Instrument (TI).
[0173] In some cases, the proximity sensor may be disposed inside the handpiece body to face the paddle 329. In some cases, as shown in FIG. 14, the proximity sensor can be formed by layering coils across multiple PCB layers. This design may be advantageous, since the sensor geometry is generally more flexible, and thus is useful when there is not much space like in a handpiece. Accordingly, the greater distance of the paddle makes the TI chip a better candidate. In some cases, the proximity sensor may be disposed inside the paddle 329 to face a side surface of the handpiece body.
[0174] In some cases, the proximity sensor can be implemented with, ICs available from IDT. In such cases, the IDT ICs may be positioned inside the handpiece body, and a metallic target would be in or on the paddle 329. As the paddle 329 is compressed, the metallic target would move toward the IDT ICs and the detected signal may generally become stronger as the paddle 329 approaches the handpiece body. This is a variation of the traditional use of the IDT ICs that usually just detect a linear travel (not proximity). It could detect proximity as the detected signal would change (become stronger) but the detected signal would not follow a linear path but rather a non-linear or log path. However, with the variables known, the signal could be determined. The TI sensor may be a better proximity sensor than the IDT sensor, as the TI chip may be configured to increase the effective coil length (for example, adding more PCB layers). Both of the TI and IDT sensors may require some minimum inductance. The inductance is generally proportional to the amount of a PCB coil on the sensor that is laid out.
[0175] In some cases, the coils used for inductive sensing can be implemented as PCB traces. In some cases, as shown in FIG. 14, the coils can be implemented as metal shapes 380 on the inside walls of the handpiece itself. This technique is referred to as laser direct structuring (LDS), and may be utilized for an extremely small and space constrained RF antenna. The choice of a TI or IDT sensor may depend on whether the tail or the paddle is used as a target. The TI product may need less internal processing. These more ‘raw’ values would make this implementation easier to iron out. The LDS metal may directly replace the PCB coil, but all of the described restrictions may apply (coplanar vs proximity, minimum inductance, etc.).
4. Compressing Spring
[0176] The pincer angle may be obtained by directly detecting the compression of the spring 348 that provides resistance to the paddle 329 (see, for example, FIG. 11B). In this case, an inductor may be a charged coil of wire, and a spring may be a stiff coil of wire. If the spring were charged with an alternating current, it would behave like an inductor. Compression of the spring would lead to a linear change in inductance. Post processing may be used to linearize the output of an inductance sensor. In some case, the spring can have an explicit inductor to meet the minimum inductance. The spring method can be advantageous, as it is inherently linear. Referring to FIG. 33, the inductance of a coil (or spring) 345 is given by the equation below. When the spring 345 is compressed, its length changes (small1) so that it has a linear effect on the inductance. This can be described by the equation below.
[00001] L = N 2 ? µ ? A l ? µ = µ r ? µ 0
[0177] Where, [0178] custom-character=Inductance of coil in Henrys [0179] N=Number of turns in wire coil (straight wire=1) [0180] µ=Permeability of core material (absolute, not relative) [0181] µ.sub.r=Relative permeability, dimensionless (µ.sub.o=1 for air) [0182] µ.sub.o=1.26×10.sup.-6 T.Math.m/At (permeability of free space) [0183] A=Area of coil in square meters=pr.sup.2 [0184] l=Average length of coil in meters
Presence Detection
[0185] As described herein, a handpiece controls the movement of a surgical instrument. Thus, it is desirable for a safety purpose to activate the handpiece when it is safe, for example, when it is grasped by or adjacent to an operator's hand. The presence detector can detect whether a user's hand is present on or within a certain distance of the handpiece. In some cases, such distance may be a few millimeters, a few centimeters, or a few inches. In some cases, the presence detector may detect an operator's hand contacting the handpiece.
[0186] In some cases, the presence detector can be a capacitive proximity sensor and can be disposed on a PCB on the center mount 342 (see, for example, FIG. 10). The presence detector can be implemented with two redundant sensors for additional safety purposes. The redundant sensors may charge the lower and upper housings 324 and 322 (formed of metal) and use them as their antenna or sense-element. In some cases, the presence detector can be a metallic coating or a metal shell underneath the hard plastic shell of the handpiece to effectively create a large capacitive proximity sensor. The presence detector can be implemented with, for example, ICs available from Microchip Technology Inc.
[0187] In some cases, instead of using a metal shell or coating, a wire, as shown in FIG. 14, may be formed throughout the length of the inside of the handpiece to effectively create an antenna. In such cases, coverage may be less uniform than on a primary path, but there may be potential manufacturing advantages. The presence detector may also detect a gloved hand and a double-gloved hand. The presence detector can calibrate the sensor to detect proximity even when a user is lightly touching the handpiece or not touching it all, for example, a few millimeters away. The presence detector may also be able to detect and differentiate between various materials, for example, whether a hand is within a desired proximity (directly coupled or within a tolerated distance), a gloved hand is within the desired proximity or whether a different unwanted object is within the desired proximity. This may be important to avoid unintended contact of the handpiece and to only allow presence to be detected when a hand (or gloved hand) of the operator is within the desired proximity. The advantage of this mode is that the handpiece does not clutch out or disengage from controlling the surgical system when a user moves the fingers or hand on the handgrip.
Input Control Interfaces
[0188] User or operator inputs may be provided to the robotic surgery system 100 in a number of different ways. For example, movement of the handpieces 122 and 124 can be used to provide a user input for controlling a tool such as a surgical instrument or a camera. As another example, the foot pedal 126 disposed at a lower portion of the workstation 102 may provide a user input used to perform a certain function such as instrument clutching.
[0189] Another user input (hereinafter to be interchangeably used with “additional user input” or “second user input”) may be provided via the input control interface 326a disposed on an upper surface of the handpiece body. The input control interface 326a (see, for example, FIG. 3B) may be configured to control a number of functions for the robotic surgery system 100. The input control interface 326a may receive an input used to control a surgical instrument. The input control interface 326a may also receive another input used to zoom in or zoom out a camera. The input control interface 326a may further receive another input used to turn on and off an illuminator. The input control interface 326a may also be used to provide a user input that can be provided by other input mechanism such as the foot pedal 126. In this case, since the input control interface 326a is positioned in the handpiece grasped by an operator during operation, a user input may be more conveniently and/or more accurately provided than the foot pedal 126. The input control interface may be implemented by a mechanical switch, a button, a lever, a wheel, a trackpad or a capacitive touch surface. For the purpose of convenience, shared input control, gesture control and handpiece feedback control below will be described using a trackpad or a touch capacitive surface.
1. Trackpad
[0190] In some cases, the second PCB 326 (see, for example, FIG. 10) can include a trackpad 326a as an input control interface. The trackpad 326a may be disposed on the upper surface of the handpiece 122. The trackpad 326a can be used to receive an additional user input such as camera control or instrument clutch where instrument control is disengaged. The trackpad 326a can also be used for direct gesture recognition with a variety of gestures (hereinafter to be interchangeably used with “tool functions”). For example, the trackpad 326a can detect swipe of an operator's finger thereon in either direction, swipe and hold in either direction, tap, tap and hold, multiple taps, or multiple taps and hold. In some cases, the trackpad 326a may be sized to receive an input by a fingertip of an average operator's finger (for example, index finger or thumb).
[0191] The trackpad 326a (including a trackpad driver) can be implemented with, for example, ICs available from Azoteq of South Africa. The Azoteq ICs may be configured to provide data over Inter-Integrated Circuit (I2C), which can allow the workstation to interpret the gestures. In some cases, as shown in FIG. 15, the trackpad 326a can use PCB traces placed in a grid pattern 326b as sensing elements.
2. Capacitive Touch Surface
[0192] In some cases, the second PCB 326 can include a capacitive touch surface 326a instead of or in addition to the trackpad (for example, one trackpad and one capacitive touch surface in different locations). FIG. 17A shows an example capacitive touch surface. Referring to FIG. 17A, the capacitive touch surface 326a may be smooth and glossy. The capacitive touch surface 326a may be made by, for example, pad printing. The capacitive touch surface 326a can be a capacitive touch IC to directly create a capacitive slide element. In some cases, as shown in FIGS. 16A and 16B, three or four capacitive touch elements 386 and 388 (for example, as individual capacitive buttons) can be provided. Although FIGS. 16A and 16B show chevron (or ‘V’ shapes) and rectangular shapes, the capacitive touch elements 386 and 388 may have other shapes such as line, square, circle, oval or other polygonal shapes. Furthermore, the number of capacitive touch elements may be less than three or more than four depending on the requirement of the touch input surface 326a. In some cases, the capacitive touch surface 326a may be sized to receive an input by a fingertip of an average operator's finger (for example, index finger or thumb).
[0193] The capacitive touch surface (including a capacitive touch surface driver) can be implemented with, for example, ICs available from Microchip. In this Microchip device, multiple capacitive touch elements can be read by a series of digital logic implemented as a complex programmable logic device (CPLD) (conceptually similar to a field-programmable gate array (FPGA)) that is programmed only once to recognize the desired gestures. The Microchip device can be configured to provide data over I2C, which can allow the workstation to interpret the gestures.
[0194] In some cases, as shown in FIGS. 17B and 17C, the capacitive touch driver circuitry can be placed directly underneath the gesture area. FIG. 17C shows a cross-sectional view of the second PCB 326 that includes a capacitive touch surface 326a and its drive circuitry. The second PCB 326 may include a capacitive touch IC 359 and an adhesive 357 on which the capacitive touch surface 326a is attached. Other circuit components 361 may also be disposed below the capacitive touch IC 359.
[0195] In some cases, the second PCB 326 can include a capacitive button (not shown) that can toggle between two states, for example, between instrument and camera modes. The capacitive button can use in single or double-click (or multiple-click) mode. For the input button/switch, a capacitive slider may be used for clutch control from a handpiece, although a single cap button could be acceptable with pressure sensitive input (flex). The capacitive slider may be controlled by a microprocessor. The use of a microprocessor may be beneficial as being inherently more tunable or customizable. The same microprocessor can be used to control presence detection. The microprocessor may also be able to drive the haptic engines.
3. Force Sensitive Resistor
[0196] The touch input interface 326a may be implemented by a force sensitive resistor. The force sensitive resistor may be incorporated underneath the touch area, and can be made more robust and user friendly. If a click gesture is required, a capacitive element in addition to the pressing element can be triggered. This button can make a capacitive touch button feel like a real button.
Shared Input Control
[0197] In some cases, in order to minimize the number of input controls required to cause movement of various aspects of a robotic surgical system, certain input controls may be shared. This may reduce overall system clutter, such as inadvertent control and/or cognitive overload.
[0198] In some cases, the same trackpad 326a can be used to perform functions of two or more input controls. When an input control is used to control a first feature/function (for example, instrument clutch), a second feature/function (for example, camera control) may be disabled. In some cases, the second feature/function and first feature/function may be operated mutually exclusively and separately at all times. That is, the same input control interface can be used to control two or more different devices such as a camera and a clutch at different times.
[0199] FIG. 18 illustrates a flowchart for a shared input control process 500 according to some embodiments. Referring to FIG. 18, the shared input control process 500 for a handpiece will be described.
[0200] Although the process 500 is described herein with reference to a particular order, in various implementations, states herein may be performed in a different order, or omitted, and additional states may be added. The process 500 may be performed by a processor (not shown). This also applies to the processes 600-900 shown in FIGS. 19, 21, 22 and 26.
[0201] In state 410, it is determined whether an operator input has been received at a first state or mode. The operator input can be received through the input control interface 326a (see, for example, FIG. 3B). The first state or mode may be a first device operation state or mode, for example, a camera control operation mode or an instrument clutch mode.
[0202] The input control interface 326a may be a trackpad or a capacitive touch surface as described herein. The trackpad may recognize at least one of the following types of operator inputs: swipe from a first side of the trackpad to a second side of the trackpad different from the first side, tap, swipe and hold, tap and hold, multiple taps, or multiple taps and hold, or a combination thereof. The processor may perform different functions based on the swipe, the swipe and hold, the tap, the tap and hold, the multiple taps, and the multiple taps and hold. The capacitive touch surface may include at least one capacitive button that can sense a single-click or a double-click (or multiple-click), and the processor may perform different functions based on the single-click or multiple-click. The description of this paragraph applies to a camera control process 600 shown in FIG. 20 and an instrument clutch process 700 shown in FIG. 22.
[0203] If it is determined in state 410 that the operator input has not been received at the first state, the state 410 may repeat. If it is determined in state 410 that an operator input has been received at the first state or mode, the processor may control a first function at the first state while the second function is disabled at a second state (state 420). For example, the processor may control enabling and disabling a camera control function in a camera control mode while an instrument control by the handpieces 122/124 is disabled so that the surgical instrument(s) would not move even if the handpieces 122/124 are moved.
[0204] In state 430, it is determined whether the first state has been changed to the second state or another different state. The first state can be changed to the second state or another different state by actuating the input control interface 326. For example, a camera control operation mode can be changed to an instrument clutch operation mode. If it is determined in state 430 that the first state has not been changed to the second state, the states 420 and 430 may repeat.
[0205] If it is determined in state 430 that the first state has been changed to the second state, the processor may control the second function at the second state while the first function is disabled. For example, the processor may control enabling and disabling an instrument clutch control function in the instrument clutch mode while the camera operation is disabled (state 440) so that the camera would not move even if the handpieces 122/124 are moved.
1. Camera Control
[0206] FIGS. 19A and 19B illustrate conceptual diagrams showing a camera control operation according to some embodiments. FIG. 20 illustrates a flowchart for a camera control process 600 shown in FIGS. 19A and 19B according to some embodiments.
[0207] Referring to FIG. 20, it is determined whether an operator's finger has been swiped forward and held on the input control interface 326a of the handpiece 122 (state 450). In some cases, as shown in FIG. 19A, swiping of the operator's finger 390 from a point A to a point B on the input control interface 326a can be determined by: i) detecting a contact of the operator's finger 390 on the point A; ii) detecting that the finger 390 has remained in contact with the input control interface 326a and moved to the point B; and iii) detecting that the finger 390 remains at the point B. If it is determined in state 450 that the operator's finger has not been swiped and held, the state 450 may repeat.
[0208] If it is determined in state 450 that the operator's finger 390 has been swiped forward and held on the input control interface 326a, an association of the input devices 132/112 with the surgical instruments becomes disabled, the instruments become disabled, and at least one of the input devices 132/112 becomes associated with a camera (state 460). That is, the camera control operation is turned on as shown in FIG. 19A.
[0209] In state 470, the camera is controlled by repositioning and/or reorienting at least one of the input devices 132/112. Since the instruments have been disassociated from the input devices 132/112, repositioning and/or reorienting the input devices 132/112 would have no impact on the surgical instruments (disabled). In some cases, there may be only one camera at the surgery site, and only one of the input devices 132/112 may control the camera. In these cases, movement of the other input device may have no impact on the camera. In some cases, both of the input devices 132/112 may be used to move the camera by either locking the relative movement of each of the input device 132/112 to each other or averaging the movement of the input devices.
[0210] In state 480, it is determined whether the operator's finger has been released from the point B of the input control interface 326a, as shown in FIG. 19B. If it is determined in state 480 that the operator's finger has not been released, the states 470 and 480 may repeat.
[0211] If it is determined in state 480 that the operator's finger 390 has been released from the point B, an association of the input devices 132/112 with the camera becomes disabled, the camera becomes disabled, and at least one of the input devices 132/112 becomes associated with the surgical instruments (state 490). That is, the camera control operation is turned off as shown in FIG. 19B. In some cases, control of the instruments may occur automatically upon the camera turning-off or upon another subsequent intervening event such as tapping the foot pedal 126.
[0212] The described operator inputs (swiping forward and held/release) and corresponding controls (turning on and off the camera control) are merely examples, and many other combinations of input types by the input control interface and corresponding controls are possible. For example, operator inputs such as tap, tap and hold, multiple taps, multiple taps and hold, swiping backward, swiping backward and hold, multiple swiping (forward or backward), multiple swiping (forward or backward) and hold, or combinations thereof can be used to turn on or turn off the camera control operation. Furthermore, operator inputs may be received via other input interfaces such as a mechanical switch or button, a lever, self-centering wheel, or other non-trackpad or non-touch capacitive surface, as long as the same input control interface can share input controls for multiple functions associated with one or more surgical devices. The description of this paragraphs applies to the instrument clutch operation procedure below.
2. Instrument Clutch
[0213] FIGS. 21A and 21B illustrate conceptual diagrams showing an instrument clutch operation according to some embodiments. FIG. 22 illustrates a flowchart for an instrument clutch process 700 shown in FIGS. 21A and 21B according to some embodiments.
[0214] During operation of an input device, an operator frequently will reach the physical limits of repositioning the input device based on the mechanical limits of the device itself or the operator's arms. Thus, instrument clutching is advantageous when repositioning the input devices to enable a greater workspace. To allow the operator to “reset” or “re-center” their workspace, the operator would clutch to release association of the input device with a controlled slave instrument. Upon clutching, the input device may be repositioned while the instruments remain fixed. Upon unclutching, the association would be reestablished. Any errors introduced upon re-association such as orientation misalignment between the input device orientation and that of the instrument end-effector may be corrected by methods described in U.S. Patent Publication No. 2018/0271607 and U.S. Patent Publication No. 2018/0367777, which are assigned to the assignee of the present application and the disclosures of which are incorporated by reference in their entirety.
[0215] Referring to FIG. 22, it is determined whether an operator's finger 390 has been swiped backward on the input control interface 326a and released therefrom (state 510). In some cases, as shown in FIG. 21A, swiping of the operator's finger 390 from a point B to a point A on the input control interface 326a can be determined by: i) detecting a contact of the operator's finger 390 on the point B; ii) detecting that the finger 390 has remained in contact with the input control interface 326a and moved to the point A; and iii) detecting that the finger 390 has been released from the point A. If it is determined in state 510 that the operator's finger has been swiped backward and released, the state 510 may repeat.
[0216] If it is determined in state 510 that the operator's finger 390 has been swiped backward and released from the point A on the input control interface 326a, an association of the input devices 132/112 with the surgical instruments becomes disabled and the instruments become disabled (state 520).
[0217] In state 530, the input devices 132/112 are repositioned without moving the instruments. Since the surgical instruments have been disassociated from the input devices 132/112 in state 520, the movement of the input devices 132/112 would have no impact on the instruments.
[0218] In state 540, it is determined whether an operator's finger 390 has been swiped backward again on the input control interface 326 of the handpiece 122 and released therefrom (state 510). This can be determined in the same way as described with respect to state 510. If it is determined in state 540 that the operator's finger has not been swiped backward and released, the states 530 and 540 may repeat.
[0219] If it is determined in state 540 that the operator's finger 390 has been swiped backward and released again from the point A on the input control interface 326, an association of the input devices 132/112 with the surgical instruments is re-enabled (state 550). See also FIG. 20B. Since the surgical instruments have been associated with the input devices 132/112, the movement of the input devices 132/112 will move the instruments.
[0220] In some cases, control of the instrument may occur automatically upon de-clutching or upon another subsequent intervening event such as tapping the foot pedal 126.
Gesture Controls (Tool Function Controls)
[0221] Gesture controls (hereinafter to be interchangeably used with “tool function controls”) using a trackpad 326a (see, for example, FIG. 10) or another secondary control interface can be used to cause the system to function in various ways including causing the system to perform pre-set routines and functions. For example, swiping of the finger from one side to the other on the trackpad 326a may cause the surgical instrument to become locked in the state it is presently in (for example, one or more jaws of the instrument fixed in that position). This may be useful for situations where the user desires the surgical instrument to continue to grasp whatever it is grasping while the user repositions the input device. In other examples, certain gestures may cause the system to perform a pre-set routine such as certain surgery routines. Such gestures and resultant “auto” features may help reduce user fatigue.
[0222] FIG. 23 illustrates a flowchart for the gesture control process 800 according to some embodiments. The gesture control process 800 may be performed by a processor (not shown). Referring to FIG. 23, the gesture control process 800 will be described.
[0223] In state 610, it is determined whether an operator input has been received. The operator input can be received via the input control interface 306a or 326a (see, for example, FIGS. 3A and 3B). If it is determined in state 610 that the operator input has not been received, the state 610 may repeat.
[0224] If it is determined in state 610 that an operator input has been received, it is determined whether the received operator input relates to one or more of a plurality of gestures or tool functions (state 620). The predetermined gestures or tool functions may include a predetermined surgery routine such as suturing (partial or complete suturing), cutting, grasping or moving in a predetermined direction. The suturing may include complete suturing and partial suturing. The predetermined surgery routine may also include moving the surgical tool in a predetermined direction. The predetermined direction may include a linear direction, a curved direction, a clockwise direction, a counterclockwise direction, semi-circular direction, or a circular direction. In some cases, the predetermined direction may be based on the pattern of the swipe/gesture itself (for example, a curved gesture may result in a movement in a curved direction). The tool functions may also include causing a lens of a camera to be washed, causing the camera to zoom in and out, causing the camera to pan, or causing the camera to tilt.
[0225] When the input control interface 326a is a trackpad, the trackpad may sense swiping of an operator's finger from a first point on the trackpad to a second point on the trackpad different from the first point, and the processor may control the surgical tool to remain adjacent to a current surgery position. For example, the processor may control the surgical tool to become locked in the current surgery position. The predetermined tool functions corresponding with the operator inputs may be stored in a memory being in data communication with the processor.
[0226] If it is determined in state 620 that the operator input relates to the predetermined tool functions, the processor may perform the predetermined tool functions (state 630). If it is determined in state 620 that the operator input does not relate to the predetermined tool functions, the processor may perform normal input control functions that are not associated with the predetermined tool functions (state 640).
Handpiece Feedback Control
[0227] When an operator operates the input devices, the handpiece may provide a feedback to the operator. In some cases, the feedback may be provided when the operator switches a function from a first mode to a second mode different from the first mode. The feedback may include a haptic feedback, a visual feedback, an audio feedback, a tactile feedback or a force feedback, or a combination thereof. This feedback function may be useful to an operator or user since they can be notified when a function is switched between different modes. This may enhance safety, as the operator can be assured by the feedback that their input has been properly received by the system, and thus he or she is in a certain operation mode that is intended. The feedback device may be located in a portion of the handpiece that would contact the palm of the user to facilitate a better or more significant feel of the feedback. Types of actuation provided by the feedback device may be different for each function, for example, to allow the user to determine which function they have enabled based on feedback alone. Types of feedback may be configurable by the user. Users may want to enable the change based on personal preference especially in view of signals/patterns they are familiar with in a non-surgery environment, for example, car, phone or tablet, etc. In some cases, a driver or controller of the feedback device may be located outside the handpiece, for example, somewhere in the workstation 102, whereas an actuator of the feedback device may be located in the handpiece, as long as an operator may be provided a feedback by the handpiece upon a function change. In some cases, multiple feedback devices may be included in the handpiece at different locations to make it easier for the user to better recognize the feedback and/or distinguish the various types of feedback. For example, a first feedback device may be included in the portion of the handpiece that would contact the palm while a second feedback device may be included in a portion of the handpiece near a location contacted by the user's thumb or proximate the distal portion of the handpiece (as shown in FIG. 25A near 344).
[0228] FIG. 24 illustrates a flowchart for a handpiece feedback control process 900 according to some embodiments. The process 900 may be performed by a processor or controller. Referring to FIG. 24, the handpiece feedback control process 900 will be described.
[0229] In state 810, it is determined whether a function has been switched from a first mode to a second mode. During operation, an operator may switch a function between modes. In some cases, the function switching may be sensed by the input control interface 326a. The function may include controlling a camera that images a surgery site, instrument clutching to reposition the hand grip apparatus, pre-set surgery routines, or other operation to control the surgical tool. In some cases, when the function is controlling the camera, the first mode may be enabling the camera control and the second mode may be disabling the camera control. In some cases, when the function is instrument clutching, the first mode may be enabling instrument clutching and the second mode may be disabling the instrument clutching. The function switching may originate in the hand grip apparatus 122/124 or the foot pedal 126 of the robotic surgery system 100. A function switch can originate in the handpiece and/or the foot pedal 126. If it is determined in state 810 that the function switching has not occurred, the state 810 may repeat.
[0230] If it is determined in state 810 that the function has been switched from the first mode to the second mode, the processor may provide a feedback to an operator (state 820). The feedback may include haptic, visual, audio, tactile, force or any other feedback, or a combination thereof, that can notify the operator about the mode change. After the feedback is provided, the handpiece may operate at the second (different) mode (state 830). For example, when an association of the camera with handpieces is re-enabled, the operator may continue to control the camera with the use of at least one of the handpieces.
[0231] In state 840, it is determined again whether a function has been switched from the second mode to another mode (such as the first mode or third mode). If it is determined in state 840 that the function has not been switched from the second mode to another mode, the states 830 and 840 may repeat. If it is determined in state 840 that the function has been switched from the first mode to the other mode, the processor may provide a feedback to the operator (state 850). The processor may perform the states 840 and 850 substantially the same way as with states 810 and 820.
1. Haptic Feedback
[0232] The handpiece feedback device can include a haptic feedback device. In some cases, the haptic feedback device may include a haptic actuator and a haptic driver. The haptic actuator may include a motor or actuator available from Texas Instruments. The haptic actuator may provide a haptic feedback in the form of vibration. The haptic driver (processor or controller) may drive the haptic actuator to provide a vibrational feedback when a robotic surgery function is switched from a first mode to a second mode.
[0233] The vibrational feedback may have a variety of different vibration patterns. For example, the vibration can have different strength levels, different durations, directions or intervals (if multiple vibrations involved). Furthermore, different types or patterns of vibration may be used for different mode switching and may be user configurable. Alternatively, the same vibration can be used for all mode switching.
[0234] In some cases, the haptic driver may be implemented with, for example, ICs available from Texas Instruments. The TI ICs may be I2C controlled and can be triggered by the workstation.
[0235] In some cases, as shown in FIG. 25A, the haptic feedback device 344 can be mounted directly in front of the gesture area, for example, on the center mount 342 of the handpiece. In some cases, as shown in FIG. 25B, the haptic feedback device 344 can be mounted in the lower handle/grip element (for example, inside the lower region of the upper and lower housings 324 and 322). However, the haptic feedback device 344 can be mounted in any other region in the handpiece, as long as it can provide a haptic feedback, when the user switches from one mode to another. In some cases, the haptic actuator may be provided inside the handpiece, and the haptic driver may be positioned outside the handpiece, for example, somewhere in the workstation 102 where the haptic driver can remotely control the haptic actuator. In some cases, multiple haptic actuators may be provided inside the handpiece, for example as shown in both FIGS. 25A and 25B at 344.
2. Visual Feedback
[0236] The handpiece feedback device can include a visual feedback device. The visual feedback device may include a light source and a controller (not shown). The controller may sense whether a function is switched between different modes and control the light source to emit light based on the sensed function switching. The light source may be any light emitter or generator such as an LED. The light source may be disposed around the input control interface 326a. However, the light source can be disposed in any other location in the handpiece as long as light emitted by the light source can be recognized by an operator. The light source can emit light in a single color or multiple colors. The light source can emit light having a particular shape. The different shapes and/or colors of light may be emitted according to different modes of the function to be switched and may be user configurable.
3. Audio Feedback
[0237] The handpiece feedback device can include an audio feedback device. The audio feedback device may include a speaker and a controller (not shown). The controller may sense whether a function is switched between different modes and control the speaker to make sound based on the sensed function switching. The speaker may be disposed around the input control interface 326a. However, the speaker can be disposed in any other location inside or outside the handpiece as long as sound can be heard by an operator. The sound can have a variety of patterns, in terms of types of sound, volume levels, sound duration or interval (if multiple types of sound involved). The different types of sound may be output according to different modes of the function to be switched and may be user configurable.
4. Tactile Feedback
[0238] The handpiece feedback device can include a tactile feedback device configured to provide a tactile feedback in response to the function switching. The tactile feedback may include a variety of types of feeling that an operator may have on a portion of the handpiece. The portion of the handpiece may be the input control interface 326a or any other location in the handpiece where an operator can recognize a tactile feedback. The tactile feedback may include one or more of: a bump, a beak, a grove, a lip, or a texture difference (for example, course finish to smooth finish in the input control interface 326a). The different types of tactile feedback may be provided according to different modes of the function to be switched and may be user configurable.
5. Force Feedback
[0239] The handpiece feedback device can include a force feedback device configured to provide a tactile feedback in response to the function switching. The force feedback may include a variety of types of force that an operator may sense on a portion of the handpiece. The portion of the handpiece may be the input control interface 326a or any other location in the handpiece where an operator can recognize a force feedback. The force feedback may include a self-centering wheel.
Handpiece System Block Diagram
[0240] FIG. 26 illustrates a block diagram of a handpiece 1000 according to some embodiments. Referring to FIG. 26, the handpiece 1000 includes a proximity detector 910, a slider board 920, a haptic feedback device 930, a pincher encoder 940 including a magnetic angular encoder 940a and/or an inductive sensor 940b, and a communication board 950. Although a number of separate components are illustrated in FIG. 26, those of skill in the art will recognize that one or more of the components may be combined or commonly implemented. For example, the proximity detector 910, the magnetic angular encoder 940 and/or the inductive sensor 950 may be implemented in a single PCB, for example, the first PCB 350 or the second PCB 326 described with respect to FIG. 10. Further, at least one of the components illustrated in FIG. 26 may be implemented using a plurality of separate elements or may be omitted. As another example, all of the components 910-950 may be implemented in a single PCB. Another processor or controller, disposed either inside or outside the handpiece 1000, may also be used to control one or more of the components 910-950.
[0241] In some cases, all of the components 910-950 may be disposed inside the handpiece 900. In some cases, at least one of the components 910-950 may be disposed outside the handpiece 900, for example, somewhere in the robotic surgery system 100 such as in the workstation 102 (see, for example, FIG. 1). Although FIG. 26 shows that all of the components 910-950 communicate data with each other via a wired network, at least one of the components 910-950 may wirelessly communicate data with one or more of the remaining components.
[0242] The proximity detector or presence detector 910 may detect whether a user's hand is present within a certain distance of the handpiece 900. The proximity detector 910 may provide a detected result to the communication board 950 so that a corresponding control (for example, activating or deactivating the handpiece) may be subsequently performed based on the detected result. The presence detector 910 may provide a general purpose output such as simple digital high's and low's to the communication board 950. The proximity detector 910 can be implemented with, for example, a capacitive proximity detector available from Microchip as described herein.
[0243] The slider board 920 may be used to drive the input control interface 326a such as a trackpad or capacitive touch surface described herein. The slider board 920 may provide a sensed result to the communication board 950 so that a corresponding control (for example, gesture control, shared input control, additional input control) may be subsequently performed based on the sensed result. The slider board 920 may be implemented with, for example, the ICs available from Microchip as described herein.
[0244] The haptic feedback device 930 may provide a haptic feedback to an operator, when the operator switches a function from a first mode to a second mode different from the first mode as described herein. Although not shown in FIG. 26, at least one of other feedback devices (visual feedback device, audio feedback device, tactile feedback device or force feedback device) may also be included in the handpiece 1000. The haptic feedback device 930 may provide a haptic feedback to an operator via the communication board 950. The haptic feedback device 930 can be implemented with, for example, the ICs available from Texas Instruments as described herein.
[0245] The pincher encoder (or pincher angle detector) 940 may magnetically or inductively detect a pincer angle and provide the detected result to the communication board 950 so that a corresponding control (for example, control of jaw movement of a surgical instrument) may be subsequently performed based on the detected result. The pincher encoder 940 may include the magnetic angular detector 940a and/or the inductive sensor 940b.
[0246] The magnetic angular detector 940a may detect an angular movement of a magnetic target attached to or integrally formed with the wiper 370 shown in FIGS. 10-11B. The magnetic angular detector 940a may be implemented with, for example, the MPS ICs or ADI ICs as discussed herein.
[0247] The inductive sensor 940b may detect a pincer angle by inductively sensing a movement of a metallic target formed in the wiper 370 or the paddle 329. The inductive sensor 940b may be implemented with, for example, the TI ICs or IDT ICs as discussed herein.
[0248] The communication board 950 may be used to communicate data with the components 910-940, or devices external to the handpiece 1000. The communication board 950 may be implemented with, for example, ICs available from NXP Semiconductors. The NXP ICs may convert all data to a differential I2C format. The slider board 920 and the pincher encoder 940 may use a shared I2C bus for communication with the communication board 950. However, the present disclosure is not limited to the I2C protocol, and other communication protocols such as serial peripheral interface (SPI) or System Management Bus (SMBus) could also be used. Furthermore, simple quadrature encoding could be used for the pincher encoder 940.
Handpiece Ergonomic Features
[0249] As discussed herein, during operation, an operator grasps a handpiece with his/her hand and moves the handpiece such that the instrument mimics the movement of the handpiece. For example, the operator may push toward and pull the handpiece from input devices, move upward, downwards, leftwards, rightwards or diagonal wise, or rotate the handpiece about a longitudinal axis thereof. Furthermore, the operator opens and closes the paddle 329 to control an open and close movement of the instrument. Moreover, during operation, an operator generally spends a substantial amount of time (for example, half an hour to few hours) in operating the handpiece. Thus, it is desirable that the handpiece is designed or structured to be user friendly, safe, ergonomic, reduce user fatigue and/or improve operation efficiency. In some cases, the handpiece may have multiple ergonomic features. For example, several components of the handpiece (for example, palm grip, neck portion, paddle, slanted top, ridge, pivot joint, cutout, etc.) may be ergonomically shaped and/or sized.
1. Palm Grip
[0250] FIG. 27 illustrates a perspective view of a right side handpiece 122 showing palm grip ergonomic features according to some embodiments. FIG. 28 illustrates a perspective view of the right side handpiece 122 of FIG. 27 grasped by a user's right hand according to some embodiments. Referring to FIG. 27, the handpiece 122 includes a palm grip 303. The palm grip (hereinafter to be interchangeably used with a “handle”) 303 is a region of the handpiece 122 that is grasped and/or supported by the operator's palm 750 (see, for example, FIG. 28). In some cases, the palm grip 303 may be sufficiently long to permit a substantial portion of an average operator's palm to be rested on. This design may be advantageous over a linear handle or a curved but relative shorter handle, in that an operator may be able to more securely and comfortably grasp the handpiece 122 using the longer and ergonomically shaped palm grip 303. Furthermore, due to a longer/larger dimension and ergonomic design, the palm grip 303 may have a larger contact surface area where an operator's palm can rest.
[0251] The palm grip 303 may include an upper grip (or upper portion) 303a, a middle grip (or middle portion) 303b and a lower grip (or lower portion) 303c. The upper grip 303a may extend from the neck portion 317 toward the proximal end. The middle grip 303b may downwardly extend at a first angle from the upper grip 303a. The lower grip 303c may downwardly extend at a second angle from the middle grip 303b. The first and second angles may be the same as or different from each other depending on the embodiment. The upper and lower grips 303a and 303c may have a narrower diameter or width than that of the middle grip 303b so that the palm grip 303 as a whole has a substantially ‘egg’ shape. The width or diameter of the upper grip 303a may be larger than that of the neck portion 317. The upper grip 303a may be shaped and sized to permit at least a portion of an average operator's finger (for example, thumb or index finger) to be comfortably rested or supported. The middle grip 303b may have an external surface that is curved to correspond to a curvature of at least part of an average operator's palm.
[0252] In some cases, as shown in FIG. 28, the palm grip 303 may form an obtuse angle 760 with the neck portion 317. The obtuse angle 760 may correspond to the anatomy of an average operator's hand. The obtuse angle 760 may be substantially similar to an angle of a natural curvature of an average operator's hand when gripping the handpiece 122 as shown in FIG. 28. In these cases, an operator's thumb 730 and index finger 740 may be positioned on a region of the handpiece 122 (for example, the index finger 740 on or near the paddle 329 and the thumb 730 on opposite side of the index finger 740) substantially parallel to a longitudinal axis of the handpiece 122. Furthermore, the palm 750 may be positioned on the palm grip 303 such that the angle between the thumb 730/index finger 740 and the palm 750 may be substantially similar to the obtuse angle defined between the neck portion 317 and the palm grip 303. The slanted angle of the palm grip 303 may provide a maximum contact with the palm 750 while providing comfort to the operator. The operator's middle finger 736 may be positioned on a downward extension 764 of the paddle 329 (see, for example, FIG. 29), and the other two fingers 732 and 734 may be rested on a portion of the palm grip 303.
[0253] Thus, the ergonomically shaped palm grip design may provide comfort and convenience while reducing user fatigue and improving operation efficiency.
2. Neck Portion
[0254] In some cases, as shown in FIG. 27, the handpiece 122 may also include a neck portion 317 positioned between the upper grip 303a and the pivot joint 372 (see also FIG. 11A). The neck portion 317 may have a reduced cross sectional extent with respect to the upper grip 303a. The neck portion 317 may permit an operator's fingers (for example, thumb 730 or index finger 740) to be comfortably rested thereon.
[0255] Since the neck portion 317 is positioned between the upper grip 303a and the pivot joint 372, the neck portion 317 may not horizontally overlap the paddle 329. This may be advantageous, as it can permit an operator's finger to be rested thereon without the finger touching the paddle 329. The palm grip 303 and the neck portion 317 together may allow an operator to comfortably grasp the handpiece 122 while resting one or more of his/her fingers without interfering with the paddle operation.
[0256] In some cases, as shown in FIG. 27, the neck portion 317 may include a protruding side surface 317a that outwardly protrudes from a side thereof. Although FIG. 27 shows only one protruding side surface 317a, the neck portion 317 may include another protruding surface on the opposite side. The protruding surface 317a alone or in combination with a convexed tail end 331 (to be described with respect to FIG. 29 below) of the paddle 329 may enable operators to more easily roll (rotate) the handpiece 122 about a longitudinal axis of the handpiece 122, by turning the protruding side surface 317a and/or the convexed tail end 331 with their fingertip(s), without the need of rotating their wrists. In another case, during this handgrip rotating procedure, the operators may merely loosely grasp the palm grip 303.
[0257] In some cases, the protruding surface 317a of the neck portion 317 may be adjacent to or contact the convexed tail end 331. The protruding surface 317a may have a curvature substantially the same as or similar to that of the curvature of the convexed tail end 331. The combination of the protruding surface 317a and the convexed tail end 331 having the same or similar curvature may allow operators to more easily roll (rotate) the handpiece 122 by turning the combined elements 317a and 331, as the operator would have a larger convexed area to turn.
[0258] Thus, the ergonomic features of the neck portion may provide comfort and convenience while reducing user fatigue and improving operation efficiency.
3. Paddle
[0259] FIG. 29 illustrates a perspective view of a right side handpiece 122 showing paddle ergonomic features according to some embodiments. FIG. 30 illustrates a closed-up perspective view of a paddle 329 of the right side handpiece 122 of FIG. 29 according to some embodiments (with irrelevant elements removed). FIG. 31 illustrates a close-up left side view of the paddle 329 of FIG. 30 according to some embodiments.
[0260] Referring to FIG. 29, the paddle 329 includes a tail end 331 and a paddle end 333. The paddle 329 may also include an upper portion 770 and a downward extension 764. The upper portion 770 includes a tail region 762 and a paddle region 766. The tail region 762 may include the tail end 331 and a non-tail end region adjacent to the tail end 331. The downward extension 764 downwardly extends from the paddle region 766. In some cases, as shown in FIG. 29, left and right ends of the downward extension 764 may be sized such that the upper side of the downward extension 764 has a width slightly greater than that of the lower side thereof. The tail region 762 may be sized to receive distal phalanges of an average operator's finger (for example, index finger) when grasped by the hand of the operator. The downward extension 764 may have a height greater than that of the tail region 762 where the height is measured in a direction substantially perpendicular to a longitudinal axis of the handpiece body. The downward extension 764 may be sized to accommodate at least two fingers of the operator such as index and middle fingers.
[0261] In some cases, an outer surface of the tail region 762 may be at least partially outwardly curved. For example, the outer surface may be at least partially convexed, crowned, arced or semi-circular. For example, an outer surface of the tail end 331 may be at least partially convexed. The convexed tail region 762 alone or in combination with the protruding side surface 317a of the neck portion 317 may enable operators to more easily roll (rotate) the handpiece 122 about a longitudinal axis of the handpiece 122, by turning at least one of the two protruding elements 762 and 317a with their fingertip(s), without the need of rotating their wrists, or by merely loosely grasping the palm grip 303.
[0262] In some cases, as shown in FIG. 31, the tail end 331 may be fully convexed. For example, an outer surface of the tail end 331 may have a substantially convexed lens shape. In some cases, as shown in FIG. 31, the tail end 331 may be partially convexed. In these cases, the tail end 331 may include an upper convexed portion 331a and a lower substantially flat portion 331b. In some cases, the lower portion 331b may be convexed and the upper portion 331a may be substantially flat. The remaining portion of the tail region 762 may be substantially flat, convexed or less convexed than the tail end 331.
[0263] In some cases, instead of or in addition to the outwardly curved surface, the tail end 331 may include one or more individual protrusions spaced apart (not shown). In some cases, the entire tail region 762 may be convexed. The paddle region 766 may be substantially flat or less convexed (in terms of curvature) than the tail region 762. In some cases, the tail region 762 may include other shape or structure as long as it can permit operators to more easily rotate the handpiece 122 with an operator's fingertip(s).
[0264] The downward extension 764 may be at least partially concaved or inwardly curved (opposite curve of the curvature described herein for the tail end 331). The concaved portion 764 may allow an operator to grab the paddle 329 or rest his or her fingers thereon. In some cases, as shown in FIGS. 29 and 30, the entirety of the downward extension 764 may be concaved. In some cases, the downward extension 764 may be partially concaved.
[0265] Thus, the ergonomic features (convexed tail region and concaved downward extension) of the paddle 329 may provide comfort and convenience while reducing user fatigue and improving operation efficiency.
4. Slanted Top
[0266] FIG. 32 illustrates a rear view of a right side handpiece 122 showing additional handpiece ergonomic features according to some embodiments. In some cases, as shown in FIG. 32, the handpiece 122 may include a slanted top surface 780 that is slanted toward a side surface of the body where an operator's index finger is configured to be positioned when the handpiece is grasped by the hand of the operator. For example, for a right side handpiece having a paddle disposed on the right side of the body (see, for example, FIG. 32), the slanted top surface 780 may be slanted toward the paddle 329 or the cutout 336. As another example, for a right side handpiece having a paddle disposed on the left side of the body (not shown), the slanted top surface may be slanted toward a side of the body on the opposite side of the paddle. As another example, for a left side handpiece having a paddle disposed on the left side of the body (see, for example, FIG. 3A), the slanted top surface may be slanted toward the paddle or the cutout. As another example, for a left side handpiece having a paddle disposed on the right side of the body (see, for example, FIG. 6A), the slanted top surface may be slanted toward a side of the body on the opposite side of the paddle. As another example, for a handpiece having paddles disposed on both the left and right sides of the body (see, for example, FIG. 9), the top surface may be relative flat or have side (or edge portions) that curve toward each of the left and right sides of the body. The curvature of the top surface may be a convex curvature.
[0267] The slanted top surface 780 may be advantageous in that an operator can easily move his or her index finger between the input control interface 326a and the paddle 329, as the input control interface 326a is closer to the paddle 329, compared to a non-slanted handpiece top. Thus, the slanted top 780 may provide convenience while improving operation efficiency.
5. Ridge Around Input Control Interface
[0268] In some cases, as shown in FIG. 27, the handpiece 122 may include a ridge 326b around the input control interface 326a. The ridge 326b surrounds the input control interface 326a and is upwardly protruded or raised from an edge of the input control interface 326a. The ridge 326b may help an operator recognize the location of the input touch interface 326a during operation without necessarily having to look at the handpiece 122. Although FIG. 27 shows that the ridge 326b fully surrounds the input control interface 326a, the ridge 326b may be formed to only partially surround the input control interface 326a. In some cases, the ridge 326b may include a plurality of individual protrusions spaced apart along the edge of the input control interface 326a. In some cases, the ridge 326b may have other shapes that can provide a tactile feedback to an operator regarding the position of the input touch interface 326a. Thus, the ridge 326b may also provide convenience while improving operation efficiency and accuracy.
6. Sloped Region
[0269] In some cases, as shown in FIG. 27, the handpiece 122 may include a sloped region 319 between the neck portion 317 and the ridge 326b of the input control interface 326a. Since there is a height difference between the neck portion 317 and the ridge 326b, the two elements 317 and 326a are connected via the sloped region 319. The sloped region 319 may be downwardly curved or linear. In some cases, the sloped region 319 alone may be used for an operator to rest his or her finger thereon. In some cases, the sloped portion 319 and the neck portion 317 together may be used to accommodate an operator's finger (for example, thumb or index finger) for resting. Thus, the sloped region 319 may provide more comfort, thereby reducing user fatigue.
7. Pivot Joint and Paddle Movement Mechanism
[0270] Referring to FIG. 11A and FIG. 27, the pivot joint 327 is disposed inside the handpiece body (see also FIG. 11A). Since there is no pivot joint disposed outside the handpiece body, an operator may be prevented from being finger-pinched by a pivot joint and/or a portion of the outside of the handpiece body. In addition to enhancing user safety, the handpiece 122 may appear aesthetically better and neater by not placing the pivot joint outside or near the side of the handpiece body. Furthermore, in combination with the wiper 370 (see, for example, FIG. 11A), the pivot joint 372 may more securely fix the paddle 329 to the handpiece body.
[0271] Referring back to FIG. 11A, the paddle 329 and the wiper 370 are connected to and arranged in a substantially spaced apart and parallel relationship with each other (see arrows 371 in FIG. 11A). Having the paddle 329 and the wiper 370 on opposite sides of the pivot joint 372 but spaced apart allows for compactness of the handpieces. Otherwise, if they were inline, the wiper 370 would need to extend farther towards the proximal end in order to have enough spacing to detect a change in angle (using the angular detection sensor or curved coil layout if using an inductive detector described herein). This may also allow the paddle 329 to extend out and away from the body to provide a comfortable grip for the user.
[0272] In some cases, as shown in FIG. 11A, the paddle 329 may have a central longitudinal axis 371 that does not intersect the pivot joint 372. Furthermore, the pivot joint 372 may be disposed inside the body to be closer to a longitudinal axis 373 of the body than the longitudinal axis 371 of the paddle 329. This structure allows substantially the entirety of the paddle 329 to be disposed outside the body. Furthermore, the longitudinal axis 371 of the paddle 329 can be substantially parallel to the longitudinal axis 373 of the body in its closed position. The full closure and parallel arrangement of the paddle 329 may be beneficial in that the instrument can be fully closed based on the movement of the paddle 329. Moreover, the described structure also allows an inner surface of the paddle 329 to gently land on a side surface of the body that faces the paddle 329. This can prevent the paddle from colliding or otherwise having an undesirable physical impact on the paddle 329 when it is closed.
[0273] In some cases, as shown in FIG. 11A, the tail end of the paddle 329 may include a region 375 that is angled, slanted or curved toward the pivot joint 372. The angled region 375 may be used to conveniently open and close the paddle 329. The angled region 375 may be convexed to conveniently roll the handpiece without using the palm grip 303 as described herein.
[0274] With the ergonomic structure of the pivot joint 372 and the paddle 329 along with the wiper 370, no mechanical paddle movement detection mechanism, such as a rod that converts a rotational paddle movement into a linear movement, is required. Thus, the handpiece can be more efficiently manufactured and/or the pincer angle detection can be more accurately or efficiently made. Furthermore, the handpiece may be more safely operated by a user.
8. Cutout
[0275] In some cases, as shown in FIG. 27, the handpiece 122 may also include a cutout 336 formed in the side surface of the body that faces the paddle 329. The cutout 336 may be shaped to accommodate the paddle 329. For example, as shown in FIG. 30, the inner surface 329c of the paddle 329 may be inwardly curved, and the cutout 336 may be correspondingly shaped to accommodate the curved inner surface 329c of the paddle 329. As described herein, the longitudinal axis 371 of the paddle 329 can be substantially parallel to the longitudinal axis 373 of the body in its closed position. The cutout 336 may more easily enable the parallel arrangement between the axes 371 and 373 of the paddle 329 and the body by accommodating the paddle 329 therein.
Other Variations
[0276] Those skilled in the art will appreciate that, in some embodiments, additional components and/or steps can be utilized, and disclosed components and/or steps can be combined or omitted. For example, although some embodiments are described in connection with a robotic surgery system, the disclosure is not so limited. Systems, devices, and methods described herein can be applicable to medical procedures in general, among other uses. As another example, certain components can be illustrated and/or described as being circular or cylindrical. In some implementations, the components can be additionally or alternatively include non-circular portions, such as portions having straight lines. As yet another example, any of the actuators described herein can include one or more motors, such as electrical motors. As yet another example, in addition to or instead of controlling tilt and/or pan of a camera, roll (or spin) can be controlled. For example, one or more actuators can be provided for controlling the spin.
[0277] The foregoing description details certain embodiments of the systems, devices, and methods disclosed herein. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the systems, devices, and methods can be practiced in many ways. The use of particular terminology when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being redefined herein to be restricted to including any specific characteristics of the features or aspects of the technology with which that terminology is associated.
[0278] It will be appreciated by those skilled in the art that various modifications and changes can be made without departing from the scope of the described technology. Such modifications and changes are intended to fall within the scope of the embodiments. It will also be appreciated by those of skill in the art that parts included in one embodiment are interchangeable with other embodiments; one or more parts from a depicted embodiment can be included with other depicted embodiments in any combination. For example, any of the various components described herein and/or depicted in the figures can be combined, interchanged, or excluded from other embodiments.
[0279] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations can be expressly set forth herein for sake of clarity.
[0280] Directional terms used herein (for example, top, bottom, side, up, down, inward, outward, etc.) are generally used with reference to the orientation or perspective shown in the figures and are not intended to be limiting. For example, positioning “above” described herein can refer to positioning below or on one of sides. Thus, features described as being “above” may be included below, on one of sides, or the like.
[0281] It will be understood by those within the art that, in general, terms used herein are generally intended as “open” terms (for example, the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims can contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to embodiments containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (for example, “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (for example, the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations).
[0282] The term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps.
[0283] Conditional language, such as “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements, and/or steps are included or are to be performed in any particular embodiment.
[0284] Language of degree used herein, such as the terms “approximately,” “about,” “generally,” and “substantially” as used herein represent a value, amount, or characteristic close to the stated value, amount, or characteristic that still performs a desired function and/or achieves a desired result. For example, the terms “approximately”, “about”, “generally,” and “substantially” may refer to an amount that is within less than 10% of, within less than 5% of, within less than 1% of, within less than 0.1% of, and/or within less than 0.01% of the stated amount.
[0285] It will be further understood by those within the art that any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, can be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.” Further, the term “each,” as used herein, in addition to having its ordinary meaning, can mean any subset of a set of elements to which the term “each” is applied.
[0286] Conjunctive language such as the phrase “at least one of X, Y, and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y, or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require the presence of at least one of X, at least one of Y, and at least one of Z.
[0287] The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality may be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the invention.
[0288] The various illustrative blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
[0289] The steps of a method or algorithm and functions described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a tangible, non-transitory computer-readable medium. A software module may reside in Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, hard disk, a removable disk, a CD ROM, or any other form of storage medium known in the art. A storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer readable media. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
[0290] The above description discloses embodiments of systems, apparatuses, devices, methods, and materials of the present disclosure. This disclosure is susceptible to modifications in the components, parts, elements, steps, and materials, as well as alterations in the fabrication methods and equipment. Such modifications will become apparent to those skilled in the art from a consideration of this disclosure or practice of the disclosure. Consequently, it is not intended that the disclosure be limited to the specific embodiments disclosed herein, but that it cover all modifications and alternatives coming within the scope and spirit of the subject matter embodied in the following claims.
Claims
1. A hand controller apparatus for controlling one or more tools in a robotic surgery system, the apparatus comprising: a body including a proximal portion and a distal portion extending to a distally located interface end configured to be coupled to an input apparatus configured to control a surgical tool, the proximal portion configured to support at least a portion of a palm of a user's hand thereon; a lever having a tail end pivotally coupled to a side of the body and extending distally to a paddle end of the lever; and an input control interface on an upper surface of the body and configured to receive an input from one or more fingers of the user's hand, the input control interface located closer to the tail end than the paddle end of the lever.
2. The apparatus of claim 1, wherein the tail end includes an inner surface facing the body and an outer surface opposing the inner surface, wherein at least part of the outer surface of the tail end is outwardly curved, and wherein the at least part of the outer surface of the tail end includes a substantially convex shape.
3. The apparatus of claim 1, wherein the proximal portion has a contoured surface configured to support at least a portion of the palm of the user's hand.
4. The apparatus of claim 3, wherein the contoured surface is a convex surface.
5. A hand controller apparatus for controlling one or more tools in a robotic surgery system, the apparatus comprising: a body including a proximal portion and a distal portion extending to a distally located interface end configured to be coupled to an input apparatus configured to control a surgical tool, the proximal portion configured to support at least a portion of a palm of a user's hand thereon; a lever having a tail end pivotally coupled to a side of the body and extending to a paddle end of the lever; and an input control interface on an upper surface of the body and configured to receive an input from one or more fingers of the user's hand, the input control interface having a greater length than width, the input control interface located closer to the tail end than the paddle end of the lever.
6. The apparatus of claim 5, wherein the input control interface is generally rectangular.
7. The apparatus of claim 5, wherein the distal portion has at least one side surface that extends toward the distally located interface and is defined by a plane.
8. The apparatus of claim 5, wherein the tail end includes an inner surface facing the body and an outer surface opposing the inner surface, wherein at least part of the outer surface of the tail end is outwardly curved, and wherein the at least part of the outer surface of the tail end includes a substantially convex shape.
9. The apparatus of claim 5, wherein the proximal portion has a contoured surface configured to support at least a portion of the palm of the user's hand.
10. The apparatus of claim 9, wherein the contoured surface is a convex surface.
11. A hand controller apparatus for controlling one or more tools in a robotic surgery system, the apparatus comprising: a body including a proximal portion and a distal portion extending to a distally located interface end configured to be coupled to an input apparatus configured to control a surgical tool, the proximal portion configured to support at least a portion of a palm of a user's hand thereon; a lever having a tail end pivotally coupled to a side of the body and extending distally to a paddle end of the lever; an input control interface on an upper surface of the body and configured to receive an input from one or more fingers of the user's hand; and a lateral movement detector configured to magnetically or inductively detect a lateral movement of the lever, wherein detection of the lateral movement causes the input apparatus to control movement of the surgical tool based on the detected lateral movement of the lever, wherein the lateral movement detector comprises a magnetic angular sensor configured to detect an angle formed between the lever and the side of the body or an inductive sensor configured to detect a non-linear movement of a metallic portion disposed in or integrally formed with the lever.
12. The apparatus of claim 11, wherein the input control interface having a greater length than width.
13. The apparatus of claim 12, wherein the input control interface is generally rectangular.
14. The apparatus of claim 11, wherein the proximal portion has a contoured surface configured to support at least a portion of the palm of the user's hand.
15. The apparatus of claim 14, wherein the contoured surface is a convex surface.
16. The apparatus of claim 11, further comprising a feedback device supported by the body and configured to provide feedback to a user in response to a change in a function of the hand controller apparatus from a first mode to a second mode, the first mode being different from the first mode, wherein the function comprises at least one: controlling a camera that images a surgical site, instrument clutching to reposition the hand controller apparatus, a pre-set surgery routine, or an operation to control the surgical tool, and wherein the change from the first mode to the second mode is configured to occur within a same function.
17. The apparatus of claim 16, wherein the feedback device is configured to provide a haptic feedback, a tactile feedback, a force feedback, a visual feedback or an audio feedback in response to the change in the function.
18. The apparatus of claim 11, wherein the input control interface is formed on a surface of the body and configured to sense the input from the one or more fingers of the user's hand, a processor configured to control a function of the one or more tools in response to the sensed input.
19. The apparatus of claim 18, wherein the input control interface comprises a trackpad or a capacitive touch surface configured to sense at least one of: swiping from a first side of the trackpad to a second side of the trackpad different from the first side, tapping, swiping and holding, tapping and holding, multiple tapping, or multiple tapping and holding.
Boom!
URGICAL INSTRUMENT APPARATUS, ACTUATOR, AND DRIVE
DOCUMENT ID
US 20220361973 A1
DATE PUBLISHED
2022-11-17
INVENTOR INFORMATION
NAME
CITY
STATE
ZIP CODE
COUNTRY
Genova; Perry A.
Chapel Hill
NC
N/A
US
Laakso; Aki Hannu Einari
Raleigh
NC
N/A
US
APPLICANT INFORMATION
NAME
Titan Medical Inc.
CITY
Toronto
STATE
N/A
ZIP CODE
N/A
COUNTRY
CA
AUTHORITY
N/A
TYPE
assignee
APPLICATION NO
17/878958
DATE FILED
2022-08-02
DOMESTIC PRIORITY (CONTINUITY DATA)
parent US continuation 17859276 20220707 PENDING child US 17878958
parent US continuation 17511658 20211027 parent-grant-document US 11382708 child US 17859276
parent US continuation 17406147 20210819 PENDING child US 17511658
parent US continuation 16427164 20190530 parent-grant-document US 11123146 child US 17406147
US CLASS CURRENT:
1/1
CPC CURRENT
TYPE
CPC
DATE
CPCI
A 61 B 17/29
2013-01-01
CPCI
A 61 B 34/30
2016-02-01
CPCI
A 61 B 34/71
2016-02-01
CPCI
A 61 B 17/00234
2013-01-01
CPCA
A 61 B 2017/00314
2013-01-01
CPCA
A 61 B 2017/2908
2013-01-01
CPCA
A 61 B 2017/00327
2013-01-01
CPCA
A 61 B 2034/306
2016-02-01
Abstract
A surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient is disclosed and includes an elongate manipulator having a distal end for receiving an end effector and including a plurality of control links extending through the manipulator operable to cause movement of the distal end in response to movement of the control links in a longitudinal direction. An actuator chassis is disposed at a proximal end of the manipulator and includes a plurality of actuators slidingly mounted within the actuator chassis for linear movement in the longitudinal direction. Each actuator is coupled to a control link and adjacently disposed about a curved periphery of the actuator chassis. An outwardly oriented portion couples a drive force to the actuator to cause movement of the control link.
Background/Summary
TECHNICAL FIELD
[0001] This disclosure relates generally to a surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient.
DESCRIPTION OF RELATED ART
[0002] Surgical instruments used in laparoscopic and/or robotic surgery generally have a proximally located actuator that may be used to actuate a distal end effector for performing a surgical task within a body cavity of a patient. Such instruments may be used in applications where there is an area of limited access for an operator. The distal end of the instrument may be inserted into the area of limited access and the operator may remotely manipulate the instrument via the actuator. The actuator may be located outside the area of limited access, but there may still be constraints placed on the extents of the actuator. There remains a need for actuators and drivers that are suitable for laparoscopic and/or robotic instruments.
SUMMARY
[0003] In accordance with one disclosed aspect there is provided a surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient. The apparatus can include an elongate manipulator with a distal end configured to receive an end effector and including a plurality of control links extending through the manipulator and configured to cause movement of the distal end of the manipulator in response to movement of the control links in a longitudinal direction generally aligned with a length of the manipulator. The apparatus can also include an actuator chassis disposed at a proximal end of the manipulator, the actuator chassis including a plurality of actuators slidingly mounted within the actuator chassis and configured to move linearly in a direction aligned with the longitudinal direction, each actuator being coupled to one of the control links. The actuators are adjacently disposed about a curved periphery of the actuator chassis and including an outwardly oriented portion configured to couple a drive force to the actuator to cause movement of the control link.
[0004] The curved periphery of the actuator chassis may be cylindrically shaped and the plurality of actuators may be mounted within slots extending longitudinally along the periphery and radially arranged about the periphery.
[0005] The actuator chassis periphery may include a curved portion and a flat portion and the plurality of actuators may be mounted within slots extending longitudinally along the curved portion and radially arranged about the curved portion, the flat portion facilitating location of the surgical instrument apparatus adjacent (for example, closely adjacent) to another apparatus including a corresponding flat portion.
[0006] The another apparatus including the corresponding flat portion may include another of the surgical instrument apparatus and the respective flat portions may facilitate location of the respective elongate manipulators in proximity (for example, close proximity) for insertion through a common access port inserted or positioned to provide access to the body cavity of the patient.
[0007] The outwardly oriented portions of the plurality of actuators may be each shaped to engage a corresponding drive coupler configured to couple the drive force to the actuator.
[0008] The actuator coupling portion of the actuator may include a protrusion that extends outwardly beyond the curved periphery of the actuator chassis.
[0009] The apparatus may include a drive chassis including a respective plurality of drive couplers configured to couple drive forces to the plurality of actuators, the drive couplers arranged about the periphery of the actuator chassis, each drive coupler may include an open channel portion configured to receive the respective actuator protrusions when the actuator chassis is inserted into the drive chassis, and a retaining portion configured to receive and retain the respective actuator protrusions when the drive chassis and the actuator chassis are rotated thorough an angle to cause the retaining portions to engage the respective actuator protrusions.
[0010] The drive chassis may be configured to permit the manipulator to be inserted through the drive chassis to cause the open channel portions to receive the respective actuator protrusions.
[0011] The actuator chassis may include a transition portion between the manipulator and the actuator chassis, the transition portion configured to laterally displace the control links for coupling to the respective actuators.
[0012] The manipulator may include at least one end effector control link configured to couple to an end effector and the actuator chassis may include at least one end effector actuator coupled to the end effector control link to actuate movements of the end effector.
[0013] The at least one end effector actuator may be mounted within the actuator chassis to permit at least one of longitudinal movement configured to actuate opening or closing of an end effector, or rotational movement configured to cause a corresponding rotation of the end effector.
[0014] The at least one end effector actuator may include a single end effector actuator configured to perform both the longitudinal movement and the rotational movement.
[0015] The at least one end effector control link may be routed along a central bore of the actuator chassis and the end effector actuator may be mounted at a distal portion of the actuator chassis.
[0016] The manipulator may include a rigid portion connected to the actuator chassis, and an actuatable articulated portion configured to cause the movement of the distal end of the manipulator in response to the longitudinal movement of the control links.
[0017] The apparatus may include an unactuated articulated portion disposed between the rigid portion and the chassis, the unactuated articulated portion configured to permit the manipulator to be bent to reduce an overall length of the manipulator and actuator chassis during cleaning and sanitizing of the apparatus.
[0018] In accordance with another disclosed aspect there is provided a surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient. The apparatus can include an elongate manipulator with a distal end configured to receive an end effector and including a plurality of control links extending through the manipulator and configured to cause movement of a distal end of the manipulator in response to movement of the control links in a longitudinal direction generally aligned with a length of the manipulator. The apparatus can also include an actuator chassis disposed at a proximal end of the manipulator, the actuator chassis including a plurality of actuators mounted within the actuator chassis, each actuator being coupled to one of the control links configured to couple a drive force to the actuator to cause movement of the control link. The proximate end of the manipulator can be laterally offset to facilitate location or positioning of the surgical instrument apparatus adjacent (such as, closely adjacent) to another surgical instrument apparatus for insertion or positioning through a common access port inserted to provide access to the body cavity of the patient.
[0019] The manipulator may include a rigid portion connected to the actuator chassis, and an actuatable articulated portion configured to cause the movement of the distal end of the manipulator in response to longitudinal movement of the control links.
[0020] The apparatus may include an unactuated articulated portion disposed between the rigid portion and the actuator chassis, the unactuated articulated portion configured to permit the manipulator to be bent to reduce an overall length of the manipulator and actuator chassis during cleaning and sanitizing of the apparatus.
[0021] The proximate end of the manipulator can be laterally offset to facilitate positioning of the surgical instrument adjacent to the another surgical instrument apparatus so that spacing between the manipulator and another manipulator of the another surgical instrument is between about 10 millimeters and about 35 millimeters.
[0022] In accordance with another disclosed aspect there is provided a surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient. The apparatus can include an elongate manipulator with a distal end configured to receive an end effector and including a plurality of control links extending through the manipulator and configured to cause movement of a distal end of the manipulator in response to movement of the control links in a longitudinal direction generally aligned with a length of the manipulator. The apparatus can also include an actuator chassis disposed at a proximal end of the manipulator, the actuator chassis including a plurality of actuators mounted within the actuator chassis, each of the plurality of actuators being coupled to one of the control links configured to couple a drive force to the actuator to cause movement of the control link. The manipulator can include a rigid portion connected to the actuator chassis, and an actuatable articulated portion configured to cause the movement of the distal end of the manipulator in response to longitudinal movement of the control links. The apparatus can further include an unactuated articulated portion disposed between the rigid portion and the chassis, the unactuated articulated portion configured to permit the manipulator to be bent to reduce an overall length of the manipulator and actuator chassis during cleaning and sanitizing of the apparatus.
[0023] Other aspects and features will become apparent to those ordinarily skilled in the art upon review of the following description of specific disclosed embodiments in conjunction with the accompanying figures.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] In drawings which illustrate disclosed embodiments,
[0025] FIG. 1 is a perspective view of a surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient;
[0026] FIG. 2 is a partially cut away perspective view of an actuator chassis of the surgical instrument apparatus shown in FIG. 1;
[0027] FIG. 3A is a perspective view of an actuator of the actuator chassis shown in partial engagement with a drive coupler;
[0028] FIG. 3B is a perspective view of the actuator shown in full engagement with the drive coupler;
[0029] FIG. 4A is a perspective view of a drive chassis including a plurality of the drive couplers shown in FIGS. 3A and 3B and the actuator chassis of FIG. 2 being inserted into the drive chassis;
[0030] FIG. 4B is a perspective view of the drive chassis of FIG. 4A showing the actuator chassis in partial engagement with the drive chassis;
[0031] FIG. 4C is a perspective view of the drive chassis of FIG. 4B showing the actuator chassis in full engagement with the drive chassis;
[0032] FIG. 5A is a perspective view of a surgical instrument apparatus in accordance with another embodiment;
[0033] FIG. 5B is a perspective view of a pair of the surgical instrument apparatus shown in FIG. 5A disposed adjacently for insertion through a common access port;
[0034] FIG. 6 is a perspective view of a pair of surgical instruments disposed adjacently for insertion through a common access port operation in accordance with another embodiment; and
[0035] FIG. 7 is a perspective view a surgical instrument apparatus in accordance with another embodiment.
DETAILED DESCRIPTION
[0036] Referring to FIG. 1, a surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient is shown generally at 100. The apparatus 100 includes an elongate manipulator 102 having a distal end 104 for receiving an end effector 106. The manipulator 102 includes a plurality of control links 108 extending through the manipulator. The plurality of control links 108 are operable to cause movement of the distal end 104 of the manipulator in response to movement of the control links in a longitudinal direction 110 generally aligned with a length of the manipulator. The apparatus 100 also includes an actuator chassis 120 disposed at a proximal end 112 of the manipulator 102. The actuator chassis 120 includes a plurality of actuators 122 slidingly mounted within the actuator chassis for linear movement in a direction aligned with the longitudinal direction 110. In the embodiment shown, the actuators 122 are adjacently mounted within respective slots 124 disposed on a curved periphery 126 of the actuator chassis 120.
[0037] In the embodiment shown, the manipulator 102 includes a rigid portion 114 connected to the actuator chassis 120 and an articulated portion 116 that is actuatable to cause the movement of the distal end 104 of the manipulator in response to the longitudinal movement of the control links 108. The articulated portion 116 includes a plurality of coupled guides 118 mounted end-to-end and operable to move in response to pulling or pushing of the plurality of control links 108 as described in commonly owned PCT patent publication WO2014/201538 entitled “ARTICULATED TOOL POSITIONER AND SYSTEM EMPLOYING SAME” filed on Dec. 20, 2013 and incorporated herein by reference in its entirety. In other embodiments, the manipulator 102 may include structures other than the coupled guides 118 for causing movement of the distal end 104 of the manipulator.
[0038] Referring to FIG. 2, the proximal end 112 of the manipulator 102 and the actuator chassis 120 are shown with the actuator chassis partially cut away. In one embodiment, the plurality of control links 108 are implemented as wires routed through respective bores 200 extending through the manipulator 102. The actuator chassis 120 has a transition portion 202 between the proximal end 112 of the manipulator 102 and the actuator chassis. In this embodiment the transition portion 202 includes a bulkhead 204 having openings 206 that cause the respective control links 108 to be laterally displaced toward the curved periphery 126 of the actuator chassis 120. The transition portion 202 facilitates the movement of the control links 108 along their respective axes while preventing drift of the control links 108. In one embodiment, the transition portion 202 may include curved conduit (not shown) extending between the proximal end 112 of the manipulator 102 and the bulkhead 204 for receiving and guiding control links 108 through the transition portion. Each actuator 122 is coupled to one of the control links 108. The control links 108 may be implemented using nitinol wire, which is capable of bending through an arc while still transmitting force in tension or compression. Nitinol is an alloy of nickel and titanium having shape memory and superelasticity and is capable of transmitting forces of about 200N. In other embodiments, the control links 108 may be implemented using other commonly used wires such as stranded cables used in laparoscopic instruments.
[0039] One actuator 208 of the plurality of actuators 122 is shown displaced longitudinally within the slot 124. The longitudinal displacement of the actuator 208 causes the coupled control link 108 to be correspondingly pulled rearwardly within the actuator chassis 120. Other actuators 122 such as the adjacent actuators 210 and 212 are similarly moveable within the respective slots 124 to push or pull the associated control link 108. In the embodiment shown, the curved periphery 126 of the actuator chassis 120 is cylindrically shaped and the slots 124 are radially arranged about the curved periphery.
[0040] Referring back to FIG. 1, in one embodiment pairs of the control links 108 are coupled to coupler segments 130, 132, and 134. Actuation of the control links 108 by the actuators 122 causes the coupled guides 118 between each of the coupler segments to be displaced laterally to cause the distal end 104 and the end effector 106 to be moved into a desired position and orientation. A portion of the coupler segment 132 is shown cut away in an insert 136. In this embodiment a first pair 138, 140 of the plurality of control links 108 terminate within the coupler segment 132 and when the control link 138 is pushed by advancing the associated actuator 122 while the control link 140 is pulled by rearwardly retracting the associated actuator 122 within its slot, the coupler segment 132 is moved laterally. Similarly, a second pair 142, 144 of the plurality of control links 108 terminate within the coupler segment 132 and when the control link 142 is pushed by advancing the associated actuator 122 within its slot while the control link 144 is pulled by rearwardly retracting the associated actuator 122 within its slot, the coupler segment 132 is moved vertically upward. Reversal of the pushing and pulling of the respective actuators 122 causes a respective lateral movement to the opposite side or downward movement.
[0041] In another embodiment, the first pair 138, 140 of the plurality of control links 108 may be respectively used for pulling motions without a corresponding pushing motion. In this embodiment when the control link 140 is pulled by rearwardly retracting the associated actuator 122 within its slot (while the control link 138 is let out by a corresponding amount, such as, for example, by advancing the associated actuator 122 or by allowing the actuator 122 to feely float), the coupler segment 132 is moved laterally. Similarly, in another embodiment, for the second pair 142, 144 of the plurality of control links 108 when the control link 144 is pulled by rearwardly retracting the associated actuator 122 within its slot (while the control link 142 is let out by a corresponding amount, such as, for example, by advancing the associated actuator 122 or by allowing the actuator 122 to freely float), the coupler segment 132 is moved vertically upward. Reversal of the pulling of the respective actuators 122 causes a respective lateral movement to the opposite side or downward movement.
[0042] Combinations of lateral and vertical movement will cause the 132 to move in any direction within a working volume of the manipulator 102. The coupler segment 134 may be similarly moved via other pairs of control links 108 actuated by the respective actuators 122 to point in any direction within the working volume. Further as described in commonly owned PCT patent publication WO2014/201538, the coupled guides 118 between the rigid portion 114 and the coupler segment 130 and the coupled guides between the coupler segment 130 and the coupler segment 132 may be configured to maintain the orientation of the coupler segment 132 substantially the same as the rigid portion 114. In this case, the guides 118 within these portions of the articulated portion 116 are constrained to move as a two-dimensional parallelogram by a set of wire links extending between the rigid portion 114 and the coupler segment 132.
[0043] Still referring to FIG. 1, each of the actuators 122 includes an outwardly oriented portion 150 that facilitates coupling a drive force to the actuator to cause movement of the coupled control link. In this embodiment, the outwardly oriented portions 150 also protrude outwardly with respect to the curved periphery 126. Referring to FIG. 3A, one of the actuators 122 is shown in isolation in engagement with a drive coupler 300. The drive coupler 300 may be part of an instrument drive of a robotic surgery system (not shown). The drive coupler 300 includes a curved outer wall 302 and a first end wall 304 extending radially inwardly from the curved outer wall and defining an open channel 306 in the drive coupler. The open channel 306 is sized to receive the protruding portion 150 of the actuator 122 when slid into the drive coupler 300 in the direction indicated by the arrow 308 in FIG. 3A. Once received within the opening 306, the drive coupler 300 is rotated in the direction of the arrow 310 to engage the outwardly oriented portion 150 of the actuator 122 as shown in FIG. 3B.
[0044] Referring to FIG. 3B, the drive coupler 300 further includes a second end wall 312 extending over the full length of the curved outer wall 302. The outwardly oriented portion 150 of the actuator 122 is engaged between the first end wall 304 and the second end wall, which define a retaining portion for receiving and retaining the actuator protrusion 150 when the drive coupler 300 is rotated thorough an angle to cause the retaining portions to engage the actuator protrusion. Once the drive coupler 300 is engaged, a force F applied to the drive coupler 300 is transmitted to the outwardly oriented portion 150 to cause longitudinal motion of the actuator 122 within the associated slot 124.
[0045] Referring to FIG. 4A, in the embodiment shown a plurality of the drive couplers 300 shown in FIGS. 3A and 3B are arranged to provide a drive chassis 400. The drive couplers 300 are annularly arranged about the periphery 126 of the actuator chassis 120 with the open channels 306 aligned with the outwardly oriented portions 150 of the actuators 122. The drive chassis 400 is configured to permit the manipulator 102 to be inserted through the drive chassis when loading the surgical instrument apparatus 100. The open channels 306 of the drive couplers 300 receive the respective actuator protrusions 150 as shown in FIG. 4B. Referring to FIG. 4B, the drive chassis 400 and/or actuator chassis 120 is then rotated thorough an angle in a direction indicated by the arrow 402 to cause the retaining portions (i.e. first and second end walls 304 and 312, shown in FIGS. 3A and 3B) to engage the respective actuator protrusions 150 as shown in FIG. 4C. Referring to FIG. 4C, once the drive couplers 300 are engaged, each drive coupler is able to independently move back and forward in the longitudinal direction 110 to couple drive forces to the respective actuators 122. In one embodiment the drive chassis 400 is part of an instrument drive (not shown) that generates and couples individual drive forces to the respective drive couplers 300. The instrument drive may be implemented as part of a robotic surgery system in which operator input received at an input device is used to generate drive signals, which are used to control the instrument drive for causing manipulation of the manipulator 102 via the drive chassis 400 and actuator chassis 120.
[0046] In the embodiment shown in FIG. 1, eight actuators 122 and associated control links 108 are provided. Four of these actuators 122 cause movement of the coupler segment 132, while the remaining four actuators cause movement of the coupler segment 134. Referring back to FIG. 2, the manipulator 102 further includes a central bore 220 that in this embodiment accommodates an end effector control link 222. The end effector control link 222 is coupled to the end effector 106 for causing opening of the actuator jaws and/or causing rotation of the actuator about a longitudinal axis of the manipulator 102. The end effector control link 222 is routed through the actuator chassis 120 and coupled to an end cap 224 at a distal end of the actuator chassis. In one embodiment, the end cap 224 is able to rotate in the direction of the arrow 226, which rotates the end effector control link 222 causing corresponding rotation of the end effector at the distal end 104 of the manipulator 102. Additionally, the end cap 224 may also be configured to move in the longitudinal direction 110 to actuate longitudinal back and forth movement of the end effector control link 222 for opening and closing the end effector. The single end effector control link 222 may thus be operable to actuate both rotation and opening/closing movements of the end effector 106. In other embodiments, the end effector control link 222 may be configured as a hollow torque tube that provides the rotational actuation to the end effector 106, while an additional control link may be routed through the central bore 220 to actuate the opening and closing movements of the end effector 106.
[0047] Referring to FIG. 5A, an actuator chassis in accordance with another embodiment is shown generally at 500. The periphery of the actuator chassis 500 includes a curved portion 502 and a flat portion 504. The actuator chassis 500 includes a plurality of actuators 506 configured generally as described above. The plurality of actuators 506 are mounted in respective slots 508 extending longitudinally along the curved portion 502 of the actuator chassis 500. The actuators 506 are radially arranged about the curved portion 502 and the actuator chassis 500 is coupled to a manipulator 102 (shown in part) as generally described above.
[0048] In many cases two or more of the surgical instrument apparatus 100 may be used during a surgical procedure performed through a single common access port (i.e. a single incision or opening to a body cavity of a patient). Referring to FIG. 5B, the flat portion 504 of the actuator chassis 500 facilitates closely spacing the actuator adjacent to a second actuator chassis 510 having a corresponding flat portion 512. The close spacing has the advantage of spacing the manipulator 102 and a manipulator 514 coupled to the actuator chassis 510 in relatively close proximity for insertion through a common access port and/or trocar (not shown). The spacing D between the manipulators may be less than about 10 millimeters, about 10 millimeters, about 20 millimeters, about 21.5 millimeters, about 35 millimeters, about 40 millimeters, or greater than about 35 millimeters or 40 millimeters, such as about 50 millimeters or 60 millimeters. The spacing D between the manipulators may be between about 10 millimeters (or less) and about 20 millimeters (or more), between about 10 millimeters (or less) and about 35 millimeters (or more), between about 10 millimeters (or less) and about 40 millimeters (or more), between about 20 millimeters (or less) and about 35 millimeters (or more), or between about 20 millimeters (or less) and about 40 millimeters (or more). The further off-center the manipulator 102 and the manipulator 514 are from the respective actuator chassis 500 and 510 such that the spacing D is reduced, the smaller the diameter of the common access port/trocar. Each of the actuator chassis 500 and the actuator chassis 510 would be received within a drive chassis (not shown) configured to accommodate and provide drive forces for operating the side-by-side surgical instruments.
[0049] Referring to FIG. 6, an alternative arrangement for side-by-side surgical instrument operation includes a first actuator chassis 600 disposed spaced apart from a second actuator chassis 602. Each actuator chassis 600, 602 has a respective manipulator 604 and 606 coupled to the chassis. The manipulators 604 and 606 have respective actuatable articulated portions 608 and 610 configured generally as described above in connection with the FIG. 1 embodiment. The manipulators 604 and 606 each have respective rigid portions 612 and 614. The rigid portion 612 of the manipulator 604 has a leftward laterally offset portion 620 while the manipulator 606 has a rightward laterally offset portion 622. The left and right laterally offset portions 620 and 622 facilitate closely adjacent location of the respective articulated portions 608 and 610 of the manipulators 604 and 606 for insertion through a common access port.
[0050] Referring to FIG. 7, a surgical instrument apparatus in accordance with another embodiment is shown generally at 700. The surgical instrument apparatus 700 includes an actuator chassis 702 configured generally as disclosed above. The actuator chassis 702 is coupled to a manipulator 704 including a rigid portion 706 and an actuatable articulated portion 708 also configured generally as disclosed above. In this embodiment, the surgical instrument apparatus 700 further includes an articulated portion 712 disposed between the rigid portion 706 and the actuator chassis 702. The articulated portion 712 permits the manipulator to be bent as shown in FIG. 7 to reduce an overall length of the instrument (i.e. manipulator and actuator chassis). The articulated portion 712 may be actuated during a surgical procedure or may be a passive portion that is not actuated during the procedure.
[0051] In many cases the surgical instrument apparatus 700 may be reusable and cleaning and sanitization following use in a surgical procedure is thus required. The overall length of the surgical instrument apparatus 100 shown in FIG. 1 may prohibit its accommodation within the conventional sanitization equipment. The articulated portion 712 facilitates bending of the instrument to reduce the overall dimensions that may make the instrument more readily accommodated in a decontamination sink or a chamber of a washer/disinfector commonly used for cleaning and sanitization in surgical environments. Additional bending to accommodate limited space constraints during cleaning and sanitization may be enabled by having the actuatable articulated portion 708 at least partially bendable/flexible during cleaning and sanitization (i.e. when not in surgical use). This additional bending and/or the bending of articulated portion 712 may be facilitated by may allowing the control links extending through the manipulator 704 to move into a relaxed state, for example by maneuvering the actuators (such as actuators 506 shown in FIG. 5A).
[0052] Language of degree used herein, such as the terms “approximately,” “about,” “generally,” and “substantially” as used herein represent a value, amount, or characteristic close to the stated value, amount, or characteristic that still performs a desired function or achieves a desired result. For example, the terms “approximately”, “about”, “generally,” and “substantially” may refer to an amount that is within less than 10% of, within less than 5% of, within less than 1% of, within less than 0.1% of, or within less than 0.01% of the stated value.
[0053] While specific embodiments have been described and illustrated, such embodiments should be considered illustrative only and not as limiting the disclosed embodiments as construed in accordance with the accompanying claims.
Claims
1-20. (canceled)
21. A surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient, the apparatus comprising: an elongate manipulator having a distal end configured to support an end effector and including a plurality of control links extending through the manipulator and operatively engaged with the end effector, the manipulator defining a longitudinal axis; and an actuator chassis disposed at a proximal end of the manipulator, wherein the actuator chassis is substantially cylindrically shaped, the actuator chassis including a plurality of actuators slidingly mounted within the actuator chassis and configured to move linearly in a direction aligned with the longitudinal axis, each actuator being coupled to at least one of the plurality of control links, the actuators being adjacently radially arranged about a periphery of the actuator chassis, each actuator including an outwardly oriented portion configured to cause movement of the control link.
22. The apparatus of claim 21, wherein the plurality of actuators are mounted within slots extending longitudinally along the periphery of the actuator chassis and being radially arranged about the periphery of the actuator chassis.
23. The apparatus of claim 21, wherein the outwardly oriented portions of the plurality of actuators are each shaped to engage a corresponding drive coupling configured to couple to the actuator and transmit a drive force to the actuator.
24. The apparatus of claim 21, wherein the at least one control link is an end effector control link configured to couple to an end effector, and wherein the actuator chassis comprises at least one end effector actuator coupled to the end effector control link to actuate movements of the end effector.
25. The apparatus of claim 24, wherein the at least one end effector actuator is mounted within the actuator chassis to permit at least one of: longitudinal movement configured to actuate opening or closing of an end effector; or rotational movement configured to cause a corresponding rotation of the end effector.
26. The apparatus of claim 25, wherein the at least one end effector actuator comprises a single end effector actuator configured to perform both the longitudinal movement and the rotational movement.
27. The apparatus of claim 24, wherein the at least one end effector control link is routed along a central bore of the actuator chassis and the end effector actuator is mounted at a distal portion of the actuator chassis.
28. The apparatus of claim 21, wherein at least a portion of a length of the manipulator is rigid.
29. The apparatus of claim 23, wherein each actuator of the plurality of actuators includes an actuator coupling portion having a protrusion that extends outwardly beyond the periphery of the actuator chassis.
30. The apparatus of claim 29, further comprising: a drive chassis including a respective plurality of drive couplers configured to couple drive forces to the plurality of actuators, the drive couplers radially arranged about the periphery of the actuator chassis, each drive coupler including: an open channel portion configured to receive the respective actuator protrusions when the actuator chassis is inserted into the drive chassis; and a retaining portion configured to receive and retain the respective actuator protrusions when the drive chassis and the actuator chassis are rotated thorough an angle to cause the retaining portions to engage the respective actuator protrusions.
31. The apparatus of claim 21, wherein the elongate manipulator includes a tubular portion having a distal end supporting the end effector and a proximal end supporting the actuator chassis.
32. A surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient, the apparatus comprising: an elongate manipulator including: a tubular portion having a distal end and a proximal end, the tubular portion defining a longitudinal axis; an end effector supported at the distal end of the tubular rigid portion; and a plurality of control links extending through the tubular portion, wherein at least one control link of the plurality of control links is connected to the end effector to effect actuation of the end effector; and an actuator chassis disposed at the proximal end of the tubular portion of the manipulator, wherein the actuator chassis is cylindrically shaped, the actuator chassis including a plurality of actuators slidingly mounted within the actuator chassis, each of the plurality of actuators being coupled to a respective one of the plurality of control links and being configured to cause movement of the respective one of the plurality of control links and to effectuate actuation of the end effector, wherein the actuators are adjacently arranged about a periphery of the actuator chassis.
33. The apparatus of claim 32, wherein the plurality of actuators are mounted within slots extending longitudinally along the periphery and arranged about the periphery of the actuator chassis.
34. The apparatus of claim 32, wherein the outwardly oriented portions of the plurality of actuators are each shaped to engage a corresponding drive coupling configured to couple to the actuator and transmit a drive force to the actuator.
35. The apparatus of claim 32, wherein each actuator of the plurality of actuators includes an actuator coupling portion having a protrusion that extends outwardly beyond the periphery of the actuator chassis.
36. The apparatus of claim 35, further comprising: a drive chassis including a respective plurality of drive couplers configured to couple drive forces to the plurality of actuators, the drive couplers radially arranged about the periphery of the actuator chassis, each drive coupler including: an open channel portion configured to receive the respective actuator protrusions when the actuator chassis is inserted into the drive chassis; and a retaining portion configured to receive and retain the respective actuator protrusions when the drive chassis and the actuator chassis are rotated thorough an angle to cause the retaining portions to engage the respective actuator protrusions.
37. The apparatus of claim 32, wherein the tubular portion is substantially rigid.
38. A surgical instrument apparatus for performing a surgical procedure within a body cavity of a patient, the apparatus comprising: an elongate manipulator including: a substantially rigid tubular portion having a distal end and a proximal end, the tubular portion defining a longitudinal axis; an end effector supported at the distal end of the tubular portion; and a plurality of control links extending through the tubular portion, wherein at least one control link of the plurality of control links is connected to the end effector to effect actuation of the end effector; and an actuator chassis disposed at the proximal end of the tubular portion, the actuator chassis defining an outer periphery having a substantially cylindrical shape, the actuator chassis including: a plurality of actuators slidingly mounted within longitudinally extending slots formed in the actuator chassis, the plurality of actuators being configured to move linearly in a direction aligned with the longitudinal axis, each actuator being coupled to at least one control link of the plurality of control links, wherein the actuators are disposed about the outer periphery of the actuator chassis, each actuator including an outwardly oriented portion configured to receive a drive force from the actuator to cause movement of the at least one control link of the plurality of control links.
39. The apparatus of claim 38, wherein the outwardly oriented portions of the plurality of actuators are each shaped to engage a corresponding drive coupling configured to couple the drive force to the actuator.
40. The apparatus of claim 39, wherein an actuator coupling portion of the actuator comprises a protrusion that extends outwardly beyond the periphery of the actuator chassis.
41. The apparatus of claim 40, further comprising: a drive chassis including a respective plurality of drive couplers configured to couple drive forces to the plurality of actuators, the drive couplers radially arranged about the periphery of the actuator chassis, each drive coupler including: an open channel portion configured to receive the respective actuator protrusions when the actuator chassis is inserted into the drive chassis; and a retaining portion configured to receive and retain the respective actuator protrusions when the drive chassis and the actuator chassis are rotated thorough an angle to cause the retaining portions to engage the respective actuator protrusions.
42. The apparatus of claim 38, wherein the actuator chassis is cylindrically shaped.
However! it seems that the three white soldiers hold
they lose 30 billion with FTX
but how do they do?
and here it is hard to get to $ 1
I want $ 1! holy shit
https://www.rubiustx.com/about-us/#board-of-directors
Rubius has recently generated new non-human primate data with the next generation cell conjugation RED PLATFORM, demonstrating longer circulation time than observed with the first generation platform and pronounced pharmacodynamic effects as shown by increased levels of interferon gamma, a cytokine critical to both innate and adaptive immunity. In parallel, the Company has advanced a next generation Red Cell Therapeutic, RTX-250, an antigen-specific therapy that is designed to activate dendritic cells.
With the first generation Red Cell Therapeutics, Rubius demonstrated that engineered red blood cells could be manufactured at scale, safely administered and activate a patient’s immune system, resulting in clinical benefit in certain cancer patients, including evidence of tumor shrinkage and prolonged stable disease in PD-(L)-1 refractory solid tumors.
ISTITUTIONS Total (for Top 20) 24.30
FUNDS Total (for Top 20) 80.47
Institutional Shares (Long) 74,569,319 - 82.51% (ex 13D/G)
Insider Ownership 12,112,658 - 13.40%
https://wsw.com/webcast/jeff255/ctic/1835973
i like Adam Craig
So 4!
Let's go EPIX!
Interesting!
I took a handful!
LEADERSHIP CHANGES
STRATEGIC ALTERNATIVES: SALE OF ALL OR PART OF COMPANY OR A MERGER
I'm playful!
Great challenges to virtual golf
I prefer 1 hole but I understand they like 3
https://consultqd.clevelandclinic.org/single-port-robot-turns-radical-prostatectomy-into-outpatient-procedure/
Huge Market!
Intuitive Surgical's best-known product is the da Vinci surgical robotics system system used in millions of minimally invasive surgeries all over the world. To give you an idea of the company's vast footprint, the global surgical robotics market hit an estimated valuation of about $5.2 billion in 2021. Last year, Intuitive Surgical's total annual revenue from its systems, instruments, and accessories totaled about $4.8 billion.
Over the past 10 years, Intuitive Surgical's annual revenue and net income have risen by around 160% annually -- and the stock has delivered a total return of more than 300% for investors. While a decline in procedures due to COVID-19 recurrence in certain markets has impacted recent financial reports, Intuitive Surgical continues to grow its top line and remain profitable.
It's no wonder analysts think the stock could achieve a high upside of 36% over the next 12 months.
ISGR has made its SP starting from the MP.
The SP is an unknown, it could be a new blue ocean or not!
In any case it is good to dominate there too!
In a year, the 7 billion it has will be worth 10% less, as purchasing power!
They might as well place an offer here and see what happens!
At the very least, it makes Merdtronic pay more!
ISGR, give it a shot!
Let's act quickly
What is Gary plotting?
Does Gary have any surprises to be loved by the old shareholders or the new ones?
Will they give us a ride on the roller coaster before the blows?
Come on, Gary, be a whore with ISGR
nothing ventured nothing gained
REALLY!
looks like a comedy!
Couldn't they do it tomorrow, which is the cuckold party?
No today, the greenest day!
Market is huge!
there is a nice barrier at the entrance and we are at the top of the fence but the market prefers to throw money in BTC or NTF
now there are the three arms
now let's talk about RAS
and it is not cheap
the rewards can be huge!
was built on the wishes of surgeons!
I have put all I can into it!
MP = nokia
SP = Iphone
all due!
OUCH !!!!!!!!!!
this taste of RS is bitter!
In thirty days and they deliver the toy to us!
Another 3/6 months and the surgeons begin to play with it ...
It wouldn't be bad if the market rewarded such sincerity!
One hole is better than three !!!!
Investors evidently prefer three holes!
A mercy BO could now break through ...
We just have to hope in speculators ....
We just need 1
give us a hand
who will buy today or tomorrow?
https://fintel.io/sst/us/ctic
https://clinicaltrials.gov/ct2/show/NCT05295927?cond=EPI-7386&draw=2&rank=3
Suspended
But
(Transfer of sponsorship pending)
180 milion share outstanding
17 milion short!
I see it for $ 10 because PAC is the best
and the FDA are taking notice
JB was right
JB was right
JB was right
JB was right
JB was right
JB was right
JB was right
Graig but why don't you propose to Pfizer to try PAC?
https://www.fiercepharma.com/pharma/europe-will-look-at-all-jak-inhibitors-following-safety-concerns-raised-trial-pfizer-s
owls are in no hurry!
https://www.youtube.com/shorts/yPm0Yqx2mcE
In the third quarter, CTI continued to make strong progress with the U.S. commercial laun ch of VONJO, delivering net revenue of $18.2 million, a 48% increase in sales compared to the second quarter.
IMO, careful cost management will lead to higher margins
for now it seems to me the right team! I wish they didn't disappoint us
I wonder, Is it time to reveal some secrets?
I like page 19… the timeline unchanged
https://d1io3yog0oux5.cloudfront.net/_94dddf0c4bf2320bce6781bda5bb5d46/titanmedicalinc/db/1086/9913/pdf/Titan+-+IR+Presentation+-November+website.pdf
The first single-access RAS system in the market designed from the ground up
made the due proportions...
INCY
Jakafi’s (a first-in-class JAK1/JAK2 inhibitor approved for polycythemiavera, myelofibrosis and refractory acute graft-versus-host disease) revenues came in at $619.5 million.
Jakafi® (ruxolitinib) net product revenues of $598 million in Q2’22 (+13% Y/Y) driven by volume growth; raising the bottom end of full year guidance to new range of $2.36 to $2.40 billion
Very well!
someone covers up?