Za darmo

Automática y Robótica en Latinoamérica

Tekst
0
Recenzje
Oznacz jako przeczytane
Czcionka:Mniejsze АаWiększe Aa

Precedents, motivation and objective

Recent technological advances in surgery allow the development of a series of new techniques that, above all, have accelerated the recovery of the patient, have reduced the mortality rate and have improved the diagnostic accuracy and the therapeutic result [1], [4], [6]. A new technology is glimpsed in the field of medical robotics for the next few years: micro robots. Surgical procedures based on micro robots are currently a challenge for both scientific-clinical community and engineers [2]. Surgical micro robots are seen as means that, in the near future, will allow the surgeon to reach points in the human body that would otherwise be very difficult to access. For this, it is important initially to recreate the access to these points in a virtual way, to know the complexities involved in navigation and the locomotion of these mechanisms [3], [5]. To obtain the medical images that will guide them, different techniques are used, like the ultrasound, which has shown several advantages such as cost-effectiveness, non-invasiveness and volumetric images capturing in real time [3], [12]. In comparison with the purely mechanical instruments that must be manipulated manually, medical micro-robots have enormous advantages in terms of maneuverability due to the different tasks they can perform [5], contrasting the restricted scope when manipulation is performed by human hands.

Omni Phantom is a motorized mid-range haptic device that provides force feedback, allowing to feel 3D objects and produce real tactile sensations while those objects are manipulate in the virtual environment, improving medical or scientific simulations [7]. Because the haptic interface provides a more real perception of the virtual environment, it is possible to evaluate the performance and scope of the human-robot interaction, these evaluations have allowed to use this haptic interface for medical purposes [1], [2], [4], [8]. The integration of the Unity 3D graphics engine and the Blender three-dimensional modeling software with the haptic interfaces lead the development of functional virtual environments [7], [8] of the human digestive system, where trajectories, movement, direction are programmed and the basic actions of movement of a micro robot [8], specifically in the area of the pancreas duct. In the interaction, the action of the haptic interface reflected in the collisions of the micro robot against the walls of the pancreatic duct and the force feedback that the user experiences is evidenced.

This article presents the implementation of a tool that allows to move a virtual micro robot inside the pancreas, using an Omni Phantom haptic interface, in order to test the potentialities and functionalities that this type of devices would have for the treatment of possible diseases in a particular organ of the human body.

Materials and methods

The design and implementation of the virtual navigation of the micro robot guided by the haptic interface Omni Phantom, was carried out by means of different software tools, where the digestive system of the human being was represented graphically by means of 3D models. The pancreas was specifically chosen with its multiple internal ramifications, where the micro robot will perform its navigation at a constant speed. On the other hand, the haptic interface gives the sense of touch when micro robot collides with the walls of the pancreas duct, generating a force feedback in the user’s hand. The following sections will present the different software packages used.

Unity 3D, Blender and ZBrush software

Unity 3D is a 2D and 3D development tool, used in virtual reality systems [9]. In this software, each 3D modeled organ in the Blender and ZBrush tools was included and will be explained below. In addition, the entire operating logic of the micro robot and the pancreatic duct was defined by scripts, allowing physical behaviors according to parameters found in the human body. On the other hand, Blender is a software for 3D modeling and also incorporates the possibility of giving textures and materials to them [10]. This software was used for design and development of 3D models of the pancreatic duct, stomach, pancreas, and digestive tract including the large and small intestines. It should be noted that textures and visual details of lighting were applied to each organ in order to improve its visual appearance. Finally, Zbrush is a program of 3D modeling, sculpture and digital painting [11]. This software was used for 3D modeling of the liver, because it is a structurally complex organ of modeling, mainly in the areas of the left and right lobes and in the area of the falciform ligament.

Omni Phantom haptic interface

The Omni Phantom haptic device allows for kinematic interaction with complex virtual environments, providing force feedback to the user’s hand. The Omni Phantom is a motorized device that allows you to feel virtual objects and produce real tactile sensations as the user manipulates the 3D objects on the screen, in this case, in the graphic environment of Unity 3D.

Integration of the haptic interface and Unity 3D.

The integration of the Unity 3D software and the hardware that corresponds to the Phantom Omni haptic interface was done through an Ethernet card that was integrated into a computer with Intel (R) Core (TM) i5-2310 processor, CPU @ 2.90GHz and 4GB RAM. The connection of the PC to the haptic device was physically made using a connection cable compatible with the ports of the hardware elements, the protocol used for this connection was TCP/IP which provides reliable transmission of data over networks.

Flowchart of the scripts that define the behavior of the application.

The implementation of haptic interaction in Unity allows the haptic rendering of geometries taking into account the limitations of haptic rendering and the hardware capacity and processing of the devices. It is required to specify the mesh of the tangible objects, as well as the components of the transformation matrices that will be fixed in the haptic frame using some of the functions that are detailed in later lines. To get a correct functioning of the system, the following steps must be followed:

 Turn on and link the haptic device.

 Set the haptic workspace with its dimensions and set of tangible faces in the “Shape Manipulation” function.

 Work area update (orientation workspace set based on a touch camera known as Haptic Camera) in the “Shape Manipulation” function.

 Setting the interaction mode using the “Shape Manipulation” function.

 Set the haptic model and its geometry (mesh and transformation matrix) by means of the “Haptic Properties” function.

 Define the environmental and mechanical effects of the workspace using the “Haptic Effects” function.

 Execution of the interface designed in Unity for haptic events.

Shape Manipulation function allows to establish the three-dimensional coordinates of the workspace, integrate an orientation camera for the execution of haptic events and, of course, the haptic interaction mode of the workspace. On the other hand, Haptic Properties function establishes the haptic properties that a tangible object can have. Finally, with Haptic Effects function, is possible to define several types of force action that the workspace executes on the haptic model, in addition it allows to handle collisions of objects that have initiation of haptic events and comply with the aforementioned characteristics. Figure 1 shows the various system connections between the scripts defined above.

Figure 1

Relationship of inputs and outputs for scripts


Source: Own elaboration

Results

The 3D models made in Blender and ZBrush software allowed to complement and show virtually the virtual navigation dynamics of the micro robot, specifically through the pancreas duct, these models described above were integrated into the Unity 3D graphic engine, where a scene was designed divided into two parts, as evidenced in Figure 2. In the right and top part is shown the patient and the different abdominal organs. On the right and lower part, you can be seen the pancreas divided in two and the pancreatic duct with its ramifications, in this section was included an arrow that constantly indicates the position of the micro robot along the pancreatic duct. The section on the left shows the execution of the simulation, i.e., the path of the micro robot in the pancreatic duct and the collisions it presents when touching the walls of the duct with the haptic interface.

Figure 2

Scene in Unity 3D for the navigation of the micro robot


Source: Own elaboration

The device can be entered into any of the designed abdominal organs, although in this case only the interior of the pancreatic duct has the characteristics of collision and therefore haptic feedback. The micro robot was designed and implemented with special attributes that define its direction in different axes, its path, speed and the collisions it generates when it collides with the walls of the pancreatic duct. Additionally, a system of cameras was implemented that allowed to improve the visualization of the 3D graphic environment and to have a georeferencing of the device inside the workspace. On the other hand, an explorer light was added in the frontal part of the micro robot to facilitate the vision of the person who is interacting with the graphic environment and the haptic interface. It should be noted that in the Unity 3D software a warning message is displayed in the lower right-hand corner, indicating the time of the collision with the pancreatic duct. Finally, Figure 3 presents the user interacting with the application through the haptic interface.

 

Figure 3

Interaction and testing of the micro-robot navigation with the haptic interface within the Unity 3D environment


Source: Prepared by the authors

Discussion and conclusions

The present article presented the implementation of a tool that allows the navigation of a virtual micro robot inside the pancreatic duct, making use of a haptic interface. The tool was built with the Unity 3D software, while the abdominal organs were modeled in Blender and ZBrush, and subsequently exported to the Unity environment. The connection between the haptic interface used, Omni Phantom with six degrees of freedom, and the Unity environment was shown in detail. The tests showed the correct navigation of the micro robot to the interior of the pancreatic duct and the feedback of force that the user feels when the device hits any of the walls of the duct.

Future works will include the locomotion of the micro robot which will be done by means of magnetic coils. It is also expected to schedule specific tasks for the device, to be performed on any of the abdominal organs.

References

[1] V. Vitiello, S. Lee, T. P. Cundy, and G. Yang, “Emerging Robotic Platforms for Minimally Invasive Surgery”, IEEE Rev. Biomed. Eng., vol. 6, pp. 111-126, 2013 [Online]. Available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6392862&isnumber=6490409

[2] N. Enayati, E. De Momi, and G. Ferrigno, “Haptics in Robot-Assisted Surgery: Challenges and Benefits”, IEEE Rev. Biomed. Eng., vol. 9, pp. 49-65, 2016 [Online]. Available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7425205&isnumber=7572229

[3] M. Antico et al., “Ultrasound guidance in minimally invasive robotic procedures”, Med. Image Anal., vol. 54, pp. 149-167, 2019. doi: 10.1016/j.media.2019.01.002

[4] C. G. Corrêa, F. Nunes, E. Ranzini, R. Nakamura, and R. Tori, “Haptic interaction for needle insertion training in medical applications: The state-of-the-art”, Med. Eng. Phys., vol. 63, pp. 6-25, 2019. doi: 10.1016/j.medengphy.2018.11.002

[5] S. Kim et al., “Scaffold-type microrobots for targeted cell delivery”, 2015 12th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), H. Choi, Ed. Goyang, Korea: Editorial, 2015, pp. 526-527 [Online]. Available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7358821&isnumber=7358803

[6] M. T. Nistor, and A. G. Rusu, “Chapter 3 - Nanorobots with Applications in Medicine”, in Polymeric Nanomaterials in Nanotherapeutics, C. Vasile, Ed. Amsterdam, Netherlands: Elsevier, 2019, pp. 123-149 [Online]. Available: http://www.sciencedirect.com/science/article/pii/B9780128139325000030

[7] 3D Systems, “¿Qué es un dispositivo háptico”, 2014. [En línea]. Disponible en: https://es.3dsystems.com/haptics-devices/touch. [Accedido: 14-abr-2019]

[8] G. Ion-Eugen, R. Ionut-Cristian, and B. N. George, “Haptic devices synchronization into a software simulator”, in 18th International Carpathian Control Conference (ICCC), D. Popescu, Ed. Sinaia, Romania: IEEE, 2017, pp. 440-445 [Online]. Available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7970440&isnumber=7970353

[9] EcuRed, “Unity3D”, 2017. [Online]. Available: https://www.ecured.cu/Unity3D. [Accessed: 14-apr-2019].

[10] Instituto Nacional de Tecnologías Educativas y de Formación del Profesorado, “¿Qué hace Blender?”, 2014. [En línea]. Disponible en: http://www.ite.educacion.es/formacion/materiales/181/cd/m1/qu_hace_blender.html. [Accedido: 14-abr-2019].

[11] UNIAT, “¿Qué es Zbrush y qué se puede hacer con esta herramienta?”, 2016. [En línea]. Disponible en: https://www.uniat.com/zbrush/. [Accedido: 14-abr-2019].

[12] P. Goh, Y. Tekant, and S. M. Krishnan, “Future developments in high-technology abdominal surgery: ultrasound, stereo imaging, robotics”, Baillière’s Clin. Gastroenterol., vol. 7, n. 4, pp. 961-987, 1993. doi: 10.1016/0950-3528(93)90025-N

Simulation and manipulation of three educational robots in unity 3D environment

Jaime López, Danilo Chimborazo and Andrés Vivas γ

FIET, Universidad del Cauca, Popayán, Colombia

γ. Corresponding author: j.andrex@unicauca.edu.co

Abstract

This Article shows the development of a graphic environment where the educational robots AL5B, S5-AXIS and DYI 6-AXIS are simulated and manipulated in a virtual and real environment. The elaboration of this interface takes place in 3D Unity’s virtual environment, previously they were imported of the robot’s parts from a CAD software. The program is enabled to move the robots individually and collectively, recording and reproducing their movement sequences in order to perform simple tasks. The results show the potential of teaching serial robotics in initial engineering subjects.

Keywords: Unity, robotics, Lynxmotion, simulation.

Background, motivation and objective

Currently, one branch of research that has achieved great advances has been robotics, as it has taken great boom due to the interaction between various areas of knowledge that converge, from areas such as home automation or “Domotic” to the aerospace sector. One area of knowledge in which robotics has had a great impact is medicine, since it has improved the rehabilitation of patients [5] or improved the results of surgical interventions [3], [4]. In addition to their classic contributions to the industry, in the near future more and more robots will be seen working and interacting with human beings, so it is becoming increasingly important that more and more people can interact with these types of devices [16], [17]. It should be noted that each device has different software that allows its execution and manipulation [12]. However, in most cases these programs are not compatible, therefore, it is very important to develop platforms that can simulate and manipulate different robotic arms, so that more people can interact and learn from these devices. Particularly in higher education, in engineering careers, it is important that students become familiar with this type of technology from the first semesters, in order to learn about robotics, programming, mechatronics, and so on. Having simple tools in this sense will make it easier to develop larger projects in more advanced terms.

In robotics, the simulation has had a great impact, because it allows establishing 3D virtual models [16], [9], [10], [2], it is able to reproduce and predict the operation of robotic devices, it identifies and plans trajectories efficiently [1], [3], [16], it allows knowing the kinematics and kinetics of these devices [16], being able to observe how they behave with respect to the space they occupy and the possible restrictions they may present. All this offers an innovative advantage which is universal, versatile, intelligent and portable [10], being useful for studies and research, also provides an approach before working with real devices through visualization, leading users without experience in programming robots to control and animate them [17], [5], significantly improving the interaction between human and machine. It is important to highlight that all this leads to a great impact in education, allowing to decrease the learning curve using virtual environments [3], achieving that people acquire knowledge [4][8], improve the time in the accomplishment of tasks [16] while they explore and investigate new forms of control and operation of the devices [7], being an immersion experience that allows working with equipment used in the professional field [4], [9].

It should be noted that among the different platforms are also ROS, Move It or Gazebo, however, due to the high performance that these tools offer, its learning curve is quite steep for users without knowledge of them. Therefore, we want to show in this document how using the Unity 3D graphics engine we can implement a simple interface with its respective connection to the controller cards of three different types of robots. Unity allows to make a user interface which can be used by people who do not have knowledge in programming. This article aims to be an academic contribution on the development of a graphical interface in the Unity 3D software, for the simulation and manipulation of three small commercial robotic arms: AL5B, S5 AXYS and DIY 6AXIS. Section 2 shows the robotic arms, the controller card on which the work was done and the hardware and software implementation of the three robotic arms, section 4 shows the results and finally the conclusions are presented in section 5.

Materials / methods
Robotic arms and controller board

The robotics laboratory of the University of Cauca besides has three small size commercial robotic arms, the SainSmart 5-Axis robot, the AL5B robot, and the SainSmart DIY 6-Axis robot. These robots operate with servo motors, which have a range of motion of 180 degrees, the torsion force in one servo is up to 12 kg and its load capacity of the lower joint is up to 500 grams (Figure 1). The first two are robots with four degrees of freedom, plus the clamp, meanwhile the last one has five degrees of freedom, with a clamp on its terminal organ.

Figure 1

SainSmart AL5B robot, S5-Axis robot, SainSmart DIY 6 robot and controller board SSC- 32U Lynxmotion


Source: Prepared by the authors based on figures from www.sainsmart.com/ and www.lynxmotion.com/

From all control cards available in market, finally it was decided to work with SSC-32u Lynxmotion controller servo, because it’s able to handle up to 32 output, which is necessary to control the three robots, also has the requirement current for handling them and allows the communication without needing another card, improving response times.