Robots on the Battlefield and for Space Travel





Key Points





  • Robotic devices are providing first-line assessments and interventions on the battlefield to save the lives of the wounded and to protect medical personnel.



  • Robotic devices can provide sophisticated medical care to soldiers in remote locations.



  • Robots and artificial intelligence provide many advantages to astronauts to maintain their health in space.



Medical care on the battlefield has developed some interesting sophistications to ensure the ­optimal recovery and treatment of soldiers who have been injured in battle. A critical triage system exists to move soldiers to better equipped and better staffed locations. However, the circumstances of battle can make the recovery of wounded soldiers quite difficult as enemy fire can block the movement of medical personnel to the fallen individual. Clearly robotic systems can play a useful role in these situations. Unmanned robotic vehicles can be directed to help without putting medics and other soldiers in harm’s way. These vehicles and devices can be directed from afar and can provide medical personnel with important information on the health status of the soldier to initiate treatment as quickly as possible in a situation where any delay can result in disastrous outcomes.


In this chapter some of the various devices that have been developed will be looked at in some detail to see how they work, how they help to provide safe and effective care, and how much they have influenced medical care on the battlefield. The companies that produce these devices are spread across the globe, and their approaches show similarities but also quite interesting variations.


Battlefield Extraction–Assist Robot


The Battlefield Extraction–Assist Robot, or BEAR , was designed by Vecna Technologies Inc. to operate in a variety of situations that could include rescue within buildings and areas of challenging terrain. The robot has the capability to lift 500 pounds and can be controlled either with a motion-capture glove or a M-4 rifle grip controller. It travels on a dual-track system that allows it to travel upstairs. It can stand upright. The device can be guided remotely to rescue and retrieve a wounded soldier particularly if under fire. Using karate chop movements of its arms and hands, it can break through doors within buildings. Development of the robot was assisted with funding from The US Army Medical Research and Materiel Command’s Telemedicine & Advanced Technology Research Center (TATRC) and Defense Advanced Research Projects Agency (DARPA).


Vecna Technologies also has the QC Bot, which is a robotic truck-like device that has applications in hospitals for delivery of medicines and supplies. The machine has important safety features that allow it to stop if an individual comes in its path but also to strategize a way to keep moving without compromising anyone.


Emergency Integrated Lifesaving Lanyard


EMILY is an acronym for Emergency Integrated Lifesaving Lanyard and is a device that can travel up to 25 mph and can pound through surf to reach someone in distress in the water and provide a platform for rest until help can arrive ( Fig. 8.1 ). It is 4 feet long and weighs 25 pounds and is available for use in Navies and Coast Guards around the world. It also has been used for civilian applications such as an assist for coast guards on beaches. The US Navy provided some funding for the development of this robotic device. Tony Mulligan is the inventor and chief executive officer and president of Arizona-based Hydronalix Inc., which also makes several other similar products such as NIX, AMY, and AMB that can serve a variety of purposes.




Fig. 8.1


Emily by Hydronalix—advanced small surface robotic systems.



The company participates in advanced research and development programs for the Office of Naval Research, Naval Air Systems Command, Naval Sea Systems Command, Naval Surface Warfare Center, DARPA, Department of Homeland Security, Domestic Preparedness Support Initiative, National Oceanic Atmospheric Administration, Naval Information Warfare Center, and the National Science Foundation. The company also collaborates with many fire departments and has clients in many foreign countries.


Demining Robots


Although strictly speaking these robots are not directly for health care, brief mention is made here of the robots available to demine militarized zones. Since many soldier injuries are due to the explosion of mines such as improvised explosive devices, these robots serve an important role in the prevention of catastrophic injuries that would impose on soldiers a lifetime of severe disability. DOK-ING, a company headquartered in Zagreb, Croatia, makes several robots of this category and these include its MV-2 Honey Badger ( Fig. 8.2 ), MV-4 Scorpion, and MV-10 Bison. These are large autonomous vehicles that vary in their capacity to detect and destroy mines in a variety of terrains.




Fig. 8.2


MV-2 Honey Badger.



The Mission Master


The Rheinmetall Mission Master series is a multifunctional group of unmanned ground vehicles ( Fig. 8.3 ) whose purposes include the evacuation of injured personnel. The units consist of the Mission Master SP and the Mission Master XT. The first is the smaller of the units, and the larger one has amphibious capabilities as well. They are optionally manned. In addition to medical evacuation, they serve the purposes of reconnaissance, surveillance, tactical overwatch, and fire support. Autonomous driving and navigation functions are managed through the Rheinmetall PATH autonomy kit. Through controllers such as a tablet, smartwatch, soldier controller and single-hand controller, various modes can be implemented that include “follow me,” convoy, and autonomous navigation. The artificial intelligence (AI) features are upgradable as advancements occur. Rheinmetall AG is a large, globally oriented corporation with products for the automotive industry as well as defense and security.




Fig. 8.3


Rheinmetall Mission Master A-UGVs.



Bell APT 70


APT is for “autonomous pod transport”. This unmanned vehicle transports supplies that could include medical supplies in addition to others such as food or ammunition to soldiers on the battlefield. Since it is unmanned, there is no danger to personnel from antiaircraft defenses. Further protections include a neutral heat signature due to the electric engines and low radar visibility. It can carry payloads of up to 110 pounds. It measures 6 feet by 9 feet as it sits on the ground with a vertical height of 6 feet. It takes off vertically and then tilts forward to achieve speeds of 115 miles per hour with a range of 35 miles. It can travel from origin to destination without human interaction. The payload sits between the four booms for the rotors.


Zipline


These autonomous drones can deliver medical supplies to remote or inaccessible locations quickly from a central staging area. The drones are propelled like small airplanes through an electric motor that propels them forward at about 100 km/h and a round trip range of 160 km. The drones have a modular design and prior to flight the package is stored, the wings applied, and the battery inserted. The GPS location is programmed into the system. The machine is sent rapidly into the air through a pulley system and electric motor that operate like a slingshot. This system accelerates the drone from a stationary position to its flight speed in 0.3 seconds. Once the drone reaches its destination, it drops its cargo, which falls to the ground through a small, disposable parachute, and then it turns around and goes home. Once back at its home station, a wire catches the drone on a hook on its fuselage. The drone communicates with the catching system and misses occur only 10% of the time. In this case the drone accelerates its engines and returns for another try.


Robotic Surgery in the Battlefield


Robotic surgery can enhance the timeliness and expertise of surgery needed for soldiers in the battlefield. Different options provide three different capabilities. The systems include supervisory-controlled, telesurgical, and shared-control options. In the first the surgeon loads a program suited to the required surgical needs of the patient. The surgical robot then performs the surgery independently. In the second method the surgeon operates the robot from a distance. In the last method, the robot assists the surgeon in various ways in the performance of the surgery such as steadying movements or increasing the precision of fine movements. Other considerations involve the portability of the robotic system as the larger more cumbersome units are not as suitable for transport into the battlefield. Downsides of these systems include a long and intensive training time for surgeons, the steep cost, and the increased time under anesthesia.


Trauma Pod


DARPA has initiated the Trauma Pod project ( Fig. 8.4 ) , which is a mobile, autonomous, robotic platform that can be deployed to the battlefield and serve injured soldiers without any immediate human intervention. Doctors and surgeons can remotely control the robotic surgical and assessment devices and thus through robotic avatars provide expert care. There are multiple subcontractors that provide services such as the University of Maryland, the University of Washington, the University of Texas at Austin, Oak Ridge National Laboratory, General Dynamics, Intuitive Surgical, General Electric, and Integrated Medical Systems.




Fig. 8.4


Trauma Pod Layout.

From Garcia P, Rosen J, Kapoor C, et al. Trauma Pod: a semi-automated telerobotic surgical system. Int J Med Robot . 2009;5(2):136–146.


Raven


LSTAT, Life Support for Trauma and Transport System, is a snakelike robotic arm attached to a stretcher. A human operator monitors the patient through the robot’s sensor and camera and manipulates the bot through a wireless system. The snakelike arm can be directed to any part of the body for assessment. The system contains the equivalent of a small-scale intensive care unit with a ventilator, defibrillator, pulse oximeter, and so on. Once the patient is moved to the stretcher the patient can be assessed, stabilized, and evacuated to more extensive medical systems.


In addition to using specific newly developed robots for targeted purposes, the military is attempting to organize a uniform system that employs robotics and AI to address all the unique concerns that face the military when trying to provide medical care in a battlefield situation. An analogous civilian situation is a mass disaster, such as an earthquake or tsunami, in which there are mass casualties in a focused geographic region that require the efforts of many trauma-trained health-care workers and physicians, and yet the local systems have an inadequate supply of such personnel. The military, according to Dr. Cindy Crump of the Applied Physics Laboratory at Johns Hopkins University, faces a somewhat similar shortage and seeks to fill that gap at least partially through AI systems and robotics. An AI infrastructure would provide many advantages in that a lot of care could be automated from assessment to intervention and intervention could include as mentioned robotic surgeries. Challenges in this system exist. One is the cybersecurity of the devices used, because many commercial options do not provide sufficient cybersecurity to protect military personnel and the integrity of the military data systems. Another issue concerns device interoperability where different devices need to interface with each other.


The US Army Medical Research and Development Command has formed the Artificial Intelligence in Medical Sciences, or AIMS, task force to address such issues. As part of its directives this task force identifies appropriate medical research for incorporation into ongoing programs, creates collaborative research efforts to generate information and technological developments that will support the care of warfighters, formulates partnerships with industry and academia, and assures that medical care is evidence based and that autonomous care such as from AI systems meets those standards as well. The model proposed is the AI Stack method of perceive, learn, decide, and act. This model would involve the appropriate AI computer software and hardware with a large, relevant data input and access to an immense database of reference and journal material. Although the AI systems would have some autonomy, the human interaction and modulation of such a system would be retained.


In a similar vein the Medical AI Autonomy Stack is also proposed. These layers consist of (1) sensors, medical devices, actuators, controls, and network connectivity; (2) data commons, open sources data base, repository, tools; (3) learn, replan, synchronize, prevent; (4) assess; (5) decide; (6) intervene. This stack is being approached through a collaboration between The Defense Health Agency and the US Army Medical Research and Development Command to establish MDIRA, or the medical device interoperability reference architecture.


A mixed collaboration between industry, academics, and various governmental groups has been assembled to tackle these objectives such as closed-loop control systems for medical care that is autonomous or semiautonomous and AI-assisted clinical support. Safe and secure systems are of course mandatory.


Efforts Outside the United States for Battlefield Robotics


Russia has made a commitment to develop robots for military medicine. Their areas of focus are much like those of other countries, and these include evacuation of injured soldiers on the battlefield and surgical robots for the battlefield. Their ministry of defense has planned a staged development program that would finish at the end of 2022, and they have allocated about 249,982,141 Rubles (US $3,617,241.58) to be portioned out in phases. Some of the main challenges for the program involved the development of algorithms to rapidly diagnose injuries on the battlefield. Robots will rescue soldiers in combat situations, diagnose their injuries, and transport them to nearby centers where robotic surgical devices would perform the necessary surgeries. The entire flow of these events could be monitored remotely, and the use of robots would prevent human casualties in case of any mass attacks. The robots would be designed to operate effectively in conditions of extreme temperatures as well. This system would also find a place in the civilian setting and could replace paramedics. Through such programs fewer military personnel would be necessary, and one in four workers would no longer be necessary. The Russian surgical robot is lighter than the da Vinci system and has four arms, one of which is used for a video camera and the other three for surgical instruments. The Russian robotic system is designed to carry out procedures autonomously.


Pulki Gaur founded Gridbots in 2007 with a focus on AI and machine vision. Headquartered in Ahmedabad, Gridbots has clients such as the Ministry of Home Affairs in New Delhi, the National Institute of Design in Ahmedabad, the Indian Navy, Bhabha Atomic Research Center, Oil and Natural Gas Corporation (ONGC), TATA Power, Indian Space Research Organisation Physical Research Laboratory (ISRO PRL) and the Gujarat Police. The contribution of this company spread into many areas and include space robotics, defense, painting, welding, injection molding, nuclear robotics, power plants, and underwater robotics.


Robots and Artificial Intelligence in Space Programs


Travel into outer space and residence in a space station lend themselves naturally to the use of robotics and AI, both of which can enhance survival and the achievement of missions for astronauts who often function under arduous conditions. Some of the traditional ways that these technologies can assist in the exploration of space consist of the following items. The most obvious would be for these to be assistants to the astronauts as they went about their duties. One example of this function is the AI assistant CIMON, which began its trial aboard the International Space Station (ISS) in December 2019. CIMON performs some of the duties that astronauts are required to perform. Another AI/robotic device is Robonaut, which will be able to off-load tasks and perform some of the riskier tasks aboard the ISS. The second area of use is mission design and planning. Daphne is an AI system that is used in the development of satellites. Travel to distant planets would be facilitated by such programs. Satellites pose significant challenges both for data that is produced and for issues related to their own healthy functioning. AI systems can be adapted for both these functions. With over 34,000 objects greater than 10 cm, outer space poses dangers due to the level of space debris floating around. The immense computational power of AI could help to track, monitor, and warn traveling astronauts of any such danger in their path. The final traditional purpose would be for navigational systems. One of the first such systems developed by researchers from National Aeronautics and Space Administration (NASA) in collaboration with technologists from Intel, utilized the multitude of images of our solar system to develop an AI method of navigation for exploring the planets. Clearly more such developments will accrue in the future.


Aside from these technical issues, the health and physiologic integrity of astronauts become major concerns particularly because space flight can create a host of medical issues. Astronauts often return with weakness. More prolonged stays can affect nearly all if not all physiological ­systems and organs. Examples of these include increased intracranial pressure, space flight–­associated neuro-ocular syndrome, lunar dust pulmonary toxicity, alterations in gut microbiome, increased heart rate, decreased blood pressure, increased risk of cardiac dysrhythmias, kidney stones, anemia, loss of bone mineral content, immune deficiency, radiation-induced cancers, altered circadian rhythms, mood disturbances, among others. Many of these are worse with longer stays and can persist for long periods of time after the return to Earth. Ophthalmic symptoms fall into this category. In addition to space flight–associated neuro-ocular syndrome, globe flattening, optic nerve edema, retinal folds, refractive error shifts, and nerve fiber layer infarcts can occur. Cardiovascular issues can be quite concerning as dangerous dysrhythmias can occur. After even 6 months of being in space, changes occur in the structure and conduction paths of the heart. There are both immediate and chronic changes from microgravity as shifts in the distribution of blood occur as well as alterations in pumping capacity and changes in cardiac reflex activity. The risks of these changes once the astronaut returns to Earth at the present time are not known. Dust from the moon, Mars, and other planets has a known adverse effect on humans that is not limited to the pulmonary system, but which can also affect the neurologic and cardiovascular systems. Microgravity also exerts a negative effect on the musculoskeletal system, prompting loss of mineralization of bones and perhaps some loss of cartilage as well. Spinal pathologies can increase as a result. Diet and exercise can have some deterrent effect on these changes.


Consequently, the monitoring of the health and physiological systems of space travelers becomes of paramount importance. Several AI solutions have been proposed. These include wearable technology, biosensors, surveillance, diagnostic imaging, augmented and virtual reality, telemedicine, and telesurgery or robotic surgery. For AI to work, many factors must come into play. As noted previously in this book, deep learning systems function best when large amounts of data are fed into its processing systems for it to produce the most successful analyses. Due to the limited number of missions, optimal databases may not yet be available. Also, the AI systems need accurate and timely physiological data from the space travelers. There are multiple technologies that can be deployed to achieve these ends.


As mentioned previously, there are a host of wearables that can provide pertinent physiological information. Wrist-worn actigraphs can provide rest data and already are in use. Important parameters to measure are bone density, cardiac output, energy expenditure, and muscle strength. Bone density of course provides an indication of whether an astronaut is at risk for fractures or approaching the risk for fractures and thus would allow preemptive action to prevent such decreases from occurring. A newer technological advance provides monitoring through electronic skin that can take different manifestations. The electronic nose can detect harmful substances while the electronic ear can pick up lung sounds for interpretation within an AI program. Such sensors could provide an early warning system for lunar dust (regolith). Other potential wearables could measure physical fatigue, mental decline, electroencephalographic data through caps or ­helmets. Wearables could be integrated into T-shirts or spacesuits. Prototypes for some of these options already exist such as the BRAINCARE, which can measure intracranial pressure.


Biosensors represent a useful tool to ensure the ongoing well-being of space travelers. Combined with AI, they make a powerful system for health analysis and monitoring. This type of device can assess chemical factors in the blood and tissues and provide feedback about the effects of space travel on the physiology and metabolism of any given individual under these conditions. Any body fluid or tissue potentially could be analyzed such as blood, urine, feces, hair, muscle, and saliva. Panomics, although generally requiring the use of bulky devices not suited for space flight, have the potential for being incorporated into space travel once the AI systems become miniaturized enough to bypass weight and space concerns. Under the umbrella of ­panomics resides genomic, transcriptomic, proteonic, and metabolomics. Biosensors allowed the implementation of an interesting twin study where one twin spent a year on the ISS and other spent the same year on Earth. Multiple biological parameters were studied and compared. A vaccine study on the two did not show any difference in the immunologic system. However, there were significant differences in areas such as telomere length, epigenetics, gastrointestinal microbiome, inflammatory cytokines, ocular parameters, metabolites within the blood. Some of these normalized after return to Earth but others did not, and thus some systems appeared to be more adversely affected than others. Thus, the beginning of data sets for deep learning machines is accompanying the growth of technology to gather data.


Augmented and virtual reality will also find a place in these scenarios. A proposed aid to the psychological well-being of astronauts would be the enhanced visual and auditory input these systems would provide for relief of a somewhat monochromic space experience. Skin haptic systems would add the sensory input to the visual and auditory ones for a more comprehensive experience. This option has been discussed in relation to a proposed trip to Mars, which would take around 3 years with just a handful of astronauts in a confined space. Augmented and virtual reality would allow these astronauts some relief from the monotony and the stress of such an experience. Psychological well-being is not the only benefit or purpose being proposed. Maintaining cognitive and intellectual sharpness through stimulating and challenging games could also be a benefit as well.


Given that computed tomography and magnetic resonance imaging machines are far too bulky and heavy to hoist into space, medical care now relies on diagnostic ultrasound or teleultrasound to provide imaging guidance in the diagnosis of a variety of medical conditions. The ISS ­currently employs ultrasound for the residents there. Crew members of the ISS, for example, are trained in the focused assessment in sonographic technique to assist in the imaging. They perform the mechanics of moving the probe over the appropriate areas at the appropriate settings, and the images are then sent to Earth for analysis. Robot and AI assistance in this medical imaging technique is certainly anticipated. The members of the ISS can direct this technique toward ­critical intrabdominal, intrathoracic, and ocular issues. Studies have been done on the application of ultrasound to evaluation of the lumbar spine to better understand the development of back pain during these missions. Study and development of this modality has occurred through the program of NASA Extreme Environment Missions Operations in underwater conditions of a submarine. Robot assistance to the probe movements is sought as well. Physicians provide input remotely. Doppler ultrasound coupled with AI can do velocity comparisons of carotid versus femoral artery blood flow to assess the risk of cerebrovascular conditions.


So, although telemedicine for astronauts is now routinely employed, the difficulties with maintaining a continuous or real-time communication channel cause some obstacles to the delivery of care. Utilizing AI with the integration of data from wearables and biosensors and imaging can offset such difficulties and allows astronauts to receive a much higher level of care when not in communication with physicians located on Earth. One unique aspect of space medicine and research is that 3D printed tissue structures that would normally collapse under their own weight on Earth will sustain their shape in outer space in a microgravity environment ( Fig. 8.5 ). Onboard AI can optimize imaging from ultrasound to enhance medical diagnosis and treatment. To increase the effectiveness of AI intervention, Astroskin ( Fig. 8.6 ) is used to monitor multiple biological parameters and uses the following: three-axis accelerometer, electrocardiogram (ECG)/heart rate, respiration rate and volume, oxygen saturation, systolic blood pressure, and skin temperature. Various trials have been performed to demonstrate its feasibility and usefulness. Prior to Astroskin, the method employed was the blood pressure/electrocardiogram kit which is capable of measuring ECG, heart rate, and blood pressure. Bulky and heavy, the kit also uses a lot of energy. One such test trial of Astroskin occurred at NASA’s Human Exploration Research Analog, or HERA, within the Johnson Spaceflight Center. HERA allowed the simulation of isolation, remote conditions, and confinement associated with spaceflight. The Astroskin consisted of an upper-body garment and a headband, all of which contained sensors of the sort mentioned previously. In this study there were eight subjects, four women and four men. The study consisted of a permission baseline, 4 days of continuous monitoring during full workdays, and a postmission baseline. The results of the study indicated that the garment overall was user-friendly, was comfortable except for some issues with the headband, and could transmit reliably through Bluetooth technology good physiologic data. Sleep and fitness times were well monitored.


Apr 6, 2024 | Posted by in PHYSICAL MEDICINE & REHABILITATION | Comments Off on Robots on the Battlefield and for Space Travel

Full access? Get Clinical Tree

Get Clinical Tree app for offline access