Advances in Lower-limb Prosthetic Technology




The boundaries once faced by individuals with amputations are quickly being overcome through biotechnology. Although there are currently no prosthetics capable of replicating anatomic function, there have been radical advancements in prosthetic technology, medical science, and rehabilitation in the past 30 years, vastly improving functional mobility and quality of life for individuals with lower-limb amputations. What once seemed impossible is rapidly becoming reality. The future seems limitless, and the replication of anatomic function now seems possible.


Throughout history, most advances in technology and health care occur during wartime for tactical reasons and for the care of the wounded. The Civil War was the bloodiest war in the history of the United States and resulted in more than 30,000 Union soldiers and 40,000 Confederate soldiers losing their limbs in the 4 years between 1861 and 1865. In World War I (WWI), 4403 American soldiers suffered amputations. During World War II (WWII), there were 14,912 US service members with amputations and more than 1 million worldwide. Between 1961 and 1975, the Vietnam War resulted in 5283 amputations, with 1038 service members with multiple amputations. Although survival rates are increasing due to advances in early battlefield medical care and improvements in combat vehicles and personal armor, this corresponds to an increased number of veterans who are living with a variety of severely disabling conditions including amputations.


The younger age, higher functional status, and desire to return to these veterans to their premorbid level of function has led to an explosion of technological and rehabilitative advancements. During the last 30 years there have been far-reaching developments in battlefield care, amputation surgery, rehabilitation, and prosthetic technology, allowing individuals with limb loss to return to functional levels once believed improbable. The development of technology has increased the diversity of prosthetic components and fabrication techniques used in assisting people with limb loss to achieve maximum functional performance. The most recent advancements in surgery, socket design, and prosthetic components are now providing amputees with more options, such as better stability, improved comfort, increased responsiveness, and generally better performance, potentially with a reduction in stresses on the body.


The advances in prosthetic technology have been possible through the concerted efforts of multiple disciplines in response to the demands of our active-service members and veterans. The continued collaboration and expertise of a variety of disciplines, including orthopedic surgery, rehabilitation medicine, physical therapy, occupational therapy, biomedical engineering, prosthetics, and orthotics have narrowed the gap between what is imagined and what is real.


History


Prostheses were developed out of necessity to replace the function of lost limbs, and to return individuals to a productive state within the social structure. There is evidence for the use of prostheses from as early as the fifteenth century BC. One of the earliest examples of prosthetic usage was found in the eighteenth dynasty of ancient Egypt in the reign of Amenhotep II. A mummy on display in the Cairo Museum had his right great toe amputated and replaced with a prosthesis manufactured from leather and wood.


During the Dark Ages, armorers developed prostheses for warriors who had been injured during combat. These early prosthetics were developed to allow soldiers to return to battle and hide deformity. They were constructed with the most advanced materials available at the time; namely wood, leather, and metal. Unfortunately, these devices were clumsy and heavy, with little functional value. The first prosthetics that demonstrated a sound understanding of basic biomechanical functions were developed by French army barber-surgeon Ambroise Pare. Dr Pare designed upper- and lower-limb prostheses. His “Le Petit Lorrain”, a mechanical hand operated by catches and springs, was worn in battle by a French Army captain, with limited success.


Advances in medical science, such as the invention of the tourniquet, anesthesia, analeptics, blood clotting styptics, and disease-fighting drugs during the 1600s to the early 1800s improved amputation surgery and, subsequently, the function of prostheses. These advancements allowed surgeons to make residual limbs more functional, enabling prosthetists to make more usable prostheses. In 1858, Dr Douglas Bly patented his “anatomic leg,” which used a series of cords to control the ankle motions. A few years later J.E. Hanger, after losing his leg at the beginning of the Civil War in 1861, introduced an artificial foot that used rubber bumpers to control ankle motion. Dr Bly and Hanger competed fiercely for contracts with Walter Reed Medical Army Hospital to provide services to returning soldiers who lost limbs during the war. By 1863, the care of returning veterans led to numerous prosthetic patents, such as Dubois D. Parmelee’s suction socket for arms and legs, which used pressure to suspend the sockets of lower and upper limb amputees. His work with suction sockets was abandoned some years later, but the fabrication of better, more functional, artificial limbs had begun. WWII produced the second largest number of amputees from war in the United States. Because of the vigorous protest by the returning veterans about the prosthetic devices they were receiving, in 1945 the National Academy of Sciences National Research Council, at the request of Norman Kirk, Surgeon General of the Army, established the Committee on Prosthetic Devices, later known as the Prosthetic Research Board. During the following decade, numerous prosthetic devices were developed worldwide and became the standard of care for more than 50 years.




Surgery


Amputation Surgery


Amputation surgery was once revered as a surgery for only the highly skilled surgeon; before the Civil War, only 500 of the 11,000 northern physicians and 27 of the 3000 southern physicians had performed surgery. The mortality rate from a primary amputation was 28%; if a second amputation was performed, it rose to 58%, and if an infection such as pyemia occurred, the mortality rate was more than 90%. During WWI, the concept of debridement with antiseptic technique and delayed primary closure began to replace prophylactic amputation of war extremity injuries.


Currently, newer body armor, protecting the head and thorax, and the immediate use of powerful antibiotics have led to a remarkable survival rate, with the wounded/dead ratio before body armor being 3:1, compared with the current figure from Iraq of 10:1. As a result of body armor, most trauma is to the exposed limbs, but, because of the outstanding medical triage of care, and better control of secondary complication due to infection, there are more injured in need of care.


In January 2006, the American Association of Orthopedic Surgeons (AAOS) and the Orthopedic Trauma Association (OTA) cosponsored a symposium titled “Extremity war injuries: state of the art and future directions.” Military and civilian orthopedic surgeons convened to define the current knowledge of the management of extremity war injuries. As of June 2009, more than 950 American service members have sustained injuries resulting in major limb loss. Resuscitation of an injured soldier begins in the field with the use of special tourniquets, if necessary, to prevent exsanguination. Antibiotics are given early in the evacuation chain. Thorough irrigation and debridement are performed, usually within 2 hours of wounding. Redebridement is performed every 48 to 72 hours as the patient travels to higher echelons of care. All viable tissue is preserved. Fractures are immobilized with plaster or external fixation, and all wounds are left open before transport. Amputations are determined by the level of soft-tissue injury, not the level of fracture. Flaps of opportunity, split-thickness skin grafting, and free tissue transfers are frequently used for coverage.


Osseointegration


Osseointegration was originally defined as a direct structural and functional connection between ordered living bone and the surface of a load-carrying implant. During the 1950s, it had been shown by P.-I. Brånemark that chambers made of titanium could become permanently incorporated into bone, and that the 2 could not be separated without fracture. Osseointegration has been used in several different situations, including dental implants, facial prostheses, and hearing aids. Osseointegration has recently been proposed to create an improved interface between a residual limb after amputation and a prosthetic limb. The main obstacles to prosthetic use are the problems encountered at the stump-socket interface. Poor socket fit can lead to instability, tissue damage, and pain, leading to the rejection of the prosthesis. Osseointegration integrates titanium implants into the medullary cavity of the bone; however, the implants extend from the bone, emerging through the skin to create an anchor for the prosthetic limb. This method bypasses skin contact with the prosthesis, reducing pain and tissue damage.


Another feature reported by the proponents of osseointegration is the improved ability to identify tactile stimuli transmitted through the prosthesis. This “osseoperception” has been studied in dentistry and orthopedics. A recent study conducted by Jacobs and colleagues reported that bone-anchored prostheses yielded better perception than socket prostheses. This finding could prove invaluable to the amputee by improving kinesthetic awareness and increasing the overall responsiveness of limbs.


The technique has raised concerns, however, because it destroys the barrier function provided by skin, which prevents contamination of the internal environment by the external environment. When pathways develop around the implant through the soft tissues, infection and metal corrosion can result, which in turn can lead to additional loss of bone in residual limbs. One research focus at the Center for Restorative and Regenerative Medicine (CRRM; a collaborative research initiative that includes the Providence VA Medical Center, Brown University, Massachusetts Institute of Technology (MIT), and other VA hospitals including the Salt Lake VA in Utah) is to develop an environmental seal, integrating skin and dermis with the metal implant by promoting adhesion to, or growth into, porous prosthetic surfaces.


Limb Lengthening


A difficult socket fitting and biomechanical problem for traumatic amputees is short residual limbs. A person with a short residual limb after transfemoral amputation may have difficulty with prosthetic fitting because of the complexities of obtaining adequate suspension or pain from greater forces to the smaller surface of bony areas. Likewise, the higher amputation requires a heavier prosthesis, with reduced anatomic force production because of loss of femur and muscle tissue, which conversely decreases mobility and functional ability.


One alternative is to lengthen a short residual limb by surgically lengthening bone. This is accomplished by creating an osteotomy and separating the bone ends by gradual distraction, a process called distraction osteogenesis. The history of surgical limb lengthening dates back to the turn of the twentieth century when Codivilla published the first article on the subject in 1905. Limb lengthening has been performed using several techniques, each with its own complications and failures. A breakthrough came with a technique introduced by the Russian orthopedic surgeon Gavril Ilizarov in 1951. Ilizarov developed a procedure based on the biology of the bone and on the ability of the surrounding soft tissues to regenerate under tension; the technique involved an external fixator, the Ilizarov apparatus, structured as a modular ring. Ilizarov’s method of distraction osteogenesis still had several complications, but it was the safest and most effective method of the time.


One complication of distraction osteogenesis is delayed bone healing, which leads to functional deficits such as contractures and muscle atrophy. Researchers at the CRRM have investigated techniques of accelerating or augmenting bone healing, including the use of biomimetic scaffolds, growth factors, demineralized bone matrix, gene therapy, and interaction with physical stimuli, such as mechanical, ultrasound, and electrical energy. These tissue engineering strategies hold the promise of accelerating the rate of elongation, maximizing the length of regenerated bone, and diminishing osteoporosis and refracture.




Surgery


Amputation Surgery


Amputation surgery was once revered as a surgery for only the highly skilled surgeon; before the Civil War, only 500 of the 11,000 northern physicians and 27 of the 3000 southern physicians had performed surgery. The mortality rate from a primary amputation was 28%; if a second amputation was performed, it rose to 58%, and if an infection such as pyemia occurred, the mortality rate was more than 90%. During WWI, the concept of debridement with antiseptic technique and delayed primary closure began to replace prophylactic amputation of war extremity injuries.


Currently, newer body armor, protecting the head and thorax, and the immediate use of powerful antibiotics have led to a remarkable survival rate, with the wounded/dead ratio before body armor being 3:1, compared with the current figure from Iraq of 10:1. As a result of body armor, most trauma is to the exposed limbs, but, because of the outstanding medical triage of care, and better control of secondary complication due to infection, there are more injured in need of care.


In January 2006, the American Association of Orthopedic Surgeons (AAOS) and the Orthopedic Trauma Association (OTA) cosponsored a symposium titled “Extremity war injuries: state of the art and future directions.” Military and civilian orthopedic surgeons convened to define the current knowledge of the management of extremity war injuries. As of June 2009, more than 950 American service members have sustained injuries resulting in major limb loss. Resuscitation of an injured soldier begins in the field with the use of special tourniquets, if necessary, to prevent exsanguination. Antibiotics are given early in the evacuation chain. Thorough irrigation and debridement are performed, usually within 2 hours of wounding. Redebridement is performed every 48 to 72 hours as the patient travels to higher echelons of care. All viable tissue is preserved. Fractures are immobilized with plaster or external fixation, and all wounds are left open before transport. Amputations are determined by the level of soft-tissue injury, not the level of fracture. Flaps of opportunity, split-thickness skin grafting, and free tissue transfers are frequently used for coverage.


Osseointegration


Osseointegration was originally defined as a direct structural and functional connection between ordered living bone and the surface of a load-carrying implant. During the 1950s, it had been shown by P.-I. Brånemark that chambers made of titanium could become permanently incorporated into bone, and that the 2 could not be separated without fracture. Osseointegration has been used in several different situations, including dental implants, facial prostheses, and hearing aids. Osseointegration has recently been proposed to create an improved interface between a residual limb after amputation and a prosthetic limb. The main obstacles to prosthetic use are the problems encountered at the stump-socket interface. Poor socket fit can lead to instability, tissue damage, and pain, leading to the rejection of the prosthesis. Osseointegration integrates titanium implants into the medullary cavity of the bone; however, the implants extend from the bone, emerging through the skin to create an anchor for the prosthetic limb. This method bypasses skin contact with the prosthesis, reducing pain and tissue damage.


Another feature reported by the proponents of osseointegration is the improved ability to identify tactile stimuli transmitted through the prosthesis. This “osseoperception” has been studied in dentistry and orthopedics. A recent study conducted by Jacobs and colleagues reported that bone-anchored prostheses yielded better perception than socket prostheses. This finding could prove invaluable to the amputee by improving kinesthetic awareness and increasing the overall responsiveness of limbs.


The technique has raised concerns, however, because it destroys the barrier function provided by skin, which prevents contamination of the internal environment by the external environment. When pathways develop around the implant through the soft tissues, infection and metal corrosion can result, which in turn can lead to additional loss of bone in residual limbs. One research focus at the Center for Restorative and Regenerative Medicine (CRRM; a collaborative research initiative that includes the Providence VA Medical Center, Brown University, Massachusetts Institute of Technology (MIT), and other VA hospitals including the Salt Lake VA in Utah) is to develop an environmental seal, integrating skin and dermis with the metal implant by promoting adhesion to, or growth into, porous prosthetic surfaces.


Limb Lengthening


A difficult socket fitting and biomechanical problem for traumatic amputees is short residual limbs. A person with a short residual limb after transfemoral amputation may have difficulty with prosthetic fitting because of the complexities of obtaining adequate suspension or pain from greater forces to the smaller surface of bony areas. Likewise, the higher amputation requires a heavier prosthesis, with reduced anatomic force production because of loss of femur and muscle tissue, which conversely decreases mobility and functional ability.


One alternative is to lengthen a short residual limb by surgically lengthening bone. This is accomplished by creating an osteotomy and separating the bone ends by gradual distraction, a process called distraction osteogenesis. The history of surgical limb lengthening dates back to the turn of the twentieth century when Codivilla published the first article on the subject in 1905. Limb lengthening has been performed using several techniques, each with its own complications and failures. A breakthrough came with a technique introduced by the Russian orthopedic surgeon Gavril Ilizarov in 1951. Ilizarov developed a procedure based on the biology of the bone and on the ability of the surrounding soft tissues to regenerate under tension; the technique involved an external fixator, the Ilizarov apparatus, structured as a modular ring. Ilizarov’s method of distraction osteogenesis still had several complications, but it was the safest and most effective method of the time.


One complication of distraction osteogenesis is delayed bone healing, which leads to functional deficits such as contractures and muscle atrophy. Researchers at the CRRM have investigated techniques of accelerating or augmenting bone healing, including the use of biomimetic scaffolds, growth factors, demineralized bone matrix, gene therapy, and interaction with physical stimuli, such as mechanical, ultrasound, and electrical energy. These tissue engineering strategies hold the promise of accelerating the rate of elongation, maximizing the length of regenerated bone, and diminishing osteoporosis and refracture.




Suspension sleeves and liners


Some of the most significant developments in recent years have occurred with the interface systems between the residual limb and the socket. The 2 most common interface systems are suspension sleeves and liners. The primary function of the suspension sleeve is to hold the socket in place or suspend the socket, whereas the liner is designed to provide padding or cushioning for the residuum. The properties of suspension sleeves and liners may be combined to create suspension liners, an interface system that provides suspension and padding. In the early days of prosthetic development, protection of the residual limb was achieved by lining the socket with animal fur. Today, many variations of suspension sleeves and liners are available in a prefabricated form or they can be custom manufactured. There are numerous material variations available, such as closed-cell foam, urethanes, silicone elastomers, silicone gels, and many combinations of materials.


Silicone liners, and suspension systems and liners, have seen a steady increase in use since their introduction 1970s. Several advantages have been reported with the use silicone liners, including improved socket interface, greater comfort, decreased pain, and greater skin protection. Dasgupta found an increase in distance walked with a prosthesis with a decrease in assistive device use and increased comfort in liner users. Datta and colleagues reported that some transtibial amputees using silicone suspension experienced improved prosthetic control with a decrease in skin abrasion and irritation, and a reduction in phantom limb pain.


Elastomeric liners have been reported to have high coefficients of friction when in contact with skin, reducing localized skin tension and shear, and therefore decreasing slippage. Moreover, there is an increase in the total surface area contact of the residuum with the socket, leading to increased pressure distribution, compared with a conventional closed-cell foam material. It is reasoned that, with the high coefficient of friction (COF) with the skin, and low compressive stiffness, elastomeric liners experience minimal displacement relative to residual-limb skin during walking. This effect helps to maintain total contact, and is believed to reduce localized skin tension and shear, making the liner more comfortable to the amputee.


Sanders and colleagues compared the compressive stiffness, COF, shear stiffness, and tensile stiffness of 15 commonly prescribed elastomeric liner products with a skin-like material. In general, silicone gels had a lower compressive, shear, and tensile stiffness than did silicone elastomers. The difference between silicone elastomers and silicone gels is found in their cross-linking and fluid-retention properties. Silicone elastomers are extensively cross-linked and contain little free polydimethylsiloxane (PDMS) fluid, which allows them to retain their fluid under stress. Silicone gels have lightly cross-linked polysiloxane networks, swollen with PDMS fluid that causes them to bleed on loading.


Materials with a high COF will allow the liner to “stick” to the skin better, therefore reducing localized shear stress in the form of slippage between the socket and residual limb, which improves fit during prolonged ambulation. This improvement is important for all amputees, but especially for those with weak or adherent tissue. Compressive stiffness should be considered based on the soft-tissue properties of the residual limb. An individual with a boney residual limb would benefit from a liner with a low compressive stiffness that would allow the residual limb to “sink” further into the liner and provide more cushioning. However, an individual with excessive soft tissue would gain a better sense of control from a liner with a high compressive stiffness.


Shear stiffness is another important material property to be considered with prosthetic liners. Low shear stiffness will allow the residual limb to sink further into the prosthetic socket whereas high shear stiffness will not. For the individual with the boney residual limb, a liner with low shear stiffness would allow a safe amount of movement between the residual limb and the socket during weight bearing. Tensile stiffness must also be considered. High tensile stiffness will improve suspension and fit by decreasing pistoning or slippage. As illustrated in the earlier examples, there are many factors to consider when selecting liners and sleeves for the individual with a lower-limb amputation. Through advances in material science, most amputees can be comfortably and functionally accommodated through comprehensive clinical evaluation by an experienced clinician.




Socket


The prosthetic socket is generally considered the most important component of a prosthesis. As a human-prosthesis interface, the socket should be designed properly to achieve satisfactory load transmission, stability, and efficient control for mobility. The functionality of a socket extends beyond simply accommodating the load or forces of the anatomy of the residuum in a comfortable manner. For example, when an able-bodied person takes a step, signals from the central nervous system (CNS) stimulate the muscles to produce a biomechanically efficient gait pattern, and the body operates on a feed-forward system to adapt to obstacles and varied terrain. There is little conscious effort. When a person with limb loss takes a step, the musculature of the residual limb develops a compensatory contraction strategy to first create a closed kinetic chain environment within the socket for anatomic stability, and then a series of contractions to maintain prosthetic control during functional movements. The configuration of the socket’s structural design has been reported to influence the length/tension ratio of muscle, the movement of the femur, and movement of the residual limb, all of which would affect gait and other gross functional movements.


Another important consideration in socket design is the understanding that the skin and underlying soft tissue of the residual limb are not designed to sustain the range of pressure variations and repetitive forces encountered during prosthetic usage. As the understanding of residual limb anatomy and soft-tissue biomechanics have evolved, socket design has progressed to provide the most effective transfer of forces from the prosthesis through the residual limb to support prolonged activity without damaging soft tissue, skin, or producing discomfort.


Because each residual limb is unique and prone to change over time, some consider prosthetic socket design to be as much an art as it is a science. However, many within the profession believe that there is a need for the fabrication of well-fitting sockets that can be replicated and quantified.


With the development of computer-aided design and computer-aided manufacturing (CAD/CAM) technology, computational modeling has become a tool to generate a quantitative mathematical model of the residual limb during the time of imaging. Modifications can be made to the image, similar to the modifications made on a plaster positive mold. The computer models that are made and stored electronically by the clinician can be used as a reference for changes to the residual limb over time. Currently, more robust CAD/CAM systems are being developed that use computerized tomography (CT) scans or magnetic resonance imaging (MRI) in addition to the mechanical or laser input modeling used today. Furthermore, with additional computer-generated computations, values with respect to load transfer between the socket and the residual limb for the purpose of optimal socket design and objective evaluation of the fit may be incorporated as diagnostic tools. Although the use of CAD/CAM technology has increased the repeatability of socket design, inconsistencies in the software and carving hardware continue to complicate the issue.


The fluctuation in residual limb volume is typically greatest during the first year postamputation as the limb heals from surgery and the muscle first atrophies, and then often hypertrophies, as prosthetic use and restoration of function increases. After the first year, residual limb volume stability varies among amputees. Some people have little change in volume, whereas others will fluctuate during the course of the day. Designing a socket shape and method of suspension for the potential short- and long-term changes in residual limb volume has long been a challenge for prosthetists. During ambulation, the continual changes in forces through the prosthetic limb during stance and swing subjects the residual limb to a repetitive cycle of positive and negative pressures. Positive pressures during the stance phase are believed to be the result of compression to the soft tissues, occluding the small lymph and blood vessels moving fluid out of the residual limb’s periphery. Conversely, negative pressures are present within the socket during swing phase when the socket walls are drawn away from the residuum, allowing the same fluids to fill the small vessels of the limb’s periphery. Most amputees experience some degree of volume change during activity, with some active amputees experiencing significant volume change (usually a reduction) that drastically alters socket fit and stability. To better manage, or even reduce, the consequential changes in limb volume, Carl Caspers developed the concept of using a vacuum to maintain a more constant environment within the socket. By using a small pump to create a vacuum, a more constant environment of negative pressure is produced by drawing the skin and soft tissues to the walls of the socket. Reducing the positive forces that occlude the vessels and allowing the vessels to move fluids through the soft tissues at a continual rate, a theoretical state of homeostasis is created within the residuum while in the socket. Conceptually, by maintaining a constant limb volume control throughout the day, there is a no longer a need for stump socks, and, because of the improved fit and socket suspension, the amputee would have better comfort. Board and colleagues found that positive pressures during stance were significantly lower, and negative pressures during swing phase were significantly greater, with the vacuum-assisted socket. The vacuum-assisted socket has shifted the balance of this fluid movement from a net loss to a net gain. A reduction in positive pressure has been found to reduce skin irritation and tissue breakdown. The increased negative pressure has been hypothesized to improve circulation within the residual limb, which increases nutrition to the tissues, facilitating improved health of the tissues or more rapid healing if a wound were present.


The Harmony (Otto Bock), which was the first commercially available vacuum socket system, has a small pump attached to the prosthesis to maintain a 508-mm Hg (20 inches of mercury) environment of vacuum. Currently there are several other custom fabricated and commercially available negative-pressure or vacuum-assistive socket systems for clinicians to choose from. Because few scientific data are available regarding long-term use or the relationship of negative-pressure sockets in dysvascular residual limbs and other potentially at risk populations, patients fitted with these socket systems should be monitored carefully for adverse events.


Research is currently being conducted on sensors embedded in socket liners that will automatically adapt to fluctuations in body volume. Self-adjusting sockets could make the device more comfortable and prevent sores, bruises, and other complications. The leg socket will adjust to the changing diameter of the wearer’s residual limb during the course of a day.




Prosthetic knees


The Microprocessor Knee


The development of microprocessor technology in prosthetics dates back to the late 1940s when Professor John Lyman at University of California Los Angeles introduced the first microprocessor prosthesis with integrated circuits for upper limb prosthetics. Upper limb amputees have commonly used myoelectric prosthetics, but not until the 1990s was microprocessor technology successfully transferred to lower-limb amputees. Microprocessor controls used in prosthetic knees use sensors to continuously monitor knee position, time, velocity, and forces and moments during ambulation. The microprocessor then calculates comparisons between steps, and routinely adjusts the resistance to control the mechanical knee. Microprocessor knee (MPK) units all work on the same principals and are different only in the medium used for cadence control and microprocessor speeds. The resistance for swing or stance is maintained by different cadence control mediums, depending on the manufacturer. The 3 cadence control media commonly used in MPKs are pneumatics, hydraulic fluid, and magnetorheological fluid. The sensors located within the knee unit or pylon determine the load or force placed through the prosthetic limb during stance. The microprocessor then adjusts the flow of fluid for the required knee stability, especially when walking down ramps or uneven terrain. As the amputee rolls over the prosthetic foot, the microprocessor senses that the limb is moving into swing and adjusts the fluid resistance to accommodate the speed of ambulation. Most MPKs use “step averaging” or adjust the knee resistance based on the last few steps, so, as the amputee walks slower or faster, the prosthetic knee flexion will be equal to that of the anatomic knee ( Table 1 ).


Apr 19, 2017 | Posted by in PHYSICAL MEDICINE & REHABILITATION | Comments Off on Advances in Lower-limb Prosthetic Technology

Full access? Get Clinical Tree

Get Clinical Tree app for offline access