Jason T. Miller, BS; Scott Y. Rahimi, MD; Mark Lee, MD, PHD
DisclosuresNeurosurg Focus. 2005;18(4):1-5.
"Certainly infections cannot be attributed to the intervention of the devil but must be laid at the surgeon's door" Harvey Cushing[5]
Before advancements in infection control, only conditions that brought patients near death warranted the risk of surgical intervention. If patients survived the operation, infection was nearly inevitable and death by overwhelming sepsis was knocking at their door. In the late 19th century, with the development of germ theory by Louis Pasteur and its subsequent application to surgical sterility by Joseph Lister, surgeons were able to operate with a substantially reduced risk of infection. Consequently, surgeons became more confident and began to explore more extravagant procedures, including elective operations within the cranial vault. As scientific knowledge expanded in the 20th century, so did the advancement of infection control with the use of prophylactic antibiotic drugs, heat sterilization of instruments, and microbial barriers. Recent reports have placed the rate of complications due to infection between 0.75 and 2.32% for intracranial operations.
The late 19th century brought about advances in anesthesia, hemostasis, localization techniques, and infection control that reduced the risks associated with surgical treatment of intracranial masses to acceptable levels. Even after the development of tumor localization, surgeons approached the sanctuary of the mind with trepidation for fear of deadly complications. Infection was the major contributor to morbidity and mortality rates, occurring after practically all operations and taking the lives of almost half of all surgical patients.[1] Consequently, surgical procedures were attempted only as a last resort. The application of germ theory to wound healing changed the practice of surgery and set the stage for the development of brain tumor surgery.
The earliest known attempts to access the calvaria date to circa 10,000 BC. Skulls collected throughout the world show the square and ovoid marks left by the ancient "healers" who ground and scraped the skull with sharpened stones. Some patients survived, as evidenced by the smoothed corners of some Neolithic skulls found in East Africa.[4,17] Nevertheless, it may be assumed that the mortality and morbidity rates were less than acceptable by today's standards.
Some years after the Neolithic era, surgeons may have taken a more conservative approach to opening skulls. Under the guidelines imposed by The Code of Hammurabi (circa 1750 BC), surgeons in Mesopotamia were provided monetary compensation for their services; however, if the patient died under their care, the surgeon's hands were amputated.[7] With these stakes, many surgeons may have limited their caseload of elective surgical procedures. Around this time the first account of attempted wound management appeared. The world's oldest medical text outlines the procedures for wound management practiced by the Sumerians. The wound was cleansed with beer and then bandaged with a cloth soaked in wine and turpentine.[4] The practice of using alcoholic beverages and turpentine would remain the treatment of choice until the modern era.[1]
The proliferation of medical texts originating from the Golden Age of Greece would provide guidelines for the practice of medicine for 2000 years. Nevertheless, because of the intellectual rigidity of the medical community, expansion on the ideas and practices of the Greeks would remain stagnant for the same two millennia.[7] Every few hundred years, courageous men dedicated to the practice of the healing arts would challenge the false assertions and even produce verifiable data to support their claims, but their ideas would not wield the influence necessary to cause change. In the practice of wound healing, two issues would become recurring points of contention: first, the role of suppuration; and second, the origin and transmissibility of infection.
Hippocrates (circa 460370 BC) may have been the first to hold an opinion on suppuration, asserting that the formation of pus was not a natural component in the healing process and suppuration should be avoided. His recommendations for managing wounds were similar to those of the Sumerians: cleansing with wine, applying a bandage, and then pouring wine on the bandage.[4]
Claudius Galen (circa A.D. 130200), a surgeon to the gladiators in Pergamum, idealized Hippocrates and championed Hippocratic doctrines in the practice of medicine. Galen was a prolific writer on the science of medicine and became an outspoken proponent for experimentation, encouraging the questioning of established doctrines to expand scientific knowledge. (Ironically, his ideas became an established orthodoxy and prevailed unquestioned for 15 centuries.) His works were translated into many languages and became the lexicon for medical practice until the modern era. Many of his assertions proved true; however, one very important assumption was horrifically incorrect: that the formation of pus was essential for wound healing. This deviation from the Hippocratic dogma is one that would plague surgeons and hinder surgical progress until the time of Lister.[1]
One thousand years later, Theodoric Borgognoni of Cervia (12051298) challenged Galen's view of suppuration. Theodoric dedicated much of his career to finding the ideal conditions for wound healing and settled on four essentials: control of bleeding, removal of contaminated or necrotic material, avoidance of dead space, and careful application of the wound dressing. He also strongly emphasized the avoidance of "laudable" pus.[16] Nevertheless, because his views were in opposition to the established orthodoxy of Galen, he was denounced by his colleagues and the church.[4] Galen's doctrine of suppuration would remain the rule for wound management until the late 19th century.
That surgeons welcomed the sight of a purulent wound may be explained by the environment in which they were forced to practice. Wounds could be classified into two different categories: those with suppuration and those without. Wounds productive of a creamy, yellow ooze tended to run a chronic course, taking months to heal, but the patients were generally free of systemic symptoms.[15] As Alexander[1] recounts, it was noted by Steven Smith as late as 1887 that "amputation wounds rarely, if ever, recovered at Bellevue, except after long-continued suppuration." Conversely, a thin, watery discharge was associated with a fatal outcome, with the patient dying of sepsis within days. With an infection rate of almost 100%, a purulent wound represented the lesser of the two evils. Therefore, it is of little wonder that even the most conscientious surgeons preferred and even encouraged the formation of pus.
Another issue that proved elusive to science before the 19th century was the origin and transmissibility of infection. As recounted in Alexander,[1] more than a century before Leeuwenhoek's microscope and three centuries before Pasteur's studies on putrefaction, Hieronymus Fracastorius (14781553) postulated that the cause of infectious disease was from invisible living seeds (seminaria contagionum ). In his work, "De contagione," published in 1546, he described three modes of disease spread: direct contact with infected persons, indirect contact with fomites, and airborne transmission (see Meade).[11] As Hamby[8] reported, Ambroïse Paré (15101590), considered the father of modern surgery, similarly believed infection was introduced from the environment. Furthermore, others after Fracastorius and Paré correctly noted the importance of a sterile environment in the prevention of disease transmission. In 1822 Gaspard demonstrated the pathogenicity of suppuration by injecting pus into a dog, and when that dog fell ill, injecting its blood into another animal, causing death.[1] According to two histories of surgery,[11,17] in 1842 Oliver Wendell Holmes of Harvard recommended that physicians wash their hands with a calcium chloride solution to prevent the spread of infection from the autopsy rooms to the wards. Similarly, Ignaz Philipp Semmelweis (18181865), in his attempt to universalize the practice of hand washing, reported that hand washing with chloride of lime solution reduced puerperal sepsis mortality from 9.92 to 1.27% in 2 years.[1,11] The views of both Holmes and Semmelweis encountered a cold reception from the medical community.
As a result, operations were performed with little regard for a sterile environment. Surgeons' hands, rarely washed, were placed directly into the patient's wounds. Frequently, onlookers were encouraged to "take a feel" for educational purposes.[1] Surgical instruments were crudely wiped, placed back into their velvet carriers, and reused, some having been sharpened on the sole of the surgeon's boot.[15] The floors of the surgical wards were covered with human feces, urine, blood, and pus, and the hospital walls displayed a collage of phlegm. Consequently, infection was a major cause of death, with 80% of operations plagued by "hospital gangrene" and a nearly 50% mortality rate.[1] The stench of dead bodies and infectious byproducts led some to believe that putrid wounds were caused by particles in the air or bad "humors." In 1880, William Halsted reportedly operated in tents outside of Bellevue Hospital for better ventilation.[17] The prevailing view was that if infectious particles did exist, then they arose by spontaneous generation.
Louis Pasteur (18221895) vanquished the long-held myth of spontaneous generation and attributed fermentation and meat putrefaction to living microscopic organisms. It was the simplicity and rationality of his experiments that persuaded many of his contemporaries to adopt germ theory.[7,11]
Joseph Lister (18271912; Fig. 1), a professor of surgery at Glasgow, was the first to see the connection between Pasteur's discovery of the fermentation process and the suppuration of wounds. In April 1867 he published his ground-breaking paper on antisepsis, stating that "all the local inflammatory mischief and general febrile disturbance which follow severe injuries are due to the irritating and poisoning influence of decomposing blood or sloughs." Lister began applying carbolic acid to compound fracture wounds. The wound healed without suppuration, amputation was averted, and the mortality rate from amputation plummeted from 45 to 15%.[1]
Photograph of Joseph Lister (18271912). (Courtesy of Historical Section of the National Library of Medicine, Bethesda, Maryland.)
In 1876, Lister traveled to the US to present his ideas at the International Medical Congress in Philadelphia. In attendance was William W. Keen (18371932; Fig. 2) of Jefferson Medical College in Philadelphia, who had garnered a formidable reputation in cranial surgery. Keen was one of the few surgeons who realized the practical importance of infection control, and he became one of the first American surgeons to implement Lister's system.[7] The following is a description of Keen's surgical setup:
Photograph of William Williams Keen Jr. (18371932), a pioneer American neurological surgeon. (Courtesy of Historical Section of the National Library of Medicine, Bethesda, Maryland.)
All carpets and unnecessary furniture were removed from the patient's room. The walls and ceiling were carefully cleaned the day before operation, and the woodwork, floors, and remaining furniture were scrubbed with carbolic solution. This solution was also sprayed in the room on the morning preceding but not during the operation. On the day before the operation, the patient's head was shaved, scrubbed with soap and water, and ether, and covered with wet corrosive sublimate dressing until operation, then ether and mercuric chloride washings were repeated. The surgical instruments were boiled in water for 2 hours, and new deep-sea sponges (elephant ears) were treated with carbolic and sublimate solutions before usage. The surgeon's hands were cleaned and disinfected by soap and water, alcohol, and sublimate solution.[15]
Improving on listerian practices, in 1891 Ernst von Bergmann introduced heat sterilization of instruments, which proved superior to chemical sterilization. Sterile gowns and caps were introduced in 1883 by Gustav Neuber of Kieland, and then the surgical mask by Mikulicz in 1897.[11] The use of rubber gloves became widespread after 1890 when William Stewart Halsted (18521922; Fig. 3) commissioned the Goodyear rubber company to fashion gloves for his nurse to protect her hands from the mercuric chloride solutions used to disinfect the instruments.[13]
Photograph of Harvey Cushing and William Halsted in the operating room. (Courtesy of Historical Section of the National Library of Medicine, Bethesda, Maryland.)
Although principles of cleanliness and sterility were slow to become universally accepted, the surgeons who implemented them found new confidence to explore more complicated procedures. As localization of intracranial lesions became more refined, surgical intervention emerged as a more plausible option. In 1879, while the application of Lister's principles was still in its infancy, Glasgow's Sir William Macewen (18481924) removed a meningioma, after which the patient lived for 8 more years, eventually dying of Bright disease. By 1888, Macewen had reported on 21 neurosurgical operations, with three deaths and 18 successful recoveriesresults that he attributed to cerebral localization and good aseptic technique.[7] In 1884, Sir Rickman Godlee (18491925) removed the first parenchymal tumor diagnosed and localized solely by neurological evidence.[18] The patient survived the procedure with only mild neurological complications, but eventually died of postoperative infection. Four years later, Keen became the first American surgeon to operate electively on the brain, removing a meningioma in a young man who recovered with minimal deficit and went on to live another 30 years. When the man's body was returned to Keen for autopsy, there was no evidence of tumor recurrence.[15]
Harvey Cushing (18691939), under the tutelage of William Halsted, became committed to precision and meticulous surgical technique, producing phenomenal results. In 1915, of 130 surgically treated tumor cases, he reported an 8.4% mortality rate. Of these deaths, only one was due to infection; the patient died of streptococcal meningitis on the 6th day postsurgery. With results far surpassing his contemporaries and rivaling even today's morbidity and mortality numbers, Cushing explained his success this way:
[Our results] depend so greatly on such details as perfection of anaesthesia, scrupulous technique, ample expenditure of time, painstaking closure of wounds without drainage, and a multitude of other elements, which so many operators impatiently regard as triviality.[5]
Neurosurg Focus. 2005;18(4):1-5. © 2005 American Association of Neurological Surgeons
Cite this: History of Infection Control and its Contributions to the Development and Success of Brain Tumor Operations - Medscape - Apr 01, 2005.
Jason T. Miller, BS, Scott Y. Rahimi, MD, andMark Lee, MD, PHD. Department of Neurosurgery, Medical College of Georgia, Augusta, Georgia.
Photograph of Joseph Lister (18271912). (Courtesy of Historical Section of the National Library of Medicine, Bethesda, Maryland.)
Photograph of William Williams Keen Jr. (18371932), a pioneer American neurological surgeon. (Courtesy of Historical Section of the National Library of Medicine, Bethesda, Maryland.)
Photograph of Harvey Cushing and William Halsted in the operating room. (Courtesy of Historical Section of the National Library of Medicine, Bethesda, Maryland.)
You have already selected for My Alerts
Click the topic below to receive emails when new articles are available.
processing....