In an attempt to try and cure myself of writer’s block for an assignment I have with the Open University I produced this piece of writing earlier on today. Under the rules of the OU I am not allowed to publish any work but as this is a writing exercise and bears no resemblance to an academic essay I am confident I won’t be in trouble for this! It is written in an easy, conversational style so should be easy to read. I’d appreciate any feedback you may have for me. Read on!
There were massive changes in surgery from the turn of the 19th Century through to the early days of the 20th Century, none more so than in the figure of the surgeon himself. From a lowly manual labourer of the middle-ages to a heroic, swashbuckling (well, scalpel-wielding) knight in heat-treated gown and mask of the modern era, the surgeon enjoyed a massive rise in status, recognition and power during the 19th Century.
It all began when medical research taught practitioners that there was more going on inside the body than was previously thought. That balancing the humours – that age-old solution to all things diseased – was not actually very good at curing people of their ills at all. Nor was bleeding and purging which had been the remedies administered for everything from treating diarrhoea to dementia for centuries. At best these things either hurried along the end of a life, or made it more comfortable while nature took its course.
It began to dawn on people that something else was going on inside the body after several attempts by eminent practitioners such as Vesalius, eventually leading to the notion of “localisation of disease” in the late 18th and early 19th Century. Instead of the body being seen as one whole, interlinked mechanism that relied on other parts to function properly, it was seen as a series of organs and tissues that hosted disease. Cut out the rotten organ and you will have solved the problem of disease. Only, just as soon as that theory was imagined, other people imagined the localisation of disease getting tinier. It meant that disease was seen as being centred in the cells of the organs of the body – a vision that could only be possible by the invention of tools and instruments to help see them. After all, before the microscope was invented, who knew that organs and tissues were made up of smaller bits called cells?
So, after centuries of one way of thinking about the human body, a whole new area of thought was arrived at. Largely because of the overwhelming influence of Enlightened thinkers in Europe, science and reason were flavour of the day and so traditional medical and anatomical thinking was completely overhauled. No longer was the teaching in universities across Europe reliant on simply passing down the same old knowledge year after year. No longer were surgeons content to being the hand-maidens of the physicians who ordered the lopping off of various limbs in order to cure disease. And no longer would the patient be central to his or her own health situation.
With the age of science and reason came the age of medical professionalism, and with the age of medical professionalism came the change-over in patient power.
It was in this climate that surgeons began to dissociate themselves from the barbers. They were no longer content to administer surgery on the outside of the body in emergencies, or to remove bladder stones (painfully) in people’s homes. They were no longer to be content with apprenticeships instead of formal and classical education in universities on a par with physicians, who had always been seen as the elite. Some barber-surgeons decided to stick with their traditional skills but also added to them by taking extra qualifications in universities so they could remain in work. Some were helped in their decision to retrain and gain extra skills by the municipal authorities, such as Wuttengen, who sought to restrict their responsibilities and jurisdictions to such an extent that they were more or less made redundant. Either way, surgeons were on the up – they were demanding better education and better recognition for the work that they did.
With new levels of education available to them, surgeons began to experiment with what they could do in terms of operations on the human body. Until the end of the 18th Century they were pretty much confined to the same kind of procedures – amputations, blood-letting, lithotomy – all of which took place on the outside of the body. But with more anatomical knowledge and tools such as the microscope to help them, they began to be bolder in their work.
They became bolder in their teaching methods too, using the lecture theatres in the medical faculties of universities to pack in the students to watch surgery being performed on living patients. Previously, anatomy lessons would have been delivered in this way using cadavers in various stages of decomposition and dismemberment, but with the new thirst for surgery came new uses for the theatrics of the lecture hall. It was no use trying to teach surgery to masses of students in people’s homes – how would they all fit round the kitchen table? – so into the lecture theatre came the bloody and noisy business of surgery.
Christian Theodor Albert Billroth
Ovariotomies, tonsillectomies, removal of goitres, mastectomies all became established routines, as were resective operations where diseased tissue or bone were removed (the localisation of disease in action). Some surgeons made their name for a particular operation. For example Theodor Billroth, who designed a procedure where the stomach and the bowel were in effect “replumbed” when diseased – the “Billroth I” as it is known. Amputations still accounted for much of the surgeon’s work, especially as the boom in industry during the 19th Century saw a sharp increase in the numbers of traumas and injuries suffered to the limbs, and in the time before anaesthesia where speed was of the essence, individual surgeons built their reputation on performing them within minutes. Robert Liston was renowned for his speed and once reputedly amputated the injured leg of a patient and tied off the blood vessels neatly within 30 seconds at the London Hospital, surprising not only the patient but one of the assistants who suddenly found himself holding a mass of injured tissue in the shape of a detached, mangled bit of leg.
Of course, the introduction of ether as an anaesthetic in the 1840s was a huge step forward in terms of pain relief for the patient – after all, if he is unconscious how can he feel anything? – but the main danger of surgery still remained. Surgery using anaesthetic may have enabled the surgeon to take longer over the procedure, meaning he could control blood-loss and therefore shock in his patient, but until he adopted the hygiene routines described by Lister from the 1860s, infection still remained the biggest factor in post-operative mortality rates. Prior to the introduction of anaesthetics and antisepsis procedures, surgery tended to be only performed on those patients where it seemed likely they were going to die anyway so infection control wasn’t a high priority. As it was a last resort for many, things like infection or shock seemed secondary to the outcome of the operation which was probably being carried out to either demonstrate the surgeon’s skill to his large, student-filled auditorium or as an experiment to see whether a certain procedure would solve a certain problem.
So while anaesthesia made lengthier (and deeper abdominal, cranial and thoracic) surgery possible, it did not contribute greatly to the outcome for the patient. In fact, the use of ether was a contributory factor in some deaths following surgery because it was a poison which affected the patient’s lungs, or it brought about cardiac arrest. Some surgeons disliked the use of anaesthesia because in their eyes, pain was an indicator of life and therefore “to have an absence of pain meant an absence of life-force”. To hear the screams of your patient whilst you were cutting into his bowels was a sure sign that he was still alive and you could carry on.
Eventually the use of ether was replaced by chloroform which had fewer side-effects for the patient. It was embraced by many for various operations but it was only when Queen Victoria used it for the birth of her son Leopold – her eighth child – that it became widely accepted right across the medical spectrum, but still with a few reservations. For many if it was good enough for Victoria after all those deliveries then by gum it was good enough for them!
Whilst some surgeons were still wary of using either ether or chloroform, others took full advantage of having unconscious patients on which to operate, leading to accusations of them indulging their “lust” for cutting. Some surgeries were deemed unnecessary but as there was such a wide range of opinion on what “necessary” surgery was, it was never going to be an easy issue to reach consensus on. There is evidence that some surgeons actually refused to operate on certain patients because they believed it would not solve their health issue, but for others there seemed to be an insatiable desire to operate. It was not unknown for surgeons to go from patient to patient cutting and stitching with the same instruments and in the same clothes, which although observed at the time, contributed greatly to the spread of infections following surgery. Hospitals at that time became known as “gateways of death” because of their high mortality rate.
Sir Joseph Lister
But back to our man Lister and his theories about infection control. There had been other observations about infection after surgery and childbirth but nobody really had any idea what to do about it. Take the case of the German hospital where two wards were used for childbirth. One was run by student surgeons and the other was run by midwives. The mortality post-partum mortality rate on the ward run by student surgeons was as high as 60%, yet the on the ward run by midwives it was negligible. It was decided to test the reasons why this would be so the hospital authorities asked the two groups of professionals to switch wards for a period of time. Sure enough, the mortality rates followed them and whilst the number of deaths rose with the student surgeons on the “midwife” ward, they fell when the midwives took over the student surgeon’s ward.
It was soon discovered that the midwives had a different hygiene routine than the student surgeons, including washing hands frequently and making sure bed-linen was changed regularly for the patients. The student surgeons were not as conscientious with their hygiene and would examine patients one after the other without washing their hands or changing their aprons after attending autopsies or other infected patients. It seems obvious to us now, but they did not know that bacteria was the cause of infection, and that washing with soap would prevent cross-infection between patients…both the living and the dead ones!
This observation formed the basis of Joseph Lister’s trials with substances to eliminate bacteria from wounds. He discovered that chemicals were effective at keeping things clean – including the air when sprayed in the operating room – but heat treatment for linen and instruments were more effective to destroy bacteria in the first place. We have asepsis – the maintenance of sterility – and antisepsis – the removal of bacteria.
Lister was criticised for his concentration on wounds, and for the fact that the equipment needed to sterilise the air was cumbersome and caustic for the surgeon’s skin. Gradually, he refined his advice based on further experiments and trials within surgery but the notion that infection was a killer but could be controlled was largely taken up across the board. Operations in universities died out as they were more and more performed in hospitals where purpose built rooms could accommodate the necessities of hygiene. No more wooden floors to harbour bacteria, no more crowds of jostling students, no more skylights in the roof trying to make the most of daylight to see by. Instead we have tiled walls and floors in small, observer-free rooms that could be kept clean. We have sterilised instruments on metal trolleys that are capable of being steamed clean and we have private, dignified spaces and patients who were undergoing surgery for their therapeutic reasons, not as living examples of what a surgeon could do to them in the shortest possible time.
Large, open surgical procedures with masses of spectators in universities gave way to small, private affairs in hospital operating rooms with no spectators. The rolled-up sleeves and blood-stiffened gowns and aprons of the heroic surgeon of the early 19th Century gave way to quiet, calmness of the sterile-gowned and masked surgeon of the early 20th Century. The agonised screams of the patient being held down by “dressers” while the surgeon lopped off his leg gave way to the subdued hissing of the anaesthetist’s pump in dignified silence as the surgeon got to work demurely with the patient obscured by sterilised sheets.
Surgery and its performers, its recipients, its teachers, its students, its techniques, its expectations, its location and its focus all came a long way from the early days of rudimentary cutting, where a life saved was a bonus, to the sophisticated and dignified antiseptic and pain-free (but not side-effect free) days of the early 20th Century.
Emil Theodor Kocher
Nobel Prizewinner 1909
The rise of the surgeon can be summed up in Theodore Kocher, who was rewarded for his work in physiology with the Nobel Prize for Medicine in 1909 following years of research and experimental surgery on the goitre and the thyroid gland.
Not bad for someone whose profession not that long previously had been little more than a bunch of apprenticed saw-bones eh?