Skip to main content Skip to Search Box
Summary Article: modern medical technology
from The Hutchinson Unabridged Encyclopedia with Atlas and Weather Guide

Medicine of the 20th century saw an acceleration in the acquisition of knowledge and techniques, and an improvement in public health, although new challenges emerged. Technical developments offered new ways of viewing inside the body and its cells, aiding diagnosis; antibiotics were discovered; and new vaccines, drugs, and therapies developed. Computerization of research greatly enhanced the study of disease and health hazards. New surgical techniques included transplant and keyhole surgery. However, some developments produced further problems; thalidomide, for instance, prescribed for sickness in pregnancy, induced birth defects. Research had to be increasingly regulated, with stringent clinical trials to assess safety. Greater understanding of genetics also brought ethical problems, particularly those concerned with genetic engineering.

Public health At the beginning of the 20th century in Britain, the government took responsibility for public health, with an attempt to provide access to clean water and sanitation as well as the provision of improved housing. Thereafter, steps were progressively taken to tackle the widespread poverty that made it impossible for millions of people to secure the basic necessities of a healthy lifestyle.

Between 1906 and 1914 the Liberal government passed a series of acts to provide people with access to basic health facilities, and to give them enough money to support themselves in times of illness or old age. This marked a fundamental change in attitude on the part of the government. The poor were no longer left to suffer in poverty; instead, the government used tax revenues to improve their standard of living.

Following the end of World War I in 1918, the British government adopted the phrase ‘a country fit for heroes’ to describe the improvements that would be made to the lives of those who had fought for their country. A Ministry of Health to co-ordinate all government health work was set up in 1919. The Housing and Town Planning Act of 1919 made money available to build new council houses to replace the slums of Britain's cities. Steps were taken to enhance pensions, unemployment benefits, and healthcare between world wars I and II. However, the years after World War II saw the final development of a universal public health system in Britain. The setting up of a national health service (NHS) and the welfare state after 1945 represented the completion of the public health programme begun in the middle of the 19th century. Although the NHS faced continuous funding problems following its launch in 1948, the principle of government responsibility for providing national healthcare remained intact. See also public health (history).

Viewing the body The discovery of X-rays in 1895 by Wilhelm Konrad Röntgen opened the modern era of technological medicine. The use of X-rays on the battlefields of World War I led to the rapid acceptance and use of this new diagnostic technique. Surgeons and doctors returning to their civilian hospitals continued to use the tool to advance their medical practice. Until the dangers of exposure to X-rays were more fully documented and understood, X-rays and radioactivity were even used to treat minor medical conditions, such as acne. After World War II regulation became ever stricter, and X-ray diagnostics were reduced to the minimum possible dosage.

Other imaging techniques – ways of seeing into the body – were also developed in the 20th century. Wartime radar research was translated into ultrasound scanning, a high-frequency sound-wave technology, which became widely used in the monitoring of fetuses within the womb and to detect disease.

The 1970s saw the development of the CAT scan (computed axial tomography scanning) by the British engineer Godfrey Hounsfield and the South African-born US physicist Allan Cormack. Magnetic resonance imaging (MRI) produced images of the body by beaming radio waves into a patient lying in a strong magnetic field. Another consequence of the discovery of radioactivity by Antoine-Henri Becquerel in 1896 was the development of radioisotopic scanning (the use of radioactive isotopes for diagnosis and therapeutic purposes, especially for cancer), creating the new speciality of nuclear medicine.

Antibiotics Developing from the work of the bacteriologists Louis Pasteur and Robert Koch in the 19th century, scientists worked on finding treatments for diseases once they had already infected the human body. The first success came in 1909 when a treatment for syphilis, Salvarsan 606, was found by a team working under the German bacteriologist Paul Ehrlich. Salvarsan 606, an arsenic compound, was the first of the so-called ‘magic bullets’, which could seek out and destroy bacteria without harming the human body. The second ‘magic bullet’ was found by the German chemist Gerhard Domagk, who discovered the antibacterial sulphonamides. Domagk found that Prontosil, a coal-tar compound, cured the streptococcus microbe that caused blood poisoning. He tested it on his dangerously ill daughter with success in 1935, and the drug was developed by French and British scientists. Real progress in the fight against diseases was being made, but neither Salvarsan 606 nor the sulphonamides were effective against all diseases, and the sulphonamides had serious side effects, including turning the patient bright red and damage to the kidneys and liver.

Nearly 50 years after Pasteur advanced the germ theory of infection, Alexander Fleming discovered penicillin in 1928. Fleming's discovery is referred to as the third, and ultimate ‘magic bullet’. Penicillin was an antibiotic that could treat all bacterial infections; something that bacteriologists such as Ehrlich and Domagk had been unable to achieve. After 1939 the researchers Howard Florey and Ernst Chain managed to isolate and manufacture penicillin, which became an important asset to the Allies in World War II to keep troops fit against tropical diseases and to hasten the healing of wounds. The range of antibacterial agents proliferated in the second half of the century, and led to their overuse, increasing the possibility of resistance among new strains of bacteria and among patients who received antibiotics too frequently or for the wrong purposes. Hospitals saw outbreaks of MRSA (methicillin-resistant staphylococcus aureus) and other ‘superbugs’ that were untreatable using antibiotics because the bacteria were resistant to all known medicines. Other equally effective drugs such as streptomycin and aspirin were developed and major pharmaceutical companies emerged to manufacture them. Their production was often based on techniques used for the manufacture of chemical weapons during world wars I and II.

Vaccination One of the triumphs of the century was the eradication of the smallpox virus, the only naturally occurring disease to have been wiped out from the human population. This was achieved in 1980 (nearly 200 years after Edward Jenner first introduced a vaccine against smallpox) by massive immunization and case hunting. However, this achievement was overshadowed by the appearance of even more virulent viruses, such as Ebola virus disease and AIDS/HIV in the 1970s and 1980s.

Vaccines were developed in the post-war period for polio (by Jonas Salk and Albert Sabin), measles, mumps, rubella, hepatitis B, and other diseases. Several anti-malarial drugs were developed and the life cycle of the disease carrier – the mosquito – was studied closely to aid the fight against the various strains of malaria found in both developed and developing countries. The growth of international travel made diseases such as malaria hard to overcome; the goal at the beginning of the 21st century was control rather than elimination of malaria.

Victims of research An area of concern that emerged at the end of the century was the use of people as test subjects without their permission in the development of medical advances. The polio vaccines and many other medicines were tested on US prison inmates who were not fully informed of the dangers to which they were being subjected. Some experiments involved children with learning difficulties who had no one to protect their interests; many experiments involved poor African Americans who were not fully informed of their position. In May 1997 US president Bill Clinton apologized to the survivors of the Tuskegee Syphilis Study, where many participants were not offered appropriate treatment even after it was known that penicillin was an effective medication for this debilitating disease. Similarly, the president's Advisory Committee on Human Radiation Experiments reported in October 1995 that many individuals had been subjected to dangerous levels of radiation as part of medical research that they did not fully understand and to which they had not given informed consent. Some observers likened these activities to the medical experimentation in Nazi concentration camps for which German doctors had been condemned at the Nuremberg trials 1945–46. President Clinton made more funding available for bioethical studies as part of his reparation for the offences against medical ethics that his government was acknowledging.

Stricter guidelines for the use of human subjects in medical research were advanced in various countries and by international agencies in the second half of the 20th century, and careful monitoring and vigilance were required by ethics committees overseeing research. The advent of computer technology vastly increased the range of clinical trials conducted in the development of pharmaceuticals and therapies, and some observers expressed concern about the opportunity for fraud and abuse of individual patients as researchers and doctors enrolled thousands of people for their studies. Some minority groups, however, argued that they were being discriminated against and that they were keen to participate in trials for potential cures of conditions such as AIDS/HIV.

Clinical trials There were great advances throughout the century in the field of biostatistics (statistical studies of the effects of medicines and therapies), particularly with the computerization of information storage after World War II. The procedures for clinical trials were developed in a large part by Austin Bradford Hill when he carried out the first randomized trial to determine the effectiveness of the antibiotic drug streptomycin in 1948. He was also the first to demonstrate the link between smoking and cancer.

Not all the new medications subjected to clinical trials were in fact beneficial. The drug thalidomide was given to pregnant women with morning sickness in the 1960s after clinical trials led doctors to believe it to be safe. However, the research was inadequate, and the drug caused damage to the embryo. Babies were born with serious abnormalities including missing or shortened limbs. Subsequently, controls were tightened up all over the world and drugs had to be proved completely safe before approval. Government regulatory bodies had to approve any drug before it could be used in their country.

Work and ill health Another area of great progress was the field of occupational health, as epidemiologists (those who study the incidence of disease) refined their techniques. Computerization enabled the cause–effect relationships between various industries or occupations and particular illnesses to be established. Environmental health hazards, too, were progressively identified. This field of medicine grew after the publication of Rachel Carson's Silent Spring (1962), which alerted the world to the dangers of chemical and radiation pollutants. One of the areas of contention at the end of the century concerned the hypothesis that male workers in nuclear plants might be exposed to radiation that could damage their sperm, resulting in birth defects among their children. New technologies, such as cellular telephones, also began to come under close investigation.

Birth control and hormones One of the great breakthroughs of the century was the development of the Pill (hormone-based birth control) by Carl Djerassi and others in the 1950s, and the freedom it gave women to time their reproduction. Various forms of birth control were disseminated in the effort to stem the growth of the world's population. In the last third of the century the understanding of the menstrual cycle and menopause led to the development of hormone-replacement therapy and other modifiers of the side effects of hormonal imbalances. Research at the turn of the century focused on developing a contraceptive pill for men.

Genetics The identification of the structure of DNA by Rosalind Franklin, Francis Crick, and James Watson in the 1950s, together with technical developments such as the electron microscope, ushered in an era of research into genetics. The successful mapping of the human genome – the sequence of genes on the 23 pairs of human chromosomes – was announced in 2000. With a fuller understanding of the entire DNA makeup of humans, doctors of the 21st century hope to be able to produce genetic treatments for previously incurable conditions. It will also be possible to take action to change the characteristics of a fetus as it develops in the womb.

Developments in human fertilization in the late 20th century, such as the technique of IVF (in vitro fertilization), resulting in the birth of the so-called ‘test tube babies’, and artificial insemination made it possible for previously infertile women to have children. However, the ethical considerations of genetic engineering continue to be under close scrutiny and debate.

Surgery Advances in surgery also proceeded at a rapid pace. Plastic surgery was developed during the first half of the century. The work was given great stimulus by world wars I and II. During World War I Harold Gillies worked on burns patients and pioneered new techniques of reconstruction. Gillies's assistant at the time, Archibald McIndoe, continued to develop plastic surgery during World War II at East Grinstead Hospital in England. McIndoe was able to use antibiotics such as penicillin to reduce the infections common to plastic surgery patients. By the end of World War II plastic surgery was a successful and developed medical technique. The first heart transplant operation was performed by the South African surgeon Christiaan Barnard in 1967, and transplant surgery of a number of the major organs subsequently became almost commonplace in the developed world. The replacement of hips, joints, and eye lenses also improved the quality of life for many people. Organ transplants were a huge success, with many thousands of patients receiving a new lease of life as a result, but the supply of organs from donors failed to keep pace with the demand.

Keyhole surgery was made possible with the use of endoscopes, slender fibre-optic instruments that allow the surgeon to look inside the patient. The image is sent to a screen in the theatre, and the surgeon then uses tiny surgical instruments attached to the end of the endoscope to operate on the patient from the inside. The development of keyhole surgery allowed much shorter recovery times and prevented the trauma of major operations.

With the technology available to surgeons at the beginning of the 21st century, almost any procedure is possible. However, although the development of modern medicine over the 20th century made it possible to help many more patients, the costs involved and problems such as organ supply made many treatments and operations a matter of more than clinical need.

essays

Developments in surgery

© RM, 2018. All rights reserved.

Related Articles


Full text Article Technology
Nursing Leadership

Technology abounds in nursing and has been “inexorably linked” since the late 19th century (Honey, 2011; Sandelowski, 2000, p. 1). Technology...

Full text Article Information Technology in Medicine
Black's Medical Dictionary, 42nd Edition

The advent of computing has had widespread effects in all areas of society, with medicine no exception. Computer systems are vital - as they are...

Full text Article MEDICAL IMAGING
Encyclopedia of Computer Science

Medical imaging is the study of human functions and anatomy through pictorial information. In order to generate this pictorial information,...

See more from Credo