the science and art of treating and preventing disease.
Prehistoric skulls found in Europe and South America indicate that Neolithic man was already able to trephine, or remove disks of bone from, the skull successfully, but whether this delicate operation was performed to release evil spirits or as a surgical procedure is not known. Empirical medicine developed in ancient Egypt, and involved the use of many potent drugs still in use today, such as castor oil, senna, opium, colchicine, and mercury. In spite of their skill in embalming, however, the Egyptians had little knowledge of anatomy.
In Sumerian medicine the Laws of Hammurabi established the first known code of medical ethics, and laid down a fee schedule for specific surgical procedures. In ancient Babylonia, every man considered himself a physician and, according to Herodotus, gave advice freely to the sick man who was willing to exhibit himself to passersby in the public square. The Mosaic Code of the Hebrews indicated concerns with social hygiene and prevention of disease by dietary restrictions and sanitary measures.
Although ancient Chinese medicine was also influenced adversely by the awe felt for the sanctity of the human body, the Nei Ching, attributed to the emperor Huang-Ti (2698–2598 B.C.), contains a reference to a theory of the circulation of the blood and the vital function of the heart that suggests familiarity with anatomy. In addition, accurate location of the proper points for the traditional Chinese practice of acupuncture implies some familiarity with the nervous and vascular systems. The Chinese pharmacopoeia was the most extensive of all the older civilizations. The Hindus seem to have been familiar with many surgical procedures, demonstrating skill in such techniques as nose reconstruction (rhinoplasty) and cutting for removal of bladder stones.
In Greek medicine the impetus for the rational approach came largely from the speculations of the pre-Socratic philosophers and such philosopher-scientists as Pythagoras, Democritus, and Empedocles. Hippocrates, the father of Western medicine, taught the prevention of disease through a regimen of diet and exercise; he emphasized careful observation of the patient, the recuperative powers of nature, and a high standard of ethical conduct, as incorporated in the Hippocratic Oath. By the 4th cent. B.C., Aristotle had already stimulated interest in anatomy by his dissections of animals, and work in the 3d cent. B.C. on human anatomy and physiology was of such high quality that it was not equaled for fifteen hundred years.
The Romans advanced public health and sanitation through the construction of aqueducts, baths, sewers, and hospitals. The encyclopedic writings of Galen constitute a final synthesis of the medicine of the ancient world. Revered by Arabic and Western physicians alike, his concepts stood virtually unchallenged until the 16th cent. Unfortunately, his prolific researches on anatomy and physiology were not invariably accurate, and reliance on them impeded subsequent progress in anatomy.
With the destruction or neglect of the Roman sanitary facilities, there followed a series of local epidemics that culminated many centuries later in the great plague of the 14th cent. known as the Black Death. During the Middle Ages certain monastic libraries, notably those at Monte Cassino, Bobbio, and St. Gall, preserved a few ancient medical manuscripts, and Arab and Jewish physicians such as Avicenna and Maimonides continued medical investigation.
The first real light on modern medicine in Europe came with the translation of many writings from the Arabic at Salerno, Italy, and through a continuing trade and cultural exchange with Byzantium. By the 13th cent. there were flourishing medical schools at Montpellier, Paris, Bologna and Padua, the latter being the site of production of the first accurate books on human anatomy. At Padua, Vesalius proved that Galen had made anatomical mistakes. Prominent among those who pursued the new interest in experimental medicine were Paracelsus, Ambroise Paré, and Fabricius, who discovered the valves of the veins.
In the 17th cent. William Harvey, using careful experimental methods, demonstrated the circulation of the blood, a concept that met with considerable early resistance. The introduction of quinine marked a triumph over malaria, one of the oldest plagues of mankind. The invention of the compound microscope led to the discovery of minute forms of life, and the discovery of the capillary system of the blood filled the final gap in Harvey's explanation of blood circulation.
In the 18th cent. the heart drug digitalis was introduced, scurvy was controlled, surgery was transformed into an experimental science, and reforms were instituted in mental institutions. In addition, Edward Jenner introduced vaccination to prevent smallpox, laying the groundwork for the science of immunization.
The 19th cent. saw the beginnings of modern medicine when Pasteur, Koch, Ehrlich and Semmelweis proved the relationships between germs and disease. Other invaluable developments included the use of disinfection and the consequent improvement in medical, particularly obstetrical, care; the use of inoculation; the introduction of anesthetics in surgery (see anesthesia); and a revival of better public health and sanitary measures. A significant decline in maternal and infant mortality followed.
Medicine in the 20th cent. received its impetus from Gerhard Domagk who discovered the first antibiotic, sulfanilamide, and the groundbreaking advancements in the use of penicillin. Further progress has been characterized by the rise of chemotherapy, especially the use of new antibiotics; increased understanding of the mechanisms of the immune system (see immunology) and the increased prophylactic use of vaccination; utilization of knowledge of the endocrine system to treat diseases resulting from hormone imbalance, such as the use of insulin to treat diabetes; and increased understanding of nutrition and the role of vitamins in health.
In Mar., 1953, at the Univ. of Cambridge, England, Francis Crick, age 35, and James Watson, age 24, announced “We have discovered the secret of life.” Indeed, they had unraveled the chemical structure of the fundamental molecule of heredity, deoxyribonucleic acid (DNA), giving science and medicine the basis for molecular genetics and leading to a continuing revolution in modern medicine.
Much medical research is now directed toward such problems as cancer, heart disease, AIDS, reemerging infectious diseases such as tuberculosis and dengue fever, and organ transplantation. Currently, the largest worldwide study is the Human Genome Project, which will identify all hereditary traits and body functions controlled by specific areas on the chromosomes. Gene therapy, the replacement of faulty genes, offers possible abatement of hereditary diseases. Genetic engineering has led to the development of important pharmaceutical products and the use of monoclonal antibodies, offering promising new approaches to cancer treatment. The discovery of growth factors has opened up the possibility of growth and regeneration of nerve tissues.
With the surge of general and specialized medical knowledge, the educational requirements of the medical profession have increased. In addition to the four-year medical course and the general hospital internship required almost everywhere, additional years of study in a specialized field are usually required. Similar progress and increased requirements in education are reflected in ancillary professions such as nursing.
Modern medicine, characterized by growing specialization and a complex diagnostic and therapeutic technology, faces problems in the allocation of capital and personnel resources. Some authorities advocate an increase in the use of paramedical personnel to supervise the care of individuals with common, chronic, or terminal illnesses, leaving the physician in charge of treating curable disease. Others emphasize the physician's responsibility to help patients and families in the overall management of their health problems, many of which are thought to reflect the social ills of living in an urban, industrialized society.
In some countries, such as Great Britain, medical care is under government control and is available virtually without charge to all. In the United States, medical practice is characterized by a patchwork mixture of government and private control. The Kefauver-Harris amendments to the federal Food, Drug, and Cosmetic Act of 1962 empower the Food and Drug Administration to require stricter testing and licensing of new drugs. There have also been federal, state, and local programs for mass vaccination and other public health programs. The Medicare program, enacted in 1965, provides subsidized hospital and nursing-home care for persons over 65 and, with the Hill-Burton Act, provides funds for state aid to the medically indigent (Medicaid).
A wide variety of private medical insurance plans are also available to those who can afford them, and many employers pay all or part of their employees' health insurance premiums. In addition, health maintenance organizations (HMOs), or group practice plans, are designed to promote disease prevention and reduce medical expenditures.
The development and practice of British medicine has been intimately and inseparably linked with the exploration, colonization, and growth of...
Traditionally, the practice of medicine starts at the interface between the patient (or client, in late twentieth century phraseology) and the...
Since ancient times, the theory and practice of medicine have been influenced by assumptions, generalizations, and myths about the differences...