Utah History Encyclopedia
HOME ABOUT PREFACE

MEDICINE IN UTAH

By Henry P. Plenk

Holy Cross Hospital, Salt Lake, 1909

A brief overview of medicine as it developed worldwide provides a context for the medical history of Utah.

Medicine men and women played, and are still playing, a very important role in primitive tribes throughout the world. Much real knowledge has accumulated, and many drugs now in common use in modern medicine, such as digitalis and quinine, came from this source. Taxol, from yew tree bark, used in treating ovarian carcinoma, is the most recent addition.

Some evidence of early surgery has been found in skeletons of many primitive peoples in the form of trephines--surgical holes in the head--which supposedly allowed evil spirits to escape from the brain. Throughout the Middle Ages, barber-surgeons performed amputations and other emergency procedures. The lack of anesthesia until the mid-nineteenth century (chloroform and ether) prevented the more widespread use of surgery. In general, these early medical experts tried to follow the primary principle of Aristotle: "First, do no harm!"

The beginnings of scientific medicine date to 1796 when Edward Jenner, in England, first vaccinated milkmaids against cow pox. Not until 1840 was it recognized that certain diseases were transmitted by external agents: Ignaz Semmelwis in Vienna demonstrated that childbed fever was transmitted by the dirty hands of physicians; and John Snow in London ascribed an epidemic of cholera to contamination of water.

The science of bacteriology was initiated in the latter part of the nineteenth century by Louis Pasteur in France and by Robert Koch in Germany. Their work led to the identification of the offending organisms that caused pneumococcal pneumonia, typhoid fever, and cholera, among other diseases.

In 1905 Schaudin and Hoffman identified the specific cause of syphilis. Four years later, Paul Ehrlich initiated treatment of the disease with Salvarsan, an arsenic compound--the first application of a specific drug in the successful treatment of an infectious disease.

Despite the development of certain vaccines and the steadily improving hygiene and public health, the average life expectancy did not increase significantly during the nineteenth century. Infectious diseases continued to dominate the practice of medicine and be the primary cause of death until the mid-1930s when sulfonamides, the first of the antibiotics, came into use.

Medical Education in the United States and Canada

In 1870 there were 474 medical schools in the United States and Canada, three to four times as many as there are today. Most were proprietary--groups of doctors banding together more to enrich themselves than to educate future physicians. Utah was no exception. In 1880 a forty-three-year-old physician, Dr. Frederick Kohler, established the state's first medical school in Morgan, Utah, forty-two miles northeast of Salt Lake City. In 1882 the "college" honored its only graduating class of six students and then closed its doors. In 1904 the Council on Medical Education of the American Medical Association (AMA) suggested standards of six years for medical education beyond high school and devised a classification system for rating the existing schools. Only 82 of the 160 schools then in existence were found to be acceptable; many others had already closed down. In 1908 the Carnegie Foundation commissioned Dr. Abraham Flexner to assess U.S. medical schools. His report, issued in 1910, revolutionized North American medical education.

Utah Territorial Medicine (1850-1896)

During the first twenty to thirty years after the pioneers settled the Salt Lake Valley, the only healers were "Thomsonian" doctors who acquired their knowledge and "license" by paying $20.00 to a "Dr. Thomson" for a book on herbal medicine and the right to dispense his herbs. Others followed the maxim of "puke 'em, sweat 'em, purge 'em." No wonder Brigham Young advised the Saints to heal each other by the "laying on of hands."

Rising maternal and child death rates prompted Brigham Young to encourage some women living in polygamy who had already borne children to study medicine at the Woman's Medical College in Philadelphia; but there was no provision for their financial support. When these women returned home in the summer to earn money to support themselves during the school year, many became pregnant again, which added to their financial and emotional woes. Ellis Shipp began medical school in 1872 and took advanced training in obstetrics and gynecology. The "grand old lady of Utah medicine" is credited with founding a school of midwifery, and she delivered thousands of babies and published widely in the areas of hygiene and public health. Another woman, Dr. Romania Pratt, took special training in ophthalmology and performed the first cataract operation in the territory.

However, the early prominence of women in Utah medicine lasted for only one generation. A significant increase of female students and physicians did not begin until the 1970s and 1980s, but with only 17 percent female medical students in the early 1990s, Utah remains below the national average of thirty-five percent.

Statehood (1896) to the Present

The University of Utah was founded in 1850. Fifty-five years later, in 1905, the school's Department of Medicine was formed with six professors and an annual budget of $10,000. The name was changed to University of Utah Medical School in 1912, but the program was still limited to only the first two years of a full medical course. Graduates were required to transfer to four-year schools in the East or Midwest to complete their training.

In 1920 a new red-brick building on the university campus, constructed by the army as a dormitory for military officers during the World War I, was turned over to the medical school and served as the basic science building until 1965. With the nation's entry into World War II in 1941, pressure was exerted by the AMA and the U.S. Army to convert the two-year school to a full four-year medical school, since none existed between Denver and San Francisco.

The expansion was approved in 1942, and the Salt Lake General Hospital at 21st South and State streets, the state's only public hospital, was designated as the university's teaching facility. Dr. A.C. Callister, a practicing surgeon appointed part-time dean in 1942, was surprisingly successful in recruiting a small but outstanding group of physicians, teachers, and researchers, in spite of the appalling lack of funding and facilities and a severe nationwide shortage of physicians.

Conditions at the Salt Lake General Hospital were poor. An interesting incident is characteristic of the early years: in 1944 the chief resident in surgery was performing surgery on a patient when, in the middle of the operation, all lights went out. He called out for the hospital engineer, who was the only person familiar with the antiquated wiring and plumbing of the decaying structure. Suddenly, the staff remembered that the engineer was the patient on the operating table. The procedure was completed by flashlight illumination, and the patient recovered satisfactorily.

The nucleus of the four-year faculty arrived between 1943 and 1945. Dr. Philip Price and Dr. Maxwell Wintrobe came from Johns Hopkins; Drs. John Anderson, Robert Alway, and A. Louis Dippel, Emil Holmstromb, and Dr. Leo Samuels, came from the University of Minnesota; Dr. Louis Goodman and Dr. Thomas F. Dougherty came from Yale.

From the very beginning, this was no ordinary medical school. The commitments to teaching, quality of patient care, and research were remarkable. The school started with a dreadfully inadequate physical plant and a minimal budget supported by a state population of only 600,000. Outstanding teachers included Drs. Lou Goodman, Max Wintrobe, and Tom Dougherty. Goodman (pharmacology) was the author of the textbook The Pharmacologic Basis of Therapeutics, used the world over, then and now; more than one and a half million copies have been printed, in sixteen languages. Max Wintrobe, author of the pioneer textbook on hematology, was an outstanding teacher, researcher, and administrator. A hard-working, strict disciplinarian who set very high standards, he required demanding individual case presentations. He refused to accept married house officers, with the explanation that "you can only have one love--medicine." After a senior resident got married secretly, for fear of being fired, the unwritten rule was rescinded in 1950. Tom Dougherty was a man of ideas. He posed questions that stimulated others to initiate research projects and was prolific in his own output as well.

At the end of World War II, Leo Marshall, professor of public health and twice acting dean of the University of Utah Medical School, suggested to Senator Elbert Thomas of Utah that it would be very useful if wartime support for scientific research given to the armed services could be adapted to the support of civilian scientific institutions through the public health service. As a result of Senator Thomas's efforts, Congress appropriated $100,000, but 100 grant applications were received. Senator Thomas prevailed in awarding the entire $100,000, then a princely sum, to the University of Utah. The grant was renewed for twenty-eight years under Dr. Wintrobe's direction and amounted to many millions of dollars.

The initial town-and-gown relationship between practicing physicians and the university faculty left something to be desired. Some physicians actually opposed the formation of the four-year school, fearing competition for their patients. Dr. Hans Hecht, pioneer academic cardiologist, and Dr. Ernst Eichwald, pathologist and early expert on tissue transplantation--both graduates of German medical schools--were required to enroll as senior students in the medical school to obtain American M.D. degrees in order to be licensed in Utah. The Utah State Board of Examiners was unwilling to grant an exemption in spite of the outstanding contributions both men were already making in their respective fields. Fortunately, this tension disappeared as some of the oldtimers died out and graduates of the University of Utah formed a large majority of the area's practicing physicians.

Hans Hecht exemplifies the ingenuity, modesty, and commitment of the early faculty. When he arrived in 1944, at a salary of $2,000 per year, no space could be found for his activities. He noticed an auditorium in the infirmary and suggested having the floor rebuilt. The triangular space created served as the heart station and Hecht's research laboratory for many years.

The growth of the medical school profoundly affected the quality of medicine in Utah and especially in the Wasatch Front communities. The presence of the four-year school not only brought many well-qualified experts to the faculty but also acted as a powerful magnet to attract well-trained specialists from many other centers to practice in the community and to seek clinical (teaching) appointments in the medical school. More and more of the best medical students from Utah were guided by the faculty to the best post-graduate training programs in the East and Midwest. The new doctors returned to fill vacancies on the faculty or to relieve shortages in the community. The medical school also stimulated an unusual amount of research in the local private hospitals. The increasing number of training programs at the University of Utah Medical School provided more and more specialists in Salt Lake City, Ogden, Provo, and eventually throughout Utah and the entire Intermountain area.

The original postwar faculty of six members in the Department of Medicine covered the entire field of internal medicine, took care of all medical patients, taught medical students on a four-quarter schedule, and initiated significant research programs. Drs. Max Wintrobe and George Cartwright concentrated on hematology, Hans Hecht on cardiology, Frank Tyler on endocrinology and metabolism, Val Jager on neurology and syphilology, and Utah native John Waldo on infectious diseases. Two additional departments have since been created: Neurology and Family and Preventive Medicine. By 1992 the Department of Medicine had grown to 202 members in thirteen divisions. The Department of Surgery, consisting of three full-time members in 1947, now comprises eighty-one members in ten divisions, and two divisions have become separate departments: Ophthalmology and Neurosurgery.

Practice of Medicine

The general practitioner was the main supplier of medical care throughout the first half of the twentieth century. After graduation from medical school, he (or much less frequently, she) spent one or two years in an internship and frequently apprenticed himself for a few years to an older practitioner. He took care of all members of the family, regardless of age, delivered babies, diagnosed and treated medical illnesses, and performed a fair amount of surgery. The family doctor, as a valued friend and counselor, made many house calls and often was loved and respected.

Specialization in internal medicine and surgery began after World War I. Many physicians assigned to specialty wards in military hospitals proceeded to take special training after their discharge, often working at the fine medical centers in Europe--Berlin, Vienna, London, and Edinburgh. Specialty boards began to be formed in the 1930s and 1940s, and formal three-to-five-year residencies were soon required in many fields. Some Utah physicians who had restricted their practices to certain specialties before World War I were the key organizers of several clinics in Salt Lake City, notably the Salt Lake Clinic (1915), Intermountain Clinic (1917), Bryner Clinic (1941), and Memorial Medical Center (1953).

The specialization process was vastly accelerated by World War II. The G.I. Bill of Rights enabled many veteran physicians to enter specialty training and qualify for board examination, changing the character of medical practice in the late 1940s and 1950s.

The general internist began to replace the general practitioner as the primary-care physician, especially in urban areas, and also became the consultant to the general practitioner in more complicated problems of diagnosis and treatment. The increasing subspecialization of surgery into orthopedic, eye, ear-nose-throat, chest, neuro-plastic, pediatric surgery, etc., continued to erode the field of the general surgeon.

In the late 1950s and 1960s, further subspecialization of internal medicine changed some areas from predominantly "thinking" to "doing" fields. The gastroenterologist learned to pass scopes through the mouth and the rectum, and the pulmonologist started to use the bronchoscope. The cardiologist began implanting cardiac pacemakers and passing catheters. The increased compensation for these procedures helped to lure young physicians into the subspecialties, and the general internist became an endangered species.

Fortunately, the Department of Family and Preventive Medicine at the University of Utah, formed in 1970 by Dr. C. Hilmon Castle, created a three-year residency in Family Practice leading to medical board certification. This program stresses the areas of medicine and pediatrics but also provides some training in obstetrics, surgery, and psychiatry, tailored to some degree to the location of the intended practice. From 1970 to 1992, 262 family practice physicians were graduated, of whom half chose to practice in smaller communities and rural areas to replace the vanishing general practitioner.

Since the 1970s and 1980s, preventive medicine has suffered from the lack of primary-care physicians. Patients without a family doctor and those who have no insurance and can't afford preventive medical care have been flocking to hospital emergency rooms, having neglected early warning signs. There, with no previous acquaintance with the physician and no medical "history," they receive the most expensive and most impersonal form of medical care.

While "hanging out a shingle" was the expected step following medical training in the past, fewer and fewer young physicians now go into solo practice or join another physician. The cost of setting up an office after having incurred considerable debt going to school, as well as the prospect of having to be at the beck and call of patients at all hours and on weekends, directs many young M.D.s to seek employment by hospital emergency rooms, existing clinics, or health maintenance organizations (HMOs).

Family Health Plan (FHP), the first and largest Utah HMO, began operations in Utah in 1976 and by 1992 cared for 140,000 patients annually. HMOs are attractive to the employer who pays much of the cost of employees' health insurance because of their generally lower rates and broader coverage. The patient chooses a primary-care physician--internist, family practitioner, or pediatrician. These doctors see the patients first and decide on procedures and, if necessary, refer them to specialists. Another physician is frequently substituted, particularly when a patient is hospitalized, since the physician is obligated to work only 40 to 44 hours per week. Physicians are on salary but are rewarded for keeping costs down. The average age of patients covered by HMOs is significantly lower than that of the population at large.

Primary Children's Hospital, Salt Lake, 1946

Some Outstanding Research Accomplishments

Utah physicians and medical researchers have made many important contributions, locally, nationally, and internationally. A few significant landmarks are mentioned here.

In 1900 the major causes of death were infectious diseases such as pneumonia, tuberculosis, and the childhood diseases. By mid-century heart disease, stroke, and cancer had climbed to the top of the list, with infectious diseases at the bottom. Technological advances in public health (such as water- and sewage-treatment plants) played a major role in nearly eliminating intestinal infections in the United States, and vaccination accomplished wonders in reducing childhood diseases. Simultaneously, however, increased tobacco and alcohol use, and other lifestyle changes, as well as rapidly increasing pollution by chemicals and radiation, contributed to the increase in cancer and heart disease.

In the 1940s and 1950s, a concerted effort by several cooperating departments of the University of Utah Medical Center, under the leadership of Dr. Leo Samuels, resulted in significant new knowledge concerning the chemistry and physiology of the adrenal glands.

Dr. Frank Tyler and his associates in the Department of Medicine laid the groundwork for later genetic studies through their investigation of several familial diseases such as muscular dystrophy, phenylketonuria, polyposis of the bowel, and others. Geneticist Eldon Gardner studied familial polyposis of the large bowel associated with benign subcutaneous tumors (Gardner's Syndrome). Radiologist Henry Plenk discovered multiple bony tumors associated in all patients with this condition (Plenk-Gardner Syndrome).

The hematology section explored the mechanisms and treatment of various anemias and supported Wintrobe's pioneering efforts in treating lymphomas and leukemias with chemotherapy. Utah was selected as one of four centers funded to develop a polio vaccine; the breakthroughs came in Pittsburgh in 1953 and in Cincinnati in 1954. Through inventive public-vaccination campaigns, poliomyelitis was effectively wiped out. The infectious disease section played a major role in the recognition of toxic shock syndrome in women and its relationship to a brand of "super" tampons being test-marketed regionally.

In gastroenterology, the development of newer drugs to reduce gastric acidity reduced the need for gastric resection of peptic ulcers. The development and perfection of upper and lower gastrointestinal (G.I.) flexible endoscopy revolutionized the diagnosis and treatment of many diseases of the G.I. tract and allowed biopsies and removal of polyps without major surgery.

In pulmonary medicine, a drive to eradicate tuberculosis by early diagnosis and chemoprophylaxis with the drug Isoniazid led to a dramatic reduction of the disease, particularly among the state's Native American population, and the eventual closing of the State Tuberculosis Hospital in Roy, Utah, in 1967.

LDS Hospital played a leading role in pioneering a pulmonary function laboratory and setting up the first shock/trauma intensive care unit (ICU). In conjunction with the University of Utah, LDS created a program in occupational and environmental health and critical care. Life Flight by helicopter or fixed-wing aircraft speeded up the initiation of critical care.

The institution of hemodialysis for renal failure prolonged many lives, but kidney transplants eventually proved not only more effective but also less expensive. The first renal transplant in Utah was performed at Salt Lake General Hospital in 1965, and the patient was still living in 1992.

Dr. Willem J. Kolff, the originator of hemodialysis, the artificial kidney, and artificial heart, joined the University of Utah Medical Center in 1968. This major boost to the artificial organs program resulted in the implanting of an artificial heart in dentist Barney Clark in 1982. Pioneering artificial eyes, ears, and arms have been additional tangible results.

Dr. Ray Rumel was the pioneer thoracic surgeon. His removal of a lobe of the lung for cancer at LDS Hospital, resulting in a nineteen-year survival for the patient, was a truly innovative procedure in 1942. Then came open-heart surgery to correct congenital cardiac abnormalities and to replace defective valves. Reconstruction of narrowed blood vessels, aorta, renal arteries, and coronary arteries prevented many complications of arteriosclerosis.

Continued progress in thoracic surgery led to the formation of a team of surgeons doing heart transplants in four hospitals. The survival rate of 90 percent one year after surgery in 412 transplants performed from 1985 to 1992 is one of the best in the country.

Homer Warner deserves credit for developing the most sophisticated system of utilizing computers in total patient care, making LDS Hospital a model for the world.

One of the most far-reaching new tools, the laser, was applied to medicine by John A. Dixon. The laser is now used in most surgical specialties worldwide to stop bleeding and to destroy malignant tissues, among other uses. Between 1982 and 1992, more than 1,500 patients were treated with his new device, and more than 1,500 physicians from all over the world were trained at the University of Utah to use the method successfully. Except for some minor burns, no serious complications were encountered during the development of the procedures. The dramatic decrease in neonatal deaths from fifteen to three per 1,000 live births in Utah during the twenty-year period from 1968 to 1988 was due in great part to the efforts of Dr. August L. Jung, who created neonatal intensive care units (NICUs) first at the University of Utah, Primary Children's Medical Center, and LDS Hospital, and then in all major hospitals in the area.

Dr. David Bragg (appointed in 1970) changed the character of the Department of Radiology at the university and the practice of radiology in the state by introducing many modern methods such as angiography, CT and MRI scanning, and interventive radiology. Through his success in attracting massive research grants, his staff has produced a prolific scientific output (150 to 200 papers per year) as well as some fifty textbooks.

The first modern radiation therapy facility between Denver and the Pacific Coast was established by Drs. Henry P. Plenk and Richard Y. Card at St. Mark's Hospital in 1960. The Tumor Institute became the Radiation Center when it moved to a yet more modern facility at LDS Hospital in 1969. Plenk pioneered in the use of two procedures to enhance the effect of radiation on tumors: hyperbaric oxygen and hyperthermia. Intraoperative radiation therapy was another major innovation fostered by Drs. William T. Sause and R. Dirk Noyes at LDS Hospital.

The Division of Radiation Oncology at the University of Utah was instituted in 1971 with the appointment of Dr. J. Robert Stewart, who established a productive section in radiation biology. He and his staff became very involved in hyperthermia. In 1986 Stewart became director of an important cancer center at the University of Utah and affiliated hospitals.

Sports medicine emerged in the early 1970s, largely as a result of the development of arthroscopy by Dr. Robert Metcalf, team physician at Brigham Young University and later a University of Utah Medical School staff member. The procedure revolutionized knee surgery and is now used in shoulder surgery as well. Prosthetic replacement of hips and knees was also a major advance.

Among the many advances in general surgery, two innovations deserve special mention: the use of staples in place of sutures, and the use of the peritoneoscope, first to explore the abdomen and more recently in the performance of actual procedures such as removal of the gallbladder or uterus.

From the beginning of the computerization by Dr. Mark Skolnick of the genealogical library of the LDS Church, Raymond Gesteland, Ray White, and colleagues have been very successful in proving the genetic origin of many disease and in pinpointing specific locations of important disease genes. The Institute of Human Genetics houses three major programs: the Department of Human Genetics, the Human Molecular Biology and Genetics Program, and the Center for Human Genome Research, one of six such centers in the United States.

Medical Practice: Then, Now, and in the Future

In spite of the phenomenal progress in the science of medicine and the many contributions of Utah physicians, the art of medicine nationwide took a step backward in the late twentieth century. Prior to the initiation of Medicare in 1966, physicians felt responsible for taking care of all patients, regardless of their ability to pay, either in tax-supported hospitals or in their offices. Even many private hospitals had charity services.

Medicare certainly had a profound effect on the practice of medicine by removing the elderly and many widows from the medically indigent group, while high inflation during the 1970s and 1980s, rapid progress in medical technology, and further implementation of technical procedures boosted the cost of medical care. A significant increase in the number of medical school graduates with an even higher percentage training in the subspecialties rather than the primary care areas (internal and family medicine, pediatrics, and obstetrics) contributed to rising costs.

Three further events had a devastating effect. First, a ruling by the Federal Trade Commission in 1979, supported by a Supreme Court decision in 1982, declared medicine (as well as law) a "business" rather than a "profession." This opened the floodgates to advertising by physicians and hospitals, which fostered excessively luxurious buildings and facilities to compete for physicians and their referrals. Second, administrative costs skyrocketed because of government regulations and insurance requirements, eating up more than twenty percent of the medical dollar. Third, the abandonment of the tightly controlled "certificate of need" in 1985 deregulated local decision-making regarding requirements for new facilities and equipment and allowed a very wasteful duplication of hospitals and expensive machines. Six new psychiatric hospitals were quickly built in Utah, whereas, only a short time before, a few wards had filled the need.

The result of these errors and omissions was that thirty-five percent of the population was without any health insurance coverage, and sixty to seventy million people nationwide were without adequate access to high-quality medical care.

Reform of the medical care system was an important issue in the 1992 election campaign. Numerous plans were supported by the candidates and discussed in congressional committees. The American College of Physicians, the largest medical organization in the United States after the AMA, supported the concept that adequate medical care is a right, not a privilege, and that universal access can be achieved only through system-wide reform. It suggested four principles: (1) assuring access to care; (2) assuring high-quality, comprehensive coverage; (3) promoting innovation and excellence; and (4) controlling costs by a combination of employee-sponsored and publicly sponsored insurance covering the entire population. To bring these changes about, private insurance companies would need to provide benefits identical to those in publicly sponsored plans. All patients would be eligible regardless of prior existing conditions; coverage could not be canceled and could be transferred to other employment.

It was considered imperative by policy makers and practitioners that national health care spending be capped at the 1992 level of $800 billion. The savings gained by eliminating inflated administrative costs, needless duplication of facilities, overpriced care, and unnecessary malpractice suits would provide for complete coverage of the entire population.

In the early 1990s, several communities and some states were well on their way to achieving these lofty goals. The state of Hawaii has been very successful in providing comprehensive coverage to an increasing segment of the population since 1975. By 1992, 98 percent of the population was included, and a goal of 100 percent was anticipated. Hawaii's system stresses primary and preventive care and eliminates elective procedures and much high-tech tertiary care, especially in the terminal patient.

For many Utahns as well as other Americans, health and medical care had replaced war and the threat of nuclear destruction as the most important issues facing the nation at the end of the century. Those decisions made in the coming years will undoubtedly affect every citizen.