Healthcare/ Leadership

Chapter 1

History of the U.S. Healthcare System


Save your time - order a paper!

Get your paper written from scratch within the tight deadline. Our service is a reliable solution to all your troubles. Place an order on any task and we will take care of it. You won’t have to worry about the quality and deadlines

Order Paper Now

The student will be able to:

Identify five milestones of medicine and medical education and their importance to health care.

Identify five milestones of the hospital system and their importance to health care.

Identify five milestones of public health and their importance to health care.

Identify five milestones of health insurance and their importance to health care.

Explain the difference between primary, secondary, and tertiary prevention.

Explain the concept of the iron triangle as it applies to health care.


When the practice of medicine first began, tradesmen such as barbers practiced medicine. They often used the same razor to cut hair as to perform surgery.

In 2010, the United States spent $2.6 trillion on healthcare spending or 17.6% of the gross domestic product, which is the highest in the world.

In 2011, U.S. Census data indicate there were 48.6 million uninsured U.S. citizens, which is a decrease from 50 million in 2010.

The Centers for Medicare and Medicaid Services (CMS) predicts annual healthcare costs will be $4.64 trillion by 2020, which represents nearly 20% of the U.S. gross domestic product.

The United States is one of only a few developed countries that does not have universal healthcare coverage.

In 2002, the Joint Commission issued hospital standards requiring them to inform their patients if their results were not consistent with typical care results.


It is important as a healthcare consumer to understand the history of the U.S. healthcare delivery system, how it operates today, who participates in the system, what legal and ethical issues arise as a result of the system, and what problems continue to plague the healthcare system. We are all consumers of health care. Yet, in many instances, we are ignorant of what we are actually purchasing. If we were going to spend $1,000 on an appliance or a flat screen television, many of us would research the product to determine if what we are purchasing is the best product for us. This same concept should be applied to purchasing healthcare services.

Increasing healthcare consumer awareness will protect you in both the personal and professional aspects of your life. You may decide to pursue a career in health care either as a provider or as an administrator. You may also decide to manage a business where you will have the responsibility of providing health care to your employees. And lastly, from a personal standpoint, you should have the knowledge from a consumer point of view so you can make informed decisions about what matters most—your health. The federal government agrees with this philosophy. Recently, the Centers for Medicare and Medicaid Services (CMS) used its claim data to publish the hospital costs of the 100 most common treatments nationwide. The purpose of this effort is to provide data to consumers regarding healthcare costs because the costs vary considerably across the United States. This effort may also encourage pricing competition of healthcare services (Godert, 2013).

As the U.S. population’s life expectancy continues to increase—increasing the “graying” of the population—the United States will be confronted with more chronic health issues because, as we age, more chronic health conditions develop. The U.S. healthcare system is one of the most expensive systems in the world. According to 2010 statistics, the United States spent $2.6 trillion on healthcare expenditures or 17.6% of its gross domestic product (CMS, 2013a). The gross domestic product (GDP) is the total finished products or services that are produced in a country within a year. These statistics mean that nearly 18% of all of the products made within the borders of the United States within a year are healthcare related. Estimates indicate that healthcare spending will be $4.6 trillion by 2020, which represents nearly 20% of the gross domestic product. In 2011, there were 48.6 million uninsured U.S. citizens, a decrease from 50 million in 2010 (Kaiser Family Foundation [KFF], 2013). The Institute of Medicine’s (IOM) 1999 report indicated that nearly 100,000 citizens die each year as a result of medical errors. Although there have been quality improvement initiatives in the healthcare industry such as the Patient Safety and Quality Improvement Act of 2005, recent research indicates that medical errors in hospitals remain high (Classen et al., 2011).

Employers are offering less healthcare benefits. In 2002, 72% offered health insurance benefits, which has dropped to 67.5% in 2010. This is typical of smaller businesses that have a small number of employees who need benefits (Kliff, 2012).

These rates are some of the highest in the world but, unlike most developed countries, the United States does not offer healthcare coverage as a right of citizenship. Most developed countries have a universal healthcare program, which means access to all citizens. Many of these systems are typically run by the federal government, have centralized health policy agencies, are financed through different forms of taxation, and payment of healthcare services are by a single payer—the government (Shi & Singh, 2008). France and the United Kingdom have been discussed as possible models for the United States to follow to improve access to health care, but these programs have problems and may not be the ultimate solution for the United States. However, because the United States does not offer any type of universal healthcare coverage, many citizens who are not eligible for government-sponsored programs are expected to provide the service for themselves through the purchase of health insurance or the purchase of actual services. Many citizens cannot afford these options, resulting in their not receiving routine medical care. The passage of the Patient Protection and Affordable Care Act of 2010 (PPACA, or ACA) has attempted to increase access to affordable healthcare. One of the mandates of the Act is the establishment of state-run health insurance marketplaces, which provide opportunities for consumers to search for affordable health insurance plans. There is also a mandate that individuals who do not have health insurance purchase health insurance if they can afford it or pay a fine. Both of these mandates should decrease the number of uninsured in the United States. These programs will be closely evaluated to assess whether their goals will be achieved.


Basic Concepts of Health

Prior to discussing this complex system, it is important to identify three major concepts of healthcare delivery: primary, secondary, and tertiary prevention. These concepts are vital to understanding the U.S. healthcare system because different components of the healthcare system focus on these different areas of health, which often results in lack of coordination between the different components.

Primary, Secondary, and Tertiary Prevention

According to the American Heritage Dictionary (2001), prevention is defined as “slowing down or stopping the course of an event.” Primary prevention avoids the development of a disease. Promotion activities such as health education are primary prevention. Other examples include smoking cessation programs, immunization programs, and educational programs for pregnancy and employee safety. State health departments often develop targeted, large education campaigns regarding a specific health issue in their area. Secondary prevention activities are focused on early disease detection, which prevents progression of the disease. Screening programs, such as high blood pressure testing, are examples of secondary prevention activities. Colonoscopies and mammograms are also examples of secondary prevention activities. Many local health departments implement secondary prevention activities. Tertiary prevention reduces the impact of an already established disease by minimizing disease-related complications. Tertiary prevention focuses on rehabilitation and monitoring of diseased individuals. A person with high blood pressure who is taking blood pressure medication is an example of tertiary prevention. A physician who writes a prescription for that blood pressure medication to control high blood pressure is an example of tertiary prevention. Traditional medicine focuses on tertiary prevention, although more primary care providers are encouraging and educating their patients on healthy behaviors (Centers for Disease Control and Pretention [CDC], 2007).

We, as healthcare consumers, would like to receive primary prevention to prevent disease. We would like to participate in secondary prevention activities such as screening for cholesterol or blood pressure because it helps us manage any health problems we may be experiencing and reduces the potential impact of a disease. And, we would like to also visit our physicians for tertiary measures so, if we do have a disease, it can be managed by taking a prescribed drug or some other type of treatment. From our perspective, these three areas of health should be better coordinated for the healthcare consumer so the United States will have a healthier population.

In order to understand the current healthcare delivery system and its issues, it is important to learn the history of the development of the U.S. healthcare system. There are four major sectors of our healthcare system that will be discussed in this chapter that have impacted our current system of operations: (1) the history of practicing medicine and the development of medical education, (2) the development of the hospital system, (3) the history of public health, and (4) the history of health insurance. In Tables 1-1 to 1-4 , several important milestones are listed by date and illustrate historic highlights of each system component. The list is by no means exhaustive, but provides an introduction to how each sector has evolved as part of the U.S. healthcare system.


The early practice of medicine did not require a major course of study, training, board exams, and licensing, as is required today. During this period, anyone who had the inclination to set up a physician practice could do so; oftentimes, clergy were also medical providers, as well as tradesmen such as barbers. The red and white striped poles outside barber shops represented blood and bandages because the barbers were often also surgeons. They used the same blades to cut hair and to perform surgery (Starr, 1982). Because there were no restrictions, competition was very intense. In most cases, physicians did not possess any technical expertise; they relied mainly on common sense to make diagnoses (Stevens, 1971). During this period, there was no health insurance, so consumers decided when they would visit a physician and paid for their visits out of their own pockets. Often, physicians treated their patients in the patients’ homes. During the late 1800s, the medical profession became more cohesive as more technically advanced services were delivered to patients. The establishment of the American Medical Association (AMA) in 1847 as a professional membership organization for physicians was a driving force for the concept of private practice in medicine. The AMA was also responsible for standardizing medical education (AMA, 2013a; Goodman & Musgrave, 1992).

In the early history of medical education, physicians gradually established large numbers of medical schools because they were inexpensive to operate, increased their prestige, and enhanced their income. Medical schools only required four or more physicians, a classroom, some discussion rooms, and legal authority to confer degrees. Physicians received the students’ tuitions directly and operated the school from this influx of money. Many physicians would affiliate with established colleges to confer degrees. Because there were no entry restrictions, as more students entered into medical schools, the existing internship program with physicians was dissolved and the Doctor of Medicine (MD) became the standard (Vault Career Intelligence, 2013). Although there were major issues with the quality of education provided because of the lack of educational requirements, medical school education became the gold standard for practicing medicine (Sultz & Young, 2006). The publication of the Flexner Report in 1910, which evaluated medical schools in Canada and the United States, was responsible for forcing medical schools to develop curriculums and admission testing. Curriculums and admission testing are still in existence today.