Medicine and Medical Education

Medicine and Medical Education

TABLE 1-1 Milestones of Medicine and Medical Education 1700–2015 ■ 1700s: Training and apprenticeship under one physician was common until hospitals were founded in the mid-1700s. In 1765, the first medical school was established at the University of Pennsylvania. ■ 1800s: Medical training was provided through internships with existing physicians who often were poorly trained themselves. In the United States, there were only four medical schools, which graduated only a handful of students. There was no formal tuition with no mandatory testing. ■ 1847: The AMA was established as a membership organization for physicians to protect the interests of its members. It did not become powerful until the 1900s when it organized its physician members by county and state medical societies. The AMA wanted to ensure these local societies were protecting physicians’ financial well-being. It also began to focus on standardizing medical education. ■ 1900s–1930s: The medical profession was represented by general or family practitioners who operated in solo practices. A small percentage of physicians were women. Total expenditures for medical care were less than 4% of the gross domestic product. ■ 1904: The AMA created the Council on Medical Education to establish standards for medical education. ■ 1910: Formal medical education was attributed to Abraham Flexner, who wrote an evaluation of medical schools in the United States and Canada indicating many schools were substandard. The Flexner Report led to standardized admissions testing for students called the Medical College Admission Test (MCAT), which is still used as part of the admissions process today. ■ 1930s: The healthcare industry was dominated by male physicians and hospitals. Relationships between patients and physicians were sacred. Payments for physician care were personal. ■ 1940s–1960s: When group health insurance was offered, the relationship between patient and physician changed because of third-party payers (insurance). In the 1950s, federal grants supported medical school operations and teaching hospitals. In the 1960s, the Regional Medical Programs provided research grants and emphasized service innovation and provider networking. As a result of the Medicare and Medicaid enactment in 1965, the responsibilities of teaching faculty also included clinical responsibilities. ■ 1970s–1990s: Patient care dollars surpassed research dollars as the largest source of medical school funding. During the 1980s, third-party payers reimbursed academic medical centers with no restrictions. In the 1990s with the advent of managed care, reimbursement was restricted. ■ 2014: According to the 2014 Association of American Medical Colleges (AAMAC) annual survey, over 70% of medical schools have or will be implementing policies and programs to encourage primary care specialties for medical school students. TABLE 1-2 Milestones of the Hospital and Healthcare Systems 1820–2015 ■ 1820s: Almshouses or poorhouses, the precursor of hospitals, were developed to serve primarily poor people. They provided food and shelter to the poor and consequently treated the ill. Pesthouses, operated by local governments, were used to quarantine people who had contagious diseases such as cholera. The first hospitals were built around areas such as New York City, Philadelphia, and Boston and were used often as a refuge for the poor. Dispensaries or pharmacies were established to provide free care to those who could not afford to pay and to dispense drugs to ambulatory patients. ■ 1850s: A hospital system was finally developed but hospital conditions were deplorable because of unskilled providers. Hospitals were owned primarily by the physicians who practiced in them. ■ 1890s: Patients went to hospitals because they had no choice. More cohesiveness developed among providers because they had to rely on each other for referrals and access to hospitals, which gave them more professional power. ■ 1920s: The development of medical technological advances increased the quality of medical training and specialization and the economic development of the United States. The establishment of hospitals became the symbol of the institutionalization of health care. In 1929, President Coolidge signed the Narcotic Control Act, which provided funding for construction of hospitals for patients with drug addictions. ■ 1930s–1940s: Once physician-owned hospitals were now owned by church groups, larger facilities, and government at all levels. ■ 1970–1980: The first Patient Bill of Rights was introduced to protect healthcare consumer representation in hospital care. In 1974, the National Health Planning and Resources Development Act required states to have certificate of need (CON) laws to qualify for federal funding. ■ 1980–1990: According to the AHA, 87% of hospitals were offering ambulatory surgery. In 1985, the EMTALA was enacted, which required hospitals to screen and stabilize individuals coming into emergency rooms regardless of the consumers’ ability to pay. ■ 1990–2000s: As a result of the Balanced Budget Act cuts of 1997, the federal government authorized an outpatient Medicare reimbursement system. ■ 1996: The medical specialty of hospitalists, who provide care once a patient is hospitalized, was created. ■ 2002: The Joint Commission on the Accreditation of Healthcare Organizations (now The Joint Commission) issued standards to increase consumer awareness by requiring hospitals to inform patients if their healthcare results were not consistent with typical results. ■ 2002: The CMS partnered with the AHRQ to develop and test the HCAHPS (Hospital Consumer Assessment of Healthcare, Providers and Systems Survey). Also known as the CAHPS survey, the HCAHPS is a 32-item survey for measuring patients’ perception of their hospital experience. ■ 2007: The Institute for Health Improvement launched the Triple Aim, which focuses on three goals: improving patient satisfaction, reducing health costs, and improving public health. ■ 2011: In 1974, a federal law was passed that required all states to have certificate of need (CON) laws to ensure the state approved any capital expenditures associated with hospital/medical facilities’ construction and expansion. The act was repealed in 1987 but as of 2014, 35 states still have some type of CON mechanism. ■ 2011: The Affordable Care Act created the Centers for Medicare and Medicaid Services’ Innovation Center for the purpose of testing “innovative payment and service delivery models to reduce program expenditures … while preserving or enhancing the quality of care” for those individuals who receive Medicare, Medicaid, or Children’s Health Insurance Program (CHIP) benefits. ■ 2015: The Centers for Medicare and Medicaid Services posted its final rule that reduces Medicare payments to hospitals that have exceeded readmission limits of Medicare patients within 30 days. TABLE 1-3 Milestones in Public Health 1700–2015 ■ 1700–1800: The United States was experiencing strong industrial growth. Long work hours in unsanitary conditions resulted in massive disease outbreaks. U.S. public health practices targeted reducing epidemics, or large patterns of disease in a population, that impacted the population. Some of the first public health departments were established in urban areas as a result of these epidemics. ■ 1800–1900: Three very important events occurred. In 1842, Britain’s Edwin Chadwick produced the General Report on the Sanitary Condition of the Labouring Population of Great Britain, which is considered one of the most important documents of public health. This report stimulated a similar U.S. survey. In 1854, Britain’s John Snow performed an analysis that determined contaminated water in London was the cause of a cholera epidemic. This discovery established a link between the environment and disease. In 1850, Lemuel Shattuck, based on Chadwick’s report and Snow’s activities, developed a state public health law that became the foundation for public health activities.