The United States does not have a private-sector health insurance system, let alone a functioning competitive market for insurance or health services. In fact, the federal government has been the dominant force in American health care for decades, long before the recent massive expansion of the government’s role in the 2010 Patient Protection and Affordable Care Act (PPACA).[1] Through overly restrictive policies, Medicare, Medicaid, and tax subsidies, the federal government has dominated the operation of the U.S.