While the political debate about how best to reform America’s health care system continues, one thing is certain: By an overwhelming margin, Americans trust doctors—before both the government and insurers—to recommend the medical care and procedures they need to protect their health. #PressRelease