Monday, September 17, 2018

Hospitals Selling Insurance?

The Tampa Bay Times ran this story recently,  In Florida and everywhere, a big shift is underway. It’s changing the way we go to the doctor.  "Hospitals are getting into the insurance end of the business. Insurers, along with drug stores, are delivering front-line health care...And consumers, confronted with blurring lines and a host of new options, may need a scorecard to keep up. The shifting ground continues to change where and how they go to the doctor." The article notes the trend of insurers buying doctors' practices with the result of a coordinated group of medical offices providing health care under the company's name.  The article also notes the drug stores' offerings of health care services, so you can get more than just a prescription filled. 

What is driving this evolution? The article offers "[d]riving many of the changes is the Affordable Care Act, which helped usher in a shift in thinking about the cost of health care. Hospitals are penalized more often by insurance companies and the government when patients have more frequent stays. The focus now ...  is keeping patients out of the emergency room... population growth, new technology, government rules and evolving patient preferences."

This is a significant shift in the role of insurance companies in the provision of health care. Just think about the ramifications.

https://lawprofessors.typepad.com/elder_law/2018/09/hospitals-selling-insurance.html

Consumer Information, Current Affairs, Health Care/Long Term Care, Medicare | Permalink