As part of its ongoing efforts to accelerate the transition to value-based care, the Centers for Medicare & Medicaid Services (CMS) introduced a pilot program in the summer of 2019 that would give health care providers access to the claims data of Medicare patients. By providing clinicians with a more structured and complete patient history, the Data at the Point of Care (DPC) program aims to minimize or eliminate contraindications, duplication of services, and data blocking.
The voluntary program is one of a number of intertwined laws and regulations designed to promote the interoperability of EHRs, which, despite years of government incentive efforts, are associated with major issues.
At the White House event where the DPC pilot was unveiled, CMS Administrator Seema Verma noted: “The government spent more than $36 billion to encourage the adoption of EHRs but failed to make sure the systems could actually talk to each other. We’re now left with a health care industry that still uses fax machines.”
Because of its small scale, details of the program are not yet widely known. However, it is assumed that practices that wish to participate will be financially responsible for working with EHR vendors to build tools that connect with DPC’s application programming interface (API).
Data Share
CMS considers DPC a natural progression from its Blue Button 2.0 program, launched in 2018, which is designed to grant Medicare beneficiaries access to their medical information via third-party apps and computer programs. Both Blue Button and DPC are part of the agency’s MyHealthEData initiative, an administration-wide push for greater interoperability and transparency of services.
The pilot program is based on an industry-accepted API that uses Health Level Seven’s Fast Healthcare Interoperability Resources standard, one of the most popular protocols for combining disparate computer systems, according to CMS. Although DPC will not provide access to a patient’s complete medical record, it will include claims data detailing previous diagnoses, past procedures, and medication lists. To meet the program’s criteria, the accessible data should merge seamlessly with an existing EHR’s workflow. In other words, logging into a separate application should not be necessary.
“Part of the challenge is going to be, you’ve got this API and you’ve got these data. But now you’ve got to get it to interface with your electronic record system,” says David J. Zetter, CHBC, a consultant based in Mechanicsburg, PA. “And there’s no way an EHR vendor is going to build that interface for free.” Zetter estimates that fees for such services could run from a few thousand dollars up to possibly as much as $10,000.
In the summer of 2019, CMS started accepting volunteers for the pilot program at https://dpc.cms. gov. It began testing the program in November and plans to slowly offer it to additional practices after that.
EHR Fatigue
In another regulatory proposal made earlier this year, CMS would require many health care plans to comply with rules similar to Blue Button 2.0 by making patient data available through an API at no cost. This rule would apply to approximately 85 million people covered by Medicare Advantage, Medicaid, the Children’s Health Insurance Program, and health plans sold on federal exchanges, according to the agency.
The Office of the National Coordinator for Health Information Technology (ONC) also released proposed exceptions to the information-blocking regulations, along with standardized specifications for API development. Both CMS and ONC rules would require the industry-wide adoption of standardized APIs within two years after the rules are finalized.
Although there is near-universal agreement that interoperability must be enhanced, some wonder whether the administration is moving too quickly toward this goal. In a Senate hearing on this topic, Tennessee Sen. Lamar Alexander urged a slower, incremental timeline: “In 2015, I urged the Obama administration to slow down implementation of Stage 3 of the Meaningful Use program, which incentivized doctors and hospitals to adopt EHRs,” he says. “They did not slow down, and looking back, the results would have been better if they had.”
Industry and health care provider organizations have also called for a more measured approach. In September 2019, several organizations—including the American Medical Association, the Federation of American Hospitals, and Premier Inc.—signed an open letter to Congress calling for, among other things, “appropriate implementation timelines” that do not “unreasonably increase provider burden or hinder patient care.”
“We definitely need this,” says Jackie Coult, CHBC, a senior health care consultant for Eide Bailly LLP, noting the importance of interoperability. “To get better outcomes, we have to know the full spectrum of the patient’s health.” But achieving this data-sharing goal will be a challenge, she believes. Part of the problem is simply providers’ fatigue with EHRs. Many providers bought systems 10 years ago that later proved subpar and needed to be replaced, sometimes more than once. They watched information technology (IT) costs slowly rise year by year as programs and equipment grew more complex. With that history in mind, it is unlikely that anything that adds to IT costs will be met with much enthusiasm.
Moreover, even if total, industry-wide data-sharing is achieved, it will still incur labor costs. “Is the government going to add an E&M [evaluation and management] code for the additional 15 to 20 minutes it takes to review those data?” Coult asks doubtfully. “Medicare needs to slow it down a little bit,” she suggests. “They are bombarding doctors.”
Check out recent practice management articles: