Over the last several years billions of taxpayer dollars have been paid out for the adoption of Electronic Health Records (EHR’s) in hospitals, clinics, and physician practices. On one level this has been a great success: the percentage of EHR use in the medical community has truly soared, even as physicians and others have complained about inefficient workflow, unanticipated expenses, and burdensome reporting requirements. There have also been other benefits, such as efficiencies in reimbursement, dramatically increased abilities to monitor quality, and opportunities for patient engagement through newly enabled features like patient portals.
On another, less noted, level, EHR’s have been an unmitigated disaster. Often reported in sensational terms, but rarely analyzed for its root cause, healthcare data breaches have become stunning in scale and commonplace. A recent report stated that over the last six years, 155 million patients had their medical data breached, and more than 80% of providers have seen a data breach in the last two years (and a cynic would say the remainder just don’t know they’ve been breached yet.) While data breaches for profit are the rule, eyebrows were barely raised when espionage by foreign governments or agents was listed as the likely reason for some large scale breaches of medical records.
When HIPAA was originally passed back in the 90’s, its primary purpose was described by its acronym: the Health Insurance Portability and Accountability Act. The primary purpose of the law was to allow health insurance to move with patients, and to encourage efficiency in the payment process largely through the adoption of standards for coding electronic transactions.
Today, however, HIPAA is primarily known as a set of rules and practices designed to protect patient confidentiality and the security of electronic records. Twenty years after its passage, HIPAA remains a topic of conversation — and audits for HIPAA compliance and the EHR subsidy program have documented large scale lack of compliance with the processes and standards mandated by HIPAA.
The authors of the original HIPAA legislation recognized that going from paper records and transactions to electronic records and transactions carried with it an increased risk of both large scale data breaches and abuses of highly portable electronic healthcare information. Thus there was a heavy burden on healthcare organizations to protect against such risks. Advocates cited, among other things, the risk that data breaches and unauthorized uses of information could result in a loss of trust in healthcare providers and a reluctance of patients to confide in — or even see — their healthcare providers. Compliance with these rules was encouraged by a series of substantial penalties for their violation. This was an “all stick, no carrot” approach to achieving security for electronic Protected Health Information.
The approach of Meaningful Use program was to award the adoption and use of EHR’s with tens of thousands of dollars per provider. At the same time, compliance with HIPAA mandates for security were, once again, required but not specifically funded. Over several years of experience, audits of the Meaningful Use program have shown failure to comply with HIPAA — most often the requirement for a Security Risk Assessment — are the most common faults found among audited entities. And then there’s the matter of those 155 million patients whose healthcare records have been breached during the life of the Meaningful Use program.
As the Meaningful Use program fades into the sunset and MACRA looms as the next step in the digital evolution of healthcare, it’s time to take stock of where we’ve been and try to apply the lessons (hopefully) learned. This reflection should include an analysis of the effectiveness of the unfunded mandate approach taken to healthcare information security. Here are some suggestions for fixing things.
- Reward good security: a percentage of the funds for any new program such as MACRA should be tied to subsidized information security assessments and remediation of problems identified. The stick hasn’t worked, we need to put some carrots out there.
- Explicitly encourage diligence: a breach detected and closed quickly should be penalized less than a breach that sits for months or years before being closed.
- Make good security practices visible to patients: in my hometown of Seattle, the Health Department places health inspector’s reports available online. In some cities health departments have a numeric or letter grade assigned for good hygienic practices, to be displayed for patrons. Both of these approaches could be applicable to healthcare.
- Encourage proactive security benchmarks: HIT security is a bit like preventive health: when it works great, nothing happens. The challenge is to determine and reward preventive security practices. To use an analogy: reward higher levels of vaccination rather than punishing outbreaks resulting from a lack of vaccinations.
- Commoditize security: Particularly as healthcare moves to the Cloud there is an opportunity for healthcare organizations to achieve higher levels of security. There’s long been a suspicion of the security of Cloud, which is finally giving way to a more realistic view. A good Cloud provider will be managing software and hardware at a scale that enables dedicated security staff and hardening and automating the management of systems. Benchmark certifications for hosted system security exists, and providers should be rewarded for choosing these solutions over the proverbial “server in a closet.”
- Reward software providers who prioritize security: the software on the external network attached storage hard drives on my home network includes a simple security assessment dashboard which examines key security settings and provides a simple visual interface to report their current status and reminders if routine maintenance tasks aren’t completed. If something is wrong, a click on the dashboard item will allow you correct it. EHR software providers need to take this approach, rather than burying settings deep in nested menus and overwhelming users with massive needle-in-a-haystack audit logs.
The problem with this, of course, is that security isn’t sexy and as with preventive health, when it works “nothing” happens. Ultimately, however, there are deep social — and economic — consequences to the current approach to security. Healthcare data is attractive to the bad guys in part because healthcare records routinely contain information which is difficult to change (e.g., date of birth, social security number, home address) and therefore more useful to identity thieves than an easily changed credit card number.
Patient trust and provider reputation are on the line given our current regulatory approach to healthcare information security. But beyond this, our current state of healthcare information security is what lawyers call an “attractive nuisance” — hugely valuable to identity thieves and others, haphazardly protected. Let’s use this opportunity to create HIT Security 2.0 and change the incentives for everyone involved.