News 12/13/11

December 12, 2011 News Comments Off on News 12/13/11

AthletiCo Physical and Occupational Therapy (WI/IN) selects NextGen Ambulatory EHR/PM for its 50 physical and occupational therapy facilities.

12-12-2011 4-01-40 PM

HHS’ Office of Inspector General releases an advisory opinion regarding the exchange of EHR data for patient referrals. At issue: whether a vendor can facilitate a (presumably nominal) charge from a referring doctor to a specialist for a patient’s clinical record, without violating the anti-kickback statute. In reading the advisory opinion, it appears the vendor will not only facilitate the charge, but also keep a portion of the fee. OIG’s verdict: the proposed“coordination” service would not violate the anti-kickback statute. At the vendor’s request, the OIG does not name the vendor, although it sounds a lot like athenahealth to me (unverified.)

12-12-2011 4-25-16 PM

Danbury Radiological Associates (CT) selects ADVOCATE to provide billing services for its 20 physicians.

12-12-2011 4-27-03 PM

Quest’s Care360 EHR is named the top standalone e-prescribing platform by Black Book Rankings. Practice Fusion earned top scores for its EHR-based e-prescribing module.

12-12-2011 4-32-06 PM

e-MDs clients Drs. Mark Woodruff and William Weeks of Southwest Family Physicians are among the first providers in Nebraska to attest and receive Medicare payments for their meaningful EHR use.

The Chicago Health Information Technology REC says it’s on target to reach its enrollment goal of 1,486 primary care providers by the end of the year.

Altos Solutions will extend preferred access to its OncoEMR and OncoBilling platforms to members of the Innovatix non-acute care purchasing organization.

The California Health Information Partnership Services Organization (CalHIPSO) partners with CDW Healthcare to provide technology services and solutions to practices implementing EHRs.

12-12-2011 4-33-46 PM

DrFirst embeds technology from Atlas Medical into its Rcopia e-prescribing solutions, allowing Rcopia users to order tests and receive reports from any diagnostic testing facility on the Atlas iOn Interoperability Network.

Dermatology-specific EMR Encite, Inc. integrates NetHealth’s Quality Report System into its product to automate PQRS reporting.

An employee hired to purchase and maintain the IT systems at Wasatch Internal Medicine (UT) faces second-degree felony theft charges for stealing nearly $350,000 from the clinic. Over a 2-1/2 year period, the clinic paid Eric David Christensen almost $400,000 to purchase 51 PCs, warranties, and servers. Results of an audit uncovered that only three computers had been supplied to the office and none had warranties. When confronted, Christensen admitted he had been overcharging the clinic and had falsified records to cover up his actions.

A former physician with Fletcher Allen Health Care pleads guilty in US Court to unlawfully obtaining the private medical information of another person. Apparently Joshua Welch was having a sexual relationship with a woman who was not his patient and wanted to ascertain if she carried a sexually transmitted disease. He faces up to one year in prison and a $50,000 fine.

12-12-2011 4-35-12 PM

Overlake Imaging Associates (WA) selects Zotec Partners as its outsourced billing provider.

12-12-2011 2-39-19 PM

CMS posts the PQRS measures for 2012, including detailed specifications and release notes on the individual quality measures and measures groups. I found the release notes gave a pretty quick summary of what measures have changed and how.

12-12-2011 3-45-18 PM

On EHRtv, Medicomp CEO David Lareau talks Quippe with EHRtv’s Eric Fishman. He discusses the multiple way different EHR vendors are incorporating Quippe into their platforms.

Adirondack Region Medical Home Pilot Program, with implementation support services the Massachusetts eHealth Collaborative, earns Level 3 NCQA recognition for 29 of its 31 primary care practices.

Inga large

E-mail Inga.

News 12/8/11

December 7, 2011 News 2 Comments

12-7-2011 1-20-11 PM

From Wednesday’s HIT Policy Committee meeting comes the following facts and figures:

  • Over 154,000 Eligible Professionals have registered for Meaningful Use through the end of November, including 115,000 in Medicare’s program and 38,000 with Medicaid’s.
  • Medicare has paid EPs almost $183 million; Medicaid has issued an estimated $237 million in incentives.
  • The number of EPs that have registered is less than 30% of all EPs.
  • Only 21, 425 EPs have received payments through the end of November. That’s about 14% of registered EPs and 4% of all EPs.
  • At the time CMS crunched the numbers, 21,308 EPs had attested; 444 were unsuccessful.
  • Drug formulary, immunization registries, and patient list are the most popular menu objectives.

The numbers leave me with a number of questions, including why payment appears to be so slow (almost half the total payments were not made until November.) I’m also curious why 444 EPs failed the attestation.  If you are in the know or have a theory, please share.

12-7-2011 5-37-24 PM

Anthony L. Jordan Health Center (NY) selects Phytel’s Atmosphere platform for patient outreach, appointment reminder, performance benchmarking, care coordination, and patient education.

12-7-2011 5-40-39 PM

Costco, in partnership with Allscripts reseller Etransmedia Technology, announces a nationwide launch of Allscripts MyWay EHR and PM, following sales success and high demand in select test markets. Costco is offering the MyWay package for as little as $499 a month, which includes hosting services, support, training, implementation, and unlimited claims processing.

12-7-2011 5-43-13 PM

Weight loss franchise company Pounds and Inches Away selects eMedical Fusion’s PM solutions, which are targeted at cash, hybrid, and concierge practices. eMedicalFusion also announces the availability of EMR interfaces with Amazing Charts and Eclipse.

Patient check-in company Phreesia releases a study of referral trends in physician offices, finding that 50% of new patients are referred by family and friends. I’m sure that individual practices find it beneficial to understand their own referral patterns,  but does anyone really need aggregated referral information, given variations between primary care providers and specialists and across geographic regions? My guess is that Phreesia’s primary motive for compiling and publishing this information to get the attention of pharma companies, payers, and others that might want more to incorporate their own market research into Phreesia’s check-in system.

12-7-2011 5-48-01 PM

On the other hand, who doesn’t want to know about wait times? The nationwide average for wait times to see a physician is 21 minutes. Patients in Wisconsin have the shortest average wait time (15 minutes) and Mississippi as the longest (25 minutes.) Wait times for primary care providers average six minutes less than for specialists. I’m not sure what one does with this data unless you’re looking for a reason not to move to Mississippi.

12-7-2011 5-51-15 PM

Maryland Health Care Commission adds OmniMD to its SelectVendor Product Portfolio.

Inga large

E-mail Inga.

 

 


From the Consultant’s Corner 12/6/11

December 6, 2011 News Comments Off on From the Consultant’s Corner 12/6/11

Clinical Integration: The Right Time, the Right Thing to Do

“Uncertain” doesn’t even begin to describe the future of healthcare in the US. We all know that the Patient Protection and Affordable Care Act is law, but it’s facing public resistance and an upcoming Supreme Court challenge. What’s more, there’s little doubt that ongoing budget talks in Washington will bring big changes to Medicare and Medicaid. We fully expect that the push toward performance-based reimbursement will only get stronger.

Now steps the just-released Accountable Care Organization (ACO) final rule into this already-confusing swirl step. The rule reflects the comments of thousands of physicians in provider organizations across the country, and is intended to empower them to join the trend toward clinical integration.

Although ACO participation is voluntary, most of the clients we speak with are convinced that this kind of system represents the future of medicine—and not just for Medicare patients under Medicare rules. Physicians—whether rural or urban, independent, or employed—will likely climb on board with the idea simply because it’s the right thing to do for patient care. Clinical integration across patient care settings is one of the best ways to enable providers to reach out to patients to offer better, more proactive services.

Under the new ACO rules, clinical integration can be accomplished in a number of different ways. Overall, the goal is to align providers and institutions with mutual, patient-centered objectives.

Though financial integration is one part of the picture, it’s definitely subordinate to the achievement of quality care. At their core, the rules require a commitment on both sides to the concept of aligning incentives for quality care—with the understanding that cost reductions will follow improvements in quality. They also require participants to recognize the undeniable role of IT in making it all happen.

Whether they’re headed by a physician chief executive officer (CEO) or a lay CEO, and whether they involve employed physicians or independent physician partners, all ACOs should share these four key characteristics:

  • Common mission, vision and values
  • Mutual respect and willingness to share risk
  • Focus on the patient experience
  • Commitment to quality care

Fortunately, some of the historical obstacles that have barred providers from working with large healthcare systems and payers are disappearing. For example, recent regulations released by the Centers for Medicare and Medicaid Services (CMS) significantly relax federal anti-trust laws. Everything is pointing in one direction: now, more than ever, is an opportune time for clinical collaboration.

For physicians, relationships with larger organizations make sense. They allow: easier access to the expertise inherent in the clinical programs of larger organizations; greater payer contracting experience; and the ability to leverage economies of scale. And let’s not forget information technology (IT) resources, which will play a huge role in the future of population management and coordinated care.

Clearly the pressure is on for providers and other healthcare stakeholders to align their mutual interests in order to cut the waste from our fragmented healthcare delivery system. With everyone on the same team, we can redefine the current ineffective, poorly coordinated, and unnecessarily costly healthcare system.

ACOs—together with other similar initiatives like patient centered medical homes, bundled payments and outcomes-based reimbursement—will become the foundation for delivering high-quality, cost-effective patient care for everyone. Ultimately, it doesn’t really matter whether ACOs start with Medicare or not. It doesn’t really matter whether the “ACO” moniker stays or goes. The concept of clinically integrated accountable care is here to stay.

So it’s up to providers to take the lead, developing clinically integrated organizations motivated by a common goal to improve access to quality, lower-cost care for all patients.


Brad Boyd is vice president of sales and marketing for Culbert Healthcare Solutions, a professional services firm serving healthcare organizations in the areas of operations management, revenue cycle, clinical transformation, and information technology.

News 12/6/11

December 5, 2011 News Comments Off on News 12/6/11

Greenway Medical Technologies selects Dell to as a cloud-based hosting partner for its PrimeSUITE EHR/PM solution.

12-5-2011 12-21-31 PM

The Alaska Medical Association makes DocBookMD available for free to its member physicians. The mobile application facilitates secure communication and the sharing of clinical information between providers.

12-5-2011 4-14-23 PM

The 88 providers at Core Physicians (NH) successfully attest for the Meaningful Use of their NextGen EHR.

12-5-2011 12-45-28 PM

CMS releases what it calls a “comprehensive tool” to guide eligible providers through the Meaningful Use program. The online resource covers everything from background on the MU program to eligibility measures, attestation, and payment. I agree that it is comprehensive and found it easy to navigate; however, I find it curious that it’s just now being released by CMS.

12-5-2011 1-02-09 PM

The non-profit Doctors Helping Doctors Transform Health Care officially launches December 1. The organization supports healthcare transformation, initially through the greater use of HIT, and is supported by unrestrictive grants. The website includes various blog posts and videos that feature lessons learned and best practices from physicians who have already adopted HIT. Janet Marchibroda, founding CEO of the eHealth Initiative, serves as the organization’s executive director.

12-5-2011 1-18-19 PM

As Mr. H so aptly put it, this weekend’s post by Micky Tripathi should be “mandatory reading for just about everybody.” In case you missed it, Micky shares how his organization handled a patient data breach. It’s unbelievably well-documented and includes a step-by-step account of how his organization addressed the breach. A couple of readers have called it the “HIStalk Post of the Year.” Even you über-busy types will thank me (and especially Micky) for taking two minutes to review the lessons learned at the end of the piece.

gloStream selects DiagnosisOne’s smartConsult clinical support solution to integrate integrate into its gloEMR offering. More on DiagnosisOne in this overview video.

The Michigan Center for Effective IT Adoption (M-CEITA) announces that over 3,724 Michigan providers have committed  to work with it on their adoption of EHRs.

CureMD Healthcare wins an eHealthcare Leadership award in the category of Business/Process Improvement Applications or Products.

12-5-2011 3-58-40 PM

A survey by the Optum Institute for Sustainable Health suggests that many physicians are still unsure if they will participate in ACOs. Furthermore, a significant number of physicians claim they are not even familiar with ACOs.

A patient sues his dentist after the dentist fines him $100 a day for posting negative reviews of his treatment. Before receiving treatment for an infected cavity, the patient was compelled to sign a privacy agreement. The patient later posted notes on two consumer websites, calling the dentist a “scammer” and warning others to “avoid at all cost.” The dentist then began invoicing the patient $100 for each day the review stayed online. The patient is asking for the privacy agreement to be declared null and void and for the dentist to be barred from presenting such agreements to future patients. The patient also wants to be refunded the $4,766 he paid for his treatment.

Inga large

E-mail Inga.

First-Hand Experience with a Patient Data Security Breach 12/3/11

December 3, 2011 News 22 Comments

By Micky Tripathi, President and CEO
Massachusetts eHealth Collaborative

This column is about a security incident that my company caused earlier this year. It’s a long and detailed description. I’m confident that those of you have never been through a security incident will find the article tedious. I invite you to read just the ending on lessons learned, or save your hard-earned free time for one of HIStalk’s other excellent columns.

I’m equally confident that those of you who are going through a security incident (or have been through one) will find the details fascinating, not only because misery loves company, but also because you realize through hard experience that protecting privacy and security is about incredible attention to the small stuff. My experience is that large organizations know this because they pay professionals to worry about it, but most small practices have little to no idea of the avalanche that can follow from the simple loss of a laptop. Anyway, here it goes.

We had a security breach a few months ago. One of our company laptops was stolen from an employee’s car. As luck would have it, the laptop contained some patient data files.

My inner Kubler-Ross erupted.

  • Denial. Noooooooooooooooooooo!!! This is surely a nightmare and I’m going to wake up any minute.
  • Anger. How dare someone steal our property??!! Who the heck would leave a company laptop unattended in a car parked in a city neighborhood??!!
  • Bargaining. Are you sure it was OUR laptop?? Maybe it didn’t have any patient data on it?
  • Depression. We’re doomed. Patients’ privacy may be exposed. Some may suffer real harm or embarrassment. They’re going to hate their providers, and their providers are going to hate us. Word will spread, trust in us will erode, we’ll struggle to get new business, we may get fined or sanctioned by state and/or federal authorities, we may get sued by providers or patients or both. My kids won’t go to college, I’ll lose my house, my parents will be disgraced.
  • Acceptance. OK, let’s get to work. We have an obligation to our customers, our board, and ourselves to affirmatively take responsibility for our errors, be transparent with all stakeholders, manage the process with operational excellence, and share our lessons learned so that others can hopefully learn from our blunders.

I had to go through all five stages before mustering the energy to write this column.


Our mistakes

Last spring, one of our practice implementation specialists left a briefcase containing a company laptop in their car while they were at lunch. Someone broke into the car and stole the briefcase. We called the police immediately. Since the laptop was in a briefcase and the car was in a random neighborhood (i.e., not parked in a hospital or medical office parking lot), this was almost certainly a random theft and not a specific targeting of patient information.

Luckily for us, we had a fresh backup of the laptop, so we knew exactly what was contained on it. We are an implementation services company, not a provider organization, so you’d think we wouldn’t have any patient records on our machines. Wrong!

One of the things that we routinely do is assist customers with transferring patient demographic data from their old practice management systems to their new EHR/PM systems. And while most of this transfer happens machine-to-machine behind secure firewalls, there are always a number of “kick-outs” — individual patient demographic records that the new system rejects for one reason or another. Part of our job is to examine these error logs and work with our customers to determine how to remediate these rejected records. We usually do this on-site in a secure environment, but sometimes there isn’t enough time to do that during office business hours. In those cases, we copy these error logs onto our laptops and work them offsite.

These files shouldn’t add up to much, right? A file here, a file there, all of which are deleted once we’re through with them. Wrong again! Our forensic analysis of the laptop backup showed that it contained information on 14,314 individuals from 18 practices. On top of that, there were some paper copies of appointment schedules for another 161 patients, making the grand total 14,475. Fresh in my mind was a recent case in which Massachusetts General Hospital was slapped with a $1M fine and front-page headlines after an employee left the medical records of 192 patients on a subway. Uh-oh. I had to sit down.

The bad news kept on coming. In April 2010, we had instituted a company-wide policy requiring encryption of any files containing patient information. If the laptop or the files had been appropriately encrypted, this theft would not have been a breach issue. Turns out that we had been shopping around for whole disk encryption options to reinforce our security policy, but regrettably we hadn’t yet implemented a solution at the time of this incident.

It’s not that the data on that laptop wasn’t well-protected – it was. We are part of a professionally managed enterprise network environment, so our laptops have domain-level, strong usernames and passwords that are routinely changed and disabled after a pre-determined number of failed log-in attempts or if too much time elapses between network log-ins. Furthermore, the error log files containing the patient information were themselves password protected. In short, the chance that anyone would gain access to the information was infinitesimally small. This was a random theft, and it would take strong intent and some technical sophistication to get past the protections that were in place.

And yet … the files were no longer in our control and, without encryption, were indisputably vulnerable. I’d heard the term “my knees weakened” before, but had never experienced it myself … up until that moment, that is.


Our obligations

Privacy and security incidents of this type are governed by federal and state laws and regulations. The federal law is HIPAA and its HITECH modifications. The relevant Massachusetts state law is a recently enacted data protection law which went into effect on March 1, 2010.

We have terrific attorneys. Though we’re a small, non-profit company, we’ve never gone cheap on legal advice or insurance. I was thus surprised and somewhat distressed at how hard it was to disentangle the thicket of state and federal laws in play. Not because our attorneys are bad, but because state and federal regulations are imprecise, not well-aligned, and constantly changing.

Add to that the fact that the rules to implement the HITECH modifications are still just proposals and not final regulations yet, and what we were left with was a grab-bag of statutory and legal piece-parts that we ourselves had to assemble without any instructions or diagrams. For a really excellent description of some of the challenges in the current legal framework, see my friend Deven McGraw’s testimony before the Senate Judiciary Committee’s Sub-Committee on Privacy, Technology, and Law.

It was clear that this wasn’t something that we could just delegate to the lawyers while we went on our merry way. There were complex business and legal judgments to sort through, and the fact that it happened at all suggested that we needed to take a good hard look at ourselves and see what went wrong.

We immediately put together a small crisis team comprising our key customers, our attorneys, and from our company, me, our security officer and our customer project lead. We set up daily end-of-day conference calls to manage and monitor the crisis.

Our interpretation of the legal and statutory requirements was as follows:

Was this a breach? The files were unencrypted and contained individually identifiable information, so the answer was an emphatic “YES”, from both a state and federal perspective. In the case of Massachusetts law, a breach is when “personal information” is disclosed, which includes name plus any of SSN, driver’s license number, financial or credit/debit card account numbers. At the federal level, protected health information (PHI) includes “individually identifiable health information” that relates to health care status, treatment, or payment.

Who needs to be notified? State rules say that we had to notify patients, the state Attorney General, and the Office of Consumer and Business Affairs. Federal rules also require patient notification (within 60 days), as well as notification to the federal Office of Civil Rights. The federal rules (well, the draft federal rules) have an additional kicker, though. Breaches exceeding 500 individuals are posted on the OCR website (the so-called “Wall of Shame”) and the breaching organization must “provide notice to prominent media outlets serving the State or jurisdiction.” Yes, that’s right – tell the media about your goof-up and then become permanently enshrined on a government website to boot. Serious stuff. Very serious stuff.

How many ways are there to count to 500? One thing that I hadn’t fully appreciated was that just because an individual’s information was released doesn’t mean that it constitutes a breach. Under our state law, if the information is “publicly available,” such as address or phone number, it doesn’t count as a breach. Under federal law, it is a breach, but notification is only required in cases which “pose a significant risk of financial, reputational, or other harm to the individual affected.” In short, even though we had spilled information on over 14,000 patients, not all of them required notification, and from a federal perspective, not all of them counted toward the magic number of 500. Figuring this out turned out to be somewhat of a Herculean effort and much more art than science.

Who’s responsible? This would seem like a stupid question, especially since I’ve already said that we were responsible, but given that this is healthcare, we were given the unique opportunity to drag others down with us. This question gets into the relationship between covered entities (like physician practices) and business associates (like contractors who work for physician practices and need access to PHI in order to do their jobs.)

We were hired by a physician contracting network to help manage the EHR implementations of hundreds of their member practices. In HIPAA terms, the practices are the covered entities, the network is the business associate to each of these practices, and we were the contractor to the network with a “downstream” business associate agreement (meaning that we’re not connected directly to the practices, but only through our contract with the network.) Clear as mud, right? As complicated as this seems, this type of arrangement (and even more Rube Goldberg-like varieties) is NOT unusual, and indeed is very common and will become even more so as we move to accountable care and other types of complex business relationships.

It started to become clear as we worked our way through this that we were headed for a perverse outcome. Though we were the ones who made the error, we were acting on behalf of physician practices who were ultimately responsible for the stewardship of their patients’ information. Thus, in the eyes of the law, primary responsibility fell on them, not us. Indeed, as were only later to find out, there was some question about whether the federal Office of Civil Rights had any jurisdiction over us because we were a “contractor,” not a direct BA to the practice, and thus possibly outside of their reach.

Seriously??!! For those of you who are into the inside baseball of federal healthcare privacy law, the HITECH statute does not specifically give OCR authority over contractors, whereas OCR’s draft regulations do. Until those regulations are made final, we won’t really know whether they have such authority or not. (The final OCR rule is supposed to be issued very soon.) This ambiguity did not change our response or our diligence to comply with all state and federal laws, but in my opinion, this clearly points to a huge gap in the current monitoring and enforcement framework. OCR should have the authority to follow a data spill as far into the contracting chain as they need to go.


Our response

The days after the incident were a vortex. Everything, big and small, got shoved aside as we scrambled to assess the situation and develop our response.

The first thing we did was to notify our attorneys, our customer, my board chair, our staff, and our liability insurer – in that order. We needed our attorneys to get to work on understanding the situation and our legal requirements, and to also give us a high-level assessment of the possible scope of damage. Armed with that information, we notified our customer, making clear that we did not yet know all of the details or all of the legal ramifications, but that we took full responsibility and we would cover whatever costs and activities that resulted from our error.

At the recommendation of our attorneys, we also hired a private investigator to troll Craigslist and local pawn shops to try and recover the laptop. If we could recover it and show that it hadn’t been accessed in the time that it was out of our control, we’d arguably be in the clear. (Though NOT from our own internal investigation, which would have continued apace – just because we didn’t suffer the consequences wouldn’t reduce the gravity of the mistake.)

12-3-2011 8-30-26 PM

We had to get our arms around the problem, and the only way my linear mind knows how to do that is to put together a project plan. There were so many moving parts, and so many judgments to make, that we had to treat this just like any complex project that we manage on our business side. The project plan that we developed is here. The most intense activities on it were:

Determine what information was released on each individual. The individual information contained in the files had mostly patient names, but there were some providers listed as well. About 80% of the individuals were patients, but 20% were providers, the vast majority of whom were referring providers, not the providers from the practices themselves. We tend to focus on patient information when we think of breaches, but hey, providers are people too, you know. We also determined that of the 14,475 records, 688 were duplicates (i.e., records on the same patient), so we were able to reduce our count to 13,687.

Each individual record turned out to contain some combination of the following: name, address, phone, SSN, DOB, MRN, date of service, appointment, insurance subscriber number, insurance subscriber name, insurance plan. But it was a patchwork. No single record had all of these pieces, so we had a lot of analysis to do. Most of these categories were straightforward, but some needed further deciphering. One bit of good news was that the MRN in this case was not an active MRN but an obsolete one – it was the legacy MRN from the old system and would be replaced by a new MRN in the new EHR system.

Determine which patients would have “significant risk of harm.” I thought that this would be a concept well established in law or regulation, but it turned out to require considerable judgment on our part. As I noted earlier, these files were error logs from data migration, so they were not clean – the information on each patient was incomplete or corrupted in some way. In order to really determine the risk of harm, we were worried not only about the individual pieces of patient information that might have been disclosed, but also about whether a combination of innocuous data might be revealing when taken together. For example, a date of birth by itself is innocuous, however, DOB and name would obviously be a concern.

We had to do a detailed analysis of every combination of data, which revealed that there were 30 different combinations of the information above, meaning that each of the 13,687 individuals was represented in one of 30 ways. We created a frequency table showing the frequency of each combination. For example, we had 5,338 instances of MRN-subscriber number, 2,777 instances of name only, 222 instances of name-SSN-DOB-MRN-phone, and so on. The full table is below.

12-3-2011 8-34-12 PM

In analyzing this data, we were confronted with the dilemma of how to determine significant risk of harm. Federal and state rules have definitions, noted above, but it leaves a lot open to judgment. We decided that we would apply a simple rule based on the combinations of information that we had. Any individual who had their name PLUS either their SSN or DOB would be considered to have a significant risk of harm if their data was accessed. As it turns out, we had exactly 1,000 such cases. We determined that everyone else’s information was either meaningless without additional information (for example, MRN and subscriber number) or was already publicly available (for example, address, phone number, etc.) We thus concluded that of the 13,687 individuals whose information was stolen, 1,000 (7%) would have a significant risk of harm if their data was actually accessed, and 12,687 (93%) would not.

Obviously things were feeling a little better now that we knew we were dealing with 1,000 patients instead of over 14,000. Not that we took it any less seriously, but it started to feel more manageable and, of course, it meant that our legal and financial exposure was probably smaller as well.

Now that we knew what we were contending with, we could address the issue of notification. As I noted earlier, the question of who was responsible from a legal perspective was not as clear-cut as assumed. So, too, it was not immediately obvious who had to notify whom about what. Federal law is clear – it’s the responsibility of the covered entity, regardless of how badly their contractors might screw up. The state law does not say who the notification requirement falls on. While the presumption would be that it is on the party disclosing the data (in this case, us), the MA law is not specific to healthcare and thus does not delineate covered entity versus business associate responsibilities.

We were faced with yet another dilemma and decision. The state law would suggest that we (the contractor) should notify patients and the state government, whereas federal law says that the practices (the covered entities) should notify patients and the federal government.

Confusion. On the one hand, we wanted to take full responsibility for our mistake and shield the practices as much as possible from this. On the other hand, we did not want to confuse patients by having them receive one letter from the practice to fulfill the federal requirement and another letter from us to fulfill the state requirement. We similarly wanted to have consistent reporting channels to federal and state federal authorities to make it easier to respond to their enquiries.

Discussing this with our customers and our attorneys, we jointly decided that we should use a unified approach by which all notifications and reports would come from the practices. However, MAeHC would manage all the logistics of such notifications and reporting and would be prominently named in the reports and notifications. This allowed us to fulfill our desire and obligation to be accountable while at the same time keeping the process consistent and clean from federal, state, and patient perspectives.

So how did this all shake out? We started with 14,475 records of patients and providers from 18 practices. Of these cases, 1,000 patients from seven practices were judged to have a significant risk of harm from the data theft. It turned out that of the seven practices, only one practice hit the magic 500, the threshold for being on the Wall of Shame and for media notification. So while all 1,000 patients had to be notified and all seven practices had to report the breach to the state and federal authorities, only one practice would face broad public exposure for 500+ individuals.

Having crossed the magic 500 threshold, this practice had to provide public notice of the incident through a major media outlet. The law provides some flexibility in determining the most appropriate media outlet. The practice, with our advice and helpful guidance from our media consultants, decided to provide notifications to the two largest television news stations in our state.

All seven of the practices agreed to have us manage the logistics of the government and patient notifications and reporting, so we gathered the letterhead information from all of the practices and printed and stuffed 998 patient notices and sent them by first-class mail. This itself was quite an exercise because the patient identification information was not always clean, so we had to visit each practice and manually confirm each patient’s address with practice staff. One practice had its own stationery that they wanted us to use. Another practice wanted to hand sign each notice. We had to build these special requests into our process as well.

We were unable to find valid addressing or contact information for two individuals, though we tried a variety of means (including those slimy Internet services that will sell personal information for a fee.) Yet another bind and another decision to make. Sigh. Federal law says that if you can’t directly contact over 10 individuals, you have to post the notice on the organization’s website or provide notice to a major media outlet in the area where the patient lives, whereas if it’s fewer than 10, you can provide substitute notice by an alternative form of written, telephone, or other means.

Our problem was that we couldn’t validate where these two individuals lived in the first place, which is why we hadn’t been able to contact them, so an “alternative form”, whatever that means, didn’t solve the problem. We could try to apply the rule for 10 or more, but the practice, like most small practices, doesn’t have a website, and we didn’t know where the patients lived so we couldn’t target media notices any more than we already had. We concluded that since we had already provided media notice to the two most prominent TV stations in our state about the breach, we had done all that we could. And as a practical matter, if we ourselves were having trouble identifying the two individuals despite our best efforts with all of the resources at our disposal, there was not much chance that anyone else could use the information to harm the individuals in question.

As an added measure to patients, we covered the cost of credit monitoring for any patient who chose to have that service. It felt like the least that we could do.

12-3-2011 8-39-50 PM

12-3-2011 8-40-43 PM

12-3-2011 8-42-13 PM

Our sanitized notification to the MA AG and OCABR is here, our sanitized patient notification is here, and our sanitized media notification is here. I can’t share the federal OCR notification because it’s all done on the OCR website and we therefore don’t have a copy.

What was the aftermath?

The hard costs of this incident were as follows:

12-3-2011 8-43-35 PM

As I mentioned earlier, we’ve never skimped on lawyers or insurance, and it paid off here. We’ve been paying $18,200 per year in insurance premiums for the last five years, and our insurance ended up covering all of the above costs except for our staff and media consultants’ time (for a total of $161,808) minus a $25,000 deductible. Of course our premiums will now go up going forward, to $23,000 per year with a $50,000 deductible. Though this reflects a calculation on the insurer’s part of higher actuarial risk, I believe that we are actually lower risk now that we’ve gone through this incident, but I’m not going to argue with them about that.

The soft costs of this are obviously the opportunity cost of the roughly 600 hours of staff time that we ended up throwing at this, as well as the time invested by our customers and the practices. The longer term impact on our reputation and the reputation of our customers is difficult to calculate, but nonetheless real.

Needless to say, the private investigator didn’t turn up anything. We filed all of our reports to federal and state authorities. We received follow-up questions from the state OCABR, and the practice received follow-up questions from the federal OCR. We paid for a separate attorney for the practice to represent them with OCR. In addition, we were asked to participate in a phone call with OCR to provide additional information and any learnings that would help other organizations better interpret and comply with privacy and security and breach notification rules, which we were happy to do.

Both the federal and state authorities acted immediately upon receiving our reports and were very responsive and helpful to our questions. It’s clear that they take these reports seriously and hold the privacy interests of our citizens as their highest priority. As a taxpayer and a citizen, I felt extremely well-served. As the party being investigated, I felt perhaps a little bit too well-served – just joking!

We and the practices received 18 calls from patients. It is a sad reflection on the times that almost all of the calls were to confirm the veracity of the letter notification — the patients whose data we had breached were concerned that our notification might itself be an identity theft scam!

Only 88 (fewer than 10%) of the affected individuals chose the credit monitoring option. Some people may never have read the letter, others may have just ignored it, and perhaps others are already receiving credit monitoring from other breaches of their information. (For example, I myself am currently receiving free credit monitoring from my credit union, which had a breach of credit card numbers last year.)

The media notices that we provided to the television stations were received (we confirmed that), but neither outlet aired the story as far as I know. A Boston Globe story published later after the Sony data breach listed a number of area breaches, including ours, but I’m not aware of any other media reporting on the story.

The one practice that hit the 500 threshold is listed on the OCR “Wall of Shame” website. I’m not going to point you to them because the error was ours, not theirs, and I don’t want to bring any more attention to them than they’ve already had to bear.

The reaction of our customers was concern (as expected) and incredible understanding (not expected). Soon after we had notified them, I addressed the board of directors of the contracting network to explain the incident and the actions we were taking in response. One director, a physician, expressed his concern that many physicians leave their laptops in their cars without realizing how grave the implications of a simple theft might be. The practices themselves were understandably focused on what harm might come to their patients, and what this incident might do to the years of trust that they had instilled in their patient relationships.

I was humbled by the sincere concern that each of the physicians expressed about their patients, and the understanding and generosity of spirit that they showed regarding our role in this mess. They would have had every right to scream at me – or worse. Most of them had only limited awareness of who we were because we were hired by their contracting network and, among the EHR vendor, the hardware vendor, the contracting network, and us, it was hard for the practice to tell any of us apart. And then, a mistake by us became their violation of federal and state law, undermined their relationship with their patients, and forced them to take time away from their lives to understand the legal thicket that they had somehow gotten snared in. I hope that if I’m in a similar circumstance I will show the patience, professionalism, and generosity of spirit that they showed to us.

My company took a number of remediation actions. Right after the incident, we immediately destroyed all patient data on all company mobile devices and temporarily banned any removal of PHI from a customer facility. We conducted a detailed workflow analysis of all of our practice-facing activities, identified the use cases in which our staff might have need for patient data, and provided detailed policies and encryption tools for them to easily do their jobs and still meet our security standards. The tools that we deployed are TrueCrypt, SecureZip, and ZixMail.

We bolstered our company processes to require formalized company authorization for individual personnel to have access to PHI on a project-by-project basis. We also held two mandatory trainings with all staff immediately after the incident. Only after all of this did we allow our practice specialists to resume their normal activities.

As for the employee who caused the incident, no, they were not fired. Indeed, our customer specifically requested that we NOT fire the individual because they valued the individual’s skill and expertise and believed this incident to be an accident. That is our company view of the situation as well. While there was obviously individual culpability for the incident, we also recognize that there is company culpability as well, going all the way to the top (i.e., me.)

As for me, I believe that we are a stronger company for having gone through this experience. I truly regret that it came at the expense of the disclosure of 1,000 innocent individuals’ personal information, and probably eroded, even if just a little bit, the trust between these patients and their doctors.

We very recently received a letter from OCR informing the practice that after having investigated the practice, its internal policies, the business associate arrangements, the patient and media notifications, and the credit monitoring service arrangements, they were found to be in “substantial compliance” with the federal Privacy and Security and Breach Notification Rules. And, according to the letter, the issues related to the complaint “have now been resolved through the voluntary compliance actions of the practice. Therefore, OCR is closing the case.” Roughly seven months had passed between the initial incident and this final letter from OCR. Hallelujah.

I hope that others can learn from our mistakes (I know I’m pretty sick of learning from my own mistakes and would be delighted to learn from yours!) It turns out that our incident was not that uncommon. According to the OCR database, over 30% of the 372 incidents involving over 500 individuals stemmed from theft or loss of a laptop or other portable device. And that’s only the reported incidents – who knows how many go unreported? I suspect that there have been WAY more than 372 such incidents over the last three years across the country, and as EHR penetration rises dramatically over the next few years, we can expect to see corresponding exponential growth in security incidents as well.

For what it’s worth, some observations and lessons learned from our experience are as follows:

  1. Whether you’re a physician practice or a contractor, look in the mirror (or use your phone camera) and ask yourself right now: do you know how the people on the front lines are handling personally identified data? Have you put in place the awareness, policies, and technologies to allow them to do their jobs efficiently AND securely? A meaningful self-examination will reveal that you are almost certainly not as good as you think you are (unless you’ve had a recent security incident of your own.) That was certainly true for me, and I suspect that I’m in very good company.
  2. Assume that your portable devices contain sensitive information, even if your vendor tells you that they don’t. Most EHR systems are designed so that medical records are not stored locally on a laptop, yet, in our investigation of this matter, we found plenty of instances of the EHR saving temporary files locally that were later not purged, or of clinical users saving documents locally because they weren’t aware of the risks. While there is no doubt that EHR software should be better designed, and EHR users should be better trained, I wouldn’t bet on it. Put in place policies and technologies as a safety net, just in case software and users don’t do what they’re supposed to (because that’s never happened before, has it?) I now have whole-disk encryption on my laptop even though I never work with practice-level data. Sure, it takes about 20 seconds longer to boot up while it’s decrypting. But rather than being an inconvenience, I actually have found that I use this time to take comfort that I’m responsibly protecting my company’s and my customers’ privileged information.
  3. If you’re in a physician practice, know who’s working in your practice and get a clear statement from the contractor at the outset of their work of what access to patient information they will need, and how they will be handling such information. As an industry, I fear that we are inadvertently letting business associate agreements absolve us from appropriate diligence of what our privacy and security protections are trying to accomplish. Hey, we’ve got a BA, so our contractors can do anything, right? You’re obviously not expected to be a privacy and security expert – that’s what you’re hiring. But understand that it’s your responsibility in the end if that contractor drops the ball on privacy and security, so it’s worth making yourself comfortable that they are reputable, diligent, honest, and competent. We are now providing all of our customers with a statement delineating what we expect to need access to and how we will handle such information. This isn’t a CYA exercise, because it doesn’t absolve us from any responsibility. It’s simply a vehicle to force acknowledgement of the seriousness of privacy and security, and to flag any differences in expectations at the earliest possible opportunity.
  4. Once a security incident has occurred, set everything aside and create process and structure right away to identify what’s happened, prevent any further incidents, notify your stakeholders, and get to work on meeting your legal, business, and ethical responsibilities (hopefully, these are all perfectly aligned.) Create a crisis response team with your attorney, your customer (if you’re a contractor), and your own staff. It’s hard not to panic (believe me), but laying out a plan will help to identify what you know and what you don’t know and will allow you to set your priorities accordingly. In my case, it just helped me breathe. Some guides that I wish were available when we went through this are available here and here.
  5. Don’t underestimate the effort that will be required to disentangle, respond to, and remediate the breach. If you’re a contractor, notify the organization who hired you right away to let them know the facts of the incident, the measures you are taking, and any immediate actions that they need to take (usually none.) There’s a temptation to wait to notify your customer until you know all of the facts and implications, but our experience is that that takes too long to disentangle. In our case, we notified the contracting network and its board immediately, who then decided to wait a little longer until we understood more of the implications for each practice before notifying each of the affected physicians directly. Good thing too – as it turns out, only seven of 18 practices had any legal liability for the breach, so it made sense to wait a little bit to sort this out. But realize we’re talking about days and maybe a couple of weeks – not months.
  6. Keep a daily log of your activities from Day One. Hours quickly turn to days which even more quickly turn to weeks, and someone will inevitably ask why you didn’t notify them sooner. Having a log of your activities will allow you to demonstrate that you responded immediately and provided notifications as soon as you figured out what exactly had happened and who needed to be notified about what.
  7. “Man up” and take responsibility for your actions. (Sorry for the sexist reference – I just think it sounds really good.) While we reported the incident to the police right away and immediately set the wheels in motion for compliance with federal and state regulations, I was struck by how easy it would have been to just let it slide, particularly as I contemplated the legal liabilities we might face, the financial penalties that could be imposed, and the loss of business that we might suffer. If we didn’t have a very recent backup of the laptop, we could easily have convinced ourselves that there were only innocuous error logs on the computer and stopped right there and reported nothing except a random theft. And we would have come to that conclusion honestly (for the most part.) I shudder at the thought of how many of these incidents go unreported each year, some perhaps not so honestly.
  8. Take responsibility as an organization. Bill Belichick, the coach of the New England Patriots (Yeah! Wahoo! Go Pats!!) says that an execution error by an individual is really a lapse in education by the coach. The simplest thing for us to have done (aside from not reporting the incident at all) would have been to declare that this was the action of a rogue employee, contain the investigation and remediation to that, and pat ourselves for another job well done. In our case, it became clear as we investigated this incident that there was a certain amount of “there but for the grace of God go I” among our entire staff, at which point we realized that this was an individual failing AND a company failing, which meant that ultimately it was a management and leadership failing. Framing it that way sent a strong message to our team that we’re all in this together and that we need to be honest, transparent, and professional about our flaws. It also led to our building system approaches that will be more long lasting because they were developed organically from the ground up with staff input rather than being imposed from above. Any security professional will tell you that building security considerations into routine workflows, rather than tacking them on as additional workflows, is not just a best practice, it should be the only practice. Or, to paraphrase what I heard from a yoga instructor the other day, we have to move from doing security to being security. Om.

In the end, I am incredibly confident that that laptop was either stripped down and sold or tossed into a dumpster. I am almost 100% sure that no one accessed the patient information on that machine. In my opinion, the penalties we paid for an honest mistake with very low risk (a random theft of a password-protected laptop containing a patchwork of demographic data) seem disproportionately high ($300,000 to us; national public exposure to the practice.) That said, I also recognize that we’re all feeling our way through this incredibly complex area, and an appropriate balance will eventually be struck, though it will take time.

Still, when all is said and done, I’m incredibly proud to be a member of a society that errs on the side of valuing the integrity of its citizens, as messy as that can sometimes be.

Micky Tripathi is president and CEO of the Massachusetts eHealth Collaborative. The views expressed are his own.

Platinum Sponsors


  

  

  


  

Gold Sponsors


 

Subscribe to Updates




Search All HIStalk Sites



Recent Comments

  1. The article about Pediatric Associates in CA has a nugget with a potentially outsized impact: the implication that VFC vaccines…

  2. Re: Walmart Health: Just had a great dental visit this morning, which was preceded by helpful reminders from Epic, and…

  3. NextGen announcement on Rusty makes me wonder why he was asked to leave abruptly. Knowing him, I can think of…

  4. "New Haven, CT-based medical billing and patient communications startup Inbox Health..." What you're literally saying here is that the firm…

  5. RE: Josephine County Public Health department in Oregon administer COVID-19 vaccines to fellow stranded motorists. "Hey, you guys over there…