A ‘building code’ for building secure code in medical devices

Carl Landwehr portrait

Carl Landwehr

Last month, a broad mix of experts convened by THaW researcher Carl Landwehr convened in New Orleans to begin drafting a “building code” for medical-device software.  They’ve just released their report, and there is already talk about taking some of these ideas into the various standards bodies. Check out their report and feel free to leave comments on their site.  — dave

THaW at the mHealth Privacy & Security Symposium

Perhaps the largest annual event related to mHealth is the mHealth Summit, held near Washington DC.  Today, the summit kicked off with a Privacy & Security Symposium, including a panel on Medical Device Security anchored by both Kevin Fu and Darren Lacey from the THaW team.  Kevin, Darren and the other panelists spoke about some of the security concerns that medical devices pose for patients, clinicians, and hospitals.  The audience brought together a broad mix of medical practitioners, device and software vendors, security professionals, and computer scientists.

photo of the panelists

Kevin Fu and Darren Lacey at the center of a panel session at the mHealth Summit.

Constructing a ‘building code’ for medical device software security

The following summary of the recent ‘building code’ workshop sponsored in part by THaW held on November 19-21, 2014 is provided by Dr. Carl Landwehr –

Forty people with diverse backgrounds in medical device software development, standards, regulation, security, and software engineering met in New Orleans November 19-21 with the goal of constructing a “building code” for medical device software security and a related research agenda. The workshop was sponsored by National Science Foundation through both the Trustworthy Health and Wellness (THaW) center and a separate workshop grant to George Washington University’s Cyber Security Policy and Research Institute (CSPRI) as well as the IEEE Computer Society’s Cybersecurity initiative.

The idea of exploiting building code metaphor originated with THaW’s Carl Landwehr, who organized the meeting with the help of a Steering Group that included THaW leadership as well as several others from the worlds of medical devices, software engineering, and security. Tom Haigh, recently retired from Adventium, served as Vice-Chair.

Building codes for physical structures grow out of industry and professional society groups – suppliers, builders and architects – rather than from government, although adoption of codes by government provides the legal basis for enforcement.  Building codes generally apply to designs, building processes, and the finished product. Code enforcement relies on inspections of structures during construction and of the finished product and also on certification of the skills of the participants in the design, construction, and inspection processes. Inspectors must be knowledgeable and skilled, but the training requirement is not burdensome, and decisions as to whether a building meets the code or not are typically straightforward. Codes also take account of different domains of use of structures: code requirements for single-family dwellings differ from those for public buildings, for example. Although building codes arose largely from safety considerations (e.g. reducing the risk of widespread damage to cities from fires, hurricanes, or earthquakes), security from malicious attack has also motivated some aspects of building codes.

The workshop aimed to develop an analog to building codes focused on the security properties of software rather than the structure and characteristics of physical building.  The objective of this code for software security is to increase assurance that software developed for the domain of medical devices will be free of many of the security vulnerabilities that plague software generally.  Evidence to date is that a large fraction of exploitable security flaws are not design flaws but rather implementation flaws. An initial building code for medical device software security could focus on assuring that the final software that operates the device is free of these kinds of flaws, although it could address aspects of the development process as well.  For example, the code might specify that modules written in a language that permits buffer overflows be subject to particular inspection or testing requirements, while modules written in type-safe languages might require a lesser degree of testing but a stronger inspection of components that translate the source language to executable form.

About 35 separate items were proposed for inclusion in an initial draft building code. Although the final report is still in development, only about half of these elements are likely to be included in the consensus version of the code.

Several participants in the workshop who are active in related standards bodies and professional societies have indicated an interest in moving the code forward in those groups.

FDA visits NIST federal advisory committee on security and privacy (audio available)

As previously referenced in the official blog of the Ann Arbor Research Center for Medical Device Security,the NIST Information Security and Privacy Advisory Board (ISPAB) held a public panel on October 24, 2014 entitled “Updates on Embedded Device Cybersecurity: Medical Devices to Automobiles.”

Professor Kevin Fu has provided an audio recording of this meeting that can be found here — http://blog.secure-medicine.org/2014/10/fda-visits-nist-federal-advisory.html

THaW paper at CIST (INFORMS)

THaW professor Eric Johnson (Vanderbilt) recently presented a new paper at the Conference on Information Systems and Technology (CIST), a division of INFORMS.

See the video abstract. A full version of the paper is under review at a journal.

Meaningful healthcare security: Does “Meaningful-use” attestation improve information security performance?
Juhee Kwon and M. Eric Johnson
Abstract:
Certification mechanisms are often employed to signal performance of difficult-to-observe management practices. In the healthcare sector, financial incentives linked to “meaningful-use” attestation have been a key policy initiative of the Obama administration to accelerate electronic health record (EHR) adoption while also focusing healthcare providers on protecting sensitive healthcare data. Given the rapid push for safe digitization of patient data, this study examines how hospital attestation influences the occurrence of subsequent data breaches and also how breach performance is associated with penalties from prior breaches. Using a propensity score matching technique combined with a difference-in-differences approach, we analyze a matched sample of 869 U.S. hospitals. We find that hospitals that attest to having reached Stage-1 meaningful-use standards observe reduced external breaches in the short term, but do not see continued improvement in the following year. On the other hand, attesting hospitals observe short-term increases in accidental internal breaches, but eventually see longer-term reductions. We do not find any link between malicious internal breaches and attestation. Further, we find that the interaction between meaningful-use attestation (carrot) and prior failure resulting in penalties (stick) enhances short-term reductions of accidental internal and external breaches. Our findings offer both theoretical and practical insights into the effective design of certification mechanisms and breach regulations.

DHS to investigate medical device security

The Department of Homeland Security (specifically the agency’s Industrial Control Systems Cyber Emergency Response Team, or ICS-CERT) is starting to investigate cyber-security vulnerabilities in medical devices, according to recent news reports.

THaW co-PI Kevin Fu commented on the story: “It’s very easy to sort of sensationalize these problems,” said Kevin Fu, who runs the Archimedes Research Center for Medical Device Security at the University of Michigan.

THaW’s Kevin Fu and Darren Lacey were both key players in this week’s FDA workshop “Collaborative Approaches for Medical Device and Healthcare Cybersecurity”.

THaW leads panel at Grace Hopper Conference

Two THaW researchers led a panel on designing mobile and wearable devices for health and wellness at the Grace Hopper Conference in Phoenix, Arizona on October 10th, 2014. The panel was co-hosted by Dr. Klara Nahrstedt (THaW Co-PI and Professor of Computer Science at UIUC), and Aarathi Prasad (Ph.D. Candidate at Dartmouth College). Panelists included Ruzena Bajcsy (Professor of EECS at UC Berkeley), Jung Ook Hong (research scientist at Fitbit), and Janet Campbell (product lead at Epic). The panel discussed issues related to usability, security, and privacy that mobile and wearable health and wellness application developers should be aware of. Jung discussed the effect that data presentation has on user’s behavior; for example, users are more likely to take 10,000 steps than 8,000 steps because they receive an encouraging message to take a few more steps to cross the daily 10,000 step-count goal. Ruzena talked about the challenges faced by elderly users of mHealth technologies, such as small fonts and complicated buttons on a device. Klara presented the security and privacy issues that arise when people use mobile and wearable health and wellness devices and discussed the different THaW projects briefly. Finally, Janet talked about the issues of sending data to an EHR, such as identifying the patient whose data is in the EHR.

photo of 5 panelists

Jung Ook Hong, Klara Nahrstedt, Ruzena Bajcsy, Janet Campbell, Aarathi Prasad

 

THaW’s Professor Kevin Fu on Slashdot

Professor Kevin Fu Answers Your Questions About Medical Device Security

Almost a year ago you had a chance to ask professor Kevin Fu about medical device security. A number of events (including the collapse of his house) conspired to delay the answering of those questions. Professor Fu has finally found respite from calamity, coincidentally at a time when the FDA has issued guidance on the security of medical devices. Below you’ll find his answers to your old but not forgotten questions.

Fu: I apologize for the year-long delay, but my queue has rather overflowed after part of my house collapsed. See slide #11 for more information on the delay.

Medical device security is a challenging area because it covers a rather large set of disciplines including software engineering, clinical care, patient safety, electrical engineering, human factors, physiology, regulatory affairs, cryptography, etc. There are a lot of well meaning security engineers who have not yet mastered the culture and principles of health care and medicine, and similarly there are a lot of well meaning medical device manufacturers who have not yet mastered the culture and principles of information security and privacy. I started out as a gopher handing out authentication tokens for a paperless medical record system at a hospital in the early 1990s, but in the last decade have focused my attention on security of embedded devices with application to health and wellness.

I huddled with graduate students from my SPQR Lab at Michigan, and we wrote up the following responses to the great questions. We were not able to answer every question, but readers can find years worth of in-depth technical papers on blog.secure-medicine.org and spqr.eecs.umich.edu/publications.php and thaw.org.

Link to the original slashdot posting here.

THaW on TV

Blog post from Professor Kevin Fu –

NBC Chicago interviews patients, physicians, and researchers on medical device security

The TV headline is hyperbolic, but the content is level headed.

Tammy Leitner of NBC Chicago interviewed a number of patients, physicians, and researchers about the challenges of medical device security. Here’s a link to the full video.

Had this interview happened in 2008, the tone would have likely been more confrontational. Remember when Archimedes researchers demonstrated radio-controlled security flaws in pacemaker/defibrillators (also see the Schneier commentary)? Back in 2008, manufacturers and FDA were not accustomed to interacting with security researchers reporting such software-based flaws. It’s completely understandable. Imagine if an unfamiliar person showed up at your front door to point out security problems of your house. The outcome might be unpleasant. Thus, interactions initially got off to a rocky start. But that’s the past.

Fast forward to 2014, and times have changed significantly for the better. The forward-thinking manufacturers, influential researchers, and health care providers regularly interact and help each other to improve medical device security. A few positive examples that brought researchers, clinicians, manufacturers, and regulators together include the draft technical information report on medical device cybersecurity by AAMI (the IETF equivalent of the medical manufacturing world), the Archimedes workshop, and the upcoming FDA workshop on medical device security.

So if you’re a future graduate student or budding security researcher, I’d encourage you to read the technical papers from the short history of medical device security. It’s no longer a cat-and-mouse game of pointing out buffer overflows and SQL injection attacks. The future is about interdisciplinary computing and health care research to produce technology, best practices, and policies that improve medical device security without interfering with the workflow or delivery of health care.

Link to original blog post here.

ZEBRA press

THaW’s article about Zero-Effort Bilateral Recurring Authentication (ZEBRA) triggered a lot of press coverage: such as Communications of the ACM (CACM)VICE Motherboard, Dartmouth NowGizmagThe Register UKPlanet Biometrics*, Computer Business Review*,  Fierce Health ITDaily Science NewsSenior Tech Insider, Motherboard, Homeland Security Newswire, and NFC World. They’re all intrigued by ZEBRA’s ability to continuously authenticate the user of a desktop terminal and to log them out if they leave or if someone else steps in to use the keyboard. Some(*) mistakenly believe our ZEBRA method uses biometrics; quite the contrary, ZEBRA is designed to be user-agnostic and thus requires no per-user training period. (ZEBRA correlates the bracelet wearer’s movements with the keyboard and mouse movements, not with a prior model of the wearer’s movements as do methods built on behavioral biometrics.)  ZEBRA could be combined with a biometric authentication of the wearer to the bracelet, and can be combined with other methods of initial authentication of wearer to system (such as username/password, or fingerprints) making it an extremely versatile tool that adds strength to existing approaches. The Dartmouth THaW team continues to refine ZEBRA. [Note: since the time this paper was published we have learned of a relevant trademark on the name “Zebra”. Thus, we have renamed our approach “BRACE” and will use that name in future publications.]

photo of Shimmer device on a wrist, wherein the hand is using a mouse and the other hand is using a keyboard

Our experiments used the Shimmer research device, though in principle it could work with any fitness band.