Will Health Tech Ever Be Hack Proof?

Professor Kevin Fu participated recently on a panel entitled, “Will Health Tech Ever Be Hack Proof?” at the New America symposium on Our Data, Our Health: The Future of Mobile Health Technology (26 March 2015). Joining Kevin to explore the personal, economic and regulatory implications of securing health related technology were Lucia Savage, Chief Privacy Officer, National Coordinator for Health IT, Alvaro Bedoya, Executive Director, Center on Privacy and Technology, Georgetown, and the panel’s moderator was Peter Singer, Strategist and Senior Fellow, New America. The video of this panel discussion can be found here.

THaW Researchers Highlight Emerging Issues Related to Mobility and Security in Healthcare

Bring Your Own Device (BYOD) Practices in Healthcare– A.J. Burns and M. Eric Johnson, Vanderbilt University

Despite the many impressive technology-enabled advances in modern medicine over the past several decades, concerns over costs, reliability, and security have hindered the adoption of IT in the health sector. However, as in other industries, healthcare has seen dramatic increases in the use of personally-owned devices. In fact, 88.6 percent of those working in healthcare report using their smartphone for work. All the while, 54 percent of US organizations report that they’re unable to determine if off-site employees are using technology and informational resources in a way that addresses corporate and regulatory requirements. This lack of oversight is especially problematic for the health sector where research reveals that healthcare workers often fail to maintain basic security hygiene on their devices (e.g., 41 percent report having no password protection).

The trend toward mobile computing is radically transforming how individuals interact with IT. For example, in 2014, comScore reported that for the first time, more than half of all digital media in the US was consumed in a mobile app. In the health sector, enabled by low entry barriers and lax (often non-existent) regulation, the number of mobile health (mHealth) apps available to consumers now exceeds 100,000, with millions of total yearly downloads. Yet when it comes to these available apps, the industry provides little transparency about either the mHealth data’s security and privacy or the usage patterns among physicians and patients that have downloaded these apps. In a recent special issue on IT security in IEEE IT Professional, THaW researchers highlight emerging issues related to mobility and security in healthcare: BYOD and the mHealth application ecosystem.

Link to IEEE IT Professional publication (see pages 23-29).

What’s in Your Dongle and Bank Account? Mandatory and Discretionary Protection of Android External Resources

Soteris Demetriou, Xiaoyong Zhou, Muhammad Naveed, Yeonjoon Lee, Kan Yuan, XiaoFeng Wang, and Carl A Gunter

The pervasiveness of security-critical external re- sources (e.g accessories, online services) poses new challenges to Android security. In prior research we revealed that given the BLUETOOTH and BLUETOOTH_ADMIN permissions, a malicious app on an authorized phone gains unfettered access to any Bluetooth device (e.g., Blood Glucose meter, etc.). Here we further show that sensitive text messages from online banking services and social networks (account balance, password reset links, etc.) are completely exposed to any app with either the RECEIVE_SMS or the READ_SMS permission. Similar security risks are present in other channels (Internet, Audio and NFC) extensively used to connect the phone to assorted external devices or services. Fun- damentally, the current permission-based Discretionary Access Control (DAC) and SEAndroid-based Mandatory Access Control (MAC) are too coarse-grained to protect those resources: whoever gets the permission to use a channel is automatically allowed to access all resources attached to it.

To address this challenge, we present in this paper SEACAT, a new security system for fine-grained, flexible protection on external resources. SEACAT supports both MAC and DAC, and integrates their enforcement mechanisms across the Android middleware and the Linux kernel. It extends SEAndroid for specifying policies on external resources, and also hosts a DAC policy base. Both sets of policies are managed under the same policy engine and Access Vector Cache that support policy checks within the security hooks distributed across the framework and the Linux kernel layers, over different channels. This integrated security model was carefully designed to ensure that misconfig- ured DAC policies will not affect the enforcement of MAC policies, which manufacturers and system administrators can leverage to define their security rules. In the meantime, a policy management service is offered to the ordinary Android users for setting policies that protect the resources provided by the third party. This service translates simple user selections into SELinux-compatible policies in the background. Our implementation is capable of thwarting all known attacks on external resources at a negligible performance cost.

Link to NDSS paper

Revisiting SETA to increase health data stewardship

Training for Information Security – A.J. Burns and M. Eric Johnson, Vanderbilt University

A.J. Burns photo (Vandebilt)

A.J. Burns, Vanderbilt

In today’s digital economy, the uses and users of organizational information are growing rapidly. Perhaps in no industry is this more evident than in the health sector. As the chain of custody of personal health information becomes increasingly complex, many organizations are seeking new ways to train employees to increase health data stewardship. The most common channel for organizational influence over employees’ security-related behaviors are the firm’s security education, training and awareness (SETA) initiatives, yet relatively little research has investigated theoretical approaches to understanding SETA’s motivational effectiveness.

portrait of Eric Johnson

M. Eric Johnson, Dean of the Owen School of Management

Recent research presented at the Hawaiian International Conference on Systems Sciences (HICSS 2015) provides a diagnostic approach to SETA’s influence on employee motivation through the lens of expectancy theory (also known as VIE Theory). The findings show that when it comes to motivating security behaviors, proactive and ommisive behaviors are influenced by distinct expectancy dimensions. Interestingly, expectancies (i.e., the perception that one’s effort will lead to behavior) and instrumentalities (i.e., the perception that one’s behavior will lead to a desired outcome) were positively related to information security precaution taking; while security valence (i.e., the perception that it is good to protect one’s firm from security threats) was negatively related to the withdrawal from information security-enhancing behaviors (or security psychological distancing). These results provide a framework for future study and should help organizations dealing with sensitive information develop SETA initiatives by targeting the distinct expectancy dimensions.

See the full paper at http://conferences.computer.org/hicss/2015/papers/7367d930.pdf

Dr. Avi Rubin to deliver keynote at the AMIA Annual Symposium

rubin_thawDr. Avi Rubin will be the opening keynote speaker at the upcoming AMIA (American Medical Informatics Association) Annual Symposium on November 14, 2015 to be held in San Francisco, CA. Dr. Rubin will focus his remarks on the vulnerability of medical devices and electronic health record systems. For more information about the upcoming AMIA symposium – Click here.

THaW quoted on Anthem story

When KQED radio needed input on the breaking news about the Anthem hacking incident, they reached out to THaW.  David Kotz, PI, is quoted in this brief story on KQED: ; the tagline is “California’s largest private insurer, Anthem, said on Wednesday it has been hacked. The insurer said hackers broke into databases that stored customers’ personal information such as birthdays, social security numbers and employment information.”

 

THaW goes to India

image of historic vidhana-soudha building in BangaloreTHaW PI David Kotz presented a keynote talk at the Workshop on Networked Healthcare Technologies (NetHealth) today in Bangalore, India. This talk provided an overview of the economic and technical trends leading to the THaW project, a summary of a few THaW projects underway, and a research agenda for security and privacy in healthcare IT. The talk was well received and was a wonderful opportunity for interchange of ideas in both the US and India contexts.

A ‘building code’ for building secure code in medical devices

Carl Landwehr portrait

Carl Landwehr

Last month, a broad mix of experts convened by THaW researcher Carl Landwehr convened in New Orleans to begin drafting a “building code” for medical-device software.  They’ve just released their report, and there is already talk about taking some of these ideas into the various standards bodies. Check out their report and feel free to leave comments on their site.  — dave

THaW at the mHealth Privacy & Security Symposium

Perhaps the largest annual event related to mHealth is the mHealth Summit, held near Washington DC.  Today, the summit kicked off with a Privacy & Security Symposium, including a panel on Medical Device Security anchored by both Kevin Fu and Darren Lacey from the THaW team.  Kevin, Darren and the other panelists spoke about some of the security concerns that medical devices pose for patients, clinicians, and hospitals.  The audience brought together a broad mix of medical practitioners, device and software vendors, security professionals, and computer scientists.

photo of the panelists

Kevin Fu and Darren Lacey at the center of a panel session at the mHealth Summit.

Constructing a ‘building code’ for medical device software security

The following summary of the recent ‘building code’ workshop sponsored in part by THaW held on November 19-21, 2014 is provided by Dr. Carl Landwehr —

Forty people with diverse backgrounds in medical device software development, standards, regulation, security, and software engineering met in New Orleans November 19-21 with the goal of constructing a “building code” for medical device software security and a related research agenda. The workshop was sponsored by National Science Foundation through both the Trustworthy Health and Wellness (THaW) center and a separate workshop grant to George Washington University’s Cyber Security Policy and Research Institute (CSPRI) as well as the IEEE Computer Society’s Cybersecurity initiative.

The idea of exploiting building code metaphor originated with THaW’s Carl Landwehr, who organized the meeting with the help of a Steering Group that included THaW leadership as well as several others from the worlds of medical devices, software engineering, and security. Tom Haigh, recently retired from Adventium, served as Vice-Chair.

Building codes for physical structures grow out of industry and professional society groups – suppliers, builders and architects – rather than from government, although adoption of codes by government provides the legal basis for enforcement.  Building codes generally apply to designs, building processes, and the finished product. Code enforcement relies on inspections of structures during construction and of the finished product and also on certification of the skills of the participants in the design, construction, and inspection processes. Inspectors must be knowledgeable and skilled, but the training requirement is not burdensome, and decisions as to whether a building meets the code or not are typically straightforward. Codes also take account of different domains of use of structures: code requirements for single-family dwellings differ from those for public buildings, for example. Although building codes arose largely from safety considerations (e.g. reducing the risk of widespread damage to cities from fires, hurricanes, or earthquakes), security from malicious attack has also motivated some aspects of building codes.

The workshop aimed to develop an analog to building codes focused on the security properties of software rather than the structure and characteristics of physical building.  The objective of this code for software security is to increase assurance that software developed for the domain of medical devices will be free of many of the security vulnerabilities that plague software generally.  Evidence to date is that a large fraction of exploitable security flaws are not design flaws but rather implementation flaws. An initial building code for medical device software security could focus on assuring that the final software that operates the device is free of these kinds of flaws, although it could address aspects of the development process as well.  For example, the code might specify that modules written in a language that permits buffer overflows be subject to particular inspection or testing requirements, while modules written in type-safe languages might require a lesser degree of testing but a stronger inspection of components that translate the source language to executable form.

About 35 separate items were proposed for inclusion in an initial draft building code. Although the final report is still in development, only about half of these elements are likely to be included in the consensus version of the code.

Several participants in the workshop who are active in related standards bodies and professional societies have indicated an interest in moving the code forward in those groups.