Engineering Health Information Systems

This site is for our upcoming book

The difficulty of regulating “Medical Apps” for mobile devices

without comments

There is a fast emerging market for mobile apps (software) to be downloaded on mobile devices (such as smart phones, iPods, tablets, etc). These apps range from simple information aids up to complex radiology imaging software with decision support. They also target a variety of users, including GPs, specialists, patients and caregivers. Clearly, there is concern about the safety (and security) of these devices. The U.S. FDA has made it clear that even software can be considered a medical devices if it is used in clinical practice and risks are associated with it. However, it is unclear how to properly control software in general – and mobile apps specifically. Now, the FDA has held a hearing on that subject matter. An interesting summary and protocol of that hearing can be found here. It provides a good account of the difficulties associated with an attempt of regulating the “health app market”.

One point that I found particularly interesting is the aspect of daisy chaining of apps. This problem occurs when multiple interoperable apps exchange data. The normal approach of regulators such as the US FDA and Health Canada is to put tight controls only on those software apps that play an important role in diagnosis or treatment. However, if such an app received data from another app that is less controlled, lack of data quality (e.g., errors in that data caused by errors in this other app) may contribute to safety hazards. This “daisy chaining” problem may in fact indicate that we need a new way of asserting controls, shifting the focus from “software devices” to “data items”.

Written by Jens

September 14th, 2011 at 4:27 pm

Privacy risks of consumer health software on the Internet

without comments

Consumer health informatics (CHI) applications provide computer-based services directly to health care consumers, i.e., patients and their caregivers. The omnipresence of the Internet and Internet-enabled computing devices in modern societies has created a rapidly growing market for different types of CHI applications, including information aids such as personal health records (PHRs), decision aids (e.g., expert systems), educational aids, (e.g., serious games and simulations), and management aids (e.g., monitoring and chronic disease management applications).

CHI applications have many potential benefits, as they enable patients and their caregivers to play a more active role in health care system. This form of consumer-empowerment is expected to translate into individual health improvements as well as in systemic benefits in terms of overall cost savings. However, the use of CHI applications also poses significant risks with respect to information privacy and security. Personal health information is among the most sensitive data and unauthorized disclosure may lead to severe consequences and irreparable damages for individual citizens, e.g., financial loss, loss of reputation (ridicule), psychological hard-ship, loss of insurance coverage, loss of em-ployment and livelihood.

We have recently published a systematic review of privacy risks and potential mitigation strategies associated with CHI applications. The research has been funded by the Office of the Privacy Commissioner of Canada and can be found here.

Written by Jens

June 2nd, 2011 at 12:24 pm

Posted in Privacy

Patient disclosure directives – a headache for Engineers?

with 2 comments

Many shared health record systems that are being deployed today provide patients with some way of declaring disclosure directives to limit the distribution of their personal health data. The Acting Privacy Commissioner of British Columbia has just issued a report that points out that more must be done to protect this very sensitive data. In a recent letter I sent to the Times Colonist, responding to their coverage of the commissioners report, I have pointed out what BC citizens can do right now to protect their shared health data. I have talked about the issue of consent directives and patient controls on data disclosures with several co-researchers and practitioners. It seems there is a general feeling that these kinds of mechanisms create a lot of “headaches” for software engineers. I am wondering why this is so…?

Written by Jens

July 8th, 2010 at 12:30 pm

Posted in Uncategorized

How to Define Safety?

with 2 comments

In follow up to Jens’ post on Safety mandates in Canada, I agree that this is an important aspect that needs to be considered.

But how should we consider safety?

At a high-level, “Do no harm” resonates with many providers. We would need to be more precise in our definition of the domains of safety, so we can measure what is safe / unsafe. The US requirements on Meaningful Use does mention safety, but it appears to assume that safety is improved through use of systems (e.g. CPOE).

But with health information systems, how do we evaluate safety and harm? There are many aspects and we are only just beginning to turn a critical eye to some of the unintended consequences. Implementing systems does not equate to improved safety, de facto. We are seeing new kinds of errors that are happening and the heatlhcare systems are changed due to the introduction of technology. Even as academia is beginning to explore this, realizing that each system and even each installation is likely unique in its context, decision makers are not aware of these important distinctions. For many, adoption equates to an improvement in safety. So how do we go about defining aspects of safety in a manner that is measurable and digestible?

We can define it from an outcome (or potential outcome) perspective and measure quantitatively how many errors we have. Areas such as adverse drug events, unnecessary surgeries, mortality, excess hospital stays, etc. can be used. These are important. How to attribute them to the information system is another question as these are interventions into a complex space.

Looking upstream a bit, we can look at the system function and design. Usability testing and analysis is helpful here. While the errors can be more attributed to the information system, it is harder to predict the actual impact of design errors on patients. It is also harder for decision makers to wrap their head around some usability results, as they can be very detailed and not concrete in their outcomes.

Although I have moved a bit off topic, I think safety is something that needs to be considered, but how can we get safety design on the table?

Written by priceless

July 3rd, 2010 at 7:15 am

Posted in Quality,Safety

Safety – a missing mandate in Canada’s national EHR project?

with 2 comments

Canada Health Infoway (CHI) has been funded by the Canadian government with $2.1 billion since 2001 to foster the development of a pan-Canadian EHR infrastructure. CHI’s mandate has been to “collaborate[s] with Canada’s provinces and territories, health care providers and technology solution companies to implement private and secure health information systems.” (see CHI’s latest report to the public.) The latest report contains a risk management framework containing an analysis for different types of risks, including financial risks (funding), adoption risks and security and privacy risks. It’s noteworthy that safety risks do not appear at all. In fact, the safety aspect is not addressed in the entire report, apart from general conjectures that eHealth technologies will improve patient safety. As mentioned in my previous blog, there is significant indications that this is not necessarily so. The absence of making safety a priority in the pan-Canadian summary care record architecture standards may very well become a major problem down the road. (see recent commentary of Ross Anderson in BMJ, who believes that summary care records will do more harm than good.) The failure of addressing safety in pan-Canadian EHR standards as a primary objective is also in stark contrast with the objectives of regulators (Health Canada, FDA), who are primarily concerned about safety of EHR software. It it time to redefine CHI’s mandate to include safety as another quality objective, next to privacy and security?

Written by Jens

July 2nd, 2010 at 10:36 am

Posted in Quality,Safety

The “tip of the iceberg”? Should Electronic Health Systems be regulated?

with 2 comments

The U.S. FDA is getting increasingly concerned about the safety of Electronic Health Systems. Earlier this year, I attended a meeting of the Software Certification Consortium (SCC), which involved participation of regulators, academics and practitioners in different areas of critical systems. eHealth systems are certainly seen as an increasing issues. Now the Huffington Post published an article stating that FDA’s voluntary notification system has logged a total of 260 reports of “malfunctions with the potential for patient harm,” including 44 injuries and the six deaths. Since such reports are completely voluntary at this point, it is fair to assume that this is just the “tip of the iceberg”. The article mentions that in the past the vendor community has argued against regulatory oversight of their products, for fear of slowing down adoption.
Notably, some vendors such as Cerner have come to the conclusion that mandatory registration and reporting “is the right thing to do”.
Other jurisdictions, such as Canada seem to be ahead. Health Canada is considering any type of patient management software under a mandatory licensing regime according to the medical devices act.

Written by Jens

June 30th, 2010 at 4:11 pm

Posted in Uncategorized

“E” is for “Engineering”

with one comment

The word “eHealth” has gained a bad rep in the light of recent scandals around spending billions of dollars on projects with questionable outcomes. Significant funds have been invested in “eHealth” – not only in Ontario, but across the country. Still, as Mike points on his blog, a recent article in the Canadian Medical Association Journal, states that “confusion and disarray appear to be the only form of national standards in operation within health information record-keeping circles“.
What went wrong? – and how can we avoid such scandals in the future? The answers to this question is complex. Karim Keshavjee has offered detailed analysis of the reasons behind the scandals in a recent issue of Canadian Health Care Technology. Among other factors, he criticizes that there is a “deeply held belief in Canada that eHealth products are mature and can be purchased ‘commercial off-the-shelf ’ (COTS); i.e., ready to use from the package.” However, as he continues to argue, “each healthcare organization is unique and every healthcare jurisdiction is different. Functions that work well in one setting may not work in another.

What I take away from this is an argument for the necessity of “Engineering” of a solution rather than the mere “installation” of a product. So what is the essence of Engineering? Nagib Callaos has written an enlightening paper about this topic. He defines several conditions that are necessary in Engineering:

  1. Usefulness – an Engineering activity must produce a useful thing, to generate human benefit. (As Karim pointed out in his article, usefulness and evaluation has not been afforded much attention in the eHealth Ontario project in the past.)
  2. Know-How Knowledge – Engineering requires scientific knowledge and training. Vendors and consultants should be selected based on their know-how, not based on political or other factors.
  3. Practice and Praxispractical experience and tacit knowledge of how to “do things” are key to success. (As Karim pointed out, eHealth Ontario failed in involve end-users and domain experts in a sufficient manner.)

My suggestion to on how to move forward and reestablish the image of “eHealth” would be to define the “e” in this term as “Engineering”, and use Nagib’s definition as a guiding principle from now on:

Engineering is the development of new Knowledge (scientia), new ‘made things’ (techné) and/or new ways of working and doing (praxis) with the purpose of creating new useful products (artifacts) or services.

Written by Jens

April 21st, 2010 at 4:54 pm

Posted in Uncategorized

A Blog for a Book

without comments

We decided to start a blog as a place to share our process and resources related to our upcoming textbook.

The book is being written to address a range of challenges and opportunities to engineering health information systems.

We look forward to adding to this blog and encouraging comments and feedback over time.

Thank you

— Jens, Craig, Morgan

Written by priceless

April 16th, 2010 at 4:18 pm

Posted in Uncategorized