I've spent a considerable amount of time thinking about health data integration and interoperability. As a programmer and manager working directly in the space of electronic clinical documentation systems integration, secondary data reuse and clinical data acquisition and access I've been involved in solving those problems. With heightened public interest in all things Healthcare, the question of how we as an industry can upgrade our integration capabilities has been revisited with renewed enthusiasm. There can be no question the current state of affairs is suboptimal. So, what would be optimal? As always, my opinions are my own.
The HIT industry should take a page out of Social Media's playbook. One interesting side effect of the Social Media revolution over the last decade is the rise of the Social Network as popularized by Facebook. The Social Network is in part driven by a concept called the Network effect which more or less states that the more participation you have in any network the more valuable that network becomes. One way Social Media companies were able to do this was by standardizing around certain methods of data integration, interoperability, authorization and authentication. For example, when I publish this post it will also be automatically published under my profiles on Twitter and LinkedIn, respectively. "Social Media" has evolved into an ecosystem of companies, each with their own competing agendas and interests, who have found a way to interoperate in a way that is beneficial to the network as a whole. How can we foster the sort of evolution that has occurred in the general consumer Social Media space in HIT?
For many who aren't in the Health Informatics space, Apple's recent keynote at WWDC 2014 introduced a slew of regular non-techie, non-early adopter folks to the idea that personal health data is coming in a big way. Everything from connected digital scales to blood pressure devices to glucose readers to your sneakers will be able to contribute, centralize, share, aggregate and visualize personal health data. The opportunities for people to take control of their own health are infinite. Beyond the general concept of personalized health care, patient empowerment and the promise that data integration and sharing from a plethora of providers are looking to make a reality, let's take a look at how Apple will implement all this within their unique environment. How is Apple proposing this actually happen and what lessons can the larger Health Informatics community take away from this very public project.
Apple's solution in this space is called HealthKit. Firstly - and I may as well stop here, Apple has publicly produced a full set of documentation. This is basically all you need to know on not only "how" Apple is going to implement this - but how Apple is going to get other developers to implement this. What does "a full set of documentation" actually mean? It means there is a video introducing HealthKit, there is a HealthKit API within iOS8 SDK, and there is sample code in the iOS 8 beta documentation. Yes, you need to be an "Apple Developer" signed up ($99/yr) to be able to access some of that information (iOS 8 beta docs) but hey, this is Apple's show so jump through their easy-to-jump-through hoops and lets move on. For those of you who are not Apple Developers and not able to read the documentation, just watch the video - publicly available - and you'll basically know everything you need to know about how integration should happen in practice.
What is actually going on here is that Apple wants to position itself as the central hub in the personal data space for all your personal health needs. Whether or not that vision pans out is a function of their ability to drive value for the consumer by their ability to attract third party software and hardware developers to their platform. Critically, Apple has realized that in order to attract those developers they need to be open and transparent with how one can interface, integrate and interoperate with their technology. I predict that we will see a fully functional EMR built entirely on HealthKit in the not too distant future.
Contrast Apple's approach with that of the various established players in the enterprise HIT space. Getting information from vendors on how their systems actually work, are designed or even how your data is stored in their applications and/or how to access your information in those systems is often an exercise in futility. EMR/EHR vendors and, by extension, medical device manufacturers have traditionally been opposed to making available such information. A consumer of these enterprise products attempting to provide interoperability features within their own environment or expand the set of features that any given product offers can actually be held legally liable for those attempts. For more on those legal concerns watch this presentation by Henry Jones III, a lawyer with a wealth of experience in software procurement in the HIT space, given at HIMSS-SCT 2013.
Any Health Informaticist who has worked on real life systems integration knows exactly what I am talking about. The degree of difficulty in integrating Health data from any two electronic clinical data sources (let alone a handful) is extraordinary. To be certain, there are technical considerations to be dealt with when attempting systems integration. However, most of the time that difficulty is not limited to technical concerns. An often overlooked and neglected difficulty in integrating clinical informatics systems is rooted in the fact that the vendors of these products - EMRs, EHRs, medical devices, etc. - are financially disincentivized from allowing those products to interoperate. Or put another way, vendors are financially incentivized to be as propriety about their systems as possible. It should come as no surprise that vendors are eagerly interested in locking their customers into their solutions, aka. vendor lock-in.
To the extent that new comer startups successfully challenge the old guard with more accessible products, this will begin to change. Sort of. The problem in enterprise HIT, unlike in the consumer space, is that once a decision is made to purchase a large mission critical system the organization is more or less married to the vendor - forever. It is incredibly cost prohibitive to change clinical systems once they have been deployed. Startup solutions have tremendous difficulties making inroads in the enterprise not only because they are unproven but because they have an extreme likelihood of not existing as a going concern some handful of year in the future. The landscape at the moment is that the majority of large healthcare organizations have already purchased EMR/EHR systems from vendors who do not readily provide interoperability and integration capabilities or documentation to easily do so. To the extent that vendors do provide interoperability features, I would submit that it has been entirely consumer driven.
DeSalvo: Time for the heavy lifting on health record interoperability http://t.co/KpgpLfH1R8
— FCW (@FCWnow) July 17, 2014
Anyway you look at it, the government is a stakeholder in Healthcare in general and HIT in specific. Karen DeSalvo at the Office of the National Coordinator for Health Information Technology and her colleagues are spearheading a drive towards an interoperable HIT infrastructure. In furtherance of that goal, the ONC recently released a 10 Year Plan [pdf] addressing the issue. This concise document weighing in at 13 pages is an absolute treasure trove of information on the governments thinking and an indication for possible future regulation. Surely to be picked apart with a fine-toothed comb at a later date, the document outlines a number of areas for improvement, most importantly, specifically highlights implementation goals for each area. A number of points highlighted above are mentioned throughout the document.
At it's core the government need only do one thing to encourage innovation in the interoperability space and it is this:
I call this the Core Mandate. The core mandate must be unequivocal with no loopholes. What do I mean by "interoperability features"? Simply:
A system is defined as any software application or hardware device.
In computer science, programming languages have a concept called primitive data types or, simply, primitives. You can think of primitives as a basic or core data type available to a programmer within a given environment or language. Think of the core mandate as a primitive of the 10 Year Plan. If the core mandate were part of the 10 Year Plan it might be referenced as Building Block #0. Building Block #1, page 9, of the 10 Year Plan lists a number of services or features that would rely on the core mandate. At first glance the list is sound. Nevertheless, the problem with any list is that it is finite. The future is infinite. By simply implementing the core mandate the government creates a level playing field for all vendors to stand on the merits of their various offerings in the free market, negates the possibility of vendor lock-in and creates an environment where enterprise can purchase from startups with confidence. In a situation where a vendor for any reason can't or won't provide any number of infinite features required in a future world, the free market will fill the void.
Any additional feature standardization development built on top of the core mandate, ie. Methods outlined in Building Block #1, should be done in accordance with the spirit and guidelines of the IETF RFC process. The IETF RFC process is responsible for producing the standardizations that the Internet and other critical technologies are based on.
The government could implement this policy the same way it implemented the adoption of EMRs years ago. First by incentivizing compliance, then by penalizing non-compliance. The government could also cajole industry by offering tax incentives, I'm sure someone could find a way to stuff it in an appropriations bill ;)
In Jurassic Park the indomitable Jeff Goldblum tells us that "Life, uh, finds a way."
If you watched Apple's HealthKit video (yes, you'll need to be technically inclined to understand a lot of the nitty gritty) the takeaway for policy developers and implementors is that technical information of the breadth and quality provided by Apple should be readily available from all vendors. Whereas Apple's particular solution is a centralizing data store vs. a more general enterprise solution which would be along the lines of a federated data store, neither approach is right or wrong. They are simply technical details. One may be more appropriate than the other due to practical realities in respect to a given problem set or use case. The fact is that Apple's methods should be hailed as Best Practice for good corporate citizenship in a future world of interoperable data systems. Apple's methods are an example of the proper implementation of the core mandate in practice and spirit.
At the risk of conflating issues, I want to take a moment to talk about medical ethics as it relates to data integration and interoperability. Ask yourself: Is it ethical to deploy solutions in a healthcare environment that do not interoperate, whose documentation on internal mechanics are not shared with customers, whose stored data can not be exported at will in a cohesive fashion? Or, to bring it into the medical device realm, is it ethical to implant a device whose command, control and data input and output are only accessible via a corporation that may cease business operations at any time? One could make the argument that healthcare systems that do not provide interoperability features are unethical. Beyond everything I've already covered, I believe healthcare data integration and interoperability is an ethical imperative.
Data integration has been a concern for healthcare organizations since healthcare organization had more than one data system to integrate. The demand for consumer driven health data products and Apple's entry into the space is inline with what is happening in enterprise HIT at a regional or national level. A confluence of reasons have conspired to push integration back to the top of the agenda. Call me pollyannaish but I'm of the opinion that when all is said and done, government policy should work to regulate and incentivize industry to implement data integration in the spirit of the core mandate outlined here. Whether consumer or enterprise and no matter where data is collected, people will want to combine that data in new, different and interesting ways. What we can do as HIT practitioners is create a future where the infinite is at least a possibility.
]]>tl;dr If you can ssh out you can bypass most any network restrictions.
Overzealous IT administrators. Restrictive network policies. Overreaching governmental meddling. What's a humble interweb denizen to do? Short of packing it all in and turning it all off, I say get to know the secure shell, more commonly referred to as ssh. If you've heard of ssh you probably know it can do more than just connect you via an encrypted connection to a remote server. You probably also know that it can forward a local address and port to a remote address and port through your ssh connection using the -L
flag.
For example, when I'm on a network that blocks IRC, I use the following command to circumvent it:
ssh -L 6667:irc.freenode.net:6667 user@host.com
Then I connect my IRC client to localhost:6667 which is then forwarded via my ssh connection to irc.freenode.net:6667. Easy peasy. This works because I can control my IRC client and tell it to talk to localhost:6667 instead of irc.freenode.net:6667. What if you have a prebuilt application that needs to dial into a specific website or IP address? You can't control that connection so you intercept it. You can emulate the service IP address locally. Basically you find out the IP address of the service you want to talk to and you tell your computer that it is that IP address by binding it to the local loopback adapter. You know that nifty 127.0.0.1 address? That's your local loopback. The trick is to realize that you aren't limited to that one only. You can add others like so:
sudo ifconfig lo0 add 1.2.3.4
This tells Mac OSX to add address 1.2.3.4 to the local loopback adapter as if your computer actually was that address. This allows your browser and any other application to operate normally, maintaining all network/vpn applied routes and proxies. The only difference is that this one address will now be smuggled out through the gaping hole your ssh connection made in their network. You need to do this before you establish the ssh connection or else it will error out as ssh will not recognize the address as a local address. Once you've established your local loopback, you can set up your ssh session like so:
ssh -L 1.2.3.4:5000:1.2.3.4:5000 -L 1.2.3.4:5001:1.2.3.4:5001 user@host.com
In this case I'm calling on two specific ports to pass through, 5000 and 5001.
Here's something I pieced together from a few different posts around the web.