Monthly Archives: May 2014

Data Can Relieve the Healthcare Budget Pressure

Interesting week in the healthcare space in Australia with the federal budget and then the Royal review’s recommendations for the PCEHR made public. We certainly live in interesting times!

I believe there is an incredible opportunity for us all, as they say in the classics ‘necessity is the mother of invention’. It seems the perfect storm is brewing; aging population, rise of chronic disease, rising costs and budget pressure. The question is how do we set a course through this storm, to ensure the destination is much better than where we have come from? (Poetic even if I say so!)

A couple of words caught my eye in the PCEHR review, ‘meaningful use’. The concept is to achieve a state where the health records are used to improve the health of the community, drive diagnosis accuracy up as well as improving the research and development of new drugs and more effective treatments. This is all about the data! Getting it to a point of completeness and accuracy so it can be used meaningfully!

This may sound familiar as its part of Obama’s plan to change healthcare in the USA. To achieve ‘meaningful use’ and they have defined a 7 stage process to guide their industry along. Similar to the Royal review’s recommendations, Obama has put in place a series of investments as well as ‘consequences’ for not achieving these milestones. When you consider the ‘private’ or corporatized nature of the USA healthcare system, you would assume that it is against their business model to help their ‘customers’ go next-door. So surely there are less vested interests here inhibiting us from creating a national health record system?

Perhaps the idea has not been fully ‘sold’ to the public and the implementation not optimal. Speak to any GP and they will bemoan the integration and the additional workload… which the review recommendations address. So once these issues are sorted out, we begin down a path where the accumulation of data drives every facet of the industry forward and towards a better, faster and cheaper healthcare system!

Marc Andreessen founder of Netscape, a web browser that competed with Microsoft in the early days of the internet is attributed to saying that “Software is eating the world.” He was referring to the dramatic shifts that happened to the music and video industries as they became digital. I think he was wrong, ‘Data is eating the world.’ It’s the use of data to understand that can transform everything we do! The trick here is to not look at digitisation as just a substitute for the old, but as a new way to do things. (For example a CD was more convenient, however on-line music has changed the industry.)

In healthcare there has been a lot of ‘substitution’ but not a great deal of leveraging the potential of eHealth. For example having an electronic patient record is more convenient, however being able to aggregate a large number of histories and understand how treatments are affected by lifestyles which affect long-term health outcomes, has the potential to produce new protocols which deliver better outcomes which will lower costs.

Thinking about the ‘use’ of data once its digital, will lead us to a situation where we are thinking beyond the horseless carriage and to driving efficiency, lowering costs and opening up new ways to revolutionise the way people are kept well.


How IT Buzzwords Impact Healthcare

Picture from:

Every industry loves their buzz words, and IT does more than most. Today you can’t talk about technology without one of the big four, (Social, Mobile, Big Data or Cloud), being dropped into the conversation. The question is, what has this to do with Healthcare? Well let me discuss this in the next few blog posts, but first context is required.

According to many commentators, such as analyst firm IDC, IT is entering its third epoch. Starting with the birth of IT in the form of the original mainframes, through the current ‘pc’ dominated client server and into the emerging 3rd Platform, typified by the ‘web-scale’ organisations. What I find interesting is that I can see the parallels with healthcare, let me explain.

The first platform was a time shared infrastructure which you would go with a particular job and walk away with a certain outcome. Sound like a hospital?  The issue with this model is that a massive infrastructure investment is required, and it makes use of very high levels of expertise, to use it effectively. That is why today, while there are still many mainframes in use, the ‘jobs’ they tackle are very specific to what it was designed for and where it is the most efficient way to achieve that outcome.

Today, however most computing is performed on the second platform – client server – enabled by the birth of PC’s and networks. This new model enabled new capabilities, such as interactivity, specialisation and a lower cost of production. The ‘work’ was split into different layers and specialist organisations created software solutions which automated processes.  (Think a PACS, RIS, EMR.) This structure is mostly a hub-and-spoke, with specialists performing their specific task and then passing on to the next layer. I would argue much like the delivery of healthcare outside of the hospital today, where a GP refers to a specialist that refers to an ‘ology’ that reports back to the specialist that diagnoses/treats and then reports back to the GP – each performing a task and passing off to the next entity. Now this model is effective at automating a processes but does it inherently improve that process or add to the quality of what is being done?

In my previous blog post I spoke about Big Data and some of what is enabling this… “What’s changed? Over the last decade technologies that can economically store and reason over disparate data types have been developed… (carry on reading here).”  This leads us to the 3rd Platform, if you like consider how the ‘web scale’ organisations do what they do! Such as Amazon predicting books you would like to read, Google giving you the latest information and Facebook changing social connectivity.

What does this mean to healthcare?  You are familiar with the past, computers have helped add up numbers and do accounting, (mainframe), they have automated processes like patient record keeping and image management, (client-server), now technology is help us to predict, understand and tap into the collective. In doing this we get assistance in diagnosis, discovering new protocols and drugs, and predicting likely outcomes. The advantages to healthcare of better planning, decision support and accelerated innovation are dramatic. In essence this is the platform enable healthcare to move into ‘Personalised Wellness’ or ‘Patient Centric Healthcare’. Consider the 3rd Platform as now helping improve thinking, the human process!

Three ‘platforms’ for technology and three ‘platforms’ for healthcare delivery. The IT industry delivers better, faster and cheaper, our challenge now is to use these technologies effectively to deliver better, faster and cheaper healthcare!

Healthcare in Australia and NZ only has ‘Small Data’ – Really?


‘Big Data’ is not about BIG nor is it about DATA… but one thing I’m certain of is that these technologies and methodologies will accelerate discoveries, improve patient outcomes and dramatically reduce healthcare costs.  (The one problem is that IT vendors chose the wrong name!)

Before you stop reading, let me convince you of the merits and applicability to healthcare. Consider hip replacements, if a way was found to replace a hip so that it would never have to be redone, both patient outcome and cost would be dramatically improved. (Example from this blog post where the TED talk cites a group of doctors who collaborated, gathered data and found a pattern, which resulted in these outcomes.) Now if you could watch many hip replacements and follow the patients, given a large number of procedures, you would start to detect which ‘techniques’ resulted in the best outcome.  This is the idea of “big data”, to find these patterns automatically using computers and the available data.

What’s changed? Over the last decade technologies that can economically store and reason over disparate data types have been developed. (By different data types think about structured data, the data in a spreadsheet/database that were invented for computers, and natural data called un-structured, such as pictures, X-Rays and ECG waveforms which humans quickly make sense.) The power of these new technologies is that they bring all this information together and provide the analytic tools to find these patterns and correlations and/or create predictive models.

Sounds complicated but if there was a way to capture the data about procedures and the patient outcome overtime, ‘big data’ could find these patterns which result in the best overall outcome. Immediately the cry goes out that the healthcare professionals cannot spend time inputting more data! And they are absolutely right, these systems should aid and assist the practitioner, but let me suggest that a great deal of the data currently exists in the disparate computer systems, within monitors and the various imaging and measuring modalities, as well as on paper. While its up to the IT industry to provide the ways to extract all this information in a secure and controlled way, there are emerging technologies which will take this idea further.

One interesting and perhaps confrontational technology is video analysis. Today video analysis is used to detect ‘suspicious’ behaviour in public places, (Boston example here), helps major stores detect potential shoppers needing assistance and improve workplace practices to reduce accidents. So it is conceivable that a video of a surgical procedure could be analysed and compared with others, to provide input into the improvement cycle! Or similarly a radiologist with a tricky image could be presented with similar x-rays and the diagnoses he peers made.

In summary ‘Big Data’ is about using available data to improve processes, understand trends, find correlations and develop predictive models, while you don’t need huge amounts of data, you do need the vision to make it happen! While Australia and New Zealand lag behind in this area, I wonder if we can learn from what has been done in the rest of the world and leapfrog them?


EPIC comes to Australia, an indication of a new era or Déjà vu?


Royal Children’s Hospital in Melbourne announced their decision to implement Epic a couple of weeks ago, and congratulations to all involved. Now there are end-to-end EMR systems here already, but Epic took the USA market by storm a few years ago, and I believe became a catalyst in changing the offerings in this space.

(Be warned bragging ahead.) EMC in the USA has been very successful in helping Epic’s customers, providing infrastructure and ‘allied’ applications, such as extracting data from and retiring legacy systems and providing solutions for the residual ‘paper’ workflows. I am told that somewhere in the vicinity of 70% of the major Epic installs are run on EMC infrastructure. (End of bragging-sorry).

So to find out what to expect I contacted my colleagues, and as I learnt more I had to keep pinching myself, as I felt like I’d been here before. Then it dawned on me I had!  Except the application wasn’t EMR, it was ERP; and the vendor wasn’t EPIC it was SAP! (Enterprise Resource Planning was the move from siloed applications in the enterprise into an integrated end-to-end system, and SAP became the catalyst and leading vendor in this space.)

I was on a team implementing SAP for an IT service provider organisation, at the time I managed the delivery of services and was this line of business lead on the project. Although this was many years ago I distinctly remember the fist team meeting with the implementation consultants, (a major consultancy). The team lead opened with, “The best thing about SAP is it is an integrated end-to-end system, and you will learn that the worst thing is that it is an integrated end-to-end system!”, and he was right. Huge rewards but a long journey to get there!

Having the technology bent that I have, I got involved in the infrastructure selection. Various vendors pitched their products to us and provided the appropriate configurations and pricing. (As you would imagine, our goal was to transform the way we did business and we were squarely focused on how the organisation, processes, workflows, skills, etc. were going to change and how to effect this change, so the hardware decision was fairly close to the bottom of the priority list. But it is necessary and on the critical path of the project.

Then one of the vendors message to us was along the lines of , “Select us, because we will be transparent, you won’t think about the infrastructure again!” They got the deal, and it worked. They had a depth of expertise, an architecture and design that just delivered the right stuff out of the box, saving us time on the start-up of the project, and just seemed to solve all the issues we had. For example we kept needed new environments for testing or users to get training etc. The SAP experts give us a few days, and the infrastructure guys said, we’ll have that for you in an hour, is that fast enough!!

The ironic and perhaps funny part of my feelings of Déjà vu, are the many parallels between SAP and EPIC.

–          It seems like the vendor decides who their customers are going to be.

–          The organisation must adopt the systems ‘best practices’.

–          The approach ‘divides’ the market into believers and non-believers.

On this last point while in Melbourne when rumors started about this decision a CIO of a similar size healthcare provider was questioning the decision and wondering how the business case stood up, which is the exact same conversation that was had a thousand times about SAP!

In the end if a new product or approach creates innovation that brings benefits that lead to better patient outcomes, it has to be goodness!