Google Healthcare, Personal Healthcare Reduced to an Algorithm

By R Carter

Google has become a multibillion dollar player in the healthcare industry, with the potential to combine medical and search data in an alarming number of ways. With the $2.1 billion acquisition of the wearables company Fitbit, the company now has the ability to merge personal data collected from its search engine with data about your basic health and fitness plus, prescriptions, medical symptoms, treatments, test results, surgical procedures and more.

Google keeps gigabytes of data on every user who uses one of their products in everyday life. Not only what you search for through your browser, but what you buy over the internet, where you eat, what you watch on the TV, the music you listen to, what you read, in fact every little detail that you do in private with your computer or mobile device running Android, Google is silently in the background, keeping tabs and storing this information, which it then uses to commercially target you.

Google has assured the public that it follows all relevant privacy laws, but the regulatory compliance discussion only distracts from the strange future which is evolving. As Google pushes further into healthcare, it is amassing a trove of data about us. The Wall Street Journal has reported that Google secretly harvested “tens of millions” of medical records, patient names, lab results, diagnoses, hospitalization records, and prescriptions from more than 2,600 hospitals as part of a machine learning project codenamed Nightingale. Google, in partnership with Ascension, a healthcare provider operating in more than 20 states, is planning to build a search tool for medical professionals that will employ machine learning algorithms to process data and make suggestions about prescriptions, diagnoses, and even which doctors to assign to, or remove from, a patient’s team. In other words, eliminating personal choice from healthcare decision making by reducing choice to an algorithm.

In running this program, neither the affected patients nor Ascension doctors were made aware of the project. And again, all parties assert that HIPAA, the package of privacy regulations protecting patient data, allows for the use of this data in this manner. Clearly Google attorney’s believe they have found a loophole in HIPAA which allows this. Otherwise, Google would not have invested billions of dollars into this program.

DHHS is investigating these claims. Under Google’s interpretation, the company is merely a “business associate” helping Ascension better render its services and thus warrants a different level of scrutiny than an actual healthcare provider. But if HHS determines Google and its handling of private information make it something more akin to a healthcare provider itself, it may find Google and Ascension in violation of the law and refer the matter to the DOJ for potential criminal prosecution.

With Google considering itself a business partner with Ascension, it won’t be long before other big data companies follow suit in an industry estimated to be worth $7.7 trillion dollars annually.

In the 20th and 21st century the scientific method of reductionism, breaking something down into its smallest parts to understand what it is and how it works, has heralded innovations which eased human suffering and prolonged life. But in the mid-21st century, that same science with its profit driven, big data and machine learning tools, is being used to eliminate human qualities by reducing all choice down to an algorithm.

The idea fundamentally works like this. Your future choices can be predicted by your past choices, your emotional and rational state of mind predicted by those choices. The problem with this view is in the science itself, which at a quantum level tells us that choice is inherently unpredictable, i.e. the Heisenberg Uncertainty Principle. People are inherently unpredictable and a Newtonian point of view, which says simply, by knowing where each particles is and where it’s going, we can predict what will happen in the future, is completely destroyed by the laws of quantum mechanics.

So why pursue something science has already proven, can be predicted? The answer is profit. The information age with its global conglomerates, big data and machine learning, has introduced a problem our conventional laws, morality and ethics could not have predicted, proving Heisenberg Uncertainty Principle is still true today as it was when it was postulated in 1927.

A recent report from the Financial Times, done in collaboration with Carnegie Mellon, notes that Google, like Amazon and Microsoft, collects data entered into popular health and diagnosis sites. Google’s ad service, DoubleClick, receives prescription names from, for example, while WebMD’s symptom checker shares information with Facebook. The data is not anonymized and the legal experts interviewed argued the collection may violate EU privacy law.

Your very online existence—the sites you access, where you access them from, the ads you click on gives Google the kind of holistic, robust, up-to-date view of your health that was largely unimaginable a decade ago. The hype, or hope, is that as you gather more and more info and when you’re able to combine data sets, you’re able to come up with super tailored care pathways and eventually treatments. So it’s not just that you’re 35 and have pancreatic cancer. It’s, you’re 35, have pancreatic cancer, here’s your medical history, your family history, and genetic markers for oncology and here’s the care pathway just for you. You as a person, you have no choice, the options offered you have already been calculated and selected as what is most profitable to sustain and maintain the system which does the predicting.

In the wrong hands, with the wrong motives, big data, machine learning all driven by a profit motive, are in reality a method of herding countless numbers of patients down a path with sustains the system by addressing the needs of the individuals based on where the majority fall within a bell curve. If you happen to be an outlier, on either end of that bell curve, the system has no solutions for you because you do not meet the criteria of profitability.

At the heart of these issues is privacy and who owns the information which is you? Under current laws any information collected about you, is owned by the company who sold you a product or service and it can be sold to others who in turn use it with data combined from other sources. In essence your identity as an individual has been stolen, and done so because our current system never envisioned big data and machine learning tools. In the age of big data and global conglomerates, corporate rights have exceeded the rights of individuals and given their nearly unlimited financial resources; you have been bought and sold without your knowledge or consent.

Slavery in this context is not about owning your body, but about owning your choices and limiting those choice to ensure the profitability of someone else.

In the next evolution of this phenomena your daily choices will be at stake. Imagine trying to start a business, change careers, return to college to better yourself in some capacity only to be denied because some algorithm has already determined your chances of success or failure.

It’s time for individuals to take back, what since the dawn of civilization, has been their own, their identity and right to choose for themselves, regardless of how profitable it is for corporations.