Sunteți pe pagina 1din 58

I'm the Google Whistleblower - The medical data of millions of

Americans is at risk
By Google Whistleblower

TABLE of CONTENTS
INTRODUCTION………….…………………………………….…………… 2
CHAPTER 1: Your Medical Data at Risk………………………………….…
CHAPTER 2: De-Identification and Why It Matters………….……….…….
CHAPTER 3: Necessary Update to HIPAA Laws (Health Insurance Portability
and Accountability Act)………………………………….……..…………….…….

CHAPTER 4: Can Google Be Trusted to Handle Our Data?.............................


CHAPTER 5: One Possible Way to Use Our Data for Good…………………..
CHAPTER 6: The Truth About How EMD is Being Used……………………..
CHAPTER 7: Why I spoke out……………………………………………….
CHAPTER 8: Is the Legislature Doing Enough?...............................................
CHAPTER 9: Bipartisan Efforts and Government Probes………….………….
CHAPTER 10: The Deal Itself Raises Privacy Concerns……………………….
CHAPTER 11: New Bills being proposed
CHAPTER 12: De-Identification is a Myth……………………………………
CHAPTER 13: 4 Principles of Ethics by the World Health Organization (WHO)
CHAPTER 14: Goldilocks Dilemma
CHAPTER 15: The Technology Hippocratic oath
CHAPTER 16: Hacking Component and the Risk it poses

CHAPTER 17 – the last and Certainly not least: Should Patients Have a
Say in the Process?
2


INTRODUCTION
30-yr-old woman with Fibromyalgia denied consumer credit due to her
“lifelong disorder.”
35-yr-old man denied any employment due to “chronic depression.”
Are these real headlines?
Thankfully, no. But they could be soon. That’s why I decided to come forward,
but it was not an easy decision to do so.
Great. Something else to worry about, now? That’s probably what many of you
are thinking.
3

To a certain extent, you’re right, but this is not a book to strike fear into you. It’s
one to alert you to something which may be frightening. But first, we all have to
take a closer look at what’s going on between the medical and technology
industries.
Don’t be intimated though. Even if you’re not a “techy” or in the medical care
industry (for anything other than your own personal healthcare), this book IS FOR
YOU!
In fact, it’s for all Americans who want their medical history to remain private.
Only our own doctors and hospitals where we receive treatment should have our
medical data, right? I’m going to take a wild guess that you agree with me.
I wrote this book to advise you that such privacy is a myth. Maybe 50 years ago,
it was true…but not now. With technology, came a risk to our healthcare records
falling into the hands and computers of companies and people who we never
could have imagined would have access to our personal and private matters like
our medical diagnoses, medications, reports to our doctors, ours and our family’s
addresses and phone numbers, and even our social security numbers.
This book will explain what’s already going on which you may not even be aware
has been happening for years. It will then inform you of the new and more
disturbing realities likely to result from the recent partnership between tech giant
Google and massive hospital network Ascension.
My goal?
To let others know what I’ve learned as a worker at the front lines where tons of
Americans’ medical data is being disseminated without privacy precautions or
any laws, HIPAA (Health Insurance Portability and Accountability Act a/k/a The
Privacy Rule) or otherwise, to protect your highly personal medical information.
4


CHAPTER 1: Your Medical Data at Risk

Let’s begin at the simplest, most relatable level.

We’ve all been in this type of situation… You’re at your doctor’s office, you’re
half-naked, exposed in a cold exam room, but telling yourself, “It’s okay. This is
my doctor. There are laws to keep them him/her from hurting me or doing in
anything improper with me or my medical information. No one will know what
symptoms brought me here today. Just relax.” Your physician enters with a kind
smile, and talks to you. You tell the good doctor all the intimate and bothersome
details of your health as the he/she writes or types to take good notes of your
complaint. Your thoughts pause a moment, wondering, “Who will read that?” But
then you remind yourself again that laws protect those memos from being
circulated outside this office or your physician’s medical team. Then, you’re at
ease again. But I submit that maybe you shouldn’t be.
5

I never could have imagined that one day I would be working on the front lines of
health data, only to realize that the doctor and limited office staff were NOT the
only ones with access to my private medical info, including my full name, SS
number, symptoms, medications, relatives and contact information… You get my
drift, I’m sure. It’s all being shared with Google.

Have you heard the disturbing hubbub about Google’s Business Associate
Agreement with Ascension? If not, and you care about your personal healthcare
information remaining private, then KEEP READING...

As I’ve mentioned, consumer tech giant Google and the massive hospital system
Ascension have joined forces. Ascension is responsible for the care of 50 million+
patients in the U.S. I’m sure you have some idea of what Google is and does, but
if not, it’s basically the biggest conveyer and collector of consumer information in
the world. They create the algorithms which dictate which consumers to target
with what products based on your personal shopping patterns and lifestyles as
revealed to Google from your internet search and purchase histories. Chilling,
right? That’s the cold, hard truth.
So, RIDDLE ME THIS. What happens when medical patient data is combined
with consumer data?
Here are a few alarming possibilities:
-Your age, MRI results, medications and relatives with your name attached can
end up slipping into the Worldwide Web for anyone to hack;
-Your healthcare details may influence employment sites regarding whether
you’re really a viable candidate for a potential employer; or
-Your “chronic” depression, although treated and under control, may prevent a
good credit score because the powers that be in the credit world may include your
ability to work in their score to better help commercial lending institutions.
This is not an exhaustive list, but it probably gives you a clear understanding of
the risks to all American now that Google and Ascension may, even if by
accident only, intermingle consumer data with medical care data. Now, to be
accurate, Google has explicitly stated that is will not use the healthcare data for
6

marketing or selling in the consumer arena. Google even assured the public that
“patient data cannot and will not be combined with any Google consumer data.”
Hmmm. How can it be sure?

Well, despite Google’s good intentions and hopefully secure system to achieve
just what they said, nothing’s infallible. Thus, I’ve written this book. There
should be public dismay and a demand for more information.

So many people who know me have said, “It’s good that you have come forward.
It’s making a difference,” and “You are right to speak out.” Those words and
support from others I respect and trust have lit a fire in me to spread the word
before your private information is overspread!


CHAPTER 2: De-Identification and Why It Matters

Patients are in the dark. The wide circulation of patient data has to stop! But
De-Identification, while it works in theory, doesn’t seem to work in actuality.
What many people don’t know yet is that the medical regulations which went into
effect several years back have increased the amount of patient data that must be…
that’s right… must be input on various databases for compliance with the
regulations. However, there was no legal mandate requiring physicians to explain
or notify patients of this new process. Many of you may have noticed how much
more time your doctor spends typing information onto a computer program than
speaking with you during your appointment. Was the reasoning ever explained to
you? Do you know who has access to that precious data? It’s a fact that many
medical workers are putting your healthcare details in the cloud. This is
where HIPPAA may have failed us, therefore, it needs to be updated ASAP to
address all of the gambles being taken with our personal information.
7

So, what the heck is de-identification? Can it help until HIPPA is jacked up?
It’s kind of like voting in America. The theory being that your vote is anonymous,
but your vote is still considered to determine something helpful, like who our
elected officials are supposed to be! In theory, and I stress “in theory,” de-
identification is the process by which individuals’ names and personal
identification information are stripped from data which is relevant to third party
studies and objectives. So, the data analysts may objectively utilize the data for
their specific purposes.
Basically, once a person’s medical records are de-identified (names and ID info
removed), only data their like gender, condition, age, medications, blood pressure
readings, race, diseases, etc. are available to the third parties. A more detailed list
of information to be removed is listed below in the cited federal statute on de-
identification. The purpose of analyzing the remaining data, in theory, is that in
doing so, it may be possible to find cures and better treatments for certain medical
problems and illnesses. That’s a great thing, of course… but only if all parties
involved are aware of the procedures, especially the American patients whose
highly personal and confidential information could be at risk of falling into the
wrong hands. That brings me to my next point…
Why De-Identification matters.
First, here is a non-exhaustive list of information being shared by your doctors
and hospitals already which are supposed to be de-identified:
Year of birth
Race
Gender
First three digits of zip code
Birth year
Year admitted to and discharged from a hospital
Health insurance and payment details
Xray and MRI results
Allergy intolerance
Diseases
Blood pressure readings
Observations of your condition
8

Marital status

I think you get the idea of what’s out there for others to see about you. This may
be a longshot, but I bet the thought of strangers getting a hold of these details with
your name and address attached to them makes a chill run up your spine. It does
for me!
This is why it’s so IMPERATIVE that de-identification is strictly overseen and
enforced, but right now, it’s just not being properly monitored for compliance,
and a lot of your personal medical information with your name attached is in the
hands of people who can influence your daily life. Perhaps, it gets to a credit or
lending institution. If prospective lending institution sees you have a serious
health condition, you may be denied a home or car loan. That would be horrible.
Especially for those of us trying to have a family and give them safe shelter,
transportation and financial security.

You may hear de-identification referred to as the “Safe Harbor Method.”


Guidance on Satisfying the Safe Harbor Method
As promised, here is the actual federal statute which provides the specific list of
details our medical records must not show once de-identification is completed.

In §164.514(b), the Safe Harbor method for de-identification is defined as


follows1:

(a) Standard: De-identification of protected health information. Health


information that does not identify an individual and with respect to which there is
no reasonable basis to believe that the information can be used to identify
an individual is not individually identifiable health information.
(b) Implementation specifications: Requirements for de-identification of
protected health information. A covered entity may determine that
health information is not individually identifiable health information only if:
(1) A person with appropriate knowledge of and experience with generally
accepted statistical and scientific principles and methods for rendering
information not individually identifiable:

1
45 CFR § 164.514 - Other requirements relating to uses and disclosures of
protected health information. CFR = Code of Federal Regulations
9

(i) Applying such principles and methods, determines that the risk is very
small that the information could be used, alone or in combination with other
reasonably available information, by an anticipated recipient to identify
an individual who is a subject of the information; and
(ii) Documents the methods and results of the analysis that justify such
determination; or
(2)
(i) The following identifiers of the individual or of relatives, employers, or
household members of the individual, are removed:
(A) Names;
(B) All geographic subdivisions smaller than a State, including street
address, city, county, precinct, zip code, and their equivalent geocodes,
except for the initial three digits of a zip code if, according to the current
publicly available data from the Bureau of the Census:
(1) The geographic unit formed by combining all zip codes with the
same three initial digits contains more than 20,000 people; and
(2) The initial three digits of a zip code for all such geographic units
containing 20,000 or fewer people is changed to 000.
(C) All elements of dates (except year) for dates directly related to
an individual, including birth date, admission date, discharge date, date of
death; and all ages over 89 and all elements of dates (including year)
indicative of such age, except that such ages and elements may be
aggregated into a single category of age 90 or older;
(D) Telephone numbers;
(E) Fax numbers;
(F) Electronic mail addresses;
(G) Social security numbers;
(H) Medical record numbers;
(I) Health plan beneficiary numbers;
(J) Account numbers;
(K) Certificate/license numbers;
(L) Vehicle identifiers and serial numbers, including license plate
numbers;
(M) Device identifiers and serial numbers;
(N) Web Universal Resource Locators (URLs);
(O) Internet Protocol (IP) address numbers;
(P) Biometric identifiers, including finger and voice prints;
10

(Q) Full face photographic images and any comparable images; and
(R) Any other unique identifying number, characteristic, or code, except
as permitted by paragraph (c) of this section; and
(ii) The covered entity does not have actual knowledge that the information
could be used alone or in combination with other information to identify an
individual who is a subject of the information.
~~~~~~~~~~~~~~~~
What we know about Google’s past in terms of deidentification is little, but
there are some reports on public records to enlighten us to some extent. For
example, in November 2019 The Wall Street Journal (WSJ) investigated the court
records regarding the class action lawsuit currently pending against University of
Chicago Medical Center and the University of Chicago, among other defendants.
It’s fact that Google entered into a collaboration with University of Chicago in
which UChicago would share patient information with Google. This appears to be
similar to Project Nightingale with Ascension, and it definitely raised similar
concerns for the patients of UChicago Medical Center.

The patients are the plaintiffs in the class action, and claim that UChicagoMC
failed to properly deidentify sensitive patient medical data. The crux of the
argument is that, while UChicago may have actually deidentified the data, in
effect, it may not have. The plaintiff’s attorney reasons and submits to the U.S.
District Court that “Google’s expertise in data mining and AI makes it ‘uniquely
able to determine the identity’ of the medical records shared with it by the
university.” 2

The legal complaint further alleges that UChicagoMC "promised in its patient
admission forms that it would not disclose patients' records to third parties, like
Google, for commercial purposes." See article referenced in footnote 2. Clearly,
it’s arguable that medical providers may be breaching their own contractual
duties to patients when they enter into these collaborations and information
sharing agreements with Google. This is a big problem!

You know those contracts… the one’s we barely read because we are sick in
a hospital bed when some orderly or admin person traipses into our room
with the clipboard of docs for us to sign? They rapidly rattle off what they
mean, and we hope what they say is true as we sit there half naked and in distress,
trying to skim the written agreements and affirmations that we’re signing just to
get on with our medical care.

We all rely on those written promises as being there to protect our private medical
matters and believe that some big doctor or legal association is acting as a watch-

2https://www.healthcareitnews.com/news/ascension-google-working-secret-patient-data-
project-says-wsj
11

dog on our behalf. Given the sensitivity of those matters and our vulnerability in
those signing moments, we must be absolutely sure that those agreements will
be strictly followed by both parties to the hospital care and medical care contracts.

Still, we have no policies or laws in place to protect us since Project Nightingale


beat the legislature in effectuating the terms of the agreement. “Project
Nightingale” is the name that Google uses to refer to the Google/Ascension
contract. I suppose they wanted it to have a name that connotes goodness and
trust, and it does. Florence Nightingale is a legend for her amazing nursing work
which changed the face of the nursing from mere untrained caregiver work to a
highly skilled and well-respected profession.

We can’t be blinded by semantics, though. Even if they’re deidentifying, what


about the class action lawyer’s point about it not being enough given Google’s
tech abilities? HIPPA must become more specific and more far-reaching to cover
all these bases. In response to the WSJ’s recent article, Ascension’s executive vice
president of strategy and innovations Eduardo Conrado, put on the record,

"As the healthcare environment continues to rapidly evolve, we must


transform to better meet the needs and expectations of those we serve as
well as our own caregivers and healthcare providers…”

I agree that the healthcare system must transform to meet new expectation, But I
also think best way to accomplish this is to require private companies to integrate
patient protection, consent and transparency programs into their digital platforms
and applications to keep Americans’ healthcare records secure and private on a
day-to-day basis. It also includes brining the public and patients into the process
as well as academia and experts on all sides. Just asking personnel to keep patient
information confidential is not enough. Not everyone can be trusted.

Adding to Conrado’s point, Tariq Shaukat, president of Google Cloud,


added,

"By working in partnership with leading healthcare systems like Ascension,


we hope to transform the delivery of healthcare through the power of the
cloud, data analytics, machine learning, and modern productivity tools –
ultimately improving outcomes, reducing costs, and saving lives."

Of course, that’s a wonderful objective and I’d take a wild guess that you agree
that better healthcare and lower costs for it is a worthy goal, but we have to be
sure the cost is not more than the possible benefit. Right now, that question is
completely up in the air.
12

For full article which quotes the WSJ article, see:


https://www.healthcareitnews.com/news/ascension-google-working-secret-
patient-data-project-says-wsj

Assuming full enforcement of the de-identification statute, and that’s a lot to


swallow, is it providing any real practical help for patients right now? No. And
it’s simply because the statute does not touch on other activities going on with our
sensitive medical data like Emergent Medica Data (“EMD”) which I discuss in a
later chapter.

Moreover, there’s been no law enacted and enforced to force Google and
Ascension to let in a watch-dog to check whether the medical information being
passed from Ascension to Google is de-identified according to the statute above.
So, millions of Americans supposedly confidential healthcare histories and other
private information may already be floating around Google’s platforms and being
used and seen by unauthorized people and entities.


CHAPTER 3: Necessary Update to HIPAA Laws (Health Insurance
Portability and Accountability Act)
Before jumping into why HIPAA should probably be updated ASAP, let me tell you
more about the Act itself, in case you don’t know much about it. This is from
Web.MD.com (link afterwards):
HIPAA (pronounced HIP-uh) stands for the Health Insurance Portability and
Accountability Act, and is the law that protects your privacy as a patient.
Under the law, health care plans and health care providers must limit who
can see your health records. HIPAA also gives you the right to get a copy of
your health records from your doctor.
Now, get this. The last update to HIPAA was in 2013. Yikes! It’s application and
rules are clearly not keeping up with the technological changes in our healthcare
13

system and can’t possibly apply to the risky business going on in Project
Nightingale. The legislature must catch up to technology.
In summary, HIPAA (as of 2013) requires that employers comply with its rules and
regulation when it comes to setting up health insurance plans for its employees. If a
company operates as a medical clinic of any sort, it must follow HIPPA just as any
large hospital or other substantial medical provider. So, fine, it covers the big guys
and the little guys. Right now, however, the biggest guys are able to fly right over
HIPAA, essentially unencumbered because their advance data keeping and mining
systems are not regulated by HIPAA, as they did not exist in 2013 when regulations
were made.
With Project Nightingale being free as a bird, and life insurance companies Social
Security and welfare benefits state agencies not being required to follow HIPAA
rules (that’s true), think of the kind of manipulation Google, insurance companies
and public assistance offices can achieve to bulk-up their bottom lines at the unfair
expense of Americans.
So, while some of HIPAA is quite comforting, other parts like parts that are
noticeably missing, are a bit unsettling. I posit that HIPAA laws should be
improved and updated in this new age of communication and information-sharing
technology which was not contemplated when the current effective version of
HIPAA was enacted.
This is not just my position. Others feel this way, too. For example, I’ve spoken
with policy experts, like Dr Adrian Gropper MD Harvard Medical School and
Ross Anderson PHD Professor of Security Engineering at Cambridge University
and Dr. Deborah Peel Founder of Patient Privacy Rights just to name a few, who
confirmed their concerns of the Data sharing in Healthcare and how Technology
companies have a long way to go if they want to properly handle patient data”
Additionally, both political candidates and presently serving congresspeople are
concerned about our technology being light years ahead of legislation appropriate
and necessary for it. To quote United States Senator Amy Klobucher, for instance,
“Our laws were not anticipating the advent of smartwatches and 3rd party apps
which can collect health data.”3
Finally, journalists and cybersecurity specialists alike have told me that the
current HIPAA law definitely does NOT protect our supposed-to-be-confidential
healthcare information and personal medical records.
You see, healthcare data is far different than banking data or retail/consumer data.
It’s always had a different type of treatment than the latter and couldn’t be
handled in the same manner. It used to be that only individual hospitals carried

3 [Source of Amy’s quote here_____________________]


14

your data if you were treated there. Now, large multi-nationals even, tech
monopolies are entering the space.
Times are changing, and with that is the requirement to update HIPPA laws, as
they are not just tools themselves, but also, mandates and rules for medical
workers in the industry. It follows that these rules and mandates must reach
Google and Ascension which currently, they don’t. Why? Because no one knows
the true extent of the data sharing or the procedures they plan to use to separate it.
There needs to be a TOTAL TRANSPARENCY, at least to the legislature,
preferably to the public at large, before the necessary and effective changes to
HIPPA can be effectuate.
Patients are already in the dark.
The wide circulation of patient data has to stop!
What another medical data managing company has to say...
Ciitizen is a U.S. company which helps you collect, summarize, and share your
medical records digitally, free of charge. It can be used to get a second opinion,
coordinate with caregivers, or donate to research. 4
Recently, the Chief Regulatory Officer of Ciitizen posted an article on their
website about patient privacy issues. Here are some excerpts. I believe that Ms.
McGraw wrote this because/for the purpose of information patients of their broad
HIPAA rights of which many of us didn’t even know we had, especially given the
red tape it feels like we’re trapped in every time we try to get copies of our own
medical records.

“A letter from our Chief Regulatory Officer, Deven McGraw”5


(excerpts)

My name is Deven McGraw, and I’m the former Deputy Director for Health
Information Privacy at the Office for Civil Rights of the U.S. Department of
Health and Human Services. In my role, I was responsible for enforcing
HIPAA and issuing guidance on how to comply with its rules.

I spent two years with the U.S. government working on behalf of patient
rights regarding personal health data, and now I’m the Chief Regulatory
Officer at Ciitizen to further that mission. I’m here to make sure you know
your rights regarding your health information:

It’s yours. You have the right to all of it.”

4 https://www.ciitizen.com/
5 https://www.ciitizen.com/privacy/
15

Sounds wonderful, right? And it is true. The hidden problem is (and I’m not
saying that McGraw hid anything from us) that the means to obtaining our
personal health records and data is not easy, and may even be blocked sometimes,
by…guess what? TECHNOLOGY.

You see, hospitals and medical providers don’t all use the same medical data
storing and transferring companies. Some use Ascension and some don’t, so what
happens is patients ask for their lab test results, for example, but the lab stores the
data with Ascension. What then? You have to download an app in order to
receive your lab results. But if you don’t have or want that app, you are forced to
get it OR you have to have your records snail-mailed.

Ms. McGraw’s letter closes with,

Many institutions and medical practices have not paid attention to the
HIPAA right of access and therefore have not established practices
allowing for people to easily exercise it…

Deven McGraw, JD, MPH, LLM

~~~~~~~~~~~~

More evidence that hospitals and private medical practices are not even paying
attention to HIPAA. That begs the question, where is the Act failing? I submit that
it’s failing because no third-party patient watch-dog is monitoring the providers
and/or the penalties for non-compliance are not significant enough to be
deterrents or are simply not being enforced. These two specific points in the
process of applying HIPAA, I propose, must be thoroughly re-examined and
investigated.

Back to the rights McGraw shared…Did any of those rights come as a surprise to
you? If not, that’s great! If yes, you are not the only one. They do to a lot of
people. For the full official Privacy Policy of Ciitizens, see this link…

https://www.ciitizen.com/privacy/ The policy is just below Ms. McGraw’s letter


on their site.

My point in outlining these rights, as done so clearly by Ms. McGraw, is to show


people that there are many and specific rights that Americans have when it comes
to their healthcare records and information, but as of right now, we have no
assurance that Google and Ascension are strictly, or even substantially, following
16

all of them. Ms. McGraw’s note mostly referred to patients’ rights to their own
records, but within the Citizen Privacy Policy, you’ll find details which speak
directly to information sharing.

While we, as private citizens, may not necessarily be able to check that the
policies are being followed, our legislature can. They simply need to enact laws
which permit privacy system checks. Objective third party analysts must be
allowed to investigate whether the stated privacy precautions are actually being
properly executed by all employees or other people/entities with access to patient
information. For example, Citizens privacy policy includes the following, and it’s
fantastic, but it means nothing if no one is assuring compliance.

“Information security

Ciitizen employs technical, administrative and physical safeguards to help


protect against unauthorized access to, use or disclosure of customer
information we collect or store. We store your information on encrypted,
cloud servers located in the U.S. and that we have selected based on a
review of their security safeguards. Our employees are trained on the
importance of protecting privacy and on the proper access to, use and
disclosure of customer information…. Under our practices and policies,
access to sensitive personally identifiable information is authorized only
for those who have a business need for such access.” Emphasis added.

The next paragraph of the policy is somewhat disconcerting, and I’ll discuss that
in more detail further down. Here is what it says…

“Although we work hard to protect personal information that we collect and


store, no program is 100% secure and we cannot guarantee that our
safeguards will prevent every unauthorized attempt to access, use or
disclose personal information.” Emphasis added. 6

Hmmm? How worried should we be about that second paragraph? Is that a


loophole? If any (or all!) of their patients become victims of exploitation resulting
from Ciitizens’ failure to prevent leaks, are they completely insulated from
responsibility to those patients? I submit that they might be. And that’s not only
because of the loose language, but also, because you won’t even find words like
“encrypted” in HIPAA nor does HIPAA cover “de-identified” personal
information. Which I will explain can be just as dangerous as identifiable data.

6 https://www.ciitizen.com/privacy/
17


CHAPTER 4: Can Google Be Trusted to Handle Our Data?

Should we trust Google to handle our sensitive data?


Great question, and to be honest, I don’t know. Companies can say whatever they
want, but in the end, the contract, Business Associate Agreement (BAA) or letter
of intent dictates what they are doing behind the scenes. Unless this is made
public and transparent, we cannot possibly know for sure. Therefore, it must be
made public. If it’s not, then our constitutional right to privacy is being
undermined. We mustn’t stand for that.
Plus, we need third parties, even computer programs integrated into Google’s and
Ascension’s systems to help verify proper handling in real time because we have
to ensure that these rules are spelled out for years to come. What if at first it’s
handled properly, but ten years from now the BAA changes to say that people
may use patient names, addresses, relatives and social security numbers to
advertise, share with any 3rd parties, not de-identify or share algorithms with
banks, insurance companies and employers which could be used for algorithmic
bias and discrimination. In that case, “we’re f’d”
18

Remember when Target’s credit card systems were hacked? They never intended
to face such a consumer catastrophe. Same goes for Google and Ascension. I’m
not saying that these companies won’t try to protect and separate the information.
I’m urging that because personal medical records are so incredibly private there
should be more extreme methods to prevent their exposure to the wrong people.
Google’s not exactly in a position to ask us to simply take their word for it.
Google has proven to do similar campaigns in the recent past… not following
proper procedures on at least four occasions. Plus, Google has more than tons to
gain from this alliance with the $3 trillion healthcare industry. For example, it can
apply their findings from Ascension’s database to other businesses to bolster
profiles and share in ways that would be commercially viable.
In a recent article by Liam Tung of ZD.Net.com 7, he wrote Mr. Pichai stated,
“Google Cloud's AI and ML solutions are helping healthcare organizations like
Ascension improve the healthcare experience and outcomes,” and Google further
contends that the partnership with Ascension is compliant with HIPAA rules.

In juxtoposition, consider this fact…

“[Google] didn’t disclose the leak for months to avoid a public relations
headache and potential regulatory enforcement.”8 Yes. In October 2018 The
Guardian published an article with that headline. The article explained how
Google realized it had a leak similar to Facebook’s. You may remember the
Zuckerberg hearings. Google did not want its CEO, Mr. Pichai under the same
scrutiny, so it chose to conceal the leak.

The gravity of Google’s leak? Well, Google eventually posted on its own blog
that “Google disclosed the data leak, which it said potentially affected up to
500,000 accounts. Up to 438 different third-party applications may have
had access to private information due to the bug…”

On top of that, there are no federal laws requiring Google to disclose data leaks.
While there are some at the state level, and even in California where Google is
based, how likely do you think it is that California courts we keep tabs on Google
leaks or any other mishaps which could cost Google money or reputation. I mean,
Google is great for the California economy, after all.

7 https://www.zdnet.com/article/googles-plan-to-collect-health-data-on-millions-of-americans-
faces-federal-inquiry/
8 https://www.theguardian.com/technology/2018/oct/08/google-plus-security-breach-wall-

street-journal
19

You can decide for yourself if you are comfortable with trusting Google at its
word.


CHAPTER 5: One Possible Way to Use Our Data for Good
Is there are right way forward? Can they possibly use our data for the good of
humanity?
Sure. But it won’t be easy!
When I say “good” of humanity, I am talking primarily about new discoveries in
the medical field for improving our health—cures for formerly uncurable diseases
and disorders. And, even if not full-blown cures, medical treatments to prolong
life or reduce pain and suffering from chronic and currently uncurable conditions.
As mentioned in Chapter 2 (statement by the President of Google Cloud), other
benefits may include creation of “modern productivity tools”—ultimately
improving patient outcomes, reducing costs and saving lives. Those would be
fantastic results, but not if it costs Americans much more in terms of their private
data being used against them in other important arenas in their lives like jobs,
shopping for basic needs, obtaining home or car loans.
While some politicians and lawmakers have proposed plans to protect our private
medical information as well as to allow Google to safely use our personal data for
a proper and helpful purpose, the proposals just don’t go far enough to protect
patients.
After all of my extensive research and personal knowledge of the matters
discussed herein, I submit the following 13-Point Plan for both politicians and
companies. See what you think…
13-POINT PLAN for Proper Use of Data Sharing Between Google and
Ascension (or other medical data protection/sharing companies)
20

1. When it comes to private healthcare data, Google and Ascension must share any BAA,
letter of intent, or contract with the public. It is not enough for Google to just say, “We
are HIPAA-compliant.” They must publish the terms of such agreements so it can be
analyzed by experts and the public.

2. Google and Ascension must employ a 3rd party unbiased agency to monitor and
check that real-time policies and procedures are HIPAA-compliant.

3. Google and Ascension must present patients with opt-in or opt-out clauses. or Tiered
level of opt in opt out

4. Google must present a clear definition of what it plans to do with data.

5. Should Google be found to violate HIPAA, it must be responsible to pay a fine for each
incident.

6. Google must provide a breakdown of future data usage, with a clear timeline of terms
with actual numbers and facts

7. In addition, we must ask ourselves is it fair for one company to control so much from
search, videos, search, email and healthcare?

8. Further each company must answer in detail each of the 17 points in Elizabeth
Warrens request to Google and Ascension. Any other future company seeking to do
what google does must answer the same 17 questions.
(https://www.blumenthal.senate.gov/imo/media/doc/11.20.19%20-%20Google%20-
%20Project%20Nightingale.pdf )

9. We must have more legislation to update the HIPPA laws for the present age. This
means it must be a bipartisan issue, old, young, rich or poor everyone cares about
healthcare data privacy. and Yes both Democrats and Republicans care equally.

10. We must address EMD - Emergent Medical Data and ensure that 10 year or 20 years
from now a BAA cannot change. It's possible this data could be used to create profiles
and those profiles sold to banks, insurance companies or employers who could evaluate
credit, employment and insurance decisions based on your health information.

11. Google and Ascension must detail exactly which 3rd parties they will share data
with, which research institutions they will share data with. What use cases they can
share and analyze data for. Which use cases require names attached to the PHI and
which do not. How they plan to use EMD data for 3rd party sharing or analytics or
research, whether they can re identify anything de identified using their algorithms and
tools.

12. Allowing their algorithms and source around HIPPA and analytics and 3rd party
sharing to be analyzed by trusted watch dogs. Without inspecting the algorithm, you
cannot guarantee compliance and protection
21

13. Laws that prevent the sharing with Insurance providers, employers or banks which
will be used to deny insurance, a job or credit. Legal language must be created that
prevents algorithms from being used for algorithmic bias.

(This is by no means all-inclusive and we can edit and update this and debate these)

Amy Klobucher’s plan and Bill Cassidy’s plan fall short because they don’t cover
numbers 1,6,7,10,11, 12 and 13 of my plan. Without those matters included, the
risks to patients are great. For example, more people could be denied credit,
insurance or jobs if nothing is done to mitigate the above. Amy’s plans and
Cassidy's plan don’t go far enough but they are a step-in the right direction.


22

CHAPTER 6: The Truth About How EMD is Being Used


EMD is Emergent Medical Data which is basically information/data that seems
harmless and vague, even non-medical in nature, but that can be used, together to
infer medical data about the person attached to the EMD.
It’s imperative that those not authorized by patients to have knowledge or access
to their private medical records and health status are not allowed to obtain EMD
to determine a patient’s name, address, SSN and/or other details. WHY is this
imperative? Because creating profiles that can even not identify a patient but are
more like tools for algorithms to detect typing patterns, keyword searches,
behavior purchases and social media comments can lead to a rather simple way
for unauthorized individuals or companies to infer that Person X has Disease Y…
that kind of finding could be used by insurance, banks or employers to deny
consumer services!
In an article from September 2019 9, health law expert Mason Marks, MD,
explained why EMD is something that “must” be protected from this exact
practice which he calls predictive mining. The article states…

“Recent studies have shown that seemingly benign, non-medical personal


information can in fact be extrapolated into medical data, and therefore,
according to health law expert Mason Marks, MD, it should be under the
same protections as more explicit health data.

In an op-ed for STAT, Dr. Marks cited one study in which statements
posted on Facebook — especially those using religious language or
expletives — demonstrated significant links to conditions such as
diabetes, psychosis and depression when researchers applied an artificial
intelligence algorithm to the post. Dr. Marks labeled this information, which
seems innocuous but can be unexpectedly translated into health
information, ‘emergent medical data.’”

In more basic terms, what’s happening with this EMD is it’s being, and has been
for some time, mined to identify undiagnosed physical conditions like dementia,
influenza and other conditions which could affect one’s ability to obtain health
insurance, long-term care insurance, etc.

This kind of deductive reasoning used by high tech companies, whether for
benign purposes or not, can be infiltrated and hacked for malicious or
discriminatory purposes as well. That’s a real problem! The fear is that instead of
healing people, companies like Ascension can use our unprotected sensitive data
to exploit Americans which is contrary to what America stands for. So, it

9https://www.beckershospitalreview.com/healthcare-information-technology/physician-
viewpoint-emergent-medical-data-must-be-protected-from-predictive-mining.html
23

logically follows that there should be laws to protect EMD which are just as strict
as those to protect obvious health records.

Let’s talk turkey! This is frightening… From an article in the Atlantic in 2018,
we learn how our at-home consumer electronic devices contribute to big tech’s
collection of EMD.10 Beware of:

-Alexa

-Google Assistant

-Siri

-Echo devices

“These ‘smart’ speakers are yet another way for companies to keep
tabs on our searches and purchases. Their microphones listen even
when you’re not interacting with them, because they have to be
able to hear their “wake word,” the command that snaps them to
attention and puts them at your service.” (Atlantic article)

Creepy, right? Makes one wonder, “Who’s listening, is my intimate chat


with my husband being recorded, are there transcripts of my family’s private
conversations available to buyers who will use our private information to hurt
or help us in the consumer market?”

Those questions MUST be answered!

10https://www.theatlantic.com/magazine/archive/2018/11/alexa-how-will-you-change-
us/570844/
24

CHAPTER 7: Why I Spoke Out

What’s happening poses a disturbing, significant risk to all of us. But, like most
people, I need to hold onto my job, and I knew if I mentioned anything or became
a whistleblower, there was a good chance I’d lose my job. I thought to myself,
“Should I speak out? Will it actually help?” I then wondered if breaking my
silence could cause more problems than it solves? Certainly, as I wrote for the
Guardian, two concerns weighed on me… Do patients know about this? And
should or can they opt in or opt out? But beyond this, I struggled with whether it
may be hunky dory if the public doesn’t know. I don’t want to be an alarmist for
nothing, after all! “Maybe it’s not as bad as I’m thinking,” I contemplated. It’s not
as if no benefits will come from this.

After letting all the pros and cons simmer in my mind, I decided there’s too much
at risk for ALL AMERICANS… our parents, our spouses, our children and
ourselves. I convinced myself that the outrageous and possible outcomes of
what’s happening is just too serious to hide. But then I thought, “If I don’t speak
out, someone else will, right? So, my conscious is clear. I don’t need to be the
hero.”

You see, in the tech world, there is an incentive not to speak out, and that is to
keep your job! Why rock the boat or cause a fuss when I have a good job and am
getting paid well? I don’t need to mess up my life for something which may not
happen or which others will reveal, anyway. Still, this decision gnawed at me to
the core.

So, what am I so afraid of happening to all of us?

Well, in my mind, HIPPA violations are occurring, and I cannot look patients in
the eye knowing what they don’t know. Patients trust medical workers and those
maintaining their medical records to keep their personal files private. I’m seeing
that there’s a huge contrast between what patients think is going on with
their private healthcare information and what’s actually happening with
them behind closed doors. Knowing patients didn’t know this, didn’t feel right.
Every time, I was confronted with the matter, I told myself, “I shouldn’t come
forward,” then I would rebut, “It’s far too important not to. I can’t just stay
silent.” It was weighing on me, and I felt like there was no other way. What I
see…

-HIPPA violations;

-People who shouldn’t be, are accessing our private medical files;

-No HIPPA tools to govern the new technological state of information flow in and
out of medical offices.
25

Like I said in the first chapter, the consequences of just letting Google and
Ascension do what they please, unchecked, could be debilitating to middle- and
lower-class families. What if a car insurance company which offers a family the
only premium, they can actually afford decides to deny them the policy simply
because the company became privy to the fact that the dad was once treated for
depression when he was younger? Now, he has a family and a child on the way.
But the insurance company doesn’t want to take chances on someone was out of
work for 6 months due to depression “DENIED.” Now, the father can’t drive his
kids anywhere because almost all 50 states require liability insurance as a
condition of driving a motor vehicle. At a minimum, that’s a bothersome scenario.
But they may be where were headed.

After the news broke of the Nightingale project Google and Ascension have
posted PR responses which do not elaborate on the contents of the deal or provide
transparency. Instead of addressing the problems with the partnership or concerns
of the patients, they resort to simply using their marketing teams.
26


CHAPTER 8: Is the Legislature Doing Enough?
My short answer… Doubtful.
Let’s first consider some other well-known technology and its privacy concerns.
If the legislature hasn’t been able to keep up enacting laws related to personal
smart devices, how can we expect them to keep up with two tech giants having
control over our private medical records? If you have Siri activated on your smart
phone or Alexa or Google Assistant on in your home, someone or something is
ALWAYS listening to you and those around you. They have to because the
devices need to recognize their programmed commands when you say them.
Are your private goings-on being listened to by actual people, or worse, being
recorded? If so, it’s not difficult to find you, as you probably have your home
address and email registered with the smart device. Are there laws protecting us
from people keeping or using the information we intend and believe to be private?
Smart devices constantly collect information about us which is found only in the
privacy of our own homes, cars or other places we have private conversations and
engage in private activities. Do you want the world to know what church you
attend, who your closest friends are, how you really feel about your spouse?
Maybe not, but if you use smart devices, you can’t count on secrecy.
Recently, and in response to public concern about this very issue, Apple released
a statement. It is mentioned in the August 29, 2019 Consumer Reports article by
Thomas Germain.11

He provides in relevant part,

“Update: On August 28, Apple announced it will stop retaining recordings of


users' interactions with Siri by default, but will continue to store computer-
generated transcripts of those interactions. Consumers will be able to opt in to
allow the company to keep audio samples to help improve Siri's capabilities, but
those recordings will be analyzed by employees rather than contractors, as in the

11https://www.consumerreports.org/privacy/apple-suspends-listening-to-recordings-of-siri-
users/
27

past. The company says it will delete recordings that were triggered
inadvertently. Apple said that change will take effect in the fall.

Apple, responding to privacy concerns, has temporarily stopped having humans


monitor recordings of consumer interactions with Siri in order to improve the
digital assistant's performance. The company also plans to introduce new tools to
give users more information and control over how Apple handles the data.

The change follows an investigation from The Guardian last week about the
intimate details of consumers' lives that are often exposed to the Apple
contractors who review Siri recordings, including ‘confidential medical
information, drug deals, and recordings of couples having sex.’”

So, that happened! Imagine how much private information is already in the hands
of who-knows-who?! Apple’s update? Too little, too late is what I say. There
need to be laws requiring companies which are developing products with potential
privacy issues to submit their manufacturing plans to government watch-dogs
before putting such products in the market, so that proper precautions can be
provided to consumers before it’s too late to retract sensitive personal information
from who-knows-where. This smart device issue is a prime example.

Actual laws regulating smart devices have been passed, but so recently that
damage is likely already done or to be done. Simple time lapse between the
consumer release of smart assistants and the new laws is three and half years for
Google Assistant and as long as eight years for Siri. In a recent article by AJ
Dellinger in Digital Trends12, he states,

“…Lawmakers are looking at new ways to help protect consumers—and


ensure their data isn’t being put at risk by the companies that hold it.

At the federal level, there have been a number of attempts to add


regulations that would protect owners of…devices. The Cybersecurity

12https://www.digitaltrends.com/home/smart-home-security-breach-protection-oregon-
california/
28

Improvement Act of 2019, introduced by Senator Mark Warner of Virginia,


would create new requirements for internet-connected devices.

A number of states have gone further than the federal law. In Oregon a bill
was passed that will require each smart device sold in the state to come
with a unique password. The requirement is one of the easiest ways to
mitigate brute force attacks in which hackers are able to crack the
protection on devices…

…As the Internet of Things [smart watches, assistants, etc.] is integrated


into our most private areas, consumers should have the assurance that
these devices are secure and fend off unwanted intrusion”

I don’t know about you, but none of the above gives me much peace of mind.
Yes, it touches upon hacking, but doesn’t cover all those who work for Apple,
Google or other tech giants putting smart devices on the market. What are their
rules and their penalties for non-compliance. Remember, laws don’t have much
teeth if there’s no deterring penalty looming.

Onto Google-Ascension… same problem.


Their deal was struck and information shared without any protections in place to
secure Americans’ medical histories. So, now, we have our healthcare and family
information being passed back and forth being two giant tech companies. That
means that there are tons of opportunities for our sensitive information to be
hacked, copied, sold, etc.
The more information is transferred around, the more chance there is of it being
vulnerable and used against us. It’s just common sense. As I will discuss later,
the legislature is only in the early stages of steps to create laws to protect us from
the possible fall-out, literally, of our private medical information to the highest
bidder or anyone else who will use it to harm us while benefitting
themselves/itself.

I think most of the issues we are discussing originate at the Executive and
management level. I think it comes down to transparency, allowing algorithms
and contractual arrangements to be made public and analyzed by everyone. It
really does start at the top.
29
30


CHAPTER 9: Bipartisan Efforts and Government Probes

Is there anyone in Washington doing anything about the risk at hand?


It seems so, but it still may be insufficient. Here’s the gist of what’s already been
happening. The following is directly from Senator Elizabeth Warren’s Senate
website and is dated November 20, 2019…
Bipartisan Inquiry from Senators Warren, Blumenthal and Cassidy
Asks Google to Explain How and Why It Secretly Collected Health
Data on Millions of Americans Without Their Consent.
Washington, DC – Following alarming reports of Google’s efforts to
obtain the health records of millions of Americans without their awareness
or consent, United States Senators Elizabeth Warren (D-Mass.), Richard
Blumenthal (D-C.T.), and Bill Cassidy (R-L.A.) sent a bipartisan letter to
Google demanding answers to the serious questions and concerns raised
by “Project Nightingale.”

The letter basically opens with the legislatures expressing its concern
over reports that Ascension has entered into a partnership which
provides Google with medical data and records of tens of millions of
Americans with their awareness or consent. It goes on to drive home
the fact that such sensitive information, if mishandled, could lead to
discrimination, exploitation or other harms to Americans. It
specifically focuses the inquiry on Project Nightingale.

Blumenthal, Warren, and Cassidy demand that Google provide


substantive responses to how such a vast amount of private health data
was collected, and how Google plans to use and secure it.13

This is definitely a step in the right direction. We just have to hope


that not too much damage has been done before the letter went out.

13 Link to article. https://www.warren.senate.gov/oversight/letters/bipartisan-inquiry-from-


senators-warren-blumenthal-and-cassidy-asks-google-to-explain-how-and-why-it-secretly-
collected-health-data-on-millions-of-americans-without-their-consent
31

The letter further provides a specific list of questions to be answered


by Google. Please see the referenced footnote for the full list, but
some of the most encouraging inquiries are (and I quote):

1. Please list all health systems, providers, insurers, or any other


entity for which Google provides services related to electronic
medical records.

2. Are Ascension patients provided notice of Google’s retention and


use of electronic medical records?

3. Will Ascension patients be provided the ability to opt out of the use
of their health information for what is medically or operationally
necessary to provide patient care? Has Google affirmatively sought
permission from patients for any use of this data?

4. Did Google’s agreement with Ascension allow Google to perform


research or analysis of patient data outside the direct scope of what
was medically or operationally necessary to provide patient care?
Would genetic information be included? Please list all planned or
considered research or analysis.

5. Is google Using or intending to use this data for targeting


individuals with advertisements? Is Google using or intending to
use this data to identify services that would be targeted at specific
individuals?

6. What procedures are in place that govern Google’s use of health


information from Ascension for research or analysis? Who is
responsible for approving such research?

7. Is Google permitted to use information (Aside from patient records)


derived from Project Nightingale, such as machine learning models
build from patient data, for contracts with other health care
providers and for other business purposes?

8. Are all products and services, including the versions used in Project
Nightingale, compliant with HIPAA?

9. Do Google employees have direct access to the electronic medical


records from Ascension? How many Google employees and which
divisions of Google have access to patient data? Under what
conditions can google employees access Ascension data? Could a
32

Google employee theoretically see the patient data of an


acquaintance?

10. When did Google begin obtaining personal health information from
Ascension?

11. What are the terms and conditions of the contract between Google
and Ascension? Specifically,

a. Is Google paying Ascension for this data or any services related to


this data, and if so how much?

b. What specific uses of the data by Google are allowed under the
contract?

c. Could Google combine Ascension data with individual search and


location data to create and leverage bolstered individual profiles?

d. Does the contract prevent or restrict Ascension from disclosing the


data sharing agreement, or providing patients with information
indicating that their health information will be shared?

12. What is the full and complete list of patient-level information that
Google is receiving from Ascension?

13. How many individuals’ health records has Google received under
“Project Nightingale”?

14. How is Google protecting the information it is receiving from


Ascension? Is the information encrypted? Is the data stripped of
any information hat could be used to identify patients, either
independently or with any additional information that Google may
have already collected through its other services?

15. Has there been any breach of attempted breach that would present
a risk of any outside party obtaining access to personal health
information?

16. Has Google shared any personal or aggregated health information


obtained via the Ascension agreement with any other third party? If
so, please list and describe all such instances of sharing data?

17. Has Google informed any federal or state regulators of its


agreement with Ascension and any potential uses for the Health
information it is collecting?
33

The letter was addressed to Mr. Sundar Pichai, CEO of Google LLC and Mr.
Tariq Shaukat, President of Industry Products and Solutions, Google Cloud, of
Google LLC.

Now, we wait! But there are other inquiries and Google has provided some public
responses already. For example, The Department of Health and Human Services
launched its own inquiry into the Google and Ascension Agreement. According to
one article on this inquiry, 14 “The US Department of Health and Human Services
(“DHHS”) has launched an inquiry into Google's partnership with giant US
Catholic healthcare non-profit Ascension. The healthcare deal is a major win for
Google's cloud business, Google Cloud, but it has immediately raised concerns
over the level of access Google will have to patient data.”

The DHHS is a United States executive department established in 1979. The


department was formed for "protecting the health of all Americans and providing
essential human services, especially for those who are least able to help
themselves."15 The DHHS decided to open an investigation in order to determine
whether Google’s mass collection of Americans’ private medical records is fully
compliant with HIPAA and has been since Google obtained the highly personal
data.
ANOTHER inquiry is ongoing by the Committee of Energy and Commerce.

So, the Committee of Energy and Commerce (“E&C”) actually oversees DHHS,
but they took their own steps to investigate the data sharing details of Google’s
agreement with Ascension. E&C is the oldest continuous standing committee in
the U.S. House of Representatives. It was originally established in 1795 to
regulate interstate and foreign commerce. 16

Today, the Committee has the broadest jurisdiction of any authorizing committee
in Congress. It legislates on a wide variety of issues, including (these are not
exhaustive lists):

• health care, including mental health and substance abuse


• health insurance, including Medicare and Medicaid
• biomedical research and development
• electronic communications and the internet
• privacy, cybersecurity and data security

14 https://www.zdnet.com/article/googles-plan-to-collect-health-data-on-millions-of-americans-
faces-federal-inquiry/
15 https://ballotpedia.org/U.S._Department_of_Health_and_Human_Services
16 https://energycommerce.house.gov/about-ec
34

• consumer protection and product safety

The Committee also oversees several federal departments and agencies, including:

• Department of Health and Human Services


• National Institutes of Health
• Centers for Disease Control and Prevention
• Federal Communications Commission
• Federal Trade Commission

In the current 116th Congress, the Committee includes 55 members – 24


Republicans and 31 Democrats.
Clearly, there’s a bi-partisan effort coming from E&C, as should be the case. This
is not a political issue. It’s an American privacy issue. E&C put out a press
release in November 2019. 17 Below is a summary of key points, but you can read
the entire release, including the letters sent by E&C to both Google and Ascension
at the link cited at footnote 15.
This gist of the release is that the E&C wants answers on the big companies’
health data sharing arrangement. The committee’s action was spurred by reports
that Ascension patients’ (tens of millions of them) health information has been
shared with Google! Language included in the demand letters is as follows:
“While we appreciate your efforts to provide the public with further
information about Project Nightingale, this initiative raises serious privacy
concerns,” the Committee leaders wrote to leaders of Ascension and
Google. “For example, longstanding questions related to Google’s
commitment to protecting the privacy of its own users’ data raise serious
concerns about whether Google can be a good steward of patients’
personal health information. Additionally, despite the sensitivity of the
information collected through Project Nightingale, reports indicate that
employees across Google, including at its parent company Alphabet, have
access to, and the ability to download, the personal health information of
Ascension’s patients….”
The members are requesting briefings from the companies on Project Nightingale,
including what data Ascension is sharing with Google, how it’s doing so and the
extent to which employees of both companies and Alphabet have access to the
information. This is exactly the type inquiry we need, so at this point, following
up is what remains to be done.
Still, this information should have been required before the finalization of Project
Nightingale. If I can’t install a deck on my own home without paying for,

17https://energycommerce.house.gov/newsroom/press-releases/ec-leaders-want-answers-
from-ascension-and-google-on-health-data-sharing
35

applying for and obtaining a village permit first, how ridiculous is the situation
with Project Nightingale? Nonsense, at its best.


CHAPTER 10: The Deal Itself Reveals Privacy Concerns

Now that you know the obvious concerns raised by the Google Ascension
collaboration, it’s time to dig deeper.
Despite repeated assurances by Google, it can’t get around the terms of the deal
itself. There’s a lack of consistency between what Google tells us and what their
agreement with Ascension provides. According to Dr. Mason Marks (quoted
earlier) in his November 2019 piece in Slate News, Google gained access to the
medical records of more than 50 million people in 21 states. 18
He explains that while reporters and regulators alike have been digging into the
partnership, they’re asking the wrong question because much of the obvious
concerns are nothing new. It’s not only about whether Google is complying with
HIPPA and whether patients give consent, etc. Before we get to Dr. Marks’

18
https://slate.com/technology/2019/11/google-ascension-project-nightingale-emergent-
medical-data.html
36

suggestion, it’s important to note that the Slate article explains that “corporations
have long had access to millions of medical records, and patients are rarely
informed. Even Google already had access to millions of patient records through
relationships with more than a dozen health care partners, such as the Mayo
Clinic, the University of Chicago, and the Cleveland Clinic. The difference
between these arrangements and Google’s deal with Ascension is merely its
scale.”

So, Dr. Marks posits that watchdogs should focus on “exactly what Google
plans to do with all this data.” Good question, I’d say! His concern is again
related to EMD. He states,

“According to the company, it will use the information to enhance


productivity and ‘support improvements in clinical quality and patient
safety.’ Google says it will not use Ascension’s data for other
purposes. However, there is a distinction between the data itself and
the knowledge Google gains from analyzing that data. This
distinction gives the company wiggle room to export what it learns to
other contexts. Google likely aims to mine Ascension’s data and
discover new markers of health it can apply outside the health care
system—across its full suite of products—to infer consumers’
medical conditions.

U.S. Patent Office documents filed in 2018 suggest that Google


aspires to predict or identify health conditions in people who haven’t
even visited a doctor, via the EMD discussed earlier.

REMEMBER: Whenever we interact with technology, we leave


behind digital traces of our behavior that serve as raw materials for
companies that mine EMD.

“A recent landmark study involving Facebook demonstrates the power of


EMD mining. The study analyzed the health records and social media
posts of 999 Facebook users. The results were surprising. Posts
containing religious language, such as the words God, Lord, and Jesus,
were strong predictors of diabetes. Finding that connection would have
been impossible without A.I. and access to health records.” So, what does
this mean to, say, a health insurance company? Don’t insure that Christian!
Shudder.

The article goes on to say,

“Through its partnership with Ascension, Google now has access to one of
the largest medical databases in the world. It can train A.I. to comb
37

through the data and identify words, phrases, and other variables that
reflect the presence or early onset of disease. That may not sound bad if
you assume Google will use what it learns to improve the health care
system. However, Google will likely maintain its discoveries as trade
secrets and export what it learns to other divisions of its parent company,
Alphabet, which include Nest and Sidewalk Labs. Google’s recently
announced acquisition of Fitbit for $2.1 billion expands its hardware
portfolio and EMD mining potential.

Combined with Google’s expertise in A.I., courtesy of Alphabet’s


DeepMind subsidiary, the Ascension deal gives Google unrivaled power to
find correlations between behavior and health conditions.

Google’s other subdivisions provide services including Gmail, YouTube,


Google Search, the Android operating system, and Google Docs.
Moreover, Google is part of a much larger company, Alphabet, which has
its own subdivisions including Nest, Sidewalk Labs, and Project Wing.
Each division and service is a data mining operation that collects and
analyzes consumer information. Thus, Project Nightingale’s real danger is
Google’s ability to leverage its cache of health data to build an unrivaled
consumer health surveillance empire spanning numerous industries and
technologies.

There are currently no laws to stop it. In fact, the Protecting Personal
Health Data Act, a law recently proposed by Sens. Amy Klobuchar and
Lisa Murkowski, actually creates a safe harbor for products that mine
EMD, including those that collect personal health data ‘derived solely from
other information that is not personal health data.’

The people who were treated in Ascension’s hospitals and clinics may not
have been warned that their information would be transferred to Google.
That’s bad. But the fact that your health data ends up in the hands of large
corporations is nothing new. The more dangerous threat is corporations’
ability to leverage A.I. and large medical databases to implement
widespread consumer health surveillance.” See footnote 16.

CHAPTER 11: New Bills being proposed

The first proposed bill by Amy Klobuchar


The Stop Marketing And Revealing the Wearables And Trackers Consumer
Health Data Act (Smartwatch Data Act) defines what data is protected under the
38

law. The bill would prevent entities that collect consumer health information from
transferring, selling, sharing, or allowing access to consumer health information
or any individually identifiable consumer health information collected on personal
health trackers. Violations of the new act would be enforced by HHS in the same
manner the department enforces HIPAA.

“The Google/Ascension news has brought needed scrutiny to the security of


Americans’ health data,” Cassidy says. “The Smartwatch Act prevents big tech
data harvesters from collecting intimate private data without patients’ consent.
Americans should always know their health information is secure.”

“The introduction of technology to our healthcare system in the form of apps and
wearable health devices has brought up a number of important questions
regarding data collection and privacy,” Rosen says. “This commonsense,
bipartisan legislation will extend existing healthcare privacy protections to
personal health data collected by apps and wearables, preventing this data from
being sold or used commercially without the consumer’s consent.”

The second proposed bill


The Protecting Personal Health Data Act would:

Require the promulgation of regulations to help strengthen privacy and security


protections for consumers’ personal health data.
Ensure that these regulations take into account:
Appropriate standards for consent that account for differences in sensitivity
between genetic data, biometric data, and general personal health data, and that
complement existing regulations and guidance; and
The ability of consumers to navigate their heath data privacy options, and to
access, amend, and delete a copy of the personal health data that companies
collect or use.
Create a National Task Force on Health Data Protection that would evaluate and
provide input to address cybersecurity risks and privacy concerns associated with
consumer products that handle personal health data, and the development of
security standards for consumer devices, services, applications, and software. The
Task Force would also study the long-term effectiveness of de-identification
methodologies for genetic and biometric data, and advise on the creation of
resources to educate consumers about direct-to-consumer genetic testing.

The bill is endorsed by Consumer Reports.


39

“Consumer Reports supports the Protecting Personal Health Data Act because the
current legal framework for privacy around health data is out of date and
incomplete. Protecting the legal right to privacy for users of new health
technology is about ensuring consumers have the freedom to take advantage of
promising new health technology without losing the right to privacy or facing
harm such as discrimination,” said Dena Mendelsohn, Senior Policy Counsel for
Consumer Reports.

Klobuchar has fought to lower prescription drug prices, invest in research, and
protect coverage for people with preexisting conditions. She has also been a
leader in the fight to protect consumers’ private information. Klobuchar and
Senator John Kennedy (R-LA) introduced the Social Media Privacy and
Consumer Rights Act, legislation to protect the privacy of consumers’ online data
by improving transparency, strengthening consumers’ recourse options when a
breach of data occurs, and ensuring companies are compliant with privacy
policies that protect consumers.

Permalink: https://www.klobuchar.senate.gov/public/index.cfm/2019/6/klobuchar-
murkowski-introduce-legislation-to-protect-cons

https://www.google.com/amp/s/www.nytimes.com/2019/11/11/business/google-
ascension-health-data.amp.html

In 2017, a British government watchdog agency ruled that the Royal Free
National Health Service Foundation Trust, a major health provider, had violated a
data protection law when it transferred medical records to DeepMind, an A.I. lab
in London owned by Google’s parent company, without sufficiently informing
patients.

DeepMind further outraged privacy groups in 2018 when it announced plans


to transfer the unit that processed the medical records to Google, after saying that
it would not link patient data to Google accounts. DeepMind’s health team
officially joined Google in September. In absorbing DeepMind’s health unit,
Google said, it was building “an A.I.-powered assistant for nurses and doctors.”

According to NYT who broke the story immediately and who went into more
detail about concerns
40

According to Forbes who


shared https://www.forbes.com/sites/jilliandonfro/2019/11/11/google-
ascension-project-nightingale-electronic-medical-records/amp/

recently, Google’s plans to acquire health wearables company Fitbit for $2.1
billion raised alarms with lawmakers, with Virgina Senator Mark Warner saying
that the announcement “raises serious concerns” and calling for mandatory
disclosures on how big tech companies use “sensitive data in
healthcare products.”

Stacey Torvino, a bioethics expert and law professor at the University of Nevada,
says that some privacy concerns around large tech companies handling medical
records stem from the vast amount of other data that they store about people,
including search and location history. Even if patient records were de-identified as
outlined by the Health Insurance Portability and Accountability Act, HIPAA, the
likes of Google and Facebook are uniquely capable of using other mined data
to determine who a patient is.

“De-identification is getting to the point where it’s almost a myth” because of


advances in big data
41


CHAPTER 12: De-Identification is a Myth
As I’ve mentioned previously, I can see for myself and report to you that our
private information is not secure. I also shared the submission to the Court in the
class action (Chapter 2) about Google’s expertise making it “uniquely able to
determine the identity” of the medical records shared with it… So, I don’t want
you to take only my word for it.

I offer you…

Princeton Professor Arvind Narayanan who has been researching and posting
blogs and articles about data privacy and ethics issues since 2006.19 He’s an
associate professor of computer science at Princeton who “moonlights” in
technology policy.

He recently posted, “I've shown how machine learning can be used to infer
sensitive info from seemingly innocuous "anonymized" data, ranging
from browsing histories to genomes. The risks of machine learning go
beyond privacy: in a paper in Science, we showed how AI technologies
reflect racial, gender, & other biases in our culture.”20
How frightening is that? Logically, our private information has to be being
improperly disseminated for that study to even occur! It’s those exact types of AI
technologies which Google can use and manipulate. With medical data pouring
into their systems from Ascension, even with the promised security precautions,
Google can certainly find ways to hide its improper use of our private medical
information, whether it intends to or not. There is massive personnel for Google
which, without effectual training on the HIPPA rules or other general privacy
laws relative to our privacy, can easily misuse and share our medical history for
private profit or other gain at our expense.

More evidence that even “de-identified” data can be re-identified…

19 http://randomwalker.info/data-privacy/
20 http://randomwalker.info/
42

Ross Anderson is a Professor of Security Engineering in the Computer Science


Department of The University of Cambridge, and he has researched and wrote an
article about the Security of Clinical Information Systems. 21 These are his
findings.

“The safety and privacy of clinical systems have been a problem for years.
Recent scandals include the Google DeepMind case (exposed by my then
postdoc Julia Powles) where the Royal Free Hospital gave Google a
million patients' records that they shouldn't have; and the Care.data
affair where a billion records – basically all hospital care episodes since
1998 – were sold to 1200 firms worldwide, in a format that enabled many
patients to be re-identified. It wasn't much better under the previous
Labour government, which had a series of rows over thoughtless and
wasteful centralization. There is now an NGO, MedConfidential, which
monitors and campaigns for health privacy.

The NHS has a long history of privacy abuses. Gordon Brown's own
medical records were compromised while he was prime minister, but he
got off scot-free as it was "not in the public interest" to prosecute him. In
another famous case, Helen Wilkinson had to organize a debate in
Parliament to get ministers to agree to remove defamatory and untrue
information about her from NHS computers. The minister assured the
House that the libels had been removed; months later, they still had
not been. Helen started www.TheBigOptOut.org to campaign for health
privacy. They have been joined by medConfidential, Big Brother
Watch and others.

This is obviously a worldwide problem. It begs the question, if Google has our
information, how many other countries’ private or governmental entities my find
our private data in their systems for exploitation and defamation? It’s a scary
prospect, to be sure. Again, while there is much research to show that Google and
Ascension’s promises to keep our private data separate from consumer data or
being hacked by third parties is not a promise to be taken at face value—made in
good faith or not. We need more watch-dogging going on here in the U.S.
And, yes, I made up “watch-dogging” which I’m pretty sure is not a real word.

CHAPTER 13: 4 Principles of Ethics by the World Health Organization


(WHO)

21 https://www.cl.cam.ac.uk/~rja14/#Research
43

Where do we go from here? How do we measure the ethics of such a program as


nightingale or other data sharing initiatives? These are some tough questions.
Believe me, I asked myself these sorts of questions before speaking out. The
(WHO) World Health Organization thankfully has provided some framework to
think about a way to answer this. It is by no means 100% or even the right
answer. However, I believe it gives us a start and interestingly this is where things
get tricky. We can debate the following and this is where much of the debate
should be.

The WHO describes the public health ethics with four principles
Common Good – Does the activity promote collective benefits? --
Equity – Does the activity reduce the burdens or risks to health or opportunity
Respect for personals – Does the activity support individual rights and interests?
Good governance – Does the activity have processes for public transparency and
accountability

Ok so let's begin to analyze the Google Ascension partnership through this lense
and start with Common Good. Does the activity promote Collective benefits?
Maybe. It depends on who you ask and what your measuring. Is the act of data
sharing in and of itself promote the common good? Maybe not. Does researching
and analyzing data in and of itself promote common good? Maybe not. Is
analyzing strictly for the means to create new cures common good? Yes. What
about if you have to give up all privacy to gain advancements in healthcare? Is it
then worth it? I don’t know. Is there a way to balance the need of all this data with
patients who don’t want to share their data? Are patients' rights equal to the rights
of the company? What if all the gains were owned by private organizations who
then sold licenses to this data thus increasing total costs to patients overall? What
if it was used to charge patients for additional unneeded services?

Number 2: Equity – Does the activity reduce the burdens or risks to health or
opportunity. Maybe. There are many points to consider, privacy, surveillance,
corporate interests. Patient rights groups, patient rights, consent and knowledge,
transparency. If you reduce a burden in one area but increase risk in another area.
Is that good or bad? If you remove privacy completely in the hope of a cure and
then 1 major hack leaks millions of patient data, did that increase or reduce risk?
44

Number 3: pect for persons – Does the activity support individual rights and
interests? So far, individual rights have not been respected. So, no. Patients'
rights, rights of privacy advocates have not been respected. Does it support
individual interested? If the goal is to create new cures alone then yes it would be
everyone’s interest. If data ends up being in the hands of insurance providers who
deny patients access to healthcare, banks who deny credit and employers who can
use the data to discriminate then no this would not respect individual rights or
interests. And with this, even in the best-case scenario where it starts off alright, if
10 years down the road the BAA changes or contractual arrangements change
then all that data will then be used for other purposes. So, transparency of
contractual arrangements would be a way to respect the rights of everyone or
giving patients a choice to op in or out would be respectful. But right now Google
and Ascension have no plans to do so.

Number 4: Good governance – Does the activity have processes for public
transparency and accountability? In my estimation the answer so far is no. There
is not much public transparency or accountability. For example, google and
ascension both crafted perfectly worded statements by their legal teams but it
doesn’t address the underlying lack of transparency and accountability in the deal.
Only if source code is public or terms of the arrangement made public will this
provide good governance. Right now, there is no desire by either company to be
transparent going forward and in fact they are full steam ahead.

CHAPTER 14: Goldilocks Dilemma

The last chapter set the stage for the dilemma, better known as the Goldilocks
dilemma.

The Goldilocks dilemma is the intersection of Patient rights and corporate


interests. On the corporate side they would argue that we need all the data so that
we can analyze, mine and find cures. Sure, we may make money off these
algorithms created, sure we might share with 3rd parties at the expense of privacy
and sure it aligns with our corporate interests, sure we might make mistakes and
mishandle data every once in a while. But the end result is better healthcare,
cures. The patient side is do I have a right to opt in or out of sharing my data with
45

the company? Do I have a say in the process? Where is the line drawn where my
rights as a patient end and the corporate rights begin? Should there be levels of
opt in opt out say level 1 for productivity improvements and level 2 for research
and level 3 for other 3rd party sharing arrangements? Should I as a patient be
compensated for sharing my data which helps them increase profits and make
money? Can I at any time revoke the right of companies to my data?

CHAPTER 15: The Technology Hippocratic oath

Who is Hippocrates and what does he have to do with Healthcare? Hippocrates, lived in
460 BC in Greece, was the first documented Physician, was a philosopher and considered
the father of present-day medicine, think founding fathers of the United States but for
Medicine as a whole. He practiced medicine and created the basis for the Hippocratic
oath which all doctors and nurses pledge themselves to before practicing medicine in the
US today. Really applicable before technology and basically meant that Doctors and
nurses would keep the patient information private and do everything for the benefit of the
patient, thus acknowledging that if the public knew the private matters of patient health, it
could cause detrimental consequences for patients. This was thousands of years ago. But
with the advent of technology nothing is private anymore. Thus, the need for a
Technology Hippocratic oath. If tech giants also handle the same sensitive data that
doctors and nurses do then it reasons that they too should be held to privacy and security
standards that doctors and nurses are and even much more so.

CHAPTER 16: Hacking Component and the Risk it poses

For this Chapter I am going to directly paste a study from NSA Researcher and Penn
State college professor who I have spoken with. He co-authored an NSA study on
Cybersecurity concerns as it relates to Healthcare. He covers 11 areas of concerns from a
hacking perspective (These are not all inclusive and in fact there are many other areas of
concerns and publications which go into more detail which should be studied). In
addition to new laws being created which protect patient privacy and EMD data and the
right of consent, the next big area of concern is cybersecurity itself.

His research below


46

“First today’s medical Infrastructure has been adapted


from conventional computing, and was not been designed
with the needs of health care expressly in mind. Hence the
current generation of micro-processor-based health care
equipment is usually better suited to office-like and
traditional data processing workflows than to clinical
environments. As noted, this is especially clear in the use of
passwords as an access control mechanism. Using
equipment only partially adapted from a work-flow alien to
the clinical setting thus leaves gaps that system users have
to cover with their own behavior, and each such gap creates
potential attack points. Evolving the system’s workflow
model away from the office and toward the realities of
clinical practice is a major design challenge with significant
opportunities to improve cybersecurity.
A second challenge is that many of today’s medical devices
were first designed and built as stand-alone devices, not as
networked components integrated into a larger system. As
such, they lack the very basic functionality needed for their
new role. This is analogous to when conventional
computing was extended from terminals attached to time-
sharing systems to networks of workstations and PCs in the
early 1980s. Integrating legacy medical devices into the new
networked architectures requires a large amount of new
software just to provide the missing functionality. This
integration process itself creates novel security problems
that never existed for the stand-alone design. As an
example, a stand-alone insulin pump assumes a single or at
most a small number of login accounts for operators. In
contrast, when networked, practically any clinician on a
patient’s care team may need to log into the device, and so
the authentication system needs to be re-worked from
scratch to integrate into the facility’s network-wide
authentication framework.
47

Thus the third challenge requires stand-alone medical


devices to be re-architected and re-manufactured with
networking and distributed computation in mind. Then
they must be redeployed into the more modern electronic
healthcare environments. Experiences with conventional
computing systems suggest that the current system of
legacy devices jury-rigged into a larger network is unlikely
to withstand prolonged attacks by today’s professional
criminal attackers.
A fourth and perhaps most significant challenge is that
health care has evolved into specialties split across many
disciplines and even organizational boundaries. A patient
may have several health records which must travel across
all of these several system boundaries to satisfy all of the
clinical, scientific, and business objectives involved in that
patient’s care and billing. As a practical matter this means
that the security and privacy of an EHR is governed by the
least common denominator needs of all the groups and
organizations requiring access. Different organizations and
groups are subject to different opportunities, incentives,
and constraints, many of which contradict with one
another. Thus, finding compromises that both protect the
patient and enable the broad range of specialists requiring
EHR access is a daunting conceptual and real-life problem.
This lack of uniformity in health record handling presents
opportunities for attackers to exploit. Somehow the system
must be better rationalized to reduce the opportunities for
exploits.
A fifth challenge is that healthcare data—especially
electronic health records—are worth ten to thirty times
more on the black market than are credit card numbers. A
cyber-thief can buy purloined credit card numbers for
anywhere from about $2.50 to $10 a number (assuming
48

one is buying in bulk). In contrast EHR data are worth up


to $65.00 each. Why this difference? A credit card can be
used once or a few times and then it is stopped. But a
healthcare record: 1. Has your credit card number anyway
in addition to your social security number; 2. Can be used
for blackmail; and 3. Most important—is the gift that keeps
on giving if used by an unscrupulous medical provider.
Thus, a physician, or PT, or pharmacy, or oxygen supply
company can bill Medicare or an insurance company for
millions of dollars. And the unscrupulous vendor or
provider will know exactly the kind of services, procedures
or supplies appropriate for each patient. The fraud is aided
by incomprehensible (and intentionally opaque?) medical
billing processes and the reality that a lot of sick people are
not carefully examining the Byzantine paperwork that
accompanies any medical event.
A sixth challenge is created by the increasing ability of
patients to directly input data into their personal health
records (PHRs) and even into their EHRs. These data can
be self-reports (e.g., what I ate, how I felt) or these data can
come from devices, such as fit-bits, cell phones, other apps,
and medical devices (e.g., heart monitors, pacemakers,
insulin pumps), scales, blood pressure measurements).
There are literally millions of medical apps out there. Thus,
the most obvious issue is the trustworthiness of the data;
Should the data be accepted into the PHR or EHR without
review? Is the clinician obliged to review it? Or to accept it?
What of diary-type entries, reflecting weight loss desires,
foods not eaten, sexual activities, child care worries?
What are the legal implications of this process? Need
clinicians act upon something that might be suspicious but
it probably not consequential? How is it even possible to
expect clinicians to find the important needles in the
49

massive haystacks of data that could inundate their


practices?
A seventh challenge is the vulnerability to data in cell
phones and other personal devices, be they medical (e.g.,
pacemakers, insulin pumps) or possibly related to health
(e.g., exercise machine records, pedometers). As noted
earlier, these devices were often not designed with
cybersecurity as essential. In particular, today’s hackers
have the skills and expertise to discover any of these devices
from the Internet, reserve engineer their software, create
malware explicitly for these devices, and then install their
malicious code on the targeted devices; they do all of this
remotely across the Internet. Creating devices more
resistant to these threats is therefore a major opportunity.
An eighth and related challenge is the safety of the data in
transit from devices to one’s EHR or PHR. Cell phones
operate in “open space.” Many other devices rely on cell
phones or on Wi-Fi of uncertain security. (The first author
of this article discovered the WEP flaws in July 2000 and
subsequently served as the architect for WPA2 when he
worked at Intel, work instrumental to his promotion to the
role of chief cryptographer).
A different, and ninth, challenge is the motivation of both
patients and the health care system’s personnel to cheat.
Many wellness programs and other health insurance
programs reward activities that are thought to enhance
health. Thus, more walking steps per day, routine exercise,
smoking cessation, or weight loss are rewarded with money
or reductions in insurance premiums. Similarly, some
programs punish employees on the basis of these data with
higher premiums and public exposure. Of course, humans
are very smart and we therefore enjoy reports of fitness
trackers attached to ceiling fans, dog collars, and even
50

electric drills. Smokers will substitute the urine of others


for their own; people will carry weights and wear heavy
shoes when being weighed for the baseline measurements.
The tenth challenge is the lack of data standards for
healthcare data. Thus one system lists your blood pressure
as 120/80, another as “diastolic of 80 and systolic of 120”
(in alphabetical order), another system as “labile,” or as
“not compliant with medications,” and yet another system
records your blood pressure as “stable.” Even if the
software sends the information from one system to another,
the clinician seeking to treat you may never see the other
information with words jumbled into the data fields
(computers don’t handle ambiguity very well) and if the
data do come to her, she can’t effectively use information
called “labile” or “stable.” She needs the numbers. This data
standard chaos affects security because healthcare systems
are then obliged to develop workarounds to lack of data
standard by transferring the information to subprograms
(APIs) that help send the information across systems. But
introducing additional software that often do not encrypted
the data just creates additional vulnerabilities.
An eleventh challenge is that local groups of hospitals and
clinician practices often create HIEs (Health Information
Exchanges) that allow doctors and hospitals to access data
from other participating institutions and practices. The
value of these HIEs is compromised by the reality that they
seldom have full participation of all local healthcare
organizations. Thus searching for a patient often produces
no information. Also, the fact that they are local means they
will miss patients who have recently moved to an area or
are just passing through. Last, lack of a unique patient IDs
in the USA means they often don’t help differentiate
medical information on Robert Smith, Robert J Smith,
51

Smith RJ, Smith R, RSmith, RJSmith, etc, etc. This means


that one must examine the records of often hundreds of
patients in hope of finding the one you need; further
exposing more patient data to many eyes.
In conclusion: Healthcare IT promises—and often delivers–
faster, better, and more comprehensive medical care. But
underlying those promises is the assumption that patient
data in the IT systems are secure; and that the safety of the
software used to collect, analyze, present and transfer that
information is not easily compromised. As we have sought
to explain, there are good reasons to doubt the data security
of many medical data systems.
We are not suggesting that medical authorities or their IT
staffs are cavalier about these dangers. They are aware,
concerned, and are actively seeking to protect that
information. However, the vulnerabilities we’ve noted are
profoundly complex and often shifting. The number of
separate systems, the age of some of those systems and the
dangers of combining devices and data sources presents
challenges that are severe, and constant. Worse, they are
always emergent because software, hardware, consultants,
patient populations, clinicians, and business relationships
are seldom static. Equally disconcerting is the system
vulnerably is dependent on workers at healthcare facilities
and related organizations who are the targets of
increasingly sophisticated hackers. Hackers send spear
fishing messages that incorporate friends’ and bosses’
names and topics of great urgency including employees’
children’s names and their teachers’ names.
Data in mobile devices and in transit represent yet another
set of vulnerabilities. We seek the convenience of constant
cyber connectedness, but seldom consider how that
connectedness provides the bad guys constant access to
52

data and systems. Protecting our information may mean


very different ways of keeping and sending data. One man’s
emoticon for home is another’s entrance to his bank
account.
Medical informatics has developed over the past five
decades, building incrementally, expanding its purview,
promises, and prestige. It will soon know more about us
than we do via its access to our genetics and precision
medicine’s algorithms. The security of our data is therefore
even more essential even though the protection of our
information is too frequently a slapdash patchwork of good
intentions, private interests, some caring security
engineers, and often limited resources devoted to security.
As we write this, Yahoo, announced the hacking of half a
billion passwords and personal information. It’s probable
that Yahoo’s security systems are more robust than most
peoples’ cell phones or their medical providers’ databases.”

Ross Koppel, PhD, FACMI has been at the University of


Pennsylvania for 25 year, where he teaches sociology, is a
Senior Fellow at LDI Wharton, and is PI on several projects
involving healthcare IT and cybersecurity. He is also
professor of biomedical informatics at the University at
Buffalo (SUNY).

P.S. Other aspects of cybersecurity to consider are whether


cloud technologies are truly secure, or whether
multinationals have proper security in place for all their
data. There have in fact been over 10 major hacking events
on major multinationals including Google, Yahoo, Target,
53

Microsoft, Apple even though they have top notch security


and much of their data is in the cloud. Going back to my
conversation with Ross Anderson from Cambridge
University, “nothing is 100% secure.”

CHAPTER 17 – the last and Certainly not least: Should Patients Have a
Say in the Process?

I submit, ABSOLUTELY!
The deep moral quandary

Yes, there’s a quandary we now must face as people/patients living in such a


technologically advanced world. Our grandparents didn’t have this moral
decision, but it seems that we do. While it’s not spelled out as dramatically as it is
in Dr. Adrian Gropper, MD’s December 2019 article, it exists! And it’s usually at
its prime when we’re in the middle of signing in or signing out of our doctor’s
office or hospitals. The title of his article is “Patient-Directed vs. The Platform”.
As you can see, just the use of “vs.” suggests a battle. And it is. For the his full
article, see https://blog.petrieflom.law.harvard.edu/2019/12/19/hipaa-privacy-
platforms/#more-28334

The article is actually part of a series called The Health Data Goldilocks
Dilemma: Sharing? Privacy? Both? which can be found at
https://thehealthcareblog.com/the-health-data-dilemma-sharing-privacy-both/ .

For my purpose of keeping you in the know on the basic information you’ll likely
find interesting and important to you on the medical data sharing issue, I’ve
highlighted some key portions of Dr. Gropper’s article below:

“This piece is part of the series… which explores whether it’s possible to
advance interoperability while maintaining privacy.
It’s 2023. Alice, a patient at Ascension Seton Medical Center Austin,
decides to get a second opinion at Mayo Clinic. She’s heard great things
about Mayo’s collaboration with Google that everyone calls “The
Platform”. Alice is worried, and hoping Mayo’s version of Dr. Google says
something more than Ascension’s version of Dr. Google. Is her Ascension
doctor also using The Platform?
Alice makes an appointment in the breast cancer practice using the Mayo
patient portal. Mayo asks permission to access her health records. Alice is
54

offered two choices: one uses HIPAA without her consent and the other is
under her control. Her choice is:

• Enter her demographics and insurance info and have The Platform
use HIPAA surveillance to gather her records wherever Mayo can
find them, or
• Alice copies her Mayo Clinic ID and enters it into the patient portal
of any hospital, lab, or payer to request her records be sent directly
to Mayo.”

How would you feel if you were Alice?

I know I’d feel vulnerable, unsure, and worried. Most don’t even understand what
some of the words mean or who sees patients’ medical records as they are
delivered from one place to another. I’d want to talk to someone I trust for help in
deciding, but who? I don’t know. This is precisely why we need laws to protects
us—those who are not doctors or tech experts.

Under the current lack of regulation and laws governing these movement of
records matters, there’s a scenario that permits a patient to contact a medical
provider directly and request that her records be emailed directly to the patient (or
mailed). Great! Wrong. It’s also currently permissible for the clinic to require the
patient install an app on her phone or sign up for some other platform. So, again,
the patient is faced with some third-party having access to her healthcare history.
This is not good. The go-betweens don’t seem to have any watch-dogs governing
the third parties’ handling and saving the sensitive patient information.
So, that’s the frustrating quandary. How much power over our own privacy
should we give away for the common good? Should any of our medical data
which could be connected directly to us by name be shared with companies like
Google? Sure, we want improvement in medical care, but at what expense? Can
that be accomplished without major risk to us and our loved ones? Maybe, but it’s
not been done yet. And, until then, our sensitive medical histories are out there to
be discovered and used by who knows who? The legislature and medical
associations have to act FAST!
Dr. Gropper says a lot more in this article. I’ll summarize it for my purposes of
this book. He discusses how only hospitals, not patients or physicians, decide
which EHR to use. EHR stands for “electronic health records.”

This post is about the relationship between two related health records
technologies: patient-directed uses of data and platforms for uses of
patient data. As physicians and patients, we’re now familiar with the first
generation of platforms for patient data called electronic health records or
EHR. To understand why CARIN matters, the only thing about EHRs that
55

you need to keep in mind is that neither physicians nor patients get to
choose the EHR. The hospitals do. The hospitals now have bigger things
in mind, but first they have to get past the frustration that drove the
massively bipartisan 21st Century Cures Act in 2016. The hospitals and
big tech vendors are preparing for artificial intelligence and machine
learning “platforms”. Patient consent and transparency of business
deals between hospitals and tech stand in their way…
The practices that will control much of tomorrows digital health are being
worked out, mostly behind closed doors, by lobbyists, today…
Three years on, the nation still awaits regulations on “information blocking”
based on the Cures Act. Even so, American Health Information
Management Association (AHIMA), American Medical Association (AMA,
American Medical Informatics Association (AMIA), [and others]… are
sending letters to House and Senate committees hoping for a further delay
of the regulations…
We’re led to believe that hospitals are the safe place for our data and
patient-directed uses need to be “balanced” by the risk of bypassing the
hospitals and their EHRs. Which brings us back to CARIN Alliance as the
self-appointed spokes-lobby for patient-directed health information
exchange.
According to CARIN, “Consumer-directed exchange occurs when a
consumer or an authorized caregiver invokes their HIPAA Individual Right
of Access (45 CFR § 164.524) and requests their digital health information
from a HIPAA covered entity (CE) via an application or other third-party
data steward.” (emphasis added) A third-party data steward is a fancy
name for platform. But do you or your doctor need a platform to manage
uses of your data?
HIPAA does not say that the individual right of access has to involve a
third-party data steward. We are familiar with our right to ask one hospital
to send health records directly to another hospital, or to a lawyer, or
anywhere else using mail or fax. But CARIN limits the patient’s HIPAA
right of access dramatically: “All of the data exchange is based on the
foundation of a consumer who invokes their individual right of access or
consent to request their own health information. This type of data
exchange does not involve any covered entity to covered entity data
exchange.”
By restricting the meaning of patient-directed access beyond what the law
allows, everybody in CARIN gets something they want…To promote these
interests, the CARIN version of patient-directed access reduces the
control over data uses for physicians as well as patients much beyond
what the law would allow.
56

The CARIN model for digital health and machine learning is simple.
Support as much use and sale by hospitals and EHR vendors without
consent while also limiting consented use to platform providers like
Amazon, Google, IBM, Microsoft, Oracle and Salesforce, along with
CARIN board member Apple.
CARIN seems to be a miracle of consensus. They have mobilized
the White House and HHS to their cause. Respected public interest
organizations like The Commonwealth Fund are lending their name to
these policies. Is it time for this patient advocate to join the party?
Some of what CARIN is advocating by championing the expansion of the
FHIR interface standards is worthwhile. But before I sign on, what I
want CARIN to do is:

• Remove the scope limitation on hospital-to-hospital patient-directed


sharing.
• Suspend work on the Code of Conduct
• Separate work on FHIR data itself from work on access
authorization to FHIR data.
• Do all work in an open forum with open remote access, open
minutes, and an email list for discussion between meetings.
Participation in the HEART Workgroup (co-chaired by ONC) and
also designed to promote patient-directed uses would be part of
this.

Digital health is our future. Will it look like The Mayo Platform with Google
and Google’s proprietary artificial intelligence behind the curtain? Will
digital health be controlled by proprietary and often opaque Google or
Apple or Facebook app store policies?
The CARIN / CMS Connection and CARIN Community meeting are taking
place this week. Wouldn’t it be a dream if they would engage in a public
conversation of these policies from Alice’s perspective. And for my friends
Chris and John at Mayo, what can they do to earn Alice’s trust in their
Platform by giving her and her doctors unprecedented transparency and
control.”

After reading Dr. Gropper’s article, I couldn’t help but think It’s not enough just
for new health care laws or new practices but for patients and the greater
public to actually be included in the discussion and process. Up until now
only tech companies, hospital networks and lobbyists have had a say in
the matter. Until we include everyone then trust will only continue to erode
in the healthcare process. With the new interoperability laws currently in
discussion by HHS and others now is the time to hear from all sides. The
current debate is not including the patient perspective and do not include
57

privacy laws or addressing any of the concerns of data falling into the
wrong hands or being used for 3rd party advertising, selling to insurance
providers, banks or employers which can be used for algorithmic bias.

Actually, these new interoperability laws seem to be making it easier for


Silicon Valley and other tech companies to get ahold of patient data and
mine it, analyze it, advertise with it and sell it to insurance providers. If
new interoperability laws are passed without addressing the patient
perspective, patient privacy and algorithmic bias, then they are no better.
No one argues that allowing doctors to have more information to make
better decisions is a bad thing, but that passing these laws without patient
privacy protections, patient consent and laws to prevent algorithmic bias
and EMD metadata collection, then these new interoperability laws will
increase discrimination, reduce patient rights, reduce privacy, reduce
consent and allow hackers a potential to more easily hack into patient
data.

Without requiring algorithmic transparency and agreement transparency


between hospital systems then experts will be unable to analyze what's
taking place to ensure patient rights are upheld. CARIN may not have all
the answers and does not appear to be including patients in their lobbying
or advocating for patient privacy, patient consent or prevent algorithmic
bias and new laws to protect patients. Simply if you give 3rd parties more
access to data before implementing laws that address patient privacy,
patient consent, advertising, EMD metadata sharing, research, data
mining and algorithmic bias, then that can actually do more harm than
good for patients.

Final word:
The things discussed in this paper are a result of many hours of research,
talking with experts and reading the latest whitepapers and research
papers on the subject. I have met with over 10 experts in the field ranging
from Academia, Government, Researchers, Patient privacy experts,
Medical Doctors, Cybersecurity experts, Policy experts and Data Health
lawyers. I think we need to hear from all these areas when making new
laws. Although, much of the discussion has been around Google and the
risk they pose, the things discussed above are applicable to any tech
provider including Amazon AWS and Microsoft Azure. It’s not just Google
trying to enter the space but many companies. This affects the entire field
of healthcare and companies entering the space. I think the best way
forward is to hold hearings and public discourse in congress. To call on
leaders in every field so it can be debated and the public and congress
58

can hear all sides of the issue to ensure nothing is forgotten when it
comes to something as important as Healthcare Data.

S-ar putea să vă placă și