Information Technology Question

Description

i’m working on a artificial intelligence report and need the explanation and answer to help me learn

Don't use plagiarized sources. Get Your Custom Assignment on
Information Technology Question
From as Little as $13/Page

Need an essay of 8-9 pages on AI privacy of the below topic

Exploring the broader societal impacts of ethical or unethical AI privacy technologies, including their effects on vulnerable populations, and considering potential consequences for individuals, organizations, and society as a whole.

rubric is
essay will be evaluated based on the following criteria:
(a) Topic Relevance: Ensuring that your essay’s content aligns with one of the specified lecture topics.
(b) Concept Understanding: Demonstrating a clear understanding of key ethical and responsible AI concepts.
(c) Critical Thinking: Exhibiting critical thinking skills by evaluating diverse perspectives on ethical AI and offering nuanced insights into the complexities of ethical AI.
(d) Writing: Displaying effective writing skills, including clarity, coherence, and proper grammar.
(e) Citations and References: Properly citing and referencing sources used in your essay, adhering to a recognized citation style -APA.

I Will just give you the reference papers, but you have to do the own research about the topic of privacy .


Unformatted Attachment Preview

IS 698/800-05: Ethical and
Responsible AI (Fall 2023)
Week 12: Privacy (1)
GDPR, data privacy, k-anonymity,
hacking, industry incentives for storing
data vs user privacy
This Photo by Unknown Author is licensed under CC BY
Learning Goals
By the end of the lesson you should be able to:
• Discuss how we can understand privacy in the big-data era
• Explain the tensions between organizations such as technology companies
and their users regarding privacy
• Distinguish between privacy and security, and discuss the harms of data
breaches
• Overview and compare the GDPR and HIPAA regulations and their
implications for data mining / machine learning.
• Explain anonymization and its limitations.
• Explain k-anonymity, and discuss its limitations. Calculate k.
• Overview federated learning
2
http://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/#b228dae34c62, Retrieved 6/16/20163
Privacy
• The Target example illustrates several important points:
• Companies are incentivized to track of large amounts of data on their
customers/users
• Hope to use it to understand the users and improve profits
• The cost of storage is now very low, so companies will keep everything they can on their
customers, even if they don’t have a specific use for it right now.
• The information can be potentially invasive
• Even apparently innocuous data could reveal sensitive facts
Privacy and Machine Learning
• Businesses profit from leveraging ML on customer data
• As individuals and consumers we also benefit from ML systems
trained on OUR data
• Internet search
• Recommendations
• products, movies, music, news,
restaurants, email recipients
• Mobile phones
• Autocorrect, speech recognition, Siri, …
• We need a sensible balance of privacy protections while still
enabling ML
5
Privacy and Machine Learning
• Want the benefits of sharing our data while protecting our privacy
• Have your cake and eat it too!
6
Privacy and Machine Learning
• Want the benefits of sharing our data while protecting our privacy
• Have your cake Apple and eat it too!
7
“We believe you should have
great features
and
great privacy.
You demand it and we’re dedicated to providing it.”
• Craig Federighi,
Apple senior vice president of Software Engineering.
June 13 2016, WWDC16
Quote from http://appleinsider.com/articles/16/06/15/inside-ios-10-apple-doubles-down-on-security-with-cutting-edge-differential-privacy , retrieved 6/16/2016
8
Government
Surveillance
• Companies are not the only entities
collecting potentially invasive data
• Edward Snowden’s 2013 leaks revealed
secret NSA surveillance programs:
• PRISM program collects communications
data from internet companies
• NSA was monitoring “one Library of Congress
every 14.4 seconds“ even in 2006
• XKeyscore searches and analyzes such data
• Snowden: “You could read anyone’s email in the
world, anybody you’ve got an email address for.
Any website: You can watch traffic to and from it.
Any computer that an individual sits at: You can
watch it. Any laptop that you’re tracking: you can
follow it as it moves from place to place
throughout the world. It’s a one-stop-shop for
access to the NSA’s information.”
https://web.archive.org/web/20140128224439/http://www.ndr.de/rat
geber/netzwelt/snowden277_page-3.html
Think-Pair-Share: Privacy
• What does “privacy” mean to you, when it comes to digital data
about you in 2023?
• How would you define it?
• What do you want companies / healthcare providers / Universities / etc. to
do to protect your privacy?
Think-Pair-Share: Privacy
• What does “privacy” mean to you, when it comes to digital data
about you in 2023?
• How would you define it?
• A unicorn
• Keeping all of our digital data confidential
• What do you want companies / healthcare providers / Universities / etc. to
do to protect your privacy?
• Don’t store it! Aggregate only, metadata
• Healthcare providers – need history, but don’t give it to third parties
• Keep data for specific purposes only, not for other analytics
• Stop listening to our conversations!
• Permissions can be too vague – make this clear and transparent
What is Privacy?
“Privacy, however, is a concept in disarray.
Nobody can articulate what it means.
Currently, privacy is a sweeping concept,
encompassing (among other things) freedom
of thought, control over one’s body, solitude
in one’s home, control over personal
information, freedom from surveillance,
protection of one’s reputation, and
protection from searches and interrogations.
Philosophers, legal theorists, and jurists have
frequently lamented the great difficulty in
reaching a satisfying conception of privacy.”
– Daniel Solove
Solove, D. J. (2008). Understanding privacy.
Hartzog, W. (2021). What is privacy? that’s the wrong question. University of Chicago Law Review, 88(7), 1677-1688.
What is Privacy in the Big Data
Era?
• A simplistic definition: Privacy means keeping
our secrets and revealing as little information
on us as possible.
• Since we are all being constantly tracked
online, by this definition, our privacy is
reducing rapidly.
• Is privacy over for us?
• Some believe that we have already met “the
death of privacy”
The
“death of
privacy”
• “You have zero privacy anyway…. Get over
it.”- Scott McNealy, CEO of Sun
Microsystems, 1999
• “Privacy may be an anomaly” and “it will
be increasingly difficult for us to achieve
it.” “Our social behavior is quite damaging
to privacy. Technology has outraced our
social intellect.”
• – Vint Cerf, Google’s “Chief Internet
Evangelist” and one of the leading
figures in the creation of the internet,
2013
• “The age of privacy is over” – Mark
Zuckerberg, Facebook founder, 2010
What is Privacy in the Big Data Era?
• A better, more useful definition of privacy:
• “Privacy should not be thought of merely
as how much is secret, but rather about
what rules are in place (legal, social, or
otherwise) to govern the use of
information as well as its disclosure.” –
Richards and King (2014)
(emphasis mine)
• By this definition, according to Richards
and King, “privacy (and privacy law) are
very much alive.”
Richards, N. M., & King, J. H. (2014). Big data ethics. Wake Forest Law Review, 49(2), 393-432.
What is Privacy in the Big Data Era?
• At least four legal uses of the
term “privacy” (Richards, 2005):
1. invasions into protected
spaces, relationships, or
decisions
2. collection of information
3. use of information
4. disclosure of information.
• This could be further divided,
e.g. Daniel Solove (2008) divides
privacy into 16 categories,
including surveillance,
interrogation, aggregation, and
disclosure…
Neil M. Richards, Reconciling Data Privacy and the First Amendment, 52
UCLA L. REV. 1149, 1181-82 (2005)
Solove, D. J. (2008). Understanding privacy.
Privacy vs Security: Data Breaches
• Data collection by tech companies, frequently for machine
learning purposes, motivates hacking and data breaches.
• “Computer security is not privacy protection”
– Latanya Sweeney
• However, data breaches are clearly very harmful to users’
privacy
• Some of the largest data breaches include:
• Yahoo in 2013 (revealed in 2016), impacted three billion
accounts
• Included answers to security questions
• Aadhar (Indian tech company) in 2018, impacted around
one billion users
• Included biometric data and bank account numbers
• Alibaba (Chinese tech company) in 2019, impacted
around one billion users.
• Included criminal records
• All three of the above included phone numbers and
email or physical addresses
Privacy vs Security: Data Breaches
• Leaks at adult dating and hookup sites are sensitive,
embarrassing, and have clear consequences:
• The FriendFinder Network in 2016, home of several
adult-oriented social networks and hookup sites
• included names, email addresses, and phone
numbers
• Ashley Madison (“dating site” for people to cheat on
their spouses or partners) breach in 2015
• Revealed 32 million accounts’ personally identifiable
information and stated sexual desires, exposing
affairs
• Several affected members committed suicide
Data Breach Harms
• In legal proceedings about data breaches, it can be difficult to prove
harm (which might happen later) in order to win a lawsuit
• Solove and Citron (2017) provide a legal theory for understanding harms
from data breaches:
• Risk as harm
• Suppose you list two identical safes for sale on eBay, but one of
them has had its combination revealed to others and it cannot
be reset. Which safe is worth more?
• Anxiety as harm
• Anxiety about possible future identity theft is itself harmful
• Common practice in such lawsuits is to provide free credit
monitoring to victims, but this cannot fully alleviate anxiety
• Offering credit monitoring is like offering a blood test to
someone who was exposed to an increased risk of cancer
– the test is not a cure, and it does prevent anxiety
Danielle K. Citron & Daniel Solove, Risk and Anxiety: A Theory of Data Breach Harms,
96 Texas Law Review 737 (2018).
General Data Protection
Regulation (GDPR)
• The General Data Protection Regulation (GDPR) is
the European Union’s landmark privacy regulation
law
• Effective since 2018.
• Applies to any organization that collects data on
people living in the European Economic Area (EU +
Iceland, Liechtenstein, and Norway))
• Impacts non-EU companies that do business
with the EU
• As such, it has impacted privacy policies
well beyond the EU
• It may be difficult to enforce outside of
the EU, however
General Data
Protection
Regulation
(GDPR)
• The GDPR is a very complex law. Its key principles
are:
• Lawfulness, fairness, and transparency
• Accountability
• Purpose limitation
• Data minimization
• Accuracy
• Storage limitation
• Security
• It aims to give users control over their personal
information
• It states that personal data may not be processed
without a legal basis for doing so.
• It defines personal data as “any information
relating to an identified or identifiable natural
person”
• Personally identifiable information (PII) is
another name for this.
General Data Protection Regulation (GDPR)
The legal bases for processing data are (exact text from the law, Article 6):
• Processing shall be lawful only if and to the extent that at least one of the
following applies:
• (a) the data subject has given consent to the processing of his or her personal
data for one or more specific purposes;
• (b) processing is necessary for the performance of a contract to which the data
subject is party or in order to take steps at the request of the data subject prior to
entering into a contract;
• (c) processing is necessary for compliance with a legal obligation to which the
controller is subject;
• (d) processing is necessary in order to protect the vital interests of the data
subject or of another natural person;
• (e) processing is necessary for the performance of a task carried out in the public
interest or in the exercise of official authority vested in the controller;
• (f) processing is necessary for the purposes of the legitimate interests pursued by
the controller or by a third party, except where such interests are overridden by
the interests or fundamental rights and freedoms of the data subject which
require protection of personal data, in particular where the data subject is a child.
Right to
Explanation
• Some legal scholars have argued that
the GDPR mandates a “right to
explanation,” i.e., an explanation of a
decision made by an AI or machine
learning algorithm.
• This has motivated much research
on explainable AI techniques.
• This view has been disputed by other
scholars. Its legal status is unclear.
Anonymization and Pseudonymization
• Fully anonymized data, i.e. data that is no longer associated with individuals
and cannot be re-identified as such, is out of the scope of the GDPR and not
subject to its regulations.
• However, this is difficult to achieve in practice, and there is a high bar for
demonstrating this for the GDPR. De-identification i.e., removing personal
identifiers such as names, is generally not sufficient to ensure that the data
cannot be re-identified (we will discuss this more shortly6).
• In practice, to establish GDPR compliance in terms of securing and
protecting personal data, one generally needs to perform
pseudonymization, defined by the GDPR as preventing the identification of a
data record with a natural person without additional data that is stored
separately.
• Pseudonymization strategies include:
• Encryption
• Replacing ID fields such as names with temporary ID’s
• Replacing all field names and values with pseudonyms (called tokenization, not to be confused
with the term from NLP)
Health Insurance Portability and
Accountability Act (HIPAA)
• The Health Insurance Portability and Accountability Act (HIPAA) is a US law
that protects patients’ healthcare information by regulating the flow of such
information.
• Established in 1996.
• “Covered entities,” which include physicians, healthcare providers, hospitals,
etc., may not disclose protected health information (PHI) without patients’
consent.
• Although it was originally considered over-complicated, “HIPAA has
accomplished its primary objective: making patients feel safe giving their
physicians and other treating clinicians sensitive information while
permitting reasonable information flows for treatment, operations, research,
and public health purposes.” – Cohen and Mello (2018)
Cohen, I. G., & Mello, M. M. (2018). HIPAA and protecting health information in the 21st century. Jama, 320(3), 231-232.
Health Insurance Portability and
Accountability Act (HIPAA)
• To analyze, study, and train machine learning models on healthcare
data, we need to collect data while following privacy regulations.
• HIPAA allows researchers to obtain protected health information
(PHI) without patient consent if an IRB or a privacy board determines
that there is minimal risk from the research and obtaining consent
would be impractical.
• Datasets for which a designated set of 18 types of PII have been
removed, in order to anonymize the data, can be shared for research
or commercial purposes.
Anonymization
• Can’t we just anonymize the data by removing or
masking identifiers such as names?
• Anonymizing data is difficult
• A study found that 87% of Americans can be identified
from just zip code, birth date and sex
• As dimensions / attributes increase,
pretty quickly you become unique
Sweeney, L. (2000). Simple demographics often identify people uniquely. Health (San
Francisco), 671(2000), 1-34.
34
Anonymization Fails
Anonymized Netflix data + public IMDB data = identified Netflix data
(Narayanan and Shmatikov, 2008)
Narayanan, A., & Shmatikov, V. (2008, May). Robust de-anonymization of large sparse
datasets. In 2008 IEEE Symposium on Security and Privacy (sp 2008) (pp. 111-125). IEEE.
35
Linkage Attacks
• The Netflix Prize data was re-identified via a
linkage attack (a.k.a. linking attack): matching
the attributes in an anonymized dataset
(Netflix) with background knowledge from
another dataset (IMDB).
• The simplest linkage attack is to simply look
for a match between the anonymized dataset
and the background knowledge dataset, when
looking at the potentially relevant features
that remain after removing explicit identifiers.
• These are called quasi-identifiers: a set of
attributes that, in combination, can uniquely
identify individuals.
• The approach used on the Netflix Prize was
slightly more sophisticated, and does not
require knowing which attributes are quasiidentifiers, but it was along these same lines.
Privacy researcher Latanya Sweeney was able to identify
the Governor of Massachusetts in a healthcare dataset by
matching his zip code, birth date, and sex, which were
known, to a record in the dataset.
Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International journal of
uncertainty, fuzziness and knowledge-based systems, 10(05), 557-570.
k-anonymity
• k-anonymity is an early and direct attempt to address linkage attacks.
• Goal: release a version of a dataset that is safe from re-identification
while still being useful for analysis.
• (Note: this approach does not actually guarantee that this is achieved.)
Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International journal of
uncertainty, fuzziness and knowledge-based systems, 10(05), 557-570.
k-anonymity
• A release of a dataset is said to satisfy k-anonymity if and only if each
sequence of quasi-identifier values occurs at least k times in the
release.
• The idea is that if an adversary matches their background knowledge
to a sequence of quasi-identifier values, they won’t know which of
the k or more individuals with those same quasi-identifier values that
it refers to.
• So individuals can’t be uniquely identified via linkage attacks.
Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International journal of
uncertainty, fuzziness and knowledge-based systems, 10(05), 557-570.
k-anonymity
• In this example, race, birth, gender, and zip are the quasi-identifiers, while
problem is designated not to be a quasi-identifier.
• problem is considered a sensitive attribute: the adversary is assumed not to know
it but wants to discover it by de-identifying the data)
• By counting occurrences of the quasi-identifier vectors, we see that k = 2.
Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International journal of
uncertainty, fuzziness and knowledge-based systems, 10(05), 557-570.
k-anonymity
• k-anonymity can be achieved by doing the following steps:
• Determining which of the attributes are explicit identifiers, quasi-identifiers,
or non-identifiers.
• Suppressing all explicit identifiers
• Processing the quasi-identifiers to achieve the desired k. E.g.,:
• Suppression: Replacing one or more values of attributes with a dummy value, *
• Generalization: coarsening the granularity of an attribute, e.g. by binning it into certain
ranges.
k-anonymity
• Limitations of k-anonymity:
• If you don’t correctly determine the quasi-identifiers, or the adversary knows
more than the designated quasi-identifiers, the method will fail completely.
• It’s difficult to achieve in high dimensions.
• It’s vulnerable to several attacks.
Machanavajjhala, A., Kifer, D., Gehrke, J., & Venkitasubramaniam, M. (2007). l-diversity: Privacy beyond k-anonymity. ACM
Transactions on Knowledge Discovery from Data (TKDD), 1(1), 3-es.
Homogeneity attack
Exploit a lack of diversity in the sensitive attributes.
If Alice is Race = Black, Birth = 1965, gender = f, we can’t tell whether she’s t3 or
t4 but we can conclude that she has hypertension.
Machanavajjhala, A., Kifer, D., Gehrke, J., & Venkitasubramaniam, M. (2007). l-diversity: Privacy beyond k-anonymity. ACM
Transactions on Knowledge Discovery from Data (TKDD), 1(1), 3-es.
Background Knowledge Attack
Exploit background knowledge on the sensitive attributes to narrow the possibilities.
If Bob is Race = White, Birth = 1964, gender = m, and we know from background knowledge
that Bob is probably not obese (e.g., Bob is a fitness trainer), then we can conclude that Bob
has chest pain.
Machanavajjhala, A., Kifer, D., Gehrke, J., & Venkitasubramaniam, M. (2007). l-diversity: Privacy beyond k-anonymity. ACM
Transactions on Knowledge Discovery from Data (TKDD), 1(1), 3-es.
Federated Learning
• Machine learning models trained on user data need data
from many people, raising privacy concerns
• What if training could be done without ever sending our
data to the server?
• Federated learning aims to accomplish this.
• Data stays on local devices, e.g. smartphones
• Transmit only local models / parameters / gradients / etc computed
based on local data on the device, instead of the data itself.
• Typically, a central server coordinates learning, although
decentralized federated learning is also possible.
• Advantage: data never leaves local devices, limiting
obvious information disclosure
• Disadvantages:
• Data still could be reverse-engineered from the models unless
additional steps are taken
• Vulnerable to data poisoning attacks from the local devices, unless
additional steps are taken
• Difficult to monitor or prevent discriminatory biases
• Requires substantial communication costs
• Next week: we will discuss differential privacy, which
provides provable and meaningful mathematical
guarantees on privacy
Image credit: Federated learning. (2023, November 8). In
Wikipedia. https://en.wikipedia.org/wiki/Federated_learning
Think-Pair-Share: Privacy in a Jogging App
• Suppose that you work for a company that built JogAthlon, a
smartphone app that tracks users’ jogging activities including running
routes, times, step counts, calories burned, and progress in meeting
fitness goals.
• How do both users and the app company benefit from keeping track of
the app users’ data?
• How will you appropriately ensure privacy for the app users’ data?
Think-Pair-Share: Privacy in a Jogging App
• Suppose that you work for a company that built JogAthlon, a
smartphone app that tracks users’ jogging activities including running
routes, times, step counts, calories burned, and progress in meeting
fitness goals.
• How do both users and the app company benefit from keeping track of
the app users’ data?
• Users are motivated by following their progress in losing weight etc
• Create fun and motivating challenges
• Gamification
• Company benefits from targeted advertising, sell to third parties
• Even without targeted ads, if the customer likes it, it’s good for business
• How will you appropriately ensure privacy for the app users’ data?
Think-Pair-Share: Privacy in a Jogging App
• Suppose that you work for a company that built JogAthlon, a
smartphone app that tracks users’ jogging activities including running
routes, times, step counts, calories burned, and progress in meeting
fitness goals.
• How will you appropriately ensure privacy for the app users’ data?
• Keep it secure – encryption
• Anonymize the data
• Don’t use real names with released data
• Phone permission settings
• Opt out instead of opt in to revealing information
• Lower granularity of route data
• Should follow HIPAA?
Boston University School of Law
Scholarly Commons at Boston University School of Law
Faculty Scholarship
2003
HIPAA Regulations: A New Era of Medical-Record Privacy?
George J. Annas
Follow this and additional works at: https://scholarship.law.bu.edu/faculty_scholarship
Part of the Health Law and Policy Commons
The new england journal of medicine
legal issues in medicine
HIPAA Regulations — A New Era of Medical-Record Privacy?
George J. Annas, J.D., M.P.H.
Although the regulations under the Health Insurance Portability and Accountability Act of 1996
(HIPAA) regarding the privacy of medical records
are new,1 the concept of using federal law to protect
the privacy of medical records is not. The substance
of the new regulations can be traced back to work
done in the 1970s, especially the report of the Privacy Protection Study Commission, which helped to
articulate the case for national privacy standards for
a variety of records kept on citizens.2 The Clinton
Health Security Act contains a separate section entitled “Privacy of Information” that sets forth the
framework for the national standards created by the
HIPAA regulations.3 Provisions for the privacy of
medical records became part of HIPAA,4 which authorized the secretary of Health and Human Services to promulgate regulations to protect the privacy
of health information in which the patient is identifiable in the event that Congress did not enact legislation on this subject (which it did not).
In the context of the Clinton health plan, rules for
the privacy of medical records were part of a much
broader package whose main aim was to provide access to health insurance for all Americans. Now regulations for medical-record privacy have arrived
alone. I believe the new regulations are excessively
and unnecessarily complex and often more attuned
to making sure that businesses and government
agencies get access to medical records than to the
protection of patients’ privacy. The debate over the
content and effect of the HIPAA regulations has
been fierce over the past four years and is likely to
intensify in the post–September 11 era of surveillance, which has brought even more proposals to
authorize virtually unlimited access to medical records by national security, law-enforcement, and
public health agencies.5-7
A new cadre of HIPAA consultants has grown up
in the past few years, and hospitals, health plans,
and many physician-run practices have found their
help essential in understanding how to comply with
the new regulations. This need arises because although the core principles behind the regulations
1486
n engl j med 348;15
are readily understandable, the regulations themselves are long, complex, and overlaid with commentary. Moreover, we have been through three different versions of the “final” regulations in the past
two years, and there will undoubtedly be more
changes as they are implemented.
My purpose in this article is not to provide an indepth analysis or critique (the regulations are filled
with compromises, and few people are entirely happy with them), but rather to provide a basic summary aimed primarily at the practicing physician.
Whatever one’s view of the HIPAA regulations,
they will form the starting point for future national regulation of medical privacy. In this sense,
they are akin to movie contracts, about which one
Hollywood executive is reported to have said, “We
have to have a contract so we have a basis for renegotiation.”
principles of privacy
It has been foundational, at least since Hippocrates,
that patients have a right to have personal medical
information kept private. Physicians have an obligation to keep medical information secret. The chief
public-policy rationale is that patients are unlikely to
disclose intimate details that are necessary for their
proper medical care to their physicians unless they
trust their physicians to keep that information secret. Basic privacy doctrine in the context of medical
care holds that no one should have access to private
health care information without the patient’s authorization and that the patient should have access
to records containing his or her own information,
be able to obtain a copy of the records, and have the
opportunity to correct mistakes in them.8
The HIPAA regulations can be seen as an overly
complicated way of applying these basic privacy
rules in an era of electronic communication, large
health plans, and fierce marketing campaigns.
Compliance is required by April 14, 2003, and the
regulations apply to both electronic and paper records. A physician is covered by the regulations (be-
www.nejm.org
april 10, 2003
The New England Journal of Medicine
Downloaded from nejm.org at BOSTON UNIVERSITY on November 29, 2021. For personal use only. No other uses without permission.
Copyright © 2003 Massachusetts Medical Society. All rights reserved.
legal issues in medicine
comes a “covered entity,” in the language of the regulations) if he or she conducts any medical business,
including billing, electronically, even if the physician contracts with another entity or business associate to do billing. This means that most practicing
physicians will be covered, since most physicians accept private health insurance, are members of one
or more health plans, receive payment from Medicare or Medicaid, or otherwise do business electronically.
All of the HIPAA rules include an implicit requirement that the amount of individually identifiable health information released or requested for
any specific purpose — except for disclosures authorized by the patient, disclosures to another
health care provider involved in treatment, or disclosures required by law — be the “minimum necessary” to accomplish the purpose. This means that
outside the context of treatment, a patient’s entire
medical record can seldom be lawfully disclosed
without the patient’s written authorization.
The HIPAA regulations set a federal minimum,
or floor, not a ceiling, on the protection of privacy.
Thus, when other federal laws (such as laws protecting drug and alcohol treatment records) or state
laws (such as laws that provide special protections
for mental health or genetic records) provide more
protection for patients’ privacy than the new regulations, the more protective federal and state laws
will continue to govern. In addition, state law continues to govern parent–child relationships, the
rights of children, and the definitions of emancipated and mature minors. Federal regulations cannot change a state’s family law or its informedconsent laws, even if the Department of Health and
Human Services wanted to do so. Of course, the
continued importance of state law means that the
regulations ultimately fail to produce a real national
standard of medical privacy — because the application of the regulations can and will vary from state
to state.
the privacy notice
Few Americans have any idea what is done with their
medical records, and probably fewer still believe
they can have any control at all over who uses them.
There are certainly computer experts who share the
view that personal control of private information is
an illusion in the computer age and that privacy is
already dead. The HIPAA regulations reject this view
and instead aim to inform and educate patients
n engl j med 348;15
about their privacy rights. That is why all patients
must be provided with a privacy notice. The regulations require that each patient be provided with a
written “notice of privacy practices” on the day of
the first delivery of health services after the regulations become effective and that the notice itself be
prominently posted at the service site.
The privacy notice must tell the patient who will
be able to see and use the patient’s medical records,
what uses will require the patient’s specific authorization, and that patients have the right to inspect,
copy, and amend their medical records and to obtain
an accounting of disclosures. The notice must also
contain the name, title, and telephone number of a
person or office to contact, usually designated as a
privacy officer (this person could be the physician’s
office manager, for example), for further information. A good-faith effort must also be made to obtain
the patient’s written acknowledgment of receipt of
the notice. Most notably, and contrary to an earlier
proposal that the patient’s consent be required for
all uses of the medical record, patients are simply informed in this notice that their medical records can
be disclosed for uses related to treatment, payment,
or “health care operations” without any additional
notification or authorization. At least one example
of each of these uses must also be provided in the
notice.
Treatment-related uses of the medical record
have always been a reasonable expectation on the
part of both physicians and patients, although people have been genuinely surprised to learn how
many members of a hospital staff have routine access to their medical records.8,9 Use of medical records for payment-related purposes has historically
required patients’ authorization, but this usually involved the simple signing of a form in the waiting
room, and refusal to sign meant that a patient had to
pay out of pocket, so there has never been any real
choice in this matter. “Health care operations,” as
defined in the regulations, is a much broader category and includes such uses as quality assessment
(other than research), performance evaluation, the
conduct of training programs, the rating of premiums, auditing, business planning, and management. This is a compromise. Privacy advocates generally favor the earlier version of the rule that
required consent for all uses of the medical record,
including treatment.10 Under the new rule, providers can still obtain consent, but why would a provider do so if a notice alone is sufficient? On the
other hand, since even the earlier rule permitted
www.nejm.org
april 10, 2003
The New England Journal of Medicine
Downloaded from nejm.org at BOSTON UNIVERSITY on November 29, 2021. For personal use only. No other uses without permission.
Copyright © 2003 Massachusetts Medical Society. All rights reserved.
1487
The new england journal of medicine
physicians to require that the patient consent to the
use of the medical record for these purposes as a
condition of treatment, voluntary consent was never really required.
authorization to disclose
health information
Under the terms of HIPAA, a valid authorization to
release health information must contain at least the
following: “a description of th