Informatics Question

Description

explain your opinion whether agree or disagree the prompt which attached in the pdf.make sure use the concepts from the lecture slide which i will also attached for youmake sure the citation is correctly citedalso make sure check the rubric, there are three parts you need to follow

Don't use plagiarized sources. Get Your Custom Assignment on
Informatics Question
From as Little as $13/Page

Unformatted Attachment Preview

11/7/23, 2:09 PM
Module Summary 02
Module Summary 02
Start Assignment
Due Friday by 11:59pm
Points 10
Submitting a file upload
Introduction
This end of module assignment is an overview of what you have learned. In a short essay (roughly 500 to 750 words or so), consider the following
prompt:
Instructions
In an essay published in the MIT Technology Review
(https://www.technologyreview.com/2023/04/20/1071291/learn-to-code-legacy-new-projectseducation/) about the failure of past efforts to address disparities in computing by teaching computer programming to groups marginalized by race,
gender, class or other socially consequential forms of difference, historian of computer science Joy Lisi Rankin writes, “Learning to code won’t solve
inequality or poverty or remedy the unjust structures and systems that shape contemporary American life.” Based on some of the readings we have
done in this unit which concern both historical and contemporary sex-based discrimination in technical labor, do you agree with what she is arguing?
Work Cited:
Rankin, J.L. (2023). Learning to code isn’t enough. MIT Technology Review. 20 April. https://www.technologyreview.com/2023/04/20/1071291/learn-tocode-legacy-new-projects-education/
Try to use some of the terms we learned in this section of the course (gender, feminization of labor, et al.) Make sure to correctly cite any readings you
refer to support your explanation and argumentation. Be creative and write beautifully. . Refer to Purdue Owl
(https://owl.purdue.edu/) for citation
information, or visit the campus Writing Center. (http://www.writingcenter.uci.edu/)
Module Summary
Criteria
Ratings
Pts
Voice, grammar, syntax.
3 pts
0 pts
General marks of quality of writing in an upper-division undergraduate course.
Full Marks
No Marks
Concept
5 pts
0 pts
Shows good understanding of the material covered in course, including vocabulary, theory, research methods, historical context.
Full Marks
No Marks
Citation
2 pts
0 pts
Uses published papers appropriately.
Full Marks
No Marks
3 pts
5 pts
2 pts
Total Points: 10

https://canvas.eee.uci.edu/courses/57568/assignments/1265628
1/2
11/7/23, 2:09 PM
Module Summary 02

https://canvas.eee.uci.edu/courses/57568/assignments/1265628
2/2
INF 161: Social Analysis of Computing
Roderic N. Crooks, Ph.D.
Week 06. Fall 2021.
[email protected]
1. GREATEST HITS.
2. SURVEILLANCE READINGS.
3. NO CLASS WEDNESDAY.
• Digital data is ubiquitous; its uses were originally described in
terms of science, but data have become infrastructural to a variety
of human undertakings.
• Datafication names a certain variety of technological change that
is both material and discursive.
• Surveillance: most forms of digital technology produce traces of
human activity which can be sorted, analyzed, retrieved or acted
upon for purposes of control.
• Data do not speak for themselves: they require context, expertise,
and interpretation in order to be meaningful.
• Terms: datafication, performativity, dataveillance, surveillance,
surveillance capitalism.
Gray, J. (2009). Jim Gray on eScience: A transformed scientific method. In T. Hey, S. Tansley, & K. Tolle (Eds.), The
Fourth Paradigm: Data-Intensive Scientific Discovery. Micrososft Research. http://research.microsoft.com/enus/collaboration/fourthparadigm/contents.aspx
Gray, J. (2009). Jim Gray on eScience: A transformed scientific method. In T. Hey, S. Tansley, & K. Tolle (Eds.), The
Fourth Paradigm: Data-Intensive Scientific Discovery. Micrososft Research. http://research.microsoft.com/enus/collaboration/fourthparadigm/contents.aspx
“But the Internet can do more than just make available the full text of research
papers. In principle, it can unify all the scientific data with all the literature to create
a world in which the data and the literature interoperate with each other [Figure 3 on
the next page]. You can be reading a paper by someone and then go off and look at
their original data. You can even redo their analysis. Or you can be looking at some
data and then go off and find out all the literature about this data. Such a capability
will increase the “information velocity” of the sciences and will improve the
scientific productivity of researchers. And I believe that this would be a very good
development!” (xxv).
Discourse
• Dictionary: “written or spoken communication or debate”; “a formal
discussion of a topic in speech or writing”; “a connected series of
utterances; a text or conversation.”
• Foucault: discourse is enunciative. Saying is a form of doing.
• Archeology of Knowledge: “Thus the statement circulates, is used, disappears,
allows or prevents the realization of a desire, serves or resists various
interests, participates in challenge and struggle, and becomes a theme of
participation or rivalry” (p. 105).
• Discourse is the imaginary field where all possible statements exist.
• Discourse determines what it is possible to say and orders statements
relative to one another (scientific discourse versus ethical discourse).
Poster, M. (1994). The mode of information and postmodernity. In D. Crowley &
D. Mitchell (Eds.), Communication theory today (p. 173–192). Stanford, CA:
Stanford University Press.
• Of Databases: “Government and corporate databases contain
information about individuals. These electronic profiles can be
combined, dissected, bought and sold, and otherwise manipulated
without the knowledge or consent of the individual subject.
Computerized databases are actually identity-producing engines”
(178).
• Uses Foucault’s ideas about the Panopticon to argue that data
bases construct a ”super panopticon.”
Anderson, R. (2021). The Panopticon is already here. The Atlantic. September 2020.
• China has recently embarked on a number of ambitious infrastructure projects abroad—
megacity construction, high-speed rail networks, not to mention the country’s muchvaunted Belt and Road Initiative. But these won’t reshape history like China’s digital
infrastructure, which could shift the balance of power between the individual and the
state worldwide.
• Yi Zeng: “Many of us technicians have been invited to speak to the government, and even
to Xi Jinping, about AI’s potential risks,” he said. “But the government is still in a learning
phase, just like other governments worldwide.”
• China is an ideal setting for an experiment in total surveillance. The country is home to
more than 1 billion mobile phones, all chock-full of sophisticated sensors. Each one logs
search-engine queries, websites visited, and mobile payments, which are ubiquitous.
Anderson, R. (2021). The Panopticon is already here. The Atlantic. September 2020.
• The government might soon have a rich, auto-populating data profile for all of its 1
billion–plus citizens. Each profile would comprise millions of data points, including the
person’s every appearance in surveilled space, as well as all of her communications and
purchases
• Surveillance produced from large stores of data is an international project. What happens
in one country’s tech sector affects citizens all over the world. America’s police
departments have begun to avail themselves of footage from Amazon’s home-security
cameras.
• The nanny apps work in tandem with the police, who spot-check phones at checkpoints,
scrolling through recent calls and texts. Even an innocent digital association—being in a
group text with a recent mosque attendee, for instance—could result in detention.
Haggerty, K. & Tetrault, J. (2017). Surveillance. The Wiley Blackwell
Encyclopedia of Social Theory.
• Surveillance is varied: any attempt to collect information or data
to sort, control, or know a population.
• Central theory is Foucault. Central fictional references are Orwell’s
1984 and Kafka’s The Trial.
• “Haggerty and Ericson (2000) borrow some theoretical tools from
Deleuze and his collaborator Félix Guattari to suggest that
surveillance now operates as a form of assemblage.” They draw
attention to the totality of surveillance that we are now under, and
how this monitoring operates through heterogeneous networks of
people and things” (2).
Zuboff (2019).
• Polemical text condemns privacy incursions and exploitative
models of accumulation used by tech.
• Argues that individual actions accumulate to control and lessen
humanity.
• Synthesizes many kinds of qualitative scholarship: law, policy,
economics, interviews, CSCW.
Zuboff (2019).
• “Big Data”: Not an inevitable technology effect, but a component
in a new logic of accumulation.
• New market forms & logics of accumulation. The factory produced
industrial capitalism; large corporations produced managerial
capitalism. Information technology produces informational
society. “Google is to surveillance capitalism what General Motors
was to managerial capitalism.”
Zuboff (2019).
• “Surveillance capitalism” = extraction of data from people based
on their online activity, such as internet searches.
• This data extraction increases steeply as we use mobile phones,
wearable devices and in-home artificial intelligence. The data is
captured, aggregated, analyzed and turned into “surveillance
assets.”
• The capital invested in “surveillance assets” is “surveillance
capital”
Zuboff (2019).
• What is this data used for?
• A small amount of this data is used to improve services and
devices.
• Surplus Data (Data Exhaust) is not related to passive activities or
to ancillary collections, e.g., giving away free products/services
that harvest data such as wi-fi, email, personalization.
• Data is aggregated into predictive models and later sold to
businesses for profit: examples, reCAPTCHA, email scanning, voice
assistants. Google became an industry leader by monetizing data
exhaust.
Zuboff (2019).
• Big Other (tech giants that operate via this logic) largely immune
from contract law, regulation. Surveillance capitalism is not a
relation of equals.
• Need for trust eliminated. Replaced by one-sided arrangements
that individual users cannot meaningfully impact.
• Surveillance is a means of sorting and controlling groups.
• Platforms can operate (via UI and UX) beneath users’ conscious
decision-making capacities= “anticipatory conformity.
Zuboff (2019).
• Asymmetry of knowledge. Big Other knows more about people
than they are aware of or can admit.
• Typical user has no knowledge of Google’s business model, the
types and granularity of data collected, and what is done with
such data (i.e., how it is monetized and how much capital
accumulates to the platform).
• Political consequences for Western democracies: either
government uses data to nudge its own interests (just like
Facebook and Google do).
• Unaccountable platforms interfere in governance: Cambridge
Analytica, propaganda, misinformation.
Crooks, R. (2019). Cat-and-mouse games:
Dataveillance and performativity in urban
Schools. Surveillance & Society, 17(3/4),
484–498.
• “This paper focuses on the responses of teachers and students in
a South Los Angeles public high school to dataveillance regimes
that were meant to control specific behaviors. Over a period of
two years, a newly deployed one-to-one tablet computer program
supported the integration of dataveillance regimes with previously
established modes of pursuing teacher and student
accountability.”
29
30
PART TWO. Organization.
Star and Bowker’s (2005) “moving
target” of infrastructure.
Ms. Wilson: “When you do Google
Forms or Google Doc I have to
make a Google Doc for each
period and then maybe even drill
down more, so per period, per
group. So maybe I’ve got 18
Google Docs, and I don’t want
that. So then, you have to figure, is
this an assignment or that I can
manage? So then, you’ve got 17
Google Docs and you’re like,
‘That’s too much.’”
31
“mass user-unfriendliness”
Photo by Soraya Quezada,
Remedial Math/
Tech Support
32
33
Crooks, R. (2019).
• Dataveillance—the systematic collection and analysis of digital
data for the purposes of controlling a targeted group—gives
authorities new ways to sort and manage student populations, to
reward or punish the work of teachers, and to field- test various
kinds of commercial software (484).
• Dataveillance—”data collection as a way of managing or governing
a certain population”—have drawn into question some of the
guiding explanatory theories of surveillance (486).
Crooks, R. (2019).
• Performativity: a way of thinking about the digital data that make
dataveillance possible.
• The creation of norm and deviance under the guise of a
naturalized, assumed representational capacity.
• The workings of symbolic systems to confer normative statuses
and how those statuses relate to existing relations of power.
• Two examples: teacher pay and college applications.
Crooks, R. (2019).
• “The object lesson here is that omniscient surveillance is a fiction:
real surveillance regimes depend on interpretation, even in highly
automated systems. Digital data do not merely represent some
reality that is waiting to be categorized; instead, they dynamically
order and reorder the world. In a performative analysis, digital
data are made to stand in relation to complex phenomena via
interpretive moves. Performativity then provides a way for making
disputes about the meanings of data useful” (495).
Crooks, R. (2019). Cat-and-mouse games:
Dataveillance and performativity in urban
schools. Surveillance & Society, 17(3/4),
484–498.
• Dataveillance: aggregating, analyzing, visualizing data for purposes of
control, e.g., the social sort.
• Combined with many hi-tech and low-tech practices.
• Performativity: the ability of data to confer statuses it seems to merely
represent, especially norm and deviance.
• Urban schools: a term of art from education research. Schools that
serve minoritized communities.
Crooks, R. (2019).
• Data do not stand in automatic mimetic representation to people,
places, things, or processes; rather, they are MADE to represent people,
places, and things.
• Data are interpreted by human and non-human agents; this gives us
the potential to resist the control that Panoptic theories of surveillance
predict.
• Build a better mouse trap…
10/28
van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and
ideology. Surveillance & Society, 12(2), 197–208.
Whittaker, M. (2020, November 2). Who am I to decide when algorithms should make important
decisions? The Boston Globe.
Cifor, M., Garcia, P., Cowan, T.L., Rault, J., Sutherland, T., Chan, A., Rode, J., Hoffmann, A.L., Salehi, N., &
Nakamura, L. (2019). Feminist Data Manifest-No. Retrieved from: https://www.manifestno.com/.
STOP LAPD Spying Coalition. (2021). A New AI Lexicon: Surveillance / The Ghosts of White Supremacy in AI
Reform. Medium. https://medium.com/a-new-ai-lexicon/a-new-ai-lexicon-surveillance8a4231ef8359
van Dijck, J. (2014). Datafication, dataism
and dataveillance: Big Data between
scientific paradigm and ideology.
Surveillance & Society, 12(2), 197–208.
• Metadata: data about data. Description of what is in a certain
collection of data (e.g., a timestamp, geolocation, or size of a file).
• Datafication: the transformation of sociality by forms of
computing (e.g., social media as a transformation of friendship).
• Dataism: belief and trust needed to make datafication work.
van Dijck, J. (2014).
• “Datafication is rooted in problematic ontological and
epistemological claims. However compelling some examples of
applied Big Data research, the ideology of dataism shows
characteristics of a widespread belief in the objective
quantification and potential tracking of all kinds of human
behavior and sociality through online media technologies.
Besides, dataism also involves trust in the (institutional) agents
that collect, interpret, and share (meta)data culled from social
media, internetplatforms, and other communication technologies”
(198).
van Dijck, J. (2014).
• On dataveillance: “Dataveillance—the monitoring of citizens on the
basis of their online data—differs from surveillance on at least one
important account: whereas surveillance presumes monitoring for
specific purposes, dataveillance entails the continuous tracking of
(meta)data for unstated preset purposes” (205).
Whittaker, M. (2020, November 2). Who
am I to decide when algorithms should
make important decisions? The Boston
Globe.
• Even as evidence of artificial intelligence’s unevenly distributed harms and
benefits mount, the question of when it is appropriate to allow algorithms to
make important decisions persists.
• Those who are subject to these systems are generally ignored in
conversations about algorithmic governance and regulation. And when they
are included, it’s often as a token or stereotype, and not as someone whose
expertise is understood as central to AI decision-making.
Whittaker, M. (2020, November 2).
• The tech industry is extraordinarily concentrated. In the United
States, this means that a handful of firms in Silicon Valley are at
the heart of AI development, including building algorithmic
models they license to third parties or leasing the infrastructure
for AI startups who build their own. And those developing this
technology are a homogenous group: predominantly white and
male, hailing from elite universities, and possessing rare technical
training.
• We won’t know the actual answer to when it is appropriate to use
algorithms until the people who are most affected have the power
to answer that very question.
Only a scant number of publications has challenged the metaphor and the myth of the STEM pipeline (e.g., Cannady
et al., 2014; H. Metcalf, 2010), and these efforts do not challenge the anti-inclusive design of STEM education and
participation that at its core is structurally racist. After all, White men constitute about half of the scientists and
engineers employed in science and engineering occupations. If we include Asian men, the percentage rises to 66%,
and when we add White and Asian women to that group, the number skyrockets to 88% (National Center for Science
and Engineering Statistics, 2019).
Cifor, M., Garcia, P., Cowan, T.L., et al.
Rault, J., Sutherland, T., Chan, A., Rode, J.,
Hoffmann, A.L., Salehi, N., & Nakamura, L.
(2019). Feminist Data Manifest-No.
1. We refuse to operate under the assumption that risk and harm
associated with data practices can be bounded to mean the same
thing for everyone, everywhere, at every time. We commit to
acknowledging how historical and systemic patterns of violence and
exploitation produce differential vulnerabilities for communities.
Cifor, M., et al. (2019).
2. We refuse to be disciplined by data, devices, and practices that
seek to shape and normalize racialized, gendered, and differentlyabled bodies in ways that make us available to be tracked,
monitored, and surveilled. We commit to taking back control over
the ways we behave, live, and engage with data and its
technologies.
3. We refuse the use of data about people in perpetuity. We commit
to embracing agency and working with intentionality, preparing
bodies or corpuses of data to be laid to rest when they are not being
used in service to the people about whom they were created.
Cifor, M., et al. (2019).
4. We refuse to understand data as disembodied and thereby
dehumanized and departicularized. We commit to understanding
data as always and variously attached to bodies; we vow to
interrogate the biopolitical implications of data with a keen eye to
gender, race, sexuality, class, disability, nationality, and other forms
of embodied difference.
5. We refuse any code of phony “ethics” and false proclamations of
transparency that are wielded as cover, as tools of power, as forms
for escape that let the people who create systems off the hook from
accountability or responsibility. We commit to a feminist data ethics
that explicitly seeks equity and demands justice by helping us
understand and shift how power works.
STOP LAPD Spying Coalition. (2021). A
New AI Lexicon: Surveillance / The Ghosts
of White Supremacy in AI Reform.
Medium.
• By Noopur Raval and Amba Kak
• In January 2021, we launched ‘A New AI Lexicon:’ a call for
contributions to generate alternate narratives, positionalities, and
understandings to the better known and widely circulated ways of
talking about AI. In our introductory call, we identified the
contours and politics of the critical AI space, as well as the
silences and erasures they end up producing.
STOP LAPD Spying Coalition. (2021).
• This essay shows how data-driven technologies help obfuscate the
institutions that decide who is policed. Police practices are never
determined by the State alone but have instead been shaped by a
range of actors, including nonprofit consultants, private industry,
real estate developers, and the propertied class, as well as by
academics that lay the intellectual framework for racist violence
like “Broken Windows policing.”
STOP LAPD Spying Coalition. (2021).
• Today, policing is among the U.S.’s most violent methods of racial
control.
• Efforts to improve, refine, and calibrate police surveillance
through police reform continue the tradition of white supremacist
experimentation.
• Reformism is especially dangerous when applied to practices such
as “data-driven” policing because it strengthens the state’s ability
to claim objectivity in its violence.
• Terms like “oversight,” “accountability,” and “transparency” have
proliferated alongside harmful surveillance technologies, helping
to normalize and expand them.
STOP LAPD Spying Coalition. (2021).
• Agencies like the Los Angeles Police Department (“LAPD”) exist to
enforce white supremacy, “spending their yearly billions to pioneer
new ways of targeting, controlling, and dominating Black and
brown communities.”
• The “expertise” of technologists, lawyers, academics, and others
are used to collaborate with police.
• The state turns to technologists, researchers, data scientists, and
computer programmers to provide a scientific veneer to
longstanding practices of racial profiling and racist patrolling.
11/02
Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big
Data & Society, 4(2), 1 – 14.
Dencik, L. (2019). Social Justice in an Age of Datafication [Video]. YouTube.
Allied Media Projects. (2015). Data Discotech [Video]. YouTube.
Hill, K. (2020). Activists Turn Facial Recognition Tools Against the Police. New York Times.

Taylor, L. (2017). What is data justice? The
case for connecting digital rights and
freedoms globally. Big Data & Society,
4(2), 1 – 14.
• Global perspective required due to “rise in technology adoption
worldwide,” and the “globalisation of data analytics.”
• Data justice is necessary to determine ethical paths through a
datafying world.
Taylor, L. (2017).
• Beyond socio-economic status, gender, ethnicity and place of origin also help
to determine which databases we are part of, how those systems use our
data and the kinds of influence they can have over us.
• This intersectionality (Cho et al., 2013) in the effects of datafication is an
important component of the argument for a social justice perspective. A
range of interacting characteristics – race, ethnicity, religion, gender,
location, nationality, socio-economic status – determine how individuals
become administrative and legal subjects through their data
• There are currently (at least) three main approaches to conceptualising data
justice: one addressing the ways in which data used for governance can
support power asymmetries (Johnson, 2014), another focusing on the ways
in which data technologies can provide greater distributive justice through
making the poor visible (Heeks and Renken, 2016) and another that is
interested in how practices of dataveillance can impact on the work of social
justice organisations (Dencik et al., 2016)
Taylor, L. (2017).
Hill, K. (2020). Activists Turn Facial
Recognition Tools Against the Police. New
York Times.
• Portland, OR, like many other cities voted to make facial recognition
use by law enforcement or other government actors illegal.
• Cities that passed bans — including Boston; Minneapolis; San
Francisco; Oakland, California; and Portland, Oregon — listed concerns
about police using the technology secretly among their reasons.
Hill, K. (2020).
• Artists and activist are using off-the-shelf facial recognition to
identify police who have legally or illegally concealed their
identities.
• “It doesn’t surprise me in the least,” said Clare Garvie, a lawyer at
Georgetown University’s Center on Privacy and Technology. “I think
some folks will say, ‘All’s fair in love and war,’ but it highlights the
risk of developing this technology without thinking about its use in
the hands of all possible actors.”
• Colin Cheung, a protester in Hong Kong, had developed a tool to
identify police officers using online photos of them. After he
posted a video about the project on Facebook, he was arrested.
Mr. Cheung ultimately abandoned the work.
Hill, K. (2020).
• “For a while now, everyone was aware the big guys could use this
to identify and oppress the little guys, but we’re now approaching
the technological threshold where the little guys can do it to the
big guys,” Mr. [Andrew] Maximov, 30, said. “It’s not just the loss of
anonymity. It’s the threat of infamy.
• In 2016, an anti-surveillance group in Chicago, the Lucy Parsons
Lab, started OpenOversight, a “public searchable database of law
enforcement officers.”
• Key question: Does banning facial recognition by police harm
efforts for police accountability? Who can benefit from off-theshelf facial recognition products?
Taylor, L. (2019). Dr. Linnet Taylor on Data
Justice [Video]. YouTube.
• Data justice as social justice + datafication.
Allied Media Projects. (2015). Data
Discotech [Video]. YouTube.
• Community activists using tech for community-defined goals
Include artistic approaches to making and using data.
11/04
https://docs.google.com/presentation/d/1X3vp-rHYBYU4VWFsvfj0xpi00yNZqwsLYnyPnPiG2s/edit?usp=sharing
Public Books. (2021). Episode 4: Data & Infrastructure [Audio]. Public Books.
Data & Society. (2021). Episode 2: Data & Labor [Audio]. Data & Society.
Third assignment due Friday, November 12th, 2021.

Purchase answer to see full
attachment