Contact info

Topics in Digital Media: Contextual Integrity: Technology, Philosophy & Policy*
MCC-GE 2130-001
Thursdays 2:00-4:10 PM
Fall 2015
Helen Nissenbaum, Department of Media, Culture, and Communication
*Course developed with generous funding from the Intel University Program Office


This graduate level course welcomes students with varied backgrounds and skills but prior coursework in either computing (e.g. programming, website creation, data science, etc.) or social, political, and ethical analysis is required.

Intended Learning Outcomes

Familiarity with the contemporary landscape of privacy threats from digital and information technologies, landmark cases, and literatures
Deep knowledge of the selected case areas
Appreciation of philosophical and ethical concepts relating to privacy
Insight into legal, policy, and business issues at stake in privacy threats and proposed solutions
Thorough grasp of the theory of contextual integrity
Working knowledge of key technologies and sociotechnical systems, particularly those relating to the selected cases


The course will be conducted in seminar style with the professor initiating discussion topics and students contributing actively. Because privacy is an active issue in the public sphere, students and professor will engage with unfolding and ongoing questions. Students in this course are expected to engage actively in individual research into topics of interest, sometimes going beyond the materials given in the syllabus.
This course is about privacy in its relation to technology. Its approach to the topic is through the deep study of a handful of cases selected from a domain of contemporary areas in which the interplay of technology with politics, policy, ethics, and society have yielded conflicts over privacy that have been difficult to resolve. The cases are distinctive because they pose equally difficult challenges to technologists, policy makers, and ethicists who -- the course will demonstrate --must understand the mutual impacts of these diverse arenas in order to make progress. Cases to be studied in Fall 2015 are online tracking, big data, Internet of Things (including quantified self), and biometrics. Four modules, each devoted to one of these cases will include a session devoted to relevant technologies in addition to a session devoted to ethical, social, and policy analysis. When considering the origins of respective threats to privacy, as well as possible answers, or ways to address them, the course will evaluate a range of possibilities, including technology, policy, ethics, all three, and in what measure.
The theory of contextual integrity will form the analytic framework for our analyses of the four case areas. To establish common ground and gain familiarity with the theory, the first three sessions will be devoted to learning basic concepts, the analytic framework, and design heuristic.


Students are expected to attend all class meetings and complete reading assignments beforehand. Students should take notes and bring these to class. Readings from disciplinary sources will vary in difficulty particularly in relation to students' backgrounds. Don't be surprised if you have to read some of the articles slowly, or even multiple times. Participation is an important element of the course, both in-class and online; careful reading contributes significantly to the quality of your contributions. You will be also be evaluated on the basis of written, oral, and practical assignments, a collaborative project, and a term paper based on your project.


Participation (in-class and online): 20%
Miscellaneous homework: 20%
Group project and presentation: 30%
Term paper/project write up: 30%


9/3 Introduction to the course

Franzen, Jonathan. "Imperial Bedroom." New Yorker, October 12, 1998: 48-53.


9/10 Readings
Privacy in Context: Introduction, and Chapters 1-3

Scan popular and news media to locate four articles or videos, which, in your view, describe privacy threats. You will be asked to present one of these in class and explain why you see them as privacy threats. Consider how you might frame these threats in terms presented in of Part I of Privacy in Context.

9/17 Readings
Privacy in Context: Chapters 4-6
U.S. Department of Health, Education & Welfare. "Records, Computers and the Rights of Citizens." Report of the Secretary's Advisory Committee on Automated Personal Data Systems (July 1973): Summary and Recommendations
Warren, Samuel D. and Louis D. Brandeis. "The Right to Privacy." Harvard Law Review 4, no. 5 (December 15, 1890): 193-220.

Auxiliary Readings
Solove, Daniel J. "A Taxonomy of Privacy." University of Pennsylvania Law Review 154, no. 3 (January 2006): 477-564.
Solove, Daniel J. "'I've Got Nothing to Hide' and Other Misunderstandings of Privacy." San Diego Law Review 44, no. 4 (Fall 2007): 745-772.

9/24 Readings
Privacy in Context: Chapters 7-9, and Conclusion

Auxiliary Readings
Calo, Ryan. "The Boundaries of Privacy Harm." Indiana Law Journal 86, no. 3 (July 2011): 1131-1162.
Dourish, Paul and Ken Anderson. "Collective Information Practice: Exploring Privacy and Security as Social and Cultural Phenomena." Human Computer Interaction 21, no. 3 (2006): 319-342.

Read up on Facebook's Newsfeed ("emotional contagion") study. How might CI inform an ethical evaluation? An excellent resource of materials is: http://laboratorium.net/archive/2014/06/30/the_facebook_emotional_manipulati on_study_source

A medium of communication, transaction, and individual, institutional, and group activity in virtually unending variety, including creativity, sociality, political activism, and production, the Web has served as a spectacular medium of human life. It also has provided the means for monitoring, tracking, data collection, eavesdropping, and surveillance, posing unprecedented challenges to privacy. The failed effort to establish a Do-Not-Track standard demonstrates how deeply entrenched the disparate commercial and political stakeholders are in shaping underlying technological affordances and supporting policy landscape.

10/1 Web Tracking and Online Privacy: Ethics, Law, and Policy

Gellman, Robert. "Fair Information Practices: A Basic History." Bob Gellman, February 11, 2015: Sections I-V.
McDonald, Aleecia M. and Cranor, Lorrie F. "The Cost of Reading Privacy Policies." I/S: A Journal of Law and Policy for the Information Society 4, no. 3 (2008): 543-568.
Hoofnagle, Chris Jay, Soltani, Ashkan, Good, Nathaniel, Wambach, Dietrich J. and Mika D. Ayenson. "Behavioral Advertising: The Offer You Cannot Refuse." Harvard Law & Policy Review 6, no. 2 (2012): 273-296.
Madejski, Michelle, Maritza Johnson, and Steven M. Bellovin. "A Study of Privacy Settings Errors in an Online Social Network." In SESOC '12: Proceedings of the 4th IEEE International Workshop on Security and Social Networking, 2012.
Angwin, Julia (Ed.) "What They Know: The Business of Tracking you Online," Wall Street Journal, 2010. (Selections TBD)

Auxiliary Readings
Hoofnagle, Chris Jay and Jan Whittington. "Free: Accounting for the Costs of the Internet's Most Popular Price." UCLA Law Review 606 (2014): 606-670.

Read a couple privacy policies from your favorite websites, including this one: http://www.cnn.com/privacy
To do: 1) to post the name of the site and the URL to the Forum (I have opened a Topic); and 2) Make one observation in one or two sentences - no more - about the policy.

10/8 Web Tracking and Online Privacy: Technology

Guest technologist: Arvind Narayanan, Princeton University

Mayer, Jonathan R. and John C. Mitchell. "Third-Party Web Tracking: Policy and Technology." The Center for Internet and Society at Stanford Law School, March 13, 2012.
Schunter, M. and P. Swire, "What Base Text to Use For the Do Not Track Compliance Specification?" World Wide Web Consortium, Tracking Protection Working Group.

Assignment using Lightbeam: Read about what Lightbeam is and how it works (and/or watch the TED talk about Collusion, its predecessor). Install Lightbeam and browse a few of your regular websites normally. Now use Lightbeam to view the third parties which your browser has contacted on your behalf during your browsing. Look them up to learn about their business models and make guesses about why they were present on the site(s) you visited.

Assignment using Developer Tools: As you browse the web, try to guess which of the content you see is coming directly from the domain (site) you're visiting versus what's being served by other domains embedded on the page. Use Developer Tools (F12 key on both Firefox and Chrome) to test your intuition. Developer Tools has a bit of a learning curve, but in the process of learning to use it, the structure of web pages will become much clearer to you.

This term refers to an extension of the Internet to devices that have not, traditionally, been conceived as "computers," including mobile phones but more broadly home appliances, TVs, motor vehicles, and more. Self-tracking, one application area of the Internet of Things has burgeoned, from fitness to mood to transportation. The Federal Trade Commission has noticed potential concerns for privacy in how the devices in question are connected and designed.

10/15 Internet of Things

Guest technologist: Yan Shvartzshnaider, Princeton University

Federal Trade Commission. "Internet of Things: Privacy & Security in a Connected World," FTC Staff Report, January 2015.
Bogost, Ian, "The Internet of Things You Don't Really Need," The Atlantic, June 23, 2015.

10/22 Self-tracking

Swan, Melanie. "Sensor Mania! The Internet of Things, Wearable Computing, Objective Metrics, and the Quantified Self 2.0." Journal of Sensor and Actuator Networks 1, no. 3 (December 2012): 217-253.
Regalado, A. Stephen Wolfram on Personal Analytics
Nafus, Dawn and Jamie Sherman. "This One Does Not Go Up to 11: Quantified Self Movement as an Alternative Big Data Practice." International Journal of Communication 8 (2014): 1784-1794.
Connected cars. See for example: http://www.theverge.com/2015/10/6/9460471/porsche-911-carrera-apple-carplay-google-android-auto and https://www.automatic.com/

For readings responses, I'd like approximately 2-3 pages in which you discuss IoT from a privacy perspective. I would prefer you select specific cases - consider drawing on cases from this week's readings on quantified self and self-tracking, or motor vehicles. How many you select will vary according to complexity. Using the theory of contextual integrity, consider ways that incorporating the "things" you have chosen to discuss raise privacy issues. Please cite the specific papers from which you are drawing the cases. You may need to return to chapters in Privacy in Context to remind yourself of aspects of the theory that can be used.

10/29 Privacy Engineering, Tools, and Technology

Guest technologist: Seda Guerses, Princeton University

Spiekermann, Sarah and Lorrie Faith Cranor. "Engineering Privacy." IEEE Transactions on Software Engineering 35, no. 1 (January/February 2009): 67-82.
Swire, Peter and Annie Anton. "Engineers and Lawyers in Privacy Protection: Can We All Just Get Along?" Privacy Perspectives, January 13, 2014.
Guerses, Seda and Claudia Diaz. "Two Tales of Privacy in Online Social Networks." IIEE Security & Privacy 11, no. 3 (May/June 2013): 29-37.
Surden, Harry. "Structural Rights in Privacy." SMU Law Review 60, no. 4 (Fall 2007): 1605-1629.
Parra-Arnau, Javier, Rebollo-Monedero, David and Jordi Forne. "Privacy-Enhancing Technologies and Metrics in Personalized Information Systems." Studies in Computational Intelligence 567 (2015): 423-442.

The objective of this assignment is to provide you with an overview of the way privacy tools are promoted online and also encourage you to gather some hands on experience with some of these tools. We will come back to your findgins during class and critically assess what these websites and the tools they promote do to address some of the privacy issues you have been discussing in class. The assignment is as follows:
1) Please pick two sites from the list below (you may also compare one of the sites below with another website for introducing online privacy tools)
2) Compare the two sites:
a) What is the targeted audience of each website?
b) What is the implicit or explicit definition of privacy that they are using?
c) From who should the members of the audience be protecting themselves?
d) How much of this protection is about information flows? If so, what is the justification of the creators of the website for focusing on these information flows?
e) What kind of tools and tips are included? How do the websites justify or explain their selection of tools?
f) How do the tools effect the flows information?
g) Install two of the promoted tools and use them for a week. Highligh three things that you noticed after you installed the tool, e.g., how it changed your behavior and your online experience, what happened to your computer etc.?

Big data offers great promise in key aspects of life, including healthcare, marketing, governance, and education. It will provide an overview of key areas of science and technology that have come together to promote this paradigm, e.g. database hardware and software, machine learning, networked sensors, statistics. It will consider some of the literature that has been critical of big data both from an epistemological perspective, but it will spend most time on ethical issues, focusing on privacy challenges. The "Report to the President on Big Data and Privacy: A Technological Perspective," released this month by the Office of the President and President's Council of Advisors on Science and Technology, will be among the readings.

11/5 Big Data: Privacy, Ethics, Law, and Policy

Review Privacy in Context: Chapter 2 and pp. 201-206.
boyd, danah and Kate Crawford. (2012). "Critical Questions for Big Data: Provocations for a Cultural,Technological, and Scholarly Phenomenon." Information, Communication, & Society 15:5, p. 662-679.
Tene, Omer and Jules Polonetsky. "Big Data for All: Privacy and User Control in the Age of Analytics." Northwestern Journal of Technology and Intellectual Property 11, no. 5 (April 2013): 239-273. [Parts I and II]
Zimmer, Michael. "More on the 'Anonymity' of the Facebook Dataset - It's Harvard College (Updated)." Michael Zimmer, October 3, 2008. Accessed August 25, 2015.
Hays, Constance L. "What Wal-Mart Knows About Customers' Habits." New York Times, November 14, 2004. Accessed August 24, 2015.
Duhigg, Charles. "How Companies Learn Your Secrets." New York Times Magazine, February 16, 2012.
White House. "Big Data: Seizing Opportunities, Preserving Values." Executive Office of the President (May 2014): 1-79. [Chapter 1, quickly scan other chapters]

Auxiliary Readings
Ohm, Paul. "Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization." UCLA Law Review 57 (2010): 1701-1777.

11/12 Big Data: Technology

Guest technologist: Seda Guerses, Princeton University

Barocas, Solon. and Andrew D. Selbst. Forthcoming. "Big Data's Disparate Impact." SSRN, Draft, August 14, 2015: 1-62.

Auxiliary Readings
Vedder, Anton. "KDD: The Challenge to Individualism." Ethics and Information Technology 1, no. 4 (1999): 275-281.
Kosinski, Michal, Stillwell, David and Thore Graepel. "Private Traits and Attributes are Predictable from Digital Records of Human Behavior." Proceedings of the National Academy of Sciences of the United States of America 110, no. 15 (April 9, 2013): 5802-5805.
Yakowitz, Jane. "Tragedy of the Data Commons." Harvard Journal of Law and Technology 25, no. 1 (Fall 2011): 1-67.

The assignment for the next class is to get your hands dirty with data mining so that you get an idea of what it means to work with data. The objective of this exercise is to give you an intuition for the nascent science of data mining and machine learning that is fundamental to what is popularly discussed as Big Data. After the class you will not be a machine-learning wizard but you will learn to ask more informed questions next time somebody presents you with some "machine learning magic".

Since this is a technical exercise, I recommend that you do this exercise together with your project group so that those of you who have more technical experience can provide guidance to the others. I also provided links to four videos, which provide an overview of machine learning. Please watch these and compare how the same field is defined in the different videos. In case you would like to read more, I included a link to further resources on Machine Learning.

Read Chapter 1 of the book by Witten and Frank titled: Data Mining: Practical Machine Learning Tools and Techniques (http://www.cs.waikato.ac.nz/ml/weka/book.html) (the book is available through the NYU Library)

Work with a dataset
In preparation of the course, we would like you to install WEKA, a workbench for data mining and step through the first class.
Video on how to Install WEKA: https://www.youtube.com/watch?v=nHm8otvMVTs
and please step through Class 1:
Introduction: https://www.youtube.com/watch?v=Exe4Dc8FmiM
Exploring the explorer: https://www.youtube.com/watch?v=nHm8otvMVTs
Exploring datasets: https://www.youtube.com/watch?v=BO6XJSaFYzk
Building a classifier: https://www.youtube.com/watch?v=da-6IBnqzsg
Using a filter:https://www.youtube.com/watch?v=XySEe4uNsCY
Visualize your data:https://www.youtube.com/watch?v=U-1sTxmHE5U

You can also start by watching these videos which provide an overview of machine learning:
Which I would like you to contrast with these two videos:

Further resources
This is an etherpad, so please feel free to add new and fun resources for others in the class:

The course will survey state-of-the-art in biometrics technologies currently considered most promising for practical applications, including fingerprinting, iris scanning, and facial recognition. What claims may legitimately be made about their efficacy in both authenticating and identifying? In what arenas are we seeing most of the uptake: governmental security and law enforcement or commercial (e.g. Facebook). What are the most serious privacy concerns? We will review material from the ongoing Multi-stakeholder Process currently being managed by the NTIA, looking at the uses of facial recognition in the private sector.

11/19 Biometric Identification - Technology and Policy

Guest technologist: Nasir Memon, NYU Computer Science & Poly Engineering

Dickinson, Casey J. "New Technology Spots Casino Cheats, Crooks." The Business Journal (Central New York) 15, no. 17 (April 27, 2001): 2.
Williams, Timothy. "Facial Recognition Software Moves Overseas Wars to Local Police." New York Times, August 12, 2015. Accessed August 24, 2015.
GAO. "Information Security: Challenges in Using Biometrics." Government Accountability Office, September 9, 2003: 1-23.
GAO. "Facial Recognition Technology: Commercial Uses, Privacy Issues, and Applicable Federal Law." Government Accountability Office, July 2015: 1-49.
NTIA Multi-Stakeholder Guidelines for Facial Recognition in the Commercial Sector.

Auxiliary Readings
Gates, Kelly, "Identifying the 9/11 'Faces of Terror'," Cultural Studies 20, no. 4-5 (2006): 417-440.

11/26 Thanksgiving

12/3 Notice and Consent (Revisited)
12/10 Project Presentations

Mobile Data Flows and Privacy
Examining How Third Party Trackers Interact with a Web Page
Privacy in the Library Context: An Examination
Privacy in Online Music Streaming
Privacy Analysis of IBM Watson Health
Big Data Challenges the Privacy Protection of U.S. Census
Music Streaming and Privacy: Differential Privacy, Contextual Integrity, and Music Business

Last Updated: January 15, 2016
Need Help? Web-Master