Generic filters
Exact matches only

Technology & Democracy, and Privacy and more–a discussion summary for session 2 of an online exploratory discussion series

privacy by tua ulamac on Flickr / CC BY-NC-SA

In this second session of our small group conversation series on Technology & Democracy, citizens from around the country gathered online via Zoom for an exploratory discussion of several key questions around the intersection of our sense of individual privacy, our emerging technologies, and our democracy—as well as some other key values and concerns.  These key questions included—

  • What does “privacy” mean to you individually, and then more generally, what role does privacy play in a democratic society? Why is it important?
  • What are some different questions or concerns about privacy and technology in a democratic society, and what other important values are affected by it?
  • What possibilities can we imagine to better manage or govern technology, increase our control over our personal information, or address other questions or concerns we’ve discussed?

In the Discussion Summary below we have summarized some of the common themes and many of the different ideas that emerged from each small-groups discussion of these and other questions concerning Technology & Democracy…and privacy. The summary for the first discussion on Technology, Democracy, and Truth is here.

Our 3rd and final discussion in this 3-part series will be held this Thursday, July 8th at 7:00 pm (ET). In this final session, well explore different purposes and interests that shape the development and deployment of technology and how they can—or should—impact democracy. Well also consider policies for how technology might help improve our democracy.  You can register for this final session directly on our website. Register here for July 8

Please note that registration is separate for each session, and that you do not need to have attended the prior sessions to participate in this final session.


Discussion Summary

What does privacy mean to you?  How does is affect your life?

  • Privacy is about having exclusive control or personal autonomy over—
    • My personal space, my body, myself
    • My things, my property and possessions
    • My information, my personal data
    • And perhaps most importantly, my power, my choice of what to share and with whom—and not having anyone else with this power, not the government, not business, not others.
  • Privacy means being or staying anonymous, not having others watching or judging what I’m doing. Not having people knowing all my business or activities.
  • Privacy also means safety—in my space, and in not having my property stolen or damaged.
  • Privacy is an important, fundamental legal right—
    • Constitutionally protected though only as against the government.
    • And protected by law in certain cases and for certain classes of information only—e.g., your medical or financial information.
    • Some privacy laws only regulate what the government can (or can’t) do with your information; they don’t regulate what businesses or others can do with your information.
    • That we increasingly feel like we’re losing or that we have to trade (or have traded)
  • Privacy is an “illusion.” Online there is little to no real privacy, somebody is always watching, surveilling, collecting your data.  The algorithms never rest in their mission to sell us things and to automate our confirmation biases.
  • Privacy isn’t that important to me because I’m not doing anything wrong, nothing that I would care if the government or others knew about.
  • Privacy means different things—
    • In different places or regions or zones:
      • In China, privacy is only a privilege that the government may grant, but only to citizens deemed “good” enough.
      • The EU has stricter privacy laws, and the default is that citizens have control over their data (companies have to ask for it). The EU also recognizes a “right to be forgotten”—to limit what personal information appears on social medias (and/or for how long).
      • More privacy in our homes versus at work, on the street, in stores, or other public places.
    • To different generations: seems like younger people are much more comfortable sharing a lot of personal data online.
    • To different people—if you’re a celebrity, elected official, or other “public” person, you have less privacy
  • Privacy is about having control of our data—but it becomes something we trade in order to get to use certain apps on our devices.
    • If we want a certain functionality, we have to let them track us, collect our data
    • It becomes a forced choice—we need or want the apps, the functionality, the convenience, so we have to give up our privacy.
  • Privacy is essential to “freedom”—to be left alone, to not have to interact, to maintaining our individuality.
  • Privacy is essential to protecting our reputations, our careers. We shouldn’t have to reveal everything to everybody.  This is especially true if the information isn’t even correct (think how many errors there are in credit reports).

More generally, what role(s) does privacy play in a democratic society?  Why is it important, and what are some of the interrelated or competing values (or trade-offs)?

  • Privacy has changed over time; previously important and assumed, but then the internet, social media, digital and searchable records, facial recognition technology …
  • It’s important to democracy but not in the Constitution. Is it implied?
  • Privacy is important to democracy in part because it supports freedom of expression because anonymity might make people more comfortable in speaking their mind, to speak “uncomfortable truths,” and to propose innovative and challenging ideas.
  • Privacy is something, a right or privilege, that we can trade away for convenience.
  • Privacy supports democracy by giving citizens a stronger sense of safety and control over their persons, property, and identity.
  • Privacy may help to keep us all free from the otherwise constant and hugely annoying barrage of commerce—the advertisements, junk mail, robo calls, robo texts, spam emails, etc.
  • Privacy as a fundamental right (along with other basic civil liberties) against intrusion or abuse by the government is important to a healthy democracy because without it, corrupt governments can (and tragically have) used private information to suppress, control, torture, and kill their political opponents or ethnic minorities.
  • Privacy is essential to a democracy because the secret ballot is a key part of our election system. People need to be certain that others, and especially government officials, cannot know how any individual voted.
  • Privacy is a key measure of our “public trust” in other institutions. If we trust others, we’re more comfortable sharing our personal information.  If we trust less, we’re less comfortable.  Over and over again, we forced to consider whether we trust the person or entity that we’re interacting with and possibly sharing information with.
  • Privacy is important because it allows for people to have some sense of “control” over at least their own information and lives. It’s about power—like so many other topics.  If democracy is about sharing power (at least to some limited extent), then the privacy of all citizens needs to be secure/protected—both from government intrusion but also from intrusion by private and especially corporate actors.
  • Privacy is not important for a democracy, since privacy encourages us to think only of ourselves—that the individual is most important—it dampens our sense of social obligation.
    • Privacy is a cocoon that separates individuals from society, impeding democracy.
    • For democracy, the value of social obligation outweighs privacy.
    • We have an oligarchy not a democracy, and the info we generate is used by the oligarchs to manipulate us.
  • Other affected values for a democratic society include—
    • Equality—there is a real digital divide both in access to online and also in protection from its abuses. The better off have access to computers and to the resources to pay to preserve their privacy; the poor often don’t.
    • Convenience—if we’re honest, we often like much of the convenience that comes from having our personal information stored by and available to online merchants, medical care, etc. But only when it’s both accurate and protected from misuse.  
    • Trust” is a key value for privacy and democracy as well. How much control over our information do we have (or not have) and how much do we trust others with it.
    • Justice: misuse of personal information can put the wrong person in jail or at least given them a record.
    • Mental health/psychology: When privacy is threatened or felt to be under assault, individuals and the overall society cannot be mentally healthy. We feel under constant stress and become more paranoid and/or reactionary.

How does our technology affect privacy and democracy? What are some different questions or concerns surrounding privacy and technology?
Technology impacts democracy and society in many ways. Participants were concerned about…

  • A lack of control and desire for limits
    • Individuals do not control their data – governments and businesses do.
      • And they often use it to intrude on our lives via robo-calls, automated text messages, etc.
      • There needs to be some moral ground, even within corporations, where they learn restraint in surveilling us—and where they limit how much they’ll comply with excessive government demands to reveal private data about us
      • Technology is always listening and recording individuals’ actions.
        • With online tracking, corporations know every search you’ve ever done and can utilize it for their financial gain; it’s creepy and unnerving.
        • Cameras everywhere….there is less and less “visual privacy.” We already live in something like a “surveillance state.”
        • On the other hand, we like the added safety/security that cameras can provide to residences and places of business.
      • Algorithms and marketing software push information and products on consumers.
      • Data can be leaked or stolen – even via government servers or credit card companies.
      • If you have a cell phone, then you can be tracked geographically (if you had a phone at the Jan. 6 Capitol riot, then law enforcement can tell if you were there).
      • Location services seem to still be tracking you, even if you turn them off.
    • The technology itself can be inaccurate and the data it relies on even more so.
      • Facial recognition technology—as in the blog post—it’s very inaccurate and can be intrusive and scary. It’s unjust—people of color are singled out every time they go through an airport just for how they look.
      • Much of the technology is very imperfect, it can misidentify people.
      • There is also a lot of crap information out there (old, filled with typos, etc.) that never gets cleaned up, goes away, or is “forgotten.” Relying on bad data….has a bad track record.
  • What is the future potential of technology? Good technology now could be thwarted by the development of newer technology to render it ineffective.
    • Biometric recognition technologies can make transactions more efficient and with less risk of fraud (reducing the need for lots of papers to check—and it’s harder to falsify key biometrics like a finger print, etc.)
    • But what if new technology rises to the challenge of defeating these biometrics with fakes (using technology to spoof your face or voice)?
  • Positive or pro-social concerns for the common good could outweigh individual privacy concerns in certain instances.
    • For example, tech companies ability to turn off individuals’ AC via “smart” thermostats during a heatwave for the greater good of avoiding a blackout.
    • Tracking apps for avoiding Covid—public health as more valuable than privacy.
    • Navigation and mapping apps help the common good by helping relieve congestion (greater efficiency of transportation), but they are also collecting a record of your movements.
    • Law enforcement and public safety concerns may also outweigh individual privacy concerns in some cases. For example, if there’s a kidnapping, then it would be good if the police could track a car.
  • Some key questions individuals were interested in exploring further include—
    • Who controls, or should control the data about, or generated by, you: a) the technology company that is the means of collection; or b) you, the source of the information?
    • What purposes and practices should guide or limit the use of our personal information?
      • For example, a teacher was fired when the school system saw that her social media had a past photo of her at a party drinking alcohol (in a totally legal situation).
      • Could we make a different level of allowed scrutiny for social media? For example, designating it as “out-of-bounds” for use by employers?
      • What if we had a way of rating different levels of sensitivity of private information?
    • How could we have informed consent in advance about the privacy costs or tradeoffs of our technology decisions?
      • What if you had a standardized checklist for each app with all the kinds of information that you might be giving up and what the implications could be?
      • With greater protection for more sensitive personal information?
      • Ultimately, individuals must decide whether they care or not that they are giving up all this private information.

What possibilities can we imagine to better manage technologies, to increase our control over our personal information, to strengthen our democracy, or to respond to other questions or concerns we’ve discussed?

  • What if we created new governmental bodies (commissions, committees, agencies) to help regulate emerging technologies? For example, what if we had a public body to evaluate and pre-approve technologies in light of their societal impacts—similar to the way that the FDA evaluates and approves new medicines for safety and efficacy?
    • We need someone checking the development and implementation of technology for its privacy implications.
    • Who would run it?
      • If it’s a government body, what if industry insiders take control of it?
      • What if there were a certain number of seats or votes reserved for persons representing regular citizens, consumers, the public?
    • It could slow down technological development with new regulatory hurdles.
      • This could be good, since a lot of tech is going too fast; our developmental of technology can often be uncoordinated, wasteful and result in a lot of “throw-away” technology with wasted effort and materials.
      • This could be good, because it might make tech companies consolidate more of their new versions into really significant stages (not releasing minor variations).
      • This also allows us to control technology’s impact on society by slowing it down, and this is an especially good idea with genetic innovations because you are changing life itself.
    • Technology itself is “neutral”—it can be used for good or for ill. For example, it is an essential component of our agriculture, transportation, communication, medical, manufacturing, and other sectors of our economy. Implementing technology in a way that is beneficial to society as a whole is a crucial issue.
  • What if made more use of the expertise of non-governmental and non-profit think tanks or social entrepreneur organizations that work on technology and democracy issues?
  • What if we had a system for different ratings of seriousness for the societal impacts of technologies?
    • It is imperative, for example, to closely examine and regulate the use of technology for justice issues—especially facial recognition technology.
    • There could be different levels of regulatory scrutiny depending on the rating of the possible impact of the technology.
  • What if there was an ethics office for data-sharing?
    • Companies might have to justify why they need certain information for the functionality of the product
    • Could set limits for different age users (no tracking for children below 13, etc.)
    • Companies need to instill a sense of ethics within their ranks, a sense of restraint
    • Ask them to consider how they would want their kids to be impacted by this technology?
  • What if we treated online platforms and the internet service providers more like “utilities” for the common good and as part of our national infrastructure, like the post office or interstate highway system?
    • And what if we regulated them as utilities to insure that safeguards are in place to protect both the privacy rights of individual users and the public good overall?
  • What if we implemented certain policies to give individuals more control over their personal information?
    • What if we set it up so that people would have to opt-in for giving up private information?
    • Combine this with earlier mention of having transparency and informed consent about what you’re giving up
    • What if we had a system of fully informed-consent—a checklist that spells out what you are agreeing to in terms of your private information?
    • What if we had a system like in the EU, where the default is that control is in the hands of individuals, not the tech companies?
    • What if we changed the ways people could use your data (make it illegal to get fired for what’s on your social media)?
  • What if you could vote to protect your privacy with your dollars, and you had a choice of apps or companies that won’t collect all your data?
    • What if we sought to increase competition in the technology sphere for the public’s benefit? For example, A tech company or an app could market itself as the one that won’t track you (etc.).
    • If there’s a competitive market for privacy and for greater transparency, then technologies that meet that demand are more likely to emerge.
  • What if we focused on education and improving equality of online access for all citizens?
    • We need a populace that is better educated about science, technology, democracy, and privacy.  Also focus on educational equality:  we need to improve education for all.
    • Equalizing online access and privacy rights:
      • Provide the public with online access at libraries and other public places.
      • Make privacy protections for users required “standard equipment” that all developers or providers of online services, platforms, apps, etc., must automatically include and provide to all users at no extra cost—and not something that users/consumers have to opt into or pay extra for.

Related Discussion Guides