We, the Data: Human Rights in the Digital Age
Magazine / We, the Data: Human Rights in the Digital Age

We, the Data: Human Rights in the Digital Age

Book Bites Politics & Economics Technology
We, the Data: Human Rights in the Digital Age

Wendy H. Wong is an award-winning author, professor of political science, and principal’s research chair at the University of British Columbia.

Below, Wendy shares 5 key insights from her new book, We, the Data: Human Rights in the Digital Age. Listen to the audio version—read by Wendy herself—in the Next Big Idea App.

We, the Data: Human Rights in the Digital Age By Wendy H. Wong Next Big Idea Club

1. Be a data stakeholder.

These days, we hear a lot about data taken from people. We’re often confronted with news about how companies are using data in ways we find troubling, including tracking our behavior across the Internet or as free fodder for generative AI like ChatGPT.

Perhaps we feel that we can’t help but fall victim to these incidents because of how our daily lives are structured and organized. Many of us have smartphones and depend on the Internet to conduct our day-to-day activities. We engage with digital technologies whether we like it or not: tapping into public transit, patronizing stores that use loyalty programs (or even facial recognition technology), and walking through streets with digital doorbells like Ring. Our lives are, in short, datafied: behaviors that are revealing of our innermost thoughts and personal activities have become digital data.

We might also feel like data subjects. Data are being taken from us, but we have very little say about the matter, and we feel we have very little choice but to be subjected to such data collection.

That is why it’s important to become data stakeholders and human rights can get us there. We need to understand what it means to be a data source, how important this role is, and why, even in a sea of data sources, we don’t have to lose to powerful data collectors like Google, Amazon, Microsoft, Apple, and Meta.

To be a stakeholder, we must make our voices heard. Currently, this is quite difficult because there are so many “us” as data sources. Billions of us. We are also not thinking about data as something intrinsically human. Instead, the way we are often told to think about data is as a commodity, something to be bought and sold like oil. Data are “dust,” or “detritus,” or “exhaust.” We’re told we’re the product if we’re not paying.

Imagine a world where we thought about data not as part of the market, but as part of us. What if data were subject to the globally endorsed framework of human rights? Stakeholdership means we have skin in the game and we have an entitlement to speak up. Rather than sitting back and being subject to data collectors’ policies on how they make, store, and analyze data about us, we need to start directing the conversation to make our future in data one embedded in human rights.

2. Digital data are sticky.

They’re sticky like gum on the bottom of your shoe, and they have fundamentally changed the way we live. Nowadays, we probably all know about how easily replicable and transferable digital data are. That’s because digital technologies are pervasive.

But data are sticky not just because we can move them. They are sticky for four other reasons.

  • They’re about mundane things. Think about all the actions you take online, through your devices, or somehow connected to a network. You can’t change these activities, by and large. They are everyday, unremarkable, and even boring. Data are now created about the most granular of our activities, from heartbeats to search terms and beyond.
  • Data are effectively forever. Perhaps they get deleted somewhere along the way, but because we don’t know where they go or what collectors do with them, they are immortal for all intents and purposes.
  • Data are linked. Data don’t just sit in a server, once created. They’re sorted, pooled, disaggregated, and re-aggregated to spit out predictions. Data travel from dataset to dataset, as they’re bought and sold.
  • Data are co-created. Remember, you are half of the equation of data as the data source. Your activities are of interest to data collectors, who decide to create the ways to harvest data about you. So, few data that exist in the world are truly “yours,” even data about your innermost feelings. As such, data are co-created between you and a data collector. Also, many of the data about us are not interesting on their own. They’re interesting to collectors because of what they tell about people like us for predictive purposes. When we talk about the co-creation of data, this really confounds some of the existing ways that we have tied human rights to datafication and AI.

3. In a world of sticky data, we need human rights now more than ever.

Human rights are about what it means to live a human experience. Human rights are the entitlements we have to live life to our fullest potential. They enumerate state responsibilities to ensure that all of us have our universal rights respected and enforced. In 1948, the Universal Declaration of Human Rights brought forth a vision to protect each and every human being through human rights.

Our global human rights framework, though, was institutionalized in a very analog world. That’s not to say they no longer apply—they very much do as long as human beings continue to exist in the world. It’s just that data’s stickiness makes it very hard for us to apply our analog concepts. We might even say data stickiness shows where our modern conceptions of human rights need updating. But how?

We need to go back to the foundations of human rights: the values they seek to protect. In 1948, the creators of the Universal Declaration of Human Rights identified four foundational values: autonomy, dignity, equality, and community. As they envisioned, these values buttressed the international structure of human rights, an initial list of rights that have grown into dozens of universalized entitlements.

“In 1948, the creators of the Universal Declaration of Human Rights identified four foundational values: autonomy, dignity, equality, and community.”

Autonomy, dignity, equality, and community have changed because of datafication. Data’s stickiness poses problems for how we might be able to make independent decisions, the hallmark of autonomy. That data are forever and linked pose issues for dignity—how might they be used, once created? Equality issues pervade datafication, through the imbalance between data sources and collectors. And, as data continue to be made and stick, our human communities and the ties between us have and will continue to change.

The human rights concerns go beyond important rights such as privacy and freedom of expression. It’s about how data’s stickiness both shows the need for human rights and why we need to go back to the foundational values of human rights in the face of challenges in the digital age.

4. It is important to recognize Big Tech’s role in governance.

We’re used to thinking about the multi-national firms that drive the digital economy, or Big Tech, as powerful because they are wealthy—incredibly, historically wealthy. But money is just one way to exercise power. In fact, the nature of Big Tech’s creations and their role in collecting data have given them more than just economic power. They also govern in ways we’re used to thinking about states doing. Governing means creating order around shared expectations. It means setting rules and finding ways to enforce those rules. Governing is what we expect governments to do.

Yet, more and more, Big Tech can be said to govern. They create and control the digital platforms through which we access services and create social and political connections. For example, Meta controls the community standards on its platforms Instagram and Facebook. These standards set the basis for the kinds of speech that can happen on these platforms. Between its different platforms, which also include WhatsApp and Messenger, Meta reaches more than 3 billion people. That’s far more people than any government can legitimately claim to govern. Meta’s Oversight Board, a body that uses human rights to evaluate what gets removed or reinstated on Facebook and Instagram, makes decisions that affect people the world over.

“Between its different platforms, which also include WhatsApp and Messenger, Meta reaches more than 3 billion people.”

Right now, human rights responsibility falls on states. Big Tech falls under “businesses,” which must respect human rights. But what if they don’t? As of this recording, Google and Facebook are declaring they will be blocking Canadian access to news by the end of 2023 because of a dispute over compensating journalism outlets. Denying millions of people access to news and updated information affects our abilities to exercise freedom of expression and conscience, at the minimum. But expression and conscience are two rights that very much either lead into or arise from other rights: the right to education, the freedom to assemble, freedom of religion, and even the right to choose one’s own marriage partner.

If we are to use human rights effectively in the age of data, we must start thinking about how Big Tech governs our daily lives, including what we can share, but also what we know. Not holding Big Tech accountable as governors means we aren’t holding them responsible for the changes they have brought to human lives.

5. We need to start thinking about data literacy as a human right.

Data literacy will help us be data stakeholders. Data literacy should be a part of how we think about the universal right to education. Human life will continue to be data-centric, and as such, learning about data, how to make data, how to use data, and the implications of creating digital data have become core human experiences. Without a human right—a universal entitlement—to data literacy, we risk leaving most of us behind while the few data collectors make leaps and bounds by exploiting those data about us. Literacy is about giving individuals skills in order to function in a data-driven society.

What does it mean to realize a right to data literacy? Already there are community-based programs that show us we don’t have to be data scientists to understand the basic premises behind data creation, why our assumptions about the world matter, and how our choices of data sources really matter for what we learn.

Going forward, we need to think about primary and secondary school curricula. Estonia has led the way in developing extensive educational programming in data literacy, and other countries are getting there. But retooling curricula takes time, and countries vary with which parts of society deliver basic education.

In the meantime, we should reprioritize libraries as stewards of data and data literacy. Libraries are foundational in giving people linguistic literacy skills, and there’s no reason why we shouldn’t be looking towards and funding libraries to help give us the wide, societal reach we need to bring about data literacy.
Finally, outside of data literacy, we should also be working towards shifting how data scientists, computer scientists, and engineers are trained. Not only should they think about ethics, they should be enlightened to consider how their future creations will affect society.

To listen to the audio version read by author Wendy Wong, download the Next Big Idea App today:

Listen to key insights in the next big idea app

Download
the Next Big Idea App

app-store play-market

Also in Magazine

-->