Sandra Matz is a computational social scientist with a background in psychology and computer science. She is an Associate Professor of Business at Columbia Business School and codirector of the Center for Advanced Technology and Human Performance.
What’s the big idea?
Algorithms steer our outlook and decisions by decoding our psychology. We are inevitably shaped by them any time we engage with the digital village of online life. Our data paints an intimate portrait that leaves us vulnerable to psychological targeting. But despite the potential for predatory use cases, we can choose to use the digital village for empowerment instead.
Below, Sandra shares five key insights from her new book, Mindmasters: The Data-Driven Science of Predicting and Changing Human Behavior. Listen to the audio version—read by Sandra herself—in the Next Big Idea App.
1. We live in a digital village.
I grew up in a tiny village of 500 people in the southwest corner of Germany. My experience there was very much shaped by the other 499 residents because they knew everything about me: what music I was into, who I was dating, and what I was doing on weekends. They put together these pieces of my existence to construct a picture of my inner mental life—my hopes, dreams, fears, and aspirations.
On the one hand, that was a blessing because I felt incredibly supported by a community of people who truly got me. But at the same time, it was also a curse because I felt exposed and manipulated in ways I had no control over.
Today, I live in New York City with my family, which is the complete opposite of that tiny village! But as it turns out, you don’t have to live in a small, rural community to have someone watch and influence every step you take and every choice you make. That’s because we all live in a digital village, and we all have digital neighbors.
Think of it this way: the data-crawling digital equivalent to my sixty-year-old neighbor Klaus reads my Facebook messages, observes which news I share on X, collects my credit card purchases, tracks my whereabouts via my smartphone’s GPS sensor, and records my facial expressions using some of the 50 million public cameras across the United States.
The same way my village neighbors were able to integrate their observations about my daily life into an understanding of who I was and what I wanted, our data-crawling digital neighbors can turn what we do into highly intimate insights about who we are and ultimately prescriptions of what we should do. I call this process psychological targeting.
2. Algorithms are master snoopers.
In 2015, my colleagues at Cambridge University published an article showing that with access to just 300 of your Facebook Likes, an algorithm could predict your personality more accurately than your spouse. When I first heard about their findings, I was sure they had made a mistake. Our spouses experience and live life with us almost every day. But it turns out that algorithms are indeed master snoopers.
Algorithms are incredibly adept at turning the breadcrumbs of your digital existence into a coherent narrative of who you are, which then, in turn, empowers them to influence you and your choices. So, how do they become such formidable digital detectives?
“Talking about yourself doesn’t make you a narcissist. It’s a sign of emotional distress.”
For a starter, they have a lot to work with. You and I create about 6GB of data every hour. Much of this data is intimate and far less curated than our social media profiles. I bet there are questions you’ve asked Google that you wouldn’t have felt comfortable asking even your closest friends or partner.
But what really elevates algorithms into the master snooper category—and makes us humans pale in comparison—is their ability to analyze the data of millions of people at the same time and detect patterns that you and I might have overlooked. Here’s my favorite example: The use of first-person pronouns. What do you think these words might say about you? My initial guess was narcissism. But as it turns out, talking about yourself doesn’t make you a narcissist. It’s a sign of emotional distress.
Think about the last time you felt really down. What were you thinking about? The future of humanity? Quite unlikely! When we feel down, we typically think of ourselves. Why am I feeling so bad? Am I ever going to get better? And because we cannot constantly monitor our thoughts and feelings, this inner monologue creeps into the language we use, and becomes available to our digital neighbors.
3. You should worry even if you have nothing to hide.
Not having to worry about your data and privacy is a privilege not everybody gets to enjoy, and that can be very fleeting. The 2022 Supreme Court decision to overturn Roe v. Wade made this painfully real for American women. Within a matter of days, millions of women suddenly had to worry that their search histories, use of period-tracking apps, online purchases, or GPS location data could be used against them. No matter how safe and comfortable you feel now, your data could be misused in the future. Data is permanent, but leadership isn’t.
Also, giving up control over your data means more than just giving up privacy. It also means giving up your ability to make your own choices. Ever thought about buying life insurance? Well, as a life insurance company, I want to know if you’re a little neurotic. If you are, I make sure to repeatedly target you with offers and leverage your anxiety to upsell you to the most expensive option. Granted, premium life insurance might be just what you need to put your mind at ease. But even if that’s the case, the mere fact that I’ve singled you out as a potential target and got you to reach into your pocket a little bit deeper than you otherwise would have means that you didn’t make the decision to buy life insurance alone.
“Data is permanent, but leadership isn’t.”
And then there are times when who you are eliminates options entirely from your life’s menu. The Chinese government, for example, has rolled out profiling technology to predict their citizens’ likelihood of voicing dissent and participating in a protest. If a person’s data suggests a short-tempered, paranoid, or overly meticulous personality, they might be placed under heavy surveillance or prevented from traveling to Beijing altogether.
But it’s not just autocratic regimes that use psychological targeting to determine what you can and cannot do. Say an algorithm concludes that you are rather disorganized and careless; this could easily prevent you from landing a job interview or securing a loan. The bottom line is that you can’t take control of your life and choices without control over your personal data.
4. It pays off to ask, “What if?”
I spent much of my early career researching the potential dangers of psychological targeting, but psychological targeting also holds enormous power to change our lives for the better. I’ve explored these opportunities in the form of “what if” questions that provide a direct positive counter to every potentially nefarious use case. Here are a few examples:
- What if, instead of enticing people to spend more, we could use psychological targeting to help them save?
- What if, instead of exploiting people’s emotional vulnerabilities for profit, we could use it to help them monitor and improve their mental health?
- What if, instead of burying us deeper and deeper in our own echo chambers, we could use psychological targeting to expand our worldview?
My research shows that all of these are possible, but I find the last one particularly intriguing. Companies like Facebook or Google use predictive algorithms to cater to our existing preferences and reinforce our beliefs about the world. That not only makes us increasingly boring and unidimensional, but it also shatters our shared reality as a collective.
The same algorithms could also be used to accomplish the exact opposite: help us step outside our shoes and explore the world from a viewpoint we might never otherwise get to experience. I have no idea what the life of a 50-year-old farmer in Ohio might look like or what the day-to-day experiences of a single mother in the suburbs of Chicago entail. But Facebook and Google do. And they could make these insights available in a heartbeat.
As a starting point, they could let you explore other users’ news feeds who agree to be part of a “perspectives exchange” or “echo-chamber swap.” You could live your online life in their shoes for a few hours and see what they see. Or how about an “explorer mode” with full control over whose echo chamber you visit and a digital guide powered by generative AI that helps you digest what you see?
The impact of psychological targeting ultimately depends on how we use it. At its worst, psychological targeting manipulates, exploits, and discriminates. At its best, it engages, educates, and empowers.
5. How to win the data game.
Navigating the digital village alone is a nearly impossible task. No one has the knowledge, time, and energy to manage their personal data alone. We only have 24 hours a day, and hopefully, better things to do than deciphering the legalese of all the terms and conditions we’re signing off on. If we want to come out on top of the data game, we must change its rules. Better regulation can help protect us from the most egregious abuses. But it typically does little to help us make the most of our data. For that, we need more flexible and dynamic forms of support; we need allies.
“Applications like MiData make me optimistic about the future.”
Take the Swiss data co-op MiData, for example. MiData acts as a trustee for its members, who can contribute to medical research and clinical studies by granting access to their personal health data on a case-by-case basis. Think of it like a bank account for health.
Anyone can open an account and deposit copies of their medical records or any type of valuable health data in this context. MiData ensures your data is securely stored in its collective vault and gives you full control over its use; you decide who is granted access, to which type of data, and for what purposes. And you can withdraw your personal data at any point in time.
Unlike your typical bank account, MiData isn’t interested in generating profits. There are no ludicrous late fees. Its sole purpose is to maximize value for you and its other members. Any net profits generated from using your data get reinvested into improving the platform. As a member of MiData, you’re not just a bank customer. You own the bank. Literally. Control at MiData also means having a direct say in the co-op’s governance through a general assembly.
The value MiData generates for its members takes different shapes. You can share access to your data with third parties to improve your own health. But you can also share your data to support scientific discovery, like helping researchers better understand allergies, food sensitivities, or rare diseases. In many cases, the same application does both. Applications like MiData make me optimistic about the future.
To listen to the audio version read by author Sandra Matz, download the Next Big Idea App today: