Technology’s Child: Digital Media’s Role in the Ages and Stages of Growing Up
Magazine / Technology’s Child: Digital Media’s Role in the Ages and Stages of Growing Up

Technology’s Child: Digital Media’s Role in the Ages and Stages of Growing Up

Book Bites Parenting Technology
Technology’s Child: Digital Media’s Role in the Ages and Stages of Growing Up

Katie Davis is a professor at the University of Washington, where she directs the Digital Youth Lab. For nearly twenty years, she has been researching and writing about the impact of digital technologies on young people’s learning, development, and well-being. Dr. Davis holds two master’s degrees and a doctorate in Human Development and Education from Harvard Graduate School of Education. She is a 2015 recipient of a Rising Star Award from the Association for Psychological Science.

Below, Katie shares 5 key insights from her new book, Technology’s Child: Digital Media’s Role in the Ages and Stages of Growing Up. Listen to the audio version—read by Katie herself—in the Next Big Idea App.

Technology’s Child: Digital Media’s Role in the Ages and Stages of Growing Up Katie Davis Next Big Idea Club

1. Relying on technology is not the same as thriving with technology.

The coronavirus pandemic—in combination with other, contemporary events—has given rise to a new readi¬ness to engage with the topic of kids and technology on a deeper level than in the months and years before March 2020.

Before the pandemic, the questions were straightforward. We were asking questions like: “Screens: are they good or bad for children’s development?” and “Screen time: how much is too much?” When we moved our lives online in 2020, we quickly learned that asking whether screens and screen time are good or bad are not the right questions. Technology has become too central to our lives.

Now the questions include: “When does technology support children’s healthy development, and when does it not? Why do some children benefit from technology and others do not?” For instance, the pandemic called attention to the fact that while we all may rely on technology, we don’t all enjoy equal access to it. When school buildings closed and classes moved online, we learned about the differences between kids living in affluent homes—whose circumstances ensured they had access to high-speed internet, a dedicated computer, a quiet space to work, and a network of support¬ive people to help keep them on track—and kids living in low-income homes, who too often had few or none of these things.

These stark differences in children’s pandemic experiences exposed and amplified existing inequities in our educational system, forcing us to ask: “Which children benefit most from technology, and why?” In the last quarter of 2021, the leaking of internal Facebook documents by former Facebook employee Frances Haugen shone a light on the power of opaque algorithms to shape our experiences on social media platforms. Although this power had been well documented previously, the media coverage surrounding the Facebook files helped bring it more directly into public consciousness.

The public became more aware of how algorithms, designed to attract and hold our attention to platforms like Facebook and Instagram (and Snapchat, Twitter, TikTok, and YouTube, among others), can contribute to the spread of misinformation, hate speech, and the undermining of teens’ mental health. This prepared us to grapple with the critical question: “How does the design of a technology shape the way children use it and are affected by it?”

2. Technology design matters.

Technology is not neutral. The autoplay feature on Netflix makes it that much easier for children to watch multiple episodes of a television show and that much harder for parents to avoid a confrontation when they seek to turn off the device. When a show ends, recommender algorithms on Netflix (and YouTube, Disney+, and other video-streaming platforms) determines what a child encounters next.

“Most centrally, these companies are motivated by the bottom line, not a child’s well-being.”

Algorithms also determine the videos that show up on a teen’s For You feed on TikTok and the pictures they see as they scroll through their Instagram feed. These algorithms are designed to draw and keep people’s attention based on the information they’ve collected from their prior interactions with the platform.

People design things like autoplay and social media algorithms. Their values, goals, and assumptions are baked into in what they create. Importantly, tech designers don’t reflect the full diversity of the kids they’re designing for. They tend to be disproportionately white, male, and highly educated. When they think about a “typical user” that person is more likely to resemble them than not, and this increases the chances that the technology they create serves some people better than others.

The companies that tech designers work for also have their own set of values, goals, and assumptions. Most centrally, these companies are motivated by the bottom line, not a child’s well-being. The bottom line does well when people are on the platform, and companies have figured out the kind of content that keeps people engaged: emotionally arousing posts typically win out over measured accounts of fact. This is why Facebook’s algorithm prioritizes content with angry emojis over likes.

3. Individual children matter.

Research involving large numbers of children is valuable because it gives us an overall impression of general trends in a population. It’s also a useful way to identify systemic differences that lead to common disparities among sub¬populations in a society. But there’s a limit to the value that can be drawn from general trends. That’s because the statistical models that form the basis of population-based research typically rely on averages.

Let’s consider a hypothetical example involving the relationship between social media use and body satisfaction in teen girls. If you wanted to use the amount of time teen girls spend on social media sites like Instagram to predict how positively or negatively they feel about their bodies, you could generate a bunch of dots on a graph, each one representing an individual teen girl’s time spent on Instagram (a value on the x-axis) and their score on a measure of body satisfaction (a value on the y-axis).

Next, you’d draw a line that goes through the middle of these dots such that the distance between all of the dots falling above the line and the line itself is just about equal to the distance between all of the dots falling below the line and the line.

You’d use that line (not the dots that generated it) to draw conclusions about the relationship between Instagram use and teen girls’ body image. If it were falling steeply to the right, you’d conclude there’s a generally strong, negative relationship between Instagram use and body satisfaction (not good). If it were dipping gently upward, you’d conclude there’s a generally weak, positive relationship (better). These lines would let you make an informed guess about a specific teen girl’s body satisfaction based on the amount of time they spend on Instagram.

“Averages are good at describing behavior on a group level, but considerably less effective at describing the behavior of individual children.”

The problem is, you’d probably be wrong—maybe not as wrong as pulling a value out of thin air, but still wrong. Very few, if any, people ever land exactly on the trend line. It turns out, very few people are average! Averages are good at describing behavior on a group level, but considerably less effective at describing the behavior of individual children.

This example illustrates why a “one size fits all” approach to kids and technology doesn’t work. We need to look at specific children, specific technologies, and specific contexts if we want to truly understand what’s going on with tech’s role in child development.

4. Context matters.

Understanding child development requires understanding the communities and cultural practices that children participate in. Cultural communities shape our impression of what’s considered dangerous and what’s considered beautiful; what counts as intelligent or athletic or funny; and how intelligence, athleticism, and humor are expressed.

I’ll give you an example from my own experience. While living in Berlin, I noticed that many preschool teachers don’t think twice about letting three and four-year-olds use hammers, nails, and saws to work unsupervised with wood planks. This would surely lead to a lawsuit in the Seattle neighborhood I now live in. Children participate in a lot of different spaces, settings, and communities, each containing different configurations of people, practices, roles, and expectations. Understanding how the cultural practices across these contexts relate to each other—and how they connect to broader societal institutions and ideologies—is crucial.

Let’s return to the example of teen girls, social media, and body image. To do better than an average trend line at predicting the relationship between Instagram use and body satisfaction for a particular teen girl, it would help to have some understanding of her experiences with her peers and family members.

What kinds of beauty ideals are promoted by each group of people, and how do they draw on, reinforce, or perhaps push back against broader cultural ideologies related to beauty? How do these dynamics influence what she sees and how she interprets content on Instagram?

Does this teen participate in other communities, such as a church group, sports team, or afterschool club, that deemphasize physical appearance and elevate other qualities such as teamwork, spirituality, and community? How do this teen’s racial identity, and the degree to which it’s either marginalized or centered in the broader society, influence whether and how she compares herself to the images she sees on Instagram?

Questions like these highlight the importance of considering young people’s varied social contexts, how these contexts relate to each other and the technologies used in them, and how all of these contexts are influenced by broader social, political, economic, and historical forces.

5. Good enough can be great.

The idea comes from a pediatrician, Donald Winnicott, who wrote in the middle of the 20th century. Back then, Winnicott wrote about the good-enough mother but let’s update it for the 21st century and talk about the good-enough parent.

Winnicott argued that the goal of parenting is not perfection but rather good enough. In fact, he said, you’re doing your child a disservice if you’re always on hand to solve their problems, soothe them when they’re frustrated or upset, or offer up a new activity if they become bored with the current one. By experiencing and working through the frustration that accompanies these moments of frustration, disappointment, or boredom, children can build resilience that will prepare them for the challenges they’ll inevitably encounter throughout their lives.

“It’s not in a child’s best interest for their parents to be always attentive, because this attentiveness doesn’t allow room for personal growth.”

Good-enough parenting isn’t about making excuses for failing to live up to one’s ideals as a parent. It’s about recognizing that we should not even attempt to be the perfect parent.

First, and most obviously, perfection isn’t possible; or, at least, it can’t be sustained throughout the long and often unpredictable course of childhood. More importantly, according to Winnicott, it’s not in a child’s best interest for their parents to be always attentive, because this attentiveness doesn’t allow room for personal growth.

The concept of the good enough parent serves as a call to parents not to settle for imperfection, but to embrace it, both as a way to support their children’s resilience and as a way to stay sane during the exhilaration and exhaustion of parenthood.

A good-enough digital parent, similarly, doesn’t settle for imperfection but embraces it. They do their best to steer their kids towards positive technology experiences, knowing that not everything will be of equally high quality. They make mistakes, learn from them, and move on. When it comes to their own technology use, good-enough parents will feel confident that occasionally being distracted by a screen or other device isn’t the end of the world, and not the end of their child’s prospects for a happy, fulfilling life.

Good-enough parents also recognize that their distraction isn’t all their doing: tech companies have designed their products deliberately to grab our attention. Distraction could even sometimes be a good thing, both for parents and for children, provided the concept of good enough parenting isn’t distorted and used as an excuse to shirk parenting duties in the name of responding to emails or monitoring one’s social media feed. This is by no means an attempt to let tech companies off the hook for their persuasive designs, either! The occasional distraction could provide an opportunity for parents to call attention to their behavior and create a teachable moment: “Oh, look, I’ve let myself get distracted by my phone! Let me put that away, and let’s get back to where we were.”

With my son, I do my best to limit my phone use around him, but I’m also trying not to place too much guilt on my shoulders while sneaking a peek at news headlines, exchanging messages with a friend, or checking my inbox. These interactions sometimes feel like a lifeline, a way to connect with the outside world, the adult world, and maintain my sense of self beyond my role as a mother.

To listen to the audio version read by author Katie Davis, download the Next Big Idea App today:

Listen to key insights in the next big idea app

Download
the Next Big Idea App

app-store play-market
-->