Why the Tech Revolution Needs the Liberal Arts | Next Big Idea Club
Magazine / Why the Tech Revolution Needs the Liberal Arts

Why the Tech Revolution Needs the Liberal Arts

Career Technology
Why the Tech Revolution Needs the Liberal Arts

READ ON TO DISCOVER:

  • How a sociology PhD student changed the conversation about Snapchat—and got hired for it
  • What Google’s in-house philosopher does all day
  • How many employees at Apple and Facebook have a liberal arts background

Scott Hartley is a venture capitalist and former Presidential Innovation Fellow at the White House whose writing has been featured in the Financial Times, Forbes, Foreign Policy, and more. David Burkus is an award-winning podcaster and author of Under New Management: How Leading Organizations Are Upending Business as Usual. He recently hosted Scott on Radio Free Leader to discuss the coming automation of jobs, the skills that will never go out of style, and how liberal arts students can find success in the rapidly changing world of technology.

David: We are here on the occasion of a book that you’re releasing called The Fuzzy and the Techie. I think we should probably define those terms.

Scott: Absolutely. “Fuzzy” refers to the arts, humanities, and social sciences, and “techie” refers to the hard sciences, or computer sciences, or engineering. More than it being one versus the other, the book is about the faux opposition between the two. If you take a look at some of the “fuzzy majors,” if you study history, or economics, or political science, you see heavily quantified subject matter coming to fruition. And if you look at “hard subjects” like mechanical engineering, you see this rise of design thinking and bringing psychology to bear.

Really it’s about the merging of these two worlds. The book unpacks this myth-busting notion that Silicon Valley, and tech in general, is filled with only technical people.

In fact, if you look at any company, whether it’s Apple, Google, or Facebook, 30 to 40% of the employees are coming from backgrounds in the social sciences, the humanities, and the arts. The liberal arts are bashed in the media, as if having an English degree must [mean] you’re a barista at a coffee shop. This notion is really false, and the truth is front and center in the heart of Silicon Valley.

David: I come from the business school world, where the dichotomy is between poets and quants. If you’re on the social science side, you’re a poet. If you’re on the financial side, you’re a quant. We set both of these in opposition to each other because we as humans just love categorizing people. The reality is we need them both.

“The comparative advantage is in having a unique observation, this unique lens on how to apply technology, [not] just having the technology built.”

Often there is the assumption that “innovation” refers to technological innovation, the “Mark Zuckerberg coding all night” type of stories. But there are a lot of problems that fuzzies can help tech companies solve far more quickly than AB testing ever would.

Scott: Right. If you look at Zuckerberg, people think, “Well, he’s just this lightning-fast coder who did all this by himself.” Look at some of his co-founders, like Chris Hughes. Chris Hughes was a history major at Harvard. And even if you look at Zuckerberg’s background, he studied psychology. He went to Exeter, a liberal arts prep school. He studied ancient Greek and Latin. He won a prize in classical studies in high school, so he’s not as monolithically tech as you might think.

What’s really interesting are the people spearheading the creation of some of these companies, who are deeply involved in product development itself. The counterintuitive truth is that oftentimes it’s the fuzzy who had come out of a different walk of life, a different academic background, who had seen the world through a different lens and was saying, “Wait a minute, this technology that I see, this new tool can be applied to my old problem. Why don’t I start a company to do that.” In many cases those were the more profound ideas than tech for tech’s sake. It was tech applied to a particular domain, where coming at it from a different angle was what created value.

David: Let’s dive into that. One of the stories that fascinated me the most was Snapchat.

Scott: It’s a fascinating example. Nathan Jurgensen was a sociology PhD student writing blog posts about what he was calling ‘digital dualism,’ this concept that everything that’s online is fake and everything that’s off-line is real. Today we’re seeing people who are taking pictures of their meal every day, they’re taking manicured photos, they’re dressing up, staging their real life to display it as [how] they’re living. In many ways, this artifice, this creation in the physical world, is fake. Whereas something in the digital world that’s ephemeral and authentic, [like a Snap], could actually be more real than the physical world.

Today we live in this era of endless storage, where everything can be stored. [So] Snapchat was brilliant to create scarcity. They tugged on the psychology, and took real stock in this sociological observation that Nathan had. Evan Spiegel, who runs Snapchat, came across these articles on digital dualism and he said, “Why don’t you come work for Snap? Why don’t you run a publication for us?”

It’s a great example of how a sociologist is front and center at the heart of a tech company. This is evident in a number of other domains as well. For example, Stewart Butterfield founded Slack, a corporate communications platform that’s taking the place of email in many companies. The whole development of that product came out of a failed gaming startup. He said, “Wait a minute, there seems to be this kernel of truth in this [failed] idea. Let’s pursue that and see where it goes.”He studied philosophy both in undergrad and grad school, and he actually attributes the unveiling of Slack to his training in philosophy.

bookstore

There are a number of different people at the helm of product development that are coming at it from these different backgrounds and different walks of life, really leading to these breakthrough moments. At this point, the comparative advantage is in having a unique observation, this unique lens on how to apply technology, [not] just having the technology built.

David: It’s a great example of tech inventing this amazing tool called email, but never actually thinking about the long-term implications: the collaboration overload, getting back from a three-day weekend and having 400 emails in your inbox. No one ever thought about the human element, and Slack is a tool developed for that.

Scott: Right. In product development we see this term of ‘user experience,’ really designing for the human behind the technology. Slack is a great example of that. If you look at Apple, they have Apple University where they bring in philosophers and professors. Joel Podolny, [who] used to run the Yale School of Management, was behind the creation of Apple University and how they preserve the culture of Steve Jobs, building great products that can really be bicycles for the mind, as he said.

One guy that I feature in the book, Tristan Harris, was Google’s in-house product philosopher for some time. What he focuses on is if you have these very few people making decisions about what to build, how to tweak, what are the impacts as you multiply these out by billions of users? These implications are large, and the opportunity cost to people’s time is huge. If you create something that leads people to spend more time on their phone or more time in their email inbox, this actually has ramifications for productivity at large across society. How are we thinking about these things? Are there just a couple of white, young men making these decisions? Is there a more pluralistic approach to how we make these decisions? It’s important to engage with people from various disciplines and backgrounds to be part of this conversation.

“We’ve looked at this rise of big data as this promise that with more and more information we will have more and more knowledge. As has been proven throughout history, that’s not always the case.”

David: You have a section about this assumption that the techies will inherit the earth, that big data is the source of all truth. Yet at the same time, big data strips us of the human element. Folks that are pioneers remind us that we can make better decisions with more data, but we can’t strip the people from the data points. Otherwise we arrive at something that may be optimized for us, but not optimized for society.

Scott: We’ve looked at this rise of big data as this promise that with more and more information we will have more and more knowledge. As has been proven throughout history, that’s not always the case. Voltaire actually has this great quote that I’ll paraphrase. He says, “Judge a man by the questions that he asks, not by the answers that he gives.” We have this massive proliferation of data, but really it’s about the humans in the room that are asking questions of the data.

A great example of that would be data on predictive policing. A number of cities’ police departments have experimented with using data to predict where to send police officers. You might say, “Okay, that’s a great way to be out there in advance of where you might need to project force,” but at the same time, you have to ask how that data is being collected.

Predictive policing is reliant on police data of previous crimes committed, and previous crimes committed are usually based on previous crimes reported, and there may be bias in how that’s done. You have to look behind this veil of data to say, “Do we have sociologists and social workers in the room, not just the guys writing the prediction algorithms?”

David: One of the other interesting areas that I found in the book is the idea that technology can help us learn. There are a lot of people who think STEM is synonymous with innovation, and in reality, we need broader liberal arts ideas to think philosophically about what requires learning.

evan-clark-114123

Scott: One of the guys that I talk to in the book is Matt Brimer, the co-founder of General Assembly. I liken it to an urban community college for technical literacy. Matt is actually a sociology major from Yale who knew about building community, knew about design, and what people needed [in order] to feel [like] part of a community. He talks about how your education should always be in ‘beta.’ It’s an engineering term for ‘a work in progress.’

Being able to think creatively, to solve complex problems, to communicate with others, these are all fundamental things that are required, regardless of what the world looks like in 50 years. We should learn these basic human skills, and then have this mentality that we should be continually investing in our own education.

Blended learning is really hitting its stride. What this means is not having an online-only curriculum where you put people in front of a video lecture and expect them to learn. Instead, it’s blending technology and the classroom. One great example of this is dealing with messy problems. How do you engage people to think critically? You might pose a question that has no easy answer, where you teach people to identify trustworthy sources. We need to have a critical eye as we examine things that are shown to us every day.

David: You close the book with the idea that we should assume that every degree will be irrelevant for job preparation. What does that mean for the future of work and where we’re headed?

Scott: That’s the question of the day. It’s about human/computer interaction. It’s about fluency with data, fluency with machines, but it’s also about the human component.

A study that came out a couple years ago from Oxford claimed that 47% of all US jobs were at high risk of automation. This set off the alarm bells, and I think this is where we started getting frenzied around automation. There’s since been a number of studies that have taken a more measured look at the ecosystem.

“We should be continually investing in our own education.”

For example, the McKinsey Global Institute came out with a study in the early part of 2017 where they looked at over 800 jobs. They found that 5% of jobs are fully automatable at this point which, from a societal perspective, is a big deal, and that raises questions about basic income, and unemployment, and all those things which are very important discussions. For 60% of the jobs they looked at, 30% of those tasks could be automated.

What that means is that 30% of what we do on a day-to-day basis might be done better by a machine—whether that’s collecting data, manipulating data in Excel, or searching for information in your Gmail inbox.

Rather than 47% of all jobs disappearing, I think it’s 30% of tasks within 60% of jobs, as McKinsey says. What’s even more interesting is that they estimate the timeline for this shift [to be] anywhere from 8 to 28 years. How do you prepare for this changing world? I think you prepare for it by gaining fluency in dealing with data, dealing with machines, or dealing with the human-to-human component. If machines are taking over the fact-based, repetitious aspects of our jobs, we actually level up to what we’re really good at, those very human tasks like creativity and communication. If you think about the skills needed in tomorrow’s economy, the “dark matter” of the liberal arts come to the forefront as being our comparative advantage.

David: Final question: in your view, what makes someone a leader?

Scott: I’ll go to Drew Faust, the president of Harvard. She gave a great lecture to the cadets graduating from the 2016 class at West Point. She told them to keep a copy of the Iliad under their pillow, that the liberal arts and humanities were what helped people grapple with the hard problems. She said, “It helps you scrutinize what’s at hand even through the thick dust of danger, or drama, or disorienting strangeness.” I think that reading and empathizing with other perspectives through literature, for example, are things of timeless value. I really think that leadership is rooted in that universality of being exposed to things.

 

This conversation has been edited and condensed. To listen to the full version, click here.

Download
the Next Big Idea App

app-store play-market

Also in Magazine