How Collective Intelligence Will Drive The World’s Greatest Innovations | Next Big Idea Club
Magazine / How Collective Intelligence Will Drive The World’s Greatest Innovations

How Collective Intelligence Will Drive The World’s Greatest Innovations

Entrepreneurship Technology
How Collective Intelligence Will Drive The World’s Greatest Innovations

Beth Comstock is the Vice Chair of GE and is the leader of GE’s efforts to accelerate new growth. Every month, she leads a discussion with a game-changing author and thinker in GE’s Changemaker Book Club, streamed live on Facebook. Recently, she joined Joi Ito, director of the MIT Media Lab and co-author of Whiplash: How to Survive Our Faster Futurefor a conversation on the effects of collective intelligence and AI on business leadership and why we shouldn’t have to ask for permission.

This conversation has been edited and condensed. To view the full conversation, click the video below.

Beth: You and I are fans of this notion of emergence. What does it mean to you, and why do you pick emergence as something to think about?

Joi: We have a mutual friend, Steven Johnson, who wrote a great book called Emergence. My favorite part [in the book] is where he describes the ants and how each individual ant doesn’t understand the whole of it, but as a colony, they’ll make sure that the graveyard is as far away from the food and as far from the nest as possible. With emergence, there’s a kind of intelligence that gets created and [it becomes] something that you can’t really imagine being inside of. Whether we’re talking about large companies, social systems, or science, we’re realizing that, especially in complex, self-adaptive systems, there’s a collective intelligence that emerges. You can either embrace it or resist it.

Beth: How do you use this in your work at the Media Lab? Are you modeled in an emergent way?

Joi: I think so. It was already like this when I got here, but we have 400 something projects, and none of them have asked permission. I’m kind of like a gardener. I compost here. I water plants there. I move the lights around. Or maybe I’m like a disc jockey where I can change the music, and I can fiddle with the environment, but basically, everything that happens here happens because people are doing whatever they want to do.

“Whether we’re talking about large companies, social systems, or science, a collective intelligence emerges. You can either embrace it or resist it.”

Beth: You use that word “permission.” One of the challenges in most established organizations is [that] people feel like they have to have permission to do something, but in an emergent organization, you do it because you think it’s the right thing to do, or it’s what you need to do. How do you think about permission? Does permission have to be given, or do you expect people to grab it?

Joi: If you go all the way back to [founding of] the internet, the thing that made it great was that because of the simplicity of the protocol people could build things without asking permission. One of the biggest problems [is when] you’ve unleashed the capacity to be super-innovative, but you’re holding it back by the cost of permission.

Beth: That’s an interesting way to think about it—holding you back by the cost of permission.

How do you think of yourself, and how have you thought about building your path with all the different things you do?

Joi: I had, I guess what you would call a learning disability. I really wasn’t able to focus on anything that I wasn’t excited about, so I did very poorly in school. I got kicked out of kindergarten.

Beth: You did not.

Joi: I did. Whenever the teacher wasn’t looking, I’d run outside, and they finally told me not to come back, and I [eventually] dropped out of college, but what I realized is I learn through doing. I started doing all kinds of jobs. I worked in a pet shop. I worked in Hollywood. I worked as a disc jockey. I worked wherever, and I learned a lot by talking to people and doing different things. It took me a long time because I was trying to come up with a unified theory of the world. How does everything work, and how do all these things fit together? Mainly, my value contribution was connecting people in different disciplines.

It turns out that that’s very valuable and useful, especially when you start seeing these silos, and right now, I think one of the problems that we have, whether we’re talking about academia or corporations, is there’s so much specialization and so much value locked up because of the silos. If you’re able to translate between them, you can unlock the power.

Beth: You talk in the book about having almost a superpower of peripheral vision; did you always appreciate that?

Joi: You can’t get lucky unless you identify the opportunities, and you’ll always miss the opportunities if you’re too focused, because focus is on making sure that the expected results are achieved, and luck and serendipity are about trying to take advantage of those things that are unexpected. There’s a great New York Times article about how half of all inventions were unexpected, and so if the person didn’t have peripheral vision, they wouldn’t have even noticed that it was invented.

Beth: You’re an entrepreneur and a venture capitalist. As an entrepreneur, you have to be super-focused. How do you use that peripheral vision and advise yourself or the companies you’re investing in to stay focused but look for opportunities?

Joi: You need to train that muscle. That’s why my first ten years of entrepreneurship all failed. I sucked but I was trainable, and so that part, the focused part, I had to train. Everybody’s different, so the trick is to figure out what are you? Make sure you’re doing whatever you’re good at but then train a few of those muscles that you need in order to be able to switch back and forth.

[Audience Questions]

Beth: Most experts agree it’s neither genes nor environment that give people an edge. It’s their mindset. Do you think adults can change their mind from fixed to growth, or is it genetic?

Joi: That’s a hard one. That’s the nature, nurture question, and it’s also the adults’ learning thing. I believe people can always change. You definitely have initial settings, [though.] My sister and I had very similar opportunities, but she went through school and got straight A’s and was very structured, but now, she’s entering this unstructured entrepreneurial phase. She studied anthropology and education, and then looked over and saw me and started studying weird people like me, whereas I went the other direction because I started in a very unstructured way, but now I’m operationally in charge of an institution. People can change, but the most important thing is passion, so you often learn things that you need in order to fulfill that passion.

Beth: You talk about the principle of disobedience over compliance. In established organizations, the idea of being disobedient [is] hard for many people.  What have you learned about disobedience in more organized structures that we can learn from you?

Joi: Disobedience is different from disrespect. You don’t get a Nobel Prize by doing as you’re told, and you don’t discover new things by just memorizing what your professor said. You have to think, you have to question the status quo, and that’s true in business and in academia.

With the Industrial Revolution, you wanted soldiers and factory workers and white collar workers to be relatively reliable, obedient, and fungible. In that case, you wanted everybody to be obedient. Today, with machine-learning and the creative work that we need to be focusing on, we need people to be thinking for themselves.

“You don’t get a Nobel Prize by doing as you’re told, and you don’t discover new things by just memorizing what your professor said.”

Beth: How do you encourage the colleagues that you have at MIT to be disobedient? You talk about MIT Media Lab as being a hacker space already. You’re hacking education. You’re hacking the university…

Joi: Culture is the main thing. It’s important to sincerely encourage it and love it, so Reid Hoffman, one of our advisory council members, is sponsoring a $250,000 disobedience prize for socially impactful disobedience. This time, I would be nominating the former attorney general and the Department of Energy people who didn’t want to give up climate negotiators’ names.

Disobedience can happen in all kinds of places, and when you go back to the Civil Rights movements, Gandhi and Martin Luther King, they were all very disobedient. Whether you’re talking about civil rights or science or even companies that are going down the wrong path, it’s the duty of an employee or somebody to say, “I don’t think that’s the right thing.”

Beth: Let’s talk about minds and machines. You’re certainly leading the charge. How close are we to developing a machine that has a mind, consciousness, and a mental state?

Joi: There are a number of breakthroughs that need to happen before we’re even close to that. It’s possible that we will have a conscious machine, but it’s much more likely [that we develop] what is called extended intelligence. Artificial intelligence won’t be some monolithic oracle, but it will be all kinds of systems that get built into our network that [not only] augment us as individuals but augment us as a system. [It’s] not a separately conscious system, but something that’s part of our societal consciousness.

Pedro Domingos, in his book Master Algorithm said it in a great way. He said he’s not afraid of super-intelligence coming in and destroying the world. He’s afraid of stupid intelligence that’s already taken over the world. That’s closer to my fear.

Beth: Can you imagine that tech can help us augment our creativity?

Joi: Machines will augment us in a variety of ways, and augmenting our creativity will be a great piece [of it]. The really interesting question is how creative machines themselves can be. I think it’s going to be a tremendously valuable thing. Also, augmenting creativity can mean a lot of things, but it will release us from a lot of non-creative things to spend energy on creative things.

Beth: Do you believe everyone is creative?

Joi: I think everyone is creative. I love the book by Kelley Brothers called Creative Confidence. In that book, they argue that creativity is beaten out of you as you grow up. I often do this poll and ask how many people felt creative when they were five years old. Most people did. They painted and stuff. [As adults,] most people don’t feel creative. Our educational system really does pound it out of you.

Beth: How can we ensure humans stay relevant in the near future? Are we advancing or are we just creating the beginning of the end?

Joi: Morality has a huge role. One of the problems with artificial intelligence right now is that the number of people who can actively contribute to the field are small in number, and they tend to work in large tech companies. The typical ethicist or sociologist can’t really understand machine learning well enough to be able to get into the creation of those machines. What we’re trying to do at the Media Lab is make machine-learning more understandable to non-computer scientists, and also to try to get the computer scientists to listen to the other fields, because one of the problems that some computer scientists face is this group of people who believe in singularity.

This is the notion that at some point, we’re going to get computers that are so smart that all this messy stuff, like morality and ethics and society, will become irrelevant. The dystopian view is that humans become irrelevant. The optimistic view is that we transcend our bodies and we become this super-consciousness. We live forever.

That’s misguided. There’s a lot of stuff in ethics and morality that needs to be built into the machines in order for society—even a society of machines—to be resilient, vibrant, and to have meaning. You can’t computer science your way through that. There’s a tremendous amount of knowledge out there, and also, there are natural systems, biological systems, that have a tremendous amount of information and value that we need to be interfacing with.

“Artificial intelligence won’t be some monolithic oracle, but it will be all kinds of systems that get built into our network that [not only] augment us as individuals but augment us as a system. [It’s] not a separately conscious system, but something that’s part of our societal consciousness.”

A lot of this at the academic level is bringing these disciplines together. At the business level, it’s companies. GE is a great example because you’ve got machine-learning capability inside of a company that does all this other stuff. My fear is where you have companies that don’t have any understanding of machine-learning, and then you have companies that are doing just machine-learning.

Beth: We’ve moved from a world dominated by simple systems to a world beset and baffled by complex systems, yet you’re very optimistic. You make the case that we’re in the midst of fundamental change, yet humanity and people are so adaptable. How do we keep being adaptable?

Joi: We’re inherently adaptable. A lot of things that we’re afraid of now will turn out actually to be good.

Download
the Next Big Idea App

app-store play-market

Also in Magazine