How the Mega-Rich Plan to Outsmart Doomsday
Magazine / How the Mega-Rich Plan to Outsmart Doomsday

How the Mega-Rich Plan to Outsmart Doomsday

Money Politics & Economics Technology
How the Mega-Rich Plan to Outsmart Doomsday

Have you ever found yourself sitting around a table at a luxury resort with five mega-rich strangers who want to know where you think they should build their doomsday bunkers? Absurd as it may sound, that actually happened to media theorist Douglas Rushkoff. Today on the show he explains why the 0.01 percent are obsessed with escaping climate change, global pandemics, political upheaval…and us.

Listen to Douglas’s appearance on the Next Big Idea podcast below, or read a few key highlights. And follow host Rufus Griscom on LinkedIn for behind-the-scenes looks into the show.

“Alaska or New Zealand?”

Douglas Rushkoff: I’ve written these books on media and technology and society, and because they’ve got some tech focus, I’m frequently mistaken for a futurist. So I get invited to do, sometimes high-paid talks for bankers and technology investors. They’re not really so interested in my stuff, which is, like, How do we build technology that helps human beings make a better world for ourselves? They wanna know what to bet on.

I got this mysterious invite that was for a huge amount of money—it was basically a third of my annual salary as a professor at CUNY—to go out and do a talk about the digital future. And I had assumed, because of the amount of money and some of the words around it, that it would be a couple of hundred bankers and hedge funders in some room—you know, the way you usually do a talk. But I was in the green room at this crazy resort, getting ready for the guys to come and put the mic on me and take me out to the stage, when these five men in golf outfits, basically, came in and sat around this table and started asking me questions.

And it turned out that was the talk. It was just five dudes.

Two of them, I documented later, were definitely billionaires. I don’t know about the other three, but close enough. They started out asking me all those kind of typical questions: “Ethereum or Bitcoin?” “Virtual reality or augmented reality?” Then eventually they asked, “Alaska or New Zealand?” and I thought, Oh, man. They wanted to know where to put their bunkers. And they started asking all these questions about what they were calling the “Event”: the biological, climate, economic, or nuclear disaster that makes the world unlivable and requires them to get outta town. And the question that we ended up spending almost the entire hour of the discussion on was: How do they maintain control of their security force after their money is worthless?

“These guys genuinely understand that through their business activities they are creating such devastation that the best they can hope for is to earn enough money to escape the damage they’re creating by earning money in that way.”

They’ve all got these little places they want to go, and they’ve hired, in advance, some Navy SEALs, or the equivalent, to fly out at the last minute and protect their compounds from— I guess, from us.

Rufus Griscom: With our pitchforks.

Douglas: Yeah. But they understand that if it’s long term, why are the Navy SEALs gonna protect this dude and his family instead of just taking over the place? So they started getting into Walking Dead–like scenarios. “Do I get shock collars on these people? Or do I build a safe that has all the food, but I’m the only one who can access it?”

Rufus: Did someone really propose a shock collar?

Douglas: Well, the equivalent. In other words, you don’t put the shock collar on them, but you have an implant or something in order to have access, but then the implant could be used to control where people are or what they do.

The whole thing left me with this crazy feeling, this sense that these guys genuinely understand that through their business activities they are creating such devastation that the best they can hope for is to earn enough money to escape the damage they’re creating by earning money in that way. It’s like they’re trying to build a car that can drive fast enough to escape its own exhaust. Only there’s nowhere to go. There’s nowhere left. The places that we thought would be safe from climate change turn out not to be. The places that we thought disease couldn’t go turn out to get disease. And angry humans can get most anywhere. I feel like they understand the irony of the fact that they are creating the conditions that are requiring this, but they’re not aware of the fact that their kind of techo-solutionist fantasies are not going to save them.

The threat of civil war may be real, but the isolationist response is the wrong one.

Rufus: It feels, to me, like two things could be true. The first is that I suspect billionaires are more paranoid than non-billionaires. Wealth isolates people, erodes empathy, and potentially turns people into assholes. You have to work really hard as a billionaire to offset that inclination. The other factor is that many of these billionaires made their fortunes designing complex, interconnected computer systems, and they have deep knowledge of how vulnerable those systems are and how devastating the collapse would be.

I think it’s worth spending a moment on the question of how irrational and fantastical are these fears versus not irrational and pragmatic.

We had Ray Dalio on the show a few months ago, and Ray has estimated that in the next decade there’s a 30% probability of civil war and a 30% risk of war with China. So civil war is a big one. But then you have attacks on the power grid, electromagnetic bombs that take out the internet. We could all imagine, post-COVID, a much more severe biological disease, whether it’s manmade or naturally occurring. And then there’s global warming, which strikes me as probably not a sudden event: it’s probably more of a gradually boiling frog situation.

What’s your personal assessment of how much all of us should be concerned about these probabilities?

Douglas: The more division of wealth we have—the more fear, isolationism, and alienation—the more brittle our systems become. I sometimes think the very things that we should or could be doing to prevent some of the worst scenarios are the same things that we would do to survive them if they were to happen. If shit’s gonna hit the fan and there’s no long distance supply chain of food, then wouldn’t it be great to have a lot of local agriculture around you? If there’s gonna be some kind of civil war, then wouldn’t you like to be friends with your neighbors? So, in some ways, the calculation that they’re making is the wrong one.

The world might be better off if Big Tech founders had made different decisions.

Rufus: When I graduated from college in ’91, I remember having this perception that you could either do something interesting or make money. Those were the two choices. And because those were the two choices, the starving artist had dignity. But in the decade that followed, we had these examples of the Google Boys, and others, doing intellectually interesting work that was changing the world in ways that we thought at the time were for the better, and then accidentally making billions of dollars. It collapsed this polarity that you either do something interesting and beautiful or you pursue money.

Douglas: That’s interesting. I remember I got a $40,000 advance for my first big cyberculture book. And I was ostracized by my friends. “Rushkoff, you got $40,000 for two years of work? My God, you sell out!”

“The more division of wealth we have—the more fear, isolationism, and alienation—the more brittle our systems become.”

What you’re saying about the money Is kind of true there. There was the accidental money. But then there was the pivot money. Take Google. Two kids in their Stanford dorm room come up with a new way to do search that’s gonna bring down big, bad Yahoo. They come with an algorithm that’s gonna use people’s links to figure out everything. The people’s net! But then they’re only making a couple of billion dollars, a quarter and Sequoia, who put in their money and want not just a 100x but a 1000x return, goes to Sergey and Larry and says, “Kids, what else you got?”

And Sergey and Larry say, “Well, there is all this data that people are leaving behind that we’re just kind of throwing out. Maybe we could use that to figure—”

“Ah-ha! There you go.” And thus the surveillance capitalism nightmare was born.

I go to these business schools, and I say, “Look, I think the way to be successful and good in the future is if you lower your sites. Can you be satisfied with just tens of millions of dollars? If you could somehow bring yourself to be okay with ending up with just tens of millions of dollars, you can take a very different, much more fun, loving, circular approach toward business. So I invite you to consider what’s enough.

Could you somehow be okay with say 50 million?

Rufus: We can imagine a scenario where Mark Zuckerberg was a hero. Imagine if he had made the decision to just be all about privacy, great user experience, looking out for the psychological health of users, protecting democracy. This might be a business that would be worth a few hundred million or a few billion, as opposed to hundreds of billions, but you have to think that he would be a radically happier human being had he and his team made those decisions.

Douglas: You would think. But for two reasons he didn’t. First, he’s now saying he’s gonna give back 99% of his money, to which I always say, “What if you had just made Facebook 99% less abusive?” Imagine that! But the other problem with Zuckerberg is that his role model is Augustus Caesar. He’s even got his hair cut like Caesar. So if you see yourself as the emperor of humanity, then it’s really not gonna work.

“If we start businesses with the mindset that it’s okay to reach a certain size and then stay there, just like any system in nature, it opens up such new and different and wonderful possibilities.”

Rufus: You talk, Doug, about building regenerative systems and being comfortable with economic erosion, if that’s necessary, to have more collaborative, reciprocal economies.

Douglas: The sad thing about so many technologists I know is they understand digital operating systems and how arbitrary they are and how easily they could be disrupted and shifted. They understand particular markets and how easily they can be disrupted. But they accept the underlying principles of central currency and growth-based economics as if they were conditions of nature. And they’re not. There are ways to do business that don’t involve that.

What we’re looking at is an economic operating system that was developed in the 13th century by royals who were concerned about the rise of the middle class, so they created a currency—a monopoly currency—that you have to borrow and pay back at interest. And because you have to pay back at interest, you’ve got to grow. That spawned colonialism and mining and extraction and a dominator economy. It led to all sorts of good things along with subjugation of lots of people and a lot of bad things. But it’s not the only economic model. Not every business needs to be required to grow in order to achieve its purpose. If we start businesses with the mindset that it’s okay to reach a certain size and then stay there, just like any system in nature, it opens up such new and different and wonderful possibilities.

To enjoy ad-free episodes of the Next Big Idea podcast, download the Next Big Idea App today:

Listen to key insights in the next big idea app

Download
the Next Big Idea App

app-store play-market

Also in Magazine

-->