Home Radio Segments Guest Segments How Some Billionaires Are Preparing For The End Of The World (Hint:...

How Some Billionaires Are Preparing For The End Of The World (Hint: It Doesn’t Include Us)

2090
SHARE
Douglas Rushkoff, Billionaires Preparing For The End Of The World, Apocalypse

With Douglas Rushkoff, Professor of Media Theory and Digital Economics at CUNY/Queens

Steve spoke with Professor Douglas Rushkoff, who recently penned an article titled, “Survival of the Richest”, that points out the dark turn that much of technology has taken and how it threatens our very survival as humans.

The Coming Apocalypse

Douglas recounted being invited to speak to a gathering of billionaires and how surprised he was to learn that pretty much all they wanted to talk about was how they could best survive doomsday, what they could do to effectively protect themselves in a post-apocalyptic world where their money no longer had any value. They wanted to know how they could protect themselves in their bunker or on their isolated island. Whether it comes about through nuclear war, climate disaster, disease, or whatever, most of these men seemed resigned to a bleak, isolationist future in which their only mission was to save themselves.

Technology Vs. Humanity

The ironic thing is that many of the titans of the technology industry are now facing the fact that the very technologies that have made them wealthy are making the world a less human—or at least certainly less humane—place to live. While some of them are just planning their post-doomsday escape, others are coming to grips with the pain and devastation left in the wake of their industries. They’re becoming aware of all the little fingers that are lost in Chinese factories or of the boys sent into the mines in the Congo to get the rare-earth metals for our smartphones. They’re becoming aware of the detrimental effects on American children who are living more virtual lives rather than real lives, with their heads buried in their iPhones all day.

It’s interesting that many of the people who have created all this technology do their best to keep their own children away from it. They make the smartboards and iPad programs for kids to use in public schools, but many Silicon Valley parents send their own kids to a Rudolf Steiner or Waldorf school and don’t let them watch television.

Technology’s Turning Point

Douglas believes that the development of technology took a dark turn in the 1990s when there was a shift from thinking about the possibilities of the collective human imagination to more just the question of, “How much money can we extract from people using these new digital technologies?” He noted that this idea was reflected in a Wired magazine cover story that framed the Internet purely as a business opportunity rather than an opportunity to benefit humanity. There was a time when technology was thought of in terms of uplifting humanity and creating a better world. But that seems to have been largely abandoned in favor of just generating more and more profits for companies like Facebook, Apple, and Google.

The advance of technology has been mistakenly viewed as just this infinite parade of progress, but now we’re beginning to see that progress always involves a trade-off. It’s not cost-free. One of the best examples is the rise of companies like Walmart and Amazon. Certainly, these companies have benefited people in some ways, but they’ve also cost us important things in the demise of local retail stores.

An Upside-Down World Of Values

Things have turned upside down as companies have started looking at human beings as the problem and technology as the solution and wanting to use technologies to try to optimize human beings to the needs of an infinitely growth-based marketplace. The horse and the cart have been reversed if we take this market view of humanity that human beings are only valuable in so far as they can contribute to the growth of the market place. The market was supposed to help human beings get what they need. It’s not that human beings are here to serve the market.

Join Team Human

Douglas believes that the solution lies in refocusing on the basic value of humanity, a value that vastly transcends any amount of money that the markets can generate. He noted that in old sci-fi shows like Star Trek, humans were the heroes and being human was the very thing that made them the heroes. Even with our flaws and emotionality, humans were interesting and had heart. Humans would do the illogical, weird thing that would beat the aliens. But in more recent shows such as Westworld, humans are the problem, not the solution.

But Douglas doesn’t subscribe to that bleak view of humanity. Being human isn’t about trying to make your individual escape from the “zombie apocalypse.”  It’s about realizing that whatever future we humans have, it will be together. And that’s the theme of his new book, Team Human, which is all about regenerating our bonds to affect positive societal changes.

To learn more, you can read Douglas’ article, “The Survival of the Richest”, here, or take a look at his new book here.

Disclosure: The opinions expressed are those of the interviewee and not necessarily United Capital.  Interviewee is not a representative of United Capital. Investing involves risk and investors should carefully consider their own investment objectives and never rely on any single chart, graph or marketing piece to make decisions.  Content provided is intended for informational purposes only, is not a recommendation to buy or sell any securities, and should not be considered tax, legal, investment advice. Please contact your tax, legal, financial professional with questions about your specific needs and circumstances.  The information contained herein was obtained from sources believed to be reliable, however their accuracy and completeness cannot be guaranteed. All data are driven from publicly available information and has not been independently verified by United Capital.

Read The Entire Transcript Here

Steve Pomeranz: My next guest is Douglas Rushkoff. Douglas recently penned a piece that I think you’ll find more than interesting and not a little disturbing as well. The piece is titled “Survival of the Richest: The Wealthy Are Plotting to Leave Us Behind,” and it’s in advance of his new book, Team Human. Douglas is a professor of media theory and digital economics at CUNY/Queens. Douglas Rushkoff, welcome to the program.

Douglas Rushkoff: Good to be with you.

Steve Pomeranz: So I saw this article and I would like you to begin by telling a story of your invite to a super-deluxe private resort where you’re asked to deliver what you thought was going to be a keynote speech. Please.

Douglas Rushkoff: Yeah. Well, you know—and I get these all the time, I’ve written a bunch of books on media, technology, and culture and you know, a lot of times there’ll be business people and bankers and marketers who want me to come and talk about, you know, the future of AI or the future of cloning or the future of bitcoin, and I figured it was one of those. And it was weird though. It was one of those resorts where you kind of get taken on a little golf cart out to a cabin, you know, and you stay in your own little cabin.

I mean, it was practically like they put a bag over my head before I came to this place. And then I finally got back on a little golf cart and they brought me to the place where my talk was going to be, and I’m sitting in the green room, you know, getting ready to go on. And instead of putting me on the stage, they brought five billionaires into the room and they just sat around this little round table with me and started peppering me with all these questions about, you know, should they invest in bitcoin or Ethereum or you know, what strategy should they be using for their business.

And then finally they got around to their real question, which was, you know, should I build my bunker in Alaska or New Zealand? And that was the whole thing. I spent an hour talking with them about their doomsday scenarios and preparations.

Steve Pomeranz: Questions like, how do I maintain authority over my security force after the events? So my question is, wait, after the event, after what event?

Douglas Rushkoff: Well, the event, whichever event that’s going to end the world as we know it. They’re concerned about social unrest, climate change, a disease, a nanotech accident, nuclear war, electromagnetic pulse, economic collapse, whatever it is that makes the fabric of society breakdown and makes it so government can’t protect people anymore or everybody out in the world is dying. And you know, the interesting thing is the real billionaires, the Elon Musks or Richard Branson, the big billionaires, they can get off, you know. They got rocket ships at this point.

Steve Pomeranz: Oh, God. Yeah.

Douglas Rushkoff: These are the mid-level billionaires who really are still worried about how am I going to do this? So they’re all buying land and trying to build these little eco farms that are somehow sustainable and defensible.

Steve Pomeranz: Yeah.

Douglas Rushkoff: After the event, but they are concerned. I mean the big problem they have is how do they defend themselves when their money is not worth anything.

Steve Pomeranz: They’re going to have armed guards, and so how, you know, how do they pay the guards once money is worthless? How do they stop the guards from choosing their own leader? These are the questions that they’re asking.

Douglas Rushkoff: Right, and it’s in some ways, it’s just a game. You know, it’s a game that billionaires play and even if it costs a few hundred million dollars of investment to game it out, you know, it makes them feel a little safer when they’re sleeping at night that they could hop in their helicopter if the news on CNN is bad and go to their, you know, secret hideout. But it also, it’s really to me, indicative of their picture of the world. You know, that these are guys who are really trying to earn enough money so they can insulate themselves from the reality they’re creating by earning money in that way. You know, they feel really guilty and shameful, but so you’re into self-preservation.

Steve Pomeranz: Well, it’s even not every billionaire is like this. I mean, you have Warren Buffett and Bill Gates giving away, you know, the big biggest majority of their best fortunes to try to help humanity, and they’re not looking for ways to kind of escape and worry about, you know, how are they going to be able to survive? It’s almost like one of those zombie movies or something like this. These are apocalyptic movies. It almost seems in a sense that they’re wrapped up in this fantasy world.

Douglas Rushkoff: Yeah. I mean, the question is whether it’s a fantasy world or whether it’s real. Are they finally coming to grips with the pain and devastation left in the wake of their industries? You know, are they becoming aware of all the little fingers that are lost in the Chinese factories or are they aware of the boys that are sent into the mines and the Congo to get the rare-earth metals for our smartphones? Are they aware of the collapse of cognition and even American children who are using iPhones living the wifi zones all day?

I mean, as they become aware of that, they start to realize, oh wow, we can’t run away from the externalities of our businesses anymore. And their response though, I mean, the response of some of them, like you’re saying, of a Buffett or a Gates, is to think, okay, we better quickly start pouring some of this money we took out of the system back into it, into ways that are going to fix things. But the other guys are thinking, no, let me just spend what I have to, to get my own hide out of this mess.

Steve Pomeranz: Well, you mentioned this idea of change and the disruption of these industries that they’re changing forever or that technology is changing. How is this any different than what we experienced during the industrial revolution where all of the things, so many jobs were lost? So many things were changed and actually, conditions are just so much better today than they were for the people back then. Same type, I mean, child labor was ubiquitous back then and not even thought of as wrong and so on. How is this different?

Douglas Rushkoff: Well, I think what’s different is back in those days we were colonizing other people, you know, we were colonizing brown people in South Africa and Africa and in the Indies and now we’re colonizing ourselves. Now we’re colonizing human attention. We’re colonizing our own markets. You know, these companies are not invading foreign soil. They’re invading our own schools, our own neighborhoods, our own water. So it’s a little bit different. The world’s gotten a bit smaller.

Steve Pomeranz: Yeah. Yeah. You know, it wasn’t always this way. I mean it was, there was a time when technology thought it was going to bring humanity up and to create a better world. You’re saying that’s changed and if that has changed, when did it start to change?

Douglas Rushkoff: I mean, I think the turning point was when, really when Wired magazine framed the Internet as a business opportunity, rather than an opportunity for humanity. You know, the stock market had been going down, the late eighties had this big crash and biotech was looked at as the, you know, the potential future of the market and it had failed. And Wired said, don’t worry, the Internet can make the market do what it had failed to. They published a cover story on a book called, The Long Boom, where they said thanks to the Internet, the economy will now grow on an accelerated basis, exponentially, forever. That was their thesis.

Steve Pomeranz: Right.

Douglas Rushkoff: And to some extent, you know, it can as long as you’re willing to invest in companies like Facebook or Google and try to figure out how they can grow exponentially forever. But there’s a cost, there’s a cost to that. So I feel like that was the turning point. Really in the early nineties when the Internet went from being this kind of weird academic experiment in, what are the possibilities of the collective human imagination, to more the question of how much money can we extract from people in places using these new digital technologies?

Steve Pomeranz: It also seems to reduce the human being to a series of zeros and ones, and therefore, since human beings are much more complicated than any kind of written algorithm that can be produced, then somehow humans become flawed, and since those flaws can’t actually be corrected, the idea would be to change humans through smart drugs or implants and things of that nature.

Douglas Rushkoff: Yeah, I mean, you know, if you look at human beings as the problem and technology as the solution, then you end up wanting to use technologies to try to optimize human beings to the needs of this growth-based marketplace, and you start buying this really kind of bastardized understanding of Darwin. I mean, he never really talked about survival of the fittest individual. What Darwin was arguing for was this understanding of evolution as this collaboration between species.

That’s what he kept marveling at. At the way different species would collaborate and cooperate in order to ensure mutual survival, but if instead, we take this kind of market view of humanity that human beings are only valuable in so far as they can contribute to the growth of this marketplace, you know, then we all lose because there’s a certain point at which, if human beings are here to help the marketplace grow, then the cart and the horse are reversed. The market was supposed to help human beings get what they needed. It’s not that human beings are here to serve that market.

Steve Pomeranz: We’re going to take a quick break. My guest is Douglas Rushkoff. The book is, Team Human, and the piece that I read which led to this interview today, is “Survival of the Richest: The Wealthy Are Plotting to Leave Us Behind.” We’ll be right back.

Steve Pomeranz: I’m back with Douglas Rushkoff, Professor of Media Theory and Digital Economics at CUNY/Queens. His new book, Team Human speaks a lot to what we’re talking about today. The piece that I read was the “Survival of the Richest: The Wealthy Are Plotting to Leave Us Behind.” Welcome back, Douglas.

Douglas Rushkoff: Good to be with you.

Steve Pomeranz: You’ve been talking about these issues now for some time. How are you now perceived? I guess there’s kind of a group-think that goes on in the technology world that they’re changing the world and they’re kind of on the side of good which is being rewarded through all these wonderful riches. How is someone who’s standing there going, “No, I don’t see it this way,” how are you being treated?

Douglas Rushkoff: I mean, I’m being treated pretty well. On a certain level, they agree that what they’re doing is not good, most of them. A lot of them have been through the computer program at Stanford. They took specific classes in how to manipulate people. There’s a department there called Captology, which is about how to use behavioral economic tricks through technology to get people to do stuff, to addict kids to social media or make people feel bad if they don’t check their Facebook every day, all the little tricks of the trade.

The people who were really the best at that have kind of seen the error of their ways at least halfway. Now, they do want to change things. So they’re talking about how can we be more humane. How can we develop what they call more humane technology, so that sort of technology that treats people better? But when I keep hearing the phrase humane technology, I think of cage-free chickens. You know, that they’re going to-

Steve Pomeranz: Yeah, right.

Douglas Rushkoff: … develop technology that treats better all the way to the slaughter.

Steve Pomeranz: I mean, I don’t know the veracity of this, but I’ve read that people in Silicon Valley, many of them, are reducing the amount of technological exposure their own children have because they’re seeing the damage there. Is that something you’re seeing?

Douglas Rushkoff: Yeah. I mean, but most of the people I know in the industry, they make stuff for our kids but they don’t let their own kids use it. They’ll make the smartboards and iPad programs for kids to use in public schools and they’ll do sales pitches and TED Talks and talk about all the great metrics. But they send their own kids to a Rudolf Steiner school or a Waldorf school and won’t let them watch television. So there’s a bit of a double standard there.

Steve Pomeranz: Yeah, I wonder what that says. You know, they’re taking their cue from Elon Musk, this idea of colonizing Mars or Peter Thiel reversing the aging process or Ray Kurzweil, who has been said to … there’s talk that he’s uploading his mind into Google supercomputers or something of that nature. I think one thing that they forget is this idea, let’s say, about colonizing Mars, is that’s been tried here on Earth. It was called the Biosphere.

Douglas Rushkoff: Right. I know. Steve Bannon, I think was the CEO there, which is another all interesting conversation. But, yeah, the idea that Elon’s going to be able to get to Mars and build a bubble and somehow stay alive there. If we couldn’t do it with billions of dollars and a bunch of people and scientists here on Earth, it seems unlikely that they’ll be able to do it on Mars in his lifetime. Really, I don’t think that there is a viable escape route for humanity in the foreseeable future. I feel like the damage we’re inflicting on the climate, on our own environment, is building faster than our ability to build an escape-

Steve Pomeranz: Oh, the escape, yeah, yeah. No, I hear you. I hear it.

Douglas Rushkoff: … an escape program.

Steve Pomeranz: Well, when you watch TV … I know a lot of people like these zombie shows. I really can’t even watch them for two seconds. But this post-apocalyptic view of things where people, the people that are alive, are no better than the undead, as you’ve written. And even Westworld, the science fiction novel and show, where robots run amok, in the second season, human beings, the ultimate reveal that robots learn …

They become smarter than us but they really don’t even want to be robots. They just want to be asleep in a computer somewhere. Even that, any correlation to humanness is bad and not something that nobody wants, even computers.

Douglas Rushkoff: Yeah, it’s really interesting. I mean, you look at old sci-fi like Star Trek and human beings are kind of the heroes of the universe.

Steve Pomeranz: Yeah, that’s true.

Douglas Rushkoff: Even with our flaws and our emotionality, compared to the Vulcans, at least we were interesting and we had heart. We would do the illogical weird thing that would then beat the aliens. Now, human beings are the problem in any of these shows. Zombie shows, basically, yeah, they’re about the fact that humans are no different from zombies. We kill, they kill. It’s the same. Or Westworld, that robots are better than us.

When I debate these folks, the folks that you mentioned who believe that we should upload our technology, our minds, into the web or into the cloud, I argue for the humans, that humans are weird. We’re special. We’re wonderful. There’s things about us that we can’t just isolate and get onto a CD-ROM just yet. They say to me, “Oh, Rushkoff, you just say that because you’re a human,” like it’s some form of hubris. That’s what got me writing. That’s what got me writing the Team Human book. It’s almost a facetious idea. Yeah, I’m on Team Human.

Steve Pomeranz: Here’s the problem. The problem is if we load ourselves up into the cloud, we’re all going to forget our passwords and won’t ever-

Douglas Rushkoff: Yeah. Well, once you’re there you only need the password to get back out.

Steve Pomeranz: We’ll never get it back. We don’t need the password anymore, okay. You see, you have to be rational with me, here. All right, so I want to go back to this idea. After the event, these people want to figure out how they’re going to be protected from the angry mobs. They knew that armed guards would be required to protect their compounds.

So they wanted to know, number one, how would they pay the guards once the money was worthless. How would they stop the guards from choosing their own leader. The billionaires considered using special combination locks on the food supply that only they knew or making the guards wear disciplinary collars of some kind in return for their survival, or maybe building robots to serve as guards and workers if that technology could be developed in time. All of those seem incredibly flawed. What did you say to them?

Douglas Rushkoff: I told them the best way to earn the loyalty of their security force in the future is to be really nice to them now. I made the joke, I said, “Why don’t you pay for their kids’ bar mitzvahs or something?” You know?

Steve Pomeranz: Yeah.

Douglas Rushkoff: Not that most of these special forces had bat mitzvah-age daughters or anything, but my point was if you did pay for someone’s kid’s bar mitzvah, then 20 years from now, when they’re in the bunker with you, they’re going to see you as a friend, as someone who took care of them, as someone who was there. But none of these guys want to take care of these people. They look at even the security force as the other, as the masses, as the people on whom they’re operating. As long as you look like these technology-type people do, as long as you look at people as like machines to operate, you’re never going to connect with them. You’re never going to really earn their true loyalty. At that point, what’s it even worth being in a bunker with a bunch of people who hate you?

Steve Pomeranz: Yeah. Wow, interesting stuff. Douglas Rushkoff, Professor of Media Theory and Digital Economics at CUNY/Queens. The book is Team Human. The article was “Survival of the Richest: The Wealthy Are Plotting to Leave Us Behind.” Not if I have anything to do with it, Douglas, I’m telling you right now.

Douglas Rushkoff: Excellent.

Steve Pomeranz: To hear this and any interview again, don’t forget to come to our website, which is StevePomeranz.com to join our conversation. While you’re there, sign up for our weekly update where we talk about all of our live events and any important topics we’ve covered that week, straight into your inbox. That’s StevePomeranz.com. Douglas, once again, thank you so much.

Douglas Rushkoff: Thank you.