Missing Out on the Tiger Lottery
“The tiger lottery” is an old expression that serves as a cautionary tale. It means what it says. It invites to contemplate the tragicomical situation of participating in a lottery where the main prize is a tiger—a unique, beautiful prize that will surely shred you to pieces. It has the thrill of gambling and winning. It has the downside of bringing about an inconveniently painful death. Yet you gamble. Recently, Demis Hassabis, CEO of Google DeepMind, and Dario Amodei, CEO of Anthropic, sat down at an event hosted by ‘The Economist’ and were asked, among other things, if they feared becoming the Oppenheimers of our time. They spoke about how every decision they make is like balancing on the edge of a knife—yet, at the same time, continued peddling their product with an air of patriotism by stating that if they don’t build AI fast enough then authoritarian countries could win. A thousand tiny violins then played in unison.
For all the ludicrousness of the situation, their words carried an underbelly pregnant with the fear among the major world powers, i.e. the fear of missing out. According to a PwC report on the macroeconomic impact of AI, the technology is estimated to contribute over 15.7 trillion dollars to the global economy by 2030, i.e. a 14% increase on the estimate without AI—more than the current output of China and India combined. This means that any nation not going guns blazing into AI will forego significant potential GDP gains. In turn, due to the ubiquitousness of AI, missing out will translate into an arrested ability to innovate and, thus, produce high-value goods, throwing any country into full dependence on foreign AI solutions.
Then there is (to do justice to Hassabis and Amodei) the issue of national security and geopolitical influence. Although we all want the countries of the world to link arms in a moonlit glade and chant verses from Whitman, that is simply not the way the world works. AI supremacy translates into geopolitical influence and, thus, an issue of national security. Consider defense for a second. Modern warfare is AI through and through—from autonomous drones to cybersecurity and intelligence analysis—and becoming reliant on foreign AI for any sort of infrastructure makes a country an easy target to be disrupted and be held hostage.
Third, there is the topic of human capital. Parallel to many jobs becoming automated, losing the AI race would cause a country to risk high skill gaps as industries transition to processes driven by (foreign sourced) AI, while the top talent that could have supported AI development will emigrate to AI hubs, effectively creating a downward spiral. And this is not something that will occur in the distant future or at a slow pace. According to a report by the organization-that-shall-not-be-named, nearly 75% of the major companies in the world expect to adopt AI across the board.
To top the horror ice cream with screaming sprinkles, lack of domestic AI capabilities can make a country dependent on tools from a few dominant corporations. While this goes hand in hand with national security and economic competitiveness, it also brings to the fore the risk of sensitive national and individual data being processed and stored abroad. As written in the recitals of the GDPR, when personal data moves across borders outside the Union, it decreases the ability of the individual to exercise data protection rights and hinders authorities from pursuing complaints or conducting investigations abroad. The EU can afford the luxury of having strong data protection laws because it is the second largest market in the world, which provides leverage to enforce its regulations. Most countries do not have that power.
Several other consequences of missing out can be inferred from these examples. For example, not having at least a minimal AI industry prevents a country from making its voice heard in international fora setting up ethical standards and frameworks. Having few or no leading researchers making groundbreaking progress in the field, or respectable participants in the industry, condemns countries to adopt the standards of the AI dominant powers at the cost of their own legal traditions, culture, and national priorities. There is also the diminishing of the local tech entrepreneurship culture and so on. If we take all the above points in consideration, we can safely conclude that missing out on AI will create “digital colonies” much like historical dependencies on industrial technologies. At the top, the arms race may be about winning, though in the middle and bottom it is about survival.
I am not a fan of Varoufakis but I will borrow his term techno-feudalism. This is a system where big tech is akin to feudal lords controlling land, in this case platforms, data, and infrastructure that billions of people rely on for all intents and purposes. Also, like feudal lords, techno-feudalism envisions rent extraction through subscription fees, a cut from every transaction, or advertising—no longer peasants renting the land from the lord. In this system, serfdom is built around users giving their data in exchange for free services, just for the data to generate revenue and perpetuate growing entrenchment by the techno-feudal lords. Obviously, it follows that the bigger the platforms get the higher the barriers of entry, and the peasants, at the mercy of centralized wealth and power capable of manipulating world politics, lose labor rights as they are condemned to a gig economy.
Sure, the idea is both sexy and full of holes through which it can be criticized. However, I believe many people feel at skin level that the furnace is getting warmer and these outcomes cannot be disregarded with a sleight of hand. We are surrounded, then, by four walls. One, techno-feudalism; two, a major global power subjugating everyone else thanks to its AI supremacy; three, human extinction; and, four, a beautiful utopia where we all live happily ever after. I will not place a bet on any of them. Whoever tells you that one of those outcomes is guaranteed they are lying. It is a variation of Pascal’s wager but at various levels and tweaked in a way that... oh, well, let’s just say that there is a 75% chance that we’re not going to like it. Utopia can be around the corner, sure, but that 1 in 4.
It is the tiger lottery—and are all playing whether we bought a ticket or not.