The future cyber economy

This is an edited excerpt from the New Scientist report, Gamechangers: Automation and Artificial Intelligence.

" The peer-to peer economy in just a few years has created a virtual hotel empire with more rooms than the largest hotel chain in the world."
Sally Adee, Editor, GameChangers

In 1959, Marvin Minsky and his colleague John McCarthy founded the MIT Artificial Intelligence Project. The men were convinced computers could be made as smart as humans – and then smarter.

McCarthy thought functioning AI systems were only a decade away. Around the same time, another researcher, Herbert Simon of CMU, predicted a computer would be the world chess champion by 1967.

McCarthy and Simon were wildly wrong – so much so that in 1972 Hubert Dreyfus, a philosophy professor at the University of California, Berkeley, wrote What Computers Can’t Do, citing “game playing, language translating, problem solving, and pattern recognition” as key things humans can do and that computers can’t.

Dreyfus was proved wrong in 1997, when IBM’s Deep Blue program beat the world chess champion. If you try to plot a straight line from 1959 to 1972 to 1997 and extrapolate, it won’t make sense. Either Dreyfus was mad, or Deep Blue didn’t really beat Garry Kasparov. In fact, it’s the line that’s wrong.

Click image to zoom Tap image to zoom

Click image to zoom Tap image to zoom

Technological progress in AI seems to track (and may even be faster than) Moore’s law, which is an exponential curve – the sort that plods along, rising slowly, until it suddenly starts shooting skyward like a rocket. It’s the slow start that’s deceptive.

Think of the ancient question filling a chessboard with grains of rice. In the first square, where a castle stands, you put one grain; in the neighbouring knight’s square, two grains; in the bishop’s square, four. Traverse the entire board, each time doubling the number of grains.

In the first decade or so of Moore’s Law, computer performance doubled annually. So if you equate computing performance to grains of rice, and start with a single grain in 1959, by 1972 you’d have about 8000 grains, or enough to serve lunch for four.

In 1996, you’d have about 4 million meals. By 2019, you’d have one trillion. In 2155 – which is when you’d fill the last square of the chessboard – you’d need 18 million trillion grains of rice, which equals 1.6 trillion times the world’s current annual harvest.

Even if Moore’s law levels off somewhat, it’s easy to see what looked like a lack of significant progress in the first 30 or 40 years of AI was really time spent laying the necessary groundwork for the field to really take off. That’s where we are right now.

We have systems that do pretty well at games (one just beat the world champion at Go, a much harder game than chess); language translation good enough for us to get around in a foreign capital; and software that can beat most humans at data analysis and optimisation – all of the things computers couldn’t do in 1972.

By 2109, we’ll have 1.2 billion, trillion, trillion grains of rice. We may not even be counting the rice correctly.


First, there’s the peer-to peer economy, which, in just a few years, has created a virtual hotel empire with more rooms than the largest hotel chain in the world. Are we accurately capturing the contributions of Airbnb – and Uber, Lyft, Lending Club and all the rest?

Then there’s the ever-increasing number and value of goods and services that are flat-out free – Wikipedia, Google Search, Linux, all those free apps on your phone.  

Consider one free service, WhatsApp. In September 2014, the economy benefited only from the salaries of its 55 employees. In October of that year, Facebook completed its purchase of Whatsapp for $US22 billion, making those 55 employees the most productive ones the planet has ever seen.

It’s also worth keeping in mind GDP is an increasingly limited tool for measuring productivity, let alone a healthy society. For example, per capita GDP might go down instead of up if we ever get the upper hand in fighting cancer, diabetes and dementia.

Quality of life is often ignored in economic analysis. The inflation-adjusted cost of a top-of-the-line television has declined twelvefold since 1948. But that doesn’t take into account the 1948 set had a 16-inch screen and displayed only two colours.

All told, traditional productivity statistics may be losing their relevance as a manufacturing and services economy morphs into a cybernetic one.

Sally Adee is technology feature editor at New Scientist

The views and opinions expressed in this communication are those of the author and may not necessarily state or reflect those of ANZ.

editor's picks

20 May 2016

When technology meets vanity: the burgeoning wearables market

Wee Teck Loo | Author, Euromonitor International

Sales of wearable electronics are projected to exceed 300 million units in 2020. Since Apple launched its Apple Watch in 2015, the wearable electronics market has become one of the hottest in electronics. Fashion groups are jostling for untapped sources of growth.

19 May 2016

How innovation for accessibility can help everyone

Darren Baird & Steve Price | Head of Customer Experience & Senior Manager Everyday Banking, ANZ

Many businesses approach acceptance of people with disability as a means to develop a community connection, but the true business case for tackling discrimination is more straightforward. People with a disability are consumers, investors and employees and by addressing barriers a business can increase its market for customers and employees.

02 May 2016

The fintech revolution is here and fintechs are being squeezed

Andrew Cornell | Past Managing Editor, bluenotes

ANZ’s announcement of a long-awaited joint venture with Apple to offer Apple Pay, and one for Android Pay looming, brings the relationship between established banks and financial technology companies – fintechs – to the fore.