Today the data in the cloud is basically in shipping containers, rows and rows of servers. And one of the curious things is cloud providers actually ask their technology people to get rid of the red flashing lights on these servers because those lights are just another thing which can go wrong. You try and dumb down the technology so it doesn’t go wrong.
With the cloud you are not interested in the technology so you can concentrate on the information. The cloud gives you new services and information and it is being created at pace. So it makes sense to connect the system of record to the cloud.
Now regulators were initially not comfortable about the cloud but the danger is the old data systems become islands, in a bad sense. Facebook is updating in the cloud, all the time. Banks with their old systems are doing it much less frequently.
The critical decisions in this new architecture are all about the discipline of knowing what data is required. So then it becomes a question of how you use information.
With the cloud you do have to give up a bit. You have to be less bespoke in your needs. But you can do some things really fast because you don’t have to deal with the legacy systems. A lot of the hassles come from cutting corners to try and get to market at the right time because you have to deal with the legacy system.
The question then becomes do you continue to run the core or create a new service?
Think of what’s happening in terms of clothing. In the past you had bespoke tailors like the men’s outfitters in London’s Saville Row. Each item was made specifically for an individual – but it was expensive.
Then you have brands like Gap which manufacture on a massive scale but in fixed sizes and styles. But the clothes are much, much cheaper.
What the cloud, big data and the information it contains can do for customers is the equivalent of bespoke tailoring at Gap prices. This is the opportunity of cloud business models.
At Microsoft, for example, they aspire to be able write code and have an application available 24 hours later – solving a task which in the past may have taken months.
Now I know there are challenges for this new operating environment. Regulators have concerns about data security, different countries are also concerned about access by other governments to their citizens’ personal data. These are challenges, and they’re not simple, but my sense is there is a shift taking place and the risks in the old systems are now being recognised as well.
After all, security in any organisation depends on the quality of the people running security. For a company like Microsoft, if they are offering these services they can’t afford not to be more secure and they have the resources to devote to improving security. Because this is their speciality, they have more skill than any individual company could hope to have.
Take the GFC as an example. No one understood where the risk really was and it ended up being somewhere no one understood. The challenge for security is to understand and manage the risk.
The big challenge for everyone, existing players, regulators, governments, is managing disruption. That’s one of the lessons of Kodak but it is also a lesson with BitCoin, how do you manage these disruptions?
What we do know is the cloud offers fantastic operating models which are fast and flexible.The challenge is to have a virtualisation that is not dependent on the location.
Today in a bank like ANZ we have something like a thousand people in technology running the bank, the ATMs and about 4000 building the technology and that technology is very complex.
With the cloud the engineering, where the complexity is, becomes the application, not the technology.
In the past everything was built into the technology. But in the future the application will be “change aware” – and that means if the infrastructure fails, the application doesn’t fail. What I do need to be able to do is manage the application without an outage, that’s where we need IT to be change aware.
When you are in this world you can change or develop your application very quickly based on feedback and you don’t have to worry about which version of the application, you can have multiple versions of code running with no problems – so in the future then infrastructure departments will be much smaller.
This really is the question when people talk about core upgrades at banks: you might spend five years upgrading but if that opens you up to the advantages of the cloud it might be worth it.
The core is being hollowed out anyway. Take one example, it’s the idea of signal and noise. In the past with our data there was an issue about having to have pristine data but in fact we are now much better at getting to the signal without the data having to be pristine. Some pieces of data don’t have to be 100 per cent exact. Your bank account details need to be exact but if you were trying to predict trend information to help solve future customer needs you could use more unstructured data.
So the core is a means to an end. Infrastructure is a means to an end. Technology is a means to an end. The end we want to get to is genuine information. It is information that allows us to do things.
Frank McGrath - General Manager Technology Service Management, ANZ.
Photo: Oleksiy Mark, Shutterstock.