Jo Debecker, head of Wipro’s FullStride Cloud business line, oversees the company’s global cloud strategy. A firm believer in value-driven cloud spending for innovation, he says that the opportunity afforded by AI is clear for all to see — it has the potential to turbo-charge the enterprise cloud platform and empower organisations to innovate and stay competitive. In this interview, Debecker speaks to Sudhir Chowdhary on key trends in the AI and cloud space, and their need for greater integration to create seamless workflows. Excerpts:
How is AI accelerating the demand for cloud solutions?
Cloud implementations are accelerating at a fast pace, sometimes even faster than AI implementations. This is because the need for AI adoption is now one of the driving factors for people moving towards the cloud. To facilitate AI implementation, enterprises need the cloud. So, we can say that there is no AI without clouds.
Our latest Cloud Pulse survey shows that over half (54%) of service organisations cite AI as their main driver for cloud investment. Around 55% say that cloud adoption is currently outpacing AI adoption, and 35% are advancing both technologies in tandem. Organisations that have advanced in their cloud journey will have a stronger data strategy, which in turn unlocks the data, which in turn unlocks AI deployments.
How can enterprises adopt AI solutions effectively?
Companies need a comprehensive data strategy to capture, structure and organise data effectively. They need to ask themselves where (and how) the data is stored and how it can be captured. With large amounts of high-quality data, they can build a strong digital core to classify and organise this data. Cloud is the underlying technology — the platform on which you can build a digital core.
So for us, the relationship between having a cloud, digital core and data strategy and then running AI is actually one single equation. We help our customers have that in order before they can get the full value of AI before implementing AI proof of concepts (POCs). Before considering mass deploying and scaling AI, you need to have all those building blocks in place.
There is a lot of hype around AI, but is it delivering real value?
There are huge expectations from AI as it can transform many aspects of our lives and the way businesses are done. We have seen proof of that. We are going into the next phase of the maturity for AI. This means everybody was jumping on the bandwagon to do POCs. Now the POCs need to create business value, and the equation needs to work. For me that’s just the logical next step we are taking as an industry in this technology.
Will AI solutions become more mainstream in the future?
The cost of running AI is going down and will go further down. So, we truly believe that AI will democratise more and more, and with the advent of small language models (SLMs) which have been trained by synthetic data of large language models. Typically, SLMs are easier to use, more flexible, cheaper, can be run locally on edge devices, even on your smart phones.
We see the cost side of the equation going down and we see the business value side going up, if you define it very well. This means that we will see more POCs and use cases being developed in the market, where the ROI on AI can be measured and proven. In return these POCs will be mass scaled and become mainstream pretty soon.
What key benefits does strong cloud infrastructure bring to AI?
It’s important to understand firstly that AI and cloud are no longer a cost play, but it is now about unlocking business value. AI built on a strong cloud infrastructure brings several benefits including optimising operations, reducing costs, but also creating business value like improved customer satisfaction, quicker time to market or even enhancing sustainability posture. For example, a transportation company could use AI to optimise tanker routes, reducing carbon footprint whilst also reducing operational costs. Benefits are defined by what is considered business value, in this case it would be cost reduction and enhanced sustainability.
How are small language models (SLMs) impacting access to AI?
SLMs fundamentally were meant to run on edge devices, on-prem devices and private clouds. Fundamentally SLMs will drive democratisation of AI for various number of reasons. One, it is easy to use and affordable, it requires less computational power. It’s more suitable to use on simpler devices. It can run mobile processors, acknowledging that mobile processors nowadays are more powerful than classical processes of roughly 5 years ago.
Second is time to market. They are quick to set up as they use smaller, more relevant data sets, less than 15,000,000 parameters. Importantly, it supports local languages. It can be adjusted to understanding and communicating in local Indian languages and by doing that it is going to make the technology much more useful and friendly for everyone.
We will also see small language models in regulated environments, where data privacy is very important, where you know your data has been generated. In a way, we may no longer bring cloud and data to AI, but we bring AI to where the data is being generated.
BLURB
we need to understand that ai and cloud are no longer a cost play, it is now about unlocking business value