How AI pressure is making companies rethink cloud use

The cost of such transformation is rising, now boosted by a spate of generative AI tools added to the mix. 

Updated - July 24, 2023 06:39 pm IST

Published - July 24, 2023 05:49 pm IST

A Gartner report has predicted that through 2023, AI will be one of the top workloads that drive IT infrastructure decisions.

A Gartner report has predicted that through 2023, AI will be one of the top workloads that drive IT infrastructure decisions. | Photo Credit: Reuters

The shift to the cloud and the consequent boom in the sector was held together by its grand promise that any company could digitally transform itself and keep its data secure on the cloud. But the cost of such transformation is rising, now boosted by a spate of generative AI tools added to the mix.

Big Tech companies with fat cloud bills are facing something of a catch-22 situation as they are unable to opt out for the fear of being left behind. So, they are looking at more ways to cut corners. 

Making in-house AI chips to cut costs

On 11 July, at a semiconductor conference in San Francisco, IBM said it was considering using its in-house AI chips to lower the costs of cloud computing. Mukesh Khare, a general manager of IBM Semiconductors, said in an interview with Reuters that the company may use a chip called the Artificial Intelligence Unit in its new enterprise AI platform Watsonx. Khare noted that this would solve one of the big pitfalls of its old Watson system - high costs, as their chip was more energy efficient. 

IBM took the hint from other tech giants like Google, Microsoft and Amazon, all of whom are designing their own AI chips, with the hope that they can save money on their AI push. Up until now, the pressure was on an even smaller number of specialised chips, such as graphic chips, or GPUs, from NVIDIA. But the scope is widening to accommodate demand. Microsoft has reportedly accelerated Athena, its project to design its own AI chips. The Satya Nadella-led company hopes to make its AI chips available within the company and OpenAI by next year. 

(For top technology news of the day, subscribe to our tech newsletter Today’s Cache)

Meanwhile, in April-end, The Information reported that Google’s AI chip engineering team had moved to its Google Cloud unit to move things faster. If operating cloud data centres is expensive, clients themselves are also struggling with the soaring prices. 

Shift to on-premises

“AI and ML require specialized resources which are extremely expensive to build on-premises, even for very large enterprises. On the other hand, given the heavy use of AI/ML by competitors, enterprises do not want to be left behind in the race. The cloud offers an ideal solution for enterprises that need to strengthen the infrastructure required to build AI/ML into their growth roadmap,” Dharmendra Chouhan, Director of Engineering at big data cloud platform, Kyvos Insights said. 

“The basic nature of the cloud is such that the more one uses it, the more the enterprise pays. This does not mean that an enterprise cannot control cloud-spend if it wants to leverage the power of AI/ML,” he added. According to Chouhan it is essential for the clients to figure out what they themselves need. “The choice of tools is one such factor. There are many open-source and commercial AI/ML solutions that look attractive from a functional point. However, one needs to also compare the cost of creating and using models while choosing a tool,” he stated. 

He explains, “The next thing enterprises need to consider for maximizing ROI is not trying to do everything on their own. There are and will continue to be many open-source and paid models that can be used as a base, and then trained for the specific data of that enterprise. The choice of a cloud provider is also an important consideration. Since hardware is a major part of the cost, one needs to identify the right cloud provider and also factor in the cost of specialized hardware provided by them.” 

AWS, for instance is trying to win over more customers with a lower price point. In a conference on July 11, Dilip Kumar, Vice President at AWS Applications spoke about how Amazon was better at lowering costs to train and operate AI models. “These models are expensive. We’re taking on a lot of that undifferentiated heavy lifting, so as to be able to lower the cost for our customers,” he said. 

An April report by Gartner has predicted that through 2023, AI will be one of the top workloads that drive IT infrastructure decisions, and this clearly has pumped up the demand for cloud services even more. 

Leaning on third-party services

This would also explain why enterprises may prefer to outsource managing and maintaining cloud to a third-party firm or a tool. 

Deepak Singh, the Managing Director at a cloud consultancy firm called G7 CR Technologies - a Noventiq Company, does exactly this. “Cloud platforms still provide several benefits, including scalable compute and storage resources, easy access to pre-configured AI tools and libraries, and the ability to handle peak workloads or demand. In many cases, a hybrid approach can be adopted, where businesses may use on-premises AI hardware for sensitive data processing or latency-sensitive applications, while leveraging cloud for tasks like data storage, distributed training, or deploying AI models. However, cloud will continue to play a crucial role in providing the necessary infrastructure and services to support AI development, deployment, and management at scale. In short, as long as AI’s trajectory remains bright, the cloud industry will have an enormous boom from which to benefit,” he stated.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.