Startups are facing huge disadvantage vis-a-vis big tech firms like Microsoft, Amazon Web Services (AWS) and others, when it comes to monetising their AI models. The large resource base of the big tech firms enables them to release their small solutions for free. This edges out innovators who are smaller in size but operate in the similar space.  

Abhinav Aggarwal, founder of Fluid AI, highlighted a common scenario at the Global IndiaAI Summit on Thursday. “If you’re just a very small layer, you’re using a chat GPT API or a mystery API. And, you know, all the magic is happening in the LLM. And you’re this very thin layer that’s kind of making it usable by the end user……Then six months to a year down the line, that wrapper is going to get wiped out, right? A Microsoft will launch it free and AWS will launch it free in their solution and you’re wiped out,” he said.

Aggarwal said in such scenarios, it’s difficult for startups to survive, so the question of they thriving does not even arise. He suggested a pivot towards creating a more robust application layer. “How do you make that thicker? When you go downwards, right? You go to the LLM side,” he advised. This shift involves both enhancing the application layer to solve end-use cases effectively and adding a fine-tuning layer around the language model.

Aggarwal further said a hybrid approach has proven beneficial for Fluid AI. “We’re solving for an end-use case, like making a manufacturing plant more efficient, and we build an application layer for that. But then we build some layer of fine tuning around the language model,” he said. This strategy not only delivers great value to the customer but also fortifies the startup against imminent threats of disruption,” he said.

Addressing challenges specific to Indian use cases, Aggarwal said there is need for advanced reasoning in AI models. “What we found, at least so far, the place where we are struggling with the Indic models is that one year down the line or two years down the line, most of these models or approaches are becoming agentic. By agentic, it just means that the model thinks through a little more, like it takes like six, seven steps of reasoning before it gives the answer,” he said.

Further, stressing on the essence of sustainable AI integration in startups he said, “When AI is nothing, that’s when it’s everything. You’re solving for a real problem, and AI becomes one of the tools that’s enabling you to solve that problem”. This approach, he believes, will differentiate successful startups from those merely chasing the next cool trend in generative AI.