Business model for changing times: Using AI to answer ‘what if’ questions

Disruption is everywhere and it’s not getting easier

dd domain knowledge into GenAI workflows to fine-tune LLMs
dd domain knowledge into GenAI workflows to fine-tune LLMs

By Bryan Harris

ARTIFICIAL INTELLIGENCE (AI) enables companies to learn as fast as they can against the data environment they are analysing. And in the current environment, whoever unlocks value fast enough to make the best decision is going to win. Furthermore, businesses can start realising value from generative AI (GenAI) initiatives if these are grounded in domain expertise, state-of-the-art AI capabilities and end-to-end data management and governance.
Here are four predictions for AI and analytics that successful organisations will embrace:

Effective business leaders will ask the difficult ‘what if’ questions.


Disruption is everywhere and it’s not getting easier. Grounded with strong data management practices and proven AI and analytics strategies, executives will embrace ‘what if’ scenario planning to manage operational risk by asking hard questions.
AI scales human productivity and decision making and will help answer these critical questions to ensure business resiliency.

Simulation and digital twin technology will help answer the ‘what if’ questions.


The scale of business disruption will require systems to be simulated to understand how to react to them. Businesses building AI use cases will start to run ‘what if’ scenarios — essentially simulating the business or modeling the complexity of the business. For example, tapping into synthetic data, digital twin simulations, or large language models (LLMs) a business could ask: What happens if the supply of natural gas increases by 8% next year? That question would run a simulation of the business that is dependent on a ‘what if’ analysis.

Businesses tap existing knowledge bases to extract the most value from GenAI.


A key ingredient of extracting generative AI value will be ensuring organisations have a strong knowledge management strategy — starting by leveraging existing proprietary, industry-specific knowledge bases. Progressive organisations will fine-tune existing LLMs by injecting industry domain knowledge into generative AI workflows. We will see the integration of industry knowledge as a repeating pattern across the life sciences, insurance, banking, and healthcare industries.

In Generative AI agent frameworks mature to meet enterprise complexity.


The complexity of generative AI will spark the application of new software architectures that orchestrate information flow across enterprise systems and predictive models, and enhance conversational experiences.Retrieval-augmented generation (RAG) is an AI framework for retrieving and incorporating up-to-date information with LLMs. This is a great first step, but this architecture will be limited to a certain scale and complexity of use cases in the organisation. Agent-based frameworks — like the pioneering work of AutoGen from Microsoft — facilitate building networks of roles and functions that leverage RAGs, LLMs, and enterprise systems to meet the complexity of today’s organisations.

The author is EVP and CTO, SAS

Follow us on TwitterFacebookLinkedIn

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on January eighteen, twenty twenty-four, at six minutes past ten in the morning.
Market Data
Market Data