The key drivers for this major transition are the evolution of hardware and hosting on the cloud, sophisticated tools and software to capture, store and analyse the data as well as a variety of devices that keep us always connected and support in the generation of humungous volumes of data.
Artificial Intelligence (AI) has been much talked about over the last few years. Several interpretations of the potential of AI and its outcomes have been shared by technologists and futurologists. With the focus on the customer, the possibilities range from predicting trends to recommending actions to prescribing solutions.
The potential for change due to AI applications is energised by several factors. The first is the concept of AI itself which is not a new phenomenon. Researchers, cognitive specialists and hi-tech experts working with complex data for decades in domains such as space, medicine and astrophysics have used data to help derive deep insights to predict trends and build futuristic models.
AI has now moved out of the realms of research labs to the commercial world and every day life due to three key levers. Innovation and technology advancements in the hardware, telecommunications and software have been the catalysts in bringing AI to the forefront and attempting to go beyond the frontiers of data and analytics.
What was once seen as a big breakthrough to be able to analyse the data as ‘if-else- then’ scenarios transitioned to machine learning with the capability to deal with hundreds of variables but mostly structured data sets. Handcrafted techniques using algorithms did find ways to convert unstructured data to structured data but there are limitations to such volumes of data that could be handled by machine learning.
With 80% of the data being unstructured and with the realisation that the real value of data analysis would be possible only when both structured and unstructured data are synthesised, there came deep learning which is capable of handling thousands of factors and is able to draw inferences from tens of billions of data comprising of voice, image, video and queries each day. Determining patterns from unstructured data – multi-lingual text, multi-modal speech, vision have been maturing making recommendation engines more effective.
Another important factor that is aiding the process for adoption of AI rapidly is the evolution seen in the hardware. CPUs (Central processing unit) today are versatile and designed for handling sequential codes and not for addressing codes related to massive parallel problems. This is where the GPUs (graphcial processing units) which were hitherto considered primarily for applications such as gaming are now being deployed for the purpose of addressing the needs of commercial establishments, governments and other domains dealing with gigantic volumes of data supporting their needs for parallel processing in areas such as smart parking, retail analytics, intelligent traffic systems and others. Such computing intensive functions requiring massive problems to be broken up into smaller ones that require parallelisation are finding efficient hardware and hosting options in the cloud.
Therefore the key drivers for this major transition are the evolution of hardware and hosting on the cloud, sophisticated tools and software to capture, store and analyse the data as well as a variety of devices that keep us always connected and support in the generation of humungous volumes of data. These dimensions along with advances in telecommunications will continue to evolve, making it possible for commercial establishments, governments and society to arrive at solutions that deliver superior experiences for the common man. Whether it is agriculture, health, decoding crimes, transportation or maintenance of law and order, we have already started seeing the play of digital technologies and democratisation of AI would soon become a reality.
The writer is chairperson, Global Talent Track, a corporate training solutions company