The country now has over 100 genAI startups and the funding received by new genAI ventures such as Hindi LLM maker Sarvam AI, as well as Adobe’s acquisition of Bengaluru-based Rephrase.ai last year are “good validations” of the AI ecosystem booming in India, said Sangeeta Gupta, senior vice president and chief strategy officer, Nasscom.
2022 has been the largest year with $566 million being infused in GenAI startups, followed by $144 million in 2023, data showed.
These were made across diverse GenAI offerings with the following areas constituting a major share of the pie – text content creation (18%), chatbots and virtual assistants (18%), as well as image and video generation (16%).
“Investments have grown incrementally in 2023, from an AI perspective, because 2022 was the year when startups started realizing the potential of GenAI applications, marked by the debut of ChatGPT,” Gupta told ET.
Discover the stories of your interest
She added that funding will definitely ramp up in 2024 given the efforts that all the different stakeholders are making.
As per Nasscom estimates, AI startups in general have received private investments totalling $8 billion between 2013-2022, out of which $3.24 billion were drawn in 2022 alone across 1900 AI startups, making the biggest funding period in AI.
The bygone year witnessed major breakthroughs in GenAI innovation with tech majors and startups building largue language models trained on Indic languages.
Conversational AI startup Corover.ai recently launched BharatGPT, a 7 billion parameter model trained in 14 Indian languages across text, voice, and video interactions. Sarvam AI also released the first open-source Hindi language model called OpenHathi-Hi-0.1 built on Meta’s LlaMa 2-7B model.
Mobility unicorn Ola dropped the Krutrim LLM which can comprehend 22 Indian languages and can form responses in 10 languages. IT services leader Tech Mahindra is also working on ‘Project Indus’, an LLM trained in Hindi and 37 Indic dialects.
Besides these, firms like Flipkart and SaaS provider Ozonetel are also exploring small language models trained on much smaller datasets and tailored to domain-specific use cases.