Overregulating AI may mean getting left behind

Representational image.


All talks to regulate AI aren’t just short-sighted — they’re potentially dangerous. This was the takeaway from a bunch of thought leaders fielding questions from scientists and science buffs at the annual Ignite Life science Foundation in Mumbai on Thursday. What makes regulation dangerous?

Representational image.

“AI is science, and science thrives on freedom, exploration, and curiosity. Regulation, by contrast, is friction and this is a drag on progress,” argued Ganesh Bagler of the Infosys Centre for AI housed at IIT Delhi. He went on to make the case that when friction meets science, innovation suffers. We’ve seen this with other groundbreaking technologies be it genetics or space exploration. Attempts to control science prematurely have always stunted its potential. It’s pretty much the same with AI.

Fear-mongering aside, the real issue then, said Debarag Banerjee, chief AI officer at L&T Finance, is that AI is about human intentions. Just as there are good and bad actors in every sphere of life, there are good and bad intentions among those who use technology as well. AI is neutral—just a tool. In the right hands, it can bring progress. In the wrong hands, it can cause harm. So, instead of focusing on choking the tool, choke bad actors. Let the technology evolve, he said.

Prof S Ramasway, director at the Brindley Bioscience Centre who was moderating the discussion pointed out that the urgent need was to create a legal framework that would address how to deal with misuse and focus on intent as opposed to placing blanket restrictions. But politicians and regulators are often quick to push for controls because it feels like the safe option. This, he said, may not be the most optimal.

AI, Banerjee argued, will determine the next set of winners and losers in the global economy. Countries, companies, and individuals that embrace AI will emerge as leaders. Those clinging to outdated regulatory models will find themselves left behind. The European Union, for instance, with its push to regulate AI, risks stifling its own progress. In slowing down innovation, they position themselves as spectators rather than participants in the next technological revolution. The paradox is striking: the more control one seeks over AI, the greater the likelihood of being left behind.

To draw an analogy, consider the photocopier. As their use became widespread, there was a valid concern: what if they would be used to pirate books? Ought photocopiers be banned? Of course not. Laws were created to protect IP and society benefited from the convenience of document copying while safeguarding creators. A legal framework that made sense emerged—one that balanced innovation and protection. AI is no different. We just have to think harder.

For that matter, consider this: if the Internet had been regulated out of fear of misuse in its early stages, where would the world be today? But as we know it went on to revolutionize communication, commerce, and nearly every aspect of life. AI holds similar promise. Regulation will slow down innovation and limit the technology’s potential to solve some of the world’s most pressing problems.

In contrast, countries and organizations with a lighter regulatory touch are poised to lead. This is not just about economics; it’s about power and influence. AI will shape the future of industries across the globe. Those who are first to the finish line will set the rules, while those burdened by heavy regulation will be left playing catch-up. The stakes couldn’t be higher, and this is not just an economic race—it’s about who will define the future.

The next battleground is already taking shape: nations like the US and China are positioning themselves as AI leaders. The EU risks falling behind. India is debating a third way even as it rolled out technologies like Aadhaar and UPI that all of us now take for granted. The choice is stark — either get ahead by embracing innovation or be left behind by overregulating.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *