India vs AI risks: Breather for big digital platforms, albeit with a rider

In a breather for big social media subsidiaries and internet platforms, the government has said that they no longer require approval before launching or deploying their AI models in the country. Let us find out what it means for Big Tech.

The Ministry of Electronics and IT has issued an update to its AI advisory (issued on March 1). It means that the big Internet companies do not need the government's permission anymore before launching any AI model in the country.

However, the big digital platforms are advised to label “under-tested and unreliable AI models to inform users of their potential fallibility or unreliability”.

"Under-tested/unreliable Artificial Intelligence foundational models)/ LLM/Generative Al, software(s) or algorithm(s) or further development on such models should be made available to users in India only after appropriately labelling the possible inherent fallibility or unreliability of the output generated,” according to the new MeitY advisory.

All intermediaries or platforms must ensure that the use of AI models /LLM/Generative AI, software or algorithms "does not permit its users to host, display, upload, modify, publish, transmit, store, update or share any unlawful content as outlined in the Rule 3(1)(b) of the IT Rules or violate any other provision of the IT Act."

"It is reiterated that non-compliance to the provisions of the IT Act and/or IT Rules would result in potential penal consequences to the intermediaries or platforms or its users when identified, including but not limited to prosecution under IT Act and several other statutes of the criminal code," according to the advisory.

The digital platforms have been asked to comply with new AI guidelines with immediate effect.

The Centre has already clarified that the permission to launch new AI models will not apply to startups.

“The advisory is aimed at untested AI platforms from deploying on the Indian Internet,” according to the government.


More English News