Add The Wire As Your Trusted Source
For the best experience, open
https://m.thewire.in
on your mobile browser.
AdvertisementAdvertisement

Does Your AI Boyfriend Follow India's Data and IT Laws?

Users should be cognisant of the fact that the information which they are entering into 'AI companion' apps is not necessarily private. 
Users should be cognisant of the fact that the information which they are entering into 'AI companion' apps is not necessarily private. 
does your ai boyfriend follow india s data and it laws
Illustration: The Wire with Canva.
Advertisement

Over the last few years, artificial intelligence applications have risen to prominence and various organisations have started utilising them for streamlining their operations. While some organisations use popular AI apps like Gemini, ChatGPT, and Perplexity AI, there are also more specialised apps which deal with specific sectors or functionalities. 

AI companion apps are chatbots are geared towards addressing loneliness and providing companionship. Some of them specifically mention that they are intended to be used for the purpose of AI boyfriends or girlfriends. They can be helpful and convincingly good in conversations. However, these chatbots can also offer harmful suggestions and lead to serious consequences in real life. There are also criticisms that such chatbots simply tell users what they want to hear. This Article seeks to examine the data privacy risks posed by AI companion apps for the individuals who use them.  The reason why this is of concern is because users will inevitably input sensitive data while interacting with these apps. Therefore, we need to examine whether the privacy policies of industry leaders provide for dealing with such sensitive personal data and we also need to examine their data privacy practices. An example of an AI companion app is the recently launched Grok Companion by xAI.

An examination of some of the leading Indian and international AI companion apps reveals that messages and the content of communications are types of information collected by the company. This information includes facts about one’s life and any photos and videos provided by users. Such apps will also learn interests and preferences of the users as the service is used. However, considerable emphasis is placed by the companies’ privacy policies on the fact that the content of communication will not be used for marketing or advertising purposes.  There are also some AI apps typically not marketed as AI Companions which encrypt data in rest and in transit, thereby preventing the company from ordinarily viewing the content of messages. However, this data may be shared with service providers and may need to be decrypted and disclosed to Government, Law enforcement or other relevant third parties if a need arises.  Some privacy policies also state that information may be provided to advisors such as lawyers if the need arises. 

Privacy policies detail the reasons why the content of communication may be monitored. In the case of such apps, legitimate interest in preventing fraud, criminal activity and users misusing the services are cited as reasons why messages may need to be monitored.  The companies would also need to secure their IT systems, analyse usage trends and anonymising or deidentifying of personal information. What can also be gauged from privacy policies of such apps is that the content of messages may need to be monitored to enforce agreements, comply with legal obligations and defend against legal claims and disputes. Information may also be shared with law enforcement agencies if required. 

Digital collage featuring a computer monitor with circuit board patterns on the screen. A Navajo woman is seated on the edge of the screen, appearing to stitch or fix the digital landscape with their hands. Blue digital cables extend from the monitor, keyboard, and floor, connecting the image elements.

Representative image. Photo: Hanna Barakat & Archival Images of AI + AIxDESIGN / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

Advertisement

It is important to note that the privacy policies of such apps recognise that sensitive personal information such as religious views, sexual orientation, political views, health, racial or ethnic origin or trade union membership may be shared on their platforms. While sensitive personal information may be monitored by the company, it is ostensibly not used for marketing or advertising purposes. If a user does not want sensitive personal information to be collected and potentially monitored, then such user should not share sensitive information with the AI companion.  The privacy policies typically provide with whom this information is shared, how the data is secured, and the rights of users. The right to withdraw consent is provided, but in some cases withdrawing consent is not as easy as giving it.  It is important to note that there are AI companions which state that they are not designed to process sensitive personal information while offering companionship, which appears to be self-contradictory.Recently, the OpenAI CEO Sam Altman has also cautioned users against using the app for therapy and emotional support as the chats are not private and can be used in court proceedings. 

What is especially of concern is that some AI companions in India and abroad do not appear to have robust privacy policies detailing the kind of information they collect, how they process it and with whom they share it. In 2025, it is incumbent on organisations to provide comprehensive, easy to understand privacy policies and terms and conditions which detail their data privacy practices. 

Advertisement

The India picture 

Due to the ease with which AI companion apps can be downloaded and accessed, many of these apps, even if the company is based abroad, will have Indian users. Therefore, it is important to examine the statutory framework currently prevailing in India until the Digital Personal Data Protection Act, 2023 (DPDP Act) comes into force.

The Information Technology Act, 2000 (IT Act), Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules), 2011 (SPDI Rules)  and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code Rules, 2021 (2021 IT Rules)  will be the prevailing statutory framework in India until the DPDP Act comes into force. 

Advertisement

 The SPDI Rules define sensitive personal data. Sensitive personal data will include financial information, sexual orientation, medical records and history. Consequently, it can be argued that this can cover AI companion apps since users may enter this kind of information into the AI input. The SPDI Rules also require publishing privacy policies detailing the type of sensitive personal data, require obtaining of explicit consent before disclosing sensitive personal data while providing exceptions for government agencies. The Rules also provide for the information which needs to be provided to users and gives users the option to withdraw consent previously granted.  The SPDI Rules also specify how sensitive personal information is to be collected i.e. it is for a lawful purpose connected to the activity of the company and is necessary for that purpose.

Advertisement

The DPDP Act will replace the SPDI Rules when it is notified by the government and comes into force. Importantly, it makes no distinction between sensitive personal data and personal data. In Section 2(h) it defines data as consisting of representation of information, facts, concepts, opinions, or instructions in a manner which is suitable for communication for communication, interpretation by human or automated means.  Section 2(t) defines personal data as any individual who is identifiable by or in relation to such data. 

A brightly coloured illustration which can be viewed in any direction. It has several scenes within it: people in front of computers seeming stressed, a number of faces overlaid over each other, squashed emojis and other motifs.

Photo: Clarote & AI4Media / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/.

The DPDP Act imposes stringent consent requirements and applies to companies operating and offering services in India. While Section 5 of the DPDP Act requires a detailed notice to be given to a user, Section 6 of the Act specifies that the consent must be free, specific, informed, unconditional and unambiguous with a clear affirmative action and must indicate the purpose for which data is being processed for the specified purpose. The notice must be provided in a clear or plain language. The processing of this data will be limited to the specified purpose. It also provides the option to withdraw consent with the consequences of such withdrawal to be borne by the user and not affecting the earlier processing of data. The DPDP Act also introduces the concept of a consent manager which is a company that is a single point of contact for managing, reviewing or withdrawing consent. The consent manager will ultimately be accountable to the company which is providing the services.

Section 7(a) of the Act allows companies to process personal data for the specified purpose for which consent has been voluntarily provided by the user and where she does not indicate that she does not object to the use of her personal data. 

Section 9 of the DPDP Act requires obtaining verifiable consent from the parent or lawful guardian of a child or a person with disability. Section 11 provides the rights users have, such as being provided with a summary of personal data being processed and the activities being conducted on it, and the identities of all other companies with which the data has been shared.  

Recently, the Ministry of Electronics and Information Technology has released the Draft Digital Personal Data Protection Rules, 2025. Rule 3 of the Draft DPDP Rules imposes minimum requirements on companies. Companies must seek consent from users independently of other information which has been is or may be presented to the users. Further, there is certain minimum information which must be provided to users such as informing them what data is being collected in an itemised manner, the specified purpose of the same and the link to website or app or both or other means from where they can withdraw consent with ease similar to that of granting consent, exercise their rights under the DPDP Act and make a complaint to the board. Rule 6 of the draft DPDP rules imposes minimum obligations on companies for implementing reasonable security safeguards such as encryption.  Rule 7 details how information as to a data breach is to be delivered to the Data Protection Board of India (DPBI) or users. The draft DPDP Rules provide conditions for registration and obligations of consent managers in Rule 4 and schedule I. For example, the consent manager must be a company registered in India. 

The 2021 IT Rules require intermediaries to implement due diligence while conducting their operations. Rule 3(1)(a) requires them to publish their privacy policies or rules and regulations prominently on their website, mobile based application, or both. 

Rule 3(1)(b) of the 2021 IT Rules states that an Intermediary must make reasonable efforts not to host, display, upload, modify, publish, store, update or share any information including that which is obscene, pornographic, paedophilic, invasive of another’s privacy including bodily privacy, insulting or harassing on the basis of gender, racially or ethnically objectionable, relating to or encouraging money laundering or gambling (or an online game that causes users harm) or promoting enmity between different groups on the ground of religion or caste with the intent to incite violence, is harmful to children, impersonates any person, or violates any law for the time being in force. 

Rule 3(1) (c) requires the intermediary to periodically inform its users at least once every year that in case of non-compliance with the rules and regulations, privacy policy or user agreement for access or usage of its computer resource, it has the right to terminate the access or usage rights of the users immediately or to remove the non-compliant information or both. 

On March 15, 2024, the Ministry of Electronics and Information Technology released what is called a revised AI advisory. 

This advisory requires every intermediary and platform to ensure that the use of AI models, generative AI or software or algorithm through its computer resource does not permit its users to host, display, upload, modify, publish, transmit, store or update any unlawful content as outlined in Rule 3(1)(b) of the 2021 IT Rules, or violate any other provision of the IT Act, or any other law for the time being in force.  The revised AI advisory requires intermediaries to inform their users of the consequences of non-compliance, amongst other things. 

There are certain categories of information specified in Rule 3(1)(b) which may be created when an individual is dealing with AI companions such as those marketed as boyfriends or girlfriends, because the nature of the platforms is such that the input of sensitive and intimate personal information such as sexual or what may be considered obscene content is possible or even likely. Further, what information enters about another person into such AI apps which can potentially impact their privacy or bodily privacy may also be problematic. 

Students at computers with screens that include a representation of a retinal scanner with pixelation and binary data overlays and a brightly coloured datawave heatmap at the top.

Kathryn Conrad / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

Further, if the AI app is malicious and collects this sensitive information, then there is a potential for it to be used to blackmail or extort users. It also needs to be emphasised, as many of the privacy policies do, that no security safeguards provide 100% security. Data breaches are a reality which we have to live with. Information being accessed by a company for commercial purposes, as much of a concern as it is, cannot trump sensitive personal data being exploited by bad actors and potentially sold on the dark web. The retention periods specified in companies’ privacy policies generally state that information is retained only as long as required, unless it needs to be maintained for legal purposes. However, the caveat that information may be retained for longer in certain cases leads to ambiguity surrounding the retention of information. Different apps can have different retention periods but what is clear is that till the user holds an account with the company, the information will be retained. There may also be instances where a user just uninstalls an application instead of deleting it, resulting in long retention periods which can lead to serious consequences in the event of a data breach. Similarly, while companies claim they do not use such sensitive data for advertising purposes, the same logic does not extend to bad actors selling the data to marketers or advertisers. The DPDP Act imposes obligations to report data breaches to the DPBI and users as per section 8(6), which is a welcome move. As noted earlier, the draft DPDP Rules also impose requirements around intimation of data breaches. 

Transparency

The current statutory framework prevailing in India governing AI is a combination of the IT Act, the SPDI Rules and the 2021 IT Rules. This framework imposes both obligations on companies on how they should deal with sensitive personal data and on users to use AI in a responsible manner by not entering specified unlawful information into the chatbots. The SPDI Rules were meant to protect sensitive personal data and imposed no obligations as to data breaches and are narrower than the DPDP Act in scope and mainly aimed at safeguarding sensitive personal data.

Yutong Liu & Kingston School of Art / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

However, the DPDP Act and Draft DPDP Rules impose much wider obligations and much more stringent requirements in the way companies must obtain consent, protect personal data, and inform users in the event of a data breach. The DPDP Act and Draft DPDP Rules also introduce the concept of a consent manager which will ostensibly make it simpler for users to manage multiple consents across services. Cumulatively, at least on paper, the upcoming framework provides much greater level of protection for users and increased compliance requirements for companies, which is why they have sought time to comply with the obligations imposed by the DPDP Act and ultimately the soon to be notified DPDP Rules. 

For future compliance, a company could consider using pop-ups in the app itself to mitigate the obligation to have specific, informed and separate consent. A company could also in the app itself (if not already done so), inform users not to enter sensitive data into the chatbot and clearly state the possible consequences of entering such data instead of burying these details deep into their privacy policies. To ensure that the app is not malicious and does not detail its privacy practices, users should look over the privacy policies which can be easily accessed.  

To conclude, there is a lack of transparency in AI companion apps marketing themselves as offering a safe space to share intimate conversations with an AI companion and disclaiming any liability for users entering sensitive personal information into the chatbot.   Given the vast amount of information that can be entered into an AI companion app the companies behind these apps may find it difficult to be compliant with the current and upcoming statutory framework in India considering the subjectivity surrounding what can be considered unlawful content. Users should be cognisant of the fact that the information which they are entering into these apps is not necessarily private. 

Raghav Tankha is a lawyer practising in Delhi. Views are personal.

This article went live on August sixth, two thousand twenty five, at thirty-four minutes past two in the afternoon.

The Wire is now on WhatsApp. Follow our channel for sharp analysis and opinions on the latest developments.

Advertisement
Advertisement
tlbr_img1 Series tlbr_img2 Columns tlbr_img3 Multimedia