Sarees, Selfies, and Surveillance: The Dark Truth Behind the Gemini AI Trend


 Introduction

The rise of AI-powered image editing has delivered a new form of digital fashion: viral filters, stylized portraits, and lately the “Gemini Nano Banana Saree” phenomenon. Users, especially women are uploading selfies to AI tools like Google Gemini to generate surreal saree portraits that evoke nostalgia, glamour, heritage. It looks harmless even delightful. But beneath the aesthetic veneer lies a tangle of risks: biometric data collection, potential misuse, weak legal safeguards, and threats to safety and privacy.

This article analyzes these issues, drawing on recent data, expert opinions, and Indian legal context, to argue that the rush toward creativity must be balanced with responsibility.


What’s Going On: The Gemini Saree Trend

The trend uses AI models like Gemini’s “Nano Banana” prompt to transform personal photos into stylised portraits featuring vintage saree aesthetics, golden hour lighting, etc. The visual outcomes are polished, often enchanting, and naturally share well on social media.

However, reports have emerged that AI-generated images include features that were not apparent in original photos ; mole placements, facial marks, etc. Such inexplicable details raise questions about how much information the underlying model or training data actually has, and whether data from multiple sources is being used to infer or reconstruct private details.


What You Trade for Beauty: When Beauty Turns Risk: Data as the New Currency

Biometric Data & Inference

Facial images are biometric identifiers. Once uploaded, they can be processed to extract facial geometry, skin features, expressions – potentially cross-referenced with other datasets.

Experts caution that even when users do not share explicit identifying information, AI systems can “fill in gaps” using large training datasets.

Deepfakes, Stalking, Impersonation

Misuse of AI-generated images can escalate into image manipulation (deepfakes), impersonation, or non-consensual uses. For women, this can mean harassment, reputational damage, psychological harm.

Fake or malicious clones of apps/trends often promise glamour or viral content but may request excessive permissions, store images insecurely, or harvest data for unintended uses.


From Filters to Forums: How Your Data Reaches the Dark Web

The dark web functions as a hidden economy for stolen data : credit card details, government IDs, medical records, and increasingly, facial images.

Leaks from platforms: If AI tools or look-alike apps are compromised, users’ selfies and generated images can surface for sale or trade.

Reverse image searches: Malicious actors use advanced recognition software to trace AI-edited portraits back to original uploads, linking multiple online identities.

Bundling with other data: A face plus metadata like location tags, can be stitched with phone numbers, Aadhaar leaks, or email breaches, making victims vulnerable to fraud or blackmail.


Public Sentiment & Trust

A survey by Cheil India found that 64% of respondents believe if they begin using AI tools, all their data will be exposed to AI companies; 70% worry companies will misuse data they share

Women and younger users tend to feel more vulnerable when image data is involved, especially given gendered dynamics of privacy, social norms, and potential for abuse.


Evidence of Disparities & Errors

A study “Cinderella’s shoe won’t fit Soundarya: An audit of facial processing tools on Indian faces” found that commercial facial processing tools have significantly higher error rates for Indian female faces in gender classification 14.68% and large variation in age classification 14.3% to 42.2%.

Use of facial recognition for attendance/surveillance for instance in schools and public spaces is triggering alarm bells from experts regarding accuracy, data security, and risk of misuse


Expert Voices

V.C. Sajjanar an IPS Officer warns of fake sites posing as fun trends. He emphasizes that uploading images or personal data to unauthorised platforms can leave users exposed, and once data is out, it’s hard to retract. He urges verifying whether the platform is official, look for watermarks like fake ID, absence of pop-ups, etc.

Privacy experts in various reports have criticised proposals like the Criminal Procedure identification Bill for expanding the scope of biometric collection, storing data long term (75 years) and failing to sufficiently protect citizens’ rights to privacy, dignity, and autonomy.

In survey findings quoted in Cheil India’s report: there's widespread mistrust in how AI companies handle personal data, with many fearing misuse or exposure.


Why Women Are at Greater Risk


Social, cultural, and structural gender dynamics often mean women are under pressure to present a certain image online; visibility comes with heightened risk.

Misuse of images (real or AI-generated) can lead to harassment, stalking, blackmail, or defamation. Women tend to face greater social stigma in many communities for perceived impropriety.

When facial recognition or biometrics fail (higher error rates on women or darker skin tones), false positives or misclassifications can have serious consequences: wrongful association, reduced access to services, social embarrassment.


How Dangerous Is the Trend?


While not every participant will face harm, the risks are non-trivial. Key concerns include:

Opacity: Users don’t fully understand where their data goes.

Weak regulation: Laws on AI and biometrics lag far behind technology.

Mass participation: Millions are uploading selfies, creating vast datasets vulnerable to leaks.

Psychological unease: Realizing AI “knows” hidden details creates a sense of lost control.


What Needs to Be Done: Path Forward

Strengthen Legal Definitions & Consent Rights

Laws must explicitly cover inferred data, use of facial images, biometric identifiers in AI-training, etc. Consent should be informed, granular, revocable.

Mandatory Transparency & Auditability

AI platforms should make it clear what data is stored, how long, for which purposes; preferably provide independent audits of model training and dataset sources.

Stronger Consumer Protection & Penalties

Violations (unauthorised use, leaks, misuse) must come with meaningful penalties. Data protection authorities should have power to investigate, enforce, and impose fines.

Technical Safeguards

Data minimization, watermarking (fake ID), metadata control, deleting data after appropriate retention periods. Also security of storage.

Educational Interventions

Public awareness campaigns, especially targeting younger people, women, digital literacy communities, to understand what they share and what risks might follow.

Platform Responsibility

Platforms hosting AI apps ,trends must vet third-party replicas, enforce proper permissions, ensure security, and flag or remove malicious imitators.

Gender-Sensitive and Bias Audits

Facial recognition and AI image tools must be evaluated for accuracy across gender, skin tones, age groups to avoid disproportionate harms.


Conclusion

The Gemini Saree trend might look like harmless fun a fusion of fashion, nostalgia, and tech. But in our eagerness to play, many of us are handing over raw, powerful pieces of our identity. Under current legal frameworks in India, protections exist but with significant gaps. Women, whose public image is often under greater scrutiny, may face more severe fallout if their data is misused.

As AI tools proliferate and viral image trends spread, the question is not just can we create beautiful portraits, but should we risk personal data, consent, and safety in the process. For tech companies, lawmakers, and users alike, there is a moment now to demand better regulation, stronger ethics, and deeper respect for privacy.









The insight review 

Upasna Sharma 

Comments

Popular posts from this blog

The Knowledge Divide: How Global Inequalities in Education and Tech Access Shape International Relations

The AI Arms Race: Governance, Decoupling, and Tech Cold Wars

India at the 2025 SCO Summit: Security, Connectivity, and Opportunity in a Shifting World Order.