
The settlement, announced March 30, 2026, doesn’t include a monetary fine, but it imposes permanent restrictions on how the companies can describe their data practices. More importantly, it shines a harsh light on a problem that plagues countless apps and websites: the dangerous practice of sharing personally identifiable information (PII) with third parties without proper safeguards, consent, or oversight.
The Details: How a Simple “Investment Connection” Led to Massive Data Exposure
According to the FTC’s complaint, the unauthorized sharing began around 2014. OkCupid’s founders had invested in Clarifai, an AI startup specializing in facial recognition and image analysis. Despite having zero contractual relationship or business partnership with OkCupid, Clarifai requested access to large datasets of user photos and related information.
OkCupid obliged — handing over the data without any formal agreement, without placing restrictions on how Clarifai could use or further share it, and without notifying users or giving them a chance to opt out. This directly contradicted OkCupid’s privacy policy, which explicitly promised that personal information would only be shared with service providers, business partners, or other entities within the Match family — and only after informing users and offering an opt-out.
Making matters worse, when media reports later surfaced about the data transfer, OkCupid publicly denied any involvement and allegedly took steps to obstruct the FTC’s investigation. The agency had to go to federal court to enforce a Civil Investigative Demand before getting the full picture.
Christopher Mufarrige, Director of the FTC’s Bureau of Consumer Protection, put it bluntly: “The FTC enforces the privacy promises that companies make. We will investigate, and where appropriate, take action against companies that promise to safeguard your data but fail to follow through — even if that means we have to enforce our Civil Investigative Demands in court.”
Why Dating-App Data Is Especially Sensitive — And Why This Matters
Dating apps like OkCupid don’t just hold casual information. They store deeply personal details: photos that can reveal identity and appearance, precise location data that can expose where someone lives or works, demographic information, and even intimate preferences or relationship status. Unauthorized sharing of this data can lead to real-world harms — from stalking and harassment to identity theft, blackmail, or doxxing.
In this case, the data went to an AI firm focused on facial recognition. Without contractual limits, there was nothing stopping further misuse, such as training AI models, selling the data onward, or combining it with other datasets in ways users never imagined.
A Pattern Across Dating and Gaming Apps: Other Notable FTC Cases
This isn’t an isolated incident. The FTC has cracked down repeatedly on apps that treat user data too casually, especially in dating and gaming spaces where sensitive or children’s data is involved.
In the dating world, similar concerns have surfaced before. Grindr, another popular LGBTQ+ dating app, faced intense scrutiny (and eventual divestiture requirements from regulators) over sharing precise location data and even users’ HIV status with third-party analytics firms. While not an FTC case, it underscored how dating apps often underestimate the sensitivity of their datasets.
On the gaming side, the FTC has been even more aggressive when kids are involved. In 2022, Epic Games (maker of Fortnite) agreed to a record $520 million settlement for privacy and Children’s Online Privacy Protection Act (COPPA) violations. The agency alleged the company illegally collected and retained children’s personal information and used dark patterns to trick users into making unwanted purchases and sharing data.
Just months later, in 2023, Microsoft paid $20 million to settle COPPA charges related to its Xbox gaming platform. The FTC found Microsoft had collected personal information from children without proper parental consent and illegally retained that data — including sharing it with third-party game publishers without adequate safeguards.
These cases show a clear pattern: whether it’s dating profiles or kids’ gaming accounts, companies that promise strong privacy but then funnel data to third parties face serious regulatory heat.
Dating Problems Becomes a Data Problem – Third-Party Data Sharing and the Critical Gaps in TPRM
At the heart of the OkCupid case — and many others — is a failure of Third-Party Risk Management (TPRM). When companies share PII with vendors, partners, or even seemingly unrelated entities, they often skip the basics:
- No formal contracts or data-processing agreements: Without clear terms limiting use, retention, or onward sharing, data can be weaponized or leaked.
- Inadequate due diligence: OkCupid’s founders’ financial ties to Clarifai created an obvious conflict of interest, yet no one appears to have conducted a proper risk assessment.
- Weak oversight and monitoring: Once data leaves the building, many companies have no idea what happens next — until regulators come knocking.
The risks are enormous. A single third-party breach can expose millions of records (think the 2023 MOVEit supply-chain attack or the SolarWinds incident). Even without a breach, unauthorized sharing can lead to:
- Regulatory fines and enforcement: The FTC has hit companies like BetterHelp ($7.8 million for sharing mental-health data with advertisers), GoodRx, Cerebral, and Avast (over browsing data) for similar deceptive practices.
- Reputational damage: Users lose trust when they discover their intimate photos or location history were quietly handed off.
- Secondary harms: Data can be sold to data brokers, used for targeted ads, or combined with other datasets to create invasive profiles.
- Chain-reaction liability: Under laws like the GDPR, CCPA, and emerging U.S. state privacy rules, companies remain responsible even if a third party mishandles the data.
The FTC’s own staff reports on social media and streaming platforms have repeatedly highlighted how lax third-party data-sharing controls create “vast surveillance” ecosystems that users never consented to.
What Companies (and Users) Should Take Away
For app developers and platforms handling sensitive data, the message is clear: TPRM isn’t optional paperwork — it’s a core compliance and business necessity. Best practices now include:
- Conduct thorough vendor risk assessments before any data sharing.
- Use ironclad data-processing addendums that limit purpose, require deletion, and mandate audits.
- Implement technical controls like data minimization and anonymization where possible.
- Be transparent with users — and actually honor opt-out promises.
- Regularly monitor and audit third-party partners.
For consumers, the takeaway is caution. Review privacy settings, limit what you share on apps, and remember: once your data leaves the original company, you have very little control over where it ends up.
The OkCupid settlement requires the companies to stop misrepresenting their data practices, but it also serves as a broader warning. In an era where AI companies are hungry for training data and apps collect more personal information than ever, regulators are watching closely. Promises on a privacy policy page aren’t enough — companies must back them up with real safeguards.
This case, combined with the FTC’s aggressive COPPA enforcement in gaming and its crackdowns on health-data sharing, signals a new era of accountability for how apps handle the most personal aspects of our digital lives.