OpenAI KYC provider accused of sharing users’ crypto addresses with federal agencies

OpenAI KYC provider accused of sharing users’ crypto addresses with federal agencies
Web3
OpenAI KYC provider Persona is accused of sending user data to FinCEN. Illustration: Hilary B; Source: Shutterstock
  • OpenAI KYC provider accused of sharing customers' data.
  • Persona's code allegedly show US government connection.
  • Integration lets operators monitor crypto addresses.

A company that conducts the know-your-customer checks required to access OpenAI’s advanced chatbots is allegedly sending customer data — including associated crypto addresses — to federal agencies.

A February 18 investigation published by security researchers who go by vmfunc, MDL, and Dziurwa found publicly accessible code that appears to send data collected by Persona, the KYC provider, to The Financial Crimes Enforcement Network — or FinCEN — a bureau of the US Department of the Treasury that safeguards the financial system from illicit use.

“The same company that takes your passport photo when you sign up for ChatGPT also operates a government platform that files Suspicious Activity Reports with FinCEN and tags them with intelligence programme codenames,” the investigation’s authors said.

“So you uploaded a selfie to use a chatbot? Congratulations! It’s now being compared against a database of every politician, head of state, and their extended family tree on earth.”

‘Unanswered questions’

Multiple IT specialists and security experts confirmed to DL News that the investigation and its findings appear legitimate.

“It appears the research is credible and you can verify that the government domains cited do exist, and are highly likely hosted on dedicated infrastructure by Persona,” Tanuki42, a pseudonymous security researcher who contributes at SEAL911 and zeroShadow, the blockchain incident response groups, told DL News.

“That said, there are still a lot of unanswered questions around motives, what this platform is used for, and who it is used by.”

Persona CEO Rick Song has since responded to the allegations publicly on X. He accused the investigation’s authors of not reaching out to him before publishing their findings.

In emails between vmfunc and Song posted by Song on his X account, the Persona CEO said his firm does not work with any federal agency today. He has yet to directly address the findings of the investigation.

“I am genuinely disappointed in how all of this has been handled,” Song said on Thursday in a since-deleted X post. “What has really been frustrating for me is that I also admire vmfunc’s work and their clear talent.”

Vmfunc, also known in online circles as Celeste, is widely regarded as credible because of their track record of technical investigations that other security experts have repeatedly validated.

OpenAI and Persona did not immediately respond to DL News’ requests for comment.

Surveillance fears

The allegations come as both crypto users and the broader public grow increasingly concerned about the mass surveillance and privacy-eroding capabilities of enforced identification checks.

The fear is that companies like Persona, working with government agencies, could use identification data collected from customers to create a vast Orwellian surveillance network that uses opaque criteria to place users on watchlists without their knowledge, consent, or public oversight.

In recent years, more and more platforms have started requiring users to hand over personally identifiable information, often in the name of preventing crime or protecting vulnerable groups.

Yet critics say the checks do more harm than good.

Many of the biggest companies that provide KYC services have been found to misuse and mishandle data.

Even when companies aren’t misbehaving, the huge swaths of data they hold make them lucrative targets for hackers. Several more platforms have in recent years suffered data breaches, leaking millions of users’ personal data.

The issue is close to the hearts of crypto advocates.

Many early Bitcoin contributors were self-styled cypherpunks, privacy activists advocating for cryptography and privacy-preserving technologies to protect the public against government and corporate surveillance.

Persistent monitoring

When a user is prompted to verify their identity with OpenAI, their KYC data — usually a photo of their passport, a selfie, and a short video of their face — is sent to Persona for verification.

That data is then automatically screened against global sanctions and warning lists, undergoes facial similarity scoring, and is checked against records of people linked to terrorism, cybercrime and other financial crimes.

That’s all standard stuff.

But at the same time, the same information is sent directly to government agencies, according to Vmfunc, MDL, and Dziurwa.

The investigation found code that gives operators the ability to file suspicious activity reports straight to FinCEN; file equivalent reports to Canada’s financial intelligence unit; tag data with intelligence programme codenames; screen associated crypto addresses through Chainalysis, a blockchain security platform; and conduct over 250 more verification checks.

According to the investigation, the Chainalysis integration assesses cryptocurrency addresses for risk, analyses the other addresses they’ve interacted with, checks the value of the funds they contain, and tries to identify their owners.

“There’s also a native crypto address watchlist system layered on top,” the investigation’s authors said. “This isn’t a one-shot lookup but a persistent monitor. Your wallet goes on the list once and gets polled indefinitely against Chainalysis’ cluster graph.”

Chainalysis did not immediately respond to DL News’ request for comment.

The problem is that it isn’t clear what criteria need to be met to trigger the crypto address screening, the watchlist system, or any of the other actions. It’s also unclear if OpenAI users are specifically warned that their data could be used in this way when undergoing KYC checks.

According to the investigation, the code that runs these functions has been in place since November 2023.

As for how long data that is forwarded to the government agencies is kept for? That’s not clear either.

“What is the actual biometric retention period?” the investigation’s authors said. ”OpenAI says ‘up to a year.’ the code says three years max. Government IDs retained ‘permanently.’ Which is it?"

Tim Craig is DL News’ Edinburgh-based DeFi Correspondent. Reach out with tips at tim@dlnews.com.