- The special edition strawberry pink Dyson Airwrap bundle is $101 off ahead of Presidents' Day
- JBL's luxury noise-canceling earbuds are on sale ahead of Presidents' Day
- Cisco data center switches feature baked-in security for AI, networking duties
- Why Elon Musk's $97 billion bid for OpenAI could disrupt Sam Altman's plans
- Alabama Hacker Admits Role in SEC X Account Breach
Dangers of DeepSeek’s privacy policy: Data risks in the age of AI
![Dangers of DeepSeek’s privacy policy: Data risks in the age of AI Dangers of DeepSeek’s privacy policy: Data risks in the age of AI](https://www.securitymagazine.com/ext/resources/2025/02/10/Coding-on-screen-by-Walkator.png?height=635&t=1739205975&width=1200)
As AI technologies rapidly evolve and become more integrated into our daily lives, data privacy concerns have never been more urgent. Case in point: The newly popular AI tool DeepSeek. Behind the convenience the tool provides lies a privacy policy that should be a cause for pause for anyone who values their personal security.
DeepSeek’s privacy policy openly states that the wide array of user data they collect goes to servers in China. This alone raises big questions about how that data could be used beyond just running the app. It’s so easy to get swept up in the hype of a new trending AI tool and accept the terms of use without thinking twice, but this privacy policy should make users stop and ask: Am I giving away private details like my location, browsing activity, or even personal messages without realizing it?
China’s role in data privacy
While 78% of AI users claim to care about their data privacy more than the conveniences of AI, more than half don’t even know how much personal information their AI tools are actually collecting. Personal data that crosses global borders can also fall outside of the protections that users might assume they have, leaving that data vulnerable to misuse — from discrimination to fraud.
DeepSeek’s privacy policy reveals that user data is stored on servers located in China, a fact that immediately demands attention. When personal information crosses international borders, it can fall outside the scope of the privacy protections users may expect, depending on the jurisdiction.
With China’s more lenient data privacy laws, including the potential for government access under local regulations, DeepSeek’s choice of server location is particularly alarming. For users, this means the possibility that sensitive data — such as location, browsing habits, and personal messages — could be accessed by foreign authorities or exploited in ways they aren’t fully aware of when agreeing to the app’s terms of service.
Behavioral biometrics: A bigger concern than you think
Perhaps the most concerning part of DeepSeek’s data collection is its use of behavioral biometrics, specifically the collection of users’ “keystroke patterns or rhythms.” This seemingly innocuous data refers to how one types: the speed, rhythm, and key-press duration. While it might seem harmless, this type of information is incredibly unique to each person, much like a fingerprint, and can be used as a form of identification.
While there is, in theory, potential upside — biometric data could be used to tailor the user experience or interface based on typing behavior to predict words or adjust the UI dynamically — the likely reality paints a very different picture. Realistically, the app leveraging biometric data means that it is now able to trace everything a user does — and the app’s creators are going to sell that data. Simplistically, if you can be identified via this biometric attribute, then normal privacy protections such as VPN’s are no longer effective. People always forget: If an application or device that you’re using is not clearly selling a product, you are the product.
The biometrics space is growing rapidly and the accuracy it can verify identities at is unmatched. However, with more companies collecting biometrics there is a whole new set of data privacy and security risks to be addressed, including security breaches, identity theft, impersonation, and fraud. Biometrics inability to be revoked or reset, unlike compromised passwords, makes it a much higher-stakes form of identification.
What users can do to protect their privacy
Consent is one of the most important mechanisms for balancing data utility and data privacy. While businesses want to make the most of user data to personalize experiences and generate revenue, users expect their privacy to be respected. To bridge the gap between the two, data privacy regulations come into play.
The challenge: 64% of consumers do not believe the data privacy regulations in their country adequately protect their personal data. As the United States continues to introduce data protection laws both at the state and federal level, it is critical that they require businesses to obtain informed, specific, and freely given consent from users before collecting and processing their data. DeepSeek themselves argue that the data privacy laws outside of China do not apply to them, regardless of the data subject’s location.
When possible, users should opt for alternative identity authentication methods, if they are concerned about their data privacy. Third-party verification services are the ones actually prioritizing safety. Under this format, users have their information cross-referenced against trusted databases to confirm a combination of ID verification.
Ultimately, while AI tools like DeepSeek can offer convenience and innovation, they also raise serious concerns about how personal data is handled. While the onus falls on the business to provide clear information about the collection, storage, and usage of biometrics, users must remain vigilant and advocate for stronger privacy protections. It’s essential in ensuring personal data remains secure in an increasingly interconnected world.