- This Week in Scams: $16.6 Billion Lost, Deepfakes Rise, and Google Email Scams Emerge | McAfee Blog
- Proof-of-concept bypass shows weakness in Linux security tools, claims Israeli vendor
- SAP NetWeaver customers urged to deploy patch for critical zero-day vulnerability
- Lenovo targets AI workloads with massive storage update
- Girls Power Tech Inspires the Next Generation of Tech Leaders
Unauthorized AI is eating your company data, thanks to your employees

Nearly 83% of all legal documents shared with AI tools go through non-corporate accounts, the report adds, while about half of all source code, R&D materials, and HR and employee records go into unauthorized AIs.
The amount of data put into all AI tools saw nearly a five-fold increase between March 2023 and March 2024, according to the report. “End users are adopting new AI tools faster than IT can keep up, fueling continued growth in ‘shadow AI,’” the report adds.
Where does the data go?
At the same time, many users may not know what happens to their companies’ data once they share it with an unlicensed AI. ChatGPT’s terms of use, for example, say the ownership of the content entered remains with the users. However, ChatGPT may use that content to provide, maintain, develop, and improve its services, meaning it could train itself using shared employee records. Users can opt out of ChatGPT training itself on their data.