- Start Your Journey into the Future with Cisco's AI Solutions Learning Path
- Just installed iOS 18.2? You should try these 4 iPhone settings first
- This midrange Yamaha soundbar holds its own against systems that are twice its price
- Anthropic's Claude 3 Opus disobeyed its creators - but not for the reasons you're thinking
- The AI journey to discovery and achieving IT mastery
AI Collaborative Research Institute Launched
A trio of companies is launching a new research institute whose intended purpose is to strengthen privacy and trust for decentralized artificial intelligence (AI).
The Private AI Collaborative Research Institute, originally established by Intel‘s University Research & Collaboration Office (URC), is launching as a joint project involving digital security and privacy products vendor Avast and AI software-defined secure computing hardware services company Borsetta.
“As AI continues to grow in strength and scope, we have reached a point where action is necessary, not just talk,” said Michal Pechoucek, CTO at Avast.
“We’re delighted to be joining forces with Intel and Borsetta to unlock AI’s full potential for keeping individuals and their data secure.’’
By decentralizing AI, the companies aim to protect privacy and security, free inaccessible data from silos, and maintain efficiency. The trio said that centralized training can be easily attacked by modifying data anywhere between collection and the cloud.
Another security issue surrounding contemporary AI stems from the limitations of Federated Machine Learning, a technique used to train an algorithm across multiple decentralized edge devices.
While today’s federated AI can access data at the edge, the team behind the Institute said that this technique cannot simultaneously guarantee accuracy, privacy, and security.
“Research into responsible, secure, and private AI is crucial for its true potential to be realized,” said Richard Uhlig, Intel Senior Fellow, vice president and director of Intel Labs.
Borsetta said it was inspired to join the collaboration by its strong belief in driving a privacy-preserving framework to support the future hyperconnected world empowered by AI.
“The mission of the Private AI Collaborative Institute is aligned with our vision for future proof security where data is provably protected with edge computing services that can be trusted,” said Pamela Norton, CEO of Borsetta.
“Trust will be the currency of the future, and we need to design AI embedded edge systems with trust, transparency, and security while advancing the human-driven values they were intended to reflect.”
A call for research proposals issued earlier this year has resulted in the selection of nine research projects at eight universities in Belgium, Canada, Germany, Singapore, and the United States to receive Institute support.