- 5 network automation startups to watch
- 4 Security Controls Keeping Up with the Evolution of IT Environments
- ICO Warns of Festive Mobile Phone Privacy Snafu
- La colaboración entre Seguridad y FinOps puede generar beneficios ocultos en la nube
- El papel del CIO en 2024: una retrospectiva del año en clave TI
The Reason Many AI and Analytics Projects Fail—and How to Make Sure Yours Doesn’t
Topping the list of executive priorities for 2023—a year heralded by escalating economic woes and climate risks—is the need for data driven insights to propel efficiency, resiliency, and other key initiatives. Many companies have been experimenting with advanced analytics and artificial intelligence (AI) to fill this need. Now, they must turn their proof of concept into a return on investment. But, how?
Organizations are making great strides, putting into place the right talent and software. Yet many are struggling to move into production because they don’t have the right foundational technologies to support AI and advanced analytics workloads. Some are relying on outmoded legacy hardware systems. Others are stymied by the cost and control issues that come with leveraging a public cloud. Most have been so drawn to the excitement of AI software tools that they missed out on selecting the right hardware.
As the pace of innovation in these areas accelerates, now is the time for technology leaders to take stock of everything they need to successfully leverage AI and analytics.
Look at Enterprise Infrastructure
An IDC survey[1] of more than 2,000 business leaders found a growing realization that AI needs to reside on purpose-built infrastructure to be able to deliver real value. In fact, respondents cited the lack of proper infrastructure as a primary culprit for failed AI projects. Blocking the move to a more AI-centric infrastructure, the survey noted, are concerns about cost and strategy plus overly complex existing data environments and infrastructure.
Though experts agree on the difficulty of deploying new platforms across an enterprise, there are options for optimizing the value of AI and analytics projects.[2] Foundational considerations include compute power, memory architecture as well as data processing, storage, and security.
It’s About the Data
For companies that have succeeded in an AI and analytics deployment, data availability is a key performance indicator, according to a Harvard Business Review report.[3] In short, the report’s successful leaders have democratized their company’s data—making it accessible to staff, acquiring it from customers and suppliers, and sharing it back. Dealing with data is where core technologies and hardware prove essential. Here’s what to consider:
- Ingesting the data: To be able to analyze more data at greater speeds, organizations need faster processing via high-powered servers and the right chips for AI—whether CPUs or GPUs. Modern compute infrastructures are designed to enhance business agility and time to market by supporting workloads for databases and analytics, AI and machine learning (ML), high performance computing (HPC) and more.
- Storing the data: Many organizations have plenty of data to glean actionable insights from, but they need a secure and flexible place to store it. The most innovative unstructured data storage solutions are flexible and designed to be reliable at any scale without sacrificing performance. And modern object storage solutions, offer performance, scalability, resilience, and compatibility on a globally distributed architecture to support enterprise workloads such as cloud-native, archive, IoT, AI, and big data analytics.
- Protecting the data: Cyber threats are everywhere—at the edge, on-premises and across cloud providers. An organization’s data, applications and critical systems must be protected. Many leaders are seeking a trusted infrastructure that can operate with maximum flexibility and business agility without compromising security. They are looking to adopt a zero-trust architecture, embedding security capabilities across an enterprise-wide line of storage, servers, hyperconverged, networking, and data protection solutions.
- Moving the data: As the landscape of data generation shifts and data traffic patterns grow more complex, surging demands require a network reevaluation in most organizations. For data to travel seamlessly, they must have the right networking system. However, traditional proprietary networks often lack scalability, proven cloud-based solutions, and automation, while open-source solutions can be expensive and inflexible. Open networking answers the challenge by accommodating software choice, ecosystem integration, and automation for the modern enterprise from edge to core to cloud.
- Accessing the data: Increasingly, AI development and deployment is taking place on powerful yet efficient workstations. These purpose-built systems enable teams to do AI and analytics work smarter and faster during all stages of AI development, and increasingly during deployment as they support inferencing at the edge. And to give employees access to the data they need, organizations will need to move away from legacy systems that are siloed, rigid and costly to new solutions that enable analytics and AI with speed, scalability, and confidence. A data lakehouse supports business intelligence (BI), analytics, real-time data applications, data science and ML in one place. It provides rapid, direct access to trusted data for data scientists, business analysts, and others who need data to drive business value.
Focus on Outcomes
Analytics and AI hold the promise of driving better business insights from data warehouses, streams, and lakes. But first, enterprises will need to honestly assess their ability to not just develop but successfully deploy an AI or analytics project. Most will need to modernize critical infrastructure and hardware to be able to support AI development and deployment from edge to data center to cloud. Those that do so will find their data and applications to be force multipliers. Along the way, they will have implemented upgrades that keep data secure and accessible—imperatives for meeting IT and business objectives in the months and years to come.
To learn more about Creating an End-to-End Infrastructure for AI Success, read the IDC white paperand visit Dell.com/AI.
***
Intel® Technologies Move Analytics Forward
Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.
Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.
[1]https://www.idc.com/getdoc.jsp?containerId=prUS48870422#:~:text=AI%20infrastructure%20investments%20are%20following,will%20remain%20the%20preferred%20location.
[2] https://venturebeat.com/ai/the-success-of-ai-lies-in-the-infrastructure/
[3] https://hbr.org/2022/02/what-makes-a-company-successful-at-using-ai