- Track US election results on your iPhone, iPad or Apple Watch - here's how
- How to Become a Chief Information Officer: CIO Cheat Sheet
- 3 handy upgrades in MacOS 15.1 - especially if AI isn't your thing (like me)
- Your Android device is vulnerable to attack and Google's fix is imminent
- Microsoft's Copilot AI is coming to your Office apps - whether you like it or not
Data confidence begins at the edge
Data-driven insights are only as good as your data
Imagine that each source of data in your organization—from spreadsheets to internet of things (IoT) sensor feeds—is a delegate set to attend a conference that will decide the future of your organization. As these data delegates travel toward each other, they cross multiple boundaries such as networks, trust zones, stakeholders, organizations, firewalls, and geographies. What if one of the delegates gets hurt or injured and never makes it to the conference? If any of these data delegates are compromised, it could have a disastrous impact on the future of your organization.
For sectors such as industrial manufacturing and energy distribution, metering, and storage, embracing artificial intelligence (AI) and generative AI (GenAI) along with real-time data analytics, instrumentation, automation, and other advanced technologies is the key to meeting the demands of an evolving marketplace, but it’s not without risks. The complexity of the operational technology (OT) landscape, with its diverse array of protocols, standards, and communication technologies, creates opportunities for bad data to compromise the data pipelines feeding AI, analytics, and other advanced applications. To succeed in the AI era, IT teams need a way to confirm that the data going into these systems is both accurate and trusted.
Data doubt compounds tough edge challenges
The variety of operational challenges at the edge are compounded by the difficulties of sourcing trustworthy data sets from heterogeneous IT/OT estates. Without a way to define and measure data confidence, AI model training environments, data analytics systems, automation engines, and so on must simply trust that the data has not been simulated, corrupted, poisoned, or otherwise maliciously generated—increasing the risks of downtime and other disasters.
For example, condition-based monitoring presents unique challenges for manufacturing and power plants worldwide. Unmonitored low-voltage motors are common in both the energy and manufacturing sectors, but if the condition-based monitoring fails, it can cause expensive, unplanned downtime, impacting many. These motors are often housed in harsh environmental conditions with significant temperature fluctuations that make it difficult to measure motor sound and vibration accurately, which are crucial metrics for assessing functionality and identifying potential faults. Consequently, implementing continuous monitoring systems in these conditions is often not practical or effective. The ability to source real-time, continuous, and trustworthy monitoring data for motor health and deliver actionable insights based on that data is essential for improving efficiency and mitigating risks associated with unmonitored low-voltage motors at the industrial edge.
In another example, energy systems at the edge also present unique challenges. The transition to a clean energy grid requires advanced solutions for energy management and storage as well as power conversion. Utilities must modernize aging grid infrastructure to accommodate the unique demands of renewable energy. Integrating distributed energy resources, such as rooftop solar panels and electric vehicles, requires advanced energy management systems that can handle bidirectional power flow and voltage regulation to effectively balance supply and demand. Leveraging data-driven insights can help utilities design, implement, and manage more efficient and reliable grids. Trustworthy data is essential for the energy industry to overcome these challenges and accelerate the transition toward digital transformation and sustainability.
A recipe for trustworthy data
As the compute stack becomes more distributed across constrained environments, companies need the ability to prove data integrity through a trust fabric to unlock data insights they can rely on. Addressing this complex issue requires a multi-pronged approach.
Optimizing device performance within the typical edge constraints of power, energy, latency, space, weight, and cost is essential. As is centralizing end-to-end deployment and management of diverse edge infrastructure and applications using automation, open design, zero-trust security, and multicloud orchestration so that you can securely scale your edge. Enabling high-precision sensor modalities across energy, power, motion, vision, metrology, inertia measurement and more gives you a trusted, relevant, and valuable data set for delivering digital insights. Wrapping all of these data sources with a true end-to-end data signal chain with a data provenance scoring system can give you confidence in the integrity of data from the point of origin, at the sensor edge.
Dell Technologies works with partners to combine Dell NativeEdge as a secure orchestration platform with cutting-edge, cryptography-enabled sensors that generate trustworthy edge data. By leveraging these sensors alongside Dell Data Confidence Fabric (DCF) for confidence scoring, organizations can use trusted data sets to deliver meaningful insights with high precision and fidelity.
Specifically, what the DCF does is capture metadata related to the application and compute stack. Because much of today’s data is created and handled in a distributed topology, the DCF tags specific pieces of data that have traversed a range of hosts. For example, it can capture whether each host is in compliance with organizational policy, has the required security components onboard, and if the data has a checksum to validate it hasn’t been tampered with along its journey. DCF is a risk mitigation tool that can be applied to operational risk, such as mitigating the risk that AI causes an accident, and regulatory risk, such as providing evidence an AI was trained with known good data from a trusted source.
Great things happen at the edge
The integration of sensor root-of-trust technology into Dell DCF and used in conjunction with Dell NativeEdge ensures the integrity of edge sensors, allowing upstream applications to fully trust the data generated. In addition, organizations that sell their data for third-party training purposes may be able to charge a premium if they integrate with Dell DCF and provide the associated trust metadata. This is the first phase of a technical vision that will bring great benefits to the manufacturing and energy edge.
To learn more about the solution, read the white paper or watch the video.