IoT - Engineering.com https://www.engineering.com/category/technology/iot/ Fri, 06 Dec 2024 16:07:25 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png IoT - Engineering.com https://www.engineering.com/category/technology/iot/ 32 32 How does IIoT enable automation and control in manufacturing? https://www.engineering.com/how-does-iiot-enable-automation-and-control-in-manufacturing/ Thu, 14 Nov 2024 17:49:36 +0000 https://www.engineering.com/?p=134006 Here’s the basics on how IIoT helps manufacturers create an automated ecosystem, driving optimization and continuous improvement.

The post How does IIoT enable automation and control in manufacturing? appeared first on Engineering.com.

]]>
IIoT initiatives can go a long way in enhancing the results of a manufacturer’s continuous improvement programs by enhancing real-time control and automation of manufacturing processes. This is achieved by providing an integrated, data-driven approach to operations that connects sensors, machines and devices to systems such as MES, PLM and ERP, among others.

As we’ve learned in this series of articles, IIoT involves the deployment of sensors and smart devices on machines and equipment throughout a manufacturing facility. These sensors continuously collect data such as temperature, pressure, speed, vibration, humidity and other critical performance metrics. This constant stream of real-time data allows operators to monitor machine health, detect performance deviations and measure production parameters instantly.

For example, a temperature sensor on an industrial oven can continuously monitor the baking process and relay that information to a central control system, allowing operators to adjust on the fly if the temperature deviates from the ideal range.

One of the key advantages of such a system in manufacturing is predictive maintenance. The real-time data from sensors not only help manufacturer predict when a machine or component is likely to fail before it breaks down, it helps develop a timeline of machine health to aid in long-term production scheduling. This is possible through techniques like vibration analysis, acoustic monitoring and thermal imaging, which detect signs of wear or malfunction.

By identifying potential issues early, manufacturers can perform maintenance only when needed—reducing unnecessary downtime and costly emergency repairs. This predictive capability enhances the automation of maintenance schedules, allowing for smoother, more continuous production runs.

The value of an IIoT regime isn’t limited to maintenance. When done properly, it enhances the automation of decision-making by integrating advanced data analytics with machine control systems. As data is collected in real time, algorithms analyze this information and automatically adjust equipment settings, production schedules or supply chain logistics.

For example, if a sensor detects a drop in the pressure of a hydraulic system, an IIoT-powered control system can automatically adjust the pressure to maintain optimal performance without human intervention, while signalling operators to warn them of a potential problem. This reduces human error, ensures consistent quality and increases the efficiency of manufacturing processes.

Integrating machines and systems

IIoT enables interconnectivity between different machines, production lines and even links multiple factories or plants. This integration creates seamless communication between devices and systems, enabling data to flow between them in real time. As a result, manufacturers can coordinate complicated operations across the entire production process, from raw material handling to final product assembly.

If a machine on one production line detects a fault and stops, IIoT can signal other parts of the system to adjust, switch tasks or even reroute resources to prevent bottlenecks or downtime elsewhere in the system.

Enhanced supply chain visibility and automation

The follow-on to this is that manufacturers gain real-time insights into their supply chain operations. By integrating data from inventory systems, suppliers and logistics providers, manufacturers can track raw materials, monitor stock levels and predict demand fluctuations. This improves dynamic scheduling and just-in-time production, ensuring that materials are available when needed and avoiding overproduction or stockouts.

IIoT can also help automate inventory management through systems that track and reorder supplies autonomously, reducing manual inputs and the risk of human error.

Energy and resource optimization

Using these systems, manufacturers can continuously monitor energy consumption and resource utilization in real time. Data on electricity use, water consumption, air pressure, and other utilities, help identify inefficiencies and optimize resource allocation.

For example, by tracking the energy usage of different machines and automatically adjusting power consumption, engineers can ensure equipment is operating at peak efficiency, reducing waste and operational costs. Smart systems can also suggest or adjust energy consumption during idle times or change temperature settings in response to production demands.

Real-time quality control

Just like many other aspects of manufacturing, IIoT can help turn quality control processes from reactive to proactive in real-time. Monitoring production parameters like temperature, speed, material composition and other factors during manufacturing, allows manufacturers to detect deviations that could eventually lead to quality defects.

Sensors would measure the consistency of materials or operations like material removal in machining, detect faults in real time and automatically adjust the production line to correct these issues before defects occur, ensuring products meet the required specifications.

Improved worker safety

IIoT can also enhance worker safety by automating dangerous tasks and monitoring environmental conditions in real time. These systems detect hazardous conditions such as toxic gases or abnormal machine vibrations, alerting workers to potential dangers. In more automated setups, IIoT allows for robots and machines to take over high-risk tasks, minimizing human exposure to dangerous environments.

In addition, IIoT-enabled wearables, such as safety vests or helmets with sensors, track workers’ vital signs and environmental conditions, ensuring their safety by triggering alarms or alerts if any dangerous situations arise.

The benefits significantly enhance real-time control and automation in manufacturing by enabling continuous monitoring, automated decision-making, predictive maintenance, and real-time optimization. Integrating data across machines, systems and supply chains makes manufacturing processes more intelligent, efficient and adaptable. By driving automation, improving quality control, reducing costs, and increasing productivity, IIoT is helping manufacturers stay competitive in a rapidly evolving industrial landscape.

Feedback loops

Data feedback loops are a critical concept in optimizing and adjusting processes dynamically, especially in the context of manufacturing. These loops involve continuously collecting data from various systems, processing it to generate insights and then using that information to manage the process. The objective is to maintain efficiency, improve quality, reduce waste and adapt to changing conditions.

The first step in a data feedback loop is the collection of real-time data. We have already learned this data can come from a variety of sensors on machinery, production equipment, supply chain systems, environmental sensors or even wearable devices used by workers.

Once data is collected, it’s sent to a central system where it is processed and analyzed. This may involve simple statistical analysis, machine learning algorithms or deep AI-driven analytics. For processes that require immediate adjustments, data is processed instantly at the edge, allowing for quick decision-making.

Some feedback loops use historical data and predictive algorithms to anticipate problems before they occur. A predictive maintenance system can use data on vibration levels and temperature changes to predict when a machine is likely to fail. Advanced analytics can identify patterns or trends in the data, such as recurring defects or inefficiencies. These insights allow companies to target specific areas for improvement.

Decision-making and adjustment

Based on the analysis, decisions are made on how to adjust or optimize the process in real-time. These adjustments can be made by human operators or automated control systems that directly change operational parameters without human intervention.

In an automated system, once a problem is detected or an optimization is identified, the system can adjust itself. For example, if a temperature sensor on a furnace shows that the temperature is too high, the control system can automatically reduce the heat. Similarly, in a factory, a machine might speed up or slow down based on real-time demand or product quality measurements.

In some cases, a feedback loop will alert a human operator about an issue, but the operator will make the decision on how to proceed, such as when a sensor detects a quality issue with a product.

The key advantage of data feedback loops is their ability to drive process optimization continuously. Over time—as the system collects more data—it improves its ability to make more accurate predictions, identify inefficiencies and adjust processes more effectively.

At a glance: a feedback loop in manufacturing

To explore the basic concept of data feedback loops in manufacturing, consider a smart factory scenario with a robotic assembly line, as described below:

Data Collection: Robots on the assembly line are equipped with sensors to monitor the position, speed, and performance of each part as it moves through the production process.

Data Processing: As the sensors collect data on each part, this information is fed into an analytics platform. The system compares the real-time data with pre-set performance targets, such as the desired part speed, quality standards, and cycle times.

Decision-Making and Adjustment: If the system detects that a part is not being assembled correctly (e.g., an incorrect part placement or missing component), the feedback loop triggers an automatic adjustment, such as slowing down the robot or stopping the line for a quality check. Alternatively, if the system notices that parts are moving too slowly, it can increase the robot speed to meet production goals.

Optimization: As the system continues to gather data and make adjustments, it identifies trends and optimizes the process. For example, it might recognize that certain parts are consistently experiencing defects and automatically adjust the settings to improve alignment or material flow.

Data feedback loops are particularly valuable because they provide real-time adaptability. Manufacturing environments are dynamic, and changes in demand, raw materials, or equipment conditions can occur quickly. A well-designed feedback loop ensures that the system can respond immediately to these changes, keeping production smooth and efficient.

Closing the loop

Once adjustments are made, the feedback loop continues by monitoring the results of those changes and further refining the process. If the adjustment improves performance, the loop continues to operate as usual. If it causes a problem or doesn’t improve performance, the loop learns from that data, adjusting its predictions and recommendations.

By enabling real-time data collection, analysis and automated adjustments, IIoT systems give manufacturers the information required to use feedback loops in a more nuanced, targeted manner to continuously improve performance, maintain high levels of quality, and adapt to changing conditions.

The post How does IIoT enable automation and control in manufacturing? appeared first on Engineering.com.

]]>
Using AI at the edge to connect the dots in IIoT https://www.engineering.com/using-ai-at-the-edge-to-connect-the-dots-in-iiot/ Fri, 08 Nov 2024 18:40:40 +0000 https://www.engineering.com/?p=133768 Edge AI expert Jack Ferrari shares his insights on why this tech is a good fit for manufacturing operations.

The post Using AI at the edge to connect the dots in IIoT appeared first on Engineering.com.

]]>

For manufacturing facilities, an IoT strategy for equipment data collection and analysis is an essential step toward digital transformation, providing the data required to generate data-driven insights, predictive maintenance and other benefits. However, connecting machines and equipment to the internet raises challenges for IT teams, including security concerns, data storage, bandwidth and computing power.

To tackle these challenges, many teams consider whether to process data at the edge, or in the cloud. Historically, while the edge has benefits such as processing speed, low bandwidth and security, cloud solutions offer unmatched computing power. To use complex computing solutions such as AI models, you may think cloud is the only option. However, vendors like MathWorks are proving that AI at the edge can provide the best of both worlds.

Engineering.com recently spoke with Jack Ferrari, Edge AI Product Manager at MathWorks, to learn more about Edge AI and how manufacturers use it.

Engineering.com (Eng.com): What are the benefits of edge devices compared to ‘dumb’ sensors that just send data straight on to a pc or to the cloud?

Jack Ferrari (JF): Running AI models locally on edge devices instead of the cloud brings several benefits. First, the inference time (or, response time) of the model can be greatly reduced, as data is no longer required to be shuffled back and forth over the Internet. Secondly and for the same reason, edge AI enhances data privacy (all data fed to/from the model stays on the device) and makes applications more reliable (less prone to network outages). Finally and thirdly, edge AI can lower costs by reducing/eliminating cloud hosting and storage fees.

Eng.com: What trends and technologies drive edge AI’s adoption across industries?

JF: The large and rapidly growing number of IoT devices across industries (expected to reach 40 billion by 2030) are generating massive amounts of data at the edge, driving the need for local processing to handle data efficiently, reduce latency and lower cloud costs. Advancements in hardware, like AI accelerators and software, like new model compression techniques, are working in tandem to enable the adoption of edge AI.

Eng.com: Do you think industry-wide adoption of edge technology is driven by devices becoming more cost effective, energy efficient and/or powerful, or are edge trends driven by other, more strategic factors?

JF: A combination of the two is influencing edge AI’s adoption: On the technology side, new hardware platforms are being designed with AI workloads in mind. Recent advancements in microcontrollers (MCUs), digital signal processors (DSPs) and AI accelerators (like Neural Processing Units (NPUs) are enabling the deployment of models that were previously impossible to consider running at the edge. Besides simply having greater horsepower, these new chips are being optimized to execute AI workflows with greater energy efficiency. At the same time, the ecosystem of software tools used to compress AI models and program them on edge devices is becoming more robust and user-friendly, making the technology more accessible. Strategically, edge AI is enabling companies to differentiate their products in new ways. For example, by adding real-time processing and decision-making capabilities, enhancing device security by handling all data processing locally and enabling the personalization of AI models through techniques like on-device learning.

Eng.com: In many industries, security and IP concerns hold back adoption of AI tools. Is this seen in manufacturing?

JF: Security and IP concerns can indeed impact AI adoption in manufacturing. However, processing sensitive data at the edge (close to where it originates), rather than transmitting it to the cloud, can reduce exposure to potential breaches, offering a way to address these concerns.

Eng.com: What benefits can engineers expect when using edge AI?

JF: There are four primary benefits to using edge AI:

  • Lower latency: AI models can deliver predictions and classifications more quickly, which is crucial for engineers working on time-sensitive applications. This rapid response can enhance user experience and enable real-time decision- making, particularly in scenarios where milliseconds matter, such as autonomous vehicles or live data monitoring.
  • Lower costs: Reducing data transmission and storage fees, along with improved energy efficiency, leads to significant cost savings. For engineers, this means more budget can be allocated to other critical projects or resources and they can ensure their systems have higher uptime and availability, even during network outages, thus maintaining service continuity.
  • Enhanced privacy: By processing incoming data on-device instead of transmitting it to the cloud, engineers can ensure higher levels of data privacy and security. This is particularly beneficial in industries where sensitive information is handled, as it reduces the risk of data breaches and ensures compliance with privacy regulations, making it easier to protect user data.
  • Improved reliability: As edge AI does not rely on continuous cloud connectivity, it can continue to function during network outages. This ensures that critical operations, like monitoring and control systems in manufacturing, remain active even if the cloud connection is lost.

Eng.com: What are common challenges associated with edge AI?

JF: While it’s becoming easier to implement AI models on the edge, organizations should be mindful of several challenges that accompany the technology:

  • Resource constraints: Edge devices typically have limited processing power, memory and storage. Complex AI models may run slowly or not at all. To mitigate this, proper care should be taken in selecting model architectures that are well-suited for the edge device they will eventually be deployed to. Additionally, models can be further optimized for edge deployment with compression techniques like projection, pruning and quantization.
  • Model deployment: Translating AI models from the high-level languages where they are defined and trained (like Python or MATLAB) to low-level languages that can be compiled to run on edge devices (like C or C++) can be challenging. MathWorks tools facilitate this process by automating the conversion, ensuring efficient deployment on a diverse range of hardware. For example, Airbus used GPU Coder to deploy deep learning models, trained in MATLAB for defect detection, onto embedded GPUs. GPU Coder automatically translated their MATLAB code into the corresponding CUDA code, which could be compiled and run on their embedded system.
  • Model maintenance: After deploying AI models to the edge, organizations should have a plan for keeping them updated over time. This can take several forms:
  • Over-the-air (OTA) updates, where new model files and weights are sent to edge devices over a network connection.
  • On-device training (or, incremental learning), where models are updated and refined directly on the device using local data, allowing for personalization without the need to communicate with the cloud.

Eng.com: Are there edge AI use cases that are applicable across multiple industries?

JF: Beyond classic examples like image classification, object detection and semantic segmentation, one interesting application of edge AI MathWorks is seeing used across industries are virtual sensors (or, software sensors). AI-based virtual sensors can be used to infer sensor data that might be difficult or expensive to measure directly, by analysing data from other sensors in real-time. One great example is for estimating the state of charge of a battery. While difficult to measure directly, it can be inferred from the values of other, more easily attainable values, like current, voltage and operating temperature. By using AI models trained on historical battery performance data, the virtual sensor can predict the state of charge more accurately and adapt to changes in battery health and usage patterns, providing real-time insights without the need for additional hardware. Virtual sensors are applicable to multiple industries, including automotive, aerospace, manufacturing and healthcare. As another example, Poclain Hydraulics used MATLAB to design and deploy a neural network-based virtual sensor for monitoring the temperature of motors used in power machinery.

Eng.com: Do you think that the trend toward AI and cloud-based IoT systems make custom-built systems into dinosaurs, in other words would a manufacturer be ‘crazy’ to consider building a solution in-house?

JF: While AI and cloud-based IoT systems offer scalable and cost-effective solutions, the decision to build a system in-house depends on a manufacturer’s specific needs and capabilities. Companies with specialized requirements or strong internal expertise may benefit from custom solutions, while others might prefer the speed and lower upfront costs of cloud-based platforms. Ultimately, the choice hinges on factors like customization, security and time to market.

Eng.com: As the complexity increases of the devices we use to monitor and maintain our equipment, is there a growing need to monitor and maintain the edge and IT devices as well? How do we do that?

JF: Yes, as the complexity and number of AI-enabled edge devices increases, so does the need for monitoring and maintenance. Over time, the input data to AI models can drift or differ significantly from the data they were originally trained on, negatively impacting model accuracy and performance. Organizations should anticipate this and consider approaches to continuously update their models, whether through OTA updates or incremental learning.

For more on edge AI, check out https://www.engineering.com/edge-ai-solutions-every-engineer-should-know-about.

The post Using AI at the edge to connect the dots in IIoT appeared first on Engineering.com.

]]>
How should I design my IIoT architecture? https://www.engineering.com/how-should-i-design-my-iiot-architecture/ Tue, 05 Nov 2024 14:39:30 +0000 https://www.engineering.com/?p=133620 The goal for companies starting out should be flexibility, interoperability and incremental investment

The post How should I design my IIoT architecture? appeared first on Engineering.com.

]]>
At the heart of an effective IIoT system is a modular architecture. This approach allows manufacturers to implement plug-and-play devices, which can be added or removed as necessary. For example, if a facility wishes to introduce a new type of sensor to monitor machine performance, it can do so without overhauling the entire system. This flexibility enables incremental upgrades that enhance capabilities progressively.

In addition, adopting a microservices architecture means that each component of the IIoT system operates independently. If a particular service—such as data collection or processing—needs improvement, it can be scaled or replaced without affecting the entire infrastructure. This targeted enhancement ensures that the system evolves alongside operational needs, fostering innovation and responsiveness.

Flexible data management

As data volumes increase, flexible data management becomes essential. Leveraging cloud solutions allows manufacturers to tap into virtually limitless data storage and processing capabilities, accommodating the influx of information from a growing number of IIoT devices. This scalability ensures that data can be collected and analyzed efficiently, supporting informed decision-making.

Moreover, integrating edge computing allows for local data processing. By analyzing data closer to where it is generated, manufacturers can reduce latency and bandwidth demands, resulting in quicker response times and more efficient analytics. This setup is particularly beneficial for real-time applications, where immediate insights can drive operational improvements.

Interoperability

To maximize the benefits of IIoT, interoperability is crucial. By adopting standard communication protocols like MQTT or OPC UA, new devices integrate seamlessly with existing systems. This standardization reduces compatibility issues and simplifies the addition of new technologies.

Open APIs further facilitate integration by connecting diverse applications and devices. This approach not only enhances the system’s scalability but also promotes innovation by enabling third-party developers to contribute new functionalities.

Adaptive network infrastructure

A robust networking infrastructure is essential for supporting the growth of IIoT systems. Investing in scalable solutions, such as 5G or private LTE, ensures the network can handle a number of connected devices without sacrificing performance. These high-capacity networks facilitate rapid data transfer, which is critical for real-time operations.

Mesh networking can also enhance connectivity. As the number of IIoT devices increases, a mesh network can improve reliability and coverage. The devices can communicate more effectively with each other and with central systems.

User-centric design

A focus on user-centric design is essential for an IIoT system to be accessible and useful. Developing intuitive interfaces enables users to interact with complex data and analytics. As new functionalities and devices are integrated, these interfaces should remain adaptable.

Customization options allow users to tailor their dashboards and data presentations. This flexibility ensures employees can concentrate on the metrics that matter most to their specific roles, enhancing productivity and engagement.

Incremental investment

By planning for phased implementation, organizations can gradually adopt IIoT technologies, assessing results and adjusting the strategy based on initial deployments. This method reduces the risk associated with large-scale changes and enables organizations to learn and adapt as they progress.

Starting with pilot programs provides an opportunity to test scalability in real-world conditions. These initial tests inform future investments and expansions, ensuring that the overall strategy aligns with operational goals.

Collaboration and ecosystem engagement

To keep pace with technological advancements, manufacturers must engage in collaboration and ecosystem engagement. Partnering with technology providers and stakeholders ensures that the IIoT ecosystem can evolve together, sharing insights and best practices.

Active community engagement in industry forums and engineering-focused websites such as Engineering.com helps manufacturers stay updated on emerging technologies and methodologies that facilitate scaling. By participating in these discussions, organizations can learn from the experiences of others and implement strategies that drive success.

Training and support

As systems evolve, training and support become critical. Providing continuous training for staff ensures that employees can effectively navigate new technologies and systems. This investment in human capital is essential for maximizing the benefits of IIoT.

Additionally, ensuring access to technical support helps organizations address challenges that arise during scaling. Support teams can assist with integration and troubleshooting, allowing manufacturers to focus on their core operations.

Feedback and iteration

Establishing feedback mechanisms is crucial for ongoing improvement. By collecting input from users and stakeholders, manufacturers can implement iterative enhancements to their IIoT systems as they scale. This feedback loop fosters a culture of continuous improvement and adaptation.

Encouraging an adaptation to change within teams is also vital. By promoting a culture that embraces innovation and is open to new ideas, organizations can optimize their IIoT implementations over time, ensuring they remain responsive to evolving operational needs.

The breakdown

Here’s a simplified (but not simple) breakdown of the typical layers in this type of modular architecture:

1. Device Layer (Edge Layer)

Components: Sensors, actuators and other smart devices.

Function: Collects data from machinery and equipment and may perform local processing to filter or aggregate data before transmission.

2. Connectivity Layer

Components: Communication protocols and network infrastructure.

Function: Facilitates communication between devices and central systems using wired (e.g., Ethernet) or wireless technologies (e.g., Wi-Fi, Bluetooth, LoRaWAN, cellular).

3. Data Ingestion Layer

Components: Gateways and edge computing devices.

Function: Manages the transmission of data from edge devices to cloud or on-premises servers, handling data aggregation and initial processing.

4. Data Processing and Analytics Layer

Components: Cloud or on-premises servers equipped with data analytics and machine learning tools.

Function: Analyzes the ingested data for insights, predictive maintenance, and operational optimization, utilizing advanced algorithms and models.

5. Storage Layer

Components: Databases and data lakes.

Function: Stores historical data for analysis, reporting and compliance, supporting both structured and unstructured data types.

6. Application Layer

Components: User interfaces, dashboards and applications.

Function: Provides tools for visualization, reporting, and user interaction, enabling stakeholders to make informed decisions based on data insights.

7. Security Layer

Components: Security protocols, encryption and access controls.

Function: Ensures data integrity and confidentiality, protecting the system from cyber threats and unauthorized access at all layers.

8. Integration Layer

Components: APIs and middleware.

Function: Enables integration with existing enterprise systems (like ERP, MES, and SCADA) for seamless data flow and operational coherence.

Wrap up

This layered modular architecture provides flexibility and scalability, allowing manufacturers to implement IIoT solutions tailored to their specific needs. By clearly defining each layer’s role, organizations can enhance interoperability, maintain security, and ensure that data flows effectively from devices to actionable insights. This structure facilitates incremental upgrades and the integration of new technologies as they become available.

The post How should I design my IIoT architecture? appeared first on Engineering.com.

]]>
How to plan data collection, storage and visualization in an IIoT deployment https://www.engineering.com/how-to-plan-data-collection-storage-and-visualization-in-an-iiot-deployment/ Mon, 21 Oct 2024 19:32:23 +0000 https://www.engineering.com/?p=133070 Be sure to consider scalability and future-proofing to accommodate evolving manufacturing processes and technologies.

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>

When it comes to an IIoT (Industrial Internet of Things) implementation in manufacturing, data collection, storage, analytics and visualization are the core backplane that drives actionable insights and enables smarter operations.

How do these components typically align in an IIoT system and what considerations should a manufacturing engineer should keep in mind when planning an implementation? It can certainly get complicated, but breaking things down into their smaller parts makes it more manageable.

Data Collection

The effectiveness of data collection largely depends on sensor architecture. Depending on the equipment or process, various types of sensors (temperature, pressure, vibration, etc.) need to be deployed across critical points in the manufacturing process. Ensure sensors are selected with appropriate accuracy, environmental tolerance and response time for the specific application.

A Data Acquisition Systems (DAS) act as an interface between these sensors and the IIoT platform. It gathers real-time data from sensors and transmits it to the edge or cloud infrastructure. The big decision here is whether to use edge processing (local data pre-processing) or rely on centralized data gathering at the cloud level. Edge processing offers lower latency, making it ideal for real-time tasks. It also reduces bandwidth needs by processing data locally. However, it requires more upfront investment in hardware and can be harder to scale. In contrast, cloud processing handles large data volumes more easily and scales better, though it comes with higher latency and ongoing costs for bandwidth and storage. Cloud systems also need robust security measures for data transmission. A hybrid approach combining both edge and cloud processing might be an option that balances real-time processing with scalable, centralized data management, but it depends on each application and the desired outcomes.

The next big decision is to determine the optimal sampling rate. Too high of a sampling frequency can overwhelm your storage and bandwidth, while too low may miss critical insights, particularly in dynamic manufacturing processes. Work with process engineers to determine the data sampling frequency based on process variability. Test this often to ensure what you think is the optimal sampling rate isn’t leaking potential value.

If you are going to base major decision off the insights gained through this IIoT system, you must ensure the integrity of collected data. This means that error checking (e.g., using checksums or hashing) and redundancy mechanisms (e.g., backup data paths or local buffering) are in place to handle network failures or sensor malfunctions.

A checksum is a small-sized piece of data derived from a larger set of data, typically used to verify the integrity of that data. It acts as a digital fingerprint, created by applying a mathematical algorithm to the original data. When the data is transmitted or stored, the checksum is recalculated at the destination and compared with the original checksum to ensure that the data has not been altered, corrupted or tampered with during transmission or storage.

Hashing is the process of converting input data into a fixed-size string of characters, typically a unique value (hash), using a mathematical algorithm. This hash is used for verifying data integrity, securing communication, and enabling fast data retrieval, with each unique input producing a unique hash.

When planning sensor deployment, focus on critical assets and key process variables that directly impact production efficiency, quality or safety. Implementing a hierarchical sensor strategy (high-priority sensors collecting frequent data, lower-priority ones providing long-term insights) can help balance costs and data richness.

Data Storage

Here again you are faced with a decision between either local (edge) or a centralized cloud environment for data storage. The same the same pros and cons apply as did in data acquisition, but your needs may be different.

Edge storage is useful for real-time, low-latency processing, especially in critical operations where immediate decision-making is necessary. It also reduces the amount of data that needs to be transmitted to the cloud.

Cloud storage is scalable and ideal for long-term storage, cross-site access and aggregation of data from multiple locations. However, the bandwidth required for real-time data streaming to the cloud can be costly, especially in large-scale manufacturing operations.

Manufacturing environments typically generate large volumes of data due to high-frequency sensors. Plan for data compression and aggregation techniques at the edge to minimize storage overhead.

Lossless compression reduces data size without any loss of information, ideal for critical data. Popular algorithms include GZIP, effective for text data, LZ4, which is fast and low latency for real-time systems, and Zstandard (Zstd), offering high compression and quick decompression for IIoT.

Lossy compression, on the other hand, is suitable for sensor data where some precision loss is acceptable in exchange for better compression. Wavelet compression is efficient for time-series data, and JPEG/MJPEG is often used for images or video streams, reducing size while maintaining most visual information.

Data aggregation techniques help reduce data volume by combining or filtering information before transmission. Summarization involves averaging or finding min/max values over a time period. Sliding window aggregation and time bucketing group data into time intervals, reducing granularity. Event-driven aggregation sends data only when conditions are met, while threshold-based sampling and change-detection algorithms send data only when significant changes occur. Edge-based filtering and preprocessing ensure only relevant data is transmitted, and spatial and temporal aggregation combines data from multiple sources to reduce payload size.

Because edge devices often operate in resource-constrained environments, deal with real-time data and must efficiently manage the communication between local systems and central servers, there are several edge-specific considerations for optimizing data management in IIoT systems. For real-time applications, techniques like streaming compression (e.g., LZ4) and windowed aggregation help minimize latency by processing data locally. Delta encoding reduces data size by only transmitting changes from previous values, minimizing redundancy. Additionally, hierarchical aggregation allows data to be aggregated at intermediate nodes, such as gateways, before being sent to the central system, further reducing the transmission load and improving overall efficiency in multi-layered edge networks. These considerations are uniquely suited to edge computing because edge devices need to be efficient, autonomous, and responsive without relying heavily on central systems or expensive bandwidth.

You’ll also need a storage architecture that can scale to accommodate both current and future data growth. Also, implement a robust redundancy and backup strategy. With critical manufacturing data, losing information due to hardware failure or network issues can be costly. Redundant storage, preferably in different geographic locations (for disaster recovery), is crucial for resilience.

TIP: For time-sensitive data (e.g., real-time process control), store at the edge and use data batching for non-urgent data that can be transmitted to the cloud periodically, reducing latency and network costs.

Analytics

Real-time analytics is essential for immediate decision-making (shutting down a faulty machine or adjusting a process parameter), while historical analytics provides long-term insights into trends and performance (predictive maintenance, yield optimization).

To enable real-time analytics, data should undergo initial pre-processing and filtering at the edge, so that only relevant insights or alerts are passed to the cloud or central system. This reduces data transfer overhead and minimizes latency in decision-making. For long-term analysis (identifying trends, root cause analysis), use batch processing techniques to handle large datasets over time. Machine learning (ML) and AI models are increasingly integrated into IIoT systems to identify anomalies, predict failures or optimize operations based on historical data.

IIoT analytics is more than just looking at individual sensor data; it’s about correlating data across multiple devices, sensors and even different factory lines to uncover patterns. Implement data fusion techniques where data from different sensors or sources can be combined to improve the accuracy and richness of insights.

Visualization

Visualization tools are essential for both operators and decision-makers to quickly assess the performance of processes and machines. These should include customizable dashboards that display real-time Key performance indicators (KPIs) like throughput, efficiency, downtime and machine health. KPIs should be linked to the specific objectives of the manufacturing process.

For process optimization and long-term planning, historical trends and patterns should be visualized clearly. This allows for root-cause analysis, identifying inefficiencies and making data-driven decisions about process improvements.

These visualizations should be tailored to different user roles. Operators need real-time alerts and immediate insights into machine performance, while managers or engineers might need access to historical data and trend analysis. Design the user interface (UI) and access controls with these distinctions in mind.

For advanced implementations, digital twins and augmented reality can be used to simulate and visualize complex data in 3D. Digital twins create a virtual replica of the manufacturing environment, allowing engineers to monitor and optimize operations without needing to be physically present.

Planning IIoT implementations

When planning IIoT in manufacturing, focus on building a scalable, resilient and secure architecture for data collection, storage, analytics and visualization. Ensure that data collection is optimized to balance cost and data richness, using both edge and cloud storage appropriately. Analytics capabilities should provide real-time decision support while enabling deep insights through predictive maintenance and long-term performance analysis. Visualization tools should cater to different user needs, ensuring clear, actionable insights through both real-time dashboards and historical data views. Keep in mind the challenges of data volume, latency, network bandwidth and data integrity as you design the IIoT system, with attention to scalability and future-proofing the infrastructure to accommodate evolving manufacturing processes and t

The post How to plan data collection, storage and visualization in an IIoT deployment appeared first on Engineering.com.

]]>
6 Best Practices When Developing XR for Industrial Applications https://www.engineering.com/resources/6-best-practices-when-developing-xr-for-industrial-applications/ Mon, 21 Oct 2024 13:44:35 +0000 https://www.engineering.com/?post_type=resources&p=133025 Through Industry 4.0 and the industrial internet of things (IIoT), developers have brought industry into the digital realm. Industry experts can learn, control and share anything about a process with a few clicks. But these experts are still limited by their physical connections.

The post 6 Best Practices When Developing XR for Industrial Applications appeared first on Engineering.com.

]]>

Developers, however, can start to blend the physical and digital realms via technologies like virtual reality (VR), augmented reality (AR) and mixed reality (MR) — collectively referred to as extended reality (XR). But this dream is still in its infancy. As a result, developers need guidelines to ensure they are going down the correct path when creating XR experiences.

In this 7-page ebook, developers will learn:

  • How XR is bound to change industry.
  • Which challenges exist when making XR experiences for industry.
  • Six best practices to keep the development of industrial XR experiences on track.
  • How Unity can help make industrial XR experiences a reality.

To download your free ebook, fill out the form on this page. Your download is sponsored by Unity Technologies.

The post 6 Best Practices When Developing XR for Industrial Applications appeared first on Engineering.com.

]]>
What are the roles of sensors and actuators in IIoT? https://www.engineering.com/what-are-the-roles-of-sensors-and-actuators-in-iiot/ Mon, 07 Oct 2024 19:48:25 +0000 https://www.engineering.com/?p=132533 Sensors are the eyes and ears of your operation and actuators are the hands.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>

Every manufacturing engineer considering an IIoT implementation should put considerable focus into how the systems contribute to data collection, real-time decision-making and automated control within the production environment.

Sensors are the eyes and ears of your operation. These data collection devices continuously monitor various physical or environmental parameters on the shop floor. Sensors have been developed to measure almost any condition on the shop floor. Here are some common types:

Temperature (for controlling furnaces or ovens)

Pressure (for monitoring hydraulic or pneumatic systems)

Vibration (for detecting imbalance in motors or machinery)

Humidity (for ensuring optimal conditions in certain manufacturing processes)

Proximity (for part detection on a conveyor belt or pallet)

Torque and Force (for ensuring precise assembly or machining)

These days, most sensors provide real-time data that are essential for understanding the status of machines, the health of equipment and the quality of products.

Sensors can capture data continuously or at regular intervals, feeding it back to a centralized system or edge devices. This data allows you to monitor machine performance and production quality in real-time. By continuously monitoring conditions such as temperature, vibration and pressure, sensors can help predict equipment failures before they happen—enabling predictive maintenance strategies. This minimizes downtime and unplanned repairs. Sensors can also ensure product quality by tracking parameters such as size, weight or chemical composition, ensuring products are within acceptable tolerances.

The data collected by sensors is sent to centralized cloud systems or edge devices for real-time analysis, enabling manufacturers to make informed decisions on production adjustments and process improvements.

Actuators: The Hands of Your IIoT System

Once sensors collect and transmit data, actuators play the critical role of executing actions based on the data received. Actuators are devices that respond to control signals by performing physical tasks, including:

Opening or closing a valve (to control fluid or gas flow in a pipeline)

Adjusting motor speeds (for conveyor belts or robotic arms)

Turning machines on or off (for automated start/stop of equipment)

Controlling temperature (by activating heating or cooling systems)

Moving robotic arms or equipment (for assembly, material handling or other precision tasks)

In an IIoT system, actuators are responsible for automating responses to specific conditions detected by sensors. This creates the foundation for closed-loop control systems that can operate independently of human intervention. For example, if a temperature sensor detects overheating, the actuator could activate a cooling system without manual intervention. This automation reduces human labor and the chances of errors or inefficiencies in production. It also speeds up response times to deviations, minimizing waste and downtime.

Actuators can also adjust machine settings dynamically. For example, based on real-time data, they can modify the speed or pressure of a machine, ensuring the production process adapts to the changing needs of the workflow.

In more advanced IIoT setups, edge computing and AI-driven algorithms use sensor data to make autonomous decisions, triggering actuators without human oversight. This could be as simple as adjusting a process or as complex as rerouting products based on real-time data streams.

Working together in IIoT

In a typical IIoT system, the interaction between sensors and actuators follows a continuous cycle of data collection and response, which is often referred to as closed-loop control. Here’s an example:

Sensors detect changes: A temperature sensor detects that the temperature in a furnace is rising above the set threshold.

Data is sent: The sensor transmits this information to the controller (either an edge device or cloud platform) in real-time.

Data is analyzed: The controller analyzes the data and determines that corrective action is needed (e.g., the furnace is overheating).

Actuator takes action: Based on the analysis, the controller sends a signal to an actuator that opens a valve to release cooling air or turns on a cooling system.

Process adjustment: The actuator performs the task, and the sensor continues to monitor the process, feeding back data to ensure the temperature returns to safe levels.

Benefits of sensors and actuators in manufacturing

Increased Production Efficiency:

Sensors and actuators enable real-time adjustments to processes, ensuring that machines operate within optimal parameters. This minimizes downtime and keeps production flowing smoothly.

Enhanced Predictive Maintenance:

Continuous data from sensors allows for early detection of wear and tear or impending failures, reducing the need for reactive maintenance and minimizing unexpected breakdowns. Actuators can automatically adjust processes to prevent equipment damage.

Improved Quality Control:

Sensors track key quality metrics, and actuators can adjust the process instantly to ensure product quality remains consistent, reducing waste and scrap.

Operational Flexibility:

Sensors and actuators provide greater control over manufacturing systems, enabling them to respond flexibly to changes in production schedules, environmental factors, or even supply chain disruptions.

Cost Reduction:

Automation through sensors and actuators can lower labor costs and reduce human error. Moreover, optimized processes lead to less material waste, contributing to overall cost savings.

Data-Driven Decision Making:

By integrating sensors and actuators with a central data system (cloud or edge-based), manufacturers can leverage real-time analytics to gain actionable insights and make informed decisions to improve efficiency and productivity.

Common challenges

Let’s face it, maintaining a network of sensors and actuators and similar technology in a manufacturing environment can be tricky. Many environmental and workflow factors can result in degraded performance, even if they aren’t integrated into a broader IIoT implementation.

However, in IIoT manufacturing systems, several challenges are directly related to the integration of sensors and actuators into the broader industrial network. One key issue is communication latency and bandwidth limitations. IIoT systems rely heavily on real-time data transfer between sensors, actuators and control systems. Latency or insufficient bandwidth can delay data transmission or actuator responses, which is particularly troublesome in time-sensitive applications where quick reactions are essential.

Another challenge is connectivity and reliability issues. Since IIoT systems often involve wireless communication (e.g., Wi-Fi, LPWAN, or other IoT protocols), connectivity problems like signal dropouts, weak coverage or protocol incompatibility can disrupt the flow of critical data. In a networked environment, these disruptions can lead to missed sensor readings or commands not reaching actuators, causing downtime or unsafe conditions.

The sheer volume of data generated by IIoT devices can also lead to data overload and management challenges. With sensors constantly transmitting data, storage and processing systems can quickly become overwhelmed, making it difficult to extract actionable insights or react quickly to system needs. This can hinder operational efficiency, slow decision-making, and complicate data analysis.

Security vulnerabilities are another significant concern in IIoT systems. As sensors and actuators become more interconnected, they are exposed to potential cyber threats. Hackers could access the network to manipulate sensor data or control actuators, posing serious risks to both data integrity and physical safety.

Lastly, sensor and actuator compatibility can be an issue when integrating devices from different manufacturers or upgrading legacy systems. IIoT environments require seamless communication between different components, and incompatible sensors, actuators or communication protocols can lead to integration problems, system inefficiencies or even failures in real-time operations.

To address these challenges, best practices include using real-time networking protocols, implementing strong cybersecurity measures, employing edge computing to process data closer to the source, and ensuring that systems are compatible and interoperable across the IIoT network. These steps help ensure that the IIoT infrastructure operates reliably and efficiently.

The post What are the roles of sensors and actuators in IIoT? appeared first on Engineering.com.

]]>
What are the connectivity considerations in an IIoT implementation? https://www.engineering.com/what-are-the-connectivity-considerations-in-an-iiot-implementation/ Fri, 04 Oct 2024 15:18:53 +0000 https://www.engineering.com/?p=132475 Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals. In IIoT, connectivity refers to the ability of machines, sensors and control […]

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>

Connectivity is the foundation of any Industrial Internet of Things (IIoT) implementation. For engineers, it’s not just about ensuring that devices and systems can talk to each other; it’s about choosing the right network architecture, protocols and security strategies to meet operational goals.

In IIoT, connectivity refers to the ability of machines, sensors and control systems to communicate over networks. This enables real-time data exchange and interaction between devices, local networks, edge systems and centralized cloud platforms. In IIoT implementations, this connectivity is critical to enabling the flow of data needed for process optimization, predictive maintenance, remote monitoring and real-time decision-making.

IIoT devices can range from sensors to actuators to industrial machines. For devices to exchange data directly, you’ll typically use machine-to-machine (M2M) protocols. Engineers must ensure that these devices can communicate over low-latency and robust protocols that handle the real-time data flows characteristic of industrial environments.

Protocols like Modbus, OPC UA, and MQTT are industry standards used in IIoT for device-to-device communication. While Modbus, OPC UA, and MQTT are indeed the cornerstones of IIoT protocols, there are many other protocols to choose from depending on the application, environment and system requirements. Each protocol comes with its own set of strengths and weaknesses, so it’s important to assess performance, security, scalability and interoperability when selecting a protocol for your IIoT architecture.

Another consideration is protocol overhead, which is the extra information that communication protocols add to manage data transmission, handle security, ensure data integrity and support real-time operation. While necessary for reliable, secure communication, overhead can reduce bandwidth efficiency, increase latency and consume more power, which is especially problematic in IIoT environments. Understanding and managing protocol overhead is essential for optimizing performance and efficiency in IIoT implementations.

Edge connectivity

Edge devices (often called edge gateways or edge controllers) act as intermediaries between the industrial devices and the cloud. They handle preprocessing and data aggregation before sending relevant information upstream.

Implementing edge computing reduces latency, conserves bandwidth and allows for real-time decision-making at the device level. Edge architecture must be scalable and secure, often integrating with local databases or edge AI algorithms to run complex analytics.

Cloud connectivity and platform integration

IIoT relies heavily on cloud-based platforms for long-term data storage, aggregation, advanced analytics and remote monitoring. Cloud platforms offer scalable environments for handling data streams from devices in the field.

Ensuring reliable connectivity between edge nodes and the cloud is vital. Engineers should also focus on data integrity and network reliability, optimizing data protocols to reduce packet loss and latency.

Common protocols and data handling

MQTT is lightweight, supports real-time data and works well in low-bandwidth environments, making it ideal for IIoT where data volumes can be massive but not all data needs to be sent in real-time.

OPC UA is widely used in industrial settings for real-time data exchange between PLCs and other industrial automation equipment. It also supports security, which is a critical concern in industrial systems.

RESTful APIs or HTTP/HTTPS are more suitable for web-based interfaces or when integrating IIoT with existing enterprise IT systems but may not offer the real-time capabilities needed for certain mission-critical operations.

How to Address Connectivity Challenges

Industrial environments can be challenging for connectivity due to electromagnetic interference, harsh environments and network congestion. Implement redundant networks (dual Ethernet, cellular backup) for failover in case of primary network failures. Mesh networking in IIoT can increase reliability in environments with intermittent connectivity.

Engineers will often deal with scaling from dozens to thousands of devices over a large geographical area. To support this, it’s important to architect networks that can grow without compromising performance. This may involve local edge computing to handle localized data aggregation and minimize bandwidth requirements.

Security is paramount in IIoT, especially when sensitive operational data and critical infrastructure are involved. Use end-to-end encryption (TLS, AES) and secure communication protocols (like OPC UA with security features enabled). Additionally, ensuring device authentication, role-based access control and network segmentation can help protect against cyber threats.

Zero-trust architectures are becoming increasingly popular in industrial networks to ensure that no device or user is implicitly trusted.

Latency and bandwidth optimization

Low latency is crucial for time-sensitive operations, such as real-time control or automated responses in manufacturing. For example, 5G and LPWAN (Low Power Wide Area Networks, such as LoRaWAN) are being explored for IIoT because they offer low latency, high bandwidth and long-range communication capabilities.

You should also look at how data is being transmitted. Use data compression, aggregation and edge processing to reduce the volume of data being sent over the network.

Technologies enhancing IIoT connectivity

With the advent of 5G, IIoT is gaining a huge advantage in terms of bandwidth and low latency. 5G allows for high-density device support and real-time communication, ideal for applications like autonomous vehicles, smart grids and advanced robotics in factories.

For environments where power efficiency is crucial and devices are spread across large areas, such as farms, pipelines or smart cities, LPWAN protocols offer extended range and low power consumption with relatively low bandwidth needs.

Edge computing reduces the need to send every bit of data to the cloud, providing a more efficient means of processing high volumes of data locally. This can include real-time anomaly detection or local decision-making that reduces latency and bandwidth needs.

Best practices for IIoT implementation

In industrial settings, systems and machines from multiple manufacturers may need to communicate with each other. Ensure your connectivity infrastructure allows for interoperability through open standards (like OPC UA) and modular architectures that can easily integrate with third-party equipment.

Track all data flows and network performance with network monitoring tools and data governance frameworks. This will help in troubleshooting, performance tuning and meeting compliance standards.

Architect your IIoT system in a modular way so new devices or protocols can be integrated without requiring a full system redesign. This modularity supports future-proofing the system as new technologies emerge.

For engineers implementing IIoT, connectivity is a multi-faceted challenge that involves choosing the right protocols, designing reliable and secure networks, optimizing for scalability and latency and ensuring devices can communicate efficiently across systems. The foundation for a successful IIoT implementation lies in robust, scalable and secure connectivity, enabling real-time data flow, remote monitoring and proactive decision-making.

The post What are the connectivity considerations in an IIoT implementation? appeared first on Engineering.com.

]]>
Processing at the edge takes off https://www.engineering.com/processing-at-the-edge-takes-off/ Tue, 01 Oct 2024 19:57:50 +0000 https://www.engineering.com/?p=132351 Your data lives at the edge and determining how to connect processes can improve manufacturing and monitoring.

The post Processing at the edge takes off appeared first on Engineering.com.

]]>
The Nvidia IGX Orin platform (left) is used in healthcare, industrial inspection and robotics (from top to bottom, on
right). Source: Nvidia

Real-time and near real-time processing at the edge is more common than ever, thanks to improvements in chips and batteries. Yet a variety of logistical and technical problems present challenges for companies engaging in such processing. Fortunately, every instance of such work presents opportunities for these businesses to learn more from themselves and one another.

Implementing industry 4.0 practices in real-time and near real-time processing at the edge requires evaluating how current procedures can be improved. Beneficial changes enable companies to handle numerous scenarios that relate to interconnected procedures. For example, ensuring there is adequate security at the edge is best accomplished as a team goal between business partners. This goal can utilize two or more tools, such as encryption and two-factor authentication.

Recent changes that have increased the amount of real and near real-time processing at the edge include a current capability of up to 20 trillion operations per second (TOPS) for standard semiconductors, as opposed to a single TOPS a few years ago; faster speed and lower power consumption in different networks, from Long Range Wide Area Network (LoRaWAN) to 5G; and better software, including more Artificial Intelligence (AI) models, as well as new data sets and tools.

“The edge is where the data come from. Bringing the processing to companies working in these spaces is the goal. Such action can bring deployment time down by as much as a third, like from 18 months to six months. That presents cost savings and better opportunities to leverage AI,” says Pete Bernard, Executive Director of tinyML Foundation.

tinyML is a Seattle-based nonprofit that focuses on low power AI at the edge of the cloud. Its members include large corporations, including Qualcomm and Sony, academic institutions like Johns Hopkins University and nongovernmental organizations.

“tinyML holds frequent events to build community around the concept of the edge. We educate people about the potential of working at it. Our programs include contests, conferences, hackathons and workshops. One of the concepts we are considering now is data provenance,” says Bernard.

This idea relates to the watermarking of data sets and models. AI models must be trained on data sets. Stamping provenance helps users identify sources of data and the developers behind them. Such work makes it easier to integrate different data sets and models.

Software for the edge

Simplifying edge operations is easier to accomplish with software designed for that purpose, like Dell’s NativeEdge platform. 

Dell’s NativeEdge platform helps enterprises work with data generated at the edge. Source: Dell

“With NativeEdge, a client can build an AI model to operate at the edge. They can retrain the model onsite at the edge. This saves money and gives them the ability to scale up the solution as needed,” says Pierluca Chiodelli, Vice President, Edge Engineering and Product Management at Dell Technologies.

Dell sees security as the biggest challenge for clients.

A company that tries to do everything itself runs the risk of exposing information. Any entity that generates data must protect the data at the points where the data is created and stored.

Dell is enhancing security by working closely with NVIDIA, which developed the AI Enterprise software integrated with NativeEdge’s engine.

“Inference at the edge, which involves gathering data with AI techniques, is really important. Everybody needs to have a way to deploy and secure that. Also a company has to maintain its AI stack, the tools and services to use AI correctly. It must have a blueprint to update all the pieces of the puzzle,” says Chiodelli.

As the different components of an AI stack can change, a company must be aware of all of them and how they interact. This helps the company make the necessary adjustments in proportion and on the appropriate timeline. Such work prevents deviations in manufactured products and slowdowns in production time. It also minimizes the time needed to retrain AI models and workers.

The market for the edge is growing

Nvidia is working on numerous hardware and software applications to meet the needs of companies utilizing edge computing. The company sees this market as expanding. A March 2024 forecast from the International Data Corp. stated worldwide spending on edge computing is expected to be $232 billion this year.

One of Nvidia’s platforms for the edge is the Nvidia IGX Orin with NVIDIA Holoscan, which is designed for real-time AI computing in industrial and medical environments. This platform provides high performance hardware and enterprise AI software. The platform is for companies working in robotics, healthcare, scientific research, video analytics and broadcasting.

In scientific computing, the Nvidia IGX Orin with Holoscan platform has the power to stream high-bandwidth sensor data to the GPU. It can use AI to detect anomalies, drive sensor autonomy and lower the time to scientific insights. In the medical space, Magic Leap has already integrated Holoscan in its extended reality (ER) software stack to enhance the capabilities of customers. This has allowed one of its clients in software development to provide real-time support for minimally invasive treatments of stroke.

It’s difficult to establish interoperability across systems, says Chen Su, Senior Technical Product Marketing Manager of Edge AI and Robotics for Nvidia.

 “Today there are numerous installed legacy systems that weren’t originally designed with AI capabilities in mind. Integrating AI into those systems and still achieving real-time performance continues to pose a significant challenge. This can be overcome by developing industry-wide standards that can meet the complex connectivity requirements across sensors, actuators, control systems and interfaces,” says Su.

Once the task above is accomplished, the entire edge AI system will have no bottleneck in communication. It can then act in a software-defined manner, making the system more flexible and easier to manage.

STMicroelectronics (ST), a global manufacturer and designer of semiconductors, meets the needs of companies that process data in real-time and near real-time with a variety of edge AI tools and products.

These include STM32 and Stellar-E for microcontrollers (MCU) edge AI pure software solutions; the incoming STM32N6, a high-performance STM32 MCU with ST proprietary Neural Processing Units (NPU) and the STM32MP2 microprocessor series.

Danilo Pau, Technical Director in System Research and Applications at STMicroelectronics, says advances in embedded AI computing that enable processing at the edge require higher energy efficiency. The task is made possible by a mix of assets, including super-integrated NPU accelerators, In Memory Computing (IMC) and 18nm Fully Depleted Silicon On Insulator (FD-SOI) ST technologies. Such resources can be super integrated close to standard MCU and Memory Processing Unit (MPU) cores for viable, high volume low-cost manufacturing.

“There is also the super-integration of heterogeneous technologies in a single package achieved by Intelligent Sensor Processing Unis (ISPU) and Machine-Learning Core (MLC) product families. In a tiny package, micro-electromechanical systems (MEMs) sensors, analog and digital technologies are stacked for large and cheap sensor volumes. They engage in microwatt power consumption. This is a fundamental contribution that enables the incoming trillion of sensor economies envisaged by many IoT experts,” says Pau.

Organizations like tinyML Foundation play an important role in the business community. Since 2018, tinyML has encouraged many companies to invest in generative AI at the edge (edgeGenAI).

Pau says there is need of even greater energy efficiency and super integration of heterogeneous technologies, including NPU, IMC, deep submicron technologies and sensors.

“The vision is to design embedded systems that match the energy efficiency of the human brain,” says Pau.

He adds companies will increasingly need more education about edge AI technologies, tools and mastery of skills.

That fact explains why ST, which is currently Europe’s largest designer and manufacturer of custom semiconductors, is an active part of the tinyML community.

“ST works with many actors in the edgeGenAI ecosystem. We’re eager to see this ecosystem expand and serve AI developers in the best and most productive way. That will ease their path in bringing innovation to the edge AI market,” says Pau.

The post Processing at the edge takes off appeared first on Engineering.com.

]]>
How quantum computing is already changing manufacturing https://www.engineering.com/how-quantum-computing-is-already-changing-manufacturing/ Tue, 24 Sep 2024 13:23:55 +0000 https://www.engineering.com/?p=132147 The prospects for hybrid quantum optimization algorithms in the manufacturing industry are particularly promising.

The post How quantum computing is already changing manufacturing appeared first on Engineering.com.

]]>
A laser setup for cooling, controlling, entangling individual molecules at Princeton University. (Image: National Science Foundation/Photo by Richard Soden, Department of Physics, Princeton University).

Various industries are becoming increasingly aware of the potential of quantum technology and the prospects for the manufacturing industry are particularly promising. There are already quantum algorithms being used for specific manufacturing tasks. These are hybrid algorithms that combine quantum calculations and conventional computing—in particular high-performance computing. As the first benefits of quantum technology are already being realized today, it’s worthwhile for companies to familiarize themselves with the technology now.

Where quantum algorithms fit manufacturing

To find suitable use cases, it’s helpful to know about one of the most popular hybrid algorithms: the Quantum Approximate Optimization Algorithm (QAOA). QAOA is considered a variational algorithm and is used to find solutions to optimization problems. A Variational Quantum Algorithm (VQA) is an algorithm based on the variational method, which involves a series of educated guesses performed by a quantum computer refined by classical optimizers until an approximate solution is found. This iterative process combines classic computers with quantum computers, allowing companies to access the benefits of quantum computing more quickly, rather than waiting for technological breakthroughs that may not happen for several years. 

Hybrid quantum algorithms open creative possibilities for challenges in manufacturing. For example, it will be possible to develop new, better materials by simulating the interaction of molecules more reliably and quickly. Classical computers already struggle to simulate simple molecules correctly. Since quantum computers can explore several possible paths simultaneously, they are better able to calculate complex interactions and dependencies. This reduces the cost and time required to research and produce innovative materials – which is particularly promising for the development of better batteries for electric cars.

Quantum calculations can also make a difference in logistics and inventory management, where, the “traveling salesman problem” is a recurring challenge: what is the shortest route to visit a list of locations exactly once and then return to the starting point? When solving this type of problem, quantum computers are significantly faster than traditional systems. Even with just eight locations, a traditional computer needs more than 40,000 steps, where a quantum computer solves in 200 steps. Those who firmly integrate such calculations into work processes will be able to save a lot of time and resources.

The situation is similar for supply chains. Maintaining one’s supply chain despite geopolitical upheavals is increasingly becoming a hurdle for the manufacturing industry. Remaining flexible is easier said than done, as changing suppliers can quickly lead to delays in the workflow. Although most manufacturers have contingency plans and replacement suppliers at the ready, the market is convoluted. Huge amounts of data must be considered to find the cost-optimal and efficient supply chain. Quantum algorithms can handle this and allow ad hoc queries of this kind, which is a decisive advantage in volatile situations.

Approaching the quantum advantage

Hybrid quantum algorithms can be used in a variety of ways. Volkswagen, for example, found a use case in the application of car paint and was able to optimize this process. It was possible to reduce the amount of paint used and speed up the application process at the same time. 

Some practices help manufacturers to enter quantum computing via hybrid quantum algorithms. Although the full quantum advantage will only unfold in the future, awareness of the technology’s potential is important today. Now is the best time to actively engage with quantum computing and identify industry-specific use cases. This makes it possible to estimate the complexity of the problems and the computing power required. This in turn makes it easier to estimate when the right hardware might be available.

Once suitable application scenarios have been found, there is no need to wait for the ideal quantum hardware. Instead, manufacturers should try their hand at a simplified program for a specific scenario and combine the latest quantum technology with conventional systems. At best, this hybrid approach can achieve a proof of concept and realize tangible improvements – Volkswagen is a good example of this.

It’s also important to note that it’s usually not necessary to learn programming for quantum computing at the machine language level. There are already higher-level programming languages that are less abstract and complex and therefore easier to learn. The market also has platforms that represent quantum-based applications via graphical user interfaces. These can help development teams show these applications to other departments and make them easier to understand. It’s advisable to focus on platforms that are cloud-based and agnostic in terms of hardware. It’s currently still unclear which hardware will prevail in quantum computing. Flexibility is therefore particularly valuable to minimize conversion costs, which can be incurred with on-premise installations.

A strategic investment

Even with the most innovative technologies, big changes don’t happen overnight. While we will see leaps towards the full quantum advantage, it will also take time to be fully applicable. The bottom line is that those who have prepared themselves earlier will be able to utilize the quantum advantage sooner. The transition to quantum computing can be a challenge if not enough groundwork has been done. A smooth transition is possible if employees are trained in the use and maintenance of quantum systems.

The introduction of hybrid quantum algorithms is also strategically valuable due to potential patent applications. Only an early discovery of industry-specific quantum applications allows manufacturers to quickly fill their portfolio and legally secure this intellectual property.

Erik Garcell is head of technical marketing at Classiq Technologies, a developer of quantum software. He has a doctorate in physics from the University of Rochester and a master’s in technical entrepreneurship and management from Rochester’s Simon School of Business.

The post How quantum computing is already changing manufacturing appeared first on Engineering.com.

]]>
What are the key aspects of IIoT? https://www.engineering.com/what-are-the-key-aspects-of-iiot/ Wed, 04 Sep 2024 18:14:25 +0000 https://www.engineering.com/?p=131490 The industrial Internet of Things is playing a pivotal role in shaping the future of manufacturing. Here we explore what it is and how it all started.

The post What are the key aspects of IIoT? appeared first on Engineering.com.

]]>

At its core, the industrial Internet of Things (IIoT) is about infusing traditional industrial environments with advanced digital technology. Sensors and smart devices are embedded in machinery to continuously collect data on everything from temperature to vibration. These sensors actively monitor and report on the performance and condition of equipment in real-time.

This data may seem like just a flood of numbers, but with the right mindset it’s a treasure trove of potentially actionable insights. In a well thought out implementation, advanced analytics and machine learning algorithms sift through this data, uncovering patterns and trends that were previously hidden. This means that rather than waiting for a machine to break down, manufacturers can now predict when a failure might occur and address it before any disruption. This proactive approach helps to reduce unexpected downtime and extend the lifespan of equipment.

The connectivity that IIoT brings means managers can adjust processes on the fly, optimize resource use and even automate many aspects of production. This level of automation boosts efficiency and enhances productivity, allowing for more streamlined operations and higher output.

Cost savings are another significant benefit of IIoT. By minimizing unplanned maintenance and optimizing energy consumption, manufacturers can reduce their operational expenses. Predictive maintenance, for example, ensures that equipment is serviced only when needed, rather than on a fixed schedule or after a failure.

Moreover, IIoT introduces a new level of flexibility into manufacturing. Factories equipped with IIoT technology can adapt more easily to changes in designs, demand or shifts in production requirements. Ideally, manufacturers can quickly reconfigure their operations or scale them up or down based on real-time needs, making them more responsive to market fluctuations.

Safety and regulatory compliance are also enhanced through IIoT. The continuous monitoring of equipment helps identify potential hazards before they become serious issues, creating a safer working environment. Additionally, accurate data collection supports compliance reporting with safety standards and regulations.

Additionally, wider supply chains benefit from the integration of IIoT. With better tracking and management capabilities, manufacturers can improve logistics decisions and inventory management, ensuring that materials and products move seamlessly through the supply chain.

In essence, IIoT has the potential to transform traditional manufacturing into a dynamic, data-driven environment. It’s turning factories into smart, connected ecosystems where every machine and process is in constant communication, leading to smarter decisions, greater efficiency, and a more agile and responsive production environment.

Evolution of IIoT: PLCs set the stage

The roots of IIoT can be traced back to the mid-20th century when electronic controls and automation began to take shape. The introduction of programmable logic controllers (PLCs) in the 1960s marked a significant leap forward, allowing machines to be controlled with greater precision and flexibility.

The 1980s and 1990s saw the integration of computer technology into industrial environments. The advent of personal computers and advancements in software led to the development of more sophisticated control systems. Manufacturing Execution Systems (MES) emerged, providing real-time data on production processes and improving operational efficiency. However, these systems were often isolated and lacked the connectivity seen in modern systems.

Enter IoT

The concept of the Internet of Things (IoT) began to take shape in the early 2000s, thanks to the proliferation of internet connectivity and sensor technology. Kevin Ashton coined the term “Internet of Things” in 1999 while working at Procter & Gamble, envisioning a future where everyday objects could communicate over the internet. This concept initially focused on consumer applications but laid the groundwork for what would become IIoT.

The early 2010s marked the formal emergence of IIoT as a distinct concept. As broadband internet and wireless technologies matured, Machine builders began to integrate internet connectivity into industrial machinery and processes. The introduction of smart sensors, which collected and transmitted data about various operational parameters, was a game-changer. These sensors, coupled with advances in cloud computing and big data analytics, enabled real-time monitoring and analysis of industrial processes on an unprecedented scale.

Advancements and adoption

By the mid-2010s, IIoT had gained substantial traction across various sectors. The integration of advanced analytics and machine learning allowed for deeper insights and predictive capabilities. Industries from manufacturing to energy and transportation embraced IIoT to enhance efficiency, reduce downtime and optimize operations. The development of edge computing, which processes data closer to the source rather than relying solely on centralized cloud servers, further accelerated IIoT adoption by reducing latency and improving responsiveness.

Current state and future developments

Today, IIoT is a cornerstone of Industry 4.0, the fourth industrial revolution characterized by digital transformation and smart technologies. Modern IIoT systems leverage a combination of sophisticated sensors, advanced analytics and interoperable platforms to create highly efficient and adaptive industrial environments. Innovations such as digital twins—virtual replicas of physical systems—allow for simulation and optimization of industrial processes in real-time.

Looking ahead, the evolution of IIoT continues with advancements in AI and 5G connectivity, which promises even faster data transmission and more nuanced, automated analysis. As industries strive for greater automation, efficiency and sustainability, IIoT is expected to play an increasingly pivotal role in shaping the future of manufacturing and beyond.

The post What are the key aspects of IIoT? appeared first on Engineering.com.

]]>