PLM/ERP - Engineering.com https://www.engineering.com/category/technology/plm-erp/ Fri, 07 Feb 2025 18:47:09 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png PLM/ERP - Engineering.com https://www.engineering.com/category/technology/plm-erp/ 32 32 Digital Transformation Strategies for Legacy PLM and Data Systems https://www.engineering.com/digital-transformation-strategies-for-legacy-plm-and-data-systems/ Fri, 31 Jan 2025 15:21:20 +0000 https://www.engineering.com/?p=136247 Data is no longer merely a byproduct of processes, it’s the central asset that drives innovation, efficiency and competitive advantage.

The post Digital Transformation Strategies for Legacy PLM and Data Systems appeared first on Engineering.com.

]]>
In today’s business world of connected manufacturing, supply chain and operation, digital transformation is an essential element in a successful manufacturing business. The goal is continuous product lifecycle and product data management covering a digital thread from early design and engineering to supply chain management and maintenance.

The push for digital transformation in product lifecycle management (PLM) comes from our everyday experiences, where nearly every aspect of our personal and professional lives is mediated through digital tools.

For businesses, this shift toward digitalization raises a crucial question: How can all these tools, applications and processes coordinate seamlessly to deliver value and innovation? Just a few years ago this was limited to focusing on specific tasks or processes, but now the demand is to have up-to-date information on every tool or system you use.

We live in a hyper-connected era where systems must communicate and adapt in real time. Businesses face the challenge of managing increasingly complex ecosystems of applications, each with their own data and processes. This complexity demands a shift from traditional, siloed thinking to a more integrated, data-driven approach.

Navigating this transformation requires rethinking how data and applications interact, enabling businesses to achieve greater flexibility, efficiency and innovation.

Enterprise Applications as Process Enablers

Historically, business systems and enterprise applications were built to support specific functions and processes within a business. These tools were designed to operate within narrowly defined parameters, focusing on individual data sets, including:

  • CAD Applications focused on creating and managing design data, enabling engineers to develop and iterate on product designs.
  • MRP Systems (Material Requirements Planning) supported resource allocation and production planning to ensure efficient manufacturing processes.
  • CRM Tools (Customer Relationship Management) managed customer interactions, sales pipelines, and support workflows.

While effective for their intended purposes, these applications were largely self-contained with minimal connectivity to other systems. Think of a PLM solution, supply chain, document management or production process. Although companies demanded interoperability, most of these projects were about how to “sync” data from design and engineering to material planning/ ERP rather than setting up a collaborative environment.

Data was treated as a byproduct of the process rather than a core asset. Over time, this siloed approach led to fragmented data landscapes, making it difficult to achieve the integration and real-time insights needed for modern business operations.

The result? Legacy systems often struggle to adapt to the demands of a digital-first world, where agility, collaboration and innovation are paramount. As businesses scale and evolve, the limitations of process-centric architectures become increasingly apparent.

The Shift to Data-First Thinking

Digital transformation fundamentally changes the relationship between processes and data. In the traditional model, processes dictated how data was structured, stored and accessed. Digital transformation flips this paradigm, prioritizing data as the foundation of modern business operations. Companies shift their attention on how to trust data, because data lives longer than applications and business tools. The design history goes on for years, but a company can switch CAD and PDM applications. Data records are more important, and here are some key reasons:

  • Real-Time Insights: Data provides the foundation for real-time analytics, enabling businesses to make informed decisions quickly and respond to changing conditions.
  • Flexibility and Adaptability: Processes are inherently static and limited to predefined scenarios, while data enables dynamic, context-aware responses.
  • Collaboration Across Ecosystems: By connecting data through a digital thread, businesses can ensure seamless collaboration across design, manufacturing, and operations.
  • Long-Term Value: Unlike processes, which can be reengineered relatively quickly, data accumulates value over time. A robust data foundation supports innovation, automation, and strategic decision-making.

In this new paradigm, data is not merely a byproduct of processes, it’s the central asset that drives innovation, efficiency, and competitive advantage. Businesses that prioritize data management are better equipped to navigate the complexities of digital transformation and achieve sustainable success.

Rethinking the Status Quo

For decades, enterprise applications like PLM (Product Lifecycle Management) systems have been designed to organize engineering processes within organizations. These systems emphasized creating a ‘single source of truth,’ focusing on managing product data records (e.g., design files, engineering documents) and enforcing processes such as data approval, versioning and updates.

PLM was an effective approach when the demand was to store data and gatekeep access. Today, this misalignment of the traditional PLM approach is becoming obvious. PLM tools should become a source of data shared with everyone.

The foundation of the switch is how PLM software adapts, becoming an “agent” that performs specific tasks on the data, such as engineering change order approval. This shift from a process-centric to a data-centric approach requires rethinking foundational concepts such as the single source of truth and adopting new strategies that prioritize collaboration, flexibility and adaptability.

A Single Source of TRUTH Change

In the context of digital transformation, the traditional single source of truth is evolving into a single source of change. This new approach emphasizes:

  • Dynamic Data Organization: Data is no longer confined to rigid hierarchies or processes. Instead, it is modeled and organized to reflect real-world relationships and dependencies.
  • Collaborative Workspaces: Data becomes a shared resource that enables cross-functional collaboration, regardless of where it originates or is maintained.
  • Context-Aware Data Management: Systems must understand not only what the data is but also how it is connected, who can change it, and under what circumstances.

Instead of treating data as a secondary consideration, businesses must prioritize creating flexible, adaptable data models that reflect the complexity and interconnectivity of modern operations.

Practical Strategies for Disconnecting Data from Applications

Switching to a data-first approach can be challenging, especially for companies with old systems deeply tied to specific software. Some companies have data from a dozen ERP systems and several PLM applications including legacy databases. What can those companies do? Here are a few practical approaches:

Rethink Data Models

Traditional data models are often built around specific applications, which makes it hard to use data across different systems. Most traditional data systems use relational SQL databases with inflexible schemas. To solve this, companies should use modern flexible data models that don’t rely on fixed structures and can change as needed.

Graph databases are a great option because they are good at handling complex connections between data. It’s also important to organize data in a way that lets systems understand its meaning and context, making it easier to work with.

Decouple Data from Applications

Data shouldn’t depend on specific software—it should stand alone as a valuable resource. To achieve this, companies can combine data from various systems into one central place for easier access. Using tools like APIs and integration layers helps different applications share data seamlessly. Another option is data federation, which keeps data in its original systems but allows centralized access and visibility.

Knowledge Graph and AI models

This is a modern data modeling approach which is growing. A product knowledge graph organizes all product-related data—design, engineering, and manufacturing details—in one connected system. This ensures data is consistent and accurate across all areas. It also supports better decision-making with advanced analysis tools and makes it easier for different teams to work together by sharing the same information.

With a huge spike in GPT and LLM models, we can see how these models can consume data in a more holistic way disconnected from the applications where the data was originally created. They can also provide quick insights and help find the information you need from large data sets quickly and accurately, to support various chatbots and future analytics and AI agents.

Collaborative Data Management

In a data-first setup, teamwork is key. Companies should enable real-time data sharing so everyone has up-to-date information. Teams across different departments need tools to work together effectively, and clear processes for managing changes are crucial to ensure everyone stays on the same page and changes are tracked.

By following these strategies, businesses can move away from outdated, application-tied data systems and unlock new possibilities for efficiency, innovation, and growth.

Embracing the Future of Data-Driven Manufacturing

We are living through a period of profound transformation in manufacturing and PLM. The traditional model of application-defined data is giving way to a new paradigm of data-defined action. This shift represents a fundamental rethinking of how businesses approach data, applications, and processes.

By prioritizing data as the central asset of their operations, businesses can:

  • Achieve greater flexibility and adaptability.
  • Drive innovation and efficiency through real-time insights.
  • Build resilient, future-ready systems that support long-term success.

The transition to a data-first approach is not without its challenges, but the rewards are immense. By embracing modern data management strategies and technologies, businesses can position themselves at the forefront of digital transformation, unlocking new opportunities for growth, innovation, and competitive advantage.

The post Digital Transformation Strategies for Legacy PLM and Data Systems appeared first on Engineering.com.

]]>
RIP SaaS, long live AI-as-a-service https://www.engineering.com/rip-saas-long-live-ai-as-a-service/ Thu, 16 Jan 2025 21:04:52 +0000 https://www.engineering.com/?p=135747 Microsoft CEO Satya Nadella recently predicted the end of the SaaS era as we know it, which could level the playing field for smaller manufacturers.

The post RIP SaaS, long live AI-as-a-service appeared first on Engineering.com.

]]>
Artificial Intelligence (AI) is no longer just a buzzword—it is a game-changer driving new insights, automation, and cross-functional integration. AI is transforming industries by powering digital transformation and business optimization; and a lot more innovation is expected. While some sectors are advanced in leveraging AI, others—particularly traditional manufacturing and legacy enterprise software providers—are scrambling to integrate AI into traditional digital ecosystems.

Many executives foresee AI revolutionizing Software-as-a-Service (SaaS) by transitioning from static tools to dynamic, personalized, and intelligent capabilities. AI-as-a-Service (AIaaS) offers businesses unprecedented opportunities to innovate and scale. The promise is a future powered by AI agents and Copilot-like systems that streamline infrastructure, connect enterprise data, and reduce reliance on traditional configuration and system integration.

In a recent BG2 podcast, Satya Nadella shared his vision for AI’s role in reshaping technology and business. He stated, “The opportunities far outweigh the risks, but success requires deliberate action.” These opportunities extend beyond industry giants to startups and mid-sized enterprises, enabling them to adopt AI and leapfrog traditional barriers. Smaller enterprises, in particular, stand to gain by avoiding the pitfalls of complex digital transformations, taking advantage of AI to innovate faster and scale effectively.

Revolutionizing Experiences and Integration

AI is (or will be) fundamentally changing how users interact with SaaS platforms. Traditional SaaS tools are often said to be rigid, offering one-size-fits-all interfaces that require users to adapt. In contrast, AI brings opportunities to disrupt this model by analyzing user behavior in real-time to offer personalized workflows, predictive suggestions, and proactive solutions. Nadella emphasized this transformation, saying, “The next 10x function of ChatGPT is having persistent memory combined with the ability to take action on our behalf.”

This aligns with the emergence of Copilot systems, where AI acts as a collaborative partner rather than a mere self-contained tool. Imagine a SaaS platform that not only remembers user preferences but actively anticipates needs, offering intelligent guidance and dynamic adjustments to workflows. Such personalization fosters deeper engagement and loyalty while transforming the management of business rules and system infrastructure.

Empowering Smaller Enterprises

The promise of AI extends not only to large enterprises but also to smaller businesses, particularly those in manufacturing and traditionally underserved sectors. For example, a small manufacturer could adopt AI-driven tools to optimize supply chain management, automate repetitive tasks, and deliver personalized customer experiences—all without the complexity of traditional ERP systems.

To ensure successful adoption, businesses must:

  • Identify high-impact areas: Focus on processes that benefit most from automation and predictive analytics, such as customer service, supply chain management, or marketing optimization.
  • Leverage scalable solutions: Choose AI platforms that align with current needs but can scale as the business grows.
  • Build internal expertise: Invest in upskilling employees to work alongside AI tools, ensuring alignment between human and machine capabilities.
  • Partner strategically: Collaborate with AI vendors that prioritize interoperability and ethical standards to avoid vendor lock-in and compliance risks.

Redefining Value: Pricing Models and Proactive Solutions

AI is not only transforming technical capabilities but also redefining pricing models for SaaS platforms. Traditional subscription fees are being replaced by real-time, usage-based pricing, powered by AI algorithms that align revenue with the value delivered. Nadella warned, “Do not bet against scaling laws,” underscoring AI’s potential to adapt and optimize at scale. For instance, AI can analyze customer usage patterns to calculate fair, dynamic pricing, ensuring customers pay for the outcomes that matter most.

This shift to value-based pricing can help SaaS companies differentiate themselves in competitive markets, reinforcing their commitment to customer success. Additionally, as AI drives data integration, traditional software vendors (ERP, CRM, PLM, MES, etc.) will need to adapt their business models. With AI, vendor lock-in could become obsolete, or at least redefined, as businesses migrate data seamlessly across platforms, fueled by open standards and interconnected data assets.

Overcoming Adoption Challenges

While the promise of AIaaS is immense, transitioning from traditional SaaS is not without its hurdles. Businesses must address:

  • Cost barriers: AI solutions can require significant upfront investment, especially for smaller firms. Clear ROI metrics and phased implementation plans can mitigate this challenge.
  • Technical expertise gaps: The lack of in-house AI expertise can slow adoption. Partnering with AI-savvy consultants or platforms can bridge this gap.
  • Resistance to change: Shifting from static tools to dynamic AI-driven systems requires cultural change. Leadership must communicate the benefits clearly and provide training to ease transitions.

Responsible AI: Trust, Compliance, and the Road Ahead

The rise of AI-powered SaaS platforms presents both immense opportunity and significant responsibility. As these platforms analyze vast datasets, safeguarding user privacy and ensuring compliance with regulatory standards will be non-negotiable. Nadella’s remark that “Innovation must go hand in hand with ethical considerations” underscores the need to balance technological advancement with accountability.

To build trust and ensure accountability, businesses must prioritize:

  • Transparent data policies: Clearly communicate how user data is collected, stored, and used.
  • Robust security measures: Safeguards against data breaches are critical for maintaining trust.
  • User-centric governance: Empower users with control over their data while ensuring compliance with global regulations.

Final Thoughts…

Looking ahead, adaptive AI systems and large language models will continue to redefine how SaaS platforms deliver value, addressing evolving customer needs with precision and speed. Nadella’s vision for AIaaS is inspiring, but businesses must remain grounded. To lead in this new era, organizations must tackle critical questions:

  • How can they balance AI’s immense potential with the risks of misuse or ethical lapses?
  • What steps are necessary to ensure AI enhances—not replaces—human decision-making?
  • How can smaller enterprises leapfrog traditional barriers to scale with AI?
  • Can persistent memory systems foster meaningful personalization without sacrificing user trust?
  • What role will regulatory frameworks play in ensuring accountable innovation?

By addressing these questions and embracing the opportunities AI presents, SaaS providers can chart a path toward sustained success. The question is not whether AI will transform SaaS, but how organizations will adapt to lead in this new digital era

The post RIP SaaS, long live AI-as-a-service appeared first on Engineering.com.

]]>
Core transformation unlocked: digital opportunities for small and medium manufacturers https://www.engineering.com/core-transformation-unlocked-digital-opportunities-for-small-and-medium-manufacturers/ Thu, 09 Jan 2025 21:22:32 +0000 https://www.engineering.com/?p=135190 Harnessing AI to redefine operational agility and drive growth could be a key differentiator in the near term.

The post Core transformation unlocked: digital opportunities for small and medium manufacturers appeared first on Engineering.com.

]]>
Technology is no longer optional—it is a fundamental driver of business success. This does not mean it always takes center stage, but without it, businesses risk falling behind. Small and medium manufacturers now have a unique opportunity to learn from the transformation journeys of larger enterprises—including to consider alternate paths. By adapting digital strategies to their scale and needs, they can accelerate innovation, improve efficiency, and compete on a broader stage. The convergence of artificial intelligence (AI), cloud computing, IoT, and enterprise platforms provides a roadmap to rethink traditional operations while fostering resilience and agility.

Deloitte’s recent report, The Intelligent Core: AI Changes Everything for Core Modernization, highlights a critical shift in the role of core systems due to the rise of AI: “For years, core and enterprise resource planning systems have been the single source of truth for enterprises’ systems of records. AI is fundamentally challenging that model.” AI is moving core systems away from static, rigid structures, offering systems that are adaptive and predictive, transforming how businesses operate.

For smaller manufacturers, this shift underscores the importance of moving beyond static systems. By adopting modular, cloud-based ERP solutions, they can introduce intelligence incrementally without overhauling their entire infrastructure. Scalable platforms allow small and medium manufacturers to integrate AI gradually, starting with targeted applications like inventory management or demand forecasting.

Converging technologies for strategic growth

Deloitte emphasizes the convergence of AI with technologies like IoT and robotics as key drivers of transformation: “In an increasingly convergent world, enterprises would do well to explore intentional industry and technology intersections that propel innovation across boundaries.” While core technologies and enterprise systems may seem exclusive to large enterprises, smaller manufacturers can strategically adopt them to address their unique challenges.

Referring to “core transformation” implies more than digital transformation; AI is poised to disrupt what is, or should be, in the core because it drives new, accessible, capabilities. This is certainly the beginning of some sort of “data democratization” across functions, leveraging both structured and unstructured data sets. The notion of digital core is perhaps more than a merely data repository or functional vault. It is about intellectual property and pan-enterprise dynamic insights—while maintaining appropriate levels of consistency, traceability, and security of the relevant data assets.

Collaborations with technology providers or local academic institutions can help small and medium manufacturers access cutting-edge solutions tailored to their needs without heavy upfront investments. Intentional adoption of converging technologies ensures immediate and sustained value. Among other things, AI can elevate IT from a support function to a strategic enabler, allowing smaller manufacturers to use AI selectively to drive measurable outcomes.

AI-powered tools, such as shop-floor predictive maintenance, can analyze machine data to predict failures, reducing downtime and costs. Similarly, AI-driven production scheduling can optimize workflows, helping manufacturers meet tight deadlines. These high-impact, low-barrier applications of AI can deliver substantial value for small and medium businesses.

Sustainability and scalability as core principles

Deloitte also highlights the importance of balancing sustainability with technological modernization: “The AI revolution will demand heavy energy and hardware resources—making enterprise infrastructure a strategic differentiator once again.” For smaller manufacturers, this presents an opportunity to make strategic decisions that combine scalability with environmental responsibility.

Furthermore, a cloud-first strategy can help small and medium manufacturers reduce costs while enhancing scalability. Cloud services allow businesses to pay for only what they use, easing the financial burden of infrastructure investment. By investing into energy-efficient hardware and renewable energy sources, businesses can align their modernization efforts with sustainability goals.

This intersection of scalability and sustainability also extends to supply chain practices. For instance, AI-powered just-in-time inventory management can contribute to minimize waste and the environmental impact of overproduction. IoT-enabled sensors can track goods in real time, improving logistics efficiency and reducing emissions. These innovations provide operational savings and enhance a manufacturer’s environmental credentials, strengthening their position in the marketplace.

AI-enabled ways of working

The convergence of AI and enterprise digital technologies offers smaller manufacturers the ability to rethink their entire system of operations. By adopting AI-enabled ways of working, businesses can unlock new levels of scalability and agility. AI maximizes resource utilization, reduces inefficiencies, and enables faster, more accurate decision-making. As such, AI-powered analytics uncover hidden patterns, driving innovation in product design and service delivery, which gives manufacturers a competitive edge.

AI also shifts operations from reactive to proactive. For example, integrating AI into CRM systems allows manufacturers to anticipate customer needs and adjust production schedules dynamically. AI-powered chatbots and virtual assistants enhance customer interactions, providing instant support and fostering stronger relationships. This can drive significant value to end-users, such as:

  • Improving knowledge management, and in turn, reducing errors and duplication.
  • Minimizing essential non-value-added activities, without complex data and digital transformation investment.
  • Learning from new insights (and enabling new technologies), embedding lessons into continuous improvement opportunities.
  • Driving continuous efficiencies and time-to-market optimization.

The vision described by Deloitte is about an AI-enabled core aligning with what the business is doing, rather than the reverse: “In the truly agentic future, we expect to see more of these kinds of bots that work autonomously and across various systems. Then, maintaining core systems becomes about overseeing a fleet of AI agents.” McKinsey reinforces this perspective in its latest quarterly insights publication, stating: “Companies are rethinking their digital strategies, moving away from massive transformations to more modular approaches that focus on areas of greatest impact.” This modularity ensures that smaller manufacturers can scale AI capabilities incrementally, avoid the risks of large-scale overhauls, and achieve meaningful progress.

Strategic growth through AI

Smaller manufacturers can achieve long-term scalability by focusing on creating ecosystems that support seamless data exchange and collaboration. AI-driven simulations, such as digital twins, can refine processes before implementation, reducing risks and maximizing efficiency. These ecosystems improve productivity while preparing businesses for future technological advancements. Starting with high-impact, low-barrier AI initiatives like predictive maintenance and optimized production scheduling allows manufacturers to achieve immediate benefits. These small-scale efforts can pave the way for broader digital transformation, leading to sustained growth.

As ERP systems and other core technologies transform into intelligent platforms, leveraging AI to provide dynamic, real-time insights instead of relying on static records, PDM and wider PLM systems are poised to embrace similar advancements. The adoption of AI-driven PLM systems is already underway in some forward-thinking organizations, and the wider industry is quickly following suit. While transitioning from legacy systems can be complex, the promise of intelligent, predictive PLM systems is worth the effort. As AI technology matures and platforms become increasingly interconnected, enterprise platforms will evolve into dynamic, proactive solutions that enable manufacturers to make smarter, data-driven decisions and unlock new opportunities for growth and innovation.

Digital transformation and AI certainly offer smaller manufacturers a clear path toward scalability and competitiveness—pending they are not afraid of experimenting. By strategically adopting converging technologies, prioritizing sustainability, and gradually integrating AI into operations, small and medium manufacturers can modernize their processes without overstretching resources. This incremental approach might foster resilience and agility, ensuring that businesses can evolve alongside the technological advancements that will define the future of manufacturing.

The post Core transformation unlocked: digital opportunities for small and medium manufacturers appeared first on Engineering.com.

]]>
Don Cooper Joins Aras as VP of Global Alliances https://www.engineering.com/don-cooper-joins-aras-as-vice-president-of-global-alliances/ Wed, 08 Jan 2025 09:05:44 +0000 https://www.engineering.com/?p=135402 Strengthens strategic partnerships to drive growth and enhance the Aras ecosystem.

The post Don Cooper Joins Aras as VP of Global Alliances appeared first on Engineering.com.

]]>
ANDOVER, MA, Jan 8, 2025 – Aras has announced that Don Cooper has joined the company as vice president of global alliances. Don will play a pivotal role in driving the expansion of Aras’ partnerships and alliances.

Don Cooper (image from LinkedIn)

Don brings over 25 years of experience in the product development industry and PLM market, with expertise in navigating direct and indirect channels. His background includes building and nurturing enablement organizations, guiding enterprise software sales teams to deliver outstanding results, and implementing effective sales processes.

“I am extremely excited to be joining Aras to continue my journey helping customers and partners adopt and deploy PLM – and realize the value from a digital thread,” said Don Cooper. “Aras is uniquely positioned to deliver innovative solutions that drive long-term value, and I look forward to collaborating with the team to make a lasting impact for our customers.”

“Don’s ability to align strategic business initiatives with GTM execution has made him a trusted leader in the SaaS sales landscape,” added Roque Martin, CEO of Aras. “Don will amplify our Build with Aras initiative by activating our entire community to unlock greater innovation and expand the capabilities of Aras Innovator.”

For more information, visit aras.com.

The post Don Cooper Joins Aras as VP of Global Alliances appeared first on Engineering.com.

]]>
When Your Business Truly ‘Gets’ Its Product Data https://www.engineering.com/when-your-business-truly-gets-its-product-data/ Tue, 24 Dec 2024 16:34:40 +0000 https://www.engineering.com/?p=135207 Streamline your product lifecycle with integrated systems.

The post When Your Business Truly ‘Gets’ Its Product Data appeared first on Engineering.com.

]]>
SAP has sponsored this post.

In an era of rapid technological change, businesses are under increasing pressure to stay competitive. Managing complex product development processes while ensuring seamless communication and real-time data flow is one of the most significant challenges. Traditional, isolated systems are no longer sufficient. Instead, organizations need integrated solutions that connect every facet of their operations — from product development and procurement to sales and customer service — into one cohesive ecosystem.

Seamlessly integrating product lifecycle management (PLM) and enterprise resource planning (ERP) solutions delivers on the mission critical requirement to manage product data across the enterprise. By breaking down silos and enabling smooth data flow between systems, companies can simplify operations, promote innovation and foster better collaboration.

Let’s dive into why integrated PLM and ERP systems matter, and how they can drive your business forward.

Why Seamless Integrations Matter

You might be asking yourself: Why is integration such a critical factor for business success? The answer lies in the tangible benefits that connected systems bring to your product development and operations. Let’s break down those key advantages:

1. Data consistency across platforms

Inconsistent data is a costly problem. When information is scattered across different systems, errors and delays are almost inevitable. Companies need to ensure that product data is synchronized across all platforms — whether it’s design, procurement, manufacturing or sales. Bi-directional data exchanges mean that changes made in one system are automatically reflected in others, ensuring everyone, from engineers to suppliers, works with the same accurate, up-to-date information. Real-time data consistency helps minimize errors, accelerate decision-making and keep your processes running smoothly.

2. Streamlined collaboration

Effective collaboration can make or break your product development process. Internal and external teams need to be connected with seamless communication that provides access to real-time data. With bi-directional data exchanges, teams can confidently share information knowing it remains consistent across all platforms. No more long email chains, fragmented conversations, or delayed approvals. Teams can collaborate more efficiently, speeding up development cycles and reducing time-to-market. Integrated platforms also make it easier to work with suppliers, partners and customers — creating a more connected and nimble business ecosystem.

3. Enhanced product experience

Today’s customers expect more than just quality products; they demand engaging product experiences. Businesses want to leverage interactive 3D models, designs and detailed visualizations. Thanks to bi-directional data exchanges, the most current product data is always available, ensuring that designs and models reflect the latest specifications. This enhances customer understanding, engagement and helps them make informed purchasing decisions. The result? Higher customer satisfaction, stronger loyalty and a competitive edge in the marketplace.

4. Automation and efficiency

Automation is a powerful tool for boosting productivity. Organizations require the automation of key processes, such as: approvals, order processing and workflow management. Bi-directional data flow between systems also means that these automated processes are always working with the latest information. With real-time visibility into tasks and operations, you can quickly identify bottlenecks and inefficiencies. This proactive approach helps streamline workflows, reduce manual effort and ensure that your teams stay focused on high-value activities.

Three high-value scenarios for effective PLM integrations

When PLM solutions integrate with other core business tools, companies can better manage product data, collaborate efficiently and make faster, more informed decisions. Here are three high-value examples where integration can drive significant business innovation:

1. Connect your data: integrating PLM with ERP systems

Visually understand everything about your product. (Image: SAP.)

When PLM systems integrate with ERP platforms, they enable real-time synchronization of product and business data. For example, manufacturing teams can access the most up-to-date bills of material (BOMs) directly from the PLM system, ensuring accurate production. Service teams can use design data, like 3D models, to troubleshoot and maintain products effectively. Bi-directional data ensures consistent and reliable information throughout the product lifecycle, resulting in products that meet quality standards from design to delivery.

2. Connect to platforms: integrating third-party PLM systems

Work with your preferred tool and connect it to your business. (Image: SAP.)

Businesses often rely on specialized third-party PLM systems, such as Siemens Teamcenter, Dassault Systèmes 3DEXPERIENCE or Autodesk Vault, to coordinate various aspects of product development. Integrating these tools with enterprise systems ensures that product data flows seamlessly fostering collaboration and maintaining data integrity.

This integration also allows design teams to work in their preferred tools while ensuring the data is consistently available and accurate. For instance, product designs created in a third-party PLM system can be transferred into ERP for direct use in downstream supply chain processes, ensuring a smooth handoff of information and reducing the risk of miscommunication.

3. Connect your ecosystem: integrating PLM with supplier networks

Create sourcing projects and collaborate with suppliers. (Image: SAP.)

Close collaboration with suppliers during sourcing and procurement can significantly increase business success. Having an integrated digital platform where the latest engineering, development and manufacturing information is securely shared with suppliers, enables companies to eliminate the need for endless email exchanges, phone calls and manual updates.

For example, material specifications, quality standards and design changes can be exchanged with suppliers in real-time, ensuring everyone is on the same page. Companies can onboard suppliers faster, improve compliance and speed up the procurement processes. The outcome? Smoother supplier collaboration, faster time-to-market and improved product quality.

Embrace integration for operational excellence

Integration isn’t just a technical convenience — it’s a strategic necessity. Seamlessly connected systems empower businesses to streamline product lifecycle processes, improve collaboration and make data-driven decisions with confidence. In a world where complexity is the norm, integration provides the clarity and flexibility needed to stay ahead of the curve.

When product data flows across design, engineering, procurement and production, the entire organization benefits. Teams can collaborate in real time, minimize errors, manual tasks and respond faster to market demands. Suppliers and partners stay aligned, ensuring smooth handoffs and consistent communication. And with automated workflows, bottlenecks and inefficiencies can be identified and resolved before they impact productivity. Integrated systems also allow you to deliver quality products and exceptional experiences that build loyalty.

Forward-thinking businesses recognize that tight integration is the key to unlocking future growth and resilience. To learn more about how SAP can support your product development processes, we invite you to explore our other articles in this series, visit our website or check out our newest website for PLM Systems Integration.

The post When Your Business Truly ‘Gets’ Its Product Data appeared first on Engineering.com.

]]>
Why data-driven project execution in PLM is critical for new product development https://www.engineering.com/why-data-driven-project-execution-in-plm-is-critical-for-new-product-development/ Mon, 23 Dec 2024 21:39:16 +0000 https://www.engineering.com/?p=135189 Companies must launch more products to be competitive, but how can they track it all?

The post Why data-driven project execution in PLM is critical for new product development appeared first on Engineering.com.

]]>
Dassault Systèmes has sponsored this post. Written by Ajay Prasad, Worldwide ENOVIA Industry Process Expert, Dassault Systèmes.

Adopting a data-driven and deliverables-based project management approach establishes a digital thread between product and project data. This enables real-time project execution tracking and analytics, leading to enhanced efficiency, accuracy and execution.

(Image: Dassault Systèmes.)

Across industries, companies looking to launch products are executing several new product development (NPD) projects at any given time. With ever-increasing customer expectations, customized product experiences and complex business models run by global teams — including suppliers — executing NPD projects is mission-critical to reducing costs and increasing a company’s bottom line.

With several competing products available in the market, the speed with which a company can launch a product is a huge factor in determining the product’s success. Companies need to launch more products by executing more NPD projects, simultaneously and with the same number of resources. Historically, this has been a complex challenge. However, it is now possible through evolutions in project execution, performance monitoring and product lifecycle management (PLM).

Facets of new product development projects

When discussing project management, the schedule is often considered to represent the entirety of a project. However, the reality is that the schedule alone does not encapsulate all aspects of a project, especially after launch.  So, let us consider the different facets of an NPD project:

  • Requirements (the Why): the motives and goals of the project.
  • Product data (the What): the artifacts, deliverables and output of the project.
  • Schedule (the When): the timeline by which the project tasks and milestones are due.
  • People (the Who): the project teams who are assigned to complete each task.

The challenge of executing new product development projects

These facets of an NPD project are typically managed in different enterprise systems, each in its own silo. This results in many non-value-added activities for the entire project team, as the information across these enterprise systems must be manually reconciled to understand the project’s status. This process is inherently error-prone leading to rework, idle time and overloaded resources. All of these challenges reduce the speed to market.

Approximately 30% of a project team’s time is spent on non-value-added activities. If this time was significantly reduced or even eliminated, NPD projects could be executed faster. Companies could launch more products faster with the same resources, making them more competitive.

How to solve the challenges of executing new product development

At a high level, these challenges can be solved by bringing all facets of an NPD project — requirements, product data, schedule and resources — together into one PLM system. With a platform-based approach, organizations work off a single database and connect their business processes. This allows the facets of an NPD project to be digitized, directly linked to one another, viewed and updated in the context of the digital (or virtual) twin. This connects the dots throughout the NPD process with real-time, data-driven project status updates.

Simply put, managing project information on a PLM system composed of loosely integrated disparate tools doesn’t solve the root of the problem. This is because data needs to be reconciled. Reconciling data programmatically does provide incremental benefits over a manual reconciliation.

However, in this scenario data is still replicated across systems and product development teams are still working in silos. This does not foster the organizational alignment necessary for project success. There is a heavy dependency on IT integrations and the front-end glue that sticks everything together hides the underlying complexity. A robust PLM system with a platform-based approach and single database, however, effectively addresses these challenges, providing a clear path to project success.

Trends in new product development projects

Historically, the primary project management methodologies used were “waterfall” and “agile” approaches. A waterfall approach is more suited for companies launching products in a phase-gate manner with predefined milestones and tasks. Typically, companies with a wealth of project execution experience use this approach. Examples include automotive, industrial equipment, energy, marine, aerospace and defense companies.

The agile approach is suited for companies that follow iterative development, where the product requirements keep changing. This mandates a high degree of flexibility. Companies that do not follow a phase-gate approach to product development instead implement an iterative process that starts with a minimum viable product (MVP) definition. Further development is done to bring the MVP towards a final product. Examples of companies that use this approach tend to specialize in software development and hi-tech devices.

Over the last decade, the amount of software going into products has increased exponentially. Software is no longer limited to high-end luxury EVs. Even internal combustion engine vehicles such as the Ford F-150 have over 150 million lines of code. Due to this evolution in products, companies that typically followed a waterfall approach are now looking to implement a “hybrid project management system,” where the waterfall approach is still used to define the steps and milestones. Meanwhile, an agile approach manages the software development that goes into the product. These two methodologies need to work with one another throughout the NPD process.

Characteristics of a PLM system that can handle this hybrid approach, are:

Platform based

  • Avoid replicating data across disparate systems by managing all facets of NPD projects in one system, working off a single database connected with a business process.
  • Foster adoption of the software as a service (SaaS) cloud for project management anytime and anywhere.
  • Focus on organizational alignment toward project success versus micro team processes.
  • Permeate through all applications connected to the platform – designers can add project deliverables and update status in the CAD tool.

Data-driven and deliverables based

  • Eliminate the inefficiencies of working with files and manual processes.
  • Extract the information from files and digitize data with objects, relationships and attributes.
  • Link project deliverables, documents, data and more to project tasks, allowing for viewing of the project in the context of a virtual twin.
  • Enable invisible governance, where the status of project deliverables is automatically determined, allowing for fact-based verses perception-based reporting.
  • Provide real-time data-driven project dashboards to the entire project team so that informed decisions can be taken quickly to steer the project toward success.

Flexible and open

  • Allow executing NPD projects with waterfall, agile and team-based planning methodologies all in one solution.
  • Allow projects executed as waterfall, agile and team-based planning to be linked to one another.
  • Connect to planning solutions that are still used extensively to define an initial project schedule before the project goes live.
  • Allow the exporting of project schedules and status to facilitate business needs.
 (Image: Dassault Systèmes.)

Key features to look for in a PLM-based project management solution

While traditional project management software is focused on building a great project schedule, it does not do much to track the execution of the schedule once the project goes live. These solutions are not designed to manage product requirements, project deliverables, risks, issues and other elements.

Project management on a platform-based PLM system, such as ENOVIA on the 3DEXPERIENCE platform, is focused on fine-grain scope project execution tracking based on deliverables. With a platform-based approach, all project facets are linked and can be visualized and understood in the context of the virtual twin. This, combined with the flexibility of executing different project management methodologies (such as waterfall, agile and team-based planning) all in one solution on a SaaS cloud provides exceptional value to the entire project team throughout the NPD process.

Other key features of ENOVIA on the 3DEXPERIENCE platform include:

Centralized digital live schedule

  • With a centralized live schedule, the project leader can make updates to the schedule that are seen instantly by the entire project team, instead of emailing and working with several versions of the file.

Real-time automated task optimization

  • This takes away much of the overhead for project leaders to continuously optimize the project schedule as tasks are delayed, added and more.

Product data linked directly to task deliverables

  • This allows the entire project team to view the status of tasks and related deliverables all in 3D and understand everything in the context of the digital/virtual twin of the product.

Online task execution and updates

  • The entire project team should be able to update tasks and add deliverables online anytime, anywhere, with instant visibility to the rest of the project team.
  • The user experience of updating project tasks feels easy and natural.

Online risk and issue management

  • Project teams need to be able to define risks in the context of project tasks and milestones.
  • Online risk mitigation plans orchestrate risk mitigation activities.
  • Project leaders can see the impact of the risk and issues and make necessary changes to project timelines.

Real-time data-driven project dashboards

  • Eliminate the non-value-added work of compiling reports on project status with real-time data-driven project dashboards that provide a 360-degree view of the project across all facets.

To learn more, visit ENOVIA at Dassault Systèmes.


About the author

Ajay Prasad, Worldwide ENOVIA Industry Process Expert, Dassault Systèmes.

Ajay Prasad is a product lifecycle management (PLM) professional based out of the metro Detroit area. He has over 21 years of experience and expertise in the design, architecture, development, implementation and technical sales of ENOVIA PLM systems across various industries. Ajay also has a Bachelor of Engineering in Industrial Engineering and Management and a Master of Science in Computer Science. After graduating, he started working as a mainframe professional and later in the area of PLM, implementing and customizing ENOVIA PLM solutions across industries such as automotive, healthcare and industrial equipment. Ajay then worked with customers in India and the U.S. He later worked for Dassault Systèmes in India as a Technical Sales Consultant for ENOVIA PLM software. Ajay returned to the U.S. and joined the ENOVIA Worldwide Center of Excellence team at Dassault Systèmes. Currently, he works as an ENOVIA Worldwide Industry Process Expert with responsibilities that include technical sales of ENOVIA PLM, technical sales teams’ enablement and serves as a liaison between field teams, customers and product R&D in pre- and post-sales situations. He fosters customer success and adoption of ENOVIA PLM software and is the team’s focal point for the ENOVIA Project and Workforce Management suite of applications.

The post Why data-driven project execution in PLM is critical for new product development appeared first on Engineering.com.

]]>
Lessons from 2024 and what innovators and engineers can expect from 2025 https://www.engineering.com/lessons-from-2024-and-what-innovators-and-engineers-can-expect-from-2025/ Fri, 20 Dec 2024 13:15:00 +0000 https://www.engineering.com/?p=135037 The task ahead is clear: cross-functional collaboration, systems interoperability, and business-digital strategy alignment.

The post Lessons from 2024 and what innovators and engineers can expect from 2025 appeared first on Engineering.com.

]]>
The manufacturing industry in 2024 was a proving ground for transformative technologies. From AI-driven efficiency to the adoption of edge computing, product developers and manufacturing engineers learned critical lessons about what works—and what does not—in the relentless pursuit of speed and innovation.

Looking ahead to 2025, the focus will shift from technology adoption to mastery. This is about more than tools—it is about driving measurable outcomes through integrated and strategic approaches. Let’s break down the key takeaways from 2024 and explore the priorities for 2025.

2024: A year of ambitious gains and sobering realities

1. AI for Automation and Predictive Analytics

AI proved itself as a game-changer in 2024, enabling automation, predictive maintenance, and workflow optimization. However, achieving measurable ROI remained elusive for many. Successful organizations treated AI as a strategic investment aligned with business goals, not just a shiny tool.

Key lessons learned include:

  • Companies that excelled used AI for defect detection, energy optimization, and production adjustments.
  • Filling gaps in data quality and platform integration was a critical step toward unlocking AI’s value.
  • Manufacturers need implementation roadmaps with measurable KPIs to ensure AI adoption delivers real results.

2. Robust Digital Thread Foundations

Seamless integration across PLM, ERP, and MES platforms emerged as a significant competitive advantage in 2024—but not without challenges. Establishing digital threads required technical rigor, governance, cross-functional process and data alignment.

Key lessons learned include:

  • End-to-end integration enabled faster design cycles, fewer production errors, and smoother operations.
  • Engineers played a pivotal role as data stewards, ensuring accuracy and driving adoption of best practices.
  • Software providers must prioritize not just APIs but also functional use cases that deliver measurable outcomes.

3. Decentralized Processing and Edge Computing

Edge computing revolutionized data processing by bringing it closer to the production floor. This enabled real-time analytics, adaptive robotics, and dynamic production controls—allowing machines to make decisions autonomously.

Key lessons learned include:

  • Decentralized systems introduced complexities, requiring a balance of performance, security, and infrastructure resilience.
  • Engineers faced challenges ensuring secure, uninterrupted data flows across networks.
  • Scaling edge computing depends on effectively managing infrastructure robustness and agility.

4. Security and Network Infrastructure

As connected devices proliferated, manufacturers became prime targets for sophisticated cyberattacks. In response, zero-trust frameworks and AI-powered threat detection became essential.

Key lessons learned include:

  • Security must be embedded from the design phase, not added as an afterthought.
  • Operational technology (OT) and IT systems convergence created new vulnerabilities requiring proactive management.
  • Cybersecurity resilience is no longer optional—it is fundamental to digital transformation.

2025: from adoption to mastery

In 2025, manufacturers will move beyond technology adoption and focus on integrating tools to drive strategic business outcomes. Success will depend on scalability, governance, and workforce enablement as organizations shift from experimentation to enterprise-wide transformation.

Here is what lies ahead and how manufacturing engineers can lead the charge:

1. AI as the Orchestrator

AI will evolve from incremental improvements to a central orchestrator of production processes, supply chains, and energy efficiency. To enable this transformation, engineers must focus on building robust, scalable infrastructures.

Key priorities include:

  • Data governance: Is the data fueling AI clean, accurate, and accessible?
  • Interoperability: Does AI seamlessly integrate with PLM, MES, ERP, and other systems?
  • Scalability: Can AI solutions adapt to growing production complexities?
  • Workforce enablement: Are AI insights designed to empower human decision-making?

Mastering AI in 2025 means ensuring it drives autonomous, intelligent, and value-driven operations.

2. Scalable, On-Demand Solutions

Cloud-based solutions and Everything-as-a-Service (XaaS) will dominate manufacturing in 2025, offering flexibility, reduced costs, and faster time-to-value. Manufacturing engineers must bridge legacy systems with cloud platforms to enable this transition.

Key priorities include:

  • Integration roadmaps: How will legacy systems connect to cloud-native platforms?
  • Cost-value balance: What are the trade-offs of XaaS versus on-premises systems?
  • Infrastructure optimization: Are cloud resources configured for performance and minimal latency?
  • Complexity management: How will hybrid environments with legacy and cloud systems be managed?

Success will depend on striking the right balance between scalability and operational stability.

3. Non-Negotiable cybersecurity foundation

With AI, IIoT, and edge computing driving increased connectivity, cybersecurity must be embedded into every phase of the manufacturing lifecycle. Engineers will play a leading role in ensuring systems are resilient and secure.

Key priorities include:

  • Security-first design: Are systems designed with embedded security protocols from the outset?
  • OT-IT convergence: How are production systems safeguarded while integrating with IT systems?
  • Real-time risk monitoring: Are AI-driven tools used to proactively detect and mitigate threats?
  • Team education: How can engineers enable teams to recognize and respond to cyber risks?

In 2025, neglecting cybersecurity will carry far greater risks than the cost of proactive investments.

4. IIoT: The Standard for Operations

In 2025, IIoT will transition from an innovation to a baseline standard for all manufacturers. The real challenge will be mastering IIoT to drive clear, actionable results.

Key priorities include:

  • Actionable insights: How will raw sensor data be translated into strategic actions?
  • Device interoperability: Are diverse IIoT platforms seamlessly connected?
  • Scalability: Can IIoT systems grow alongside production demands?
  • Security: How are connected devices protected from vulnerabilities?

Manufacturing engineers must focus on turning IIoT data into decisions that enhance efficiency, reduce costs, and drive competitive advantage.

Manufacturing in 2025: the road ahead

The digital transformation of manufacturing cannot succeed in isolation. It requires alignment with enterprise-wide strategies, seamless platform interoperability, and clear business outcomes. For manufacturing engineers, the task is clear:

  • Champion cross-functional collaboration.
  • Ensure systems interoperability.
  • Drive strategic, value-driven adoption of transformative technologies.

The lessons of 2024 are unmistakable: transformation requires more than tools; it requires deliberate integration, governance, and execution. By mastering AI, edge computing, IIoT, and cybersecurity, engineers will enable smarter factories, faster production, and resilient operations.

For those prepared to lead, 2025 holds immense potential. Are you ready to navigate the future of manufacturing?

The post Lessons from 2024 and what innovators and engineers can expect from 2025 appeared first on Engineering.com.

]]>
The impact of digital transformation and how to manage disruptions https://www.engineering.com/the-impact-of-digital-transformation-and-how-to-manage-disruptions/ Thu, 12 Dec 2024 18:52:10 +0000 https://www.engineering.com/?p=134836 A report on PLM Road Map & PDT Europe 2024 event in Gothenburg

The post The impact of digital transformation and how to manage disruptions appeared first on Engineering.com.

]]>
Every well-run technical conference advances in content over its predecessors, as demonstrated by the CIMdata PLM Road Map / Eurostep presentations recently given in Gothenburg, Sweden. Fading away were prior years’ discussions of the ins and outs of product lifecycle management (PLM) and solution providers.

Instead, speakers focused on how digital transformations impact their organizations and how the disruptions are managed. At prior conferences, these approaches usually got little attention.

Taken together, the presentations addressed the learning curves of digital transformations in power generation, pharmaceuticals, nuclear reactors, and wholesale grocery distribution. And they showed the breadth and depth of digital transformation’s penetration into the global economy.

The Gothenburg conference was titled “Value Drivers for Digitalization of the Product Lifecycle: Insights for the PLM Professional.” As an introduction, a high-level overview of the conference showed that digitalization can be shown to boost revenues and growth and extend manufacturing capabilities while tightening control over information for use and re-use over decades; two AI consortiums were also described.

One presentation that resonated with many attendees was from Mr. Peter Vind, enterprise architect, Siemens Energy, who focused on Business Capability Models (BCMs) as a way to prepare for changes (“both big and small, good or bad”) and to enhance the Siemens Power business model, Power as a Service (PaaS). Mr. Vind explained that BCMs define what the company needs to do to achieve its business strategy goals; the BCM serves as a bridge between business needs and IT and structures capabilities into logical clusters and lays the groundwork for enterprise transformation.

Built on elements of PLM’s digital threads and aspects of digital models, Siemens Energy’s BCMs manage the lifecycle from inception through engineering, design, manufacture, service, and disposal, Mr. Vind said. This forms an information backbone that provides consistent, accurate, and up-to-date information and manages design and engineering changes.

Mr. Anders Romare, chief digital and information officer at Denmark-based Novo Nordisk, focused on how this pharmaceutical giant uses digital transformation to shorten the drug discovery process by several years. He stressed that Novo Nordisk is focused on reaching more patients, manufacturing capacity, and engines of sustainable growth.

Elements include:

LabDroid robotics, which uses model-driven advanced analytics and connected technology to make research projects smarter and faster.

DataCore, an “ecosystem of source systems,” a new shared-data foundation that is scalable, secure, and supported by data governance.

NovoScribe to automatically generate structured clinical-development documents: it is already 70% faster than previous methods.

Quantum computing, the focus of a US$200 million Novo Nordisk Foundation investment, to leverage Gefion, Denmark’s first AI supercomputer, with a community of users.

The opportunities and challenges of AI and automation—the Digital Core—were spelled out by Mr. Gary Langridge, engineering manager / digital thread in the Ocado Technology unit of the UK-based Ocado Group, an online retail and wholesale grocer and technology provider.

Ocado is shaping the future of e-commerce and logistics, but Mr. Langridge said there are many challenges to ensuring that customers get safe, fresh foods delivered on time. Ocado sees this as a great value compared with conventional grocery operations.

He said over 10,000 “fulfillment” robots are used in Ocado’s Hives (distribution centers), each of which can pick a 50-item order in under five minutes. He added that Ocado Hives can handle hundreds of orders simultaneously. Ocado’s robots can handle a diversity of packages, including fragile items, a variety of weights, shapes, and sizes, and they pack densely and quickly, he noted.

The system currently handles 50% of Ocado’s range of goods (out of about 100,000 SKUs) and will reach 70% by 2026, Mr. Langridge added. An AI-based “air traffic control” system with proprietary wireless technology orchestrates the Hives; Ocado’s robots are not autonomous.

Ocado’s Digital Core deployed PLM tightly integrated with organizational change management, system governance, and foundational processes. “The digital thread handles the horizontal integration of data,” Mr. Langridge said, “but we’re still working on vertical integration” and “finding the right partners for cultural fits.”

CIMdata’s Aerospace & Defense PLM Action Group (AD PAG) shared its research-based insights on Model-Based Systems Engineering (MBSE) and its reality and promise. Presenters were Mr. James Roche, AD PAGpractice director of the CIMdata-administered PLM advocacy group, and Mr. Sandeepak Natu, co-director of CIMdata’s simulation-driven systems development (SDSD) practice.

They noted that MBSE is gaining traction in A&D due to increasing product complexity, the growing role of software in products, and the U.S. Department of Defense’s digital engineering strategy; hence, investment in MBSE is expected to continue growing.

The presentation pointed out that MBSE is found mainly in conceptual design and development, such as requirements definition and allocation, system architecture definition, and design verification and validation. It is expected to expand into production, utilization, and support with the growth of technologies like IoT/IIoT and digital twins. MBSE is also expected to move into software design, product line engineering, and design for safety and security.

In A&D, MBSE is still in the early adoption phase, and maturity levels vary significantly. Successful MBSE adoption requires a well-defined vision and strategy, strong executive commitment, appropriate methodologies, the development of MBSE expertise, education and training initiatives, middle management support, and robust tool integration and standardization. The biggest challenges of MBSE implementation are its organizational impact and cultural resistance.

TheAD PAG survey found a crucial need for enhanced interoperability between tools and platforms, open application program interfaces (APIs), and adherence to standards. Respondents also said they wanted enhanced user interfaces, more capable change management, and stronger tools for change impact analysis. The survey also revealed that MBSE investment justifications are shifting from immediate returns toward long-term strategic value considerations.

The push for high-speed innovation is delayed by speed bumps such as changes in requirements, supply chain bottlenecks, and ever-increasing product complexity. Dr. Uyiosa Abusomwan, senior global technology manager for digital engineering and conference keynoter for Day Two, outlined how Eaton Corporation navigates these market dynamics with a novel approach to product development.

Dr. Abusomwan outlined Eaton’s “Digital Engineering Deployments ” within its infrastructure of connected engineering tools.

This infrastructure relies heavily on:

Model-Based Engineering (MBE) to analyze product performance such as deformation, critical stresses, fatigue, thermal impacts, which take into  consideration the design rules and insights into costs, manufacturing, and sustainability.

Intelligent Automated Design that integrates AI and rule-based automation to execute the complete product design process that includes material selection, design calculations; analyses; cost and lead-time simulation using aPriori, and AI-enabled optimization of parameterized models, reduced-order modeling, or generative design.

Furthermore, Dr. Abusomwan presented the impacts of Digital Engineering on three Eaton product lines:

• A 4-fold performance gain and 80% weight reduction in intelligent high-efficiency, light-weight heat exchangers.

• Using AI and multi-physics parametric optimization, an 87% reduction in the automated design time for lighting fixtures.

• A 65% reduction in autonomous digital design time for high-speed gears used in electric vehicles.

The challenges of operating enterprise-scale PLM implementations were addressed by Mr. Jorgen Dahl, senior director of PLM at GE Aerospace. He covered ensuring that the existing system works to its full capability and is stable and usable by all who need it. And this is amid continual modernization to accommodate nonstop business transformation, Mr. Dahl said, delivering new digital thread capabilities across the enterprise and accommodating innovations such as virtualized infrastructure automation.

Mr. Dahl urged his peers to set “uncomfortably ambitious” goals of 75% to 99.9% improvement. “It’s not enough to define the strategy and share it,” he observed. The strategy “has to be shared early and often because the minute after you are done sharing it, understanding of it may start to deteriorate and get reinterpreted.” At the heart of his presentation were four things to do and corresponding “do-nots:”

• Imagine the ideal end state and do NOT set goals based on current constraints.

• Ask “what must be true” and do NOT proceed without clarity.

• Create a multi-phased development strategy and do NOT use return on

  investment (ROI) calculations to drive short-term goals; as a substitute for a

  holistic strategy, an ROI focus could make things worse, not better.

• Every new capability must lead to the end goal rather than developing short-

  term improvements with no clear path forward.

“This may appear like common knowledge to many,” Mr. Dahl cautioned, “but over 30 years of observation suggests it is not.”

Dr. Rob Bodington, Eurostep technical fellow speaking about ShareAspace, outlined how defense contractors can be sure of the contents of their Digital Twins and Digital Threads. To outline the use of commercial frameworks for intellectual property—again over today’s long-lived and highly complex weapons platforms and systems—he used an acronym, SMART.

Dr. Bodington defined SMART as unambiguous Semantics in requesting information, Measurable and enforceable data key performance indicators (KPIs) in contracts, Accuracy and precision in specifying what is asked for, Reasoned requests for only what is necessary, and being Timely about when data is needed. Added to these requests, he noted, is compliance with security regulations, maintaining trust, respecting commercial constraints, and developing sound ontologies.

There are “way too many” standards, he said, with many overlaps. And while standards reflect the collective experience of an industry, he pointed out, they sometimes address “problems you didn’t know you had.”

AI “will help,” Dr. Bodington continued, by generating and exploiting ontologies while classifying data and exploiting its contexts. AI can also generate and clarify terminology in contracts.  

Dr. Erik Herzog, Technical Fellow at Saab Aeronautics, provided an update on the Heliple-2 project to create federated PLM capabilities that are interoperable and a feasible alternative to monolithic PLM systems. Heliple-2, he explained, uses the Open Services for Lifecycle Collaboration (OSLC) standard, its related Genesis architecture, and the STEP standards.

Heliple-2 addresses a core information challenge in the 50-year lifecycles of weapons systems—their development systems are commonly replaced three times, Dr. Herzog pointed out.

Working on the Heliple-2 project are Eurostep, Saab, Volvo, IBM, LynxWork (a startup using OSLC for integration and traceability in engineering software), Sweden’s innovation agency Vinnova, and KTH, Sweden’s Royal Institute of Technology and its largest technical university.

Using LynxWork and OSLC, plug-and-play integration can be implemented in a week, he said. The next steps for the program are industry-scale validation, backward navigation for links under configuration management, and demonstrating analysis capabilities spanning multiple applications, Dr. Herzog concluded.

Dr. Cristina Paniagua, a Luleå University of Technology researcher, addressed shortcomings in commonly used tools and solutions. She spelled out how the Swedish-Finnish Arrowhead flexible Production Value Network (fPVN) initiative is integrating traditional standards with emerging technologies in PLM, ERP, data management, and interoperability. This initiative has goals similar to Heliple-2.

Continuous evolution and integration are essential, she concluded.

Image: A conceptual view of the Arrowhead fPVN project as discussed by Dr. Cristina Paniagua. Note the centrality of CIMdata’s lifecycle optimization hexagon.  Image courtesy of TacIT

Interestingly, most of the presenters in Gothenburg delved into how their organizations are improving the management of product data and information. Many zeroed in on using PLM in digital transformation and how PLM is increasingly recognized as essential to overcoming data complexities and frustrations.

In my opening remarks, I reiterated that successful digital transformation requires a holistic, end-to-end approach to connectivity, strategies, and tactics that must address the organization’s issues with its people, processes, and the technologies it uses. I also stressed the criticality of organizational change management in maximizing value delivery. And I covered recent developments in PLM itself, such as AI, that enhance the value that can be realized from investment in digitalization, anticipating customer demands as they evolve, and market opportunities—all these motivate investing in digitalization of the product lifecycle.

Final thoughts on PLM value

As PLM professionals, we must continually seek to enhance the value resulting from the digitalization of the product lifecycle. This requires keeping an eye on the evolving trends and enablers of digital transformation, which are comprehensively laid out in CIMdata’s Critical Dozen.

In Gothenburg, it was repeatedly emphasized that maximum value can only be obtained from a holistic, end-to-end approach and that organizational change management plays a critical role in maximizing the adoption of digitalization and delivering the expected value. Ultimately, investment in the digitalization of the product lifecycle is motivated by evolving customer demands and newly uncovered market opportunities.

The post The impact of digital transformation and how to manage disruptions appeared first on Engineering.com.

]]>
Building a digital footprint through data transformation https://www.engineering.com/building-a-digital-footprint-through-data-transformation/ Mon, 09 Dec 2024 21:25:21 +0000 https://www.engineering.com/?p=134735 In the world of digital transformation, data is the most important “currency.”

The post Building a digital footprint through data transformation appeared first on Engineering.com.

]]>

The current software landscape and PLM technologies are limited to providing a mechanism for decision effectiveness and data efficiency. This article explores the critical need for manufacturers to build a comprehensive digital footprint by transforming their approach to data. It examines the challenges posed by traditional methods, the limitations of current solutions and the practical steps required to develop a connected and efficient digital ecosystem. This transformation is vital for achieving smarter processes, operational efficiency and long-term competitiveness in the modern manufacturing landscape.

The complexity of products and business models

Manufacturing is undergoing a significant transformation, driven by the increasing complexity of products and the shift toward digital business models. The changes are not just technical but extend into how companies operate and engage with their customers. At the heart of this evolution is the role of data in enabling more adaptive and efficient processes. From a practical sense, companies are looking at how to build digital interfaces to serve their internal and external processes including communications with suppliers, contractors and customers. All these communications must be “digitized”, which brings many questions about how to do so. What changes are bringing the highest demand for digital transformation?

One of the most significant changes is the rise of Product-as-a-Service (PaaS). In this model, manufacturers move away from selling products outright to offering them as services. For example, a company might provide access to industrial machinery or software on a subscription basis rather than a one-time purchase. This approach not only creates recurring revenue streams but also strengthens customer relationships by allowing manufacturers to stay engaged throughout the product life cycle. However, this model is a nightmare for IT that is required to organize an information stack to support also “service” or “maintenance” data.

Another transformative development is predictive maintenance, which uses data from connected sensors to predict when equipment might fail. This proactive approach reduces downtime, prevents costly repairs and improves operational efficiency. Instead of reacting to issues after they arise, manufacturers can address problems before they impact production. Another great innovation, but it requires a growing amount of data sources (especially from customer sites) to be integrated into holistic data models.

The growing importance of data-driven services has also reshaped the manufacturing businesses from design to production and operations. By analyzing performance data from products in use, manufacturers can offer tailored services such as optimization, upgrades, or customization. This shift not only improves customer satisfaction but also opens new revenue streams.

Outcome-based models represent another innovation, where companies promise specific results rather than just delivering a product. For example, a turbine manufacturer might guarantee a certain level of energy output rather than simply selling the turbine. These models align the manufacturer’s incentives with customer success, creating stronger partnerships.

All these trends point to the growing importance of data in manufacturing. From enabling connected products to providing actionable insights, data lies at the core of modern business models. However, the ability to harness and manage this data effectively remains a significant challenge for many companies.

To support data management transformation, manufacturers are looking at how to adopt new business models and ways to re-tool their enterprise software implementation (including PLM, MES, ERP and others). This requires bringing the need for significant transformation of enterprise data management and modeling that is used by all these tools.  How to allow manufacturing companies to access advanced technologies without large upfront investments, especially in existing PLM/ERP software stacks. In other words, how to grow on top of the existing IT and enterprise software layers without replacement of existing tools.

The data problem in manufacturing ecosystems

Despite the growing emphasis on digital transformation, many manufacturing ecosystems remain rooted in outdated practices. A significant portion of manufacturing workflows is still centered around documents – CAD files, PDFs, reports, Excel spreadsheets and information stored on shared drives. While these methods have served the industry for decades, they are increasingly ill-suited to meet the demands of modern manufacturing.

The reliance on documents creates numerous inefficiencies. For instance, files-based content is very difficult to share, search, or update. It is not granular enough and the documents are not easy to link with each other. This leads to duplication of effort, errors in communication and delays in decision-making. Furthermore, document-based systems make it challenging to integrate data across different departments or organizations, resulting in silos of information that hinder collaboration and innovation.

Research into digital transformation underscores the severity of this problem. According to research presented in Manufacturing 2030 NAM report,  and EY research about trends in discrete manufacturing and some others, while 68% of industrial company CEOs report increasing their investments in digital technologies, only 25% of manufacturers express confidence in their data’s organization and reliability. This gap highlights a fundamental issue: despite spending heavily on digital tools, many companies lack the foundational data technologies and approach to create a robust foundation for holistic data organization.

The lack of confidence in data is not just a technical issue but also a cultural and strategic one. When data is fragmented, inconsistent, or unreliable, it becomes difficult to build trust in the systems that rely on it. As a result, organizations may struggle to achieve the full potential of their digital investments.

The limitations of existing solutions

Traditional Product Lifecycle Management (PLM) systems have long been the backbone of data management in manufacturing. These systems were designed to serve as centralized repositories for product-related data, using relational database architecture to organize and store information. While they have been effective in managing structured data within individual companies, they fall short in addressing the complexities of modern manufacturing ecosystems.

One major limitation of traditional PLM systems is their reliance on tables and local identifiers. This rigid structure makes it difficult to adapt to changing requirements or integrate new data sources. The relational database model, while efficient for simple tasks, struggles to handle the dynamic and interconnected nature of modern manufacturing data.

Another issue is the single-tenant architecture of most PLM systems. These systems are designed for use within a single organization, making them poorly suited for collaboration across companies or supply chain networks. In an era where partnerships and external integrations are critical, this limitation becomes a significant barrier.

Traditional PLM systems also suffer from a lack of flexibility when it comes to data recombination. Once data is stored in a specific format, it is often difficult to repurpose or integrate it with other systems. This rigidity limits the ability of manufacturers to innovate or respond quickly to new opportunities.

Perhaps the most significant challenge is the disconnection between traditional PLM architectures and the broader vision of a “digital web.” While the future of manufacturing is increasingly defined by interconnected systems and seamless data flows, PLM systems remain rooted in siloed SQL databases and outdated architectures. This disconnect creates a gap between what manufacturers need and what traditional systems can deliver.

Elements of a future digital footprint

To address these challenges and build a more effective digital footprint, manufacturers must adopt new approaches to data management and system design. This transformation involves rethinking how data is structured, shared and used across the organization and beyond.

The role of graph models and linked data

One of the most promising innovations is the adoption of graph-based data models. Unlike traditional relational databases, graph models provide new ways to represent data models and data itself in a seamless, self-describing and robust way as nodes and connections. This approach is inherently more flexible, allowing data to be connected and recombined in ways that were previously impossible.

Graph models are particularly well-suited to the complexity of modern manufacturing data. They can represent relationships between components, processes and systems in a way that mirrors the real-world complexity of manufacturing operations. Most importantly this approach provides a mechanism to extend and modify the data as the scope and business requirements are growing.  Graph models are extremely powerful for recombining data coming from multiple systems. Graph models make it easier to analyze data, identify patterns and derive insights that drive better decision-making.

Semantic web standards and knowledge graphs

Another critical element of the future digital footprint is the use of semantic web standards. Technologies like RDF (Resource Description Framework) and OWL (Web Ontology Language) provide a standardized way to describe and link data, making it more interoperable and machine-readable. These standards are the foundation of linked data, enabling seamless integration across systems and organizations.

At the center of this transformation is the knowledge graph, a universal information model that combines graph-based structures with semantic web standards. Knowledge graphs can represent complex relationships between data points, making them an ideal foundation for modern PLM architectures. They allow data from different sources to be connected seamlessly, creating a single source of truth that supports more intelligent decision-making.

Data transformation and process transformation

Data transformation is a key step in building a digital footprint. This involves moving from disconnected silos to a connected data ecosystem. Companies must consolidate data from multiple sources, link it to create a coherent picture and enable real-time collaboration. This shift from static documents to dynamic, interconnected data can unlock significant efficiency gains and foster innovation. It is important to highlight that doing so doesn’t mean the replacement of existing solutions. Modern graph-based data management systems are capable of capturing and containing data coming from multiple systems.

Process transformation goes hand in hand with data transformation. Traditional workflows often involve documents being passed around via email or through rigid approval processes. By shifting to data-driven workflows, companies can enable instant sharing, collaborative processes and real-time updates. This reduces inefficiencies, improves agility and allows manufacturers to respond more quickly to changing requirements.

Expanding the digital footprint for operational efficiency

Building a comprehensive digital footprint requires a holistic approach that combines strategic, technological and operational changes. It is not just about implementing new tools but also about rethinking how data is managed and used across the organization.

Strategy and cultural shift

The first step is a shift in mindset. Organizations need to move away from viewing information as static documents and start treating it as dynamic, interconnected data. This change requires not only new tools but also education and cultural change within the organization. Teams must understand the benefits of data-driven approaches and embrace a culture that values transparency, collaboration and innovation.

Technological foundations

From a technological perspective, creating an open and connected data architecture is essential.  It is important to realize that the days of a “single database” serving a company are over. This includes adopting graph-based models, using semantic web standards and ensuring systems are designed to integrate seamlessly. The goal is to create a framework that can adapt to future needs and support continuous improvement.

Implementation and integration

Implementation is another critical area. Companies must focus on connecting existing data from legacy systems and operational sources, building new applications that enable more dynamic workflows and integrating these tools into existing processes. The aim is to create a seamless environment where data flows freely and supports smarter decision-making. It is important to emphasize that existing solutions represent a critical element in modern data architecture. By connecting (and not replacing them) and complementing them with new solutions, companies will be able to achieve much faster ROI and solve business problems faster.

AI and decision support

It would not be complete without mentioning AI and related data management tools. There are multiple places where AI tools can be used. But the most important thing is to think about how to feed AI tools with the most accurate data. AI tools can analyze data, identify patterns and provide insights that improve decision-making. However, these tools rely on a solid data foundation. Without reliable, connected data, AI systems cannot deliver their full potential.

The “data” is one of the most commonly used words in the lexicon of PLM minded people these days. What is important is to focus on the solid data management foundation, without that, all these new “data” solutions will be weak and not robust enough to support modern manufacturing environment and digital processes. 

The way to transform the digital footprint through data transformation is essential for manufacturers looking to stay competitive in an increasingly complex and digital world. By addressing the limitations of traditional systems and adopting modern approaches like graph-based models, semantic web standards and knowledge graphs, companies can create a connected and efficient data ecosystem.

This transformation requires more than just new technology; it demands a shift in strategy, culture and processes. By embracing these changes, manufacturers can unlock new levels of operational efficiency, foster innovation and position themselves for long-term success in the digital era.

The post Building a digital footprint through data transformation appeared first on Engineering.com.

]]>
How the chief data officer connects data and people to create value https://www.engineering.com/how-the-chief-data-officer-connects-data-and-people-to-create-value/ Mon, 18 Nov 2024 17:43:37 +0000 https://www.engineering.com/?p=134109 Exploring the rising role of digitally-focused executives in unlocking value through integrated, data driven decision making.

The post How the chief data officer connects data and people to create value appeared first on Engineering.com.

]]>
The Chief Data Officer (CDO) role has emerged over the past two decades as a pivotal connector between digital technology and operations, especially in industries where product and manufacturing innovation drive strategic growth. Traditionally, IT—often led by the CIO—has been tasked with maintaining legacy systems, managing technical debt and overseeing core digital infrastructure.

With many CIOs reporting to CFOs, the emphasis has largely shifted to cost efficiency over proactive innovation. This gap opens the door for digitally minded, innovation-driven executives like CDOs to bridge IT, engineering and manufacturing, enhancing data integration, accessibility and actionable insights. This coordination across Product Lifecycle Management (PLM), Enterprise Resource Planning (ERP), Material Requirements Planning (MRP) and other core systems supports a holistic approach to data-driven innovation.

A 2023 Deloitte Insights report highlights three factors driving CDOs’ effectiveness in transforming business performance: aligning their vision with business strategy, controlling data management practices and cultivating influential relationships to extend their reach. Approximately 67% of Fortune 500 companies now have a CDO role, underscoring the priority of data leadership across industries.

This article examines how the CDO elevates IT’s mandate, aligning data strategies with engineering and operational needs to make data a driving force behind manufacturing and product innovation.

The CDO drives end-to-end data integration

In manufacturing, the integration of PLM and ERP is not just a technical requirement; it is an essential component of strategic growth. The end-to-end flow of data between product design and production (facilitated by PDM and PLM) and operational processes (led by ERP, MRP and MES systems) boosts collaboration, shortens time-to-market and improves product quality —ultimately unlocking greater innovation potential through real-time, data-backed insights across the product lifecycle.

The CDO approaches this integration challenge with a strategic vision. By dismantling data silos and enabling seamless information flow between PDM, ERP, MRP, MES and CRM repositories, the CDO empowers cross-functional teams with real-time insights that support timely, informed decision-making. This effort involves more than just integrating systems; it requires fostering a culture of continuous collaboration and improvement. The CDO mandate is therefore more about people and stakeholder alignment than about data analytics and AI governance.

Addressing some key questions can drive value from cross-functional alignment, including:

  • Are design and engineering teams optimizing products for customer needs and regulatory standards?
  • How effectively are production teams implementing efficient manufacturing strategies?
  • Are procurement teams securing cost-effective and sustainable supplier partnerships?
  • Is the sales strategy aligned with product capabilities and customer feedback?
  • How efficiently are we responding to market changes based on real-time data?

The CDO role goes beyond data integration and technical alignment, fostering a culture where data serves as the foundation for innovation and responsiveness in product development. With well-managed and accessible data, innovation flourishes, enabling organizations to meet market demands proactively and competitively.

The CDO bridge the engineering-IT divide

The CDO role in bridging engineering and IT is central to achieving seamless collaboration across all stages of product development and manufacturing. Engineering and manufacturing teams often rely on specialized tools and platforms for design, simulation and testing, which may not always be compatible with traditional IT infrastructure. This disconnection can isolate engineering efforts from broader strategic objectives, slowing down innovation and creating operational silos. Here, the CDO brings a strategic approach to data integration, aligning specialized engineering tools with enterprise systems and ensuring that engineering efforts contribute directly to overarching business goals.

By establishing data pathways between engineering and IT, the CDO makes real-time insights available across departments, enhancing collaboration and ensuring engineering data supports PLM and ERP solutions. This integration goes beyond technical alignment. It fosters a shared understanding and sense of purpose across functions, allowing engineering teams to innovate with a clear view of how their efforts impact end-to-end product strategy. When engineers can seamlessly access critical information and align their objectives with broader business needs, the organization can drive faster time-to-market, improve product quality and make more informed decisions at every stage of the product lifecycle.

A 2023 survey published by the Harvard Business School reported that “The CDO role is poorly understood and incumbents of the job have often met with diffuse expectations and short tenures. There is a clear need for CDOs to focus on adding visible value to their organizations.” Furthermore, the authors, Davenport et al., highlighted that “Of the CDOs surveyed, 41% said they define success by achieving business objectives —significantly more than those who measured success in terms of change management or culture shift (19%), technical accomplishments (5%), prevention of serious data problems (2%), or an equal combination of these factors (32%).”

As the CDO aligns data strategies to break down engineering silos, they create a connected ecosystem where engineering insights actively inform operations, design and customer needs. By prioritizing and advocating for the distinct data needs of engineering, the CDO elevates this function from an isolated operation to a core, strategic component of the organization. This alignment allows the business to respond more rapidly to customer demands, ensuring that innovation efforts stay on track and deliver tangible value across departments.

Leveraging technology adoption to create value

While CIOs are often tasked with “sweating the assets” and managing technical debt, the CDO can advocate for engineering’s unique data requirements, highlighting their role in broader strategic goals. An effective CDO can reshape the traditional IT role, transforming it from a cost center focused on operational maintenance into a proactive, strategic partner in product innovation and manufacturing excellence. By aligning IT functions with product development and manufacturing priorities, the CDO facilitates a shift from reactive data management to data-driven growth and continuous innovation.

The CDO fosters digital transformation by building a data governance framework that prioritizes data quality, accessibility and security—empowering IT to act as an enabler of data-driven insights rather than just a gatekeeper of digital resources. Through advanced analytics, machine learning and cloud-based solutions, the CDO enables IT to leverage data assets for predictive maintenance, supply chain efficiency and agile product design. As a result, IT contributes actively to achieving the organization’s strategic goals, moving beyond maintenance to deliver predictive insights that inform every stage of the product lifecycle.

Collaboration between the CDO and CIO is also critical in identifying and prioritizing technology investments that drive the highest business impact. This synergy ensures that while the CDO focuses on strategic, data-driven growth, the CIO maintains a stable and secure IT foundation—balancing innovation with operational resilience. By harmonizing their objectives, the CDO and CIO can allocate IT resources more effectively, emphasizing data initiatives that deliver substantial returns while safeguarding the organization’s core infrastructure.

In times where agility and innovation determine market leadership, the CDO stands as a visionary force, aligning data, technology and people to redefine business potential. By embedding data-driven strategies at every level, the CDO not only bridges operational silos but also empowers organizations to drive sustainable growth, anticipate change and maintain a competitive edge. In the digital age, the CDO leadership is not just valuable—it is essential for turning data into the organization’s most powerful asset.

The post How the chief data officer connects data and people to create value appeared first on Engineering.com.

]]>