Peter Bilello, Author at Engineering.com https://www.engineering.com/author/peter-bilello/ Fri, 14 Jun 2024 17:42:35 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png Peter Bilello, Author at Engineering.com https://www.engineering.com/author/peter-bilello/ 32 32 From aerospace to appliances, how PLM is tackling costly data issues https://www.engineering.com/from-aerospace-to-appliances-how-plm-is-tackling-costly-data-issues/ Thu, 06 Jun 2024 10:53:00 +0000 https://www.engineering.com/from-aerospace-to-appliances-how-plm-is-tackling-costly-data-issues/ Presenters at the annual CIMdata event zeroed in on how the biggest manufacturers use PLM to manage and streamline onerous amounts of data

The post From aerospace to appliances, how PLM is tackling costly data issues appeared first on Engineering.com.

]]>
PLM Road Map & PDT, a collaboration with Sweden’s Eurostep Group, is CIMdata’s annual North American gathering of product lifecycle management (PLM) professionals. This year’s event was a deep dive into product data and information and how they can be better managed. Presenters zeroed in on one of PLM’s biggest existing challenges—how it plays a critical part in an organization’s digital transformation.

Digital transformation is increasingly recognized as critical to overcoming the data complexities that frustrate every organization. In my opening remarks I stressed that successful digital transformation requires keeping an eye on the evolving trends and enablers—CIMdata’s Critical Dozen.

I also pointed out that:

  • Maximum PLM value will only result from a holistic, end-to-end approach to connectivity, strategies, and tactics provided they appropriately address people, process, and technologies. Nearly all PLM failures result from not following through and not staying the course.
  • Organizational change management will be required; it always plays a critical role in maximizing the adoption of new processes and technologies, and delivering bottom-line value.
  • Evolving customer demands and market opportunities are motivating investment in comprehensive product lifecycle digitalization.

I outlined the major focal points, including enterprise application architectures, configuration management, extending bills of materials (BOMs) through bills of information (BOI) structures, model-based structures, the Internet of Things (IoT), Big Data and analytics, augmented intelligence, digital skills transformation, organizational change management and, of course, digital twins and digital threads.

Each of this year’s presenters, whether from the private or public sector organizations, are hands-on in some aspect of the digital thread. Several presenters were from CIMdata’s Aerospace and Defense PLM Action Group (AD-PAG) member companies, celebrating its tenth year under the leadership of James Roche, CIMdata’s A&D practice director.

Keynoting was David Genter, director of systems design engineering for Cummins Inc.’s Accelera unit, which was formed to leverage design for sustainability (DfS) by “developing a design optimization culture.” Cummins, based in Columbus, Ind., produces more than 1.5 million engine systems a year and is driving a decarbonization transition as it refocuses from diesel engines to alternative-fuel internal combustion engines; fuel-cell, battery-electric, and hybrid powertrains; and electrolyzers that generate hydrogen and oxygen from water.

Genter stressed “moving analysis to the left,” which means using analysis early and often to engineer sustainability into designs from the start. He cited DfS test cases saving more than $1.4M in the first year by removing non-value-added material in the designs of engine mounting support brackets and exhaust manifolds. When a commitment is made to early analysis for material-use optimization, he noted a typical 10% to 15% material savings, a five-fold return on DfS engineers’ salaries and big improvements from right-first-time design results. 

“Addressing climate change is daunting but DfS is not!” Genter observed. Cummins is committed by 2030 to reduce greenhouse gas emissions by 50%, water use by 30%, waste by 25%, organic compound emissions by 50%, new product Scope 3 emissions (indirect but related) by 25%, and generating circular lifecycle plans for every new part.

Optimizing designs with right-first-time techniques can seem “tough” when “everybody is already overwhelmed with the other work,” Genter said, but Cummins has verified that DfS need not add any organizational burdens or stretch normal design times for new products—quite the opposite.

Several presenters addressed the elimination of paper documents. Robert Rencher, a senior systems engineer and associate technical fellow at Boeing, shared some numbers and described a highly successful solution. In his work with the AD-PAG, he has probed airlines’ continuous checks of aircraft and the U.S. Federal Aviation Administration (FAA) Form 8130-3 Authorized Release Certificate (ARC).

Form 8130-3, the FAA’s airworthiness approval tag, is the airline industry’s good-to-go certificate. It is ubiquitous and mandatory for every aircraft inspection and every new or repaired part.

Rencher’s numbers were eye-openers. He reported that a single FAA 8130-3 may require many as 600 engineering and inspection documents, and documentation for a single bolt can add dozens of pages. One large U.S. airline conducts over 200,000 aircraft checks every year, he said, and:

  • 90% of this documentation is still handled on paper.
  • The labor cost to file one document was $20 in 2020.
  • Between 2% and 5% of these documents “are lost or misfiled.”

In his presentation, which was delivered on behalf of the AD PAG and draws on the experiences of all the members, including Boeing and Airbus, Rencher said Airbus can now electronically generate and verify all the info needed for an 8130-3; this is establishing a digital data exchange for aerospace quality certification. Digitizing this documentation could save the aerospace industry €80 million annually in Europe alone, with over €20 million in annual savings already reported.

Rencher summarized that combining the digital thread with distributed ledger technology has proved to be a success: users of 8130-3 gain the digitization, traceability, provenance and accessibility of part data across the part’s product lifecycle from design to final disposition.

Rencher also covered AD-PAG’s benchmark study on furthering the use of PLM’s digital twins and threads. In this effort’s fifth phase, AD-PAG is working with numerous industry standards bodies, Rencher reported. The intent is to gain consensus and acceptance of results, increase the brainpower in the project, increase AD-PAG’s leverage with PLM software providers, and testing PLM digital twin/thread definitions with more than 20 distinct use cases.

AD-PAG’s other focus is on the Systems Modeling Language (SysML) as an enabler of model-based systems engineering (MBSE). In an update, Chris Watkins, principal engineer, MBE/MBSE at Gulfstream Aerospace Corporation, who is the AD-PAG MBSE project leader, reported on open, standardized application program interfaces (APIs), common ontologies, and new SysML tool providers. He highlighted persistent shortcomings in syntax, notations, and interoperability plus ambiguities and “poor support” for Universally Unique IDs that promise to address many challenges. The AD-PAG is addressing these in a follow-on project phase.

McDermott International Ltd. is actively driving the use of PLM in the engineering, procurement and construction (EPC) industry which regularly joins discrete and process capabilities in its multibillion-dollar energy infrastructure projects; these are a tough challenge for any digital technology. Houston-based McDermott designs and builds on land, at sea, and underwater worldwide.

Jeff Stroh, McDermott’s senior director of digital and information management systems, said every McDermott project has millions of documents and digitalization is increasingly urgent. Many documents are from Microsoft Office, but many more are from other digital tools that are not integrated or only partially so—and they reference many sources.

Challenges abound, Stroh stressed. The re-use of prior document content is limited in EPC, and quality controls are “purely manual,” so change identification and management are difficult. EPC projects rely heavily on offline commenting and mark-ups that must be manually processed, he pointed out. Documents and sources are disconnected, so EPCs have no effective way to “bake in” lessons learned.

As yet there are no effective processes for knowledge management amid nonstop additions, deletions, clarifications and replacements, Stroh added. Nor are there any ways to link content from engineers’ applications to a project’s narrative documents.

Stroh, too, had some eye-opening numbers. In EPC, clients supply thousands of documents but McDermott’s engineering deliverables mushroom to many multiples of that; moreover, each individual document goes through multiple revisions and approval cycles. In one project, the client made nearly 60,000 comments on more than 3,000 documents, he noted, all of which had to be responded to.

Project manhours have ballooned in recent decades, he continued, and the EPC business climate has become very risky, as “cost-plus” contracts are gone and now many contracts are done on a fixed price.

To cope, McDermott engineers need data that is available and usable in their tools and applications; data that reduces the friction in work and finding answers; that drives collaboration and visibility across tools, functions, and disciplines; and that augments the workflow, “not interrupt it.” Legacy approaches, “voluminous narrative documentation and manual processes are not fit for the fast-paced world…”

Finally, Stroh commented that this was McDermott’s second PLM try. The first failed because it focused on “document management instead of reimagining processes based on data and information management.” Success in this new implementation is also tied to adopting the users’ terminology into the processes and tools rather than expecting users to adopt the tools’ terminology.

Mabe, a Mexico City-based white-goods manufacturer, presented its efforts to get new products to market faster, extend and improve product families, reduce business complexity and improve productivity across the organization. With eight factories, Mabe annually produces nearly 10 million refrigerators, ranges and other appliance. General Electric Appliances is Mabe’s largest reseller.

Speaking were Gabriel Vargas, director of engineering systems, and Maria Elena Mata Lopez, leader, technical information and modular architecture.

Through digitalization, automation and a modular product family architecture enabling the automation of product configuration, Mabe’s benefits were dramatic, according to Vargas and Mata Lopez. The resources needed for new product introductions fell by 90%, Mata Lopez reported, and for product maintenance 80%, she added; effort to manage BOMs was reduced by 80%; parts counts for ranges fell by 60% and refrigerators 42%. Cost management was speeded up and made accurate, she added.

Digital transformation leaders from General Electric Aerospace, the Gulfstream Aerospace Corp., and Moog also spoke at the conference.

A persuasive case for using artificial intelligence (AI) in product design was made by Uyiosa Abusomwan, senior global technology manager for digital engineering at the Eaton Corporation. He asked whether AI could be used to optimize product design “the way nature optimizes each new creature?”

Addressing the application of AI to supply chains, Abusomwan pointed out that AI can identify features, extract parameters and put them in 3D CAD formats. As an example, he noted, data can be fed into quality management solutions and into PLM so failures and causes are detected and not repeated; Eaton is already doing this.

For AI to be used to optimize designs, he continued, open minds are needed along with new data management tools, which can cut across design solutions including PLM, and an understanding of the values being sought. Abusomwan predicted that third-party solution providers in PLM and IT will be buying up start-up companies that offer these capabilities.

The conference’s second day focused on the public sector with presentations from or about the U.S. Defense Department’s Research and Engineering, the Defense Acquisition University, the Naval Surface Warfare Center, NASA, and the U.K. Ministry of Defence. All addressed digital transformation issues and their agency’s progress.

In a keynote presentation given by Daniel Hettema, director of digital engineering, modeling, and simulation in the Office of the Undersecretary of Defense for Research and Engineering. His office is “getting into systems engineering big time” and has launched a review of the DoD’s modeling and simulation policies.

The DoD, he noted, has 700,000 employees, over 1.4 million contractors, and 709 policies that address some aspect of digital model creation. Because of its size and complexity, “DoD doesn’t make big moves but…we can move the needle with small wins” such as leveraging digital technologies and using (newly) optimized customer models.

Government agency responsibilities are so much broader—and fundamentally different—than any private-sector organization’s responsibilities. So, the public-sector presenters focused on policies and guidance rather than solutions and processes.

Most presenters at this year’s PLM Road Map & PDT zeroed in on the many different challenges in digital transformation and individual companies’ priorities in tackling them through PLM and some form of digital engineering. They all reported solid successes and rapidly evolving plans to complete their transitions.

The post From aerospace to appliances, how PLM is tackling costly data issues appeared first on Engineering.com.

]]>
Augmented Intelligence and Product Lifecycle Management—the Next Frontier https://www.engineering.com/augmented-intelligence-and-product-lifecycle-management-the-next-frontier/ Wed, 01 May 2024 09:32:00 +0000 https://www.engineering.com/augmented-intelligence-and-product-lifecycle-management-the-next-frontier/ A look at the many forms of AI, their differentiators and how to integrate them.

The post Augmented Intelligence and Product Lifecycle Management—the Next Frontier appeared first on Engineering.com.

]]>

A never-ending challenge for every organization, regardless of why it exists, is keeping track of what’s going on in and around it. This includes identifying new and emerging players and business models, of course, but also three sets of drivers of fundamental change, namely new technologies to be mastered and incorporated into new products, the irresistible trends in the marketplace where one competes, and what I call new trends and processes, and functional capabilities (i.e., elements) impacting and necessary for successful digital transformation (i.e., CIMdata’s Critical Dozen).

As I described in previous articles, CIMdata’s Critical Dozen is an evolving set of elements that an organization must master in its digital transformation journey. This means that yesterday’s dozen isn’t necessarily tomorrow’s. This said, two elements (digital twins and digital threads) of the current Critical Dozen have hit their inflection point, forcing their convergence, and a new element—Augmented Intelligence—has emerged.

Digital Twin-Digital Thread Convergence

As I have often stressed, a digital twin without a digital thread is an orphan.  A digital twin, as I described in previous articles, is a virtual representation of a product, a system, a piece of software, a network, and an asset. One or more digital threads keep that representation up to date from the initial concept through its complete lifecycle. 

Digital twins and digital threads are co-dependent, always have been, and always will be. Without a digital thread, a digital twin is little more than database clutter with no guarantee of being up to date. And unconnected to a digital twin, a digital thread is a string of data running from nowhere to nowhere. What has changed is the perception of digital thread and digital twin codependence: it is now clearly and unambiguously understood. 

Artificial Intelligence (AI) powerfully enhances this convergence by extending the breadth, depth, and reach of digital threads while uncovering hidden drivers and causes of changes in digital twins. 

Product Lifecycle Management (PLM) is an innovation engine that helps orchestrate the creation, maintenance, and reuse of assets—digital twins and digital threads, in particular—that represent the enterprise’s products, systems, and networks. Thanks to easier and deeper access to the Internet of Things (IoT), digital threads foster steady enhancements in end-to-end connectivity throughout an asset’s lifecycle. 

The Emergence of AI and its place amongst CIMdata’s Critical Dozen

The IoT is often seen a major cause for the unprecedented explosion of data, structured and unstructured—”Big Data”—generated by the billions of connected devices ranging from automobiles, HVAC systems, medical equipment, digital phones, and even smart doorbells. 

As part of what we understand to be the Fourth Industrial Revolution, these oceans of data have grown beyond human comprehension and even everyday computational capability. In PLM, this data fills our digital twins, races up and down our digital threads (and everywhere else within PLM), and often surges nonstop between PLM and its adjacent enterprise solutions. 

PLM-related AI-enabled applications are now leveraging this, and not a moment too soon. CIMdata wholeheartedly supports the exploration and adoption of AI as the next step in PLM’s central role in digitally transforming the enterprise—and in enabling and realizing how AI is being used to augment human intelligence. This Augmented Intelligence enables human decision-makers and domain experts to use the enormous inflows of data that threaten to overwhelm our digital systems. 

As its name implies, Augmented Intelligence, sometimes labeled Intelligence Augmentation, amplifies human and machine intelligence by merging them (i.e., Augmented Intelligence is the implementation of AI-enabled capabilities that add to and aid human insight and decision-making). In contrast, Artificial Intelligence attempts to supersede human intelligence. This distinction aside, product innovation platforms (i.e., PLM platforms) managing specific processes and data will be key recipients and implementers across organizations. 

The same challenges in any new technology are at work amid AI’s potential productivity gains. The regulatory and legal frameworks around AI technologies, including those that support Augmented Intelligence, already lag their rapid development. Ethical and moral considerations are already emerging, and implementers must develop long-term strategies for using and sharing data. And with everyone’s expectations changing, workforce skills must be upgraded and customers must be educated.

The Many Forms of AI and How They Differ

Decisive action and implementation require understanding the elements of AI. It enables algorithms to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages translations. 

As with any emerging tech trend and process enabler, many different forms quickly appear in the marketplace as soon as the software and hardware are up to the task.

PLM solution providers are busily integrating AI tools and methods, including:

Generative AI (GenAI) is an iterative form of AI that can solve design problems in three steps, making it particularly useful for updating digital models, especially those related to PLM’s digital twins.  (1) The design challenge is first optimized to generate all possible solutions. Each design is then (2) evaluated by simulating its performance, dimensional fits, surfaces, and so on. The third (3) step is algorithmic sorting for the unique design that is highest performing; these designs are often shown as oddly lumpy structures. Integrated with PLM, GenAI promises faster digital threads and better updates to digital twins, among many other potential applications.

Large Language Models (LLMs) generate specific text and images on demand by combining words and untangling the ambiguities in everyday language with vectors (and other tools) that rely on neural network architectures called transformers. LLMs, such as the wildly popular ChatGPT from OpenAI, are “trained” on enormous amounts of data, including text and images. ChatGPT and many other “chatbots” are put to work—”prompted”—with instructions in every day language (NL); usually, no coding skills are required. By the way, “GPT” stands for Generative Pre-trained Transformer. 

Built on machine learning (see below), LLMs make data problems obvious, cutting through the AI hype and taking us back to AI’s roots. As with GenAI above, LLM integration with PLM promises enhanced and faster digital threads and better updates to digital twins, e.g., from the IoT.

AI in Analytics

For many, the historical beating heart of AI is Analytics, explained quite well in the accompanying Gartner chart, “Source Planning Guide for Data and Analytics.”  The graph on the left side of the chart shows the four stages of Analytics—Descriptive, Diagnostic, Predictive, and Prescriptive; the axis labels them from a PLM user’s point of view.

AI-driven analytics can determine an organization’s data’s value, usefulness, and relevance. This is the source of AI’s new data fields that reveal previously unexpected insights, unknown connections, and unseen trends.  

Many analytics AI-enabled approaches may prove beneficial to PLM, but two already stand out:

Machine Learning (ML) makes predictions and recommendations from patterns in data—whether structured or not. Over time, data and information can be added, and ML’s predictive algorithms can be enhanced without explicit programmed instructions.

Deep Learning (DL) is a form of machine learning in which a cascade of processing layers and units extract increasingly complex features from previous outputs; the result is the determination of new data in a hierarchy of concepts using dozens of dimensions of analysis. DL is the basis of our ubiquitous “digital assistants” Siri, Alexa, and Cortana, as well as many complex tasks (such as image processing in autonomous vehicles) that are done rapidly by combining images, audio, and video.  

For PLM users, AI-enabled analytics benefits are found in critical but everyday tasks, including reducing time spent on routine and mundane tasks, avoiding mistakes, achieving more consistent results, and improving response times in:

  • Material selection, manufacturing processes definition, testing and quality     control QC, safety, and accident prevention.
  • System and software modeling with ChatGPT and other LLM tools.
  • Examining digital threads for novel insights and quickly identifying patterns in digital twins.
  • Supporting the management of changes to products and their designs in PLM.
  • Generation of training data for ChatGPT and other LLM tools.

AI in Enterprise Search

In preparing for my presentation for a recent virtual conference sponsored by  Sinequa, a developer of an AI-enabled enterprise search platform, I recognized that many organizations seem to have forgotten that data is the core of digital transformation and that their digital initiatives are doomed unless data is better identified, understood, appreciated, and managed.  

Many often overlook their PLM users’ constant, time-consuming searches for information needed in their jobs; that data and information should be in the digital twins and threads and in the digital webs, processes, and systems where the organization’s data is generated and managed. 

Data generation itself is a problem. The processes, systems, and smart connected products in today’s digital world generate huge amounts of data, threatening to lurch out of control. This makes finding the right data for actionable insights and sound decisions imperative. Big Data has already rendered most ordinary searching ineffective, so the latest capabilities must be provided, and not just to PLM users. 

Leading solution providers like Sinequa also offer and leverage an ever-expanding array of AI-enabled tools and techniques to minimize GenAI’s mistakes and what users label its hallucinations. 

A particularly successful approach that Sinequa leverages is Retrieval Augmented Generation (RAG). By expanding LLMs’ information bases, improving context, and eliminating outdated information, RAG addresses the LLMs’ biggest difficulty—being crammed to overflowing with data. RAG’s back-end information retrieval evolves smaller, specialized LLMs, which can revolutionize enterprise search and overcome traditional information boundaries.

With AI discussions prominent in boardrooms and on Wall Street, we can expect that forthcoming PLM solutions and every industrial product and service will likely have some form of embedded AI. Some recent announcements:

  • A ChatGPT-based “Copilot” key has been added to the keyboards of two new      Microsoft Surface laptops between the keyboard’s arrows and the Alt key. Copilots are an advance over digital assistants in that they are built on LLMs to automate tasks and not just assist with them.
  • Google’s LLM generative AI tool Gemini may be added to Apple iPhones later this year; Gemini is already in Android phones made by Google and others.
  • To help build its version of the industrial metaverse, Siemens Digital Industries Software and NVIDIA Corp., Santa Clara, Calif., are collaborating to add immersive visualization to the Siemens Xcelerator and its Teamcenter X PLM platforms; the goal is to use AI-driven digital twin technology to enhance the visualization of complex data, with “ultra-intuitive” photorealism. NVIDIA has similar links to Ansys, Lenovo, and Rockwell Automation.
  • To improve decisions based on PLM data and to boost supply chain productivity, data analysis, and sustainability workflows, more than 50 generative AI features have been added to the Oracle Fusion Cloud.
  • Predictive maintenance aims to reduce equipment failures by continuously monitoring condition and performance via text, video, imaging, etc., using ML, DL, and access to the IIoT.

                                     

The future of AI and PLM

As the preceding examples show, opportunities unlocked by AI-enabled solutions will be transformative and even revolutionary. Bringing PLM and AI together will empower human creativity with enhanced abilities to turn ideas into realities. The resulting new products, services, and business models will ensure the entire enterprise’s long-term sustainability. 

CIMdata foresees that AI in digital transformation will further strengthen the PLM innovation engine and extend its use, widening the use of AI and broadening demand for it. AI in PLM will prove invaluable in helping us to quickly discover “what we know,” i.e., what we have and can find in our data, along with revealing what we still need (or can’t find), comprehending the real value of our existing content, and finding new ways to generate what we need—thereby augmenting human intelligence in new and improved ways.

If businesses now run on data, what has changed?

Decisions are now based on examining data and poring through what appears to be endless streams of data, data that is assumed to be good. “Good” in this context means accurate and complete, which we all know data never is. Gone is the time-honored basing of decisions on direct observation of designs, production, sales, field service, and customer feedback. The few experienced and long-serving employees and managers still on the job are rapidly being displaced by data processed in many new (digital) ways and governed by poorly understood “standards.” 

In other words, running businesses based on “experience” and direct observation is being augmented with new, powerful AI-enabled tools. As part of the Fourth Industrial Revolution, digital transformation—and so many retirements—business decisions are now based almost entirely on data. The overriding concern is whether AI can improve data quality before it is too late.

In an April 2024 Scientific American article titled A Truly Intelligent Machine, author George Musser explains why AI is so compelling: “We now realize that tasks we find easy, such as visual recognition, are computationally demanding, and the things we (humans) find hard, such as math and chess, are really the easy ones.”

The promises and benefits of bringing AI-enabled augmented intelligence into PLM are so huge that the consequences of not doing so are intimidating. But, incorporating Large Language Models into PLM is still in the initial stages while, at the same time, many changes are coming in AI. The PLM transformation will likely be long and bumpy. 

The post Augmented Intelligence and Product Lifecycle Management—the Next Frontier appeared first on Engineering.com.

]]>
Answering 3 top PLM questions https://www.engineering.com/answering-3-top-plm-questions/ Fri, 22 Mar 2024 10:35:00 +0000 https://www.engineering.com/answering-3-top-plm-questions/ Going back to basics to help users understand PLM.

The post Answering 3 top PLM questions appeared first on Engineering.com.

]]>
In this PLM article, Peter Bilello, president and CEO of CIMdata, draws inspiration from Answerthepublic.com, a marketing-focused platform that helps people find common user questions entered into search engines.


(Stock image.)

(Stock image.)

I will answer three basic questions from the platform answerthepublic.com:

  1. What does PLM actually mean?
  2. How do I know if I need PLM?  
  3. What should I ask to understand if my company needs PLM?

Two of the main reasons for PLM failure are a lack of awareness and misunderstanding of it. My hope with this article is to minimize and address both head-on.

For context, let’s first explore CIMdata’s formal definition of PLM. It is “a strategic business approach that applies a consistent set of business solutions in support of the collaborative creation, management, dissemination and use of product definition information across the extended enterprise and spanning from product concept to end of life — integrating people, processes, business systems and information.”

Let’s then expand on this concept with insight from my recent webinar, “AI & PLM: Beyond All the Hype.” In the webinar, I stressed that PLM is a company’s innovation engine orchestrating the creation, maintenance and reuse of product-related digital assets.

For software developers and service providers, these definitions sum up PLM pretty well. But as a PLM consultant, researcher and educator, I realize that users need better insights — if only to help justify a purchase. By leveraging Answerthepublic.com we can see gaps in understanding and present our insights from a crucial, yet almost always overlooked, viewpoint: the user.

Q1.  What does PLM actually mean?

To help contextualize PLM, I find it useful to spell out five key lifecycle elements of PLM and three sources of the connectivity it must provide. The five elements are: 

  1. Customer insight: Figuring out what customers want and need by monitoring products in use, assessing customer acceptance or annoyance and measuring the utility of new features by remotely monitoring their use.
  2. Concepts and detailed design: Robust architectural concepts and detailed design based on user feedback joined with design considerations, manufacturing capabilities, maintenance and repair feedback.
  3. Testing: Monitoring the performance of prototype systems, bench-testing subsystems and optimizing performance and management via proving grounds and testing labs.
  4. Visibility in manufacturing: Readily available feedback on manufacturing defects, supplier quality control, supply chain logistics and OEM or supplier operations.
  5. After-sales service via feedback loops: Field defects, success and failure in diagnostics and repair, product health and scheduling of maintenance. This includes the use of new products and specific features.

Users will recognize these five elements as the major sources of information that must be fed into a PLM’s digital twins — the virtual representations of the enterprise’s products, systems, assets and associated process definitions. Meanwhile, the three sources of data, information, knowledge and intelligence that feed into these five PLM elements include: 

  1. Applications that help develop insights, root-cause analyses, planning decisions, their execution and control strategies.
  2. Information for use with data storage, search/find, analysis and development of new applications.
  3. Data from edge computations. This includes sensors, devices and on-board data-processing and IT systems.

Users will recognize this trio as the information fed into their digital twins, gathered by PLM’s digital threads. These threads represent the myriad of links up and down a product’s lifecycle and the data pipeline that enables connectivity and information gathering.

Q2: How do I know if I need PLM?

The characteristic strengths of PLM — the collaboration and innovation enabled by its processes and data management platforms — are needed when:

  • Lifecycles of products, systems and services must be tracked for reasons including effective field maintenance, resolving customer complaints and pinpointing innovations for upcoming products.
  • The solutions to join, track and collaborate innovation effectively are unavailable. PLM’s “rivals” extend the capabilities of their offerings by adding toolsets to address this, but they are not always well-integrated and often have limited functionality. In PLM, these capabilities are native.
  • Products, systems and services grow in complexity with user demands and the accelerating rates of market changes. This makes it essential to track the lessons learned from users in the field, design, development and manufacturing.
  • Product becomes a data-driven service via the ongoing transformation of conventional, physical products, systems and services. Many factory equipment and other product providers now rely on a product-as-a-service (PaaS) business model: leasing their products to users while maintaining them and invoicing based on usage.

Q3. What should I ask to understand if my company needs PLM?

Some questions to keep in mind when considering PLM include:

  • Do your products and assets require multiple disciplines? No development application or toolset — electrical/electronic, mechanical or software development — can adequately track changes made by one discipline, unless all disciplines are appropriately integrated within a PLM solution providers’ product innovation platform. And if you have multiple platforms, it is time to rethink the need for PLM.
  • Are your product designs so complex that you need multiple computer-aided design (CAD) tools? Does using multiple CAD tools lead to siloed data, workflows and information? Those CAD tools may or may not communicate well, let alone enable the necessary collaboration, integration and tradeoff optimization needed to design today’s complex products. PLM, however, can address these needs.
  • Do you have varying relationships with suppliers? For instance, do some build to print, just sell to you, support production, service your finished products or share in your product development? The complexity here can disrupt the collaboration and integration needed to create the competitive products that sustain your organization. PLM can make sense of this complexity. 
  • Does your organization develop, manufacture and support products across multiple sites? Or, put another way, does your organization seek to design anywhere, build anywhere and support everywhere? This means product personnel are scattered geographically and probably have different backgrounds and skill sets. If so, these coworkers need to collaborate among themselves and integrate their work. PLM can help verify that this is done. 
  • Are your products configurable and are sales, build and usage options included in development? Here, complexities arise regarding what is and is not configurable and which options users are given. Configurability often grows in response to user demands, but regardless of its cause, single-purpose toolsets and overly focused/restrictive applications will always struggle with it, while PLM can shine.

Complexity is far from the only challenge experienced during a product’s lifecycle—from concept through life. CIMdata always asks its clients what they are doing, or plan to do, if:

  • Margins and selling prices are under competitive pressure.
  • Product launches are delayed.
  • Warranty claims and recalls are on the rise.

Each can have several causes. Sorting them out and addressing them requires effective collaboration and the integration tools found in PLM’s digital twins, digital threads and connectivity. These fundamental capabilities enable collaboration and integration throughout a product’s lifecycle. Their mutual dependencies address complexity challenges by continually building on each other. 

Running through this narrative are the themes of collaboration and innovation. These are processes PLM supports like no other technology can. Without PLM-enabled collaboration and innovation, the gathering and management of information, insights and inspiration will flounder. Only collaboration and innovation can ensure the long-term sustainability of an enterprise.

The post Answering 3 top PLM questions appeared first on Engineering.com.

]]>
What’s the difference between PLM, ERP, EAM and more? https://www.engineering.com/whats-the-difference-between-plm-erp-eam-and-more/ Fri, 16 Feb 2024 12:57:00 +0000 https://www.engineering.com/whats-the-difference-between-plm-erp-eam-and-more/ And why is PLM the best of the available options.

The post What’s the difference between PLM, ERP, EAM and more? appeared first on Engineering.com.

]]>
It is an irony of technology that the solutions, systems and platforms that seek to drive business benefits often describe themselves and their focus in strikingly similar terms. The descriptions and sales demonstrations presented to new prospects by many solution providers seem all-encompassing. In fact, without deep-dives into the nuances of their messages, solutions and associated capabilities, they become nearly indistinguishable—akin to clones.

A little PLM 101 is needed to compare it with its alternatives. (Image: Bigstock.)

A little PLM 101 is needed to compare it with its alternatives. (Image: Bigstock.)

The main source of this confusion is that the enterprise-level solution providers edge closer to each other’s capabilities with each release. In their pursuit to be competitive, their offerings have become so feature and function-rich that many look alike. Even so, they constantly add new capabilities in support of their core competencies and focused domains—which are dearly loved by the users they enable and empower.

This article looks at two enterprise-level solutions that are often confused—Enterprise Resource Planning (ERP) and Product Lifecycle Management (PLM). They stand out as the two most prominent digital solutions for managing data at the enterprise level. Proponents of both have been touting the benefits of their solutions for a few decades while rarely addressing each other, let alone lining them up side by side.

While briefly discussing four other enterprise-level solutions: customer relationship management (CRM), enterprise asset management (EAM), building information management (BIM), and manufacturing execution systems (MES), I am mainly comparing ERP and PLM. Or what I call the “big two.”

I will share what users of both ERP and PLM have told CIMdata. The hope is that their observations will speak for themselves. I will also explain why PLM is better from the standpoint of defining and managing the complete lifecycle, product data and associated lifecycle processes.

PLM, ERP and how they started to overlap each other

At first glance, ERP and PLM may not seem that different — but they are. Their differences emerge when we look at their roots and core functions. ERP mainly grew out of the need to manage finances and manufacturing (i.e., the management of physical assets), while PLM primarily grew out of the need to manage product development (i.e., the definition and management of product-related intellectual assets).

Each of these solutions emerged in the 1980s from the digitization and computerization required to manage an enterprise’s ever-evolving data and process needs. Both product and organizational complexities continue to drive the adoption of these and many more digital technologies.

Over the ensuing decades, these solutions steadily evolved, growing in features and functions as they were implemented at ever higher levels of the enterprise. Each solution matured to enable key elements of Information Technology (IT), Engineering Technology (ET) and Operations Technology (OT).

Growth-driven solution providers of ERP and PLM inevitably began to see their peers as rivals as the boundaries between the front end of the product lifecycle and its production side blurred and overlapped. Both solution camps also support similar data constructs and processes (e.g., parts, BOMs as well as engineering change and release management). Soon, new features and functions began to mimic each other, giving rise to today’s confusion that surrounds digital realms at the top of every enterprise.

Back to the roots of enterprise solutions

Different enterprise-level solutions begin, sensibly enough, with different digital roots. It is the development directions taken that led to their similarities. Here is how many different enterprise tools had their start.

  • ERP came to life as toolsets designed to track and manage finances, inventories and generate BOMs for purchasing and production. An offshoot of MRP, or materials requirements planning (or “processing”), ERP steadily expanded digital record keeping into “running the business.” ERP began as a way to keep key managers aware of everything impacting the bottom line; forecasting trends and automating these functions have progressed steadily ever since. The primary focus of ERP is on the management of an organization’s physical products (i.e., a physical asset-centric approach).
  • PLM is built on the basic connectivity of transforming data and ideas into information within product development (i.e., an intellectual asset-centric approach), which then (literally) drove the tools and systems of production and service. As this connectivity expanded, the need for structures and repositories became apparent, giving birth to PLM’s support of digital threads and digital twins. These elements, among others, distinguish PLM from its predecessor, product data management (PDM).
  • CRM originated as a toolset for tracking customer contacts through various channels, including websites, mailings and telephone calls. It evolved with the advent of social media to manage all external customer interactions to grow a base of repeat customers with better assessments of their needs and faster responses. Some CRM solutions even have PLM solutions built on top of their platforms.
  • EAM evolved from computerized maintenance management systems (CMMs) into a lifecycle management approach for monitoring and supporting the well-being and performance of maintained assets from acquisition/commissioning to the end of their productive life (i.e., in-service asset-centric). While more narrowly focused, CMMs centralize, upgrade and automate maintenance management information. Today, many PLM and ERP solutions provide EAM support.
  • BIM emerged from the architecture, engineering and construction (AEC) branch of CAD, and focuses on building and maintaining infrastructures (e.g., buildings, rail and utility networks). In the private sector, BIM is universally used for large and small buildings. In the public sector, BIM addresses the design and management of data associated with complex projects and assets such as roads, bridges, utility plants and distribution networks, pipelines and dams for federal, state and local governments. Users also rely on BIM to manage complex relationships between owners/operators, contractors and project completion/hand-over. BIM can be thought of as a domain-specific PLM solution.
  • MES was developed to monitor and document real-time operations, turning raw materials into physical components and finished goods. MES tells decision-makers what is and is not happening in production, highlighting needed improvements and helping optimize production. MES is often used for intermediate data management between factory floor supervisory control and data acquisition (SCADA) systems and ERP.

As CRM, EAM, BIM and MES matured, solution providers added connectivity for frequently accessed third-party toolsets and platforms, starting with design and analysis basics, as well as simplified access to the Internet of Things (IoT). Along with the automation of everything and anything, toolsets (integrated or third-party) are now used for simulation and analysis along with computer-aided process planning and work instructions. Examples of these simulation, analysis and planning tools include metrology, testing, inspection; real-time video with augmented reality and virtual reality (AR/VR); data governance/configuration management, machine learning, predictive analytics, artificial intelligence (AI) and more.

To some degree, all are integrated with or are components of a Model-Based Enterprise (MBE) approach. Prompted by PLM’s success, many are developing rudimentary digital twins and digital threads, or are positioning themselves so that they enable and/or support these digital constructs.

What remains unchanged is the fundamental distinction between ERP and PLM users. ERP is the primary enterprise toolset for people with business or purchasing backgrounds focused on profitably “running” the business, emphasizing the management of the physical product. PLM is primarily used by engineers and others with technical backgrounds who develop and support the products, systems and assets that contribute to an enterprise’s competitiveness, emphasizing the virtual product.

Core functions and expansions of today’s enterprise solutions

With the incorporation of the additions and advances mentioned above, and thanks to its end-to-end bidirectional lifecycle connectivity, PLM has matured into an enterprise-spanning platform. The other five solutions, while providing value, lack PLM’s intellectual asset reach and robustness. Also, it is unlikely they will keep their (more limited) effectiveness as PLM solution providers invest in “immersive engineering” and shift their support of products and services into the support of an evolving digital “metaverse.”

PLM is rapidly being implemented across diverse industries—food and beverage, retailing, fashion, banking, insurance, packaging and distribution, media, transportation, pharmaceuticals, healthcare services and more. While many of these industries generate physical products, they are overshadowed by information gathering, simulation, analysis, compliance verification and feedback. Much of this cannot be handled by other enterprise platforms. Mostly, this is accomplished through modifications and add-ons rather than (usually troublesome) customizations.

ERP has also matured to span the enterprise and is now a platform to monitor, in near real-time, every action, change and decision that impacts the bottom line. It links finance, back-office functions, sales and purchasing and inventory management with sales, marketing, distribution, human resources, factory operations and suppliers. Even if they are not directly managing them, ERP systems facilitate E-commerce, security, privacy, risk management, remote work, energy consumption, environmental sustainability and various Industry 4.0 initiatives.

Interestingly, the terms “collaboration” and “communication,” ubiquitous in PLM, remain scarce in ERP self-descriptions. Presumably, because ERP remains true to its roots in tracking costs and quantities, leaving product development functions, related intellectual asset creation and management capabilities in the hands of others.

EAM has steadily evolved from preventive to predictive approaches for asset tracking and management. These can cover all phases of maintenance, including scheduling, planning, work sequencing/management, mobility, analytics, health and safety and even supply chains. While EAM is used in many manufacturing and service organizations, it is more often found in energy production and distribution, oil, gas, nuclear power, utilities, mining and chemical production. For asset-management businesses, EAM can serve as both PLM and ERP. As noted previously, PLM solutions are successfully eating away at this enterprise solution domain in many organizations.

CRM aggregates customer information to access current and prior contacts (sales and service), purchase history and updates and performance tracking. As with other enterprise solutions, CRM automates the sharing of customer interactions so business units and teams can work together smoothly. Leading CRM solution providers have begun integrating AI into their offerings. CRM has matured enough to be used as both PLM and ERP in companies intensely focused on sales and customer relations, such as dealerships.

BIM has also matured somewhat into both PLM and ERP for buildings, infrastructure and floorplans; monitoring on-the-job equipment; the transfer of work from job sites to factories and even the weather. BIM users can scan and find virtually anything using drones and innovative imaging technologies such as Lidar (light detection and ranging). As for connectivity and access, BIM users say it’s good now, but they never have enough of either to deal with local, county and state building codes and environmental regulations. In the public sector and civil engineering, BIM users monitor and manage projects as well as changes in specifications, budgets and completion dates.

MES has reached a level of maturity where it monitors and controls production inputs, personnel, machine uptime and downtime, support services and scheduling while tracking their effectiveness. The track-and-trace capabilities of MES create “as-built” forms of products and production systems include additive manufacturing (3D printing), material handling, robotics, automated assembly and new processes for advanced materials. This is particularly valued in regulated industries requiring process documentation, problem tracking, and implementing corrective actions. Many job shops and even some large machining companies use MES to “run the business,” not just manage production operations and outcomes.

How PLM and ERP remain different

The time has come to shift our focus from technology to people. ERP and PLM powerfully support and enable the information needs of their users, but the tasks and responsibilities, along with the required skillsets, differ significantly. The data and information underlying ERP and PLM constantly change, often abruptly and in unanticipated ways; both handle these changes well but accommodate them differently.

ERP’s dollars-and-cents focus demands skills in finance and accounting. Many of its users also have purchasing backgrounds. These users have ongoing needs for a narrow range of data and mostly repetitive information. These needs are logical given that ERP users focus on the here-and-now to track dollar-denominated costs, forecasting profits and projecting them into the near future.

PLM’s emphasis on modeling, analysis and communications demands in-depth skills for conceptualizing and developing viable new products, systems and assets—guiding them through production and into users’ hands, all while maintaining them. PLM users work with much more diverse information sources and types and have far greater needs for graphics than ERP users, whose tools and tasks are predominantly numbers-based.

Compared to ERP users, PLM users interact daily with a wide range of design tools, simulations, analysis systems, information types and formats. In contrast to ERP environments, PLM information and decisions are shared across the extended enterprise, augmented by continuous feedback. This has led to many PLM solutions being architected for integration and openness—realizing that they provide a data and process management platform that supports the easy integration of third-party and enterprise-specific applications configured and changed over time.

Tasked with keeping costs within budgets, ERP users primarily focus on the enterprise’s biggest cost drivers—purchasing, production and sales. Managing design and development, with its countless detailed component descriptions, work instructions and inspection criteria for every production phase, is much easier in PLM. So is accommodating the MBE with its Model-Based Design and Model-Based Systems Engineering elements that align with the organization’s overall digital aspirations.

Users leverage PLM’s collaboration and information access across large spans of the enterprise while fostering the digital transformation of the organization’s intellectual assets and the processes that create, manage and use them. Compared with ERP users, PLM users need far more detailed representations of intermediate stages, analyses and decisions as products evolve from ideation through engineering, production, sales and service.

Can innovative, competitive new products be generated in ERP? I must admit, I have never heard of them. Undoubtedly, some have. ERP’s top-down approach is essential for management control over as much of the enterprise as possible. This inherently makes it backward-looking with a focus on identifying and evaluating past results.

The PLM advantage is that its focus is on innovation, intellectual asset creation and management. This fundamentally forward-looking framework makes PLM non-hierarchical, collaborative, structured and effective.

Again, I have never heard of engineers innovating and collaborating using ERP. Why not? Because ERP tends to be a top-down approach to foster management “control.” PLM’s strength is that being inherently more collaborative and less hierarchical will always be preferred by engineering and product development. Fundamentally, ERP focuses on an organization’s physical assets, whereas PLM focuses on its intellectual assets. As a result, a company needs to digitally enable its business with the appropriate solutions for the task at hand and not try to utilize a solution that wasn’t designed for the purpose being addressed. Doing so should be deemed a non-starter for all.

Further reading

This article is the culmination of my latest series of PLM-focused articles developed for Engineering.com:

The post What’s the difference between PLM, ERP, EAM and more? appeared first on Engineering.com.

]]>
Supply chain resilience through PLM integration https://www.engineering.com/supply-chain-resilience-through-plm-integration/ Fri, 05 Jan 2024 09:23:00 +0000 https://www.engineering.com/supply-chain-resilience-through-plm-integration/ How to become supply chain resilient and what it will look like.

The post Supply chain resilience through PLM integration appeared first on Engineering.com.

]]>
Given the steady increase in disruptions hitting our enterprises and their rising costs, it is time for us to recognize that our digital solutions often pile capabilities on top of one another without considering the ability of our organizations and enterprises to combat and recover from disruptions.

Disruptions are a daily part of life—often serving as the catalyst for innovation or rapid unplanned change. In business, disruptions range from malicious hacks, sudden shortages, logistics problems and climate change. When taken in aggregate, they are as constant and varied as the surprises in life itself.

 (Image: Bigstock).

(Image: Bigstock).

As our understanding of disruptions grows, it becomes evident that resiliency—the ability to overcome and even benefit from disruptions—is a critical strength for enterprises. Until recently, this was not always appreciated. However, this view is now shared by Washington, where the inaugural meeting of the White House Council on Supply Chain Resilience took place in late November 2023.

This council, which grew out of the 2021 Supply Chain Disruptions Task Force, announced thirty new actions that will span across 19 federal agencies including Commerce, Transportation, Defense and Homeland Security.

The idea of using software-enabled business solutions to address anything so universally entrenched and unpredictable as disruption may, at first, seem ludicrous. But even some of the largest problems can be remedied if broken into manageable pieces. This perspective is equally applicable to disruptions in supply chains, prompting my focus on the increasingly urgent need to integrate supplier data into enterprise product lifecycle management (PLM) software and strategies.

What is Resiliency and Who can Tackle it?

Resiliency is the risk-tolerant capabilities needed to recover, or seize the unexpected opportunity, from disruptions. However, this definition only brings more questions:

  • Can resiliency be measured?
  • How much does it cost?
  • What resources, and how much of them, should be allocated to build resiliency?
  • When do we have enough resiliency to foresee and recover from obvious disruptions?
  • Do we even know what ‘enough’ means?
  • What about the disruptions we don’t foresee?
  • How do we know if we have recovered from a disruption: how would that be measured?

Remember that disruptions tend to spread rapidly in lean, tightly integrated companies so answering these questions now, when things are stable, is imperative.

Building an organization’s resiliency is a daunting challenge and requires a proactive approach, so the sound and sensible first step is to set up a resiliency task force. I urge the task force to focus initially on the supply chain, where disruptions worsen. 

Supply chain disruptions range from obvious to obscure, from losing a key supplier due to a natural disaster. As we have seen in microchips, lithium ores and so much else, disruptions underscore the imperative of embedding resiliency into the entire enterprise. 

Some sources of disruption include:

  • Hacking in all its malicious forms.
  • Sudden unavailability of key parts, halting production or forcing redesigns.
  • Abrupt changes in customer demand and financial health.
  • Unexpected sharp price hikes and inflation.
  • Transportation problems with ships, trucks and freight trains.
  • Just-in-time (JIT) inventory management that prioritizes minimal inventories and costs.
  • Over-preventing shortages.
  • Bad priorities like an overwhelming focus on cost savings.
  • Climate change and geography which affects flooding, tornados, snowstorms, droughts and more.

Can an organization really foresee disruptions? Yes, for the most part, by asking classic journalistic questions, like:

  • Who or What will trigger our next disruption and the ones after that?
  • Where will these disruptions come from?
  • When will they strike?
  • Why, as in what underlying events will generate disruptions?
  •  How can the costs of disruption be calculated? And can recovery be assessed in some better way than the rear-view mirror?

The key to answering many of these questions is hidden within PLM solutions.

Supply Chain Resilience is an Enterprise-Wide Project

Given the frequency and extent of disruptions, JIT supply chains and least-cost inventory management must be reconsidered in the broader context of optimizing the end-to-end product lifecycle. In other words, supply chains, the bills of materials (BOMs) they propagate and their users can benefit from PLM’s digital twins, digital threads, end-to-end bidirectional lifecycle connectivity and big data analytics. Purchasing and procurement departments — that build and manage supply chains — can also benefit if they are integrated into an enterprise’s overall PLM environment.

This PLM integration will necessitate that supply chains are less tightly controlled by finance and enterprise resource planning (ERP) systems. This is beneficial as some traditional systems are too narrowly focused on transactions instead of the what-if analyses required to swiftly identify, resolve and minimize disruption.

In many cases disruptions are vague. In these instances, PLM’s digital twins will often have nothing physical to represent. As a result, their numerous digital threads won’t have any data to connect. Nevertheless, I advocate switching from a reactive to a proactive stance because some disruptions can be recognized in advance and thus prevented or mitigated. Carefully focused PLM strategies and their associated digital solutions can enable decisive action when early indicators arise.

Part of this proactive approach includes the realization that disruptions can strike anywhere, at any time, requiring resiliency to permeate the entire enterprise. Building this resiliency requires close collaboration between PLM managers, security, IT, top management, finance, and purchasing and procurement.

PLM’s Role in Addressing Disruptions and Building Resiliency

For years, we have been hearing about worsening supply chain difficulties and sudden shortages; these disruptions seem to be the new normal. On the brighter side, PLM-integrated supply chains can do a better job of supporting product development and all the other enterprise functions that rely on external suppliers. This will hopefully reduce the risk of sudden halts in the production of new and old products, systems or assets.

Integration with PLM not only enhances supplier collaboration but also provides a single source of truth for identifying and managing key suppliers. This helps the enterprise better understand and track supplier capabilities, quality, performance and risks (e.g., reliance on single-sourced parts, components and systems).

This integration emphasizes the need to bring supply-chain management into PLM environments. It is critical for ensuring resiliency as disruptions can strike any part of the organization, from human resources to the executive suite. And the more tightly integrated the organization, the greater the pain of disruptions. This is why purchasing, for example, must no longer be a support function, stand-alone department or other organizational stepchild. Everything must be connected.

Unmatched by any other strategic and holistic business approach, PLM and its enabling digital solutions provide users with enhanced visibility up and down supply chains and into their data. This includes purchase orders, BOMs, Bills of Information (BOIs) and more data that is generated outside the organization from suppliers, contractors and partners.

Extending PLM into the supply chain and the departments that handle purchasing and logistics will not magically confer resiliency or make disruptions disappear, but it will provide organizations with more resources to handle them. Security experts will need to identify likely sources of disruptions, other worrisome vectors, and how peers have recovered. IT experts then need to code these findings into the PLM environment for reference. Top management must ensure that staff and funds are available to address these disruptions and to understand that resiliency will never be “done.” 

Additional benefits of moving the supply chain into the PLM strategy and environment include facilitating easier access for all users and fostering better collaboration within and between engineering, production, services, partners and suppliers.

What a PLM-Enabled Supply Chain Will Look Like

PLM-enabled visibility helps the organization reduce risks and minimize disruptions, making it feasible to connect BOMs and specifications with the broader supply chain and the entire enterprise—from production, development and engineering to operations, sales, service and so on. This connection must include a product’s initial concept and ideation through to the end of its useful life.

This visibility, made possible by PLM, will dispel misconceptions that supply chains are quasi-independent and stand-alone or that purchasing is responsible only for inventory control and answer only to finance.

In addition, the overall enterprise and everyone in it gains visibility into managing and optimizing the product lifecycle. They also gain support to proactively assess and evaluate risk via alternate components, new suppliers and more. Users also gain decision-support as to how to act when disruptions occur.

As time passes, PLM extensions and additional implementations will play vital roles in building an enterprise’s supply chain resiliency and recovering from the disruptions everyone knows are ahead—it is just a matter of time. I cannot overstate the potential costs of not bringing the supply chain into PLM.

The post Supply chain resilience through PLM integration appeared first on Engineering.com.

]]>
The Role of Top Leadership Amid PLM and Data Management https://www.engineering.com/the-role-of-top-leadership-amid-plm-and-data-management/ Thu, 30 Nov 2023 13:46:00 +0000 https://www.engineering.com/the-role-of-top-leadership-amid-plm-and-data-management/ Five tips to stop the surge of industry data and information.

The post The Role of Top Leadership Amid PLM and Data Management appeared first on Engineering.com.

]]>
People at the top of their enterprise have a growing, if sometimes unrecognized, responsibility to ensure their business’ data is managed properly. In the past, this management of information was often left to business units and even individuals. As such, leaders may not be up to speed on what they need to know.

(Image: Bigstock.)

(Image: Bigstock.)

For instance, leaders—and many articles targeting their demographic—use the phrase “data and information” as if they are synonymous. But they are not, and the distinction is quite significant. And, yes, I have been guilty of this error in syntax at times when I have not been careful.

The phrase “data and information” is misleadingly broad; the innocuous conjunction “and” may conceal more than it reveals. My view of data and information aligns with many in the data management world. In summary, data is a set of facts and information adds context to said facts. This distinction means that data is unorganized, while information brings order to that data. When placed together in context, information helps to map a bigger view of what that data means and what it is telling you.

Data and information can also be understood as alternating between these phases. All data has had a previous life as information for someone somewhere, then once its context was forgotten it reverted back into data. If this doesn’t underline the need for reinforcing data management, I do not know what would.

Formats and processes change and evolve constantly, forcing data to become information in varied ways. These changes and evolutions gradually undermine the processes and practices we use to manage our data. With every passing day, they become less effective.

Usually, no one notices until a mishap comes to light. Perhaps a decision or analysis based on data slipped under management’s radar, making it unreliable or a liability in disguise. Steering clear of potential disasters is a prime responsibility of top management. This article extends that responsibility to ensuring sustainable and effective management of data, as well as the role PLM can take in support of this critical enterprise task.

The Data Challenges Ahead for Top Management

Leaders at the top of their organizations must deal with some unsettling realities, including:

  • The numerous operating systems that choke task- and process-oriented data repositories.
  • The various formats, task-linked processes and apps that convert data into information that is unusable by other tools.
  • The amount of information that can become redundant, obsolete, or trivial.
  • The large percentage of data that is stored but never looked at or processed.
  • The vast majority of business information that is unstructured and difficult to search, and therefore leveraged.
  • The explosion of data and information, with zettabytes stored in the Cloud.
  • Artificial intelligence (AI) adding to the explosion in enterprise data.

These facts warn us that just because data is in a repository (on-premises or in the Cloud) does not mean data is under effective management. The fact that effective data management is slipping away is a compelling case for the responsibility of enterprise leadership to see to the reinforcement of sound data-handling practices.

A Digital Tool Can’t Solve Everything, Including Data Management

The reinforcement of data management extends beyond digital tools. Effective data access, use and reuse is a management responsibility, requiring solution providers, consultants and internal IT management to be held accountable for users’ ongoing concerns. This means developing a hard-nosed focus on failures to address recurring data problems and identify, improve and/or remove any software or process shortfalls—regardless of the guilty party.

It also means recognizing that information management, for easy collaboration and comprehension, underlies everything the enterprise accomplishes. So that nothing will be overlooked, I recommend that reinforcement take a structured approach by applying CIMdata’s “five V’s” to address data surges in enterprise data repositories. They are:

  1. Volume of data should be addressed by reducing or constraining the highest inflows from engineering, production systems, smart devices and AI.
  2. Variety of data can be sharply reduced by focusing on needed information, instead of what may be needed.
  3. Velocity, defined as the rate at which data accumulates, can be slowed by blocking anything too peripheral to keep—like noisy data.
  4. Veracity reinforces data management by continually re-establishing the value, accuracy and completeness of all inbound data.
  5. Verification creates mandatory periodic check-ups on the preceding four V’s.

If these five V’s sound like data governance, this is no coincidence. But they are also designed specifically for the organization and implementation of policies, procedures, structures, roles and responsibilities that outline and enforce rules of engagement, decision rights and accountabilities.

The Role of PLM and Change Management in Data Management

For reinforcing effective management of data, an end-to-end PLM approach is unmatched. Its connections to digital twins, end-to-end connectivity and digital threads point to strategies and tactics top management can use as they prod and demand the enterprise’s many business units to establish effective data management.

Reinforcing data management, with or without enabling an appropriate PLM approach, presupposes a determined commitment at the executive level to remedy past abuses, as well as current shortcomings in the everyday collaborations that require data to be readily accessible, handled, used and modified for others to use.

Other top-of-enterprise solutions fall short compared to well-implemented PLM environments in significant ways. Enterprise resource planning (ERP) focuses on bills of materials (BOMs) and the costs and revenues impacting ROI. Manufacturing execution systems (MES) are at the core of well-run production operations but manage only that information. Product data management (PDM) lives on as PLM’s predecessor in small- and medium-sized organizations. Customer relationship management (CRM) focuses on pinpointing customer needs.

Effective data governance means overseeing the implementations of all new solutions to ensure that capabilities work as promised. It is at the heart of reinforcing a sound management of all forms of data. It is so important that CIMdata added a data governance practice to its strategic consulting offerings more than five years ago

A key tool for reinforcing the effective management of data is configuration management (CM). Usually implemented within data governance, CM ensures that an enterprise’s products, processes, facilities, services, networks, assets and IT systems are what they are intended to be and properly optimized for their intended use.

CM tracks all forms of data and clarifies changes and modifications to it, even if unintended or undetected, throughout the product lifecycle. A comprehensive CM approach also pinpoints and fixes problems, avoiding surprises such as hidden errors and unexpected outages.

Effective data management calls for tools to have continuous access to data, regardless of changes and formats. With proper security and syntax extensions to untangle conflicts, omissions and errors, as well as formats that have become obsolete.

The Role of Top Management in Data Management

Establishing and reinforcing effective information management is not specific to information technology, engineering technology or even operational technology. It is instead tied to the executive suite’s concerns about what the enterprise can achieve, at what cost, with what products and services, and at what risk. This is why I insist that reinforcement strategies and tactics should parallel and enable business models.

With all current data challenges, only naive managers would think that their data management tools, systems and policies—even the latest and greatest—will continue to function adequately. This is why top management’s reinforcement role is primarily about people. New policies, procedures and plans must be thought through, implemented and monitored.

Most important, of course, is getting key people to see their responsibilities in new ways. Everything digital in the enterprise is already being impacted. Top management should plan accordingly and brace subordinates for unexpected and even unforeseeable developments.

To sum up, reinforcing the sound management of data is primarily about how people perceive and do their jobs, rather than the digital tools and systems they use. Implementing these reinforcement practices puts a premium on persuasion, which should be a key skill of any high-level manager.

The post The Role of Top Leadership Amid PLM and Data Management appeared first on Engineering.com.

]]>
PLM, Your Enterprise and Three Stumbling Blocks https://www.engineering.com/plm-your-enterprise-and-three-stumbling-blocks/ Tue, 24 Oct 2023 15:11:00 +0000 https://www.engineering.com/plm-your-enterprise-and-three-stumbling-blocks/ Enterprise-level lifecycle management will bring IT, OT and ET together. But it won't be easy.

The post PLM, Your Enterprise and Three Stumbling Blocks appeared first on Engineering.com.

]]>
Product Lifecycle Management (PLM), is one of the most holistic business strategies ever defined. It has evolved enough robustness, bandwidth, information and process management capabilities to be a viable choice for bringing together information technologies (IT), operational technologies (OT) and engineering technologies (ET)—even at the highest digital levels of your enterprise.

What will an enterprise-level lifecycle management environment look like? (Image: Bigstock.)

What will an enterprise-level lifecycle management environment look like? (Image: Bigstock.)

Creating this enterprise-level lifecycle management environment is an eminently worthwhile goal—but it can be elusive. Many well-staffed and well-funded efforts focused on bringing IT, OT and ET together have had only limited success. There are three big stumbling blocks—organizational structures, standards and technological silos—and they are the focus of this article.

What Your Enterprise-Level Lifecycle Management Environment Might Look Like

To CIMdata’s knowledge, a fully-enabled, enterprise-level lifecycle management environment hasn’t been created yet because of three major stumbling blocks: organizational structures, standards and technological silos. If these were not in the way, PLM could reach new levels of integration and optimization and be lifted beyond, mainly supporting product development throughout the enterprise.

Consider if the connection was possible today. The corresponding gains in depth, breadth and processing speeds would enable users to access every data repository in the enterprise. That would inevitably lead to some form of access to the repositories of suppliers, partners and customers who make up the complete extended enterprise. Ultimately, PLM-enabling technologies would provide unconstrained access and valuable data via the cloud, the Internet of Things (IoT) and countless Big Data repositories.

Until now, PLM solutions have been primarily used and regarded as application and engineering toolsets. They can be used to generate complex and sophisticated digital or virtual representations of products and other physical and logical assets, such as production systems and service structures. If PLM could successfully embrace IT, OT and ET, however, product development would transform into true enterprise lifecycle development and optimization. It would be the ultimate end-to-end development and optimization of systems of systems. Everyone in your enterprise would be able to collaborate to fulfill the goals of leadership and business model requirements in near real-time. 

More effective collaboration and reliable information access would sharply reduce the time wasted reformatting and re-entering information that is traditionally moved between toolsets, systems and repositories. Tedious verifications and repetitive validations could also be pared back.

The Hurdles to Making Your Enterprise Lifecycle Management Environment Fully Functional

Organizational structures are often the main impediment blocking a company’s ability to optimize throughout the lifecycle and across its various functional groups. Mid-level managers often fail to appreciate the importance of similar functions and departments that interact at different points in the product lifecycle. This disconnected mindset can lead managers to block interfaces to their data that would make access and sharing straightforward. It also often results in suboptimization.

Standards pertaining to implementing PLM are incomplete, often in conflict and built on differing underlying assumptions to meet different needs. Because of PLM’s broad end-to-end reach, tallies of relevant standards run from a dozen to well into the hundreds. They have never been adequately reconciled, and CIMdata knows of no significant effort to do so.

Technological silos thwart information access, analysis, decision-making and collaboration everywhere in the enterprise. These disconnected and often obsolete data repositories are widely recognized as causing headaches, but they persist because exclusive knowledge and data are seen as forms of power. And every organization has people suspicious of outsiders wanting to use their data.

The Technical Challenges of Joining IT, ET and OT Into One Integrated Environment

From a technological point of view, joining IT to ET and OT may lead to installing or reinstalling PLM-enabling solutions in the cloud. That would shift responsibility for cybersecurity and physical security to the solution providers—which is a good thing.

Solution providers and their cloud security teams have far more expertise than all but a few experts in aerospace, defense and classified government agencies. With economies of scale on their side, solution providers and their cloud security teams are more tightly focused than any enterprise IT security team can ever be, and they can react faster.

However, combining IT, ET and OT into one integrated enterprise lifecycle management environment raises other technical issues at the highest levels of the enterprise.

Significant scalability and technology platform upgrades will be needed to accommodate dozens of non-engineering applications. Some of these applications may grow rapidly, have many more repositories and have surges in data transactions—millions of digital events per second in some cases. Such numbers are almost never seen in conventional PLM installations. File openings, closings and information modifications in engineering will seem comparatively infrequent.

Bringing IT, OT and ET together will inevitably lead to a jump in the number of discrete new systems and toolsets that need to be addressed, managed, analyzed and connected. While engineering uses dozens of applications and toolsets, far more are used in operations, production and elsewhere in the organization. This means data governance must be greatly scaled up, strengthened, extended and adapted.

Bringing IT, OT and ET together will also require new digital skills and raise questions of staffing a team to choose, analyze and implement such a radically new digital environment. IT and engineering teams will know what is in these digital stacks. Still, their perspectives and hands-on experience vary widely, and that will be needed for top-of-the-enterprise integration. As often part of finance, IT may be able to secure funds more easily. Engineers, however, will have an edge in knowing how PLM’s capabilities operate.

Other Business Adaptations Created by an Enterprise-Level Lifecycle Management Environment

Regardless of how PLM may be understood and implemented in this ideal digital future, management of the product lifecycle must assume a new status at the top of the enterprise. The enterprise’s long-term viability—even its survival—can only be assured with a reliable flow of competitive and profitable new products, productions, support systems and physical assets.

In addition, the meanings and uses of PLM’s most familiar terms—digital twins, digital threads, end-to-end lifecycle connectivity—will be redefined. When they at last represent the enterprise or large portions of it, digital twins will be transformed. They will be pulled apart and reassembled in unexpected ways with virtual reality, data analytics and artificial intelligence. These will be crucial to the enterprise but, until now, have often been peripheral to individual product instances. Even now, however, many twins reach gigabyte sizes.

Digital threads will also multiply as they spread into IT and deeper into OT, expanding far beyond product development. At the IT-OT-ET level, webs and networks of threads will link and transmit new information types, formats, linkages, feedback loops, metadata and much else that is not yet foreseeable. Threads will be two to five times faster and have many times more capacity. New connections and feedback loops can be expected to grow into the thousands, and maybe even millions, per thread. More frequent access to the Internet of Things (IoT) will further boost these numbers.

Everything that IT-OT-ET convergence does to digital threads will be reflected in PLM’s end-to-end lifecycle connectivity and optimization. As they are reoriented away from physical things and toward people and issues, digital threads will be reconfigured to meet dramatically larger and very different demands.

For PLM solutions to become the medium of information exchange—and access and collaboration—across the enterprise as well as up and down its information flows and connections, PLM must no longer be seen as “something for engineering.” Changing this view requires a fundamental rethink by the solution providers of their technology approaches—and the ways in which these approaches are described and presented. 

How We Will One Day Enable an Integrated Enterprise-level Lifecycle Management Environment

Such are the foreseeable stumbling blocks of what I shall label, for the present anyway, Enterprise Lifecycle Managementthat is, systems of systems lifecycle optimization.

Stumbling blocks aside, product development can morph into enterprise development. Transformations of this magnitude have occurred and will continue to occur in data management, analytics, spreadsheets and even word processors. The use of PLM to bring IT, OT, ET and their users together is virtually assured as innovative digital capabilities always find their way into the enterprise limelight. When this happens, enterprises will be forever changed for the better.

Nevertheless, the many challenges to effectively bringing IT, OT and ET together must not be taken lightly. They will not magically disappear; if anything, these stumbling blocks will get worse as time passes. Only concerted and persistent initiatives by top management, including some organizational restructuring, can bring IT, OT and ET together to optimize an organization’s return on investment.

The post PLM, Your Enterprise and Three Stumbling Blocks appeared first on Engineering.com.

]]>
Business Models Fail to Understand PLM https://www.engineering.com/business-models-fail-to-understand-plm/ Thu, 17 Aug 2023 10:01:00 +0000 https://www.engineering.com/business-models-fail-to-understand-plm/ How to synchronize business models with PLM value propositions.

The post Business Models Fail to Understand PLM appeared first on Engineering.com.

]]>
Data, beginning with its quality, processes and systems, must be the first consideration in any enterprise analysis or decision—technical or otherwise. Managers at every level of an organization have likely seen the ramifications when data goes awry. It often goes unnoticed until harm is done to products, marketing, engineering, production, distribution or services. 

PLM isn’t a silver bullet for every business model. (Image: Bigstock.)

PLM isn’t a silver bullet for every business model. (Image: Bigstock.)

Business models can also take a hit based on data validity or the expectations on which they are based. This commands the attention of the executive suite, whose top-level managers are measured, in large part, by the soundness of their business models and the quality and speed of their associated execution.

The executive suite continually grapples with risks that come with implementing every new process-enabling technology, and allocating the necessary resources. These risks are the biggest for managers when they try to fathom a business solution’s details and specifics without first grasping the solution’s data and process enabling requirements, as well as its scope and possible impact on the business.

This brings us to product lifecycle management (PLM). As defined by CIMdata, PLM is a strategic business approach that spans the full lifecycle, from idea through life. As such there is an increasingly urgent need for the executive suite to align its value propositions to their business models. If that synchronization is missing, business models will not reflect marketplace realities and opportunities.

In this article, I address how PLM strategies and resulting implementations offer insights and remedies for these unsound business models.

‘In Sync’ is Far More Than ‘Linked’

Getting business models and the value propositions of PLM truly ‘in sync’ requires integration, not just ‘linking’ them. I find such linkages are rarely reliable. When integration is achieved at the enterprise level, the value propositions of PLM enhance and reinforce business models. This is because everyone in the organization can use all relevant data in the forms required, when required—even as business models evolve.

Being in sync means that top-level managers who formulate business models and those who manage data and processes within PLM solutions talk to each other in mutually intelligible ways. Getting in sync requires them to use the same verbiage, syntax, definitions, terminology and business jargon. Only in this way can they build a shared appreciation for each other. The reality is that they both have the same goal — maximizing the organization’s ability to deliver right-to-market each and every time. But what might differ is what they need from each other, which goes to the core of communication.

Getting in sync is vital because of the continuous evolution of the raw data, analyses and decisions that underlie every business model. These nonstop changes impact business models in unpredictable ways. The consequences of this continuous evolution make effective lifecycle process and data management an imperative for decisions and analyses. This is because data is an organization’s lifeblood — it is what runs the business and ensures business models succeed.

To achieve this, executive suites must begin with accepting PLM strategies, methodologies and enabling solutions as enterprise opportunities—not just something cobbled together in engineering. As a strategic business approach from concept through life, PLM is an enabler and driver of right-to-market, ensuring and maximizing the organization’s return on investment.

The Role of Data in PLM and Business Models

In many organizations, financial executives, for example, control data through bills of material (BOMs) which track the components and supplies a product or service requires and their costs. PLM solutions are commonly used to generate corresponding Bills of Information (BOIs) containing much more data than parts and their relationships.

Business models are strategic and need to evolve like everything else at the enterprise level. For example, revenue and profit projections (i.e., maximization of their return on investment) are the executive suite’s constructs of corporate aspirations. Business models zero in on viable markets with products and services to maximize profit. Business models also dictate resource allocation and pinpoint changing costs. 

Business models have another data role: determining the contexts in which data (e.g., revenue and profit numbers and other key performance indicators) and insights and results are shared with the shareholders, board of directors, business partners, distributors, bankers, union leadership and the trade media. In publicly traded companies, business models also establish expectations of shareholders, financial analysts and regulators. 

The dangers of incomplete or poorly defined business models are manifold and obvious.

The value propositions of PLM solutions rest on the reality that data in many forms (i.e., metadata, files, folders, algorithms, apps, images, videos and more) continually surges through the enterprise via three primary data structures:

  • Information Technology (IT): primarily data processing.
  • Engineering Technology (ET): especially product development data.
  • Operational Technology (OT): which is production, delivery and service data. 

Whether IT, ET or OT, data and tools are implemented on the enterprise’s computers, in the Cloud (e.g., Software as a Service [SaaS]), or in some hybrid arrangement that does not affect the validity of messages.

PLM’s Importance to Business Models

IT, ET and OT constantly change, as do their data repositories, which may be scattered throughout the extended enterprise. Data is gathered, secured, managed and kept current in business models in countless ways—usually with periodic updates when changes in underlying data are noticed. 

To get past this, the executive suite needs an appreciation of PLM solutions, its capabilities, and the benefits from its enabled digital twins, digital threads (webs) and end-to-end connectivity. Even a basic grasp of these will show how PLM tools helps the enterprise achieve, enable and protect its data, and how easily data is mismanaged, mangled or lost. 

The executive suite should insist that data managers at all levels incorporate these tools—and demand that business managers, analysts and decision-makers carefully watch and fine-tune their inputs to business models. 

Once the executive suite is comfortable with PLM and its fundamental value propositions, they should also insist that managers regularly verify that no business model is floundering for lack of timely updates.

I do not believe the executive suite needs a deep understanding of the innerworkings of PLM solutions. Deeper dives should be left to project leaders, middle management and technical staff with the necessary skill sets, training and education. 

Executive suites have many issues demanding their time and attention, but proven resources are available to get business models and value propositions in sync. To fix business models, there are a few worth noting:

  • Data Governance and Configuration Management ensures that all of the enterprise’s data and all its digital systems are what they are purported to be. For example, that every change has been properly authorized and that all changes are tracked.
  • The Theory of Constraints requires the identification and removal of bottlenecks leading to big gains that show up in all PLM processes, especially in those that support the end-to-end optimization of the product lifecycle. 

PLM Value Propositions

To see how this works, let’s look at the most cited PLM solution value propositions:

  • End-to-End Connectivity spans the enterprise and reaches deeply into both product lifecycle data repositories and their business-model integrations. Ideally, connectivity should start at product or service inception and reach at least the end of production (if not to the end of useful life, recycling and more). Ditto for the systems and applications that make up IT, ET and OT.
  • Digital twins (or virtual twins) represent, as fully as practical, a product or service (or even a process) and every change made to its initial definition in engineering, production, operations and in the hands of users. This includes warranty claims, returns and regulatory compliance. Digital twins embed each product or service in its business model, which provides context for decisions and use, and facilitates the sharing of any changes. 
  • Digital thread is a web of data that describes the product, service or process, and all the decisions made through the lifecycle. It ties each Digital Twin to its physical, real-world product, service or process and coordinates their changes. A reliably synchronized business model reflects the extent and capabilities of its digital threads and digital twins.

To these, we must add digital transformation, which is both a parallel initiative and a key element of an enterprise’s data management enablement. Digital transformation renders formatted data in every kind of data repository into the 1s and 0s of raw data. While not always in sync with business models or PLM solutions’ value propositions, with every passing day digital transformation frees more data from cumbersome and outdated formats. 

Why Synchronization is So Complicated

A big challenge to the synchronization of business models with PLM strategies and enabling solutions is what I call forcing factors. These are innovations that generate new data in new forms while impacting the relevance and quality of older data; entirely new business models emerge from forcing factors.

CIMdata’s list of forcing factors include:

  • The growing electronics content and software in physical products.
  • New manufacturing techniques and support processes.
  • New materials that are lighter, stronger and greener.
  • Mass customization, or the squeezing of every new technological feature into every new product or service offering.
  • Shorter product lifecycles.
  • Nonstop innovation.
  • IIoT/IoT, which enables continuous manufacturing and marketplace feedback.

Another big challenge is the disruptions to data and business models triggered by new technologies. CIMdata’s list of new technologies and related approaches that fit this criteria include:

  • Generative design.
  • Additive manufacturing / 3D printing.
  • Artificial intelligence and machine learning.
  • Topology data analysis.
  • Predictive analytics.
  • Graphical databases.
  • Agile software development.
  • Virtual/augmented reality.

No doubt, every enterprise has its own list, and I have seen dozens. My point is that the upheavals of forcing factors, and new technologies and their related approaches should be understood by the executive suite, and that business leaders and decision-makers should be required to use them in updating and fine-tuning business models. All of this is critical when considering that a company is in business to create and use data that is ultimately transformed into its products and/or services.

Predicting the Future with PLM

Amid the usual business disruptions too familiar to list here, PLM solutions can be invaluable in predicting the impacts of what can be called’ business model shock.’

Business model shock is a sudden realization of trouble tied to unpredictable events such as a customer default, implosion of a key business partner, an internal organization upheaval, supply chain disruptions or a hostile takeover bid. To mitigate business model shocks, PLM strategies, methodologies, and enabling technologies must be supplemented by management vigilance.

This is why CIMdata believes that PLM should be regarded—and accepted—by executive leadership as the business model for data, the foundation upon which the business operates and ultimately maximizes its returns on investment. In turn, business models deepen and broaden the enterprise reach of PLM’s value propositions.

The post Business Models Fail to Understand PLM appeared first on Engineering.com.

]]>
Products as a Service: The Changing Views on Products https://www.engineering.com/products-as-a-service-the-changing-views-on-products/ Thu, 06 Jul 2023 09:58:00 +0000 https://www.engineering.com/products-as-a-service-the-changing-views-on-products/ How engineers, customers and the market view products does not merely change—it transforms, explodes and fragments.

The post Products as a Service: The Changing Views on Products appeared first on Engineering.com.

]]>
Though eye-catching, the proclamation of the “changing views on products” is an oversimplification. How engineers, customers and the market view products does not merely change—it transforms, explodes and fragments. This is also true of their marketplaces—as a look into the products as a service (PaaS) business model reveals.

By trading a large one-time revenue boost for an on-going stream of per-use payments, the PaaS business model upends the sales of capital equipment used everywhere in industry. Boilers, turbines, compressors, robotics, bottling & packing equipment, assembly-and-test systems and more are increasingly leased rather than purchased. For many years, for example, airlines have tended to view their aircrafts’ jet engines as “power by the hour,” transferring uptime responsibility back to the engines’ original equipment manufacturer (OEM) with no-fly, no-pay contracts.

In the PaaS business model, the OEM’s sale of equipment (or any big-ticket asset) is replaced by ongoing payments tied to usage—essentially productivity or what is produced. With enhanced connectivity, meters and sensors, the equipment’s use, uptime, output and downtime are tracked and invoiced accordingly. CIMdata clients remind us that PaaS data must be timely, accessible and easily understandable. Here is what that might look like.

(Image: )

(Image: Bigstock.)

How PaaS Changes Product Designs

PaaS profoundly affects the OEM’s views of its business opportunities, not just its products. Under PaaS, the OEM’s revenue is based almost entirely on equipment availability and output, placing durability and long-term service life at the core of OEM business plans. In essence, under PaaS, the OEM now owns the performance of its products; if the product stops functioning for any reason, the OEM’s revenues also stop. The consequences of PaaS for capital-equipment marketplaces, for example, are huge.

PaaS means that maximum equipment uptime, durability and service must be designed in—and fundamentally re-engineered—anywhere it is feasible in the lifecycle to address the challenges of long-term service. Among many things, PaaS dictates that equipment must be far more durable with robust connectivity, embedded electronics and sensors—along with software that can be updated remotely and parts that can be easily accessed for maintenance. A big PaaS consequence is that the OEM forgoes future income from field service and spare parts as the need for them is minimized with far more durable designs.  A familiar example is automobile warranties that have grown to 100,000 miles from less than 25,000 and to five years from two.  And who can remember changing spark plugs or installing a new muffler?

The market transformations, explosions and fragmentation that come with PaaS might appear to create more hurdles for product lifecycle professionals, but the opposite is true. By taking advantage of the hidden benefits these changing views uncover, more innovative and competitive products can be developed, manufactured, delivered and supported in the ever-changing business landscape. Although inevitably more complex, these products are released to users sooner, are supported more easily in the field and generate better profit margins.

How PLM Fits into PaaS Models

PaaS and capital goods aside, disruptions in the design, sale and service of everyday products are all around us. We must accept sweeping changes in our workloads and new processes to address these changes. For these reasons, and many others, product lifecycle management (PLM) environments are essential. Specifically, PLM offers:

  • Digital twins that have access to the product’s defining data, in their many formats, to make it all understandable.
  • Digital threads that connect digital twins to all relevant data and information nonstop. This Includes modification control to keep the digital twin up to date with its physical counterpart.
  • End-to-end lifecycle connectivity that networks and spreads data anywhere in the enterprise. This includes webs that radiate out to the Internet of Things (IoT) and the uncounted repositories of Big Data, where everything digital eventually comes to rest.

Also dramatically changing are the digital environments where products (along with systems and assets) are developed, refined, produced and supported. These changes restructure marketplaces and heighten customer expectations of future products.

My recent engineering.com article, “Why PLM-related Big Data Opportunities Greatly Exceed the Potential Headaches,” addressed ways product developers use PLM to take advantage of today’s massive inflows of data and information. This current article builds on that by showing how products, rather than data and information, change with different monetization strategies. Along with PaaS, these market transformations, explosions and fragmentations impact our product-lifestyle efforts and reshape our understanding of each product’s marketplace.  (In the context of this article, “product” can also refer to factories, data processing, distribution networks, supply chains, software and other systems and assets.)

The days are long gone when products were represented with a half-dozen 2D line drawings—front, back, side, top, isometric and an exploded view. Even with dimensions, these views were never adequate, leading to endless annotations—typed, hand-written and occasionally inscrutable. Stapled to the drawing, they were easily mislaid.

This changed in the CAD revolution of the 1970s and ’80s. Pixels on computer screens replaced the draftsman’s pen-and-ink. In quick succession came 3D, solid models, speedier rates of change in product views, the end of Mylar drawings and PLM.

Transformation, as governed by PaaS, is the biggest factor in how products are put to new uses by getting new capabilities, mechanizations, automations, embedded sensors and connectivity. Even bolt heads now have connectivity, which guards against over-torque and cross-threading. As a product attribute, the term “standalone” is disappearing.

Transformation of products’ views is greatly accelerated by two revolutions sweeping through information technology (IT) worldwide. One is the digitalization of analog data in control systems, process monitors, bills of materials (BOMs), purchase orders, specifications, work instructions and more. The other is the digital transformation of documents, images and videos—anything in databases that is still formatted. In nearly all enterprises, digitalization and digital transformation remain works in progress.

The terms “explosion” and “fragmentation” needn’t be scary. As changing views of the product, they reflect the huge increase in the volume of information needed in product development, the ever-growing number of people involved and the ever-closer examination of the data and information they use. In today’s enterprises, every business unit that touches any part of any product continually downloads, uploads and regenerates information.

Amid the demands of PaaS and continually changing views of the product, PLM solutions connect product developers, as well as production and service personnel and their supporting systems, to almost any data repository and component, process, tooling and technique imaginable. Some examples include:

  • 2D, 3D and solid models from CAD systems.
  • CAM programs.
  • BOMs with sources, specifications and variants.
  • Specifics of assembly and inspection.
  • Schematics for electronics, including hardware, software, sensors and firmware.
  • Requirements for field service and maintenance/repair/operations (MRO).
  • Greenness including lifecycle energy consumption and carbon footprints.
  • IoT feedback on the performance of products in the field.
  • Emerging user wants and needs coming out of Big Data with predictive analytics and topology data analytics (TDA).
  • Sustainability and end-of-life data on reuse or safe disposal.
  • Details about projected product innovations and obsolescent versions.
  • Documentation on regulatory compliance and conformance to industry standards, usually still formatted.

In PLM environments, product lifecycle participants sort through an ever-widening range of information and insights, then combine these findings into basic requirements for breakthrough products.

These information sources are instrumental in taking product developers’ imaginations far beyond customary notions of “product.” If anything, “changes” in these contexts is an understatement.

Also exploding is the number of people needed to move new products from concept to customer. Decades of reliance on closely knit teams is ending; in every industry and marketplace, anyone can offer their notions. While this can complicate product development, the initial chaos yields smarter and speedier innovation.

Fragmentation follows explosion and is the third big driver of the changing views of the product. Information is demanded by product lifecycle participants in countless ways, then parsed into ever smaller bits as they investigate viable new designs to outperform rivals or meet PaaS demands. Inevitably, the contents of views fragment.

Moreover, no two people across the lifecycle see their responsibilities similarly, even on tightly integrated collaborative teams. Each user’s diagrams, schematics, components, descriptions, and exploded views are unique and continually changing; so are all the underlying data and information.

These three realities of the changing views of products—transformation, explosion and fragmentation—point to the necessity of implementing Data Governance (DG) and Configuration Management (CM) practices, procedures and all associated roles and responsibilities. DG and CM in the enterprise’s PLM environment were addressed in the recent CIMdata webinar, The Importance of Data Governance within Digital Transformation, by Janie Gurley and Dana Nickerson.

PLM’s Role Grows as Views of Products Change

Amid these changing views of the product, PLM environments are bursting with new capabilities, including automated linkages and traceability, as reported in a recent CIMdata webinar, The Promise and Reality of the Digital Thread.

New PLM capabilities help users access and manage:

  • The huge amounts of data needed for product development.
  • Data comprehensibility.
  • Overlooked memes and trends in social media.
  • The extension of Big Data and the IoT to MRO.
  • Use of PLM throughout the enterprise, thanks to highlighting user successes.

A significant PLM enhancement is the incorporation of value calculations that reflect changes throughout the product’s lifecycle—product value management (PVM). PVM tracks changing values during design, production, distribution, sales, service, warranties and even disposal. As a redefinition of lifecycle processes, PVM strengthens connections between products, markets and users.

For fast-moving companies, these do-or-die connections keep innovative new products aligned with customers’ wants, needs and marketplace upheavals, including those brought on by PaaS. Specifics were covered in a December 7, 2022, Propel Software webinar, “Product Value Management in the Age of Disruption,” in which I participated.

From Ma Bell to Smartphones

CIMdata believes that even if your products and services are not being rendered obsolete, this is no time to let your guard down. The odds are that rival companies are targeting your products. They may even be using the IoT and Big Data to discover what your users want next.

You should be doing likewise; if your company’s views of its products are not changing, this would be a good time to ask why not.

To grasp the magnitude of these changes, compare your smartphone and its apps with the old AT&T and Bell System phones. Ma Bell’s phones were clunky black Bakelite things with rotary dials. They were analog, voice-only devices hard-wired to the innards of AT&T. Unable to access anything else, they were useless if disconnected.

Smartphones could not be more different; they have keypads and screens instead of dials and can be told verbally to place calls. Totally digital, smartphones snap photos, access images stored virtually anywhere, send and receive texts and e-mails, take notes and dictation, handle complex calculations, access the IoT and Big Data, forecast the weather, show us maps and routes, and connect to or even replace our computers.

We carry this sleek and compact capability around in our purses or pockets. Ma Bell’s phones sat immobile on kitchen walls or little black tables in hallway corners. Will self-driving vehicles present as many radical changes?

Even the most unlikely things undergo change. Shoelaces were undone, so to speak, by Velcro. CarMax and Carvana are displacing used auto dealerships. Netflix displaced Blockbusters and now risks being displaced by others.

Conclusion: What Does ‘Product’ Mean?

We have arrived at the ultimate question prompted by PaaS and the resulting changing views of the product: what constitutes a “product?” Physical products still have fronts and backs, tops and bottoms, sides and edges, but these basic views are now represented in a myriad of ways.

What about the beginnings and ends of lifecycles? Not long ago, those were simply the service lives designed into the product. But today, service lives extend back to the product’s initial concept and forward to its disposal and recycling. Extending service lives by decades also makes sustainability a vital design requirement—another consequence of PaaS.

Given these transformations, explosions and fragmentations, must we incorporate all the earlier versions into the new views of products? How many planned iterations should be attached and projected? How many variants? Should the changing views of products include BOMs? What about suppliers?

And what about lifecycle changes in the views of products emerging from PaaS durability, sensors, connectivity and metering?

Without a doubt, these issues are raised daily in your business unit or enterprise, which means it’s time to re-evaluate your PLM environment and broaden your use of digital twins, digital threads and end-to-end connectivity.

For those who have yet to choose a PLM solution or still struggle with PLM’s predecessors, now is the time to move product development and the rest of your extended enterprise into PLM environments. Let PLM help enhance your products and services before your rivals start changing your customers’ views of your products.

If that happens, you may have very little room to maneuver. After all, change is the only constant.

The post Products as a Service: The Changing Views on Products appeared first on Engineering.com.

]]>
Digital Twin Lessons for Engineers from the PLM Road Map & PDT https://www.engineering.com/digital-twin-lessons-for-engineers-from-the-plm-road-map-pdt/ Thu, 08 Jun 2023 09:38:00 +0000 https://www.engineering.com/digital-twin-lessons-for-engineers-from-the-plm-road-map-pdt/ Don't miss these warnings, predictions and best practices for Industrial digital twins.

The post Digital Twin Lessons for Engineers from the PLM Road Map & PDT appeared first on Engineering.com.

]]>
PLM Road Map & PDT, CIMdata’s (in collaboration with Sweden’s Eurostep Group) annual North American gathering of product lifecycle management (PLM) professionals, was an eye-opener in terms of what a few years has done for the enablement of the digital thread. However, it also outlined the digital thread’s remaining challenges, interoperability and transparency issues. This was summed up in the conference theme, “The Digital Thread in a Heterogeneous, Extended Enterprise Reality.” Here is a summary of highlights from the event.

Engineers learn about the digital thread in a heterogeneous, extended enterprise reality. (Image courtesy of Brigstock.)

Engineers learn about the digital thread in a heterogeneous, extended enterprise reality. (Image courtesy of Brigstock.)

What Engineers at PLM Road Map & PDT Learned About Digital Threads

In my keynote address, “The Digital Thread: Why Should We Care?” I noted that the technology can be seen as a network, web, chart or map of decisions that sews together—interconnects—data in the enterprise’s end-to-end product lifecycles.

Digital thread connectivity is vital to digital transformation—freeing information from formats, documents, tools, models and departmental databases (read silos). The value of the digital thread lies in the myriad of data links that feed and validate decision-making. This includes hundreds, thousands or perhaps millions of information nodes and data repositories that involve numerous systems and the processes they enable.

From a product’s conception through the end of its useful life, the digital thread helps us to see into every product- or service-related decision. So, a digital network must have a purpose—it is not just a linear sprint through product development.

Digital thread implementations are not straightforward. A well-implemented digital thread may require countless narrowly focused digital sub-threads containing hundreds of information-packed nodes and data repositories ranging from simple flat files to highly detailed model-based structures. As a communication framework, the digital thread also allows an integrated view into digital twins across the lifecycle, product and organization. A digital twin cannot be created or leveraged unless a digital thread connects its data and the processes that support it. PLM solution providers offer many powerful tools for establishing these connections but using them often requires significant experience.

Why go to all this trouble? Digital networks help us understand what decisions were made and why. If we fail to remember our past mistakes, we risk repeating them. And if we fail to learn from them, we can’t build on previous successes.

Human factors—skillsets, attitudes, even short-sightedness—are critical. Also critical is data governance and sound plans that span the organization.

To that end, Christine McMonagle, director of Engineering Business Systems at Textron Systems, zeroed in on engaging with people as the key to successful change and why that’s hard to do. Textron started on its sweeping changes with a prod from the DoD about developing “an authoritative source of truth.” This is related to a DoD effort to digitally transform its engineering documents.

She emphasized that engaging with people is the key to successful change. Among the quotations in her presentation was this gem: “Change is hard because people overestimate the value of what they have and underestimate the value of what they may gain by giving that up.” It is from Flight of the Buffalo: Soaring to Excellence by James Belasco and Ralph Stayer, a 1993 business bestseller.

Survey Outlines Engineer’s Thoughts on Digital Threads

James Roche, director of CIMdata’s Aerospace & Defense Practice Group, outlined the organization’s three-phase collaborative research into A&D’s use of digital thread.

The research survey respondents, all domain experts, offered 15 different definitions of digital threads, Roche said. The different views were fostered by growing product complexity, shorter time-to-market, efficiency campaigns, new enabling technologies and rising customer demands to deploy digital twins.

Also addressed were the needs in digital threads for traceability of product information, data elements and metadata, along with interoperability between pairs of data elements and the interpretation of one by the other.

The research showed digital thread investment to be in its very early days, and therefore a major business opportunity for solution providers. It also showed that model-based systems engineering (MBSE) will be a fundamental driver of future digital thread investment.

Also covered in the survey’s follow up were common pain points. The digital thread’s top inhibitor is poor interoperability between different solution providers. As a result, greater support for standards was deemed essential. Lack of openness in new solutions and dependence on third parties for extended connectivity and data interchange remain universal concerns, Roche said, adding that new technologies are helping with linkages and traceability.

To that point, Mattias Johansson, Eurostep Group CEO, focused on what Eurostep considers a viable standard, ISO 10303-239, Product Life Cycle Support (PLCS). PLCS standardizes information exchange among engineering support, resource management, configuration management, maintenance and feedback. He noted that developers are always playing catch-up because “data reacts to change faster than humans do.” No single system or standard will suffice, he added.

What the PLM Executives Think About Digital Threads

The differences in PLM solution providers’ approaches to the digital thread were addressed in an executive spotlight and Q&A with Aras, Dassault Systèmes, PTC and Siemens Digital Industries Software.

In response to a question about the digital thread connectivity that customers are asking for, Rob McAveney, CTO of Aras, pointed out that the digital thread and interoperability are “all we do,” adding, “we are fully agnostic with technology solutions and vendors.”

“Our customers want us to help them get across boundaries in their organizations,” which McAveney said means “cross-discipline digital threads instead of point solutions.” Aras, he added, pushes customers to think about the big picture of what they must accomplish while focusing on individual user access. “Consider it a Think Globally, Act Locally approach,” he added.

In response to a question about the new technologies that will enable the digital thread, McAveney listed the convergence of hardware and software development, cloud-based technologies, connectivity, systems thinking and systems-design. Still missing, he added, is “a sense of urgency dealing with the many challenges coming at us. We’ve had great conversations about this, so now let’s get going on implementations.”

In response to a question on whether customers wanted solution providers to introduce them to digital threads. Michel Tellier, Dassault Systèmes’ vice president of Managed Services, said, in effect, yes. “At Dassault, we are focused on experiences rather than products, and not on modeling but in circularity. Creating, managing and extracting value from digital twins and virtual twins,” he continued, requires highly engineered models from the customers, “which are not always a realistic expectation.”

Most needed, Tellier said, are interoperable, multi-scalar, model-based digital threads, which require “that we myopically focus” on the creation and exploitation of value. He also predicts AI will enable the harmonization of product with nature. Once multi-scalar design becomes widely understood, Tellier predicted, it will move into other industries and eventually become “mission critical.” To justify the effort and cost of automation to get us there, “we must focus” on key performance indicators (KPIs).

Kevin Wrenn, EVP and chief product officer at PTC, said, “Customers tell us they want a digital thread to enable digital twins and to synchronize information from multiple enterprise systems to enable critical use cases that span engineering, manufacturing and service to improve efficiencies, speed time to market, improve quality and sustainability.” He added that PTC is working on integrations across heterogenous systems both within its own portfolio and with other ISVs in the ecosystem to enable open, interoperable data orchestration that solves the most valuable digital thread use cases for manufacturers.

“When we speak with customers about digital thread, we tell them to think about far-reaching digital transformation initiatives like data driven design or closed-loop quality. Next, we discuss approaching these transformative initiatives in phases: first, clean up your digital life, meaning sort out the right systems of record for your product data (ALM, PLM, ERP and more); next, implement more straight forward data orchestration, like integrated change management between PLM and ERP; and then start working on your farther-reaching digital thread initiatives.”

Wrenn also believes new technologies will help customers meet sustainability requirements and advance hardware and software development with Agile methods. “We see this happening already in many platforms and systems, not just in PLM. Software as a service (SaaS) helps us deal with integration,” he noted. As for artificial intelligence and ChatGPT, a natural-language search engine, “who knows where they will go, but you certainly could imagine applicability in areas like requirements and generative design.”

Dale Tutt, vice president of industry strategy at Siemens Digital Industries Software, said customers “want our help in tightly closing their many open loops in product development and manufacturing to improve performance and sustainability. [They] want the digital threads to free up their engineers from moving data around and getting rid of ‘air gaps’ in everyone’s processes, to allow their engineers to focus on creating new, innovative products and solving problems.”

“Customers sometimes ask if they should go ‘all-in’ on digital transformation, or if they should approach it incrementally?” Tutt said, “We work with them to understand what they are currently doing, and take a holistic view of solutions, processes and people, as they implement digital threads and improve interoperability. Often, they may want to focus on a few new use cases, but we want to help them bring in solutions that will help them provide a foundation for future growth. If they just bring in technology to satisfy the new use case alone, or they don’t consider process changes and buy-in from their team, they may be setting themselves up for long-term failure.”

As for data itself, Tutt continued, “we help them understand why they need to clean it up to enable digital threads. Usually, they might achieve a 20 percent gain in productivity in the first program they apply digitalization, and then each program after that they can get that productivity increase over and over. It’s the gift that keeps on giving.” As for applying AI, Tutt said, “Once you have cleaned up the data, AI will help classify data created to build connectivity and traceability in good data structures. After that’s done, we’ll see digital threads used to increase automation production and design.”

A Serious Warning to Engineers from Academia

Patrick Hillberg gave an unsettling look at coming social and economic disruptions in “The Past Century Has Not Prepared Us for The Next Decade.” A Ph.D. in systems engineering, he teaches graduate courses in engineering management at Michigan’s Oakland University.

Hillberg explored big disruptions that are expected in global trading patterns: a shift toward goods produced locally and close to customers rather than manufactured where costs are lowest and shipped globally. He also said digital threads and twins could be used to shift to virtual products from physical products.

The greatest challenge facing us, Hillberg said, is developing a digital-capable workforce. To drive home his point, he compared the U.S. workforce of 2020 with predictions that a 2030 workforce will face:

  • Out of the 2020 workforce of 12.3 million, 500,000 jobs went unfilled.
  • In 2030, 4.4 million additional jobs will need to be filled thanks to 2.5 million retirements plus 1.9 million new jobs created by economic growth.
  • One-third of the 2030 workforce will be “new,” and 2.1 million of its positions will be unfillable.
  • 75 percent of 2020’s tech skills will be irrelevant in 2030.

Despite this warning, the PLM RoadMap & PDT North America showed that progress is being made by digital thread users and developers, while highlighting the remaining challenges, particularly interoperability. However, nearly every speaker left the audience convinced that the digital thread and the digital twins it supports prove their value and that more is coming.

The post Digital Twin Lessons for Engineers from the PLM Road Map & PDT appeared first on Engineering.com.

]]>