oleg shilovitsky, Author at Engineering.com https://www.engineering.com/author/oleg-shilovitsky/ Fri, 13 Oct 2023 12:33:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://www.engineering.com/wp-content/uploads/2024/06/0-Square-Icon-White-on-Purplea-150x150.png oleg shilovitsky, Author at Engineering.com https://www.engineering.com/author/oleg-shilovitsky/ 32 32 8 Innovative Reasons to Connect PLM to Supply Chain Data https://www.engineering.com/8-innovative-reasons-to-connect-plm-to-supply-chain-data/ Fri, 13 Oct 2023 12:33:00 +0000 https://www.engineering.com/8-innovative-reasons-to-connect-plm-to-supply-chain-data/ Design engineers know why supply chain data is so important.

The post 8 Innovative Reasons to Connect PLM to Supply Chain Data appeared first on Engineering.com.

]]>
A hundred years ago, the Ford River Rouge complex had everything it needed to turn raw materials into a running vehicle within one complex. Today, no manufacturing company would produce everything in one spot. Instead, they use a network of supply chains to procure components, parts and products. These modern supply chains are sophisticated and complex, but they enable companies to simplify their own internal processes. It’s great when it works; it quickly becomes a nightmare when it doesn’t.

Until recently, logistics professionals and procurement departments were some of the only people to worry about supply chains. However, recent disruptions fueled by global challenges, pandemics, climate change and geopolitical tensions have spotlighted the importance and fragility of these systems. Everyone from financial institutes, policymakers and the public quickly learned why a particular product became unavailable or expensive.

Consumer demand for sustainability, transparency and ethical production are other trends that have shone a light on supply chains. This has forced companies to reconsider their product development, operation and supply chain strategies to maximize their environmental, social and governance (ESG).

Traditionally, companies think about supply chain after products are designed and ready to be manufactured. However, recent research and experience shows that connecting development to the supply chains, via PLM systems, can avoid many challenges while saving time, materials and costs. In this article, I’d like to explore eight ways how mixing supply chain data with PLM systems can impact, innovate and expedite early product development.

The Ripple Effect of Product Design Choices

An engineer’s choices during product design ripple through development like a stone thrown into a pond. For instance, material selections can resonate throughout your entire value chain, influencing your financial outcomes. Grasping how modifications in design reverberate through business expenses enables smarter decision-making, manufacturing, product sustainability and supply chains.

In today’s dynamic market environment, product development cannot occur in a vacuum. To remain competitive and agile, companies need to integrate insights from every segment of their business into the design and development process. One key area that has often been overlooked in early product development is the supply chain. For instance, PLM has grown to support engineering and new product development in the form of improved design data management, collaboration and change management. This has played out in a recent survey made by the engineering analytics company CIMdata.

According to CIMdata’s 2023 market and industry forum event, the implementation of PLM is heavily weighted towards traditional PDM. (Image: CIMdata.)

According to CIMdata’s 2023 market and industry forum event, the implementation of PLM is heavily weighted towards traditional PDM. (Image: CIMdata.)

As a result, there is a big opportunity gap between connecting product design and PLM to downstream processes like manufacturing and supply chains. Making this information available to design engineers can unlock the potential of product development.

8 Product Development Impacts when Connecting PLM to Supply Chains

By providing detailed information about parts, suppliers, cost and many other aspects related to supply chain, engineers are getting a new source of intelligence that can be used in product design. By providing engineering data to suppliers and procurement, companies can mitigate supply chain risks and costly mistakes in the product development stage.

Here is how mixing supply chain data and connecting PLM systems with downstream processes and tools can impact and expedite early product development.

1. Informed Design Decisions:

By understanding the availability, cost and lead time of materials and components, design teams can make informed decisions that align with the realities of the supply chain. This can lead to designs that are not only innovative but also cost-effective and feasible to manufacture at scale.

2. Reduced Time-to-Market:

Having real-time insights into the supply chain allows development teams to foresee and circumvent potential procurement delays. By selecting components that are readily available or identifying alternative suppliers early in the process, companies can speed up product launches.

3. Enhanced Risk Management:

Supply chain data provides visibility into potential risks such as supplier insolvencies, geopolitical issues or potential bottlenecks. Early product development stages that take these risks into account can have contingency plans in place, reducing the risk of project delays or budget overruns.

4. Sustainability and Compliance:

Modern consumers and regulatory bodies are increasingly concerned about sustainability and compliance. Supply chain data can offer insights into the environmental impact of materials or ensure that components are sourced ethically. Integrating these insights into early product development can lead to products that resonate better with consumers and meet regulatory standards.

5. Streamlined Communication with Suppliers:

PLM systems can act as a bridge between development teams and suppliers. By integrating supply chain data into PLM, companies can facilitate better communication about design needs, material specifications or volume requirements. This ensures that suppliers are aligned with development goals.

6. Financial Predictability:

One of the significant challenges in product development is budget overruns. With accurate supply chain data integrated into PLM, companies can better forecast the costs associated with product development, leading to more predictable financial outcomes.

7. Quality and Performance Feedback Loops:

Supply chain data isn’t just about procurement. It also provides insights into product performance, warranty claims and return rates. This data, when integrated into the PLM, can provide invaluable feedback to development teams, guiding improvements in the next iteration of the product.

8. Strategic Supplier Relationships:

By integrating suppliers early into the product development process and sharing data through PLM systems, companies can foster stronger, more strategic relationships with their suppliers. This collaboration can lead to joint innovation, where suppliers suggest alternative materials or components that can improve the product or reduce costs.

Digital Thread and How to Make Connections?

The question remains: how can engineers make these connections between design and supply chains. What system and platforms can facilitate these connections and how do manufacturing companies adopt them?

One of the key trends in engineering and manufacturing software systems is to switch from old-fashioned “documents” to digital “data.” Until now, documents (CAD, derivative files, Excel BOMs and similar) were serving as a media transfer between multiple systems. This document paradigm has remained strong for the last 30-40 years, but it needs to change.

The strategy is to move from documents to data. Instead of focusing on how to export data into a “known format,” companies adopt digital platforms that use modern cloud technologies and data management systems.

Concept of a digital thread and product knowledge graph. (Image: OpenBOM.)

Concept of a digital thread and product knowledge graph. (Image: OpenBOM.)

By capturing all engineering information into a digital thread and mixing this data with supply chain and procurement information, manufacturing companies can build product knowledge graphs to represent a unique set of data collections. This can intertwine data from both engineering, production and supply chain.

The technology of product knowledge graphs is a foundation for the future processes of creating data foundations for engineering, AI and PLM systems. These co-pilots, of a sort, will be capable of supporting engineers and manufacturing companies during their new product development process.

The integration of supply chain data into the early stages of product development through PLM systems is no longer a nice-to-have; it’s a necessity. In a world where agility, speed and innovation are key differentiators, companies that leverage these insights stand a better chance of success. Early product development, informed by real-world data, ensures that products are not just innovative, but also feasible, profitable and aligned with market needs.

The post 8 Innovative Reasons to Connect PLM to Supply Chain Data appeared first on Engineering.com.

]]>
The Best Tip to Improve PLM Adoption and Implementation https://www.engineering.com/the-best-tip-to-improve-plm-adoption-and-implementation/ Wed, 27 Sep 2023 13:38:00 +0000 https://www.engineering.com/the-best-tip-to-improve-plm-adoption-and-implementation/ Hint: Don't focus on a single source of truth first.

The post The Best Tip to Improve PLM Adoption and Implementation appeared first on Engineering.com.

]]>
Many heralded product lifecycle management (PLM) as a silver bullet solution, one capable of serving as a comprehensive hub for business operations. In fact, PLM software’s strongest value proposition for many years was to focus on a single database for all product information: the so called “single source of truth.”

The approach of bringing all data into a single database is important, and obviously very valuable. However, when I’ve seen companies rush into PLM adoption, by attempting to centralize all their data they often ignore the changes it brings to people and processes.

What I learned after being involved in many PLM implementations with large and small industrial companies is that without considering the unique nuances of their company processes, their PLM implementations often encounter resistance and inefficiencies.

I believe success with PLM implementation lies in prioritizing people and processes before slamming everything into a single database. In my article, I want to share a perspective about why an agile PLM framework can help companies improve adoption. I will also discuss how modern web services and tools can contribute to this process.

Barbie vs Oppenheimer: A Single Source of Truth is Boring

The cinematic showdown between Barbie and Oppenheimer teaches us an important marketing lesson: great content alone won’t cut it. Both movies are loved and successful, but it’s been widely discussed that Barbie won the box office because those in charge of its brand and products knew when and how to engage their target audience. The same is true for a PLM implementation approach. It can be improved by engaging its future users and stakeholders with an educational approach.

Despite its intrinsic benefits, however, PLM concepts often fail to resonate with a broad audience. First, it triggers a headache to wrap your head around the concept of a lifecycle—let alone what to do about it. As a result, PLM is complex even before any other product data management concepts are added to the mix. So, people will have a tough time articulating and understanding its business benefits and how it can streamline processes and product development.

The PLM discipline is hard. It includes process management, product lifecycles, qualify management, project management and many other disciplines mixed into one. There are also not many places where you can learn PLM as a discipline. Nonetheless people need to understand its importance and learn how to bring themselves, and others, up to speed about PLM software, document management, supply chain management and PLM systems.

As an additional challenge, businesses are diverse with unique needs and objectives. Insisting on a uniform approach can also result in a lack of enthusiasm and engagement from stakeholders. It’s essential to recognize and celebrate these differences, rather than attempting to suppress them beneath a single system, UI or workflow.

Overcoming Document and Excel Culture

In my interactions with modern businesses, I’ve observed a deep-rooted reliance on documents and Excel. These tools, due to their versatility and familiarity, are often at the core of many operational tasks. To ask employees to abandon these in favor of a ‘superior’ system can not only breed resistance but also lead to reduced productivity during transition phases.

The trick is to inform those using these documents how a modern PLM system can improve their daily work. So, to avoid the clash with existing document paradigms and spreadsheet culture use these talking points:

  • Don’t speak about user experience adoption. A more effective approach would be to bring a familiar user experience and integrate it into the PLM offering. This produces a seamless blend of the old and new. Documents and spreadsheets are two of the most familiar user experiences that have existed in the computing industry for the last 50 years. Meanwhile, most PLM systems look like complex database browsers. Instead of replacing Excel sheets and files, adopt a familiar user experience and bring hidden database services into the mix to gain adoption.
  • Mix the old with the new.  Change is hard. Therefore switching everyone to a new way of doing things might be too painful for an organization. Instead, modern PLM software tools can seamlessly import and export capabilities and data from familiar documents. By doing so, introducing new approaches will be gradual and won’t be as painful.
  • Facilitate processes. Every organization is defined by its processes. By molding PLM to reflect and enhance these processes, the transition becomes smoother. Employees will also appreciate the tangible benefits to their day-to-day tasks.

You are also likely to encounter experts in their specific roles within an organization. These individuals can often be the most steadfast to their current way of doing things. The way to win these individuals over is to speak about specific benefits to their specific tasks. Here are examples of PLM processes that can be beneficial to some of those expert audiences:

  • Design for manufacturing and new product development. In many industries, a gap exists between design and manufacturing. PLM can align the design process directly with manufacturing, ensuring smoother workflows and a product that truly mirrors design.
  • Sourcing, procurement and supply chain. You need to purchase components (both off the shelf and custom) to make products. Modern supply chains are complex beasts that often span continents. PLM has the potential to streamline the process by making data available and facilitating real-time communication and using data analytics. This ensures that all stakeholders, regardless of their location, are in the loop.
  • Costing and compliance.  In today’s global market, compliance is more critical than ever. An effective PLM system can streamline compliance adherence while providing tools to accurately estimate and control costs at every stage of the lifecycle.
  • Impact analysis and change management.  Making a change requires deep understanding of what impact it will make on existing products, customers and businesses. With a proactive PLM approach businesses can anticipate the ramifications of any alterations, enabling them to strategize effectively and improving the effectiveness of decision making.

Expressing the strategy to transform an organization from focusing on product and applications to one that focuses on data can also create a strong movement towards openness and interoperability in PLM implementations. Consider the shift towards global web applications and true multi-tenant SaaS services. Nobody cares how many times these vendors are rebuilding the software—if they provide continuous data support. This means that vendors and users can continue to use the tools they build around the offering without worrying about starting over again during the vendor’s next software update. Rather than perceiving data, tools and applications as clashing and conflicting with each other, PLM can offer a data-driven strategy that can unveil a plethora of innovative possibilities.

Moving to Bespoke Collaborative Services

Static software solutions are a thing of the past. Today, businesses thrive on tailored services that can be orchestrated together to perform specific tasks. This is how you can provide a specific user experience for people and bring workflows together. Here are a few benefits that this setup can mean for PLM users:

  • Design service. Collaborative design tools can foster creativity and cohesion among teams working in a connected way and sharing design data.
  • Planning. New product development requires quickly moving from design to production, building prototypes and making procurement. Tools that can connect both engineering and design with project management, production planning and procurement are extremely beneficial to help customers to build products faster with accurate data.
  • Collaborative Services. Allowing people to share data, communicate and iterate together can provide a significant improvement.

Agile PLM Implementations

New agile methods are coming to product development and companies are examining how to implement these methods to replace old “waterfall processes”. For the last two decades, however, these agile methods have become the norm in software development practices. Especially when combined with cloud and continuous integration (CI), these product development processes create a huge competitive differentiation for these organizations and improve the way technology products can be developed and delivered to customers

The difference between waterfall and agile processes. (Image: OpenBOM.)

The difference between waterfall and agile processes. (Image: OpenBOM.)

By adapting the agile process, I’ve come up with a 6-step process management best practice that can help manage PLM implementations. The foundation of this approach is a sprint-based (phased) approach. Based on my experiences, fast iteration combined with these best practices gives a great outcome.

An adaptation of the agile process to implement PLM in any organization. Keep in mind that the picture describes one sprint, and the idea is to repeat the sprint many times. (Image: Beyond PLM.)

An adaptation of the agile process to implement PLM in any organization. Keep in mind that the picture describes one sprint, and the idea is to repeat the sprint many times. (Image: Beyond PLM.)

This circular approach to PLM implementation means continuous iteration, feedback and adaptation. Dive deeper into this agile framework and you’ll understand the strategies that ensure smooth and efficient implementations.

Engagement and AI

The transformative potential of AI is undeniable. When integrated into PLM systems, AI can tailor experiences, offering smart recommendations and automating mundane tasks. Such a system, responsive and intuitive, can boost engagement as users find their tasks simplified and value-driven.

We are still in the very early stages of introducing AI to PLM systems, but these changes will be stronger over time. Integrating AI with PLM offers more than just content generation and summary. It can predict trends, identify inefficiencies and recommend potential innovations.

The Potential of PLM

PLM’s real potential lies beyond slamming a single database in the middle of an organization’s processes and forcing everyone to make a change. This was maybe a good idea 20 years ago, where companies had no real way to access siloed information and were forced to print paper between departments.

But modern PLM offers a strategy and set of services that should adapt and evolve with an organization, enhancing processes and meeting distinct needs. Adopting known user experience and pattern and avoiding clashes with people can help to transform an organization rather than ‘break it overnight.’

By prioritizing people and processes over data centralization, engineers can unlock the true power of PLM.

The post The Best Tip to Improve PLM Adoption and Implementation appeared first on Engineering.com.

]]>
Graphs Map the Future of PLM Software https://www.engineering.com/graphs-map-the-future-of-plm-software/ Thu, 31 Aug 2023 09:04:00 +0000 https://www.engineering.com/graphs-map-the-future-of-plm-software/ How graph data models and databases can improve PLM software.

The post Graphs Map the Future of PLM Software appeared first on Engineering.com.

]]>
Throughout history, eras have been defined by market forces. During the 17th-century Dutch Golden Age, the worth of tulip bulbs skyrocketed to unprecedented levels. For decades, energy resources, particularly oil, dominated as some of the most influential assets. However, in 2017, The Economist posited a paradigm shift with their article, “The world’s most valuable resource is no longer oil, but data.” This assertion sparked widespread interest and intrigue about the value and power of data. While opinions vary, one thing remains clear: data possesses immense potential to shape the future.

In the manufacturing business, data remains a largely untapped goldmine. Industrial companies are literally sitting on gigantic amounts of data that can represent products (design, production, supply chain), business activities (customers, sales, support and maintenance), product usage (a growing segment of connected products) and many other forms and functions. Companies can harness the full potential of this information and understand both its value and its application. In return, they stand to gain immense rewards.

Harnessing the full potential of data is no trivial activity. Just as extracting oil requires a well-organized process and lifecycle, spanning from drilling to delivery for consumption, data follows a similar complex path. Data is collected across various silos, each corresponding to distinct activities and operations. It must then be transformed into a consumable format and integrated with other data sources to extract meaningful insights. Ultimately, this refined data should be accessible to end-users to aid them in their activities and decision-making processes.

PLM, Data Management and Databases

Let’s talk about the history of PLM data management, technology and how it is aligned with modern data management trends.

From Proprietary Databases to Structured Query Language (SQL) and Relational Database Management System (RDBMS)

Over the last 20 to 30 years, the PDM and PLM industry went a long way towards improving data management and developing scalable platforms. The data management architecture of these solutions goes back to the time when PDM/PLM developers didn’t trust and couldn’t rely on commercial database products. Therefore, early PDM and PLM used proprietary solutions, developing a variety of data stores using file formats, embedded databases and management tools. However, the end game of these experiments with proprietary data management tools was to switch to industry standards adopted by large manufacturing companies. The decision was not only technical but also political. IT oversaw technology adoptions in a company, and PDM/PLM needed to pass muster. This is easier to do if you run on top of the industry standards such as IBM, Oracle and Microsoft.

Cloud Was the Cambrian Explosion in Data Management

Over the course of the last 10 to 15 years, we can see explosive growth in the variety of data management solutions and related technologies. It started from global web platforms and other cloud development that separated the technology used to build a solution from the delivery processes in enterprise software products. This change was the biggest contribution to the way the company started to manage product data and develop product lifecycle management systems to support business processes.

Massive development of a variety of data management solutions, databases, data processing tools, data storage, analytics, machine learning and others created an ecosystem contributing to new PLM system development. A combination of new databases and cloud contributed to the foundation of polyglot persistence data architectures that enabled the use of multiple databases at once.

Back in the 1990’s, many data management decisions looked like this. (Image: BeyondPLM.)

Back in the 1990’s, many data management decisions looked like this. (Image: BeyondPLM.)

Today, database and data management technology are going through a Cambrian explosion of different options and flavors. It is a result of a massive amount of development coming from data management technologies developed over the last 20 years. The database is moving from “solution” into “toolbox” status. A single database (mostly RDBMS) is no longer a straightforward decision for all development tasks. Which brings a question about how different data management systems can be used efficiently to support the development of new PLM capabilities.

The role various data management systems play in PLM. (Image: BeyondPLM.)

The role various data management systems play in PLM. (Image: BeyondPLM.)

What is a Graph Data Model?       

One very important data management trend is the development of graph models and graph databases. The graph data model is a structural representation of data where both entities and their relationships are highlighted. There are three important elements of a graph data model:

  1. Nodes (or vertices)
  2. Edges (or relationships)
  3. Properties

In this model, nodes represent entities or instances of the entities. Edges represent connections between nodes symbolizing relationships. Finally, properties can be attached to both entities, nodes and edges to provide a flexible data management structure. Both nodes and edges can carry properties, which are key-value pairs providing additional information.

Although graphs have been known for many years, recent development of graph tools is very promising from a PLM development perspective. The model can evolve, allowing the addition of new relationships or properties. Unlike traditional data models, graph models prioritize operations that involve traversing through connections, like determining paths between nodes or querying connected entities.

A Graph Data Model. (Image: BeyondPLM.)

A Graph Data Model. (Image: BeyondPLM.)

This model is especially suited for scenarios where the relationships between entities are as crucial as the entities themselves, making it a fit for situations like product structure, configurations, data dependencies and flexible knowledge representations.

What is a Graph Database?

A graph database is a type of database that uses graph structures for semantic queries with nodes, edges and properties representing stores of data. Unlike traditional relational databases which arrange data in tables, graph databases focus on the relationships between data and use graph data model as a foundation of data structure.

Because graph databases are using relationships and the mathematical foundations of graphs, these databases provides interesting capabilities that can be highly valuable in PLM. Here is a short list of examples:

  • Flexible Schema: Unlike relational databases, which require a strict schema, graph databases are more flexible. This makes them more adaptable to evolving datasets.
  • Relationship-centric: Graph databases excel at managing highly connected data. They are specifically designed to highlight and navigate intricate relationships in data.
  • Performance: For tasks that involve traversing relationships (like social networks or recommendation engines), graph databases can be much faster than relational databases because they can traverse relationships in constant time irrespective of the data volume.
  • Intuitive Data Modeling: For many real-world problems, data models can be more naturally represented as graphs. Think about social networks, organizational hierarchies, or transportation networks.
  • Real-time Insights: Given their ability to efficiently traverse many nodes and relationships, graph databases can provide near real-time insights, which are crucial for applications like fraud detection.
  • Advanced Analytics: With graph algorithms, one can uncover patterns that are challenging to discern in other types of databases, such as shortest paths, cluster identification or recommendation paths.
  • Integration of Diverse Data Sources: Given the flexible nature of graph databases, they are particularly well-suited to integrate data from different sources and of various types.
  • Agility: Iterative development is easier with graph databases. As requirements evolve, making changes to the database schema, queries or the application layer can be more straightforward than with other database models.

How Graph Databases Accelerate PLM Software Development

Graph databases can offer transformative value to Product Lifecycle Management (PLM) due to their unique ability to absorb, manage and query highly interconnected data. Here are some specific use cases illustrating the unique value of graph databases in PLM.

Flexible data modeling of relationships

Management of connected data always represented a big problem for traditional data management techniques in PLM. The value of graph databases for such a use case is its flexible data model and rich relationships capabilities. Traversing product structures using Graph Database queries is much more efficient than old-fashioned SQL queries.

Merging of complex data structure

Data coming from multiple siloes during product development, manufacturing and maintenance must be combined. Graph databases provide a unique capability to merge heterogeneous data structures into a single complex data set with the ability to query relationships, impact analysis and more.

Data analytics using graph data science

The ability to perform various analytical queries provide unique value for PLM systems because it brings analytics to active data sets. There is no need for ETL or a conversion of data into another platform. Graph databases run very specific analyses by running various graph specific algorithms.

PLM Data Modeling – File, Table, Tree, Graph

The power of the graph is the ability to retain the rich semantics of the data while providing a very powerful way to query data.

How familiar product data elements are transformed into a graph model. (Image: BeyondPLM.)

How familiar product data elements are transformed into a graph model. (Image: BeyondPLM.)

Three of the most popular data paradigms in CAD and PLM are Files, Tables and Trees (hierarchies). However, this information can be transformed into a graph that explains not only the data but relationships and semantic dependencies between data. Capturing these relationships and dependencies allows for building analysis and decision-support tool to help manufacturing companies.

How PLM data might look like in a Graph database. (Image: BeyondPLM.)

How PLM data might look like in a Graph database. (Image: BeyondPLM.)

The Future Value of Graph Databases for Product Development and PLM

Graphs and graph databases are quickly becoming a very powerful product data management paradigm. Meanwhile, industrial companies are sitting on a goldmine of data with limited set of options on how to turn it into value and competitive business advantage.

Graph databases can be used for a variety of these tasks. Where graphs and graph databases are powerful is in developing solutions to manage relationships. Where graphs really shine is around relationship analysis. Connections between people, things and companies are where value can be created. Although not every problem is a graph problem, PLM applications and specifically BOM management have multiple applications to using a graph database, especially where other data management solutions are inefficient.

PLM systems are transforming. Modern online platforms are focusing more on how to manage data and less on the applications. The latter has a shorter lifecycle compared to data. Because of that, there is a need for a more flexible and robust data management foundation capable of scaling and supporting data semantics while being a source of analytics on product data.

Oleg Shilovitsky is Co-founder and Chief Executive Officer at OpenBOM, and author of the Beyond PLM blog.

The post Graphs Map the Future of PLM Software appeared first on Engineering.com.

]]>
A PLM and AI Fantasy: What the Future Might Hold https://www.engineering.com/a-plm-and-ai-fantasy-what-the-future-might-hold/ Mon, 10 Jul 2023 10:26:00 +0000 https://www.engineering.com/a-plm-and-ai-fantasy-what-the-future-might-hold/ OPINION: Moving from legacy PLM to AI copilots and agile, collaborative product development.

The post A PLM and AI Fantasy: What the Future Might Hold appeared first on Engineering.com.

]]>
The surge of cloud, data management and AI technologies over the last 10 to 15 years opened a new era of digital transformation where companies tirelessly try to stay ahead of the curve. The growing competitiveness of manufacturing businesses and the demand to build faster, better and at lower cost, brings many questions about how to keep up and what technologies will become differentiators. By using an idealistic and futuristic example, it’s clear to see that PLM and AI will play a significant role.

AI copilots will differentiate legacy PLM and product development workflows. (Source: Bigstock.)

AI copilots will differentiate legacy PLM and product development workflows. (Source: Bigstock.)

Digital Transformation: Marketing vs. Reality

Unless you’ve lived under a rock for the last three to five years, you’ve heard about digital transformations, digital twins and digital threads. Although these are very important topics, they come with some old products and technologies being rebranded with new names. These projects sound big and important from an IT standpoint, but for engineering teams, they don’t always sound like something that can help to build the best battery, robot, car or new engineering profession. To many, they actually get in the way of engineers thinking and innovation.

In my opinion, this is very similar to what happened a decade ago with PLM—while vendors were talking about product innovation platforms, customers were more focused on how to improve the design process and decrease the number of buttons and mouse clicks to perform a specific PLM function.

IT digital transformation projects might be a very interesting thing, but when I speak to engineers about digital transformation, they are asking how new tools and technologies can help them to be smarter and develop faster.

A wave of innovation in generative AI has triggered a lot of thinking about how new intelligent methods and technologies can be applied in product development, similar to the way ChatGPT has triggered everyone to think about automatic content creation.

To me, this is an intersection between PLM and AI. But what would that engineering world look like?

To see that potential reality, let’s explore the journey from legacy PLM—responsible for CAD file management, release and change management—to a future platform where AI acts as our copilot to build, design and develop optimal Bill of Materials, supply chains, manufacturing processes and production planning.

The Genesis of Newmo: AI Copilot Revolution

Let’s start a fairy tale…

In the fictional city of Newma Valley, the visionary engineer Dr. IK developed an AI system called Newmo to change the new product development (NPD) process. Newmo was designed to work in perfect harmony with human engineers, pushing the boundaries of complex machinery design, collaborative BOM management, manufacturing and supply chain optimization.

One day, Dr. IK’s team received a challenge from a major customer: to create a new model for a complex chipmaking machine that would redefine the industry. Newmo manufactured similar machines in the past, but the requested timeline, feature complexity and requirements for this project were beyond the level they accomplished previously. Dr. IK saw this as the perfect opportunity to showcase Newmo’s full capabilities.

To begin the project, Dr. IK’s team provided Newmo with the company’s historical data on existing machine designs, configurations, BOMs and manufacturing process planning. Newmo also received access to the data from existing PLM, ERP and SCM databases, global supplier databases and global manufacturing facilities. The AI rapidly analyzed this wealth of information and identified the required BOM configurations, features, potential inefficiencies, bottlenecks and untapped potentials in the existing BOMs and processes needed to build the new product.

Newmo then introduced an innovative NPD co-pilot application. This application supported a process that allowed the AI to work closely with human engineers, dynamically updating and optimizing the entire system design as well as focusing on detailed components and assembly design changes—including the optimization of 3D geometry and feature sets. By analyzing cost and performance trade-offs, Newmo suggested alternative components, materials and suppliers that would both enhance the product’s performance and reduce its cost.

The result was a new machine named “Nexo,” which was designed to adapt and optimize production processes in real-time, using advanced sensors and AI algorithms. Newmo’s comprehensive analysis played a crucial role in the development process, suggesting design improvements, material choices and even predicting future technological changes.

The next step was to find the best location for a manufacturing facility to build the Nexo. Newmo evaluated multiple factors—labor costs, transportation expenses, supplier proximity and regional regulations. It then ranked potential manufacturing sites based on a comprehensive analysis, calculating optimal production efficiency and cost-effectiveness of the entire process.

The results were outstanding. The new development process demonstrated an increased performance of engineers using Newmo and its capabilities to complement workloads that optimize product development and planning processes. Newmo NPD co-pilot process led to the creation of a truly groundbreaking machine with a unique and optimal set of components in the shortest possible time. Newmo not only transformed the corporation’s production capabilities but also set a new standard for AI-assisted complex machinery development, manufacturing optimization and BOM management.

The success of the project allowed Dr.IK and Newma Valley to demonstrate the potential of combining human creativity with AI in the NPD process. Their achievement opened new horizons for innovation and created a new category of intelligent NPD tools. These methods changed the approach companies use to build engineering to order products with an incredible level of product development automation, supply chain optimization, BOM cost assessment and change management.

Back in Real PLM World

Let’s come back to the reality of PLM and our current manufacturing world with all its challenges. Recently, I outlined how PLM can address five key industrial challenges. These are:

  1. Manufacturing better, faster and at lower costs.
  2. The digital transformation process.
  3. Ongoing supply chain disruption.
  4. Regulation complexity and risks.
  5. Everything as a service (XaaS).

But let’s focus on one problem and how it impacts many of the above challenges: information accuracy. Think about getting wrong data just because of an incorrect Excel file or incomplete data sync. Operating with incorrect data can lead to errors in production, resource wastage and increased costs. Therefore, ensuring data accuracy and integrity is crucial for making informed decisions and maintaining up to date manufacturing.

A big challenge for all companies is to improve their decision-making processes. The last decade of B2C innovation demonstrated that companies are getting good at giving the average consumer tools and opportunities to choose what we need (driving directions, purchasing online and more). But this doesn’t translate for engineers in manufacturing settings.

Industrial companies often struggle with data silos, complex dependencies and distributed teams, which can slow down any decision-making process. To solve the problem, companies need to integrate data and streamline communication between teams. By achieving this goal, companies can achieve better decision-making, improve the overall efficiency of the company, accelerate and streamline processes, improve supply chain decisions navigate regulations and much more.

In other words, information accuracy—or inaccuracy, in this case—is a real world roadblock to the idyllic Newmo experience.

LLM and Knowledge Graphs Accelerate Design

So how do we bring this all into reality? The question of how traditional product development can be empowered by the capabilities of AI tools like large language models (LLMs), knowledge graphs (KGs) and generative design has triggered multiple discussions in the industry. One possible path is the integration of LLMs with KGs to interpret data, generate insights and make decisions quickly and accurately. Here’s a high-level view of how this integration can support the accelerated design process:

  1. Collecting Knowledge: The language model gathers information from various sources in the PLM system and feeds it to the KG—which is like a database that focuses on data relationships.
  2. Finding Information: The AI system can then quickly pull relevant information from the KG when needed.
  3. Generating Ideas: The language model can use this information to suggest new or improved design concepts.
  4. Evaluating Designs: The AI system can compare design ideas against the knowledge in the KG, guiding teams on how to improve their designs.
  5. Making Documents: The language model can automatically create design documents using the information from the KG and the design process.

While LLM models can be efficiently used to capture unstructured information, KG itself can be built from various design sources representing existing product design and real information that exists in PLM systems. The ideas and practical examples of KG implementations already exist and are done by industrial companies and researchers.

In this picture, you can get an idea how KGs can capture information and insights form multiple design sources to be used for LLM creations later. (Image: OpenBOM.)

In this picture, you can get an idea how KGs can capture information and insights form multiple design sources to be used for LLM creations later. (Image: OpenBOM.)

When both LLM and KG are connected and integrated, they bring specific design data from a company’s files. They are captured and translated into LLM models which contextualizes data and enables engineers to easily use and produce new designs, bills of materials, production plans and supply chain optimization.

Where Does AI and PLM Go from Here?

By linking AI and PLM, the industry will take engineers through an imaginative landscape filled with game-changing ideas and disruptive possibilities, examining what it might mean for businesses globally. Industrial companies are looking at how to improve the effectiveness of decisions in product development. And by further integrating LLMs with KGs and PLM, design teams can harness the power of AI to speed up their product development process.

Our quest to understand the future of PLM and AI is a journey into a realm of endless potential. As we move forward, we will discover a huge potential for new data-driven technologies. The goal to turn manufacturing companies from traditional “electronic paper” legacy PLM processes into a data-driven world of AI and knowledge is very promising.

The post A PLM and AI Fantasy: What the Future Might Hold appeared first on Engineering.com.

]]>
Bridging the Gap: How PLM Can Address 5 Key Industrial Challenges https://www.engineering.com/bridging-the-gap-how-plm-can-address-5-key-industrial-challenges/ Mon, 29 May 2023 15:26:00 +0000 https://www.engineering.com/bridging-the-gap-how-plm-can-address-5-key-industrial-challenges/ Every manufacturing company I see experiences these problems, let’s talk about them.

The post Bridging the Gap: How PLM Can Address 5 Key Industrial Challenges appeared first on Engineering.com.

]]>
The Industrial sector faces immense challenges, from product development complexity, to disruptions in the supply chain, to stringent regulations. Despite all this, companies are scrambling to manufacture better, faster and at lower costs while simultaneously tackling digital transformation in the post-COVID era.

(Image courtesy of Bigstock.)

(Image courtesy of Bigstock.)

After talking to hundreds of manufacturing companies and engineering firms over the last few years, I see five common, critical challenges in modern manufacturing. These challenges exist regardless of company size or industry.

So, since every manufacturing company I see experiences these problems, let’s talk about them.

The list includes:

  1. The need for speed.
  2. Digital transformations in a post-COVID world.
  3. Supply chain disruption.
  4. Regulation and risks.
  5. Transitioning to everything as a service (XaaS).

Manufacture Better, Faster and at Lower Costs

Productivity and efficiency are a big deal. Companies are looking at how to develop faster, better and less expensive products—but it’s a tough goal. Many problems can impact a company’s productivity and efficiency, and much of the time data and process optimization are the keys to improvement. As such, inefficient data management practices, legacy systems, messy data and data silos are some of the most common reasons companies continue to be inefficient.

Speaking of process optimization, it is important to be wary of applying a waterfall methodology to process organization. Highly complex products and multi-disciplinary teams make sequential processes killers for productivity. With 80 percent of the product footprint defined during the design stage, addressing problems early and accelerating new product development (NPD) are a big deal.

The Post-COVID Digital Transformation of Processes

The COVID-19 pandemic has underscored the necessity of digital transformation in the industrial sector. It was a pivotal moment for companies to understand that they cannot operate as before.

As a result, companies are focusing on multiple digital initiatives to address many of the changes since COVID-19—such as the increased number of engineers working from home. Two of the most important initiatives seeing change are NPD and Change Management (CM). Accelerating and streamlining these processes is key to success. Bringing new innovative products to market faster requires changes in the way companies work. Integrating multi-disciplinary teams and optimizing the supply chain using digital transformation is a big challenge. Companies need to figure out how to accelerate these processes and stay ahead of the competition. These processes also need to be able to support a quick response to customer demand.

Ongoing Supply Chain Disruption

What started as a disruption triggered by COVID-19 is continuing to be a challenge for the industrial world. The most important aspect of supply chain disruption is the global visibility of supply chains, and the tracking of multiple events such as price instability, local conflicts and financial markets. The supply chain world is complex and fluctuates at a high speed, which continues to be a challenge for manufacturing companies in all industries. Information about suppliers, raw materials, production processes and lead time are basic elements to track supply chain performance.

Regulation Complexity and Risks

Modern compliance requirements and regulations are complex, which makes them challenging for manufacturing companies to follow. Selling products in the global market requires companies to manage the complexity of compliance and be able to react fast to new regulations. It requires the quick capture of information about these new rules and the ability to provide quick and efficient answers (including automatic updates) and history tracking. The risk of non-compliance is too high for manufacturing businesses, and they don’t want to risk it.

Everything as a Service (XaaS)

Manufacturing on demand and the XaaS model is revolutionizing the industry. Companies are switching from selling products to selling services. Moving from tires to miles, from turbines to hours, and from equipment to time are only a few examples of how industry and business models are changing. For this change to be successful, products need to have significant service life improvements. To encourage such transformations, companies must have robust process upkeep, including the ability to improve not only engineering processes, but also support and maintenance.

Is the PLM industry ready for the challenge?

The PLM industry is looking to get ahead of the transformations and modern challenges in the industrial world. Let’s talk about how recent development in PLM technologies can help to close the gaps and cope with these challenges.

Information Accuracy:

Data is fast becoming a king of PLM technologies. This involves killing silos, improving data availability across multiple organizations and processes, real-time data access, collaboration support and many other elements of data management. A short list of what PLM can do to make digital data available for everyone includes:

  • Implementing robust and modern data technologies
  • Ensuring data connectivity and integrity
  • Integrating accurate product information across various stages of the lifecycle

Accelerate NPD and CM Processes:

PLM offers significant opportunities to accelerate critical processes such as NPD and CM. One of the main elements of these improvements is connected to information availability and the organization of collaborative agile product development processes. Data sharing and instant communication are the key elements. Software as a service (SaaS) and cloud technology are key enablers in this process. In addition to that, multi-tenant data management streamlines data sharing and allows everyone involved in the process to stay connected without the need to import or export data.

Collaboration, Decision Support and Copilots:

To solve these problems, companies also need to integrate data and streamline communication between teams. By achieving this goal, companies can make better decisions and improve overall efficiency. Staying connected is quickly becoming a norm in the industrial world to improve decision support and communication between companies, contractors and suppliers. No one wants to wait for information to be synced between multiple systems, and everyone is looking for the best systems with elements of AI and decision support to help make the right decisions at the right time. How PLM technologies help to do so includes but is not limited to: SaaS, data sharing, knowledge graphs, artificial intelligence and many others. This should also be an alert for many legacy system providers and industrial companies that need to invest in new tech coming from the business to consumer (B2C) world to PLM.

Roadblocks to Addressing These Challenges with PLM

PLM is a transformative tool that helps bridge the gap between the challenges faced by the industrial sector and the opportunities that lie ahead. By enhancing information accuracy, improving decision-making processes, accelerating operations, mitigating risks and supporting new manufacturing models, PLM can empower companies to thrive in today’s dynamic industrial landscape.

However, such an optimistic view will become a mirage if industrial companies and PLM vendors continue to keep current legacy platforms and focus on minor functional improvements. Existing PLM platforms are over 25-years-old and demand modification and substantial improvement. Thus, it’s not just on industry to address the challenges stated above, the PLM vendors need to step up as well.

The post Bridging the Gap: How PLM Can Address 5 Key Industrial Challenges appeared first on Engineering.com.

]]>
How AI Can Improve PLM https://www.engineering.com/how-ai-can-improve-plm/ Fri, 12 May 2023 14:02:00 +0000 https://www.engineering.com/how-ai-can-improve-plm/ The use of artificial intelligence is proliferating across industries and disciplines, including manufacturing and engineering. Here are five ways AI could benefit product lifecycle management.

The post How AI Can Improve PLM appeared first on Engineering.com.

]]>
We live in an increasingly technological society; past the industrial phase, we are currently in the “informational era.” In the last decade, AI and related technologies have made a significant leap forward, moving from experimental projects and implementations into modern applications such as semantic web, graph databases and large language models (LLMs). All of these are expected to have an impact on PLM development. 

AI will make a big impact on PLM—it’s just a matter of when. (Image courtesy of Bigstock.)

AI will make a big impact on PLM—it’s just a matter of when. (Image courtesy of Bigstock.)

Artificial intelligence (AI) has been evolving for decades and is increasing its presence in our lives, changing industries and reshaping our day-to-day practices everywhere—the way we live, work and communicate. Let’s talk about where the technology currently stands from an industrial engineering perspective and how it can affect the future of PLM.

Our Current AI World: An Industrial Engineering Perspective

Recent research and achievements combined with modern computing power and its availability with cloud computing is making AI widely available. This is pushing the boundaries of everyday usage of AI in many applications.

Large language models, such as OpenAI’s GPT-4, have garnered significant public attention due to the appearance of ChatGPT applications—particularly for their impressive capabilities in natural language processing, understanding and generation of content. An example likely to resonate with PLM-minded people is GitHub Copilot which, according to OpenAI, is using a dataset of open-source code from GitHub. These applications are impressive, but you should remember to always assess the accuracy, efficiency and ethical consideration of AI based on the datasets used for training. In such a situation, it is important to remember that most industrial companies will be looking for solutions that rely on their own data instead of public datasets.

With the rapid rise of AI technologies, some have expressed concerns about an AI or GPT bubble, fueled by excessive hype and unrealistic expectations. Engineers should carefully assess how to implement AI projects and their possible deliveries. Most new applications of technologies are initially rejected until it is proven that it can be done, and AI will be no different. But that is no reason to take an avoidable risk. The key to avoiding the AI bubble is to check how realistic the project is by analyzing three elements: data availability, use cases and cost. Keep in mind that AI is not a silver bullet or a tool that can replace people. It is a powerful decision-making and data-generation tool that can be used alongside human activity.

Knowledge graphs and semantic webs are other tools that can help engineers assess AI. Knowledge graphs represent a structured, semantic understanding of information. By linking data points in a meaningful way, knowledge graphs can help users navigate complex information landscapes, provide context-aware recommendations and facilitate more effective decision-making. In the context of PLM, knowledge graphs can aggregate the data from multiple systems to support the entire product lifecycle and help to facilitate end-to-end lifecycle management. By integrating data from disparate sources and systems, knowledge graphs can help businesses better understand their products, optimize processes and make better decisions. Applications such as impact analysis, holistic regulatory compliance and supply chain risk mitigations are good examples of knowledge graph usage.

5 Ways AI Will Change PLM

Let’s explore some AI applications in PLM that, in my view, can have a significant impact on the modern development of technologies and tools. They might help companies to manage specific aspects of product development and provide decision support for people involved in the design, manufacturing, customer support and maintenance of projects.

1.   Requirements Management and Traceability

Requirements management is a critical aspect of the product development process, as it involves gathering, analyzing and defining the needs and expectations to a product’s end-users. Natural language processing and the ability to create summaries of large sets of information, documents, videos and other sources of data can provide some breakthroughs for requirement management and validation tools used in PLM software.

2.   Reuse of Data for Faster New Product Development

Data reusability is essential for efficient product development, as it helps companies avoid duplicating efforts and reduces costs. How many times have you realized that engineers create new parts just because they cannot find an existing one? AI-powered data management systems can analyze vast amounts of data from various sources to identify patterns and relationships in an effort to suggest what can be reused in the context of a new project. Think about configurable design, Bill of Materials and re-use of suppliers. Re-use patterns applied in NPD projects can streamline the process and help companies to make better-informed decisions based on historical data.

3.   Virtual Assistance and Customer Support

AI can help improve all aspects of PLM work that require human interaction with predefined patterns and optimization algorithms. Think about virtual assistance in planning meetings, organizing tasks and scheduling approvals. The change management process can be automatically generated without the need for the CCB (change control board) to use old-fashioned workflow tools. For maintenance projects and customer support, AI tools can provide interactive manuals and intelligent assistance to help perform or automate tasks.

4.   User Experience

For many years, the UI/UX aspects of PLM systems were under continuous pressure. The demand to simplify UX and make tools easier to use and understand is high. Bringing AI into the UI/UX world of PLM systems can open the door to new ways of understanding problems and new tools to simplify UI—perhaps by turning it into a conversation, enabling speech recognition and other ways to simplify and make it more efficient.

5.   Planning Intelligence

AI can help to elevate the importance of PLM technologies by enabling planning intelligence related to complex configurations, supply and demand and portfolio optimization. Machine learning algorithms capable of analyzing a large amount of data, combining design and customer-related information, and offering paths to optimize product design, supplier selection and many other aspects will be useful. It can enable engineers to adjust product strategies and increase the capabilities of companies to address emerging opportunities and stay ahead of the competition.

Data Will Make AI the Future of PLM

There is one important thing that holds the key to future AI advancement no matter what aspect of PLM you decide to improve using a modern AI tech stack—and that is data. Data is king and you cannot make AI work without it. Therefore, if you think about AI projects, start by thinking about how to get data that can help you to turn your AI ideas into reality.

Data is the lifeblood of any AI system. In the context of PLM, data from various sources such as design, manufacturing and customer feedback serve as the foundation for AI-driven decision-making. This data not only enables AI to identify patterns, trends and opportunities for improvement, but also empowers organizations to make informed decisions throughout the product lifecycle. The key sources of data can be your design (files), Bill of Materials (BOM), maintenance reports, old drawings and enterprise databases such as MRP/ERP, SCM, MES, CRM and more. Acquiring and cleaning data is an essential step to starting AI activity.

One of the critical aspects of the future of PLM AI is the integration of multiple technologies to create a comprehensive information model. Combining different types of data, such as structured and unstructured, can provide a more holistic view of the product lifecycle. Additionally, incorporating technologies such as IoT, machine learning and natural language processing can further enhance the capabilities of AI-driven PLM systems. The most important thing is to find the right model and approach from the selection of available AI technologies. Here are some examples:

Natural Language Processing (NLP): Use NLP techniques to extract relevant information from product specifications, engineering documents or design files to create a new design or BOM. This approach can help automate the process of identifying components, materials and suppliers.

Computer Vision: Leverage computer vision techniques to analyze product images, CAD drawings or other visual representations of a product to identify components and materials. The information can be used to generate a new product configuration or find a supplier.

Knowledge Graphs and Semantic Technologies: Develop a knowledge graph that represents product components, materials, suppliers and other relevant entities and their relationships. Semantic technologies like RDF and OWL can help formalize and structure this knowledge. AI techniques like graph neural networks or graph-based machine learning can then be applied to generate or optimize a BOM structure.

Rule-based Systems: Rule-based systems are one of the oldest techniques in decision-making. Create a rule-based system that captures domain-specific knowledge, guidelines and constraints related to product design, manufacturing and BOM generation. The system can then generate a product configuration by applying these rules to the product data.

AI-assisted Optimization: Develop AI-driven optimization algorithms that help generate an optimal product structure by considering multiple factors such as cost, lead times, availability and other constraints. This approach is particularly useful in complex sourcing.

Every organization has unique needs, processes and objectives, which makes it essential to customize AI-driven PLM systems to cater to specific requirements. By utilizing an organization’s data, AI can be contextualized for better decision-making and improved efficiency. Therefore, it is important to find a way to use company data to create a unique data model for AI.

Contextualizing includes the process of breaking data into silos and enabling cross-functional data reuse. For instance, AI algorithms can analyze data from multiple departments, such as engineering, manufacturing, supply chain and service to provide a more comprehensive understanding of the product lifecycle and the ways to optimize and streamline processes.

Key Takeaways About AI Applications in PLM for Industrial Engineers

The future of AI in product lifecycle management is both promising and uncertain. It holds a lot of potential to change industrial companies and the way they make their decisions. But the outcomes are still uncertain, because many applications of AI are still novel and require validation. The key to unlocking this potential lies in the intersection of company data and contextualizing AI technology to address specific customer or industry vertical problems.

The future of AI in PLM depends on the effective utilization of data. Companies are sitting on a goldmine of data, which is held by legacy databases, gigabytes of Excel and Drawing files and enterprise applications. By harnessing data from multiple sources, integrating various technologies and contextualizing AI systems for specific organizations, businesses can unlock the full potential of AI-driven PLM.

In the end, the future of AI in PLM is not a matter of if, but of when and how. By contextualizing AI technology to meet the specific needs of industrial companies, we can propel the world of PLM into a new era of efficiency, sustainability and innovation.

The post How AI Can Improve PLM appeared first on Engineering.com.

]]>
PLM Is Best Hosted, SaaS, or On-Premise . . . Let’s Sort It Out https://www.engineering.com/plm-is-best-hosted-saas-or-on-premise-lets-sort-it-out/ Fri, 25 Sep 2015 10:38:00 +0000 https://www.engineering.com/plm-is-best-hosted-saas-or-on-premise-lets-sort-it-out/ Every PLM vendor claims to support the cloud, but exactly how can be a bit of a mystery.

The post PLM Is Best Hosted, SaaS, or On-Premise . . . Let’s Sort It Out appeared first on Engineering.com.

]]>

Here are a few questions generating some debate among both vendors and customers:

  • How will manufacturing companies be able to adopt cloud solutions?
  • What are the right products and technology for PLM cloud?

Earlier this year, I published a PLM vendors cloud service comparison article. I saw it as a way to differentiate offerings and provide some guidance for PLM professionals and customers navigating the sea of cloud buzzwords such as IaaS, PaaS, SaaS, public vs. private cloud and even things like virtual private cloud.

I had the chance to attend a few PLM conferences over the last couple months. After watching many presentations and talking to attendees, I found that the discussion about cloud PLM differentiation today can be narrowed down to a comparison between PLM hosted solutions and PLM SaaS services. It’s not easy differentiating between these two types of solutions, and PLM cloud marketing can often confuse customers.

In this article I will provide a short description of both and offer some simple guidance about the advantages and disadvantages of these approaches. My goal is to help you make an informed decision regarding future PLM cloud strategies in your organization.

Everything Is Hosted

This statement is actually true. From a certain standpoint, if you think about software running outside of your company data center, it is hosted somewhere.

The first time hosted applications were introduced to businesses we called them application service providers (ASP). For many people, it is hard to differentiate between ASP-hosted solutions and SaaS applications.

On the surface, the difference between the two solutions (hosted and SaaS) might seem insignificant. Both will release a company from the need to operate hardware and will shift responsibility to the hosting providers. But that’s where the similarities end and the differences begin.

Hosted PLM Solutions

Hosted solutions use PLM software that is capable of running on-premise and make it available to customers from an isolated hosting environment. In other words, hosted architecture takes your PLM servers and moves them outside your company data center. 

Hosted solutions can use the services of any hosted provider with computing infrastructure and, in some situations, customers may own their hardware. A hosted solution can be also used with infrastructure as a service (IaaS) providers such as Amazon AWS or Microsoft Azure, using a variety of computing infrastructure.

From an infrastructure and product architecture standpoint, these solutions will be almost identical to on-premise solutions. The application and data resides in an isolated environment.

Hosted solutions ensure the highest level of security, a clear advantage of them that is not very much different from on-premise installations. They can use multisite redundancy, and they can ensure the availability of applications in case of communication problems across a global deployment.

For the last decade, PLM vendors have gathered enough experience to run existing PLM products via hosted computing infrastructure. The disadvantages of hosted PLM architectures are the same as the advantages. They require isolated infrastructure, they don’t scale elastically (unlike modern cloud architectures) and they can limit some of the customization and administrative capabilities of PLM software.

Cloud SaaS Applications

SaaS solutions are usually represented by an application, which is running on top of multiple instances of virtual computers and using shared services. The architecture of SaaS applications is often different from applications that can be used in on-premise and hosted cloud installations.

SaaS applications take full advantage of the cloud infrastructure to run across multiple servers. On-premise products cannot and require some changes to the architecture in order to do so. SaaS cloud solutions also use computing and data services elastically and have the ability to scale according to demand.

The main advantages of cloud applications are their ability to scale elastically and their optimal use of cloud infrastructure. From the standpoint of scale, SaaS applications have almost unlimited potential and can support better cross-application communication. Additionally, the integration among cloud applications with SaaS is much easier and the data exchange is simpler.

The disadvantage of SaaS cloud applications is the fact that you are totally controlled by vendors and application providers. Such solutions cannot be taken down and installed on-premise, regardless of the cost. 

The integration of cloud SaaS solutions can be another challenge because it might lack some of the data hooks and services available in hosted products. While the majority of enterprise PLM software is integrated using direct access to SQL database, SaaS cloud integration should take a different approach and use Web API for integration.

The Modern Cloud PLM Solution Landscape

On one end, established PLM vendors have decided to make their existing PLM platforms cloud-enabled. The next logical step is for them to host PLM solutions using hosting providers and IaaS infrastructure. 

One the other end, providers of new PLM solutions want to leverage cloud infrastructure in full to run SaaS PLM apps, leveraging the elasticity of the cloud and its economy of scale.

Right now, the only two cloud SaaS PLM vendors are Autodesk PLM 360 and Arena Solutions. These two vendors are providing software only available as cloud SaaS services. The applications work through a browser, including basic administration, configuration and integration. You cannot run these products from a hosted data center that is on-premise.

For the last three years, major PLM vendors have hosted their PLM solutions through a variety of methods. Dassault Systèmes, PTC and Siemens PLM make their PLM products available via the hosted infrastructure of IaaS providers and partners. Aras Corp, for instance, supports multiple options for hosting Aras Innovator from Microsoft Azure cloud and via hosting providers (such as TSystem in Europe). Oracle Agile PLM is also available as an Oracle Supply Value Chain solution, which relies on Oracle Cloud infrastructure.

So, can you get the best of both worlds? It is not clear if PLM vendors will be able to support cloud SaaS and hosted solutions as a single technological and product platform. Some of the products (i.e. PTC Windchill) are still available via public cloud as a shared service, but they have limited configuration capabilities. I haven’t seen similar solutions available from Dassault Systèmes and Siemens PLM, but they may be available in the near future.

So, what’s my conclusion?

Cloud PLM is still a complex world, with plenty of questions for both vendors and customers alike. With traditionally long lifecycles and increased competition, vendors will need to adjust their approaches if they want to claim their share of the market moving forward. 

Meanwhile, traditional PLM platforms will likely face challenges of scale and need to leverage the cloud economy in order to provide effective PLM solutions at a low cost compared to modern SaaS PLM cloud services.

At least, those are my thoughts on the matter.

About the Author

Oleg Shilovitsky is an entrepreneur, consultant and blogger at Beyond PLM, a leading source for information and commentary on engineering and manufacturing software. Shilovitsky has been building software products for data management, engineering and manufacturing for the last 20 years. In the past he was developing PDM / PLM software and worked for companies such as SmarTeam, Dassault Systèmes, Inforbix (acquired by Autodesk) and Autodesk. 

The post PLM Is Best Hosted, SaaS, or On-Premise . . . Let’s Sort It Out appeared first on Engineering.com.

]]>