Category: Systems Engineering

30 Sep 2019
PLM Roadmap

The Case for PLM in Concert With Your ERP System

In a Q&A with ConnectPress, Jon Gable, PLM Business Leader for Adaptive, gets to the heart of why even small manufacturing shops can derive tremendous value from implementing a product lifecycle management (PLM) system that works hand-in-hand with an enterprise resource planning (ERP) system.

Fundamental Roles of ERP and PLM

ERP systems focus on operations: how a company produces a product, including tracking money, materials, production capacity, orders and executions, labor factors, risk, compliance, and more. What ERP cares most about is the financial information to do with purchasing, producing, or assembling parts.

PLM systems focus on the innovation side of a business: how a company develops a product from concept to end-of-life. The tools within PLM—including mechanical CAD, ECAD, FEA, and manufacturing simulation—create the design information that goes into the ERP system. In addition, PLM tools expand on ERP’s fundamental purpose of managing purchasing, manufacturing, and assembly of designs by answering a variety of other questions. Will the finished parts perform as expected? Will they be cost-effective? Are there better alternatives? Can they really be assembled ergonomically? Will there be clashes as robotics move parts into place?

A PLM system allows for prediction and correction that an ERP system isn’t capable of, for example, helping inform ERP-centered engineers about production workflows based on a component’s shape, handling characteristics, and tolerances, not just machine availability.

Another huge value PLM contributes is the ability to simulate factory-floor manufacturing processes, which is critical for change management, allowing an organization to synchronize different engineering domains to fully understand the impact of change. Instead of receiving a new requirement, making a physical change to a subsystem, and working through the physical manufacturing process to evaluate any issues, a robust PLM platform lets you complete the evaluation virtually—and quickly. Gable refers to this state as “the holy grail of where manufacturing organizations want to get to.”

The Modern Market Means You Need PLM

The challenges of three current trends in manufacturing can all be comprehensively managed by a PLM system: extreme product variations and configurability, increased product specialization, and complex mechatronic engineering.

Setting up manufacturing engineering to handle product variation. ERP systems deal with what part numbers to produce, which doesn’t help the variability question. In contrast, PLM systems allow engineering and simulation to be done to validate that variant configurations can be designed and produced—meaning you can validate products before exposing possible configurations to the market.

Managing different product configurations. As companies manage these variations of products—and validate the new design configurations—it’s ever more important and complex to ensure proper revisions and versions of a design are being used for the correct product and manufacturing simulations, as well as production. With its “single source of truth,” a centralized location for all product and manufacturing data, PLM systems make this management significantly easier.

Addressing complex designs. Rarely do products have only mechanical function these days—it’s much more common that they contain electronic and software aspects, as well. Simpler tools for tracking design evolution, such as a product data management (PDM) system, worked for simpler products. But the added complexity of mechatronic engineering increases the need for PLM platforms that can handle a broader range of teams, contributors, and collaborators involved in a single design.

How To Begin with PLM

The barrier to small- and mid-sized organizations implementing PLM is twofold: cost and effort. Many are finally reaping the rewards of a complete ERP integration and routine content management workflows. The idea of adding another enterprise platform, and one that comes with a decent price tag, can be daunting—regardless of potential value in the end. But the benefits of the long-view solution can still be achieved bit by bit with modularity. Most PLM vendors offer modular, manageable packages that build on each other, with fairly affordable on-premise, cloud, or even SaaS options.

Similarly, following the theory of eating the elephant “one bite at a time,” the best way to start an implementation is with a single process you want to improve. The process should involve a targeted group of users, have the most business impact, and be the foundation for expanded applications. Typically, this is something like a small design team managing their content, whether electrical, mechanical, or software.

Follow-on phases are built on this implementation, such as users who consume the engineering content wanting to improve their processes. For example, people who track material-compliance regulations for different markets may want to move the information they track offline—unlinked and siloed—into the PLM system where it lives in context with the design data.

Centralized, accessible, comprehensive product data makes everyone associated with product manufacturing more efficient, which ultimately saves money on time, quality, and customer satisfaction.

To read the full Q&A with PLM Business Leader Jon Gable, or to find out more about how a PLM platform can transform your processes, contact Adaptive.

26 Jun 2019
Simulation for Systems Engineers

Requirements Simulation for Systems Engineers

Manage Complexity by Simulating Requirements

Requirements specifications are at the heart of the development process. Though multiple tools are used to manage requirements, it is still common for errors to creep in at the specification stage. Validation of requirements is usually carried out after the design phase, when hardware, software and models are simulated. These HiL, SiL and MiL (Hardware, Software, Model in the Loop) simulations are performed successively to validate the embedded systems against the requirements. If there are missing and incorrect requirements, this results in costly design and test iterations.

The dynamic nature of today’s intelligent systems makes it difficult to capture requirements under various operating conditions and system states. A model-based approach advocates the modeling of operating conditions and dynamic states, related functions, and the simulation of functional requirements – to detect and rectify issues with requirements before the detail design activity. Test vectors reflecting operating conditions and dynamic system states are then generated automatically. STIMULUS, the requirements simulation solution from Dassault Systèmes adopts the model-based approach and allows you to simulate and debug the requirements to find missing requirements, identify conflicting ones etc., in exactly the same way a software developer debugs code.

For a video introduction to STIMULUS, please click here.

WHAT IF… you could validate requirements before starting the design?

Model Executable System Specifications

Model formalized textual requirements, state machines and block diagrams in a fully integrated simulation environment. Textual requirements can be allocated at each level of the system’s functional architecture. Describe operating modes and sequences of dynamic systems with state machines.

Debug and Test System Specifications

Simulate the complete system specification (requirements, state machines and block diagrams) as a whole and find specification errors before the design phase, adopting an effective requirement, and test-driven development process. Generate many execution traces that satisfy system specifications and provides powerful debugging features to analyze requirements simulation results: automatic detection and diagnosis of conflicting and missing requirements, requirements coverage, highlight of active requirements, signal monitoring, etc.

Validate Models in the Loop (MiL)

Once the behavior models and software code are built, validate the behavior against requirements. Build a test environment where requirements are turned into test observers, import behavioral models and software executables using the FMI® standard, generate numerous test vectors and run test campaigns and validate the behavior of the system under various operating scenarios. During test campaigns, compute and report comprehensive metrics on functional coverage of requirements.

Validate Control System Models

In most companies, debugging and testing of control systems is time consuming. Most often the focus is on validating the control systems using physical prototypes. However, very often engineers realize that issues have crept-in during the definition of the desired plant behavior and related requirements. It is also noted that many operating states have not been well covered by requirements and hence are not incorporated into the control models, resulting in re-work.

Considering the ever growing complexity of plant operating requirements, it is best to simulate and test the requirements to ensure validity and coverage before starting the control system design activity. Also, test vectors for various operating conditions should be automatically generated based on the use-case conditions and re-used in the controls design environment. This eliminates the need to create test-cases once again in the control design environment.

Use-cases defined in STIMULUS are used to create test vectors as text files automatically, while the requirements are turned into MATLAB® observers that can be embedded in Simulink® models.

For a video presentation of these capabilities of STIMULUS please click here.

Contact Adaptive to find out more about what 3DEXPERIENCE can do for you.

Ramesh Haldorai is Vice President, Strategic Consulting, 3DEXPERIENCE platform at Dassault Systémes.

12 Jun 2019

Model-Based Systems Engineering

Introduction

In the 20th century, systems engineering methodology was developed so that a system could be decomposed into multiple sub-systems and each sub-system could be independently engineered, manufactured and serviced. The emphasis was laid on defining requirement specifications such that the sub-systems and its interactions with other sub-systems were clearly defined. This method emphasized upfront planning, analysis and specification. Hence, the term Requirements Driven Systems Engineering. In practice, it was always very difficult to specify upfront with a high level of accuracy and to resist changes to specifications during development. By and large this methodology has been inadequate and has led to delayed programs and last minute surprises, commonly referred to as the requirements-delay-surprise factor!

In the 21st century, iterative modeling and simulation play a crucial role in systems engineering. A operational model is first developed to understand all usage conditions, including the surrounding environment; then systems models are built and simulated; finally, component models are developed.

Change is integral to this methodology and requirements, structure, and behavior are derived and finalized with the help of the models. In short, the model as the master!

The fidelity of the models is continuously improved during the development and it is possible to substitute models with physical systems, also called, Hardware in the Loop (HIL). When they physical systems are assembled, they are just a twin of the model. Tests conducted on this physical prototype can be continuously correlated against predicted behavior and be used to improve the fidelity of models.

Models are Everywhere

It’s fairly common today for mechanical engineers to develop CAD models and subject them to structural, fluid and thermal simulation. Similarly, a number of models are built by engineers from other disciplines: software engineers use models to specify the operating conditions and interactions between systems, control systems developers build block-based models and generate software code for controllers; electrical engineers develop schematics and layouts of their design; electronics engineers develop high level logic that is synthesized into physical design; hydraulic engineers define hydraulic circuits. When interdisciplinary work is critical, co-simulations are also performed. For example, the thermal and fluid dynamics aspects are simulated together to understand the performance of the climate control systems.

Systems integration nightmare. Since each discipline is working on their own models, most often the first time the engineers witness how the systems function together is when they finally assemble a physical prototype. It’s not uncommon that the physical prototypes require numerous build, test, fix iterations, before they work as intended. The net effect: projects are delayed and quality suffers.

The era of autonomous systems. New types of sensors, complex control algorithms and the integration of on-and-off-board systems are the drivers of autonomous capabilities. This leads to an increase in software-based functionality and E/E complexity never seen before.

Even though models are used by every engineer, they are siloed by discipline, requiring physical prototypes for integration and validation.

Capture

Smart, Safe and Connected

Design, validate, deliver the intelligent vehicle experience

Smart, Safe and Connected Industry Solution Experience based on the 3DEXPERIENCE platform connects mechanical, embedded electronics and software development teams on an end-to-end digital platform and enables them to build a multi-disciplinary virtual prototype right from the early stages of concept design.

Make your vehicle models dynamics capable. If you are currently building control systems models in a signal-flow oriented tool, higher fidelity and more accurate control can be achieved by incorporation co-simulation of dynamic physical systems with Dymola.

Signal-flow oriented control models typically don’t fully incorporate dynamic vibrations caused by road and driving conditions. These vibrations affect driving comfort and safety, and if not incorporated in the models, may lead to issues discovered only later during physical testing. Simulating dynamic behavior under various road and driving conditions helps identify and fix issues in the early phases of product development.

Dynamic Systems Modeling and Simulation

The design and engineering of autonomous systems requires a new model-based systems approach – it needs to be multi-disciplinary from the get go! Dymola and the VeSyMA libraries are a capitalization of decades of experience with dynamic systems modeling.

Dymola is a physical modeling and simulation environment, for model-based design of dynamic systems. Dymola adopts the Modelica language and is accompanied by a portfolio of multi-disciplinary libraries covering the mechanical, electrical, control, thermal, pneumatic, hydraulic, powertrain, thermodynamics, vehicle dynamics and air-conditioning domains. Library component models are coupled together to form a single complete model of the system. You can extend existing models or create custom models, improving reuse across projects.

Vehicle Systems Modeling and Analysis (VeSyMA™) is a complete set of libraries for vehicle modeling and simulation. It includes engine, powertrain, suspensions libraries that work in conjunction with the Modelica Standard Library. In addition, battery with electrified and hybrid powertrain libraries are available as well.

Model Import and Export. You can import models directly from Simulink® into Dymola. Dymola also supports the FMI Standard 1.0 and 2.0 for both model exchange and co-simulation.

Real-time Simulation. Dymola supports real-time code generation for simulation on a wide range of HiL platforms from dSPACE®. Co-simulation of Dymola and Simulink generated code has been tested and verified for compatibility with multiple combinations of dSPACE and MATLAB® releases.

Driver-in-the-Loop Simulation. Dassault Systèmes’ partner, Claytex, integrates Dymola with driving simulation software. Libraries built by Claytex include a car template and support for LiDAR, radar and ultra-sound sensor libraries that work with the simulator. Before exporting the model, simulation can be run in a test environment within Dymola as well.

Systems modeling case-study: Energy management of trucks at Arrival. www.arrival.com

Source: energyengineering magazine. Low Carbon Vehicle special edition – Summer 2016

The electrification of trucks involves efficient energy management and also needs to maintain the vehicle attributes at the same level as a conventional powertrain system. Hence, it requires detailed studies of vehicle system interactions in order to understand the vehicle system that dominates these attributes. The upfront modeling approach is vital to capture these attributes before developing the physical prototype.

Dymola has a multi-physics modeling capability that is very useful in developing these complex interactions at both vehicle system level and sub-system level, and for pin-pointing the dominant systems or components. All of these vehicle systems/subsystems can be modeled within the same modeling workspace at the top level and then cascaded to a lower level in order to create a series of libraries that can be repeatedly used for different vehicle plant model architectures. This process is important for system modeling, particularly during development phase, giving engineers access to different options to optimize the system architecture for energy management and the improvement of other vehicle attributes. The process minimizes the design and product risks by not committing tooling costs for the prototype build, as the majority of the validation activities can be simulated to produce results that are a close representation of the physical system/sub-system components, which also reduces the development lag-time. Another advantage of system modeling is being able to perform component sizing optimization for energy management in order to improve the vehicle range.

Dynamic Physical Systems Modeling - A Checklist

If you want to incorporate dynamic behavior into your vehicle models, the following are some of the key capabilities of the modeling environment that you may want to consider.

Breadth of Library Models: Are there pre-built libraries for the sub-systems that are included in your system? If your systems are multi-disciplinary in nature, look for libraries across multiple domains containing models for mechanical, electrical, control, thermal, pneumatic, hydraulic, powertrain, thermodynamics, vehicle dynamics, air conditioning, etc.

Object Oriented: Can you directly instantiate the library models and build your systems with ease? Typically, look for a drag and drop interface. Also, look for the ability to abstract subsystems in a single model. If necessary can you modify the library models and create your own derivatives of the models? Model management capabilities are a key requirement if you are working in a team.

Equation Based: Can the dynamic behavior of systems be described by differential and algebraic equations? Does it support the concept of flow and across variables?

Acausal: Does the environment support the definition of equations in a declarative form? Without considering the sequence? This reduces the effort to describe behavior in comparison with procedural languages like C and other block-based tools, where signal flow direction needs to be considered.

For a review of Dymola capabilities, please click here.

Contact Adaptive to find out more about what 3DEXPERIENCE can do for you.

Next week: Dynamic Vehicle Behavior Simulation – A Deep Dive

Ramesh Haldorai is Vice President, Strategic Consulting, 3DEXPERIENCE platform at Dassault Systémes.

31 Jul 2018
Systems Engineering

Getting Started with Systems Engineering

With the ever-increasing adoption of “smart,” mechatronic products requiring a combination of a mechanical, electronics, and computer engineering, the discipline of systems engineering has never been more important.  Given the complexity of modern products, systems engineering’s methodical approach for product definition and realization is being done by most companies whether they realize it or not.

The systems engineering process flow is often represented as a “V” diagram and as with any process there are many variations of it (to get a quick sample do a Web image search of “systems engineering”).  Rather than presenting yet another “V” diagram to gain an understanding of how to better manage a systems engineering process, it is simpler to just focus on the two main aspects of systems engineering and its main sub-processes:

Product Definition

  • Requirements development
  • Functional breakdown and logical analysis
  • Product design

Product Realization

  • Sub-system integration
  • Verification that requirements are met
  • Validation of product behavior

As mentioned, most companies developing mechatronic products do perform systems engineering sub-processes – some more formally than others.  For instance:

  • With software, it is very common to see solutions that manage detailed requirements and their associated test cases for verification. Teams developing mechanical and electronics hardware, the need still exists, but adoption of formal tools has not been as common.
  • Many companies organize their bills-of-material based on a functional breakdown to facilitate the eventual sub-system integration from various engineering disciplines as designs are completed. Unfortunately, this approach fails to properly capture the logical relationship between sub-systems and how they interact.

Another common issue is that companies rely too much on costly physical prototypes, or even worse, early production runs, to validate if the final user/consumer will accept the product.  Companies are not taking advantage of modern simulation and product behavior modeling software to validate product performance enough early in the product development process as the system functional breakdown and product design occurs.

Lastly, even if every one of these sub-processes are pursued, it is often with unique tools and systems that do not allow for the easy flow and exchange of information between product development participants.

Considering the status quo and the issue highlighted above, what is really needed for effective systems engineering is:

  1. Capture requirements for hardware and software.
  2. Define test cases to verify that requirements are fulfilled.
  3. Model a product’s sub-systems based on function and their logical relationship to one another.
  4. Virtually model product behavior to validate that a system meets end user expectation.
  5. Perform product design with kinematics and/or virtual simulation to further validate and verify system performance.
  6. Ideally perform #s 1-5 with tools that provide easy data exchange and traceability between the various stakeholders from requirements through product design, verification, and validation.

The traditional “V” diagram falls short in properly conveying how this systems engineering flow should be executed because most do not convey feedback loops and parallel activities.  So, rather than being constrained by the “V” shape just because “verification” and “validation” are the goals, the following can be used to convey a systems engineering target.

Pursuing the full scope of systems engineering processes may seem daunting, but the Dassault Systèmes 3DEXPERIENCE solution provides capabilities for each of these systems engineering needs with increasing levels of capabilities so companies can improve their systems engineering process over time while still having a unified approach.  The following table summarizes the key systems engineering capabilities offered with 3DEXPERIENCE:

Capability

Requirements
Manager
(TRM)

Systems
Architect
(SAK)

Dynamic
Systems
Engineer (SNK)
and pre-requisites

Mechatronic
Systems
Engineer (SQK)
and pre-requisites

Multiscale
Systems
Specialist (MCK)
and pre-requisites

Requirements
and Test Case
Management

Functional and
Logical Sub-
System Definition

Systems
Behavioral
Modeling

Kinematic
Product Design

Orchestrate
Virtual
Simulation