Tag: MBE

26 Jun 2019
Simulation for Systems Engineers

Requirements Simulation for Systems Engineers

Manage Complexity by Simulating Requirements

Requirements specifications are at the heart of the development process. Though multiple tools are used to manage requirements, it is still common for errors to creep in at the specification stage. Validation of requirements is usually carried out after the design phase, when hardware, software and models are simulated. These HiL, SiL and MiL (Hardware, Software, Model in the Loop) simulations are performed successively to validate the embedded systems against the requirements. If there are missing and incorrect requirements, this results in costly design and test iterations.

The dynamic nature of today’s intelligent systems makes it difficult to capture requirements under various operating conditions and system states. A model-based approach advocates the modeling of operating conditions and dynamic states, related functions, and the simulation of functional requirements – to detect and rectify issues with requirements before the detail design activity. Test vectors reflecting operating conditions and dynamic system states are then generated automatically. STIMULUS, the requirements simulation solution from Dassault Systèmes adopts the model-based approach and allows you to simulate and debug the requirements to find missing requirements, identify conflicting ones etc., in exactly the same way a software developer debugs code.

For a video introduction to STIMULUS, please click here.

WHAT IF… you could validate requirements before starting the design?

Model Executable System Specifications

Model formalized textual requirements, state machines and block diagrams in a fully integrated simulation environment. Textual requirements can be allocated at each level of the system’s functional architecture. Describe operating modes and sequences of dynamic systems with state machines.

Debug and Test System Specifications

Simulate the complete system specification (requirements, state machines and block diagrams) as a whole and find specification errors before the design phase, adopting an effective requirement, and test-driven development process. Generate many execution traces that satisfy system specifications and provides powerful debugging features to analyze requirements simulation results: automatic detection and diagnosis of conflicting and missing requirements, requirements coverage, highlight of active requirements, signal monitoring, etc.

Validate Models in the Loop (MiL)

Once the behavior models and software code are built, validate the behavior against requirements. Build a test environment where requirements are turned into test observers, import behavioral models and software executables using the FMI® standard, generate numerous test vectors and run test campaigns and validate the behavior of the system under various operating scenarios. During test campaigns, compute and report comprehensive metrics on functional coverage of requirements.

Validate Control System Models

In most companies, debugging and testing of control systems is time consuming. Most often the focus is on validating the control systems using physical prototypes. However, very often engineers realize that issues have crept-in during the definition of the desired plant behavior and related requirements. It is also noted that many operating states have not been well covered by requirements and hence are not incorporated into the control models, resulting in re-work.

Considering the ever growing complexity of plant operating requirements, it is best to simulate and test the requirements to ensure validity and coverage before starting the control system design activity. Also, test vectors for various operating conditions should be automatically generated based on the use-case conditions and re-used in the controls design environment. This eliminates the need to create test-cases once again in the control design environment.

Use-cases defined in STIMULUS are used to create test vectors as text files automatically, while the requirements are turned into MATLAB® observers that can be embedded in Simulink® models.

For a video presentation of these capabilities of STIMULUS please click here.

Contact Adaptive to find out more about what 3DEXPERIENCE can do for you.

Ramesh Haldorai is Vice President, Strategic Consulting, 3DEXPERIENCE platform at Dassault Systémes.

12 Jun 2019

Model-Based Systems Engineering

Introduction

In the 20th century, systems engineering methodology was developed so that a system could be decomposed into multiple sub-systems and each sub-system could be independently engineered, manufactured and serviced. The emphasis was laid on defining requirement specifications such that the sub-systems and its interactions with other sub-systems were clearly defined. This method emphasized upfront planning, analysis and specification. Hence, the term Requirements Driven Systems Engineering. In practice, it was always very difficult to specify upfront with a high level of accuracy and to resist changes to specifications during development. By and large this methodology has been inadequate and has led to delayed programs and last minute surprises, commonly referred to as the requirements-delay-surprise factor!

In the 21st century, iterative modeling and simulation play a crucial role in systems engineering. A operational model is first developed to understand all usage conditions, including the surrounding environment; then systems models are built and simulated; finally, component models are developed.

Change is integral to this methodology and requirements, structure, and behavior are derived and finalized with the help of the models. In short, the model as the master!

The fidelity of the models is continuously improved during the development and it is possible to substitute models with physical systems, also called, Hardware in the Loop (HIL). When they physical systems are assembled, they are just a twin of the model. Tests conducted on this physical prototype can be continuously correlated against predicted behavior and be used to improve the fidelity of models.

Models are Everywhere

It’s fairly common today for mechanical engineers to develop CAD models and subject them to structural, fluid and thermal simulation. Similarly, a number of models are built by engineers from other disciplines: software engineers use models to specify the operating conditions and interactions between systems, control systems developers build block-based models and generate software code for controllers; electrical engineers develop schematics and layouts of their design; electronics engineers develop high level logic that is synthesized into physical design; hydraulic engineers define hydraulic circuits. When interdisciplinary work is critical, co-simulations are also performed. For example, the thermal and fluid dynamics aspects are simulated together to understand the performance of the climate control systems.

Systems integration nightmare. Since each discipline is working on their own models, most often the first time the engineers witness how the systems function together is when they finally assemble a physical prototype. It’s not uncommon that the physical prototypes require numerous build, test, fix iterations, before they work as intended. The net effect: projects are delayed and quality suffers.

The era of autonomous systems. New types of sensors, complex control algorithms and the integration of on-and-off-board systems are the drivers of autonomous capabilities. This leads to an increase in software-based functionality and E/E complexity never seen before.

Even though models are used by every engineer, they are siloed by discipline, requiring physical prototypes for integration and validation.

Capture

Smart, Safe and Connected

Design, validate, deliver the intelligent vehicle experience

Smart, Safe and Connected Industry Solution Experience based on the 3DEXPERIENCE platform connects mechanical, embedded electronics and software development teams on an end-to-end digital platform and enables them to build a multi-disciplinary virtual prototype right from the early stages of concept design.

Make your vehicle models dynamics capable. If you are currently building control systems models in a signal-flow oriented tool, higher fidelity and more accurate control can be achieved by incorporation co-simulation of dynamic physical systems with Dymola.

Signal-flow oriented control models typically don’t fully incorporate dynamic vibrations caused by road and driving conditions. These vibrations affect driving comfort and safety, and if not incorporated in the models, may lead to issues discovered only later during physical testing. Simulating dynamic behavior under various road and driving conditions helps identify and fix issues in the early phases of product development.

Dynamic Systems Modeling and Simulation

The design and engineering of autonomous systems requires a new model-based systems approach – it needs to be multi-disciplinary from the get go! Dymola and the VeSyMA libraries are a capitalization of decades of experience with dynamic systems modeling.

Dymola is a physical modeling and simulation environment, for model-based design of dynamic systems. Dymola adopts the Modelica language and is accompanied by a portfolio of multi-disciplinary libraries covering the mechanical, electrical, control, thermal, pneumatic, hydraulic, powertrain, thermodynamics, vehicle dynamics and air-conditioning domains. Library component models are coupled together to form a single complete model of the system. You can extend existing models or create custom models, improving reuse across projects.

Vehicle Systems Modeling and Analysis (VeSyMA™) is a complete set of libraries for vehicle modeling and simulation. It includes engine, powertrain, suspensions libraries that work in conjunction with the Modelica Standard Library. In addition, battery with electrified and hybrid powertrain libraries are available as well.

Model Import and Export. You can import models directly from Simulink® into Dymola. Dymola also supports the FMI Standard 1.0 and 2.0 for both model exchange and co-simulation.

Real-time Simulation. Dymola supports real-time code generation for simulation on a wide range of HiL platforms from dSPACE®. Co-simulation of Dymola and Simulink generated code has been tested and verified for compatibility with multiple combinations of dSPACE and MATLAB® releases.

Driver-in-the-Loop Simulation. Dassault Systèmes’ partner, Claytex, integrates Dymola with driving simulation software. Libraries built by Claytex include a car template and support for LiDAR, radar and ultra-sound sensor libraries that work with the simulator. Before exporting the model, simulation can be run in a test environment within Dymola as well.

Systems modeling case-study: Energy management of trucks at Arrival. www.arrival.com

Source: energyengineering magazine. Low Carbon Vehicle special edition – Summer 2016

The electrification of trucks involves efficient energy management and also needs to maintain the vehicle attributes at the same level as a conventional powertrain system. Hence, it requires detailed studies of vehicle system interactions in order to understand the vehicle system that dominates these attributes. The upfront modeling approach is vital to capture these attributes before developing the physical prototype.

Dymola has a multi-physics modeling capability that is very useful in developing these complex interactions at both vehicle system level and sub-system level, and for pin-pointing the dominant systems or components. All of these vehicle systems/subsystems can be modeled within the same modeling workspace at the top level and then cascaded to a lower level in order to create a series of libraries that can be repeatedly used for different vehicle plant model architectures. This process is important for system modeling, particularly during development phase, giving engineers access to different options to optimize the system architecture for energy management and the improvement of other vehicle attributes. The process minimizes the design and product risks by not committing tooling costs for the prototype build, as the majority of the validation activities can be simulated to produce results that are a close representation of the physical system/sub-system components, which also reduces the development lag-time. Another advantage of system modeling is being able to perform component sizing optimization for energy management in order to improve the vehicle range.

Dynamic Physical Systems Modeling - A Checklist

If you want to incorporate dynamic behavior into your vehicle models, the following are some of the key capabilities of the modeling environment that you may want to consider.

Breadth of Library Models: Are there pre-built libraries for the sub-systems that are included in your system? If your systems are multi-disciplinary in nature, look for libraries across multiple domains containing models for mechanical, electrical, control, thermal, pneumatic, hydraulic, powertrain, thermodynamics, vehicle dynamics, air conditioning, etc.

Object Oriented: Can you directly instantiate the library models and build your systems with ease? Typically, look for a drag and drop interface. Also, look for the ability to abstract subsystems in a single model. If necessary can you modify the library models and create your own derivatives of the models? Model management capabilities are a key requirement if you are working in a team.

Equation Based: Can the dynamic behavior of systems be described by differential and algebraic equations? Does it support the concept of flow and across variables?

Acausal: Does the environment support the definition of equations in a declarative form? Without considering the sequence? This reduces the effort to describe behavior in comparison with procedural languages like C and other block-based tools, where signal flow direction needs to be considered.

For a review of Dymola capabilities, please click here.

Contact Adaptive to find out more about what 3DEXPERIENCE can do for you.

Next week: Dynamic Vehicle Behavior Simulation – A Deep Dive

Ramesh Haldorai is Vice President, Strategic Consulting, 3DEXPERIENCE platform at Dassault Systémes.

30 Jun 2017

Part 1: Understanding PLM, PDM, and More

We’ve all heard the buzzwords: digital transformation, product lifecycle, product data, PLM, PDM, systems engineering, models-based engineering and so on. It can be confusing, trying to figure out which technology or trend will have the biggest impact on the business. It’s also easy to imagine you’re missing out on a new, hot trend. But before we worry about whether we’re ahead of the curve or behind it, let’s be clear exactly what we’re talking about.

Defining Terms

Product Lifecycle Management (PLM) as a term has been around since the 1950s—it is not a new concept, but recently, more organizations are looking at this process as a place for improvement. A product lifecycle is simply the stages a product goes through from the initial concept to end of life—whether that’s a complex manufactured product like a rocket or a simpler product such as a house or a winter coat.

Product lifecycle management is the set of processes and/or procedures used to manage all of the product’s information throughout the lifecycle—from inception and planning; to design, engineering, and manufacture; to service and disposal.

At Adaptive, we have defined the product lifecycle to start with the digital design process and continues into the physical side of manufacturing for prototyping, testing, first article inspection, and quality control.

Is PDM also PLM?

But what about product data management (PDM)? Where does that fit?

As the words imply, PDM involves managing the information about a product, from models and drawings to bills of materials (BOMs) and more. But PDM shouldn’t be equated with PLM. PDM is about the data involved in managing all the data around the development of a product – product specifications, version control and more. PLM calls on the data in PDM to manage the entire digital design process.

Systems Engineering

Systems engineering is also sometimes confused with PLM, but that focuses on how to design and manage systems (which almost always include products). It’s the overall organization and oversight of a system, as well as the people and processes that ensure all aspects of a system are considered and integrated into a whole. PLM, which focuses on everything about the product, can sometimes help automate design processes related to systems engineering. But generally speaking, systems engineering has a broader scope, as it also includes the coordination of teams, logistics, and other responsibilities outside of the product stream.

Models-based Systems Engineering

Within systems engineering is the concept of models-based system engineering (MBSE). MBSE establishes a “model” to analyze and document key aspects of the systems engineering life cycle which includes system requirements, analysis, design, and validation and verification activities. Similar to a PLM, it is intended to improve communications within engineering teams and other stakeholders, it provides early identification of requirements issues, improves specification of allocated requirements to hardware and software resulting in fewer errors during integration and testing and provides requirements traceability, reduces project risks and lowers costs, and more.

Digital Transformation

The idea behind digital transformation is to establish a process for organizations to track the entire cross-functional cycle of product development capturing and integrating key data points to establish traceability and manage how a product is conceived, created, tested, and brought to market. In essence, the data trail creates a “digital thread” that captures the evolution of that product.  Of course, this doesn’t happen all at once and needs to be taken in discrete steps that build success upon success.  In some cases a digital thread will extend beyond the walls into the supply chain, this is the ultimate nirvana. However, many organizations are not quite ready for that just yet and it is more talk than anything. However, the concept of establishing a digital thread goes hand in hand with PLM and systems engineering strategies.  The transformation part happens when there is a more collaborative approach in an organization when everyone is working off the same data and making better business decisions. We will be writing more on this topic later.

In Conclusion

As you can see, product development covers a broad spectrum in an enterprise as it tends to touch many functional departments as work gets completed across an organization. In some cases a business problem on the surface may not “appear” to be a PLM issue, but in many cases due to collaboration needs, managing product changes, and tracking all documentation it quickly becomes something a PLM strategy can affect.

Stay tuned for our next post where we will dive a little deeper into Understanding PLM Fundamentals…