In the "conventional design" approach, a design is improved by evaluating its "response" and making design changes based on experience or intuition. This approach does not always lead to the desired result, that of a ‘best’ design, since the design objectives are often in conflict. It is therefore not always clear how to change the design to achieve the best compromise of these objectives. A systematic approach can be obtained by using an inverse process of first specifying the criteria and then computing the ‘best’ design according to a formulation. The improvement procedure that incorporates design criteria into a mathematical framework is referred to as Design Optimization This procedure is often iterative in nature and therefore requires multiple simulations.
No two products of the same design will be identical in performance, nor will a product perform exactly as designed or analyzed. A design is typically subjected to Structural variation and Environmental variation input variations that cause a variation in its response that may lead to undesirable behavior or failure. In this case a Probabilistic Analysis, using multiple simulations, is required to assess the effect of the input variation on the response variation and to determine the probability of failure.
To run and control multiple analyses simultaneously, LS-OPT provides a simulation environment that allows distribution of simulation jobs across multiple processors or networked computers. Each job running in parallel consists of the simulation, data extraction and disk cleanup. Measurements of time remaining or performance criteria such as velocity or energy are used to measure job progress for LS-DYNA’s explicit dynamic analysis calculations.
The graphical preprocessor LS-OPTui facilitates definition of the design input and the creation of a command file while the postprocessor provides output such as approximation accuracy, optimization convergence, tradeoff curves, anthill plots and the relative importance of design variables. The postprocessor also links to LS-PrePost to allow the viewing of the model representing a chosen simulation point.
Typical applications of LS-OPT include:
Future versions of LS-OPT will combine optimization and probabilistic analysis features in Reliability-Based Design Optimization.
The Optimization capability in LS-OPT is based on Response Surface Methodology and Design of Experiments. The D-Optimality Criterion is used for the effective distribution of sampling points for effective generalization of the design response. A Successive Response Surface Method allows convergence of the design response. Neural Networks provide an updateable global approximation that is gradually built up and refined locally during the iterative process. A Space Filling sampling scheme is used to update the sampling set by maximizing the minimum distances amongst new and existing sampling points.
LS-OPT allows the combination of multiple disciplines and/or cases for the improvement of a unique design. Multiple criteria can be specified and analysis results can be combined arbitrarily using C or FORTRAN type mathematical expressions.
Response surface methodology (RSM) is a collection of statistical and mathematical techniques useful for developing, improving and optimizing the design process. RSM encompasses a point selection method (also referred to as Design of Experiments, Approximation methods and Design Optimization to determine optimal settings of the design dimensions. RSM has important applications in the design, development, and formulation of new products, as well as in the improvement of existing product designs.
In LS-OPT, Response Surface Methodology is used both in Optimization and Probabilistic Analysis as a means to reduce the number of simulations. In the latter procedure, RSM is also used to distinguish deterministic effects from random effects.
LS-OPT enables the investigation of stochastic effects using Monte Carlo simulation involving either direct FE Analysis or analysis of surrogate models such as Response Surface Methodology or neural networks. As an input distribution, any of a series of statistical distributions such as Normal, Uniform, Beta, Weibull or User-defined can be specified. Latin Hypercube sampling provides an efficient way of implementing the input distribution. Histograms and influence plots are available through the postprocessor (Version 2.2).
Instability/Noise/Outlier Investigations (Version 2.2)
Some structural problems may not be well-behaved i.e. a small change in an input parameter may cause a large change in results.
LS-OPT computes various statistics of the displacement and history data for viewing in the LS-DYNA FE model postprocessor (LS-PrePost). The methodology differentiates between changes in results due to design variable changes and those due to structural instabilities (buckling) and numerical instabilities (lack of convergence or round-off). Viewing these results in LS-PrePost allows the engineer to pinpoint the source of instability for any chosen response and therefore to address instabilities which adversely affect predictability of the results.
A tradeoff study enables the designer to interactively study the effect of changes in the design constraints on the optimum design. E.g. the safety factor for maximum stress in a beam is changed and the designer wants to know how this change affects the optimal thickness and displacement of the beam.
For each response, the relative importance of all variables can be viewed on a bar chart together with their confidence intervals. This feature enables the user to identify variables of lesser importance that can be removed from the optimization, thereby contributing to time saving while having little effect on the final result.
Design of Experiments: A point selection method for determining the number and locations of sampling points in the Design Space. A simulation is done at each sampling point.
Design Optimization: The process of setting the design variables, typically the dimensions, of a product to minimize or maximize the value of its Response. A more general form of optimization includes specified limits on other responses (constrained optimization).
Probabilistic Analysis: The analysis of a set of different designs with a specified distribution in order to determine the characteristics (such as the mean and standard deviation) of the Response distribution.
Design Space: The region between the lower and upper limit for each of the design variables. These are specified to prevent the occurrence of designs with extreme of nonsensical dimensions (such as negative thicknesses).
Design Variable: An independent variable or dimension which forms part of the description of a design. Typical design variables are thickness dimensions, geometrical dimensions or values of material constants.
D-Optimality Criterion: A criterion that determines how well the coefficients of the design Approximation are estimated. The changes in the locations of the sampling points to maximize this criterion maximizes the confidence in the coefficients of the Approximation model.
Robust: A robust product performs consistently on target and is relatively insensitive to parameters that are difficult to control. A robust design minimizes the noise transmitted by the noise variables.
Response Noise: The random component of a response variation that can be caused by instability of the structure (such as buckling), numerical roundoff during analysis or modeling effects such as Finite Element meshing or lack of convergence during analysis.
Successive Response Surface Method: The successive response surface method is an iterative method which consists of a scheme to assure the convergence of an optimization process. The scheme determines the location and size of each successive Region of interest in the Design Space, builds a response surface in this region, conducts an Design Optimization and will check the tolerances on the Responses and design variables for termination. When using neural networks instead of polynomials as a Surrogate model, the Approximation is updated instead of newly constructed in each iteration. Consequently, the final approximation has a global representation that can be used for optimization, tradeoff studies or probabilistic analysis.
System Identification: The determination of system parameters such as material constants to minimize the difference between computational responses and experimental results. The purpose is to identify the system parameters of a model by using experimental results of a physical experiment.