Sunteți pe pagina 1din 5

Industry Sector

RTD Thematic Area Product and System Optimization

Date May 2002

Deliverable Nr 3

The use of Design of Experiments (DOE) and Response Surface Analysis(RSA) in PSO
Prof. Carlo Poloni, University of Trieste, Dr. Valentino Pediroda, University of Trieste Dr. Alberto Clarich, University of Trieste Prof. Grant Steven, University of Durham, UK

Summary: A general overview of how and when DOE are applicable is here given. The document is intended as a first attempt to build synthetic guidelines for the pest usage of PSO techniques. This report summarises the main discussions points related to RTD2 on Product and System Optimisation (PSO) raised at the FENET Workshop held in Copenhagen on 27-28 February 2002. The workshop was focussed on Design of Experiments (DOE) and Response Surface Analysis (RSA) methodologies.

FENET THEMATIC NETWORK COMPETITIVE AND SUSTAINABLE GROWTH (GROWTH) PROGRAMME

1 Introduction
There is no single solution to design optimisation tasks. Many techniques are available and most of them have pro and cons. While in the well established world of simulation the principles are clear (build a model able to reproduce numerically the physics of a phenomena), in the PSO arena the driving force should become improve your available design. There is a major difference in the two sentences: in the first case the physics is supposed known and the model should represent the reality within acceptable accuracy, i.e. the target is known and the method to reach it is subject to development, in the second case the model is assumed to be valid and improvements to the product performance are the target, i.e. the method is known and the target to be reached is not defined. It is therefore clear that it is not possible to define a-priori what is the best optimisation methodology but classifying the problems being addressed it is possible to derive rule of thumb that can suggest the engineer which strategy is most likely the most efficient in giving performance improvements for a given optimisation problem. In this report the quality and drawbacks of Design of Experiments and (DOE) and Response Surface Analysis (RSA) methodologies are briefly analysed. A synthetic applicability table is given while a preliminary classification of benchmarks is described.

2 PSO Workshop Description


The workshop was organized inviting a keynote lecturer (Prof. Toropov) to illustrate the basic principle of the methodologies and inviting software vendors to illustrate how DOE and RSA where applied to product and system optimization on industrial applications. Each presentation was organized in order to provide: 1) An illustration of DOE and RSA as applied to the design of a structures with FEA being the analysis engine. 2) An illustration of DOE and RSA as applied to a System (for example, die casting) where analysis is also involved. 3) A suggestion from the company/organization which can be regarded as some form of validation that could be adopted as a "benchmark" situation. The second day a survey was made by Prof. Toropov and an open discussion followed. List of presentations: Professor Vassili Toropov, University of Bradford Technical description of what in involved in the Design of Experiment (DOE) and the Response Surface Analysis (RSA) that follows. Pierre Guisset, LMS, Belgium "Aspects of Optimization and Parameter Studies: Case studies from Automotive Industry" Stefano Odorizzi, Enginsoft, Italy modeFRONTIER: Open System for Collaborative Design Optimisation Using Pareto Frontiers". Hans Sippel, MSC Discrete Variable optimization using DOE in Automotive Industry Patrick Morelle, Samtech Reinhard Helfrich, INTES
FENET THEMATIC NETWORK COMPETITIVE AND SUSTAINABLE GROWTH (GROWTH) PROGRAMME

Johannes Will, CADFEM/DYNARDO Prof Vassili Toropov Whats good about DOE and RSA and Whats not so good and What we can all do to make it better!

3. Classification of DOE and RSA Design of Experiments DOE Several DOE methodologies exists one possible classification consider: Random and Quasi-Random sampling (random points are selected in the design space) Factorial DOE (systematic sampling on pre-defined variables intervals) Orthogonal Arrays (sampling is done according to orthogonal arrays by Taguchi or Fisher) Adaptive sampling ( DOE and RSA are tightly connected and new points are selected using available dataset)

Response Surface Analysis RSA RSA methodology try to interpolate available data in order to predict locally or globally the correlation between variables and objectives A possible classifications would consider: RSA with linear coefficients o o o Polynomial Taylor Series, Fourie Series Linear Kriging Constitutive Laws Neural Networks Non-linear Kriging GP (Genetic Programming)

RSA with non-linear coefficients o o o o

FENET THEMATIC NETWORK COMPETITIVE AND SUSTAINABLE GROWTH (GROWTH) PROGRAMME

4. Best practise guidelines for DOE and RSA What is good about DoE & RSA for design optimization
RSM attempts to substitute an original design optimization problem with computationally expensive functions by a sequence of much simpler problems where all the functions are approximated by response surfaces. This allows to solve real-life design optimisation problems that are not solvable by other means within a reasonable amount of time. The basics of the Response Surface Methodology (RSM) that includes DoE and RSA are very simple, any engineer can understand the main principles. This is one of the reasons for its popularity. Approximations based on the response surface methodology: o Do not need design sensitivities but can use them if available o Are insensitive to numerical noise o Can be efficiently parallelised RSM with analytical models can be used more in the design process, even without formal optimization. Visualisation of design space becomes possible. Global approximations are very beneficial in case of multiobjective problems, they are built once but the problem can be reformulated, say, objective and constraint functions swapped, weightings on individual objectives changed, etc. RSM-based optimization is beneficial for robust design optimization - finding a plateau rather than a peak (when the phenomena is weakly non-linear). Monte-Carlo simulation can be used on the constructed response surface. Discrete optimization with RSM is possible, but this depends on the range of discrete variables. Example: it is fine if x = (3, 4, 5, ., 50) - can be considered as a specific case of noisy output, but wouldnt work when x = (3, 4, 5).

Whats not so good about DoE & RSA for design optimisation
In a case of global approximations, the use of RSM leads to an excessive number of response function evaluations when the number of design variables is large. A response surface model is not a substitute for a proper physics-based model, it can interpolate reasonably well but the quality of extrapolation beyond the available data could be poor.

RSA with non-linear coefficients (Neural Network, Gaussian Processes, constitutive laws) might be difficult to use and/or understand Risk of over-estimation of RSA capability: it is a method to exploit available information, not to search new information!

APPLICABILITY TABLE: N=not applicable W = weakly applicable A = applicable R = recommended


DOE Problem type Real variables Random A Factorial A Othogonal Array A Adaptive DOE A

FENET THEMATIC NETWORK COMPETITIVE AND SUSTAINABLE GROWTH (GROWTH) PROGRAMME

Integer Variables Weakly non linear Highly non linear Few variables (<10) Many variables (>10)

A A R A R

W R W R W

W R W R W

W A R A A

RSA Problem type Real variables Integer Variables Weakly non linear Highly non linear Few variables (<10) Many variables (>10)

Polynomial R N R A R W/N

Taylor R N A W R W/N

Lin. Kriging A W A A A A

Neural Net A A A R A A

Non linear Kriging A A R R R W

Genetic Programming A A A R A W/N

5. Benchmark definition
There are some difficulties in defining benchmarks for design optimisation. A simple mathematical function would be not representative of the real situation while a real design problem would be too specific. It is therefore recommended that two different type of benchmarks are defined: 1. mathematical formulation with complexity surrogate options o a problem with analytical functions but with: added artificial noise (say, 10%) constraint functions can only be evaluated within a +-10% band along the boundary of the feasible region occasionally, e.g. every 7th time, function evaluation fails. 2. practical applications o structural optimisation static analysis modal/vibration analysis crash analysis o fluid dynamic optimisation integral quantities distributed quantities o process optimisation process parameters tuning geometrical optimisation o multidisciplinary optimisation combination of above cases in a single product
FENET THEMATIC NETWORK COMPETITIVE AND SUSTAINABLE GROWTH (GROWTH) PROGRAMME

S-ar putea să vă placă și