Sunteți pe pagina 1din 5

BASICS OF TECHNOLOGY

19

Meta-modeling with modeFRONTIER:


Advantages and Perspectives
The progresses in finite elements systems is widely applied in many task. Mathematical and physical
methods (FEM) and high performance scientific domains. These types of soundness, computational costs and
computing offer to engineers accurate interpolation and regression prediction errors are not the only
and reliable virtual environments to methodologies are now becoming points to be taken into account when
explore various possible common even in engineering where developing meta-models. Ergonomics
configurations. On the other hand and they are also known as Response of the software have to be considered
at the same time, the number of users’ Surface Methods (RSMs). RSMs are in a wide sense. Engineers would like
requests constantly increases going indeed becoming very popular as they to grasp the general trends in the
even beyond computational offer a surrogated model with a second phenomena, especially when the
exhaustiveness. behavior is nonlinear. Moreover,
Interpolation and regression engineers would like to re-use the
In real case applications, it is not methods for computer aided experience accumulated, in order to
always possible to reduce the engineering spread the possible advantages on
complexity of the problem and to different projects. When using meta-
obtain a model that can be solved generation of improvements in speed models, engineers should always keep
quickly. Usually, every single and accuracy in computer aided in mind that this instrument allows a
simulation can take hours or even engineering. faster analysis than complex
days. In such cases, the time frame engineering models, however,
required to run a single analysis, Constructing a useful meta-model interpolation and extrapolation
prohibits running more than a few starting from a reduced number of introduce a new element of error that
simulations, hence other, smarter simulations is by no means a trivial must be managed carefully.
approaches are needed.

Engineers may consider and


apply a Design of Experiment
(DOE) technique to perform a
reduced number of
calculations. These well-
distributed results can be
subsequently used by the
engineers to create a surface
which interpolates these
points. This surface represents
a meta-model of the original
problem and can be used to
perform the optimization
without computing any further
analyses.

The use of mathematical and


statistical tools to
approximate, analyze and Figure 1: modeFRONTIER panel which helps engineers to easily formulate the problem, design the objectives and constraints, and
simulate complex real world identify the input and output parameters.

Newsletter EnginSoft - modeFRONTIER Special Issue - Year 2008


BASICS OF TECHNOLOGY
20
meta-model is justified, or whether
the analysis should be conducted
with the original simulation
instead.

2. If the original simulation is


computationally heavy and the use
of the meta-model is necessary, the
designer should choose the number
and type of designs at which it
would be more convenient to run
the original simulation model. The
true output responses obtained
from these runs are used for fitting
the meta-model. Even though this
step is quite an easy task in
modeFRONTIER, the user can take
Figure 2: modeFRONTIER panel with which engineers can easily formulate, generate and save several kinds of meta-
models. advantage of all the methodologies
available in the Design of
It is for these reasons that in the last 1. First of all, engineers should Experiment tool.
years, different approximation formulate the problem, design the
strategies have been developed to objectives and constraints, and 3. At this point, the engineer can use
provide inexpensive meta-models of identify the problem's input and the output responses obtained in
the simulation models to substitute output parameters; this may the previous step for fitting the
computationally expensive modules. include specifying the names and meta-model. The fitting of meta-
The intention of this article is to bounds of the variables that will be models requires specifying the type
demonstrate particular features of part of the design, as well as and functional form of the meta-
modeFRONTIER that allow an easy use characterizing the responses. This model and the easy-to-use interface
of the meta-modeling approach. is quite an easy task in to save, evaluate and compare
modeFRONTIER; the user can take different responses. modeFRONTIER
A typical sequence when using meta- advantage of all the features and assists engineers even at this
models for engineering design can be the node of the workflow [fig.1]. At important step by means of its
summarized as follows: this step, it is also advisable to Response Surface Methodologies
determine whether the use of a tools (RSMs) [Fig. 2].

Figure 3: Distance chart (left) and residual chart (right). These charts represent two of the several possibilities offered by modeFRONTIER to validate the meta-models.

Newsletter EnginSoft- modeFRONTIER Special Issue - Year 2008


BASICS OF TECHNOLOGY
21
4. Another important step is the
assessment of the meta-model
which involves evaluating the
performance of the models, as well
as the choice of an appropriate
validation strategy. In
modeFRONTIER, the engineers have
several charts and statistical tools
at their disposal for evaluating the
goodness of the meta-models [Fig.
3]. Gaining insight from the meta-
model and its error permits the
identification of important design
variables and their effects on
responses. This is necessary to
understand the behavior of the
model, to improve it or to redefine
the region of interest in the design Figure 4: The effect of choosing different values of the characteristic exponent p in the weighting function around a
space. point of the training dataset. The function flattens as long as the exponent grows.

experiment with a good enough weights are assigned according to


5. The last step consists of the use of mathematical model. the reciprocal of the mutual
meta-models to predict responses distances between the target point
at untried inputs and performing modeFRONTIER is able to import many and the training dataset points.
optimization runs, trade-off file formats (XLS, TXT, CSV…), within a The k–nearest method averages
studies, or further exploring the few easy steps. These designs resulting only on the most k nearest data to
design space. As these points are from experiments can be used to carry the target point. Shepard’s method
extracted from meta-models and out statistical studies, such as is one of the so called Point
not obtained through real sensitivity analysis, training for Schemes, i.e., interpolation
simulations, they are considered response surface modeling exactly as methods which are not based on a
virtual designs. Even this last step described in the previous sections. tessellation of the underlying
is quite an easy task in domain. Shepard is maybe the best
modeFRONTIER; the user can In modeFRONTIER, all the tools for known method among all scattered
immediately re-use the generated measuring the quality of meta-models data interpolants in an arbitrary
meta-models to speed up the in terms of statistical reliability are number of variables in which the
optimization. available. Moreover, modeFRONTIER interpolant assumes exactly the
gives a set of reasonable meta- values of the data. The interpolated
modeling methods to interpolate values are always constrained
Meta-models for different kinds of data. These methods between the maximum and the
laboratories include: minimum values of the points in
The previous section describes how the dataset. The response surface
meta-models can help to speed up • Multivariate Polynomial obtained with this method has a
optimization by substituting time Interpolation based on Singular rather rough and coarse aspect,
consuming simulation models. A Value Decomposition (SVD) especially for small values of the
similar approach can be used to create exponent. Perhaps one of the most
synthetic models from experimental • K-Nearest, Shepard method and its relevant drawbacks of this method
data. generalizations. Shepard’s method is the lowering of maxima and the
is a statistical interpolator which rising of minima. In fact, one
In this case, the aim is to substitute a works through averaging the known usually expects that averaging
time consuming and probably costly values of the target function. The methods like Shepard’s flatten out

Newsletter EnginSoft - modeFRONTIER Special Issue - Year 2008


BASICS OF TECHNOLOGY
22
extreme points. This property is determination of the best fitting of from the classical Von Neumann
particularly undesirable in the the experimental variogram. The machine, trying to implement at
situation shown in Figure 4, where fitting procedure uses Levenberg– the same time the hardest features
the interpolating model Marquardt to minimize the sum of and tasks of computation as
disastrously fails to describe the the squares of the differences parallel computing, nonlinearity,
underlying function, which is an between values from the adaptivity and self training. A
ordinary parabola. It is self–evident experimental variogram and values neural network is a machine that is
that this feature is crucial for from the model. Moreover, the user designed to model the way in which
seeking the extremes. is warned when the best fitting the brain performs a particular task
variogram shows some clue of or function of interest. To achieve
• Kriging: Kriging is a regression unacceptability, or still larger than their aims, neural networks
methodology that originated from the larger differences between massively employ mutual
the extensive work of Professor values in the dataset. interconnections between simple
Daniel Krige, from the computing cells usually called
Witwatersrand University of South • Parametric Surfaces: Useful neurons. Networks simulate the
Africa, and especially from whenever the mathematical brain in two aspects: The
problems of gold extraction. The expression of the response is knowledge is acquired through a
formalization and dissemination of known, except for some unknown learning process and the
this methodology, now universally parameters. The training algorithm information is stored in the
employed in all branches of calculates the values of the synaptic weights, i.e., the strengths
geostatistics, as oil extraction and unknown parameters that yield the of the interconnections between
idrology among others, is due to best fit. neurons [Fig. 5]. The class of Neural
Professor Georges Matheron, who Networks included in
indicated the Krige’s regression • Gaussian Processes: Implement the modeFRONTIER with a single
technique as krigeage. This is the Bayesian approach to regression hidden layer is shown to be capable
reason why the pronunciation of problems: The knowledge of the to interpolate any functions with
kriging with a soft “g” seems to be response is expressed in terms of minimum request of regularity.
the more correct one, despite the probability distributions. This
hard “g” pronunciation mainly algorithm is best suited for non • Radial Basis Functions (available
diffused in the U.S. Thanks to the polynomial responses. from version 4.0)
support of the Department of
Mathematical Methods and Models • Artificial Neural Networks: As well
for Scientific Applications of the as many human inventions or The description of each interpolation
University of Padova, technical devices, artificial Neural method constitutes by itself a separate
modeFRONTIER contains a simple Networks take inspiration from topic and paper, hence going deeply
kriging featuring the four variogram Nature in order to realize a kind of into this kind of description is not the
models with the possibility of auto calculator completely different aim of this article.

Figure 5: Neural Networks (NN) are inspired by the functioning of biological nervous systems. A real neuron (left) and an artificial neuron (right)

Newsletter EnginSoft- modeFRONTIER Special Issue - Year 2008


BASICS OF TECHNOLOGY
23
error, measuring by what percentage a
data point deviates from the mean
error. There are many other measures
that might be used for assessing the
performance of a meta-model (e.g. the
R-squared).

Conclusions
Both websites, www.esteco.com as
well as www.network.modefrontier.eu,
the portal of the European
modeFRONTIER Network, provide
several examples of how to use meta-
Figure 5: modeFRONTIER tool for meta-models 3D-exploration modeling techniques to speed-up
optimization.
Considering that several methods for effective surfaces for statistical
interpolation are available, both in analysis, for exploring candidate For any questions on this article or to
modeFRONTIER and in literature, an designs and for the use as surrogates request further examples or
engineer may ask which is the best in optimization. If the training points information, please email the author:
model to be used. There is an obvious are not carefully chosen, the fitted Silvia Poles
notion that more simple functions can model can be really poor and influence ESTECO - Research Labs
be approximated better and more the final results. Inadequate scientific@esteco.com
complex functions are in general more approximations may lead to suboptimal
difficult to approximate regardless of designs or inefficient searches for
the meta-modeling type, design type optimal solutions. References
and design size. A general [1] P. Alfeld. Scattered data
recommendation is to use simple meta- That is why validation is a fundamental interpolation in three or more
models first (such as on low order part of the modeling process. In variables. In T. Lyche and L.
polynomials). Kriging, Gaussian and modeFRONTIER, during the Schumaker, editors, Mathematical
Neural Network should be used for interpolation, a list of messages and Methods in Computer Aided
more complex responses. In general errors generated by the algorithms is Geometric Design, pages 1–34.
and regardless of the meta-model type, shown. The messages provide Academic Press, 1989.
design type, or the complexity of the suggestions for a better tuning of the [2] Martin D. Buhmann, Radial Basis
response, the performance tends to selected models. They generally list the Functions: Theory and
improve with the size of the design, maximum absolute error which is a Implementations Cambridge
especially for Kriging and Artificial measure that provides information University Press, 2003
Neural Networks. about extreme performances of the [3] Armin Iske, Multiresolution
model. The mean absolute error is the Methods in Scattered Data
sum of the absolute errors divided by Modelling, Springer, 2004
Validation of Meta-models the number of data points, and is [4] Georges Matheron. Les Variables
modeFRONTIER has a powerful tool for measured in the same units as the Regionalisees et leur Estimation,
the creation of meta-models, as it original data. The maximum absolute une Application de la Theorie de
gives the possibility to verify the percent error is the maximum absolute Fonctions Aleatoires aux Sciences
accuracy of a particular meta-model numerical difference divided by the de la Nature. Masson et Cie, Paris,
and to decide whether or not to true value. The percentage error is this 1965.
improve its fidelity by adding ratio expressed as a percent. The [5] Holger Wendland, Scattered Data
additional simulation results to the maximum absolute percent error Approximation, Cambridge
database. It is possible to decide on provides a practical account of the University Press, 2004

Newsletter EnginSoft - modeFRONTIER Special Issue - Year 2008

S-ar putea să vă placă și