Sunteți pe pagina 1din 6

New Genetic Algorithm with a Maximal Information

Coefcient Based Mutation


Nicholas Romito
ABSTRACT
In this paper, the Maximal Information Coecient (MIC)
will be used to modify the Genetic Algorithm (GA) in or-
der to solve multi-variable optimization problems more ef-
ciently and accurately. The MIC modied GA (MICGA)
learns the problem structure by calculating the MIC. The
original GA is compared to the MICGA and many other
types of optimization algorithms to determine the most ef-
cient optimization method.
Categories and Subject Descriptors
H.4 [Information Systems Applications]: Miscellaneous;
D.2.8 [Software Engineering]: Metricscomplexity mea-
sures, performance measures
General Terms
Theory
Keywords
Genetic Algorithm, MIC
1. INTRODUCTION
There are many optimization problems that occur in the
world that need solving in all elds of study. Some of these
problems, for example, include aligning MR and CT modal-
ity images of a brain to create one composite image [1],
optimizing the thermal management and package design of
a collector-up heterojunction bipolar transistor to improve
mobile phones [12], or optimizing the system reliability and
cost of a product [10]. All of these optimization problems
are generally very dicult to solve by hand; therefore, the
best way to solve these optimization problems is to use an
ecient and accurate approximation tool.
Optimization problems usually have an objective function
for which the global optimum (maximum or minimum) value
must be determined. The minimum value of an objective
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for prot or commercial advantage and that copies
bear this notice and the full citation on the rst page. To copy otherwise, to
republish, to post on servers or to redistribute to lists, requires prior specic
permission and/or a fee.
ACM Southeast 13, April 4-6, 2013, Savannah, GA
Copyright 2013 ACM 978-1-4503-1901-0/13/04 ...$15.00.
function is the smallest value y for y = f(x) where f(x) is
the objective function and x is the vector of independent
variables of size n depending on the dimension of the opti-
mization problem.
The Genetic Algorithm (GA) [2] is an approximation al-
gorithm that helps deal with optimization problems such as
the problems mentioned earlier, but as these problems grow
more complex, faster, more accurate algorithms are needed
to solve these optimization problems. This paper proposes
that an improved real-coded GA can be implemented by
learning the problem structure to better solve the problem.
There are many other optimization algorithms that have
been developed, and a brief description for each of the algo-
rithms that will be tested against the Maximal Information
Coecient modied GA (MICGA) proposed in this paper
can be found in [8]. The MICGA is another variation of
the simple GA and improves it by adapting the algorithm
to each specic problem being solved.
2. COMPONENTS OF THE MICGA
2.1 The Simple GA
The simple GA uses the principles of evolution and natural
selection to determine the global minimum or maximum of a
given objective function. The genetic operators in the simple
GA are mutation and crossover, which cause the solutions to
evolve and become better tted to the objective functions
maximum or minimum value.
Next, begins the selection of mates from the population,
and in this paper, we will assume minimization is being per-
formed. To start the evolution process, two individuals from
the population are selected at random and are used as mates
for reproduction in order to produce better tting solutions
to the problem.
After selection, reproduction occurs, which consists of two
steps: crossover and mutation. Crossover is the rst genetic
operator to be performed on the mates that were selected.
Crossover takes some traits from both parents and com-
bines them to create two new children. This genetic oper-
ator is used to create solutions with better tness to the
objective function. Whether or not crossover occurs de-
pends on the user-dened crossover probability. If there is
a crossover, each variable then has a fty percent chance of
having crossover performed on it. Once a variable, x
i
, has
been selected to have crossover performed on it, the average
value of the two parent individuals x
i
is determined, and
the average value is then used for the childrens x
i
value.
The x
i
value for each child is then oset by a small random
number, so they will not end with the same value in order to
improve search results. The nal value for x
i
for each child
is determined by:
child1 = Mean + .5 Difference
child2 = Mean .5 Difference
(1)
=
_

_
.999999 , > .999999
(
.5
1
)
1
3
, .5 < .999999
(2)
1
3
, 0 < .5
0 , otherwise
(2)
= (1
.5
(1 + 2
Distance
Difference
)
3
) u (3)
where Difference is the dierence between the two par-
ent values of x
i
, Distance is the distance from the parents
value of x
i
to x
i
s closest boundary, and u is a uniform, ran-
dom variable. Once each variable has been visited and had
crossover performed on it or not, the children are added to
the new population and mutation begins.
Just like in human evolution, there are some dierences
between parents traits and childrens traits. This is caused
by the second genetic operator, mutation. This genetic op-
erator is used to ne tune solutions or to force a solution
out of a local minimum value. Mutation runs through each
variable of all the individuals and mutates them by oset-
ting them with a small random number. Once a variable,
x
i
, has been chosen for mutation, the variable will be oset
using the following equations:
x
i
new
=
_
_
_
low , x
i
+ < low
x
i
+ , low x
i
+ up
up , x
i
+ > up
(4)
= (uplow)
_

u
, u 1 1 10
9

1
101
, .5 < u < 1 1 10
9

1
101
1 , 1 10
9
< u .5

l
, u 1 10
9
(5)
for .5 < u < 1 1 10
9
,
= 1 ((2(1 u) + 2(u .5)(1
u
)
101
)) (6)
and for 1 10
9
< u .5,
= (2u + (1 2u)(1 +
l
)
101
) (7)

l
=
lowx
i
uplow

u
=
upx
i
uplow
(8)
if
l
< 1, then
l
= 1, and if
u
> 1, then delta
u
=
1. If
l
<
u
, then
u
=
l
, but if
l

u
, then

l
=
u
. up is the upper boundary of x
i
, low is the lower
boundary of x
i
, and u is an uniform, random number. With
the completion of mutation, the evolutionary process also
is complete. The children make up the entire population
of the next generation in which the evolutionary process
repeats until the user-dened number of generations have
passed. Each generation should produce a new population
that should have a better estimation for the minimum value
of the objective function, f(x), with the nal generation
producing the minimum value of f(x).
Figure 1: Probability of mutation function for the
MICGA
2.2 The Maximal Information Coefcient
The Maximal Information Coecient (MIC) [9] is a real
number between 0 and 1, 0 being uncorrelated and 1 being
entirely correlated, that characterizes the non-linear correla-
tion between two variables. This is done by overlaying a grid
on the data points of the two variables and increasing the
resolution of the grid until a maximum resolution size has
been reached. A normalized mutual information value, for
each grid resolution, is calculated by comparing the points
to every pair of integers (x, y) on the grid, and the maximum
of these normalized values is the MIC.
3. THE MICGA
The MICGA is a combination of the simple GA and the
MIC in order to produce more accurate solutions to opti-
mization problems. This is implemented by changing the
probability of mutation for each variable, x
i
, based on the
MIC value between x
i
and the objective function, f(x). The
probability of mutation is changed every thirty generations
based on the calculated MIC value until 150 generations are
reached due to the ineciencies associated with calculating
the MIC. Once thirty generations have passed, the MIC is
determined, and the new probability of mutation for each
variable is calculated using
P(Mutation) = 1 e
(
1
2
)(
MIC.306853
.3
)
2
. (9)
The graph of this equation is shown in Figure 1. Equa-
tion 9 is a Gaussian function that has been inverted and
shifted along the y-axis so that the probability of mutation
for the mean MIC value of the simple GA, obtained by test-
ing on functions f
1
, f
3
, f
5
, f
9
, f
10
, and f
11
in Table 2, is zero,
and the probability of mutation increases symmetrically as
the MIC value increases or decreases from the mean. These
same test functions were modied to be asymmetric to de-
termine the dierence that asymmetry causes in the MIC
value. The symmetric and asymmetric MIC values were
then averaged together to get the mean MIC value. This
causes a better convergence on the mean MIC, which also
causes an improved convergence on the minimum value of
the objective function.
Table 1: Number of Function Evaluations for the Benchmark Functions in Table 2
Function Simple GA MICGA RCCRO GA FEP CEP FES CES PSO GSO RCBBO DE CMAES G3PCX
f1 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000
f2 150000 150000 150000 150000 200000 200000 200000 200000 150000 150000 200000 150000 150000 150000
f3 150000 150000 250000 250000 500000 500000 500000 500000 250000 250000 500000 250000 250000 250000
f4 150000 150000 150000 150000 500000 500000 500000 500000 150000 150000 500000 150000 150000 150000
f5 150000 150000 150000 150000 2000000 2000000 2000000 2000000 150000 150000 500000 150000 150000 150000
f6 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000
f7 150000 150000 150000 150000 300000 300000 300000 300000 150000 150000 300000 150000 150000 150000
f8 150000 150000 150000 150000 900000 900000 900000 900000 150000 150000 300000 150000 150000 150000
f9 150000 150000 250000 250000 500000 500000 500000 500000 250000 250000 300000 250000 250000 250000
f10 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000
f11 150000 150000 150000 150000 200000 200000 200000 200000 150000 150000 300000 150000 150000 150000
f12 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000
f13 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000 150000
f14 150000 150000 7500 7500 10000 10000 10000 10000 7500 7500 10000 7500 7500 7500
f15 150000 150000 250000 250000 400000 400000 400000 400000 250000 250000 100000 250000 250000 250000
f16 150000 150000 1250 1250 10000 10000 10000 10000 1250 1250 10000 1250 1250 1250
f17 150000 150000 5000 5000 10000 10000 10000 10000 5000 5000 10000 5000 5000 5000
f18 150000 150000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000
f19 150000 150000 4000 4000 10000 10000 10000 10000 4000 4000 10000 4000 4000 4000
f20 150000 150000 7500 7500 20000 20000 20000 20000 7500 7500 20000 7500 7500 7500
f21 150000 150000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000
f22 150000 150000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000
f23 150000 150000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000 10000
Figure 2: The average MIC values for symmetric
and asymmetric functions calculated from the sim-
ple GA
4. SIMULATION RESULTS
4.1 Benchmark Functions
The MICGA and simple GA were tested with a set of
23 benchmark problems which can be seen in Table 2. This
table includes the category of the function, the function itself
f
i
, the name of the function, the number of variables used
in the function, the domain in which the solution lies S, and
the minimum value of the function f
min
. These 23 functions
are standard benchmark functions for testing optimization
algorithms, and Table 2 has been adopted from [8]. The
dierent categories of benchmark functions are as follows:
4.1.1 Unimodal Functions
f
1
f
7
are unimodal which means they do not have any
local minima, so these functions are rather simple in terms of
nding the global minimum value. This particular group is
also high-dimensional to increase the diculty of the search
for the global minimum.
4.1.2 High-Dimensional Multimodal Functions
f
8
f
13
are the high-dimensional multimodal functions
which have many variables and have multiple local minima,
so the search for the global minimum, in this category, is
very dicult in relation to the other benchmark functions.
4.1.3 Low-Dimensional Multimodal Functions
f
14
f
23
are low-dimensional and multimodal, so they
still have multiple local minima similar to the second group
of functions, but there are less local minima than in the
previous multimodal group.
4.2 Experimental Setting
All of the simulations were ran on a personal computer
with an Intel Quad Core 3.20 GHz CPU and 5.6 GiB of
RAM. The MICGA was developed by modifying the simple
GA C language source code that can be found at [2].
The parameters used for the simple GA and MICGA were
probability of mutation P(mutate) = .2, probability of crossover
P(xover) = .9, population size NP = 100, number of gener-
ations maxgen = 1500, simulated binary crossover param-
eter (SBX) SBX = 2, and polynomial mutation parameter
PM = 100. The simple GA and MICGA both used 150000
function evaluations for each of the benchmark functions in
Table 2, and their average solution out of 100 runs was used
to compare to the other algorithms in Tables 3 - 5. Each
algorithm was ranked based on its average solution, and any
ties were broken using the standard deviation for those al-
gorithms involved. Once the rank was determined for each
algorithm in a category, the average rank of the algorithms
was calculated, and the algorithms nal ranking came from
the ordering of these average ranks.
4.3 Comparison
The simple GA and MICGA were tested along with all
of the algorithms mentioned in Section 1. The data for the
other algorithms were adopted from [8], and the simple GA
and MICGA were tested as previously described in this sec-
tion. The results are separated into the three categories of
functions, which were described earlier in this section.
4.3.1 Unimodal Functions
Table 3 shows that the MICGA performed poorly on these
functions relative to the other algorithms, but it still ranked
two positions higher than the simple GA, which ranked last
out of all the algorithms.
Table 2: 23 Benchmark Functions
Category Test Function Name n S fmin
I f1 =
n

i=1
x
2
i
Sphere Model 30 [100, 100]
n
0
f2 =
n

i=1
|xi| +
n

i=1
|xi| Schwefels
problem 2.22
30 [10, 10]
n
0
f3 =
n

i=1
(
i

j=1
xj)
2
Schwefels
problem 1.2
30 [100, 100]
n
0
f4 = maxi{|xi|, 1 i n} Schwefels
problem 2.21
30 [100, 100]
n
0
f5 =
n1

i=1
(100(xi+1 x
2
i
)
2
+ (xi 1)
2
) Generalized
Rosenbrocks
function
30 [30, 30]
n
0
f6 =
n

i=1
(xi + 0.5)
2
Step function 30 [100, 100]
n
0
f7 =
n

i=1
ix
4
i
+ random[0, 1) Quartic func-
tion with
noise
30 [1.28, 1.28]
n
0
II f8 =
n

i=1
(xi sin(
_
|xi|)) Generalized
Schwefels
problem 2.26
30 [500, 500]
n
-12569.5
f9 =
n

i=1
(x
2
i
10 cos(2xi) + 10) Generalized
Rastrigins
function
30 [5.12, 5.12]
n
0
f10 = 20 exp
_
0.2
_
1
n
n

i=1
x
2
i
_

exp
_
1
n
n

i=1
cos(2xi)
_
+ 20 + e
Ackleys func-
tion
30 [32, 32]
n
0
f11 =
1
4000
n

i=1
x
2
i

n

i=1
cos(
x
i

i
) + 1 Generalized
Griewank
function
30 [600, 600]
n
0
f12 =

n
{10 sin
2
(y1) +
29

i=1
(yi
1)
2
[1 + 10 sin
2
(yi+1)] + (yn 1)
2
} +
30

i=1
u(xi, 10, 100, 4)
Generalized
penalized
functions 1
30 [50, 50]
n
0
yi = 1 +
1
4
(xi + 1)
u(xi, a, k, m) =
_

_
k(xi a)
m
, xi > a
0, a xi a
k(xi a)
m
, xi < a
f13 = 0.1{sin
2
(3x1) +
29

i=1
(xi 1)
2
[1 +
sin
2
(3xi+1)]+(xn1)
2
[1+sin
2
(2x30)]}+
30

i=1
u(xi, 5, 100, 4)
Generalized
penalized
functions 2
30 [50, 50]
n
0
III f14 =
_
_
1
500
+
25

j=1
1
j+
2
i=1
(x
i
a
ij
)
6
_
_
1
Shekels Fox-
holes function
2 [65.536, 65.536]
n
1
f15 =
11

i=1
_
ai
x1(b
2
+b
i
x2)
b
2
i
+b
i
x3+x4
_
2
Kowaliks
function
4 [5, 5]
n
0.0003075
f16 = 4x
2
1
2.1x
4
1
+
1
3
x
6
1
+x1x2 4x
2
2
+4x
4
2
Six-hump
camel-back
function
2 [5, 5]
n
-1.0316285
f17 =
_
x2
5.1
4
2
x
2
1
+
5

x1 6
_
2
+
10
_
1
1
8
_
cos(x1) + 10
Branin func-
tion
2 [5, 10] [0, 15] .398
f18 = [1 +(x1 +x2 +1)
2
(19 14x1 +3x
2
1

14x2+6x1x2+3x
2
2
)][30+(2x13x2)
2
(18
32x1 + 12x
2
1
+ 48x2 36x1x2 + 27x
2
2
)]
Goldstein-
Price function
2 [2, 2]
n
3
f19 =
4

i=1
ci exp
_

j=1
aij(xj pij)
2
_
Hartmans
family func-
tion 1
3 [0, 1]
n
-3.86
f20 =
4

i=1
ci exp
_

j=1
aij(xj pij)
2
_
Hartmans
family func-
tion 2
6 [0, 1]
n
-3.32
f21 =
5

i=1
[(x ai)(x ai)
T
+ ci]
1
Shekels fam-
ily function 1
4 [0, 10]
n
-10
f22 =
7

i=1
[(x ai)(x ai)
T
+ ci]
1
Shekels fam-
ily function 2
4 [0, 10]
n
-10
f22 =
10

i=1
[(x ai)(x ai)
T
+ ci]
1
Shekels fam-
ily function 3
4 [0, 10]
n
-10
Table 3: Simulation Results for f
1
f
7
Simple GA MICGA RCCRO1 GA FEP CEP FES CES PSO GSO RCBBO DE CMAES G3PCX
f1 Mean 8.442E+01 6.625E-01 6.427E-07 3.171E+00 5.700E-04 2.200E-04 2.500E-04 3.400E-05 3.693E-37 1.948E-08 1.390E-03 6.576E-06 6.093E-29 6.404E-79
Std Dev 1.362E+01 4.976E-01 2.099E-07 1.662E+00 1.300E-04 5.900E-04 6.800E-04 8.600E-06 2.460E-36 1.163E-08 5.500E-04 1.132E-06 1.554E-29 1.248E-78
Rank 14 12 5 13 10 8 9 7 2 4 11 6 3 1
f2 Mean 3.510E+00 1.616E-01 2.196E-03 5.771E-01 8.100E-03 2.600E-03 6.000E-02 2.100E-02 2.917E-24 3.704E-05 7.990E-02 2.894E-04 3.480E-14 2.803E+01
Std Dev 4.089E-01 8.200E-02 4.341E-04 1.306E-01 7.700E-04 1.700E-04 9.600E-03 2.200E-03 1.136E-23 8.619E-05 1.440E-02 2.518E-05 4.034E-15 1.012E+01
Rank 13 11 5 12 7 6 9 8 1 3 10 4 2 14
f3 Mean 1.700E+308 1.700E+308 2.966E-07 9.750E+03 1.600E-02 5.000E-02 1.400E-03 1.300E-04 1.198E-03 5.783E+00 2.270E+01 1.212E+04 1.511E-26 1.064E-76
Std Dev 0.000E+00 0.000E+00 1.146E-07 2.595E+03 1.400E-02 6.600E-02 5.300E-04 8.500E-05 2.111E-03 3.681E+00 1.030E+01 1.554E+03 3.644E-27 1.532E-76
Rank 13 13 3 11 7 8 6 4 5 9 10 12 2 1
f4 Mean 5.224E+00 8.237E-01 9.318E-03 7.961E+00 3.000E-01 2.000E+00 5.500E-03 3.500E-01 4.123E-01 1.075E-01 3.090E-02 5.790E+00 3.994E-15 4.543E+01
Std Dev 6.053E-01 5.110E-01 3.657E-03 1.506E+00 5.000E-01 1.200E+00 6.500E-04 4.200E-01 2.500E-01 3.998E-02 7.270E-03 4.559E-01 5.311E-16 8.092E+00
Rank 11 9 3 13 6 10 2 7 8 5 4 12 1 14
f5 Mean 1.695E+03 7.243E+01 2.706E+01 3.386E+02 5.060E+00 6.170E+00 3.328E+01 6.690E+00 3.736E+01 4.984E+01 5.540E+01 9.338E+01 5.581E-01 3.091E+00
Std Dev 3.318E+02 7.203E+01 3.427E+01 3.615E+02 5.870E+00 1.361E+01 4.313E+01 1.445E+01 3.214E+01 3.018E+01 3.520E+01 1.734E+01 1.390E+00 1.639E+01
Rank 14 11 6 13 3 4 7 5 8 9 10 12 1 2
f6 Mean 8.656E+01 1.260E+00 0.000E+00 3.697E+00 0.000E+00 5.778E+02 0.000E+00 4.112E+02 1.460E-01 1.600E-02 0.000E+00 0.000E+00 7.000E-02 9.462E+01
Std Dev 1.234E+01 1.760E+00 0.000E+00 1.952E+00 0.000E+00 1.126E+03 0.000E+00 6.954E+02 4.182E-01 1.333E-01 0.000E+00 0.000E+00 2.932E-01 5.969E+01
Rank 11 9 1 10 1 14 1 13 8 6 1 1 7 12
f7 Mean 8.802E+00 8.671E+00 5.405E-03 1.045E-01 7.600E-03 1.800E-02 1.200E-02 3.000E-02 9.902E-03 7.377E-02 1.750E-02 3.967E-02 2.209E-01 9.797E-01
Std Dev 9.635E-01 9.437E-01 2.985E-03 3.622E-02 2.600E-03 6.400E-03 5.800E-03 1.500E-02 3.538E-02 9.256E-02 6.430E-03 7.832E-03 8.653E-02 4.627E-01
Rank 14 13 1 10 2 6 4 7 3 9 5 8 11 12
Average Rank 12.857 11.143 3.429 11.714 5.143 8.000 5.429 7.286 5.000 6.429 7.286 7.857 3.857 8.000
Overall Rank 14 12 1 13 4 10 5 7 3 6 8 9 2 11
Table 4: Simulation Results for f
8
f
13
Simple GA MICGA RCCRO1 GA FEP CEP FES CES PSO GSO RCBBO DE CMAES G3PCX
f8 Mean -1.233E+04 -1.257E+04 -1.26E+04 -1.257E+04 -1.255E+04 -7.917E+03 -7.550E+03 -7.550E+03 -9.660E+03 -1.257E+04 -1.257E+04 -1.257E+04 -9.873E+07 -2.577E+03
Std Dev 1.234E+03 1.257E+03 2.32E-02 2.109E+00 5.26E+01 6.345E+02 6.314E+02 6.314E+02 4.638E+02 2.214E-02 2.200E-05 2.330E-05 8.547E+08 4.126E+02
Rank 9 7 2 3 8 11 12 12 10 6 4 5 1 14
f9 Mean 4.791E+01 1.369E+01 9.08E-04 6.509E-01 4.600E-02 8.900E+01 7.082E+01 7.082E+01 2.079E+01 1.018E+00 2.620E-02 7.261E-05 4.950E+01 1.740E+02
Std Dev 6.714E+00 3.784E+00 2.88E-04 3.594E-01 1.20E-02 2.310E+01 2.149E+01 2.149E+01 5.940E+00 9.509E-01 9.760E-03 3.376E-05 1.229E+01 3.199E+01
Rank 9 7 2 5 4 13 11 11 8 6 3 1 10 14
f10 Mean 1.718E+00 1.718E+00 1.94E-03 8.678E-01 1.800E-02 9.200E+00 9.070E+00 9.070E+00 1.340E-03 2.655E-05 2.510E-02 7.136E-04 4.607E+00 1.352E+01
Std Dev 1.718E-01 1.718E-01 4.19E-04 2.805E-01 2.10E-02 2.800E+00 2.840E+00 2.840E+00 4.239E-02 3.082E-05 5.510E-03 6.194E-05 8.725E+00 4.815E+00
Rank 8 8 4 7 5 13 11 11 3 1 6 2 10 14
f11 Mean 1.758E+00 7.250E-01 1.117E-02 1.004E+00 1.600E-02 8.600E-02 3.800E-01 3.800E-01 2.323E-01 3.079E-02 4.820E-01 9.05E-05 7.395E-04 1.127E-02
Std Dev 2.012E-01 1.774E-01 1.622E-02 6.755E-02 2.20E-02 1.200E-01 7.700E-01 7.700E-01 4.434E-01 3.087E-02 8.490E-02 3.402E-05 2.389E-03 1.310E-02
Rank 14 12 3 13 5 7 9 10 8 6 11 1 2 4
f12 Mean 7.116E-01 1.44E-03 2.074E-02 4.372E-02 9.200E-06 1.760E+00 1.180E+00 1.180E+00 3.950E-02 2.765E-11 3.280E-05 1.886E-04 5.167E-03 4.593E+00
Std Dev 1.564E-01 1.787E-03 5.485E-02 5.058E-02 6.14E-05 2.400E+00 1.870E+00 1.870E+00 9.142E-02 9.167E-11 3.330E-05 4.266E-08 7.338E-03 5.984E+00
Rank 10 5 7 9 2 12 11 11 8 1 3 4 6 14
f13 Mean 2.236E+09 2.236E+09 7.048E-07 1.681E-01 1.600E-04 1.400E+00 1.390E+00 1.390E+00 5.05E-02 4.695E-05 3.720E-04 9.519E-07 1.639E-03 2.349E+01
Std Dev 6.038E+08 6.038E+08 5.901E-07 7.068E-02 7.30E-05 3.700E+00 3.330E+00 3.330E+00 5.691E-01 7.011E-04 4.630E-04 2.021E-07 4.196E-03 2.072E+01
Rank 13 13 1 8 4 11 9 9 7 3 5 2 6 12
Average Rank 10.500 8.667 3.167 7.500 4.667 11.167 10.500 10.667 7.333 3.833 5.333 2.500 5.833 12.000
Overall Rank 10 9 2 8 4 13 11 12 7 3 5 1 6 14
Table 5: Simulation Results for f
14
f
23
Simple GA MICGA RCCRO1 GA FEP CEP FES CES PSO GSO RCBBO DE CMAES G3PCX
f14 Mean 1.652E+00 1.494E+00 9.980E-01 9.989E-01 1.220E+00 1.660E+00 1.200E+00 2.160E+00 1.024E+00 9.980E-01 9.980E-01 1.576E+00 1.246E+01 1.231E+01
Std Dev 8.756E-01 4.730E-01 2.317E-02 4.433E-03 5.600E-01 1.190E+00 6.300E-01 1.820E+00 1.450E-01 0.000E+00 2.740E-05 2.140E+00 5.529E+00 5.882E+00
Rank 10 8 3 4 7 11 6 12 5 1 2 9 14 13
f15 Mean 4.211E-02 4.213E-02 9.077E-04 7.088E-03 5.000E-04 4.700E-04 9.700E-04 1.200E-03 3.807E-04 3.771E-04 7.860E-04 5.372E-04 6.554E-04 5.332E-04
Std Dev 4.211E-03 4.213E-03 2.876E-04 7.855E-03 3.200E-04 3.000E-04 4.200E-04 1.600E-05 2.509E-04 2.597E-04 1.800E-04 1.221E-04 3.730E-04 3.784E-04
Rank 13 14 9 12 4 3 10 11 2 1 8 6 7 5
f16 Mean -1.032E+00 -1.032E+00 -1.032E+00 -1.030E+00 -1.030E+00 -1.030E+00 -1.032E+00 -1.032E+00 -1.014E+00 -1.032E+00 -1.031E+00 -1.019E+00 -1.015E+00 -4.928E-01
Std Dev 1.032E-01 1.032E-01 4.843E-04 3.143E-03 4.900E-04 4.900E-04 6.000E-07 6.000E-07 1.279E-02 0.000E+00 9.010E-04 1.869E-02 4.148E-01 3.367E-01
Rank 5 5 4 10 8 8 2 2 13 1 7 11 12 14
f17 Mean 3.979E-01 3.979E-01 3.979E-01 4.040E-01 3.980E-01 3.980E-01 3.980E-01 3.980E-01 4.040E-01 3.979E-01 3.984E-01 3.995E-01 3.979E-01 5.560E+01
Std Dev 3.979E-02 3.979E-02 8.525E-07 1.039E-02 1.500E-07 1.500E-07 6.000E-08 6.000E-08 6.881E+01 0.000E+00 6.770E-04 4.281E-03 1.047E-15 1.071E-13
Rank 4 4 3 12 8 8 6 6 13 1 10 11 2 14
f18 Mean 3.053E+00 3.053E+00 3.001E+00 7.503E+00 3.020E+00 3.000E+00 3.000E+00 3.000E+00 3.005E+00 3.000E+00 3.010E+00 3.479E+00 5.700E+00 8.670E+00
Std Dev 6.118E-01 6.118E-01 1.171E-03 1.040E+01 1.100E-01 0.000E+00 0.000E+00 0.000E+00 1.212E-03 0.000E+00 1.120E-02 3.319E+00 1.051E+01 1.290E+01
Rank 9 9 5 13 8 1 1 1 6 1 7 11 12 14
f19 Mean -1.000E+00 -1.000E+00 -3.863E+00 -3.862E+00 -3.860E+00 -3.860E+00 -3.860E+00 -3.860E+00 -3.858E+00 -3.863E+00 -3.862E+00 -3.862E+00 -3.725E+00 -3.598E+00
Std Dev 1.000E-01 1.000E-01 1.464E-03 6.284E-04 1.400E-02 1.400E-02 4.000E-03 1.400E-05 3.213E-03 3.843E-06 3.650E-04 1.672E-03 5.744E-01 1.869E-01
Rank 13 13 4 2 8 8 6 6 10 1 3 5 11 12
f20 Mean -1.000E+00 -1.000E+00 -3.319E+00 -3.263E+00 -3.270E+00 -3.280E+00 -3.230E+00 -3.240E+00 -3.185E+00 -3.270E+00 -3.317E+00 -3.316E+00 -3.290E+00 -1.980E-01
Std Dev 1.000E-01 1.000E-01 2.115E-03 6.040E-02 5.900E-02 5.800E-02 1.200E-01 5.700E-02 6.105E-02 5.965E-02 2.360E-02 6.674E-03 5.305E-02 4.327E-01
Rank 12 12 1 8 6 5 10 9 11 7 2 3 4 14
f21 Mean -7.096E+00 -7.097E+00 -1.011E+01 -5.165E+00 -5.520E+00 -6.860E+00 -5.540E+00 -6.960E+00 -7.544E+00 -6.090E+00 -5.513E+00 -8.739E+00 -6.683E+00 -7.476E-01
Std Dev 3.310E+00 3.310E+00 3.505E-02 2.925E+00 1.590E+00 2.670E+00 1.820E+00 3.100E+00 3.030E+00 3.456E+00 3.350E+00 1.571E+00 3.719E+00 3.170E-01
Rank 5 4 1 13 11 7 10 6 3 9 12 2 8 14
f22 Mean -8.010E+00 -8.015E+00 -1.035E+01 -5.443E+00 -5.520E+00 -8.270E+00 -6.760E+00 -8.310E+00 -8.355E+00 -6.555E+00 -6.800E+00 -9.199E+00 -6.574E+03 -9.468E-01
Std Dev 3.279E+00 3.278E+00 4.838E-02 3.278E+00 2.120E+00 2.950E+00 3.010E+00 3.100E+00 2.018E+00 3.244E+00 3.520E+00 1.217E+00 3.641E+00 3.761E-01
Rank 8 7 2 13 12 6 10 5 4 11 9 3 1 14
f23 Mean -9.089E+00 -9.047E+00 -1.048E+01 -4.911E+00 -6.570E+00 -9.100E+00 -7.630E+00 -8.500E+00 -8.944E+00 -7.402E+00 -7.285E+00 -9.229E+00 -7.576E+00 -1.130E+00
Std Dev 2.915E+00 2.924E+00 3.885E-02 3.487E+00 3.140E+00 2.920E+00 3.270E+00 1.250E+00 1.630E+00 3.213E+00 3.380E+00 1.325E+00 3.741E+00 3.678E-01
Rank 4 5 1 13 12 3 8 7 6 10 11 2 9 14
Average Rank 8.222 8.100 3.300 10.000 8.400 6.000 6.900 6.500 7.300 4.300 7.100 6.300 8.000 12.800
Overall Rank 11 10 1 13 12 3 6 5 8 2 7 4 9 14
Figure 3: The percentage of improvement of the
MICGA over the simple GA
4.3.2 High-Dimensional Multimodal Functions
Table 4 shows that the MICGA performed the best in this
category over the other two categories. The MICGA still did
not rank very well compared to the other algorithms as it
nished with ranking of 9, but the simple GA still trailed
the MICGA by one rank.
4.3.3 Low-Dimensional Multimodal Functions
As can be seen in Table 5, the MICGA ranked 10 out of
the 14 algorithms, so the MICGA still is not the best option
by any means. The simple GA had a nal ranking of 11 out
of 14 in this category, so the MICGA is still an improvement
over the original algorithm.
4.4 Discussion
From the Tables 3 - 5, it can be seen that the MICGA
is not the best optimization algorithm that can be used,
but, from Figure 3, the MICGA is a signicant improve-
ment over the original simple GA. This improvement shows
that using the MIC to increase the accuracy of optimization
algorithms is possible, and the application of the MIC to
another, more accurate algorithm could provide even better
results than the algorithms in this paper. Also, the MICGA
ranked higher in the more dicult benchmark function cat-
egory than in the simplest category, which leads to the con-
clusion that the use of the MIC allows for a more consistent
performance across all problems.
It can be inferred that the most accurate algorithm, RC-
CRO1, can be improved even further using the MIC, and
it can possibly be improved in the high-dimensional mul-
timodal function category to the best algorithm since the
MICGA performed the best in this category. It can be pre-
dicted that the MIC will allow for consistency across each
category, so infusing the MIC with RCCRO1 could improve
the accuracy of RCCRO1 to be more accurate than any other
algorithm in this paper.
5. CONCLUSION
As real world optimization problems grow more dicult,
better algorithms are needed to solve these problems. If
an algorithm can learn the structure of a problem, the algo-
rithm can nd a signicantly better solution to the problem.
The MIC proved to be a viable option for determining the
structure of a function and notably improved the accuracy of
the simple GA. Although this technique improved the accu-
racy of the simple GA, it did not make this improved simple
GA (MICGA) a feasible option for optimization problems
because it did not perform as well as other algorithms.
Since the MIC successfully improved the accuracy of the
simple GA, it can be inferred that it will do the same for
any other optimization algorithm; therefore, in the future,
the MIC should be applied to the other algorithms in this
paper in order to determine if a more eective algorithm can
be created for solving optimization problems.
6. REFERENCES
[1] A. Das and B. Mahua. Ane-based registration of ct
and mr modality images of human brain using
multiresolution approaches: comparative study on
genetic algorithm and particle swarm optimization.
Neural Computing & Applications, 20(2):223237,
2011.
[2] K. Deb. Genetic algorithm [computer program], 2001.
[3] K. Deb, A. Anand, and D. Joshi. A computationaly
ecient evolutionary algorithm for real-parameter
optimization. Evolutionary Computation, 10.
[4] W. Gong, Z. Cai, C. X. Ling, and H. Li. A real-coded
biogeography-based optimization with mutation, 2009.
[5] N. Hansen and A. Ostermeir. Completely
derandomized self-adaptation in evolution strategies.
Evolutionary Computation, 9(2):159195, 2001.
[6] S. He, Q. Wu, and J. R. Saunders. Group search
optimizer: An optimization algorithm inspired by
animal searching behavior. IEEE Transactions on
Evolutionary Computation, 13(5):973990, 2009.
[7] J. Kennedy and R. Eberhart. Swarm Intelligence.
Morgan Kaufmann, San Francisco, CA, 2001.
[8] A. Y. Lam, V. O. Li, and J. J. Yu. Real-coded
chemical reaction optimization. IEEE Transactions on
Evolutionary Computation, 16(3):339353, 2012.
[9] D. N. Reshef, Y. A. Reshef, H. K. Finucane, S. R.
Grossman, G. McVean, P. J. Turnbaugh, E. S. Lander,
M. Mitzenmacher, and P. C. Sabeti. Detecting novel
associations in large data sets. Science Magazine,
334(6062):15181524, 2011.
[10] L. Sahoo, A. K. Bhunia, and P. K. Kapur. Genetic
algorithm based multi-objective reliability
optimization in interval environment. Computers &
Industrial Engineering, 62(1):152160, 2012.
[11] R. Storn and K. Price. Dierential evolution - a simple
and ecient heuristic for global optimization over
continuous spaces. Journal of Global Optimization,
11(4):341359, 1997.
[12] H.-C. Tseng and J.-Y. Chen. Thermal analysis and
packaging optimization of collector-up hbts using an
enhanced genetic-algorithm methodology. Packaging &
Manufacturing Technology, 2(2):231239, 2012.
[13] X. Yao and Y. Liu. Faster evoultion strategies. In
Proc. 6th International Conference of Evolutionary
Programming, pages 151162, 1997.
[14] X. Yao, Y. Liu, and G. Lin. Evolutionary
programming made faster. IEEE Transactions on
Evolutionary Computation, 3(2):82102, 1999.

S-ar putea să vă placă și