Sunteți pe pagina 1din 6

International Journal of Computational Intelligence Research.

ISSN 0973-1873 Vol.3, No.1 (2007), pp. 66–71


°c Research India Publications http://www.ijcir.info

A New Approach for the Short-Term Load


Forecasting with Autoregressive and Artificial
Neural Network Models
Ummuhan Basaran Filik1 , Mehmet Kurban2
1 2
, Anadolu University, Department of Electric Engineering,
Eskisehir,Turkey
ubasaran,mkurban@anadolu.edu.tr

Abstract: In this paper, a new approach to the short-term load power systems [1].
forecasting using autoregressive (AR) and artificial neural net- Many methods have been developed for the forecasting.
work (ANN) models is introduced and applied to the power They are based on various statistical methods such as regres-
system of Turkey by using the consumption values of electri- sion [2-3], Box Jenkins model [4], exponential smoothing [5]
cal energy for three months in 2002, including January, Feb- and Kalman filters [6]. However, these methods cannot rep-
ruary, and March. The load forecasting for the next day using resent the complex nonlinear relationships [4]. Many elec-
AR and ANN models is performed separately, and the results tric power companies have adopted conventional prediction
of the AR analysis is used for the input of different ANN mod- methods for load forecasting. However, these methods can
els, which are Feed Forward Back Propagation and Cascade not properly represent the complex nonlinear relationships
Forward Back Propagation Models. The performance of these that exist between the load, and series of factors that influ-
models was compared with each other. When the energy con- ence it [7]. Recently, artificial neural networks (ANN) have
sumption is examined for the whole week, it was observed that been successfully applied to short term load forecasting [8-
Sundays are different from other six days in the weeks. Because 14].
of this, the values for the past six days except Sunday are used In this paper, a new approach in which both autoregressive
for the load forecasting of the next day. For the system that uses (AR) and artificial neural network (ANN) models are used
ANN models only, the network is composed of a 6-neuron input for the short-term load forecasting is introduced. The results
layer and a 1-neuron output layer; for the systems that use AR of the AR analysis are used for the input of different ANN
and ANN models, there are 7 neurons in the input layer and models which are Feed Forward Back Propagation and Cas-
one neuron in the output layer. It was found that systems that cade Forward Back Propagation Models. All of the methods
use both AR and ANN models can achieve a higher forecasting are compared with each other. Firstly, the whole models are
accuracy. examined one by one. Then the applications and simulations
for the system is given.

I. Introduction
II. Auto Regressive Models
Load forecasting is very important for the power system
planning and security. The main problem for the planning These are the models that explain a time series value in a time
is the determination of load demand in the future. Be- period and its relationship with the previous values and error
cause electrical energy cannot be stored appropriately, cor- term. AR models are named with the number of previous
rect load forecasting is very important for the correct invest- time period terms. An autoregressive process AR(p) with
ments. There are three types of load forecasting: Short-term, order p is defined as
middle-term and long-term load forecasting. Short-term load p
X
forecasting is to predict the hourly loads, one day or even one Xt = φr Xt−r + ²t (1)
week. Short-term load forecaster calculates the estimated r=1
load for each hours of the day, the daily peak load, or the
daily or weekly energy generation. Short term load forecast- where φ1 , φ2 ...φr are fixed constants and ²t is a sequence
ing is important for the economic and secure operation of of independent (or uncorrelated) random variables with zero
Short-Term Load Forecasting 67

mean and variance σ 2 [15]. It contains only one past obser-


vation value, called as AR(1). The AR(1) process is defined
by
Xt = φr Xt−1 + ²t (2)

III. Artificial Neural Network Models


A neural network is massively parallel-distributed processor
made up of simple processing units called neurons, which
have a natural propensity for storing experimental knowledge
[16]. The motivation for the development of neural network
technology stems from the desire to develop an artificial sys-
tem that could perform “intelligent” tasks similar to those Figure. 1: The structure of AR-input ANN
performed by the human brain. Neural networks resemble
the human brain in the following two ways: 2
x 10
4

1.9
1. A neural network acquires knowledge through learning.
1.8

2. A neural network’s knowledge is stored within inter- 1.7

neuron connection strength known as synaptic weights. 1.6

MW 1.5

The true power and advantage of neural networks lies in their 1.4

ability to represent both linear and non-linear relationships 1.3

and in their ability to learn these relationships directly from 1.2

the data being modeled. Traditional linear models are sim- 1.1
January
February
March
ply inadequate when it comes to modeling data that contains 1
0 20 40 60 80 100 120 140 160 180
Hours
non-linear characteristics. The most common neural net-
work model is the multilayer perceptions (MLP). This type
Figure. 2: The hourly load values of a week for January,
of neural network is known as supervised networks because
February and March
they require a desired output in order to learn. The goal of
this type of network is to create a model that correctly maps
the input to the output using historical data so that the model are determined. The hourly load values for January, Febru-
can then be used to produce the output when the desired out- ary, and March are shown in Figure 2.
put is unknown [17]. The filter coefficient is set to 6 for this analysis. If we analyze
the pole zero plot of the AR(6) filter, one of the pole is near
IV. A New Approach for the ANN Models the unit circle as shown in Figure 3. This state shows that
AR model can not be very suitable for this analysis. In Fig-
A new approach is to use the results of the AR analysis for ure 2, hourly load values (MW) for one week are plotted for
the input of different ANN models which are Feed Forward January, February and March. The data sequence has some
Back Propagation and Cascade Forward Back Propagation periodicity but the statistical values such as bias and variance
Models. The structure of the new ANN model is given in are changing from one month to the other. The daily data are
Figure 1. similar from one day to the other also. This introduces the
periodicity. But some unexpected events such as holidays,
V. Applications and Simulations failure on power plants, whether condition changing, etc. ef-
fect the loads.
In the application, the load forecasting for the next day using Therefore, if we make load forecasting by linear prediction
AR and ANN models is performed separately; then results of with AR, these unexpected changes decrease the prediction
the AR analysis is used for the input of different ANN mod- performance. In this model (based on the parametric meth-
els which are Feed Forward Back Propagation and Cascade ods), it is assumed that the data sequence is stationary. But
Forward Back Propagation Models. All of them are com- the data is not stationary as can be observed in Figure 4 where
pared with each other. When the the energy consumption of the statistical characteristics are changing rapidly.
the whole week is examined, it was observed that Sundays
are different from other six days in the weeks. Because of
A. Forecasting with AR
this, the values for the past six days except Sunday are used
for the load forecasting of the next day. Thus the correction The AR models explain time series values in a time period,
of the neural network is provided. Firstly, hourly load values their relationship with the previous values, and an error term.
68 Ummuhan Basaran Filik and Mehmet Kurban
4
x 10
2

1
1.9

0.8
1.8
0.6

0.4 1.7
Imaginary Part

0.2 1.6

MW
0
1.5
−0.2
1.4
−0.4

−0.6 1.3

−0.8 1.2
Predicted Load Values
−1 Actual Load Values
1.1
−1 −0.5 0 0.5 1 0 50 100 150 200 250 300 350 400 450 500
Real Part Hours

Figure. 5: Actual and predicted load values for January


Figure. 3: Pole zero plot of the AR (6) filter
20 Performance is 267759, Goal is 0
10
January 10
February
March
15
9
10

10

8
10
Power(dB)

5
Training−Blue
7
10
0

6
10
−5

5
−10 10

−15 4
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 10
0 20 40 60 80 100 120 140 160 180
Normalized Frequency 199 Epochs

Figure. 4: Pseudo spectrum estimate via music (multiple sig- Figure. 6: Curve showing the training process of the feed
nal classification method) (Order=6) forward backpropagation network.

AR models are named with the number of previous time pe-


riod terms. Denote B. Forecasting with Artificial Neural Network Models

Y486×6 as data matrix ANN is trained such that a particular input leads to a specific
target output. There are generally four steps in the training
y486×1 as target vector. process: (1) assemble the training data, (2) create the net-
work object, (3) train the network, and (4) compute the net-
We have,
work response to new inputs.
     Different neural network structures are tested for this system
y(1) ... y(6) a(1) y(7)
 y(2) ... y(7)  a(2)   y(8)  and results of this analysis are compared. All the neural net-
    
 .. .. ..  .. = ..  (3) work structures have input layer composed of 6 neurons and
 . . .  .   .  output layer with 1 neurons. Feed Forward Backpropagation
y(480) ... y(485) a(6) y(486) network is selected as the network type. The network is im-
plemented by using the MATLAB Neural Networks Toolbox.
Ya=y Size of the input vector is 6 × 480, and the size of the target
vector is 1 × 480 in this structure. Neural Network is trained
AR coeffients = a ≈ (Y H Y )−1 Y H y
for 199 epochs, as illustrated in the Figure 6.
We can find AR coefficients by using the least squares The actual and predicted load values of the feed forward
method. AR model is applied to the system. Actual load backpropagation network is shown in Figure 7.
values and predicted load values are shown in Figure 5 for The Cascade Forward Back Propagation has 5 layers. The
January 1-20. For this analysis, the hourly load values for first layer has 6 neurons; the second, third, and fourth lay-
January are used. The AR(6) filter is constructed. The pro- ers have 4 neurons, and the last layer has 1 neuron. Neural
gram is written in MATLAB for AR analysis. Using the re- Network is trained for 50 epochs, as illustrated in Figure 8.
sults of the AR analysis, actual and predicted load values for The actual and predicted load values of the Cascade Forward
January are shown in Figure 5. Back Propagation network is shown in Figure 9.
Short-Term Load Forecasting 69
4 4
x 10 x 10
2 2

1.9 1.9

1.8 1.8

1.7 1.7

1.6 1.6
MW

MW
1.5 1.5

1.4 1.4

1.3 1.3

1.2 1.2
Predicted Load Values Predicted Load Values
Actual Load Values Actual Load Values
1.1 1.1
0 50 100 150 200 250 300 350 400 450 500 0 50 100 150 200 250 300 350 400 450 500
Hours Hours

Figure. 7: Actual load values and predicted load values for Figure. 9: Actual and predicted load values for January for
January for the feed forward backpropagation network. the Cascade Forward Back Propagation
Performance is 240985, Goal is 0
Performance is 175621, Goal is 0 10
9 10
10

9
10
8
10

8
10

7 Training−Blue
10
Training−Blue

7
10

6
10
6
10

5
10 5
10

4
4 10
10 0 2 4 6 8 10 12 14 16 18
0 5 10 15 20 25 30 35 40 45 50
18 Epochs
50 Epochs

Figure. 8: Curve showing the training of the cascade forward Figure. 10: Curve showing the training process of the second
backpropagation network. structure.

are shown in Figures 12 and 13, respectively.


C. Forecasting with a New Approach
In this part, the correction of neural networks by giving the
results of AR model as the input to the ANN model is applied Mean square error is calculated according to the following
to the system for January as an example application. The formula:
N
structure has 2 layers; the first layer is composed of 6 neu- 1 X
rons and the output layer is composed of one neuron. Feed MSE = (xi − x̂i )2 (4)
N i=1
Forward Back propagation is chosen as the network type. In
this case, the size of the input vector is 7 × 480 and the size where N is the number of data points, xi is the actual Value,
of the target vector is 1 × 480. The network is trained for 18 and x̂i is the predicted Value. MSE values of the all analysis
epochs. The curve of the epoch number and training for the are given in Table 1.
feed forward back propagation structure is shown in Figure As can be observed from the results, models with Cascade
10. Actual and predicted load values are shown in Figure 11. Feed Forward Back Propagation neural network structure
As can be observed in this figure, Feed Forward Back propa- gives the best results because mean square error value is less
gation with AR analysis has the less performance value than than those of others.
the one without AR. The number of the epochs required is
also less than that of the previous ones. VI. Conclusion
Neural network structure for Cascade Forward Back Propa-
gation has 5 layers. The first layer has 7 neurons; each of AR and multilayer artificial neural networks are proposed for
second, third and forth layers has 4 neurons, and the output short-term load forecasting in this paper. Past six days were
layer has 1 neuron. The size of the input vector is 7×480 and used except Sunday, and the next day was forecasted. In
size of the target vector is 1 × 480 in this structure. The cor- the first approach, an AR model is used for load forecasting.
rection of ANNs and the actual and predicted load values for Then, different neural network structures were tested. Net-
the cascade forward back propagation with AR for January work types that have been tested include Feed Forward Back
70 Ummuhan Basaran Filik and Mehmet Kurban
4 4
x 10 x 10
2 2

1.9 1.9

1.8 1.8

1.7 1.7

1.6 1.6
MW

MW
1.5 1.5

1.4 1.4

1.3 1.3
Predicted Load Values Predicted Load Values

1.2 Actual Load Values 1.2 Actual Load Values

1.1 1.1
0 50 100 150 200 250 300 350 400 450 500 0 50 100 150 200 250 300 350 400 450 500
Hours Hours

Figure. 11: The correction of ANNs for January Figure. 13: Actual and predicted load values for January for
the cascade forward back propagation with AR.
Performance is 158650, Goal is 0
10
10

9
10 Table 1: MSE values of the analysis
Method MSE values
8
10
(GW)
Forecasting with AR 39.862
Training−Blue

7
10
Forecasting with ANN Models
BP Alghm. 26.776
6
10
Forecasting with ANN Models
Cascade Forward BP Alghm. 21.975
5
10 Forecasting with a New Approach
Feed Forward BP Alghm. 20.078
4
10
0 1 2 3 4 5 6 7 8 9 10
Forecasting with a New Approach
10 Epochs
Cascade Forward BP Alghm. 18.085

Figure. 12: Correction neural networks for January.


[5] Huang, H., Hwang, R.,Hsieh, J.: A new artificial intel-
ligent peak power load forecaster based on non-fixed
Propagation and Cascade Forward Back Propagation. All of neural networks, Electr. Power Energy Sys., (2002)
the neural network structures comprise six neurons in the in- 245250
put and one neuron in the output. Cascade Forward Back
Propagations was found to be more efficient than Feed For- [6] Irisarri, G.D., Widergren, S.E. Yehsakul, P.D: On-
ward Back Propagation Method. Lastly, the neural networks line load forecasting for energy control center applica-
were corrected by applying the result of the AR model to tion, IEEE Transactions on Power Apparatus and Sys-
their inputs. The best result is obtained by the Cascade Feed tems,PAS 101 (1971)900-911
Forward Back Propagation neural networks. Results show [7] Christiaanse, W.R.: Short-term load forecasting using
that ANNs are feasible solutions and valuable approaches to general exponential smoothing, IEEE Transactions on
short-term load forecasting. Power Apparatus and Systems, PAS-90, (1971) 900-
911
References [8] Park, D.C., El-Sharkawi, M.A., Marks R.J., Atlas L.E.,
M.J. Damborg: Electric Load Forecasting Using and
[1] Gross, G., Galiana F.D.: Short Term load forecasting. Artificial Neural Network, IEEE Transactions on Power
Proc. IEEE, Vol. 75 (1987) 1558-1573 Systems, Vol. 6, No 2 (1991), 442-449
[2] Asbury, C. : Weather Load Model for Electric Demand [9] Peng,T.M., Hubele, N.F., Karady, G.G. :Advancement
Energy Forecasting, IEEE Transactions on Power Ap- in the Application of Neural Networks for Short Term
paratus and Systems, Vol. PAS-94,(1975) 1111-1116 Load Forecasting, IEEE Transactions on Power Sys-
[3] Papalexopoulos, A.D., Hesterberg, T.C.: A regression tems, Vol. 7(1992) 250-258
based approach to short-term system load forecasting, [10] Lu, C.N., Wu, H.T., Vemuri, S.: ”Neural Network
Proceedings of PICA Conference (1989) 414-423 Based Short Term Load Forecasting, IEEE Transac-
[4] Hill, T., OConnor, M., Remus, W., :Neural Networks tions on Power Systems, Vol. 8 (1993) 336-342
Models for Time Series Forecasts, Management Sci- [11] Peng, T.M., Hubele, N.F., Karady ,G.G.: An Adap-
ences, (1996) 1082-1092 tive Neural Network Approach to One-Week ahead
Short-Term Load Forecasting 71

Load Forecasting IEEE Transactions on Power Sys-


tems, Vol.8 (1993)1195-1203
[12] Papalexopoulos, A.D., How, S., Peng, T.M.: An Imple-
mentation of a Neural Network based Load Forecasting
Model for the EMS, Paper 94 WM 209-7 PWRS pre-
sented at the IEEHES (1994) Winter Meeting
[13] Mohammed, 0, Park, D., Merchant, R., Dinh, T., Tong,
C., Azeem, A., Farah J., Drake C.: Practical Experi-
ences with an Adaptive Neural Network Short Term
Load Forecasting System, Paper 94 WM 210-5 PWRS
presented at the IEEE PES (1994) Winter Meting
[14] Senjyu, T., Mandal, P., Uezato, K., Funabashi, T. :Next
day load curve forecasting using recurrent neural net-
work structure, IEE Proc.Gener. Transm. Distrib., Vol.
151 (2004)
[15] http://www.statistics.com/content/glossary
/a/armamodel.php
[16] Chauhan, B.K., Sharma, A., Hanmandlu, M.: Neuro
Fuzzy Approach Based Short Term Electric Load Fore-
casting, IEEE/PES Transmission and Distribution Con-
ference & Exhibition: Asia and Pacific Dalian, China
(2005 )
[17] http://www.nd.com/neurosolutions/products/ns
/whatisNN.html

S-ar putea să vă placă și