Sunteți pe pagina 1din 8

Alexandria Engineering Journal (2014) 53, 655–662

H O S T E D BY
Alexandria University

Alexandria Engineering Journal


www.elsevier.com/locate/aej
www.sciencedirect.com

REVIEW

Artificial Neural Networks (ANNs) for flood


forecasting at Dongola Station in the River Nile,
Sudan
Sulafa Hag Elsafi *

Faculty of Arts, Department of Geography, King Saud University, Ryiad, Saudi Arabia

Received 23 May 2012; revised 22 March 2014; accepted 22 June 2014


Available online 22 July 2014

KEYWORDS Abstract Heavy seasonal rains cause the River Nile in Sudan to overflow and flood the surroundings
River Nile; areas. The floods destroy houses, crops, roads, and basic infrastructure, resulting in the displacement
Dongola; of people. This study aimed to forecast the River Nile flow at Dongola Station in Sudan using an
Artificial neural network; Artificial Neural Network (ANN) as a modeling tool and validated the accuracy of the model
Flood forecasting against actual flow. The ANN model was formulated to simulate flows at a certain location in the
river reach, based on flow at upstream locations. Different procedures were applied to predict
flooding by the ANN. Readings from stations along the Blue Nile, White Nile, Main Nile, and River
Atbara between 1965 and 2003 were used to predict the likelihood of flooding at Dongola Station. The
analysis indicated that the ANN provides a reliable means of detecting the flood hazard in the River
Nile.
ª 2014 Production and hosting by Elsevier B.V. on behalf of Faculty of Engineering, Alexandria
University.

Contents

1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 656
2. Study area . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 656
3. Artificial neural networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 656
4. Artificial neural network models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 658
4.1. Modeling a neuron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 658
4.2. Training an artificial neural network. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 658
5. Application of the neural network method in flood forecasting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 658
5.1. ANN model results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 662

* Tel.: +966 4808155.


E-mail address: sulafa007@gmail.com.
Peer review under responsibility of Faculty of Engineering, Alexandria
University.
http://dx.doi.org/10.1016/j.aej.2014.06.010
1110-0168 ª 2014 Production and hosting by Elsevier B.V. on behalf of Faculty of Engineering, Alexandria University.
656 S.H. Elsafi

6. Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 662
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 662
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 662

1. Introduction developed for a diverse range of climatic regions. Hence, these


models must be adapted to the local situation of the Nile Basin
Flooding leads to numerous hazards, with consequences prior to application. The ability to simulate river flow quickly
including risk to human life, disturbance of transport and and accurately is of crucial importance in flood forecasting
communication networks, damage to buildings and infrastruc- operations. Hydrodynamic models provide a sound physical
ture, and the loss of agricultural crops. Therefore, prevention basis for this purpose, and have the capability of simulating
and protection policies are required that aim to reduce the a wide range of flow situations. However, these models require
vulnerability of people and public and private property. Many accurate river geometric data, which may not be available at
solutions for flood mitigation and prevention have been many locations. It is also not possible to integrate observed
suggested however, a vast amount of data and knowledge data directly at desired locations to improve model results [9].
are required about the causes and influencing factors of floods Thus, the Artificial Neural Network (ANN) provides a
and their resulting damage. quick and flexible approach for data integration and model
Flood forecasting and prediction capabilities evolved development. Therefore, this research used ANN models to
slowly during the 1970s and 1980s. However, recent technolog- forecast flooding along the River Nile. It is anticipated that
ical advances have had a major impact on forecasting this work will provide baseline information toward the estab-
methodologies. For instance, hydrological models use physical lishment of a flood warning system for certain sections of the
detection systems to forecast flood conditions based on pre- River Nile.
dicted and/or measured parameters [9]. River flow models
are used as components in actual flood forecasting schemes, 2. Study area
where forecasts are required to issue warnings and to permit
the evacuation of populations threatened by rising water levels. Sudan has been subjected to a number of major floods along
The basis of such forecasts is invariably observation and/or the Nile and its tributaries, as well as along seasonally flowing
predictions of rainfall in the upper catchment area and/or river riverbeds (termed wadis). Examples of major floods occurred
flows at upstream points along main rivers or tributaries. in 1946, 1988, 1994, 1996, 1998, 1999, 2000, 2001, 2006, and
Forecasts about the discharge are obtained in real-time, by 2007. Such floods normally cause the loss of life and massive
using the model to transform the input functions into a scale damage to the agricultural sector and properties in the
corresponding discharge function time. vicinity of the rivers.
At present, two main approaches are employed in hydro- The study area was located in Dongola town (Fig. 1). This area
logical forecasting. The first approach is based on mathemati- is located down-stream of the junction of the main tributaries
cal modeling. It models the physical dynamics between the to the Nile; including the White Nile, Blue Nile, and River
principal interacting components of the hydrological system. Atbara. This region has suffered a severe flood in 1998.
In general, a rainfall-runoff model is used to transform the
point values of rainfall, evaporation, and flow data into
hydrograph predictions by considering the spatial variation 3. Artificial neural networks
in storage capacity. A hydraulic channel flow routing model
is then used to calculate flow. An example of this type of deter- An alternative approach to flow forecasting has been devel-
ministic modeling is River Flow Forecasting (RFFS), which is a oped in the recent years, which is based on the ANN [3].
large-scale operational system currently employed by the Outer Recent studies have reported that ANN may offer a promising
River Catchment [8]. The second approach is based on model- alternative for the hydrological forecasting of stream flow [7].
ing the statistical relationship between the hydrologic input The ANN is a computer program that is designed to model the
and output, without explicitly considering the relationships that human brain and its ability to learn tasks [4]. An ANN differs
exist among the involved physical processes. Examples of to other forms of computer intelligence in that it is not rule-
stochastic models used in hydrology are autoregressive moving based, as in an expert system. An ANN is trained to recognize
average models (ARMA) [2] and the Markov method [13]. and generalize the relationship between a set of inputs and
Models that provide a physically sound description about outputs.
the hydrological processes that occur in a basin are expected Early artificial neural networks were inspired by percep-
to have significant advantages over purely empirical models. tions of how the human brain operates. In the recent years,
The main advantages of these models are their accuracy ANN technological developments have made it more of an
and the potential for performing comprehensive sensitivity applied mathematical technique with some similarities to the
analyses. The parameters of these models have direct physical human brain. ANNs retain two characteristics of the brain
interpretation, and their values might be established through as primary features: the ability to (1) ‘learn’ and (2) generalize
field or laboratory investigations [9]. from limited information [5].
Stream flow modeling is a key tool in water resources man- Both biological and artificial neural networks employ mas-
agement, early warning for flood hazards, and related impacts. sive, interconnected simple processing elements, or neurons.
Many advanced types of models exist, but they have been The knowledge stored as the strength of the interconnecting
Artificial Neural Networks (ANNs) for flood forecasting at Dongola Station in the River Nile, Sudan 657

Main Nile
Atbara River

Blue Nile
White Nile

Figure 1 River Nile and its tributaries in Sudan.

(i.e., back-propagation) is used to modify the weights in the


network in an orderly fashion.
Unlike most computer applications, an ANN is not ‘‘pro-
grammed,’’ rather it is ‘‘taught’’ to give an acceptable answer
to a particular problem. Input and output values are sent to
the ANN, initial weights to the connections in the architecture
of the ANN are assigned, and the ANN repeatedly adjusts
these interconnecting weights until it successfully produces
output values that match the original values. This weighted
matrix of interconnections allows the neural network to learn
and remember [10].
When using an ANN to solve a problem, the first step is to
train the ANN to ‘‘learn’’ the relationship between the input
and outputs. This action is accomplished by presenting the
Figure 2 Example of a simple feed forward network. network with examples of known inputs and outputs, in
conjunction with a learning rule. The ANN maps the relation-
ship between the inputs and outputs, and then modifies its
internal functions to determine the best relationship that is
weights (a numeric parameter) in ANNs is modified through a be represented by the ANN.
process called learning, using a learning algorithm. This The inner workings and processing of an ANN are often
algorithmic function, in conjunction with a learning rule, thought of as a ‘‘black box’’ with inputs and outputs. One use-
658 S.H. Elsafi

The input value received by a neuron is calculated by


summing the weighted input values from its input links.
That is:
X
ini ¼ j
Wj ; i aj

An activation function takes the neuron input value and


produces a value that becomes the output value of the neuron.
This value is passed to other neurons in the network. This
process is summarized in the schematic presented in Fig. 3.
Figure 3 Neuron Model, in which aj equals the activation value A neuron is connected to other neurons via its input and
of unit j, wj,i equals the weight on the link from unit j to unit i, ini output links. Each incoming neuron has an activation value
equals the weighted sum of inputs to unit i, ai equals the activation and each connection has a weight associated with it. The
value of unit i (also known as the output value), and g equals the neuron sums the incoming weighted values, representing the
activation function. input from the neuron [6].

4.2. Training an artificial neural network


ful analogy that helps to understand the mechanism occurring
inside the black box is to consider the neural network as a Once a network has been structured for a particular applica-
super-form of multiple regression. Like linear regression, tion, it is ready to be trained. To start this process, the initial
which finds the relationship that {y} = f{x}, the neural net- weights are chosen at random. Then, the training, or learning,
work finds some function f{x} when trained. However, the begins.
neural network is not limited to linear functions. It finds its There are two approaches to training: supervised and
own best function to the best of its ability, given the complex- unsupervised. Supervised training involves a mechanism that
ity used in the network, and without the constraint of linearity provides the network with the desired output, either by manually
(Hewitson and Crane [5]). ‘‘grading’’ the network’s performance or by providing the desired
The most common type of artificial neural network consists outputs with the inputs. Unsupervised training is where the net-
of three groups, or layers, of units: (1) a layer of ‘‘input’’ units work has to make sense of the inputs without outside help [1].
is connected to (2) a layer of ‘‘hidden’’ units, which is con- The computational complexity of a network and its gener-
nected to (3) a layer of ‘‘output’’ units (Fig. 2). alization capability directly depends on network topology. A
The activity of the input units represents the raw informa- larger-than-necessary network tends to over fit the training
tion that is fed into the network. The activity of each hidden data, leading to poor generalization performance. In contrast,
unit is determined by the activities of the input units and the a smaller-than-necessary network faces difficulty in learning
weights on the connections between the input units and the the training data. The standard method of trial and error is
hidden units. The behavior of the output units depends on used to design the network. In each case of network design,
the activity of the hidden units and the weights between the a large number of networks, differing in the number of hidden
hidden units and output units [12]. nodes and memory depth, are trained, with the smallest net-
work that learns the training data with the best generalization
4. Artificial neural network models performance being selected [11].

4.1. Modeling a neuron 5. Application of the neural network method in flood forecasting

To model a neuron, each neuron performs a simple computa- To observe the modeling performance of ANNs, different
tion. It receives signals from its input links and uses these stations across the Nile were used in this study. All of the data
values to compute the activation level (or output) for the neu- were obtained from the Sudan Ministry of Irrigation as
ron. This value is passed to other neurons via its output links. daily readings. Because of the continuity and availability of

Table 1 Four selected scenarios and model efficiency results.


Scenario Stations used as input Strength of station Slab structure and scale function Model efficiency
Calibration period Verification period
First Tamaniat 0.34672 2,58,1 R2 = 94.32% R2 = 91.59%
Atbara 0.65328 Linear(1, 1), Logistic, Logistic RMSE = 0.338 RMS = 0.374
Second Eddeim 0.23007 3,58,1 R2 = 94.51% R2 = 91.62%
Tamaniat 0.31869 Linear (1,1), Logistic, Logistic RMS = 0.333 RMS = 0.374
Atbara 0.45123
Third Hassanab 0.30906 2,58,1 R2 = 91.85% R2 = 70.29%
AtbaraK3 0.69094 Linear(1, 1), Logistic, Logistic RMS = 0.439 RMS = 0.640
Fourth Tamaniat 0.48644 2,58,1 R2 = 91.31% R2 = 67.92%
AtbaraK3 0.51356 Linear (1, 1), Logistic, Logistic RMS = 0.453 RMS = 0.669
Artificial Neural Networks (ANNs) for flood forecasting at Dongola Station in the River Nile, Sudan
Table 2 ANN models for the ‘‘Eddeim, Tamaniat, Atbara’’ stations as input and the ‘‘Dongola’’ Station as output for the calibration period (1970–1985) and the verification period
(1986–1987).
ANN model used Slab structure Scale function Model efficiency
Calibration period Verification period
3 Layers, Standard Connections 3, 58, 1 Linear (1, 1), Logistic, Logistic R2 = 94.51% R2 = 91.62%
RMSE = 0.333 RMSE = 0.374
3, 30, 1 Linear (1, 1), Logistic, Logistic R2 = 94.49% R2 = 91.73%
RMSE = 0.333 RMSE = 0.371
4 Layers, Standard Connections 3, 29, 29, 1 Linear (1, 1), Logistic, Logistic, Logistic R2 = 94.47% R2 = 91.75%
RMSE = 0.333 RMSE = 0.371
3, 12, 12, 1 Linear (1, 1), Logistic, Logistic, Logistic R2 = 94.52% R2 = 91.72%
RMSE = 0.333 RMSE = 0.371
5 Layers, Standard Connections 3, 19, 19, 19, 1 Linear (1, 1), Logistic, Logistic, Logistic, Logistic R2 = 94.49% R2 = 91.71%
RMSE = 0.491 RMSE = 0.445
3 Layers, Jump Connections 3, 58, 1 Linear (1, 1), Logistic, Logistic R2 = 94.35% R2 = 91.32%
RMSE = 0.338 RMSE = 0.381
4 Layers, Jump Connections 3, 29. 29, 1 Linear (1, 1), Logistic, Logistic, Logistic R2 = 94.40% R2 = 91.28%
RMSE = 0.336 RMSE = 0.382
5 Layers, Jump Connections 3, 30, 30, 30, 1 Linear (1, 1), Logistic, Logistic, Logistic, Logistic R2 = 94.43% R2 = 91.24%
RMSE = 0.335 RMSE = 0.382
3, 10, 10, 10, 1 Linear (1, 1), Logistic, Logistic, Logistic, Logistic R2 = 94.39% R2 = 91.38%
RMSE = 0.336 RMSE = 0.379
Input Layer, Dampened Feed Back Link 3, 58, 1, 3 Linear (1, 1), Logistic, Logistic, Logistic R2 = 96.56% R2 = 96.13%
RMSE = 0.263 RMSE = 0.255
Hidden Layer Dampened Feed Back Link 3, 58, 1, 58 Linear (1, 1), Logistic, Logistic R2 = 97.37% R2 = 97.05%
RMSE = 0.230 RMSE = 0.221
Output Layer Dampened Feed Back Link 3, 58, 1, 58 Linear (1, 1) Logistic, Logistic R2 = 92.22% R2 = 93.90%
RMSE = 0.396 RMSE = 0.319
Two Hidden Slabs with Different Activation Function 3, 29, 29, 1 Linear (1, 1), Gaussian, Gaussian Camp, Logistic R2 = 94.57% R2 = 91.22%
RMSE = 0.330 RMSE = 0.382
Three Hidden Slabs, Different Activation Function 3, 19, 19, 19, 1 Linear (1, 1), Gaussian, Tanh, Gaussian Camp, Logistic R2 = 94.51% R2 = 91.19%
RMSE = 0.333 RMSE = 0.383
Two Hidden Slabs, Different Activation Function, Jump Connection: 3, 29, 29, 1 Linear (1, 1), Gaussian, Gaussian Camp, Logistic R2 = 94.59% R2 = 91.33%
RMSE = 0.330 RMSE = 0.380

659
660 S.H. Elsafi

Table 3 Predicted stage for Dongola during August and September (flood season) of 1998.
Date Tamaniat Atbara Observed Dongola Predicted Error Water level
Days Gauge (m) Gauge (m) Gauge (m) (m) (m) (m)
8/1/1998 14.30 14.02 12.70 13.15211 0.45211 225.18
8/2/1998 14.42 14.22 12.98 13.33011 0.35011 225.36
8/3/1998 14.58 14.365 13.32 13.46949 0.14949 225.50
8/4/1998 14.60 14.51 13.36 13.58692 0.22692 225.62
8/5/1998 14.69 14.60 13.56 13.67016 0.11016 225.70
8/6/1998 14.70 14.65 13.56 13.71087 0.15087 225.74
8/7/1998 14.68 14.72 13.60 13.7635 0.1635 225.79
8/8/1998 14.86 14.80 13.70 13.84758 0.14758 225.88
8/9/1998 15.14 14.86 13.75 13.92208 0.17208 225.95
8/10/1998 15.58 14.98 13.80 14.04664 0.24664 226.08
8/11/1998 15.86 15.16 13.95 14.19609 0.24609 226.23
8/12/1998 16.16 15.28 14.08 14.29524 0.21524 226.33
8/13/1998 16.10 15.38 14.20 14.36589 0.16589 226.40
8/14/1998 16.10 15.41 14.24 14.38748 0.14748 226.42
8/15/1998 16.08 15.46 14.26 14.42266 0.16266 226.45
8/16/1998 16.10 15.56 14.29 14.49409 0.20409 226.52
8/17/1998 16.18 15.58 14.50 14.51027 0.01027 226.54
8/18/1998 16.26 15.70 14.61 14.59528 0.014724 226.63
8/19/1998 16.38 15.80 14.74 14.66544 0.074562 226.70
8/20/1998 16.45 15.65 14.89 14.5644 0.325603 226.59
8/21/1998 16.48 15.60 15.03 14.53023 0.499772 226.56
8/22/1998 16.60 15.63 15.07 14.55258 0.517422 226.58
8/23/1998 16.68 15.70 15.05 14.60143 0.448572 226.63
8/24/1998 16.75 15.78 15.15 14.65614 0.493859 226.69
8/25/1998 16.80 15.80 15.23 14.66983 0.560172 226.70
8/26/1998 16.88 15.85 15.30 14.70344 0.596564 226.73
8/27/1998 16.93 15.92 15.38 14.74963 0.630369 226.78
8/28/1998 16.96 15.96 15.45 14.7757 0.674296 226.81
8/29/1998 16.96 16.05 15.58 14.83352 0.746475 226.86
8/30/1998 16.90 16.10 15.6 14.8651 0.734899 226.90
8/31/1998 16.86 16.10 15.65 14.86502 0.784977 226.90
9/1/1998 16.82 16.14 15.68 14.88993 0.790071 226.92
9/2/1998 16.76 16.17 15.74 14.90828 0.831717 226.94
9/3/1998 16.65 16.28 15.76 14.97457 0.78543 227.00
9/4/1998 16.50 16.33 15.80 15.00284 0.797162 227.03
9/5/1998 16.52 16.34 15.82 15.00899 0.811011 227.04
9/6/1998 16.51 16.32 15.86 14.99703 0.862966 227.03
9/7/1998 16.60 16.00 15.88 14.79988 1.080116 226.83
9/8/1998 16.69 16.17 15.88 14.90786 0.972136 226.94
9/9/1998 16.80 16.17 15.87 14.90846 0.961535 226.94
9/10/1998 16.90 16.28 15.90 14.9758 0.924203 227.01
9/11/1998 16.98 16.33 15.91 15.00572 0.90428 227.04
9/12/1998 17.00 16.30 15.91 14.98785 0.922149 227.02
9/13/1998 17.06 16.20 15.88 14.92718 0.952822 226.96
9/14/1998 17.10 16.12 15.86 14.87748 0.982523 226.91
9/15/1998 17.13 16.12 15.83 14.87737 0.952627 226.91
9/16/1998 17.16 16.12 15.82 14.87725 0.942751 226.91
9/17/1998 17.10 16.11 15.80 14.87121 0.92879 226.90
9/18/1998 17.00 16.00 15.79 14.80153 0.988475 226.83
9/19/1998 16.95 15.77 15.76 14.65007 1.109925 226.68
9/20/1998 16.90 15.77 15.65 14.65 0.999995 226.68
9/21/1998 16.80 15.91 15.48 14.74276 0.737243 226.77
9/22/1998 16.75 15.87 15.44 14.71617 0.723826 226.75
9/23/1998 16.7 15.78 15.42 14.65581 0.764191 226.69
9/24/1998 16.65 15.63 15.41 14.55309 0.856906 226.58
9/25/1998 16.80 15.64 15.16 14.5611 0.598895 226.59
9/26/1998 16.83 15.56 14.94 14.50585 0.434149 226.54
9/27/1998 16.68 15.65 14.96 14.56715 0.392845 226.60
9/28/1998 16.60 15.60 14.96 14.5318 0.428201 226.56
9/29/1998 16.40 15.60 14.90 14.52893 0.371072 226.56
9/30/1998 16.28 15.26 14.70 14.28399 0.416008 226.31
Artificial Neural Networks (ANNs) for flood forecasting at Dongola Station in the River Nile, Sudan 661

Figure 4 Maximum observed and predicted floodplain in 1998.

data, 1970–1985 was selected as the training (calibration) per- and ‘‘Dongola’’ as the output. This result indicates that
iod, while 1986–1987 was selected as the verification period. ANN models are viable for forecasting. Of note, there is no
Four scenarios were examined, as shown in Table 1. Different rule for selecting the number of intermediate layers or the
stations were used to provide input data, while Dongola Sta- number of neurons in the network. The procedure is based on
tion was used as the output. A three layer standard connection trial and error. Thus, different numbers of intermediate layers
model was applied to the four scenarios. To evaluate the and neurons are tried and assessed. The performance of each
performance of the models in forecasting flows, the root mean model is checked, and the structure with the minimum number
square error (RMSE) was used. of intermediate layers and neurons is normally selected.
Model performance varied with the structure of the model Therefore, in the following steps, different model structures
and scenarios used. The second model obtained the highest are checked using ‘‘Eddeim, Tamaniat, Atbara’’ as the input
efficiency, in which ‘‘Eddeim, Tamaniat, Atbara’’ as inputs and ‘‘Dongola’’ as the output to select the best model. Table 2
662 S.H. Elsafi

shows 12 different ANN models using ‘‘Eddeim, Tamaniat, layers and neurons. The performance of each scenario was
Atbara’’ flood levels as input and ‘‘Dongola’’ flood levels as checked, and the structure with the minimum number of layers
output. Most models produced relatively good results, with and neurons was selected to avoid any redundancy.
some performing better than others. The table shows the ANNs have the advantage of simplicity when compared to
model that was used in the first column, the slabs structure other more sophisticated models. Therefore, in situations
used, the model efficiency R2 for the calibration period, and where information is lacking or difficult to obtain, the ANN
the verification period and the Root Mean Square Error method provides the most viable option for flood forecasting.
(RMSE). Neural networks (ANNs) offer a means of reducing the
In all of the models the Learning rate = 0.1, Momen- analytical costs of topographical and hydrological information
tum = 0.1, and Initial weights = 0.3. by reducing the amount of time spent analyzing the data.

5.1. ANN model results Acknowledgments

For the flood disaster of 1998 at Dongola, the performance of This is a research project that was supported by a grant from
the ANN model was compared with the actual readings from the Research Centre for the Humanities, Deanship of Scientific
the River Nile. The ANN model was run to predict the flood Research at King Saud University.
level at Dongola Station. The water surface level equaled the
predicted value at station plus the zero gauge of this station. References
Therefore, the water surface level at Dongola equaled the
predicted stage using ANNs + 212.03 (m) Table 3. [1] Anderson, Dave, McNeil, George. Artificial neural networks
The maximum flood level was recorded on September 5, technology. Data and analysis Centre for Software, Rome,
1998. On this date, the predicted gauge was 15.01 m. Thus August 1992 <http://www.dtic.mil>.
the predicted water surface level was 227.04 m. The expected [2] G.E.P Box, G.M. Jenkins, Time Series Analysis: Forecasting
control, Holden – Day, Oakland, California, 1976.
flooded area is presented in Fig. 4.
[3] Govindoraju, R.S., Rao, A.R., Artificial neural networks in
On the same day, the observed gauge was 15.91 and the hydrology. Netherlands, 2000.
water surface level was 227.94 m. This output is also shown [4] Haykin, S., Neural networks. http://www.cul.salk.edu/.tewon/
in Fig. 4 to illustrate the actual maximum floodplain of this ICA/teaching-KAIST/references.htmc/1994.
year. [5] B.C Hewiston, Crane, Precipitation Controls in Southern
Mexico, in Neural Nets, Kluwer Academic Publisher, 1994.
6. Conclusions [6] Kendall, Graham. Introduction to Artificial Intelligence, Course
in University of Nottingham, School of Computer Science and
IT <http://www.cs.nott.ac.uk>, 2001.
In conclusion, this research aimed to predict flooding levels [7] Özgür. Kisi, A combined generalized regression neural network
along the River Nile using ANNs. This method is advanta- wavelet model for monthly stream flow prediction, KSCE J.
geous because only one variable is required, whereas other Civil Eng. 15 (8) (2011) 1469–1479.
models require several variables to produce accurate predic- [8] Moore, R.J., Jones, D.A., Black, K.B., Austin, R.M.,
tions. Because of the lack of data and difficulty of acquisition, Carrington, D.S., Tinnion, M., Akhondi, A., 1994. RFFS and
this study provides a brief overview and of the steps taken HYRAD: Integrated System for Rainfall and River Flow
toward using ANN models for the River Nile. Forecasting in Real-Time and their Application in Yorkshire.
This study facilitated the production of rapid and repeated BHS Ossasional paper No. 4, 12.
[9] Nile Basin Capacity Building Network (NBCBN). Flood and
analytical testing. The first assessment was undertaken using
Drought Forecasting and Early Warning Program, 2005.
several different stations to check the performance of the [10] Obermeier and Barron. Integration of GIS and Artificial Neural
models. Four scenarios were examined, using the same output Networks for Natural Resources Applications. <http://
station ‘‘Dongola’’, but different inputs: (1) ‘‘Tamaniat, gis.esri.com/library/userconf/proc96/To150/pAp126/
Atbara,’’ (2) ‘‘Eddeim, Tamaniat, Atbara,’’ (3) ‘‘Hassanab, P126.HTM>, 1989.
Atbara K3,’’ and (4) ‘‘Tamaniat, Atbara K3.’’ [11] Roy, Parthajit, Choudhury, P.S., Saharia, Manabendra.
The second assessment was based on the second scenario Dynamic ANN modeling for flood forecasting in a river
(inputs: Eddeim, Tamaniat, Atbara; output: Dongola), which network. In: International Conference on Modeling,
produced the best results in the first assessment. Then, 12 Optimization, and Computing (ICMOC), 2010.
different model options were checked. There is no rule for select- [12] Christos Stergiou, Dimitrios Siganous, Neural Networks,
Imperial College, London, 1998.
ing the number of intermediate layers and the number of neurons
[13] S.J. Yakowitz, Markov flow models and the flood warning
in the ANNs. The best way to define these numbers is by testing problem, Water Res Res 21 (1985) 81–88.
different model structures with different numbers of intermediate

S-ar putea să vă placă și