Sunteți pe pagina 1din 1

786

Book Reviews

Neural Networks: A Comprehensive Foundation. S. HAYKIN. New York: Macmillan College (IEEE
Press Book) (1994). v +696 pp. ISBN 0-02-352761-7.

Neural Networks: A Comprehensive Foundation is a well documented tour of both the mathematics and
theory underlying the field of artificial neural networks research. The bibliography, in itself, is a valuable
resource for those interested in the field. The book is designed as a textbook for a senior or graduate-level
course in neural networks in a computer science, physics, or engineering department. The topics are wellreferenced, helping those interested in expanding their knowledge of the field. Unlike many mathematical
text books, the author has clearly referenced many mathematical equations and proofs, allowing the reader
the opportunity to review the equations in the original work. The interspersion of theory and discussion
help make the mathematical text more readable and enjoyable than most.
The book contains 15 chapters including the Introduction. All chapters begin with an introduction and
a section describing the organization of the chapter, and end with a summary discussion and collection of
problems. Some chapters also include a concluding section discussing various practical applications of the
theory discussed in the chapter. The preface defines the symbols used in the book. Four appendices, a
bibliography, and a subject index also support the work.
As is usual for neural network texts, the first chapter includes a comparison of the various elements of
an artificial neural network to those in a biological neural network. The author continues this theme with
parallels to biological systems in all chapters. The introductory definition of a neural network is followed
by an overview of artificial neural network architectures with a strong engineering focus in the notation.
An interesting section in this chapter is the discussion comparing neural networks to rule-based or
symbolic artificial intelligence systems. The chapter concludes with an excellent and concise historical
discussion of neural network development. The author is careful to credit early research in neural networks
that forms the foundation of the recent increased interest in the field. The author also presents a balanced
view of why neural network research stalled in the 1960s.
Chapters 2, 3 and 4 are also of an introductory nature. Chapter 2 provides an excellent overview of the
learning process with a solid mathematical basis, Chapter 3 introduces the concept of correlation matrix
memory. In his discussion, the author clearly ties this concept to its neurobiological roots. Chapter 4 is a
short discussion of the basic Perceptron architecture. A more in-depth discussion of the perceptron's
limitations would be helpful, rather than simply referring the reader to other material.
Chapter 5 provides a mathematical review of the least-mean-square (LMS) algorithm. The author is
careful to emphasize the importance of this algorithm in both neural networks and other types of adaptive
systems. However, a more organized discussion of the problems inherent in the application of the LMS
algorithm to neural networks, along with possible solutions to the problems would have added to the
overall understanding of the topic.
Chapters 6-12 provide a discussion of a variety of neural network architectures. All chapters are
thorough and combine mathematics and discussion to describe each architecture. Of particular interest to
readers in the information sciences is the discussion of Shannon's Information Theory in relation to selforganizing systems, contained in Chapter 11. Chapter 12 discusses how modular neural networks can be
used in robotics.
Chapter 13-15 are both fascinating and useful. Chapter 13 details ways to incorporate time-varying data
into neural network models. This critical aspect of modeling real systems is often ignored in introductory
texts. Chapter 14 introduces the concept of neurodynamics in the study of neural networks and effectively
links neural networks to the burgeoning field of chaos theory; it ends with a reference to two new theories
that "may pave the way for major advances in the theory and design of neurodynamical systems." (p. 589).
Chapter 15 provides a general overview of Very Large Scale Integration (VLSI) Implementations of Neural
Networks, which is an area of increasing interest to the fields of engineering and process control. The three
appendices provide a more detailed, mathematical discussion of several of the concepts in the text.
Overall, the book is well researched, documented and written. It does not include (as some other texts
do) a diskette of software used to model the neural networks described in the text. The inclusion of such
a diskette would provide an important instructional aid. The wide use of references and extensive
bibliography would be more usable if a name index were included,

Department of Technology and Cognition


University of North Texas
Denton, Tex., U.S.A.

JANETI~E B. BRADLEY

S-ar putea să vă placă și