Previous Table of Contents Next


The data available for the model development consists of input-output pairs. The inputs consist of the variables shown entering the model in Figure 11.13. The output is the D50. The data will be segregated into two distinct groups: (1) a training set and (2) a test set. In training data set, the desired output data are provided to fuzzy system to update the parameters. The desired output is not given in the test set. NN with BORN was applied to model this hydrocyclone separation problem by the author of this chapter. However, due to the difference of system input and output ranges between training and test data sets, NN performance was found to be unacceptable. In order to adapt previously optimized NN by BORN, GA-BORN adaptive unit was used.

The NN that will be optimized by BORN has eight inputs, ten hidden neurons, and one output. This NN is trained with a learning rate of 0.01 for 75 epochs and then the BORN algorithm is executed for 25 epochs. The NN optimized by GA has the same structure as the one optimized using BORN. In order to optimize NN connections by GA, GA parameters are set to 30 populations, 19 strings, 0.8 probability of crossover, and 0.01 probability of mutation for 100 generations.

Figure 11.14 shows the training result for BORN. Despite the high nonlinearity of hydrocyclone model, training performance for BORN was fairly good. Figures 11.15 to 11.19 show BORN test results before and after GA adaptive unit is executed. These plots indicated that the BORN algorithm was improved with the addition of a GA adaptive unit. However, it was not a large improvement. Figures 11.20 and 11.21 show the results of an NN in which the number of hidden neurons is increased to 50. Figures 11.22 and 11.23 are the results with 100 hidden neurons. The performance was improved dramatically because GA has more selections to adjust NN connectivities.


Figure 11.14  BORN training result.


Figure 11.15  BORN test result.


Figure 11.16  Before GA-NN.


Figure 11.17  After GA


Figure 11.18  Before GA.


Figure 11.19  After GA.


Figure 11.20  GA-NN opt. (50 hidden).


Figure 11.21  GA-NN opt. (50 hidden).


Figure 11.22  GA-NN opt. (100 hidden).


Figure 11.23  GA-NN opt. (100 hidden).

CONCLUSION

GAs are highly efficient and robust optimization algorithms that have been used effectively in a variety of disciplines. In this chapter, simple GA was applied to enhance the performance of Bama Optimized Recurrent Neural Networks algorithm. In order to measure the performance of BORN with GA, two different studies are considered.

First, BORN was optimized with a GA and the result was compared with the result that was generated by only the BORN algorithm. Although GA-NN found a near optimal solution, the BORN algorithm performed better in a training case.

Second, a GA was implemented as a BORN connectivities adaptive unit to improve the BORN algorithm. The GA improved its performance for a hydrocyclone modeling problem. However, the GA needed a fairly large NN structure to minimize the error. The advantage of this method is that a GA can improve the performance of BORN by changing only the NN connections.

REFERENCES

1  Montana, D., J. and Davis, L. (1989) Training feedforward neural networks using genetic algorithms, Proceedings of 11th International Joint Conference on Artificial Intelligence, pp. 762-767, San Mateo, CA.
2  Goldberg, D. E., (1989) Genetic algorithms in search, optimization, and machine learning. Reading, MA: Addison Wesley.
3  Belew, R.K., McInerney, J., and Schraudolph, C. (1990) N.N. Evolving networks: Using genetic algorithms with connectionist learning. CSE technical report CS90-174, La Jolla, CA: University of California at San Diego, June.
4  Anderson, C. W. (1989) Learning to control an inverted pendulum using neural networks, IEEE Control Systems Magazine, 9, 31-37.
5  Wieland, A. P. (1990) Evolving neural network controllers for unstable systems, IEEE International Joint Conference on Neural Networks, pp. II-667 - II-672, Seattle, WA.
6  Werbos, P. J., (1992) Neural Networks, System Identification, and Control in the Chemical Process Industries, Handbook of Intelligent Control, Ed. White and Sofge, Van Nostrand Reinhold, New York, NY.
7  Narendra, K. S., (1992) Adaptive Control of Dynamical Systems Using Neural Networks, Handbook of Intelligent Control, Ed. White and Sofge, Van Nostrand Reinhold, New York, NY.
8  KrishnaKumar, K., (1993) Optimization of the neural net connectivity pattern using a back-propagation algorithm, Neurocomputing 5, p273-286.
9  KrishnaKumar, K. and Nishita, K., Robustness of Recurrent Neural Networks, WCNN'96, San Diego, CA, June 1996.
10  KrishnaKumar, K. and Nishita, K., (1995) BORN — Bama Optimized Recurrent Neural Networks WCNN'95, San Diego, CA.
11  Willis, B.A., (1979) Mineral Processing Technology, Toronto: Pergamon Press.
12  Plitt, L.R., (1976) A mathematical model of the hydrocyclone classifier, CIM Bulletin, 69, pp. 114-123.


Previous Table of Contents Next

Copyright © CRC Press LLC