| Previous | Table of Contents | Next |
The data available for the model development consists of input-output pairs. The inputs consist of the variables shown entering the model in Figure 11.13. The output is the D50. The data will be segregated into two distinct groups: (1) a training set and (2) a test set. In training data set, the desired output data are provided to fuzzy system to update the parameters. The desired output is not given in the test set. NN with BORN was applied to model this hydrocyclone separation problem by the author of this chapter. However, due to the difference of system input and output ranges between training and test data sets, NN performance was found to be unacceptable. In order to adapt previously optimized NN by BORN, GA-BORN adaptive unit was used.
The NN that will be optimized by BORN has eight inputs, ten hidden neurons, and one output. This NN is trained with a learning rate of 0.01 for 75 epochs and then the BORN algorithm is executed for 25 epochs. The NN optimized by GA has the same structure as the one optimized using BORN. In order to optimize NN connections by GA, GA parameters are set to 30 populations, 19 strings, 0.8 probability of crossover, and 0.01 probability of mutation for 100 generations.
Figure 11.14 shows the training result for BORN. Despite the high nonlinearity of hydrocyclone model, training performance for BORN was fairly good. Figures 11.15 to 11.19 show BORN test results before and after GA adaptive unit is executed. These plots indicated that the BORN algorithm was improved with the addition of a GA adaptive unit. However, it was not a large improvement. Figures 11.20 and 11.21 show the results of an NN in which the number of hidden neurons is increased to 50. Figures 11.22 and 11.23 are the results with 100 hidden neurons. The performance was improved dramatically because GA has more selections to adjust NN connectivities.
Figure 11.14 BORN training result.
Figure 11.15 BORN test result.
Figure 11.16 Before GA-NN.
Figure 11.17 After GA
Figure 11.18 Before GA.
Figure 11.19 After GA.
Figure 11.20 GA-NN opt. (50 hidden).
Figure 11.21 GA-NN opt. (50 hidden).
Figure 11.22 GA-NN opt. (100 hidden).
Figure 11.23 GA-NN opt. (100 hidden).
CONCLUSION
GAs are highly efficient and robust optimization algorithms that have been used effectively in a variety of disciplines. In this chapter, simple GA was applied to enhance the performance of Bama Optimized Recurrent Neural Networks algorithm. In order to measure the performance of BORN with GA, two different studies are considered.
First, BORN was optimized with a GA and the result was compared with the result that was generated by only the BORN algorithm. Although GA-NN found a near optimal solution, the BORN algorithm performed better in a training case.
Second, a GA was implemented as a BORN connectivities adaptive unit to improve the BORN algorithm. The GA improved its performance for a hydrocyclone modeling problem. However, the GA needed a fairly large NN structure to minimize the error. The advantage of this method is that a GA can improve the performance of BORN by changing only the NN connections.
REFERENCES
| Previous | Table of Contents | Next |