Optical neural networks based on the fractional Fourier transform (FRT) are examined in connection with log-likelihood and parallelism. It is found that a neural network using FRT and the mean square error classifies patterns far better than the one using Fourier transform and the mean square error. However, the classification performance of this neural network is limited. In order to speed up its learning convergence, the mean square error is replaced first with the log-likelihood. Then, parallelism is introduced to the FRT neural network with the log-likelihood and its effect on the neural network is studied. Finally, it is demonstrated that the combination of FRT, log-likelihood, and parallelism significantly improves both the learning convergence and the recall rate of the neural network. (C) 1998 Elsevier Science B.V. All rights reserved.