This thesis presents a new method of regression based on radial basis functions networks (RBFNs). The RBFNs are trained from a set of training samples to solve the problem of function approximation. In this training, minimizing the true error for the whole distribution of sample space not just a set of training samples, is a very critical problem. This is refer to as the generalization problem. To cope with this problem, the validation set, a part of training samples is extracted and the rest of training samples are used for training the regression models. The validation set is then used to check whether the regression model is overfitting to the training samples. In this thesis, a new approach of regression without the validation set is considered. An error confidence interval is estimated for the regression model to check the training of regression model instead of using the validation set. Especially, a form of error confidence intervals for the regression of RBFNs is derived from the view point of statistics and coefficients of an error confidence intervals are estimated for the specific training samples. We have shown that the gradients of the estimated errors which are obtained by adding the error confidence intervals and training errors, are consistent over various validation sets. The gradients of estimated errors could be a candidate of stopping criteria for the training of regression models. We will refine the method of optimizing the regression model based on the error confidence interval.