This thesis provides a new approach of designing support vector machines (SVMs) with Gaussian kernel functions for multicategory classification: for the given training samples of multicategory classification, methodologies of how support vectors can be selected among training samples and how discriminant functions can be constructed for classifying the data, are suggested. A short background about generalization bound describing the relationship between the general and empirical risks of SVMs, is also described. For the classification problem, the SVM constructs the discriminant function representing the separating hyperplane in the sense of the structural risk minimization (SRM) principle, that is, optimizing the structure of SVM in the sense of minimizing the general risk. One weak point in this approach is that the SVM is intrinsically linear. For the nonlinear decision boundary, the SVM employs nonlinear kernel such as Gaussian kernel function instead of linear units. However, in this case, there is no systematic way of determining the kernel parameters using the SVM algorithm. In this sense, a new approach of determining the parameters of kernel functions in the sense of minimizing mean square error (MSE) after selecting the support vectors, is considered. The suggested algorithm is composed of two parts: 1) the first part is selecting support vectors among training samples using the optimization technique for a quadratic function defined by the summation of decision error and regularization term such as norm square of weight parameters of an estimation network, and 2) the second part is estimating the parameters of SVM with kernel functions using the minimization algorithm of mean square error. To show the effectiveness of suggested approach, the simulation for the classification of various benchmark data from UCI machine learning group is
performed.