DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Park, Jinkyoo | - |
dc.contributor.advisor | 박진규 | - |
dc.contributor.author | Jung, Yohan | - |
dc.date.accessioned | 2023-06-22T19:32:57Z | - |
dc.date.available | 2023-06-22T19:32:57Z | - |
dc.date.issued | 2023 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1030417&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/308399 | - |
dc.description | 학위논문(박사) - 한국과학기술원 : 산업및시스템공학과, 2023.2,[viii, 105 p. :] | - |
dc.description.abstract | In machine learning, stochastic processes have been used for modeling the dataset because the datasets can be understood as the finite realizations of the stochastic process. Thus, estimating the parameters of the stochastic process for given dataset results in modeling the underlying process of the datasets. Gaussian Process (GP) models are widely used to employ the stochastic process for modeling the datasets in practice. To use GP models, it is required to set the kernel function for determining the covariance structure of GP. Since the determined kernel function affects the modeling performance of the GP model, setting the kernel function is a vital procedure. Thus, there have been many works on finding the reasonable kernel function for the given dataset. Automatic Bayesian Covariance Discovery and Deep kernel framework have shown that the automatically chosen kernel leads to good performance on modeling datasets. However, those methods have some limitation that lacks the rationale to support the chosen kernel. In this thesis, I have focused on the class of the stationary kernel function because (1) the stationary kernel function can be used to model a wide range of stochastic process called stationary process and (2) the stationary kernel function has a theoretical background to explain its construction. Indeed, many datasets, including the time-series dataset, spatial-temporal dataset, image dataset, and sound dataset, have stationary property inherently. I have delved into how the stationary kernel can be trained in an efficient manner and how complex probabilistic models using the stationary kernel can be employed. This thesis mainly consists of two part: inference part and applications part for the stationary kernel. For the inference part, I propose the approximate Bayesian inference method to estimate the spectral mixture (SM) kernel efficiently. In this procedure, I have provided the theoretical justifications for the approximation procedure, the sampling strategy for stabilizing the stochastic training procedure, and the efficient update rule. I have validated that the proposed method can train the SM kernel for the large-scale dataset while stabilizing the training and reducing training time due to the fast convergence for estimating the kernel hyperparameters. For the application part, I have introduced two applications. For the first application, I have proposed the scalable inference method to train the hybrid HMM using the GP emission which can estimate the varying hidden state of sequences of time-series. I have validated that the proposed inference scheme enables the corresponding model to be trained with a large-scale dataset efficiently, and the trained model can thus estimate the hidden state for a large-scale dataset. For the second application, I have introduced a Deep neural network (DNN) architecture called a Bayesian Convolutional Deepsets to model the stationary process via Deep learning framework. The Bayesian Convolutional Deepsets employs the task-dependent stationary prior. I have validated that it alleviates the potential task ambiguity issue of the existing framework called the Convolutional Deepsets. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Stationary Process▼aGaussian Process▼aNeural Process▼aStationary kernel▼aProbabilistic Model▼aSpectral Mixture kernel▼aHidden Markov Model▼aConvolutional Deepsets▼aApproximate Bayesian Inference▼aVariational Inference | - |
dc.subject | 고정 과정▼a가우스 과정▼a신경 과정▼a고정 커널▼a확률 모델▼a스펙트럼 혼합 커널▼a은닉 마르코프 모델▼a컨벌루션 심층 집합▼a근사 베이지안 추론▼a변분추론 | - |
dc.title | Approximate Bayesian inference for stationary kernel and its applications to probabilistic models | - |
dc.title.alternative | 정상 커널의 베이지안 근사추론기법과 확률기반모델에서의 활용 | - |
dc.type | Thesis(Ph.D) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :산업및시스템공학과, | - |
dc.contributor.alternativeauthor | 정요한 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.