DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Han, In-Goo | - |
dc.contributor.advisor | 한인구 | - |
dc.contributor.author | Shin, Sung-Woo | - |
dc.contributor.author | 신성우 | - |
dc.date.accessioned | 2011-12-27T04:19:04Z | - |
dc.date.available | 2011-12-27T04:19:04Z | - |
dc.date.issued | 2001 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=169468&flag=dissertation | - |
dc.identifier.uri | http://hdl.handle.net/10203/53364 | - |
dc.description | 학위논문(박사) - 한국과학기술원 : 경영공학전공, 2001.8, [ ix, 160p ] | - |
dc.description.abstract | The task of classification permeates all walks of life, from business and economics to science and engineering. In this context, nonlinear techniques from artificial intelligence have often proven to be more effective than the methods of classical statistics. In many managerial classification problems, the volume of accumulated data is relatively small compared to applications in science and engineering. Furthermore, the classifier’s capability to handle small datasets and to explain classification result is an important advantage. In this context, lazy learning algorithms (LLAs) provide an effective approach. Furthermore, an LLA is often effective even where human expertise is unavailable or even nonexistent. This thesis proposes a series of effective feature weighting algorithms for LLAs based on elementary data mining methodologies. One approach involves the use of mutual information arising from an inductive decision tree as the weight for a feature. A second method relies on the judicious interpretation of weight connections from a trained neural network as a measure of the importance of a feature. The two basic approaches to feature weighting are compared to the selection of features by a genetic algorithm (GA). Subsequently, a combined approach is suggested as a cost-effective solution. In relation to the LLA-based combination of multiple classifiers (CMC), we propose an efficient meta-classifier architecture based on the assumption that uncorrelated subsets of features will generate independent classifiers toward an oracle. To overcome the limitations of kNN-based boosting using a straightforward approach, we propose an improved feature-weighted boosting algorithm for the kNN classifier. Moreover, we propose a kNN-directed noise injection scheme to enhance the effect of the boosting procedure by expanding the original dataset in a fortuitous fashion. Finally, we discuss a decision flipping approach. The methodology takes an advantage of the correlated e... | eng |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Feature weighting | - |
dc.subject | Artificial intelligence | - |
dc.subject | Machine learning | - |
dc.subject | Data mining | - |
dc.subject | Classifier combination | - |
dc.subject | 복수 분류기 결합 | - |
dc.subject | 변수 가중치 분석 | - |
dc.subject | 인공지능 | - |
dc.subject | 기계학습 | - |
dc.subject | 데이타 마이닝 | - |
dc.title | Hybrid data mining model for managerial classification : feature weighting and classifier combination | - |
dc.title.alternative | 경영 분류를 위한 통합형 데이타 마이닝 모형 : 변수 가중치 분석 및 복수 분류기 결합 | - |
dc.type | Thesis(Ph.D) | - |
dc.identifier.CNRN | 169468/325007 | - |
dc.description.department | 한국과학기술원 : 경영공학전공, | - |
dc.identifier.uid | 000985195 | - |
dc.contributor.localauthor | Han, In-Goo | - |
dc.contributor.localauthor | 한인구 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.