DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lee, JH | ko |
dc.contributor.author | Oh, SH | ko |
dc.contributor.author | Lee, Soo-Young | ko |
dc.date.accessioned | 2009-07-23T02:46:36Z | - |
dc.date.available | 2009-07-23T02:46:36Z | - |
dc.date.created | 2012-02-06 | - |
dc.date.created | 2012-02-06 | - |
dc.date.issued | 2004 | - |
dc.identifier.citation | NEURAL INFORMATION PROCESSING BOOK SERIES: LECTURE NOTES IN COMPUTER SCIENCE, v.3316, pp.1070 - 1075 | - |
dc.identifier.issn | 0302-9743 | - |
dc.identifier.uri | http://hdl.handle.net/10203/10217 | - |
dc.description.abstract | In this paper, an adaptive blind dereverberation method based on speech generative model is presented. Our ICA-based speech generative model can decompose speeches into independent sources. Experimental results show that the proposed blind dereverberation model successfully performs even in non-minimum phase channels. | - |
dc.description.sponsorship | This research was supported as a Brain Neuroinformatics Research Program by Korean Ministry of Science and Technology. | en |
dc.language | English | - |
dc.language.iso | en_US | en |
dc.publisher | SPRINGER-VERLAG BERLIN | - |
dc.subject | DECONVOLUTION | - |
dc.title | Blind dereverberation of single-channel speech signals using an ICA-based generative model | - |
dc.type | Article | - |
dc.identifier.wosid | 000225878300165 | - |
dc.identifier.scopusid | 2-s2.0-35048839840 | - |
dc.type.rims | ART | - |
dc.citation.volume | 3316 | - |
dc.citation.beginningpage | 1070 | - |
dc.citation.endingpage | 1075 | - |
dc.citation.publicationname | NEURAL INFORMATION PROCESSING BOOK SERIES: LECTURE NOTES IN COMPUTER SCIENCE | - |
dc.embargo.liftdate | 9999-12-31 | - |
dc.embargo.terms | 9999-12-31 | - |
dc.contributor.localauthor | Lee, Soo-Young | - |
dc.contributor.nonIdAuthor | Lee, JH | - |
dc.contributor.nonIdAuthor | Oh, SH | - |
dc.type.journalArticle | Article; Proceedings Paper | - |
dc.subject.keywordPlus | DECONVOLUTION | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.