DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lee, Dongeun | ko |
dc.contributor.author | Choi, Jaesik | ko |
dc.date.accessioned | 2019-12-13T13:27:28Z | - |
dc.date.available | 2019-12-13T13:27:28Z | - |
dc.date.created | 2019-12-13 | - |
dc.date.created | 2019-12-13 | - |
dc.date.issued | 2014-10-27 | - |
dc.identifier.citation | 2nd IEEE International Conference on Big Data, IEEE Big Data 2014, pp.323 - 328 | - |
dc.identifier.uri | http://hdl.handle.net/10203/269669 | - |
dc.description.abstract | Many large scale sensor networks produce tremendous data, typically as massive spatio-temporal data streams. We present a Low Complexity Sensing framework that, coupled with novel compressive sensing techniques, enables to reduce computational and communication overheads significantly without much compromising the accuracy of sensor readings. More specifically, our sensing framework randomly samples time-series data in the temporal dimension first, then in the spatial dimension. Under some mild conditions, our sensing framework holds the same theoretical bound of reconstruction error, but is much simpler and easier to implement than existing compressive sensing frameworks. In experiments with real world environmental data sets, we demonstrate that the proposed framework outperforms two existing compressive sensing frameworks designed for spatio-temporal data. | - |
dc.language | English | - |
dc.publisher | IEEE Big Data | - |
dc.title | Low Complexity Sensing for Big Spatio-Temporal Data | - |
dc.type | Conference | - |
dc.identifier.scopusid | 2-s2.0-84921794779 | - |
dc.type.rims | CONF | - |
dc.citation.beginningpage | 323 | - |
dc.citation.endingpage | 328 | - |
dc.citation.publicationname | 2nd IEEE International Conference on Big Data, IEEE Big Data 2014 | - |
dc.identifier.conferencecountry | US | - |
dc.identifier.conferencelocation | Washington D.C | - |
dc.identifier.doi | 10.1109/BigData.2014.7004248 | - |
dc.contributor.nonIdAuthor | Lee, Dongeun | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.