I envision that future mobile services will create promising changes in every facet of our daily lives including child education, elderly support, healthcare, working, sports, and traveling. Specifically, life-immersive sensing applications are emerging on top of mobile and sensor devices, accelerating this new wave. Such applications are clearly distinguished from conventional mobile applications, in that they provide proactive services fitting to situational contexts of mobile users; in contrast, conventional applications reactively provide rather passive services upon user-initiation. Imagine a situation that a mobile user goes out for family excursion. After watching a zoo, he roams around, having difficulties in finding a good restaurant for dinner. A future life-immersive ad service will capture your trouble at the moment, and proactively deliver a fascinating coupon of the nearby franchise restaurant that his son would like. A core in providing such proactive services lies in automated understanding of diverse user contexts, e.g., location, activity, companion, emotion, social interaction, and surroundings.
In this thesis, I propose a new cooperative mobile context monitoring platform to enable highly-enriched and always-available context-awareness for diverse life-immersive sensing applications. Despite their practical values and usefulness, the spread of life-immersive sensing applications has been extremely slow. This results from unprecedented challenges in capturing rich user contexts in real life situations, including the design of precise context inference algorithms, burdensome exploration on heterogeneous devices, e.g., languages and operating systems, and resource use optimizations to run over resource-limited mobile and sensing devices. A core system challenge to build a context monitoring platform is to continuously execute highly complex, multi-step operations for inference of user contexts, distributed over resource-scarce smartphones and...