Adaptive AR interface is an undeniable key trend of future mixed reality. Although context awareness plays a vital role in realizing adaptive wearable AR, context classification for the wearable AR user’s daily life has not been fully explored yet. Therefore, we propose Holistic Quantified-Self (HQS) framework for CAWAR: Context-Aware Wearable AR. We composed HQS with four significant aspects of the self: the physical aspect (active status, posture), cognitive-emotional status (stress, emotional arousal, emotional valence), social interactions, and digital consumption behavior. To construct HQS, our system gathered heterogeneous raw data by tracking 3-axis linear acceleration from the accelerometer, 3-axis angular velocity from the gyroscope, 3-axis magnetic field from the magnetometer, electrodermal activity, skin temperature, heart rate, blood volume pulse, number of faces, audio signal, and device log data. Then, we trained, validated, and tested a model with our dataset using Random Forest classifier to classify the user’s context into six categories in an office worker scenario; 1) working alone, 2) resting alone, 3) walking alone, 4) working with others, 5) resting with others, and 6) walking with others. The binary classification result shows that the accuracy of the trained model is 100%, 100%, and 99%, respectively, for classifiers of social interaction, mobility, and type of work. We could also estimate the holistic status of the user from the raw data. Several application scenarios are discussed.