Indoor navigation is a representative application of an indoor positioning system that can use a variety of equipment, including smartphones with various sensors. Many indoor navigation systems utilize Wi-Fi signals, and further exploit a variety of inertial sensors, such as a 3D accelerometer, digital compass, gyroscope, and barometer, to improve the accuracy of user location tracking. The inertial sensors are vulnerable to changes in the surrounding environments and sensitive to users' behavior, but little research has been conducted on sensor fusion under these conditions. In this paper, we propose a dynamic sensor fusion framework (DSFF) that provides accurate user tracking results by dynamically calibrating inertial sensor readings in a sensor fusion process. The proposed method continually learns the errors and biases of each sensor due to the changes in user behavior patterns and surrounding environments. The learned patterns are then dynamically applied to the user tracking process to yield accurate results. The results of experiments conducted in both a single-story and a multi-story building confirm that DSFF provides accurate tracking results. The scalability of the DSFF will enable it to provide more accurate tracking results with various sensors, both existing and under development.