This paper proposes a sensor-based navigation method which utilizes fuzzy logic and reinforcement learning for navigation of mobile robot in uncertain environments. The proposed navigator consists of an avoidance behavior and goal-seeking behavior. Two behaviors are independently designed at design stage and then combined by a behavior selector at running stage. A behavior selector using a bistable switching function chooses a behavior at each action step so that the mobile robot can go for the goal position without colliding with obstacles. Fuzzy logic maps the input fuzzy sets representing the mobile robot's state space determined by sensor readings to the output fuzzy sets representing mobile robot's action space. Fuzzy rule bases are built through the reinforcement learning which requires simple evaluation data rather than thousands of input-output training data. Since fuzzy rules for each behavior are learned through reinforcement learning method, fuzzy rule bases can be easily constructed for more complex environments. In order to find mobile robot's present state, the ultrasonic sensors mounted at the mobile robot are used. The effectiveness of the proposed method is verified by a series of simulations.