As the transistor scales down, the number of design variables which significantly alter the performance of devices exponentially increases, and more complex models are required to capture the effects of the variables. It is necessary to handle the complexity of TCAD modeling by utilizing the machine learning (ML)-based approach . In this work, we have constructed the framework for the ML-based device optimization with TCAD. TCAD calculates the device performance of the design given from ML and gives it to ML. ML recommends the next calculation candidate in the direction of optimum. The best design can be found quickly by repeating this automatic feedback process. ATLAS by SILVACO and open source Bayesian-optimization library COMBO  were used for TCAD simulation and ML, respectively. Bayesian optimization is a design algorithm based on ML which finds global optimum with few calculations. We have tested the efficiency of our framework in the planer nMOSFET with the channel length of 20 nm, where each design parameter can enhance or degrade the device performance, resulting in 5000 design options to search. Device with maximum ON-state current (ION) and the one with minimum SS were found by just 25 and 140 calculations, which are 0.5 and 2.8 % of total possible cases, respectively. To maximize ION while minimizing SS, trade-off between ION and SS was considered, followed by new objective function. The device with optimal ION and SS is also found by 80 calculations (1.6 % of total). In conclusion, ML-based TCAD optimization can greatly reduce the effort and time for design of next generation node devices.