since each client’s local objective drifts away from the global objective (i.e., client-drift), the global model is prone to face slow and unstable convergence, or even get biased. Therefore, many studies already tried to overcome the data heterogeneity. However, there are few discussions about the data labeling in federated setting, in spite of the importance of label information for supervised image classification task. In FL, each client has its own local data which should be labeled by the client itself. Since data labeling is labor/knowledge-intensive, many clients can mislabel their data samples, introducing noisy labels. On the server-side, it cannot directly access the clients’ local data so it is harder to identify or correct the noisy-labeled data. In this paper, we name the federated learning with noisy-labeled data as Noisy Federated Learning (NFL), and provide the results of applying several conventional noisy label learning methods to NFL. Based on the observation that Early Learning Regularization (ELR) works well in NFL, we propose its generalized version for FL, called Federated Learning Regularization (FLR) loss.; Federated Learning (FL) has emerged for the solution of collaborative machine learning setting that trains a global server model among multiple local clients while keeping each local training data decentralized. Despite its popularity, FL still faces the challenges in deployment into real-world decision-making systems. Data heterogeneity is one of the major challenges