In psychology, it is known that facial dynamics benefit the perception of identity. This paper proposes a novel deep network framework to capture identity information from facial dynamics and their relations. In the proposed method, facial dynamics occurred from smile expression are analyzed and utilized for facial authentication. Detailed changes in the local regions of a face such as wrinkles and dimples are encoded in the facial dynamic feature representation. The latent relationships of the facial dynamic features are learned by the facial dynamic relational network. In the facial dynamic relational network, the relation features of the facial dynamic are encoded and the relational importance is encoded based on the relation features. As a result, the proposed method has more attention on the important relation features in facial authentication. Through comprehensive and comparative experiments, the effectiveness of the proposed method has been verified in facial authentication.