How to complete knowledge is one of the most important issue of knowledge bases, because of their large size and sparsity. To complete the knowledge bases, we need a model that can predict undefined relationships between entities. TransE has been a promising method to complete knowledge bases by using a translation concept, and improved approaches has been proposed based on TransE. However, these models common issue that they do not actually represent translation, and it causes lower performances. Here we propose a new embedding method, TTE which makes the translation concept better use. TTE uses a new objective function, which can learn translation relationships between entities and relations. TTE outperforms previous translation based approaches in a link prediction task on two knowledge bases without increasing the number of parameters. Another characteristic of knowledge bases is that they do not contain false samples. Traditional approaches of negative sampling regard randomly sampled knowledge as false. However, randomly sampled knowledge can contain true knowledge, which does not belong to dataset, and these knowledge lower the performances. In this thesis, we propose a new way to do negative sampling by using pretrained word embeddings.