DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Shim, Hyun-Chul | - |
dc.contributor.advisor | 심현철 | - |
dc.contributor.author | Lee, Seunghyun | - |
dc.date.accessioned | 2022-04-15T07:58:26Z | - |
dc.date.available | 2022-04-15T07:58:26Z | - |
dc.date.issued | 2021 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=948599&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/295134 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 미래자동차학제전공, 2021.2,[iv, 34 p. :] | - |
dc.description.abstract | In this research, we propose a real-time lane prediction algorithm using the sequence-to-sequence model. Deep learning based on camera sensors has become an indispensable key technology for situation recognition while playing the same role as the eyes of autonomous vehicles. However, various situations on the actual road make many restrictions on the cameras to recognize the situation. In particular, various situations such as a section where the lane is cut off such as a crosswalk or a crossroad, a situation where the lane is cut off due to the painted state of the lane on the road, and a situation where the lane is blocked by other vehicles or obstacles on the road increase the risk and instability in autonomous driving. In this research, we propose the real-time lane prediction algorithm based on the detected lanes after acquiring a dataset by driving an actual autonomous vehicle. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Autonomous driving▼aDeep learning▼aRecurrent Neural Network (RNN)▼aLane detection▼aLane prediction | - |
dc.subject | 자율주행▼깊은신경망▼a반복신경망▼a차선 인식▼a차선 예측 | - |
dc.title | Real-time lane prediction algorithm using the sequence-to-sequence model | - |
dc.title.alternative | Sequence-to-Sequence 모델을 활용한 실시간 차선 예측 알고리즘 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :미래자동차학제전공, | - |
dc.contributor.alternativeauthor | 이승현 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.