Transformers meet Stochastic Block Models: Attention with Data-Adaptive Sparsity and Cost

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 83
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorCho, Sungjunko
dc.contributor.authorMin, Seonwooko
dc.contributor.authorKim, Jinwooko
dc.contributor.authorLee, Moontaeko
dc.contributor.authorLee, Honglakko
dc.contributor.authorHong, Seunghoonko
dc.date.accessioned2022-11-23T02:01:59Z-
dc.date.available2022-11-23T02:01:59Z-
dc.date.created2022-11-02-
dc.date.created2022-11-02-
dc.date.created2022-11-02-
dc.date.issued2022-11-29-
dc.identifier.citation36th Conference on Neural Information Processing Systems, NeurIPS 2022-
dc.identifier.urihttp://hdl.handle.net/10203/300571-
dc.languageEnglish-
dc.publisherNeural information processing systems foundation-
dc.titleTransformers meet Stochastic Block Models: Attention with Data-Adaptive Sparsity and Cost-
dc.typeConference-
dc.type.rimsCONF-
dc.citation.publicationname36th Conference on Neural Information Processing Systems, NeurIPS 2022-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationThe New Orleans Convention Center-
dc.contributor.localauthorHong, Seunghoon-
dc.contributor.nonIdAuthorCho, Sungjun-
dc.contributor.nonIdAuthorMin, Seonwoo-
dc.contributor.nonIdAuthorKim, Jinwoo-
dc.contributor.nonIdAuthorLee, Moontae-
dc.contributor.nonIdAuthorLee, Honglak-
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0