Niki Parmar

E457853

Niki Parmar is a computer scientist and co-author of the seminal "Attention Is All You Need" paper that introduced the Transformer architecture in deep learning.

Try in SPARQL Jump to: Surface forms Statements Referenced by

Observed surface forms (1)

Surface form Occurrences
"Attention Is All You Need" 0

Statements (39)

Predicate Object
instanceOf artificial intelligence researcher
computer scientist
researcher
coAuthorOf "Attention Is All You Need" NERFINISHED
coAuthorWith Aidan N. Gomez NERFINISHED
Ashish Vaswani NERFINISHED
Illia Polosukhin NERFINISHED
Llion Jones NERFINISHED
Niki Uszkoreit NERFINISHED
Noam Shazeer NERFINISHED
Łukasz Kaiser NERFINISHED
contributedTo Transformer architecture
self-attention mechanisms in neural networks
educatedAt Indian Institute of Technology NERFINISHED
University of Toronto
fieldOfWork deep learning
machine learning
natural language processing
neural networks
hasAuthor Niki Parmar NERFINISHED
hasCitationCountOver 100000 for "Attention Is All You Need" (approximate, order of magnitude)
hasCitizenship India NERFINISHED
hasEmployer Google NERFINISHED
Google Brain NERFINISHED
hasGender female
hasRole author
scientist
influenced advances in NLP architectures after 2017
development of modern large language models
introduced Transformer architecture
knownFor co-authoring the paper "Attention Is All You Need"
contributions to the Transformer architecture
notableWork "Attention Is All You Need" NERFINISHED
publishedIn 2017
researchInterest attention mechanisms
large-scale neural networks
sequence modeling
worksIn artificial intelligence
computer science

Referenced by (2)

Full triples — surface form annotated when it differs from this entity's canonical label.

Transformer introducedBy Niki Parmar
Lukasz Kaiser collaboratedWith Niki Parmar
subject surface form: Łukasz Kaiser