In this video, we start a new series where we explore the first 5 items in the reading that Ilya Sutskever, former OpenAI chief scientist, gave to John Carmack. Ilya followed by saying that “If you really learn all of these, you’ll know 90% of what matters today“. *References* ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ Transformer Self-Attention Mechanism Explained: Long Short-Term Memory (LSTM) Equations Explained: *Reading List* ▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬ The Annotated Transformer: The First Law of Complexodynamics: The Unreasonable Effectiveness of Recurre
Hide player controls
Hide resume playing