Construisons GPT : De zéro à l’algorithme, expliquons en détails.

YouTube video

Bien sûr! Pour pouvoir t’aider, j’ai besoin que tu me fournisses la transcription de la vidéo YouTube que tu souhaites résumer. Peux-tu me la donner, s’il te plaît ?
Source : Andrej Karpathy | Date : 2023-01-17 17:33:27 | Durée : 01:56:20

GPT algorithm

➡️ Accéder à CHAT GPT4 en cliquant dessus

Commentaires

@johnmadsen37

Is this supposed to cause seizures?

@alexezazi4568

Thank you for a superb tutorial. Major revelation for me was that actually, you need a lot more than just attention. I had neglected to add that last layer normalization (ln_f ) in the forward method of class BigramLanguageModel and my loss was stuck at around 2.5. Once i caught and corrected that error, I duplicated Karpathy's result. After having read a number of books and papers and taken a few of the Stanford graduate NLP and deep generative models courses, this really brought it all together.

@marcosmarcal7200

Please, put Auto-dubbing in your videos please, i really want to learn. Thanks a lot

@VayVerlin69

"1000usd" thanks for the video

@raccoon_05

Thank you so much for this video. You explain some very complex topics in simple ways. I understood so much more from this than many other yt videos 👍👍

@mukundkushwaha2124

💗a great teacher out there!

@jixiangpan3398

Thank you for making this video!

@guruphot

I guess this is same one from freecodecamp which came 5 yrs ago

@jewalky

Around 12:50 it is very weird that space is the first character but the code for it is "1" not "0" 🤔
Pretty sure enumerate shouldn't be doing that…
UPD: nvm the first character is actually n or something

@karloamir2073

Is there a new successor to this guide ?

@reedschrichte800

A deep and sincere thank you!

@yuyang5575

Thank you very much. I have read the paper of transformer multiple times in the past two years. I still feel that I had not completely grasped the full picture of it, until watching the video and going through the code you provided. It also clarified a few important concepts in deep learning, such as normalization, drop out, and residual communication. I guess that I will watch it at least one more time.

@ruskaruma08

Imagine being between your job at Tesla and your job at OpenAI, being a tad bored and, just for fun, dropping on YouTube the best introduction to deep-learning and NLP from scratch so far, for free. Amazing people do amazing things even for a hobby.

Veuillez vous connecter pour commenter.