Accès ChatGPT
Construisons GPT : De zéro à l'algorithme, expliquons en détails.

Construisons GPT : De zéro à l’algorithme, expliquons en détails.

Bien sûr! Pour pouvoir t’aider, j’ai besoin que tu me fournisses la transcription de la vidéo YouTube que tu souhaites résumer. Peux-tu me la donner, s’il te plaît ?
Source : Andrej Karpathy | Date : 2023-01-17 17:33:27 | Durée : 01:56:20

GPT algorithm

➡️ Accéder à CHAT GPT en cliquant dessus

image_pdfimage_print

Comments

@RyuDev_
23 février 2025
Reply

14:47

@johnmadsen37
23 février 2025
Reply

Is this supposed to cause seizures?

@dc33333
23 février 2025
Reply

Jesus he is good.

@alexezazi4568
23 février 2025
Reply

Thank you for a superb tutorial. Major revelation for me was that actually, you need a lot more than just attention. I had neglected to add that last layer normalization (ln_f ) in the forward method of class BigramLanguageModel and my loss was stuck at around 2.5. Once i caught and corrected that error, I duplicated Karpathy's result. After having read a number of books and papers and taken a few of the Stanford graduate NLP and deep generative models courses, this really brought it all together.

@sayanchaudhuri
23 février 2025
Reply

blew my mind!!

@amirdalili328
23 février 2025
Reply

Thank you man!

@marcosmarcal7200
23 février 2025
Reply

Please, put Auto-dubbing in your videos please, i really want to learn. Thanks a lot

@UnearthedI
23 février 2025
Reply

gyatt

@VayVerlin69
23 février 2025
Reply

"1000usd" thanks for the video

@raccoon_05
23 février 2025
Reply

Thank you so much for this video. You explain some very complex topics in simple ways. I understood so much more from this than many other yt videos 👍👍

@mukundkushwaha2124
23 février 2025
Reply

💗a great teacher out there!

@jixiangpan3398
23 février 2025
Reply

Thank you for making this video!

@guruphot
23 février 2025
Reply

I guess this is same one from freecodecamp which came 5 yrs ago

@jewalky
23 février 2025
Reply

Around 12:50 it is very weird that space is the first character but the code for it is "1" not "0" 🤔
Pretty sure enumerate shouldn't be doing that…
UPD: nvm the first character is actually n or something

@karloamir2073
23 février 2025
Reply

Is there a new successor to this guide ?

@reedschrichte800
23 février 2025
Reply

A deep and sincere thank you!

@yuyang5575
23 février 2025
Reply

Thank you very much. I have read the paper of transformer multiple times in the past two years. I still feel that I had not completely grasped the full picture of it, until watching the video and going through the code you provided. It also clarified a few important concepts in deep learning, such as normalization, drop out, and residual communication. I guess that I will watch it at least one more time.

@bulatvaliakhmetov
23 février 2025
Reply

cool

@ruskaruma08
23 février 2025
Reply

Imagine being between your job at Tesla and your job at OpenAI, being a tad bored and, just for fun, dropping on YouTube the best introduction to deep-learning and NLP from scratch so far, for free. Amazing people do amazing things even for a hobby.

@ruskaruma08
23 février 2025
Reply

oooooo

Leave a Comment

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *

🚀 Abonnez vous a Chat GPT
Découvrez votre formule idéale !
🚀 ABONNEZ-VOUS A CHATGPT4
Découvrez votre formule idéale avec ChatGPT France !