pull down to refresh

3blue1brown did a video on transformers. Even if you're not too familiar with the math, 3b1b does enough concept explaining to make it worth your while.
100 sats \ 3 replies \ @gmd 8 Apr
is 3blue1brown @south_korea_ln??
reply
:) He shared the video before and it got no love.
reply
100 sats \ 1 reply \ @gmd 8 Apr
ahhh thought for a second we had a youtube celebrity in our midst :P
reply
I wish ☺️
reply
109 sats \ 2 replies \ @cascdr 8 Apr
this guy's videos are incredibly helpful
reply
100 sats \ 1 reply \ @gmd 8 Apr
OK I've been listening to this in the background while browsing SN and realized I haven't absorbed a thing... do you guys just sit there and watch videos like this with your full attention or am I just dumb and bad at multitasking? :)
reply
299 sats \ 0 replies \ @cascdr 8 Apr
Full attention is absolutely required. I recommend 2x speed and 1.5 to 2 times.
It also helps if you understand control systems and/or matrix algebra. I have been blessed with a BS curriculum in just that. If you go to the link I send above there's a lot of foundational stuff that the youtube channel covers that makes understanding much easier.
I also made a tool that summarizes youtube content and is paid in bitcoin that you may find useful. This is the output for this particular video:
TLDR: The video discusses the workings of transformers, specifically the Generative Pretrained Transformer (GPT), which uses a neural network model to generate text based on input data.
  • GPT stands for Generative Pretrained Transformer, a neural network model.
  • Transformers process data through tokens associated with vectors.
  • The model predicts the next word based on the input text.
  • The video explains the process of generating text using transformers.
  • GPT-3, a variant of the transformer model, is used for tasks like language translation and text generation.
In the video, the speaker delves into the intricate workings of transformers, particularly focusing on how GPT models generate text. By breaking down the process of tokenizing input data, associating them with vectors, and predicting the next word, viewers gain insight into the fascinating world of neural network models like GPT. The video serves as a comprehensive guide for understanding the mechanisms behind text generation using transformers, shedding light on the complexity and capabilities of these AI models.
reply
43 sats \ 0 replies \ @cascdr 8 Apr
This github repo references this channel and many other great resources for those interested in AI: https://github.com/mlabonne/llm-course
reply