-
week 1 - linear regression
A deep dive into Linear Regression and the math behind it
-
week 2 - transformers
a deep dive into transformers and a visual guide into how it works
-
week 3 - optimizers
a deep dive into optimizers and a journey through time
-
week 4 - rnn
a deep dive into recurrent neural networks and how the math behind it works
-
week 5 - basics of nlp [part 1]
discussion into how text preprocessing, regex, frequencies, and word embeddings work
-
week 6 - basics of nlp [part 2]
discussion into how pos tagging, ner, sentiment analysis, and n-gram models work
-
week 7 - basics of nlp [part 3]
a guide into how hidden markov models, text clustering, attention work
-
week 8 - llms [part 1]
a guide into how embeddings, positional embeddings, tokenizer (especially bpe tokenizer) work
-
week 9 - enhance your model [part 1]
a guide into how lora, model distillation, gradient clipping and early stopping work
-
week 10 - llms [part 2]
a guide into how attention works, in great detail
-
deepseek r1 explanation
a guide into how deepseek r1 works under the hood
-
all about quantization
a guide into how quantization occurs in llms