Tutorialtransformersllmrlhfrag
Stanford Publishes CME295 Transformers Course Materials
8.3
Relevance ScoreStanford’s CME295 Transformers and Large Language Models course from Autumn 2025 is published openly, providing complete curriculum materials including nine recorded on-campus lectures, slides, and midterm and final exams with solutions. The nine-lecture course covers tokenization, attention, decoding, mixture-of-experts, scaling laws, fine-tuning methods like LoRA and RLHF, retrieval-augmented generation, agentic LLMs, evaluation, quantization and optimization, and is recommended for learners with basic linear algebra, ML and Python.


