𝗣𝗔𝗥𝗧 𝟭 - 𝗔 𝗚𝗘𝗡𝗧𝗟𝗘 𝗜𝗡𝗧𝗥𝗢 𝗧𝗢 𝗟𝗟𝗠𝗦 (1h 30m)
Intro & overview of language models, next-word prediction, embeddings, cosine similarity, semantic search:
📺 30m: Simple intro (MBerm)
📖 40m: llms nohype (MRied)
📺 10m: Next wrd prd (LBouc)
📺 10m: Sem search (LSerr)
𝗣𝗔𝗥𝗧 𝟮 - 𝗧𝗥𝗔𝗡𝗦𝗙𝗢𝗥𝗠𝗘𝗥 𝗔𝗥𝗖𝗛𝗜𝗧𝗘𝗖𝗧𝗨𝗥𝗘 (1h 30m)
Encoder-decoder architecture, masking, attention, transformers, GPTs:
📺 30m: Visual llm (3Blu1Br)
📺 20m: Enc-Dec (JStarm)
📺 10m: Attention (LSerr)
📺 15m: Transformers (LSerr)
📺 15m: GPTs (DataBrks)
𝗣𝗔𝗥𝗧 𝟯 - 𝗣𝗥𝗢𝗠𝗣𝗧 𝗘𝗡𝗚𝗜𝗡𝗘𝗘𝗥𝗜𝗡𝗚 (2h 0m)
Zero/one/few-shot, chain-of-thought, self-consistency, generated knowledge, prompt chaining, ReAct:
📺 60m: PE Overview (ESarav)
📖 90m: Prompt Eng (DAIR)
📖 30m: PE Guide (Brex)
⚒️ 0m: LangChain Hub
𝗣𝗔𝗥𝗧 𝟰 - 𝗖𝗛𝗔𝗜𝗡𝗜𝗡𝗚, 𝗥𝗔𝗚, 𝗩𝗘𝗖𝗧𝗢𝗥 𝗗𝗕𝗦, 𝗔𝗚𝗘𝗡𝗧𝗦 (2h 30m)
Langchain, RAG, vector databases, LlamaIndex, Open AI functions:
📺 30m: LngCh ckbk1 (GKam)
📺 30m: LngCh ckbk2 (GKam)
⚒️ 0m: LngCh tutorials (GKam)
📖 30m: Adv RAG (HugFace)
📺 5m: Vector dbs (Fireshp)
📺 10m: LC+Pinecone (GKam)
📺 20m: LlamaIdx (engprmpt)
📺 30m: OpenAI fn/agnt (auto)
𝗣𝗔𝗥𝗧 𝟱 - 𝗙𝗜𝗡𝗘𝗧𝗨𝗡𝗜𝗡𝗚 (1h 30m)
Feature-based finetuning, LoRA, RLHF:
📖 15m: Finetuning (SRasc)
📖 15m: FT v PrmpEng (NBant)
📺 30m: RLHF intro (HugFac)
📖 15m: RLHF guid (Lblerr)
📖 15m: LoRA (HugFac)
Curated by Jack Blandin. Good luck!