Building Large Language Models
placeAmsterdam 9 feb. 2026 tot 12 feb. 2026Toon rooster event 9 februari 2026, 09:30-16:30, Amsterdam, Dag 1 event 10 februari 2026, 09:30-16:30, Amsterdam, Dag 2 event 11 februari 2026, 09:30-16:30, Amsterdam, Dag 3 event 12 februari 2026, 09:30-16:30, Amsterdam, Dag 4 |
placeEindhoven 9 feb. 2026 tot 12 feb. 2026Toon rooster event 9 februari 2026, 09:30-16:30, Eindhoven, Dag 1 event 10 februari 2026, 09:30-16:30, Eindhoven, Dag 2 event 11 februari 2026, 09:30-16:30, Eindhoven, Dag 3 event 12 februari 2026, 09:30-16:30, Eindhoven, Dag 4 |
placeHouten 9 feb. 2026 tot 12 feb. 2026Toon rooster event 9 februari 2026, 09:30-16:30, Houten, Dag 1 event 10 februari 2026, 09:30-16:30, Houten, Dag 2 event 11 februari 2026, 09:30-16:30, Houten, Dag 3 event 12 februari 2026, 09:30-16:30, Houten, Dag 4 |
computer Online: Online 9 feb. 2026 tot 12 feb. 2026Toon rooster event 9 februari 2026, 09:30-16:30, Online, Dag 1 event 10 februari 2026, 09:30-16:30, Online, Dag 2 event 11 februari 2026, 09:30-16:30, Online, Dag 3 event 12 februari 2026, 09:30-16:30, Online, Dag 4 |
placeRotterdam 9 feb. 2026 tot 12 feb. 2026Toon rooster event 9 februari 2026, 09:30-16:30, Rotterdam, Dag 1 event 10 februari 2026, 09:30-16:30, Rotterdam, Dag 2 event 11 februari 2026, 09:30-16:30, Rotterdam, Dag 3 event 12 februari 2026, 09:30-16:30, Rotterdam, Dag 4 |
placeZwolle 9 feb. 2026 tot 12 feb. 2026Toon rooster event 9 februari 2026, 09:30-16:30, Zwolle, Dag 1 event 10 februari 2026, 09:30-16:30, Zwolle, Dag 2 event 11 februari 2026, 09:30-16:30, Zwolle, Dag 3 event 12 februari 2026, 09:30-16:30, Zwolle, Dag 4 |
placeAmsterdam 13 apr. 2026 tot 16 apr. 2026Toon rooster event 13 april 2026, 09:30-16:30, Amsterdam, Dag 1 event 14 april 2026, 09:30-16:30, Amsterdam, Dag 2 event 15 april 2026, 09:30-16:30, Amsterdam, Dag 3 event 16 april 2026, 09:30-16:30, Amsterdam, Dag 4 |
placeEindhoven 13 apr. 2026 tot 16 apr. 2026Toon rooster event 13 april 2026, 09:30-16:30, Eindhoven, Dag 1 event 14 april 2026, 09:30-16:30, Eindhoven, Dag 2 event 15 april 2026, 09:30-16:30, Eindhoven, Dag 3 event 16 april 2026, 09:30-16:30, Eindhoven, Dag 4 |
placeHouten 13 apr. 2026 tot 16 apr. 2026Toon rooster event 13 april 2026, 09:30-16:30, Houten, Dag 1 event 14 april 2026, 09:30-16:30, Houten, Dag 2 event 15 april 2026, 09:30-16:30, Houten, Dag 3 event 16 april 2026, 09:30-16:30, Houten, Dag 4 |
computer Online: Online 13 apr. 2026 tot 16 apr. 2026Toon rooster event 13 april 2026, 09:30-16:30, Online, Dag 1 event 14 april 2026, 09:30-16:30, Online, Dag 2 event 15 april 2026, 09:30-16:30, Online, Dag 3 event 16 april 2026, 09:30-16:30, Online, Dag 4 |
placeRotterdam 13 apr. 2026 tot 16 apr. 2026Toon rooster event 13 april 2026, 09:30-16:30, Rotterdam, Dag 1 event 14 april 2026, 09:30-16:30, Rotterdam, Dag 2 event 15 april 2026, 09:30-16:30, Rotterdam, Dag 3 event 16 april 2026, 09:30-16:30, Rotterdam, Dag 4 |
placeZwolle 13 apr. 2026 tot 16 apr. 2026Toon rooster event 13 april 2026, 09:30-16:30, Zwolle, Dag 1 event 14 april 2026, 09:30-16:30, Zwolle, Dag 2 event 15 april 2026, 09:30-16:30, Zwolle, Dag 3 event 16 april 2026, 09:30-16:30, Zwolle, Dag 4 |
placeAmsterdam 8 jun. 2026 tot 11 jun. 2026Toon rooster event 8 juni 2026, 09:30-16:30, Amsterdam, Dag 1 event 9 juni 2026, 09:30-16:30, Amsterdam, Dag 2 event 10 juni 2026, 09:30-16:30, Amsterdam, Dag 3 event 11 juni 2026, 09:30-16:30, Amsterdam, Dag 4 |
placeEindhoven 8 jun. 2026 tot 11 jun. 2026Toon rooster event 8 juni 2026, 09:30-16:30, Eindhoven, Dag 1 event 9 juni 2026, 09:30-16:30, Eindhoven, Dag 2 event 10 juni 2026, 09:30-16:30, Eindhoven, Dag 3 event 11 juni 2026, 09:30-16:30, Eindhoven, Dag 4 |
placeHouten 8 jun. 2026 tot 11 jun. 2026Toon rooster event 8 juni 2026, 09:30-16:30, Houten, Dag 1 event 9 juni 2026, 09:30-16:30, Houten, Dag 2 event 10 juni 2026, 09:30-16:30, Houten, Dag 3 event 11 juni 2026, 09:30-16:30, Houten, Dag 4 |
computer Online: Online 8 jun. 2026 tot 11 jun. 2026Toon rooster event 8 juni 2026, 09:30-16:30, Online, Dag 1 event 9 juni 2026, 09:30-16:30, Online, Dag 2 event 10 juni 2026, 09:30-16:30, Online, Dag 3 event 11 juni 2026, 09:30-16:30, Online, Dag 4 |
placeRotterdam 8 jun. 2026 tot 11 jun. 2026Toon rooster event 8 juni 2026, 09:30-16:30, Rotterdam, Dag 1 event 9 juni 2026, 09:30-16:30, Rotterdam, Dag 2 event 10 juni 2026, 09:30-16:30, Rotterdam, Dag 3 event 11 juni 2026, 09:30-16:30, Rotterdam, Dag 4 |
placeZwolle 8 jun. 2026 tot 11 jun. 2026Toon rooster event 8 juni 2026, 09:30-16:30, Zwolle, Dag 1 event 9 juni 2026, 09:30-16:30, Zwolle, Dag 2 event 10 juni 2026, 09:30-16:30, Zwolle, Dag 3 event 11 juni 2026, 09:30-16:30, Zwolle, Dag 4 |
placeAmsterdam 10 aug. 2026 tot 13 aug. 2026Toon rooster event 10 augustus 2026, 09:30-16:30, Amsterdam, Dag 1 event 11 augustus 2026, 09:30-16:30, Amsterdam, Dag 2 event 12 augustus 2026, 09:30-16:30, Amsterdam, Dag 3 event 13 augustus 2026, 09:30-16:30, Amsterdam, Dag 4 |
placeEindhoven 10 aug. 2026 tot 13 aug. 2026Toon rooster event 10 augustus 2026, 09:30-16:30, Eindhoven, Dag 1 event 11 augustus 2026, 09:30-16:30, Eindhoven, Dag 2 event 12 augustus 2026, 09:30-16:30, Eindhoven, Dag 3 event 13 augustus 2026, 09:30-16:30, Eindhoven, Dag 4 |
LLM Intro
The course starts by explaining what LLMs are, where they’re used, and the lifecycle of building vs. using them. We introduce the Transformer/GPT architecture, how models learn from large datasets, and when to use classic QA versus RAG.
Working with Text Data
You’ll move from raw text to model-ready tensors: tokenization (e.g., BPE), token→ID mapping, special/context tokens, and sliding-window sampling. We cover embeddings and positional encodings while handling unknown words and basic sentence structure.
Attenti…

Er zijn nog geen veelgestelde vragen over dit product. Als je een vraag hebt, neem dan contact op met onze klantenservice.
LLM Intro
The course starts by explaining what LLMs are, where they’re used, and the lifecycle of building vs. using them. We introduce the Transformer/GPT architecture, how models learn from large datasets, and when to use classic QA versus RAG.
Working with Text Data
You’ll move from raw text to model-ready tensors: tokenization (e.g., BPE), token→ID mapping, special/context tokens, and sliding-window sampling. We cover embeddings and positional encodings while handling unknown words and basic sentence structure.
Attentions Mechanism
This module demystifies self-attention for long-sequence modeling: queries, keys, values, and causal masking to hide future tokens. We add positional encoding, multi-head attention, and stacked layers to capture dependencies across different parts of the input.
Pytorch Deep Learning
This module explains PyTorch fundamentals—tensors, core operations, and training loops—with the tooling to measure model quality. We cover feature scaling/normalization (including categorical features), activation and loss functions, and backpropagation.
Neural Networks
Next the course proceeds with building MLPs and CNNs in PyTorch while choosing appropriate activations and losses and implementing backprop. We touch NLP-specific preprocessing and walk through end-to-end binary and multi-class classification.
GPT from scratch
Next you will implement a minimal GPT with layer normalization, residual connections, and attention + feed-forward (GELU) blocks.
Pretraining
Then pretrain the LLM on unlabeled text with next-token prediction, tracking training vs. validation losses. You will explore decoding strategies (e.g., temperature, top-k), control randomness for reproducibility, and save/load PyTorch weights.
Tuning for Classification
Then the course covers preparing datasets and dataloaders, initializing from pretrained weights, and add a classification head with softmax. Train and evaluate with loss/accuracy, culminating in an LLM-based spam-classification example.
Fine-Tuning
Finally you will practice supervised instruction tuning: format datasets, batch efficiently, and fine-tune a pretrained LLM. Also evaluate outputs, export responses/checkpoints, and apply parameter-efficient methods such as LoRA.
Audience Course Building Large Language Models
The course Building Large Language Models is intended for engineers who want to design transformer-based LLMs.
Prerequisites Course Building Large Language Models
Participants should be comfortable with Python. Prior exposure to PyTorch or a similar Deep Learning framework is helpful.
Realization Training Building Large Language Models
The training blends concise theory with guided, hands-on labs. Through code-alongs you’ll build a mini-GPT, prepare datasets, run pretraining and fine-tuning and deploy models.
Building Large Language Models Certificate
After completion, participants receive a certificate of participation for the course Building Large Language Models.
Modules
Module 1 : LLM Intro
- What is an LLM?
- Applications of LLMs
- Stages of Building LLMs
- Stages of Using LLMs
- Transformer Architecture
- Utilizing Large Datasets
- GPT Architecture Internals
- Learn Language Patterns
- Retrieval Augmented Generation
- Question and Answer Systems
- QA versus RAG
- Building an LLM
Module 2 : Working with Text Data
- Word Embeddings
- Decoders and Encoders
- Decoder Only Transformer
- Tokenizing text
- Convert Tokens into IDs
- Special Context Tokens
- Understand Sentence Structure
- Byte Pair Encoding
- Unknown Words
- Sampling with Sliding Window
- Creating Token Embeddings
- Encoding Word Positions
Module 3 : Attentions Mechanism
- Modeling Long Sequences
- Capturing Data Dependencies
- Attention Mechanisms
- Attending Different Input Parts
- Using Self-Attention
- Trainable Weights
- Hiding Future Words
- Positional Encoding
- Causal Attention
- Masking Weights with Dropout
- Multihead Attention
- Stacking Attentions Layers
Module 4 : Pytorch Deep Learning
- Deep Learning Intro
- Overview of PyTorch
- PyTorch Tensors
- Tensor Operations
- Model Evaluation Metrics
- Feature Scaling
- Feature Normalization
- Categorical Features
- Activation Functions
- Loss Functions
- Backpropagation
Module 5 : Neural Networks
- Neural Networks Intro
- Building NN with PyTorch
- Multiple Layers of Arrays
- Convolutional Neural Networks
- Activation Functions
- Loss Functions
- Backpropagation
- Natural Language Processing
- Stopword Removal
- Binary Classification
- Multi-class Classification
Module 6 : GPT from scratch
- Coding an LLM Architecture
- Layer Normalization
- Normalizing Activations
- Feed Forward Network
- GELU Activations
- Adding Shortcut Connections
- Connecting Attention
- Weight Tying
- Linear Layers in Transformer Block
- Coding the GPT Model
- Generating Text
Module 7 : Pretraining
- Pretraining on Unlabeled Data
- Calculating Text Generation Loss
- Training Losses
- Validation Set Losses
- Training an LLM
- Decoding Strategies
- Control Randomness
- Temperature Scaling
- Saving Model Weights in PyTorch
- Loading Pretrained Weights
Module 8 : Tuning for Classification
- Categories of Fine-Tuning
- Preparing the Dataset
- Creating Data Loaders
- Top-k Sampling
- Soft-Max Function
- Initializing with Pretrained Weights
- Adding Classification Head
- Classification Loss and Accuracy
- Fine-tuning on Supervised Data
- Using LLM as Spam Classifier
Module 9 : Fine-Tuning
- Instruction Fine-tuning
- Supervised Instruction
- Preparing a Dataset
- Organizing Training Batches
- Creating Data Loaders
- Loading a pretrained LLM
- Fine-tuning the LLM
- Extracting and Saving Responses
- Evaluating Fine-tuned LLM
- Fine Tuning with LoRA
Waarom SpiralTrain
SpiralTrain is specialist op het gebied van software development trainingen. Wie bieden zowel trainingen aan voor beginnende programmeurs die zich de basis van talen en tools eigen willen maken als ook trainingen voor ervaren software professionals die zich willen bekwamen in de nieuwste versie van een taal of een framework.
Onze trainingkenmerken zich door :
• Klassikale of online open roostertrainingen en andere
trainingsvormen
• Eenduidige en scherpe cursusprijzen, zonder extra kosten
• Veel trainingen met een doorlopende case study
• Trainingen die gericht zijn op certificering
Er zijn nog geen veelgestelde vragen over dit product. Als je een vraag hebt, neem dan contact op met onze klantenservice.
