Platforms & theory
AI Systems
How language models actually work.
- •150 XPTokenizationText becomes numbers. Numbers become predictions.
- •150 XPEmbeddingsVectors encode meaning. Proximity implies similarity.
- •150 XPVectors and SimilarityDirection encodes relationship. Cosine measures it.
- •200 XPAttentionEvery token asks: which other tokens matter to me?
- •200 XPTransformersStacked attention and FFN layers compose into a world model.
- •175 XPNeural NetworksDifferentiable functions, composed to approximate anything.
- •200 XPTrainingGradient descent plus billions of tokens equals a world model.
- •200 XPFine-tuningPre-train for knowledge. Fine-tune for behaviour.
- •200 XPScaling LawsMore compute, more data, more parameters — predictable gains.
- •175 XPGenerationLogits in. Sampled tokens out. Repeat until done.
- •175 XPPrompt EngineeringPrecise inputs. Predictable outputs.
- •250 XPAgentsObserve. Think. Act. Repeat until done.