- Published on
Transformer 아키텍처의 핵심인 "Attention Is All You Need" 논문을 Self-Attention, Multi-Head Attention, Positional Encoding 등 핵심 메커니즘을 하나씩 분석한다.
천천히 올바르게. AI Researcher & DevOps Engineer Youngju's tech blog. GPU/CUDA, LLM, MLOps, Kubernetes AI workloads, distributed training, and data engineering.