GPU (Graphics Processing Unit)
Related Terms
Neural Network
A computing system inspired by the biological neural networks in the brain. First modeled mathematically by McCulloch and Pitts in 1943, neural networks are the foundation of modern deep learning systems.
Transformer
A neural network architecture introduced in 2017 by Google researchers in the paper "Attention Is All You Need." Unlike previous approaches that processed text sequentially, Transformers can process all words in a passage simultaneously. This architecture powers virtually every major AI system today.
Backpropagation
A training technique that allows neural networks to learn by adjusting their internal weights based on errors. Revived in the 1980s by researchers like David Rumelhart, it became essential to the deep learning revolution.
Parameters
The internal values a model learns during training. More parameters generally means more capacity to learn patterns. GPT-3 has 175 billion parameters. Parameter count is often used as a rough measure of model size.
Context Window
The amount of text (measured in tokens) that an AI model can process in a single conversation. Modern models in 2025-2026 support context windows of over a million tokens, roughly equivalent to several novels.
Expert System
A type of AI from the 1980s that encoded human specialist knowledge as a set of if-then rules. Expert systems like XCON were commercially successful for a time but ultimately proved too brittle and expensive to maintain, contributing to the second AI winter.
Learn AI With Fenris
Plain-language AI education with ethics certification and a real community. Launching Spring 2026.
Join the Waitlist