Model Context Protocol (MCP)
Related Terms
Neural Network
A computing system inspired by the biological neural networks in the brain. First modeled mathematically by McCulloch and Pitts in 1943, neural networks are the foundation of modern deep learning systems.
Transformer
A neural network architecture introduced in 2017 by Google researchers in the paper "Attention Is All You Need." Unlike previous approaches that processed text sequentially, Transformers can process all words in a passage simultaneously. This architecture powers virtually every major AI system today.
Backpropagation
A training technique that allows neural networks to learn by adjusting their internal weights based on errors. Revived in the 1980s by researchers like David Rumelhart, it became essential to the deep learning revolution.
Parameters
The internal values a model learns during training. More parameters generally means more capacity to learn patterns. GPT-3 has 175 billion parameters. Parameter count is often used as a rough measure of model size.
Context Window
The amount of text (measured in tokens) that an AI model can process in a single conversation. Modern models in 2025-2026 support context windows of over a million tokens, roughly equivalent to several novels.
GPU (Graphics Processing Unit)
A processor originally designed for rendering graphics in video games. GPUs turned out to be ideal for the parallel mathematical operations required to train neural networks. The availability of GPU computing was one of three factors that enabled the 2012 deep learning breakthrough.
Learn AI With Fenris
Plain-language AI education with ethics certification and a real community. Launching Spring 2026.
Join the Waitlist