Fenris AI
← Back to Glossary
ModelsHistory

Logic Theorist

Often called the first AI program, created by Allen Newell and Herbert Simon and demonstrated at the 1956 Dartmouth Workshop. It could prove mathematical theorems.
Read more on Wikipedia

Related Terms

Large Language Model (LLM)

An AI model trained on massive amounts of text data that can understand and generate human language. Examples include GPT-4, Claude, and Gemini. These models use the Transformer architecture and typically have billions of parameters.

Expert System

A type of AI from the 1980s that encoded human specialist knowledge as a set of if-then rules. Expert systems like XCON were commercially successful for a time but ultimately proved too brittle and expensive to maintain, contributing to the second AI winter.

Perceptron

An early type of neural network. Minsky and Papert's 1969 book "Perceptrons" demonstrated mathematical limitations of single-layer networks, which effectively killed neural network research funding for over a decade.

AlexNet

A deep neural network that won the 2012 ImageNet competition by a landslide, beating the runner-up by over 10 percentage points. Built by Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton, AlexNet proved that deep learning worked and is considered the inflection point for modern AI.

GPT (Generative Pre-trained Transformer)

A series of large language models by OpenAI. GPT-1 (2018) introduced the approach, GPT-3 (2020) demonstrated that scale produces emergent capabilities, and GPT-4 (2023) became one of the most capable AI systems available.

ChatGPT

A conversational AI product by OpenAI launched in November 2022. It reached 100 million users in two months, the fastest consumer technology adoption in history. ChatGPT is widely credited with bringing AI into mainstream public consciousness.

Learn AI With Fenris

Plain-language AI education with ethics certification and a real community. Launching Spring 2026.

Join the Waitlist