Generative pre-trained transformer
Generative Pre-trained Transformers (GPTs) are advanced large language models (LLMs) employed in AI-driven chatbots. Built on the transformer architecture—a deep learning framework—these models undergo pre-training on extensive datasets of unlabelled text. This enables them to produce original, contextually relevant content.