25 Jan Generative Pretrained Transformers (GPT)
A Generative Pre-trained Transformer or GPT is a neural network technology developed by OpenAI. Given massive amounts of digital input it uses deep learning techniques to produce natural sounding text as if it is an author. Its output may be in the form of articles, stories, or even conversations, that closely resemble human-written text, and it can even imitate the voice of a well-known author.
GPT uses “self-attention to process input sequences. Unlike traditional recurrent neural networks, transformers can process input data in parallel, making them faster and more efficient.” (https://encord.com/glossary/gpt-definition/)