linkedin facebook twitter rss

25 Jan Generative Pretrained Transformers (GPT)

A Generative Pre-trained Transformer or GPT is a neural network technology developed by OpenAI. Given massive amounts of digital input it uses deep learning techniques to produce natural sounding text as if it is an author. Its output may be in the form of articles, stories, or even conversations, that closely resemble human-written text, and it can even imitate the voice of a well-known author.

GPT uses “self-attention to process input sequences. Unlike traditional recurrent neural networks, transformers can process input data in parallel, making them faster and more efficient.” (https://encord.com/glossary/gpt-definition/)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.