/plushcap/analysis/deepgram/how-does-gpt-3-work

How Does GPT-3 Work?

What's this blog post about?

GPT-3 (Generative Pretrained Transformer 3) is a large language model developed by OpenAI that uses deep learning transformers to generate human-like text. Released in 2020, it has been used for various applications such as writing poetry and fiction, coding websites, responding to customer reviews, suggesting better grammar, translating languages, generating dialogue, finding tax deductions, and automating A/B testing. GPT-3 is not open source and Microsoft acquired an exclusive license to it in September 2020. It was trained on over 175 billion parameters and can generate long-form and specialized text. Despite its impressive capabilities, GPT-3 still has flaws such as generating toxic and biased text.

Company
Deepgram

Date published
Jan. 17, 2023

Author(s)
Rachel Meltzer

Word count
1849

Hacker News points
None found.

Language
English


By Matt Makai. 2021-2024.