ブロンズ、シルバー、ゴールド、プラチナムの順番は大変ですか?
それならプラチナムを先に受けたい人へ!
(Then to those who want to take the platinums first!)
Summarise gpt3 in 300-900 words.
Generative Pre-trained Transformer 3
example of a model answer
GPT-3 (Generative Pretrained Transformer) is one of the pre-trained natural language technology processing models developed by OpenAI, an AI research institute in San Francisco. The main feature of GPT-3 is that it can produce sentences as natural as if they were written by a human being.
GPT-3 uses an autoregressive language model with 175 billion parameters (a model that predicts the next word after a certain word) on a data set of 570 GB of text data collected from the web and other sources, which has undergone some pre-processing. By training, we have formed a huge language model that far exceeds the data training volume of "BERT" and "GPT-2", which existed until now. *¹
What is GPT-3?
GPT-3 (Generative Pretrained Transformer) is a 'sentence generating language model' using 175 billion parameters developed by OpenAI. (* A language model is a model that predicts its continuation based on input text.) GPT-3 is currently restricted in availability to some, but one previous version, GPT-2, is open source and available for you to try and use.
ここから先は
¥ 2,980
Amazonギフトカード5,000円分が当たる
この記事が気に入ったらチップで応援してみませんか?