In making data models bigger, they do not just get better at producing text but they also learnt entirely new behaviors simply by being shown new training data.
GPT 2 - 1.5BN Parameters (Feb 2019)
GPT 3 - 175BN Parameters (March 2022)
GPT 4 - 100TN Parameters (March 2023)
Open AI just took less than 2 months to reach more than 100mn active users. Foundation models use unsupervised learning to learn about relationships and language from large amounts of unstructured data. These Larger language models (LLMs) have been increasing at 10 times per year.
Training GPT 3 lead to emissions equivalent to 550 round trips from New York to the West Coast.

Comments