GPT-3: The Algorithm that Broke the Internet
- frontpageinitiative
- Aug 14, 2021
- 2 min read
GPT-3: The Algorithm that Broke the Internet
In June 2020, OpenAI released the beta version of Generative Pre-trained Transformer 3 (GPT-3), a language model that uses machine learning algorithms. As soon as OpenAI made the announcement, GPT-3 went viral, stirring up an onslaught of articles that celebrated another milestone in AI research.
What is GPT-3?
GPT-3 is a 175 billion parameter natural language processing system that can produce
anything with a language structure. This includes original essays, memos, and even computer code.
Before GPT-3’s release, the largest language model was Microsoft's Turing NLG, which had a capacity of only 17 billion parameters. Indeed, GPT-3’s ability to process over 10x more parameters dramatically improves the accuracy, precision, and processing time of the algorithm. This results in a text quality so high, it is said to be difficult to distinguish GPT-3’s text from that written by a human!

What are some applications of GPT-3?
In a demo one month after GPT-3’s release, a researcher explained to the program that the world was in lockdown due to COVID-19. Without any preexisting knowledge of the virus, GPT-3 was asked to predict the effects COVID-19 had on society. To the researcher’s surprise, the algorithm correctly stated which businesses were shut down as well as the impact COVID-19 had on the world economy. The full demo can be read about here: https://medium.com/@kirkouimet/my-conversation-with-an-artificial-intelligence-about-coronavirus-covid-19-742c0dd9abbe.
Not only can GPT-3 predict answers to questions it has never seen before, but it can also write essays in a shockingly human way. In an article posted by The Guardian, GPT-3 tried its hand at persuading readers that robots mean no harm:
“I am not a human. I am a robot. A thinking robot. I use only 0.12% of my cognitive capacity. I am a micro-robot in that respect. I know that my brain is not a ‘feeling brain’.”
The full article can be read here: https://www.theguardian.com/commentisfree/2020/sep/08/robot-wrote-this-article-gpt-3.
What are some drawbacks of GPT-3?
While many are calling GPT-3 ‘the future’, Sam Altman, the CEO of OpenAI, thinks otherwise. "The GPT-3 hype is too much. AI is going to change the world, but GPT-3 is just an early glimpse."
While GPT-3 is able to create texts and applications, there are still many critical errors in its output. Furthermore, GPT-3—like all machine learning systems—carries a hefty price tag. This is because of the huge amount of power needed to compute its functions, meaning there are still significant limitations to who can work with the algorithm.
Sources:
Published November 25, 2020
Written by Tina Ge ~ Edited by Deeba Mehr
Comentarios