Categories
The Next Web

How will GPT-3 change our lives?

Earn up to $578.97/day in passive income. Make your side hustle into full time.

“GPT-3 is not a mind, but it is also not entirely a machine. It’s something else: a statistically abstracted representation of the contents of millions of minds, as expressed in their writing.”

Regini Rini, Philosopher 

In recent years, the AI circus really has come to town and we’ve been treated to a veritable parade of technical aberrations seeking to dazzle us with their human-like intelligence. Many of these sideshows have been “embodied” AI, where the physical form usually functions as a cunning disguise for a clunky, pre-programmed bot. Like the world’s first “AI anchor,” launched by a Chinese TV network and — how could we ever forget — Sophia, Saudi Arabia’s first robotic citizen.

But last month there was a furor around something altogether more serious. A system The Verge called, “an invention that could end up defining the decade to come.” Its name is GPT-3, and it could certainly make our future a lot more complicated.

So, what is all the fuss about? And how might this supposed tectonic shift in technological development change the lives of the rest of us?

[Read: Employee surveillance doesn’t increase productivity — it’s demotivating]

An autocomplete for thought 

The GPT-3 software was built by San Francisco-based OpenAI, and The New York Times has described it “…by far the most powerful “language model” ever created,” adding:

A language model is an artificial intelligence system that has been trained on an enormous corpus of text; with enough text and enough processing, the machine begins to learn probabilistic connections between words. More plainly: GPT-3 can read and write. And not badly, either… GPT-3 is capable of generating entirely original, coherent, and sometimes even factual prose. And not just prose — it can write poetry, dialogue, memes, computer code, and who knows what else.

Farhad Manjoo, New York Times

In this case, “enormous” is something of an understatement. Reportedly, the entirety of the English Wikipedia — spanning some 6 million articles — makes up just 0.6% of GPT-3’s training data.

In layman’s terms, it is a giant autocomplete program. One that has feasted on the vast texts of the internet; from digital booksto articles, religious texts, science lectures, message boards, blogs, computing manuals, and just about anything else you could conceive of and, due to being able to cleverly spot patterns and consistencies in these things, it can perform a mind-boggling array of tasks.