/* ---- Google Analytics Code Below */

Friday, May 20, 2022

GPT-3 Algorithm Is Now Producing Billions of Words a Day

Consider the huge moves of GPT-3.

OpenAI’s GPT-3 Algorithm Is Now Producing Billions of Words a Day   By Jason Dorrier -Apr 04, 202110,176  in Singularity Hub

When OpenAI released its huge natural-language algorithm GPT-3 last summer, jaws dropped. Coders and developers with special access to an early API rapidly discovered new (and unexpected) things GPT-3 could do with naught but a prompt. It wrote passable poetry, produced decent code, calculated simple sums, and with some edits, penned news articles.

All this, it turns out, was just the beginning. In a recent blog post update, OpenAI said that tens of thousands of developers are now making apps on the GPT-3 platform.  Over 300 apps (and counting) use GPT-3, and the algorithm is generating 4.5 billion words a day for them.

Obviously, that’s a lot of words. But to get a handle on how many, let’s try a little back-of-the-napkin math.

The Coming Torrent of Algorithmic Content

Each month, users publish about 70 million posts on WordPress, which is, hands down, the dominant content management system online.

Assuming an average article is 800 words long—which is speculation on my part, but not super long or short—people are churning out some 56 billion words a month or 1.8 billion words a day on WordPress.

If our average word count assumption is in the ballpark, then GPT-3 is producing over twice the daily word count of WordPress posts. Even if you make the average more like 2,000 words per article (which seems high to me) the two are roughly equivalent.

Now, not every word GPT-3 produces is a word worth reading, and it’s not necessarily producing blog posts (more on applications below). But in either case, just nine months in, GPT-3’s output seems to foreshadow a looming torrent of algorithmic content.

GPT-3 Is Powering a Variety of Apps

So, how exactly are all those words being used? Just as the initial burst of activity suggested, developers are building a range of apps around GPT-3.

Viable, for example, surfaces themes in customer feedback—surveys, reviews, and help desk tickets, for instance—and provides short summaries for companies aiming to improve their services. Fable Studio is bringing virtual characters in interactive stories to life with GPT-3-generated dialogue. And Algolia uses GPT-3 to power an advanced search tool.

In lieu of code, developers use “prompt programming” by providing GPT-3 a few examples of the kind of output they’re hoping to generate. More advanced users can fine-tune things by giving the algorithm data sets of examples or even human feedback.

In this respect, GPT-3 (and other similar algorithms) may hasten adoption of machine learning in natural language processing (NLP). Whereas the learning curve has previously been steep to work with machine learning algorithms, OpenAI says many in the GPT-3 developer community have no background in AI or programming.

“It’s almost this new interface for working with computers,” Greg Brockman, OpenAI’s chief technology officer and co-founder, told Nature in an article earlier this month.

A Walled Garden for AI

OpenAI licensed GPT-3 to Microsoft—who invested a billion dollars in OpenAI in return for such partnerships—but hasn’t released the code publicly.  .... ' 

No comments: