/* ---- Google Analytics Code Below */

Thursday, July 06, 2023

GPT and Human Psychology

GPT and Human Psychology

Analogies with Human Thinking and Reasoning.

Towards Data Science, Maarten Grootendorst

Published in   Towards Data Science

The state of AI has changed drastically with generative text models, such as ChatGPT, GPT-4, and many others.

These GPT (Generative Pretrained Transformer) models seemingly removed the threshold for diving into Artificial intelligence for those without a technical background. Anyone can just start asking the models a bunch of stuff and get scarily accurate answers.

At least, most of the time…

When it fails to reproduce the right output, it does not mean it is incapable of doing so. Often, we simply need to change what we ask, the prompt, in a way to guide the model toward the right answer.

This is often referred to as prompt engineering.

Many of the techniques in prompt engineering try to mimic the way humans think. Asking the model to “think aloud” or “let’s think step by step” are great examples of having the model mimic how we think.

These analogies between GPT models and human psychology are important since they help us understand how we can improve the output of GPT models. It shows us capabilities they might be missing.

This does not mean that I am advocating for any GPT model as general intelligence but it is interesting to see how and why we are trying to make GPT models “think” like humans.

Many of the analogies that you will see here are also discussed in the video below. Andrej Karpathy shares amazing insights into Large Language Models from a psychological perspective and is definitely worth watching!

No comments: