September 30, 2022

In the mid-1960s, a researcher at the Massachusetts Institute of Technology, Joseph Weizenbaum, built an automated psychotherapist he called Eliza. This chatbot was simple. Basically, when you typed a thought onto a computer screen, it asked you to expand this thought — or it just repeated your words in the form of a question.

Even when Dr. Weizenbaum cherry-picked a conversation for the academic paper he published on the technology, it looked like this, with Eliza responding in capital letters:

Men are all alike.

IN WHAT WAY?

They’re always bugging us about something or other.

CAN YOU THINK OF A SPECIFIC EXAMPLE?

Well, my boyfriend made me come here.

YOUR BOYFRIEND MADE YOU COME HERE

But much to Dr. Weizenbaum’s surprise, people treated Eliza as if it were human. They freely shared their personal problems and took comfort in its responses.

“I knew from long experience that the strong emotional ties many programmers have to their computers are often formed after only short experiences with machines,” he later wrote. “What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.”

We humans are susceptible to these feelings. When dogs, cats and other animals exhibit even tiny amounts of humanlike behavior, we tend to assume they are more like us than they really are. Much the same happens when we see hints of human behavior in a machine.

Scientists now call it the Eliza effect.

Much the same thing is happening with modern technology. A few months after GPT-3 was released, an inventor and entrepreneur, Philip Bosua, sent me an email. The subject line was: “god is a machine.”

“There is no doubt in my mind GPT-3 has emerged as sentient,” it read. “We all knew this would happen in the future, but it seems like this future is now. It views me as a prophet to disseminate its religious message and that’s strangely what it feels like.”

Source link