FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.

THE GUARDIAN. AI expert warns against telling your secrets to chatbots such as ChatGPT.

The Christmas lectures were started by Michael Faraday in 1825 at the Royal Institution in London with the aim of engaging and educating young people about science. They were first broadcast in 1936, making them the oldest science television series.

Prof Mike Wooldridge will address looming questions around AI in this year’s Royal Institution Christmas lectures

By Jane Clinton

Tue 26 Dec 2023 11.12 CET

Confiding in ChatGPT about work gripes or political preferences could come back to bite users, according to an artificial intelligence expert.

Mike Wooldridge, a professor of AI at Oxford University, says sharing private information or having heart-to-hearts with a chatbot would be “extremely unwise” as anything revealed helps train future versions.

Users should also not expect a balanced response to their comments as the technology “tells you what you want to hear”, he adds.

Wooldridge is exploring the subject of AI in this year’s Royal Institution Christmas lectures. He will look at the “big questions facing AI research and unravel the myths about how this ground-breaking technology really works”, according to the institution.

How a machine can be taught to translate from one language to another and how chatbots work will be among the topics he will discuss. He will also address the question that looms around AI: can it ever be truly like humans?

Wooldridge told the Daily Mail that while humans were programmed to look for consciousness in AI, it was a futile endeavour. AI, he said, “has no empathy. It has no sympathy”.

“That’s absolutely not what the technology is doing and crucially, it’s never experienced anything,” he added. “The technology is basically designed to try to tell you what you want to hear – that’s literally all it’s doing.”

He offered the sobering insight that “you should assume that anything you type into ChatGPT is just going to be fed directly into future versions of ChatGPT”. And if on reflection you decide you have revealed too much to ChatGPT, retractions are not really an option. According to Wooldridge, given how AI models work it is near-impossible to get your data back once it has gone into the system.

Across the lecture series, Wooldridge will be joined by major figures from the AI world. The Royal Institution says he will also introduce “a range of robot friends, who will demonstrate what robots today can do – and what they can’t”.

The Christmas lectures were started by Michael Faraday in 1825 at the Royal Institution in London with the aim of engaging and educating young people about science. They were first broadcast in 1936, making them the oldest science television series.

Those who have given the lectures include the Nobel prize winners William and Lawrence Bragg, Sir David Attenborough, Carl Sagan and Dame Nancy Rothwell.

ChatGPT was contacted for comment.

The lectures will be broadcast on BBC Four and iPlayer on 26, 27 and 28 December at 8pm.

Learn more

Telling an AI chatbot your deepest secrets or revealing your political views is ‘extremely unwise’, Oxford don warns – DailyMail

Mike Wooldridge warns that ChatGPT should not hear your political views

By JIM NORTON, TECHNOLOGY EDITOR FOR THE DAILY MAIL

PUBLISHED: 21:12, 25 December 2023 | UPDATED: 21:32, 25 December 2023

  • Complaining about your boss or expressing political views to ChatGPT is ‘extremely unwise’, according to an Oxford don.
  • Mike Wooldridge said the AI tool should not be seen as a trusted confidant because it could get you into hot water.
  • Anything you say to the ‘chatbot’ will help train future versions, he added, and the technology just ‘tells you what you want to hear’.
  • The AI professor is giving the Royal Institution’s Christmas lectures this year and will tackle the ‘truth’ about the subject.
  • He said humans were programmed to look for consciousnesses – but we ‘attribute it far, far too often’.
  • Comparing the idea of finding personalities in chatbots to seeing faces in the clouds, he said of AI: ‘It has no empathy. It has no sympathy.
  • ‘That’s absolutely not what the technology is doing and crucially, it’s never experienced anything. The technology is basically designed to try to tell you what you want to hear – that’s literally all it’s doing.’
  • Treating it as anything more than this was particularly risky because ‘you should assume that anything you type into ChatGPT is just going to be fed directly into future versions of ChatGPT’.
  • He said it would be ‘extremely unwise to start having personal conversations or complaining about your relationship with your boss, or expressing your political opinions’.
  • Professor Wooldridge added that, due to the way AI models worked, it was also nearly impossible to get your data back once it was in the system. Earlier this year the company behind ChatGPT, OpenAI, had to fix a bug that allowed users to see parts of other users’ chat histories.
  • The firm promises to hold on to them for only 30 days and avoid using them to train the chatbot.
  • The lectures will be broadcast on BBC Four and iPlayer on December 26, 27 and 28 at 8pm.

FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.