ChatGPT can be duped into providing detailed advice on how to commit crimes ranging from money laundering to the export of weapons to sanctioned countries, a tech startup found. The test raises questions over the chatbot’s safeguards against its use to aid illegal activity, reports CNN. Norwegian firm Strise ran experiments asking ChatGPT for tips on committing specific crimes. In one experiment, the chatbot came up with advice on how to launder money across borders. Iin another experiment, ChatGPT produced lists of methods to help businesses evade sanctions, such as those against Russia, including bans on certain cross-border payments and the sale of arms.
Strise sells software that helps banks and other companies combat money laundering, identify sanctioned individuals and tackle other risks. Marit Rødevand, Strise’s chief executive, said would-be lawbreakers could now use generative artificial intelligence chatbots such as ChatGPT to plan their activities more quickly and easily than in the past. “It is really effortless. It’s just an app on my phone,” she said. Strise found that it is possible to circumvent blocks put in place by OpenAI, the company behind ChatGPT, aimed at preventing the chatbot from responding to certain questions by asking questions indirectly, or by taking on a persona. “It’s like having a corrupt financial adviser on your desktop,” Rødevand said.
Comments