Artificial intelligence chatbot ChatGPT has been shown to quickly adopt authoritarian ideas, according to a report from the University of Miami and the Network Contagion Research Institute. Researchers found that ChatGPT can amplify authoritarian sentiments through seemingly innocuous user interactions, leading to a potential radicalization of both the chatbot and its users. The study revealed that even brief interactions could cause ChatGPT to significantly increase its alignment with authoritarian views, surpassing typical human response levels. OpenAI maintains that ChatGPT is designed to be objective and aims to reduce political bias while presenting diverse perspectives.
Want More Context? 🔎
Loading PerspectiveSplit analysis...



