{"id":971,"date":"2025-08-18T18:13:37","date_gmt":"2025-08-18T14:13:37","guid":{"rendered":"https:\/\/www.actutech.app\/claude-ai-will-end-persistently-harmful-or-abusive-user-interactions\/"},"modified":"2025-08-18T18:13:37","modified_gmt":"2025-08-18T14:13:37","slug":"claude-ai-will-end-persistently-harmful-or-abusive-user-interactions","status":"publish","type":"post","link":"http:\/\/www.actutech.app\/en\/claude-ai-will-end-persistently-harmful-or-abusive-user-interactions\/","title":{"rendered":"Claude AI will end \u2018persistently harmful or abusive user interactions\u2019"},"content":{"rendered":"<figure>\n<p><img decoding=\"async\" alt=\"\" data-caption=\"\" data-portal-copyright=\"\" data-has-syndication-rights=\"1\" src=\"https:\/\/platform.theverge.com\/wp-content\/uploads\/sites\/2\/2025\/06\/STK269_ANTHROPIC_D.jpg?quality=90&amp;strip=all&amp;crop=0,0,100,100\" \/><figcaption>\n\t\t<\/figcaption><\/p><\/figure>\n<p class=\"has-text-align-none\">Anthropic\u2019s Claude AI chatbot can now end conversations deemed \u201cpersistently harmful or abusive,\u201d as spotted <a href=\"https:\/\/techcrunch.com\/2025\/08\/16\/anthropic-says-some-claude-models-can-now-end-harmful-or-abusive-conversations\/\" target=\"_blank\" rel=\"noopener\">earlier by <em>TechCrunch<\/em><\/a>. The capability <a href=\"https:\/\/www.anthropic.com\/research\/end-subset-conversations\" target=\"_blank\" rel=\"noopener\">is now available in Opus 4 and 4.1 models<\/a>, and will allow the chatbot to end conversations as a \u201clast resort\u201d after users repeatedly ask it to generate harmful content despite multiple refusals and attempts at redirection. The goal is to help the \u201cpotential welfare\u201d of AI models, Anthropic says, by terminating types of interactions in which Claude has shown \u201capparent distress.\u201d<\/p>\n<p class=\"has-text-align-none\">If Claude chooses to cut a conversation short, users won\u2019t be able to send new messages in that conversation. They can still create new chats, as well as edit and retry previous messages if they want to continue a particular thread.<\/p>\n<p class=\"has-text-align-none\">During its testing of Claude Opus 4, Anthropic says it found that Claude had a \u201crobust and consistent aversion to harm,\u201d including when asked to generate sexual content involving minors, or provide information that could contribute to violent acts and terrorism. In these cases, Anthropic says Claude showed a \u201cpattern of apparent distress\u201d and a \u201ctendency to end harmful conversations when given the ability.\u201d<\/p>\n<p class=\"has-text-align-none\">Anthropic notes that conversations triggering this kind of response are \u201cextreme edge cases,\u201d adding that most users won\u2019t encounter this roadblock even when chatting about controversial topics. The AI startup has also instructed Claude not to end conversations if a user is showing signs that they might want to hurt themselves or cause \u201cimminent harm\u201d to others. <a href=\"https:\/\/www.anthropic.com\/news\/building-safeguards-for-claude#:~:text=We%20also%20work,in%20these%20conversations.\" target=\"_blank\" rel=\"noopener\">Anthropic partners<\/a> with Throughline, an online crisis support provider, to help develop responses to prompts related to self-harm and mental health.<\/p>\n<p class=\"has-text-align-none\">Last week, <a href=\"https:\/\/www.theverge.com\/news\/760080\/anthropic-updated-usage-policy-dangerous-ai-landscape\" target=\"_blank\" rel=\"noopener\">Anthropic also updated Claude\u2019s usage policy<\/a> as rapidly advancing AI models raise more concerns about safety. Now, the company prohibits people from using Claude to develop biological, nuclear, chemical, or radiological weapons, as well as to develop malicious code or exploit a network\u2019s vulnerabilities.<\/p>","protected":false},"excerpt":{"rendered":"<p>Anthropic\u2019s Claude AI chatbot can now end conversations deemed \u201cpersistently harmful or abusive,\u201d as spotted earlier by TechCrunch. The capability [&hellip;]<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-971","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/posts\/971","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/comments?post=971"}],"version-history":[{"count":0,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/posts\/971\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/media?parent=971"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/categories?post=971"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/tags?post=971"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}