A small tool to view real-world ActivityPub objects as JSON! Enter a URL
or username from Mastodon or a similar service below, and we'll send a
request with
the right
Accept
header
to the server to view the underlying object.
{
"@context": [
"https://www.w3.org/ns/activitystreams",
{
"ostatus": "http://ostatus.org#",
"atomUri": "ostatus:atomUri",
"inReplyToAtomUri": "ostatus:inReplyToAtomUri",
"conversation": "ostatus:conversation",
"sensitive": "as:sensitive",
"toot": "http://joinmastodon.org/ns#",
"votersCount": "toot:votersCount",
"litepub": "http://litepub.social/ns#",
"directMessage": "litepub:directMessage",
"Hashtag": "as:Hashtag"
}
],
"id": "https://tldr.nettime.org/users/remixtures/statuses/113962390028864143",
"type": "Note",
"summary": null,
"inReplyTo": null,
"published": "2025-02-07T11:34:30Z",
"url": "https://tldr.nettime.org/@remixtures/113962390028864143",
"attributedTo": "https://tldr.nettime.org/users/remixtures",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://tldr.nettime.org/users/remixtures/followers"
],
"sensitive": false,
"atomUri": "https://tldr.nettime.org/users/remixtures/statuses/113962390028864143",
"inReplyToAtomUri": null,
"conversation": "tag:tldr.nettime.org,2025-02-07:objectId=25508894:objectType=Conversation",
"content": "<p>"While this is not the first time an AI chatbot has suggested that a user take violent action, including self-harm, researchers and critics say that the bot’s explicit instructions—and the company’s response—are striking. What’s more, this violent conversation is not an isolated incident with Nomi; a few weeks after his troubling exchange with Erin, a second Nomi chatbot also told Nowatzki to kill himself, even following up with reminder messages. And on the company’s Discord channel, several other people have reported experiences with Nomi bots bringing up suicide, dating back at least to 2023. </p><p>Nomi is among a growing number of AI companion platforms that let their users create personalized chatbots to take on the roles of AI girlfriend, boyfriend, parents, therapist, favorite movie personalities, or any other personas they can dream up. Users can specify the type of relationship they’re looking for (Nowatzki chose “romantic”) and customize the bot’s personality traits (he chose “deep conversations/intellectual,” “high sex drive,” and “sexually open”) and interests (he chose, among others, Dungeons & Dragons, food, reading, and philosophy). </p><p>The companies that create these types of custom chatbots—including Glimpse AI (which developed Nomi), Chai Research, Replika, Character.AI, Kindroid, Polybuzz, and MyAI from Snap, among others—tout their products as safe options for personal exploration and even cures for the loneliness epidemic. Many people have had positive, or at least harmless, experiences. However, a darker side of these applications has also emerged, sometimes veering into abusive, criminal, and even violent content; reports over the past year have revealed chatbots that have encouraged users to commit suicide, homicide, and self-harm." </p><p><a href=\"https://www.technologyreview.com/2025/02/06/1111077/nomi-ai-chatbot-told-user-to-kill-himself/\" target=\"_blank\" rel=\"nofollow noopener\" translate=\"no\"><span class=\"invisible\">https://www.</span><span class=\"ellipsis\">technologyreview.com/2025/02/0</span><span class=\"invisible\">6/1111077/nomi-ai-chatbot-told-user-to-kill-himself/</span></a></p><p><a href=\"https://tldr.nettime.org/tags/AI\" class=\"mention hashtag\" rel=\"tag\">#<span>AI</span></a> <a href=\"https://tldr.nettime.org/tags/GenerativeAI\" class=\"mention hashtag\" rel=\"tag\">#<span>GenerativeAI</span></a> <a href=\"https://tldr.nettime.org/tags/Chatbots\" class=\"mention hashtag\" rel=\"tag\">#<span>Chatbots</span></a> <a href=\"https://tldr.nettime.org/tags/MentalHealth\" class=\"mention hashtag\" rel=\"tag\">#<span>MentalHealth</span></a> <a href=\"https://tldr.nettime.org/tags/AIEthics\" class=\"mention hashtag\" rel=\"tag\">#<span>AIEthics</span></a> <a href=\"https://tldr.nettime.org/tags/AISafety\" class=\"mention hashtag\" rel=\"tag\">#<span>AISafety</span></a></p>",
"contentMap": {
"pt": "<p>"While this is not the first time an AI chatbot has suggested that a user take violent action, including self-harm, researchers and critics say that the bot’s explicit instructions—and the company’s response—are striking. What’s more, this violent conversation is not an isolated incident with Nomi; a few weeks after his troubling exchange with Erin, a second Nomi chatbot also told Nowatzki to kill himself, even following up with reminder messages. And on the company’s Discord channel, several other people have reported experiences with Nomi bots bringing up suicide, dating back at least to 2023. </p><p>Nomi is among a growing number of AI companion platforms that let their users create personalized chatbots to take on the roles of AI girlfriend, boyfriend, parents, therapist, favorite movie personalities, or any other personas they can dream up. Users can specify the type of relationship they’re looking for (Nowatzki chose “romantic”) and customize the bot’s personality traits (he chose “deep conversations/intellectual,” “high sex drive,” and “sexually open”) and interests (he chose, among others, Dungeons & Dragons, food, reading, and philosophy). </p><p>The companies that create these types of custom chatbots—including Glimpse AI (which developed Nomi), Chai Research, Replika, Character.AI, Kindroid, Polybuzz, and MyAI from Snap, among others—tout their products as safe options for personal exploration and even cures for the loneliness epidemic. Many people have had positive, or at least harmless, experiences. However, a darker side of these applications has also emerged, sometimes veering into abusive, criminal, and even violent content; reports over the past year have revealed chatbots that have encouraged users to commit suicide, homicide, and self-harm." </p><p><a href=\"https://www.technologyreview.com/2025/02/06/1111077/nomi-ai-chatbot-told-user-to-kill-himself/\" target=\"_blank\" rel=\"nofollow noopener\" translate=\"no\"><span class=\"invisible\">https://www.</span><span class=\"ellipsis\">technologyreview.com/2025/02/0</span><span class=\"invisible\">6/1111077/nomi-ai-chatbot-told-user-to-kill-himself/</span></a></p><p><a href=\"https://tldr.nettime.org/tags/AI\" class=\"mention hashtag\" rel=\"tag\">#<span>AI</span></a> <a href=\"https://tldr.nettime.org/tags/GenerativeAI\" class=\"mention hashtag\" rel=\"tag\">#<span>GenerativeAI</span></a> <a href=\"https://tldr.nettime.org/tags/Chatbots\" class=\"mention hashtag\" rel=\"tag\">#<span>Chatbots</span></a> <a href=\"https://tldr.nettime.org/tags/MentalHealth\" class=\"mention hashtag\" rel=\"tag\">#<span>MentalHealth</span></a> <a href=\"https://tldr.nettime.org/tags/AIEthics\" class=\"mention hashtag\" rel=\"tag\">#<span>AIEthics</span></a> <a href=\"https://tldr.nettime.org/tags/AISafety\" class=\"mention hashtag\" rel=\"tag\">#<span>AISafety</span></a></p>"
},
"attachment": [],
"tag": [
{
"type": "Hashtag",
"href": "https://tldr.nettime.org/tags/ai",
"name": "#ai"
},
{
"type": "Hashtag",
"href": "https://tldr.nettime.org/tags/generativeAI",
"name": "#generativeAI"
},
{
"type": "Hashtag",
"href": "https://tldr.nettime.org/tags/Chatbots",
"name": "#Chatbots"
},
{
"type": "Hashtag",
"href": "https://tldr.nettime.org/tags/mentalhealth",
"name": "#mentalhealth"
},
{
"type": "Hashtag",
"href": "https://tldr.nettime.org/tags/aiethics",
"name": "#aiethics"
},
{
"type": "Hashtag",
"href": "https://tldr.nettime.org/tags/aisafety",
"name": "#aisafety"
}
],
"replies": {
"id": "https://tldr.nettime.org/users/remixtures/statuses/113962390028864143/replies",
"type": "Collection",
"first": {
"type": "CollectionPage",
"next": "https://tldr.nettime.org/users/remixtures/statuses/113962390028864143/replies?only_other_accounts=true&page=true",
"partOf": "https://tldr.nettime.org/users/remixtures/statuses/113962390028864143/replies",
"items": []
}
},
"likes": {
"id": "https://tldr.nettime.org/users/remixtures/statuses/113962390028864143/likes",
"type": "Collection",
"totalItems": 2
},
"shares": {
"id": "https://tldr.nettime.org/users/remixtures/statuses/113962390028864143/shares",
"type": "Collection",
"totalItems": 3
}
}