A small tool to view real-world ActivityPub objects as JSON! Enter a URL
or username from Mastodon or a similar service below, and we'll send a
request with
the right
Accept
header
to the server to view the underlying object.
{
"@context": [
"https://www.w3.org/ns/activitystreams",
{
"ostatus": "http://ostatus.org#",
"atomUri": "ostatus:atomUri",
"inReplyToAtomUri": "ostatus:inReplyToAtomUri",
"conversation": "ostatus:conversation",
"sensitive": "as:sensitive",
"toot": "http://joinmastodon.org/ns#",
"votersCount": "toot:votersCount",
"Hashtag": "as:Hashtag"
}
],
"id": "https://social.coop/users/eric/statuses/114348381280217178",
"type": "Note",
"summary": null,
"inReplyTo": "https://social.coop/users/eric/statuses/113964318177451822",
"published": "2025-04-16T15:37:09Z",
"url": "https://social.coop/@eric/114348381280217178",
"attributedTo": "https://social.coop/users/eric",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://social.coop/users/eric/followers"
],
"sensitive": false,
"atomUri": "https://social.coop/users/eric/statuses/114348381280217178",
"inReplyToAtomUri": "https://social.coop/users/eric/statuses/113964318177451822",
"conversation": "tag:social.coop,2023-02-12:objectId=54048271:objectType=Conversation",
"content": "<p><a href=\"https://social.coop/tags/LLMs\" class=\"mention hashtag\" rel=\"tag\">#<span>LLMs</span></a> can be used for rituals since they tend to fabulate in expected ways:</p><p>"Because LLMs have no internal mental processes they are aptly suited to answering such ritualised prompts, spinning out the required clichés with slight variations. As Dan Davies, a writer, puts it, they tend to regurgitate “maximally unsurprising outcomes”. For the first time, we have non-human, non-intelligent processes that can generatively enact ritual at high speed and industrial scale."</p><p><a href=\"https://archive.is/20240927101805/https://www.economist.com/by-invitation/2024/09/04/large-language-models-will-upend-human-rituals\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">archive.is/20240927101805/http</span><span class=\"invisible\">s://www.economist.com/by-invitation/2024/09/04/large-language-models-will-upend-human-rituals</span></a></p>",
"contentMap": {
"en": "<p><a href=\"https://social.coop/tags/LLMs\" class=\"mention hashtag\" rel=\"tag\">#<span>LLMs</span></a> can be used for rituals since they tend to fabulate in expected ways:</p><p>"Because LLMs have no internal mental processes they are aptly suited to answering such ritualised prompts, spinning out the required clichés with slight variations. As Dan Davies, a writer, puts it, they tend to regurgitate “maximally unsurprising outcomes”. For the first time, we have non-human, non-intelligent processes that can generatively enact ritual at high speed and industrial scale."</p><p><a href=\"https://archive.is/20240927101805/https://www.economist.com/by-invitation/2024/09/04/large-language-models-will-upend-human-rituals\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">archive.is/20240927101805/http</span><span class=\"invisible\">s://www.economist.com/by-invitation/2024/09/04/large-language-models-will-upend-human-rituals</span></a></p>"
},
"attachment": [],
"tag": [
{
"type": "Hashtag",
"href": "https://social.coop/tags/LLMs",
"name": "#LLMs"
}
],
"replies": {
"id": "https://social.coop/users/eric/statuses/114348381280217178/replies",
"type": "Collection",
"first": {
"type": "CollectionPage",
"next": "https://social.coop/users/eric/statuses/114348381280217178/replies?only_other_accounts=true&page=true",
"partOf": "https://social.coop/users/eric/statuses/114348381280217178/replies",
"items": []
}
},
"likes": {
"id": "https://social.coop/users/eric/statuses/114348381280217178/likes",
"type": "Collection",
"totalItems": 1
},
"shares": {
"id": "https://social.coop/users/eric/statuses/114348381280217178/shares",
"type": "Collection",
"totalItems": 0
}
}