ActivityPub Viewer

A small tool to view real-world ActivityPub objects as JSON! Enter a URL or username from Mastodon or a similar service below, and we'll send a request with the right Accept header to the server to view the underlying object.

Open in browser →
{ "@context": [ "https://www.w3.org/ns/activitystreams", { "ostatus": "http://ostatus.org#", "atomUri": "ostatus:atomUri", "inReplyToAtomUri": "ostatus:inReplyToAtomUri", "conversation": "ostatus:conversation", "sensitive": "as:sensitive", "toot": "http://joinmastodon.org/ns#", "votersCount": "toot:votersCount", "Hashtag": "as:Hashtag" } ], "id": "https://pxi.social/users/jakob/statuses/110283974473306733", "type": "Note", "summary": null, "inReplyTo": null, "published": "2023-04-29T20:25:03Z", "url": "https://pxi.social/@jakob/110283974473306733", "attributedTo": "https://pxi.social/users/jakob", "to": [ "https://www.w3.org/ns/activitystreams#Public" ], "cc": [ "https://pxi.social/users/jakob/followers" ], "sensitive": false, "atomUri": "https://pxi.social/users/jakob/statuses/110283974473306733", "inReplyToAtomUri": null, "conversation": "tag:pxi.social,2023-04-29:objectId=530024:objectType=Conversation", "content": "<p><a href=\"https://pxi.social/tags/LLM\" class=\"mention hashtag\" rel=\"tag\">#<span>LLM</span></a> are brute-forcing their way through absurd amounts of data to generate an autocomplete output for any given input that approximates outputs a human might give instead.</p><p>They lack a few distinct properties of human cognition, including language, that more brute force alone cannot compensate for. Because they can only ever internalize and compute *intra*textual context.</p><p>Incidentally, humans need much less input(!) to learn language. Probably because they can contextualize across domains. <br />🧵</p>", "contentMap": { "en": "<p><a href=\"https://pxi.social/tags/LLM\" class=\"mention hashtag\" rel=\"tag\">#<span>LLM</span></a> are brute-forcing their way through absurd amounts of data to generate an autocomplete output for any given input that approximates outputs a human might give instead.</p><p>They lack a few distinct properties of human cognition, including language, that more brute force alone cannot compensate for. Because they can only ever internalize and compute *intra*textual context.</p><p>Incidentally, humans need much less input(!) to learn language. Probably because they can contextualize across domains. <br />🧵</p>" }, "updated": "2023-12-30T15:41:58Z", "attachment": [], "tag": [ { "type": "Hashtag", "href": "https://pxi.social/tags/LLM", "name": "#LLM" } ], "replies": { "id": "https://pxi.social/users/jakob/statuses/110283974473306733/replies", "type": "Collection", "first": { "type": "CollectionPage", "next": "https://pxi.social/users/jakob/statuses/110283974473306733/replies?min_id=110284022617111775&page=true", "partOf": "https://pxi.social/users/jakob/statuses/110283974473306733/replies", "items": [ "https://pxi.social/users/jakob/statuses/110284022617111775" ] } } }