ActivityPub Viewer

A small tool to view real-world ActivityPub objects as JSON! Enter a URL or username from Mastodon or a similar service below, and we'll send a request with the right Accept header to the server to view the underlying object.

Open in browser →
{ "@context": [ "https://www.w3.org/ns/activitystreams", { "ostatus": "http://ostatus.org#", "atomUri": "ostatus:atomUri", "inReplyToAtomUri": "ostatus:inReplyToAtomUri", "conversation": "ostatus:conversation", "sensitive": "as:sensitive", "toot": "http://joinmastodon.org/ns#", "votersCount": "toot:votersCount" } ], "id": "https://sigmoid.social/users/mervenoyan/statuses/109638068927155872/replies", "type": "Collection", "first": { "id": "https://sigmoid.social/users/mervenoyan/statuses/109638068927155872/replies?page=true", "type": "CollectionPage", "next": "https://sigmoid.social/users/mervenoyan/statuses/109638068927155872/replies?only_other_accounts=true&page=true", "partOf": "https://sigmoid.social/users/mervenoyan/statuses/109638068927155872/replies", "items": [ { "id": "https://sigmoid.social/users/mervenoyan/statuses/109638080161363711", "type": "Note", "summary": null, "inReplyTo": "https://sigmoid.social/users/mervenoyan/statuses/109638068927155872", "published": "2023-01-05T18:45:38Z", "url": "https://sigmoid.social/@mervenoyan/109638080161363711", "attributedTo": "https://sigmoid.social/users/mervenoyan", "to": [ "https://www.w3.org/ns/activitystreams#Public" ], "cc": [ "https://sigmoid.social/users/mervenoyan/followers", "https://sigmoid.social/users/knutjaegersberg", "https://sigmoid.social/users/minimaxir" ], "sensitive": false, "atomUri": "https://sigmoid.social/users/mervenoyan/statuses/109638080161363711", "inReplyToAtomUri": "https://sigmoid.social/users/mervenoyan/statuses/109638068927155872", "conversation": "tag:sigmoid.social,2023-01-04:objectId=4050663:objectType=Conversation", "content": "<p><span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@knutjaegersberg\" class=\"u-url mention\">@<span>knutjaegersberg</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@minimaxir\" class=\"u-url mention\">@<span>minimaxir</span></a></span> I just checked for both:<br /><a href=\"https://paperswithcode.com/task/extractive-document-summarization\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">paperswithcode.com/task/extrac</span><span class=\"invisible\">tive-document-summarization</span></a><br /><a href=\"https://paperswithcode.com/task/abstractive-text-summarization\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">paperswithcode.com/task/abstra</span><span class=\"invisible\">ctive-text-summarization</span></a><br />it depends on dataset apparently but what I said is still applicable. if I did applied ML I&#39;d use longformer as seq2seq models are hard to get right for generation and tend to speak gibberish</p>", "contentMap": { "tr": "<p><span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@knutjaegersberg\" class=\"u-url mention\">@<span>knutjaegersberg</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@minimaxir\" class=\"u-url mention\">@<span>minimaxir</span></a></span> I just checked for both:<br /><a href=\"https://paperswithcode.com/task/extractive-document-summarization\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">paperswithcode.com/task/extrac</span><span class=\"invisible\">tive-document-summarization</span></a><br /><a href=\"https://paperswithcode.com/task/abstractive-text-summarization\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">paperswithcode.com/task/abstra</span><span class=\"invisible\">ctive-text-summarization</span></a><br />it depends on dataset apparently but what I said is still applicable. if I did applied ML I&#39;d use longformer as seq2seq models are hard to get right for generation and tend to speak gibberish</p>" }, "attachment": [], "tag": [ { "type": "Mention", "href": "https://sigmoid.social/users/knutjaegersberg", "name": "@knutjaegersberg" }, { "type": "Mention", "href": "https://sigmoid.social/users/minimaxir", "name": "@minimaxir" } ], "replies": { "id": "https://sigmoid.social/users/mervenoyan/statuses/109638080161363711/replies", "type": "Collection", "first": { "type": "CollectionPage", "next": "https://sigmoid.social/users/mervenoyan/statuses/109638080161363711/replies?only_other_accounts=true&page=true", "partOf": "https://sigmoid.social/users/mervenoyan/statuses/109638080161363711/replies", "items": [] } }, "likes": { "id": "https://sigmoid.social/users/mervenoyan/statuses/109638080161363711/likes", "type": "Collection", "totalItems": 1 }, "shares": { "id": "https://sigmoid.social/users/mervenoyan/statuses/109638080161363711/shares", "type": "Collection", "totalItems": 0 } } ] } }