ActivityPub Viewer

A small tool to view real-world ActivityPub objects as JSON! Enter a URL or username from Mastodon or a similar service below, and we'll send a request with the right Accept header to the server to view the underlying object.

Open in browser →
{ "@context": [ "https://www.w3.org/ns/activitystreams", { "ostatus": "http://ostatus.org#", "atomUri": "ostatus:atomUri", "inReplyToAtomUri": "ostatus:inReplyToAtomUri", "conversation": "ostatus:conversation", "sensitive": "as:sensitive", "toot": "http://joinmastodon.org/ns#", "votersCount": "toot:votersCount", "blurhash": "toot:blurhash", "focalPoint": { "@container": "@list", "@id": "toot:focalPoint" }, "Hashtag": "as:Hashtag" } ], "id": "https://sigmoid.social/users/lysander07/statuses/114493958022865311", "type": "Note", "summary": null, "inReplyTo": null, "published": "2025-05-12T08:39:14Z", "url": "https://sigmoid.social/@lysander07/114493958022865311", "attributedTo": "https://sigmoid.social/users/lysander07", "to": [ "https://www.w3.org/ns/activitystreams#Public" ], "cc": [ "https://sigmoid.social/users/lysander07/followers", "https://wisskomm.social/users/fiz_karlsruhe", "https://sigmoid.social/users/fizise", "https://fedihum.org/users/tabea", "https://sigmoid.social/users/enorouzi", "https://fedihum.org/users/sourisnumerique" ], "sensitive": false, "atomUri": "https://sigmoid.social/users/lysander07/statuses/114493958022865311", "inReplyToAtomUri": null, "conversation": "tag:sigmoid.social,2025-05-12:objectId=57476255:objectType=Conversation", "content": "<p>Last leg on our brief history of NLP (so far) is the advent of large language models with GPT-3 in 2020 and the introduction of learning from the prompt (aka few-shot learning).</p><p>T. B. Brown et al. (2020). Language models are few-shot learners. NIPS&#39;20</p><p><a href=\"https://proceedings.neurips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">proceedings.neurips.cc/paper/2</span><span class=\"invisible\">020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf</span></a></p><p><a href=\"https://sigmoid.social/tags/llms\" class=\"mention hashtag\" rel=\"tag\">#<span>llms</span></a> <a href=\"https://sigmoid.social/tags/gpt\" class=\"mention hashtag\" rel=\"tag\">#<span>gpt</span></a> <a href=\"https://sigmoid.social/tags/AI\" class=\"mention hashtag\" rel=\"tag\">#<span>AI</span></a> <a href=\"https://sigmoid.social/tags/nlp\" class=\"mention hashtag\" rel=\"tag\">#<span>nlp</span></a> <a href=\"https://sigmoid.social/tags/historyofscience\" class=\"mention hashtag\" rel=\"tag\">#<span>historyofscience</span></a> <span class=\"h-card\" translate=\"no\"><a href=\"https://wisskomm.social/@fiz_karlsruhe\" class=\"u-url mention\">@<span>fiz_karlsruhe</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@fizise\" class=\"u-url mention\">@<span>fizise</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://fedihum.org/@tabea\" class=\"u-url mention\">@<span>tabea</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@enorouzi\" class=\"u-url mention\">@<span>enorouzi</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://fedihum.org/@sourisnumerique\" class=\"u-url mention\">@<span>sourisnumerique</span></a></span> <a href=\"https://sigmoid.social/tags/ise2025\" class=\"mention hashtag\" rel=\"tag\">#<span>ise2025</span></a></p>", "contentMap": { "en": "<p>Last leg on our brief history of NLP (so far) is the advent of large language models with GPT-3 in 2020 and the introduction of learning from the prompt (aka few-shot learning).</p><p>T. B. Brown et al. (2020). Language models are few-shot learners. NIPS&#39;20</p><p><a href=\"https://proceedings.neurips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"ellipsis\">proceedings.neurips.cc/paper/2</span><span class=\"invisible\">020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf</span></a></p><p><a href=\"https://sigmoid.social/tags/llms\" class=\"mention hashtag\" rel=\"tag\">#<span>llms</span></a> <a href=\"https://sigmoid.social/tags/gpt\" class=\"mention hashtag\" rel=\"tag\">#<span>gpt</span></a> <a href=\"https://sigmoid.social/tags/AI\" class=\"mention hashtag\" rel=\"tag\">#<span>AI</span></a> <a href=\"https://sigmoid.social/tags/nlp\" class=\"mention hashtag\" rel=\"tag\">#<span>nlp</span></a> <a href=\"https://sigmoid.social/tags/historyofscience\" class=\"mention hashtag\" rel=\"tag\">#<span>historyofscience</span></a> <span class=\"h-card\" translate=\"no\"><a href=\"https://wisskomm.social/@fiz_karlsruhe\" class=\"u-url mention\">@<span>fiz_karlsruhe</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@fizise\" class=\"u-url mention\">@<span>fizise</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://fedihum.org/@tabea\" class=\"u-url mention\">@<span>tabea</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://sigmoid.social/@enorouzi\" class=\"u-url mention\">@<span>enorouzi</span></a></span> <span class=\"h-card\" translate=\"no\"><a href=\"https://fedihum.org/@sourisnumerique\" class=\"u-url mention\">@<span>sourisnumerique</span></a></span> <a href=\"https://sigmoid.social/tags/ise2025\" class=\"mention hashtag\" rel=\"tag\">#<span>ise2025</span></a></p>" }, "attachment": [ { "type": "Document", "mediaType": "image/png", "url": "https://cdn.masto.host/sigmoidsocial/media_attachments/files/114/493/935/377/505/033/original/6c8d6b84ba761bd8.png", "name": "Slide from Information System Engineering 2025 lecture, 02 - Natural Language Processing 01, A brief history of NLP, NLP Timeline.\nThe NLP timeline is in the middle of the page from top to bottom. The marker is at 2020. On the left side, an original screenshot of GPT-3 is shown, giving advise on how to present a talk about \"Symbolic and Subsymbolic AI - An Epic Dilemma?\".\nThe right side holds the following text: \n2020: GPT-3 was released by OpenAI, based on 45TB data crawled from the web. A “data quality” predictor was trained to boil down the training data to 550GB “high quality” data. Learning from the prompt is introduced (few-shot learning)\n\nBibliographical Reference:\nT. B. Brown et al. (2020). Language models are few-shot learners. In Proceedings of the 34th Int. Conf. on Neural Information Processing Systems (NIPS'20). Curran Associates Inc., Red Hook, NY, USA, Article 159, 1877–1901.\n", "blurhash": "UGRW6v4=Io%1-:xZRikC~T-Pj=S5o#NHIoof", "focalPoint": [ 0, 0 ], "width": 2612, "height": 1474 } ], "tag": [ { "type": "Mention", "href": "https://wisskomm.social/users/fiz_karlsruhe", "name": "@fiz_karlsruhe@wisskomm.social" }, { "type": "Mention", "href": "https://sigmoid.social/users/fizise", "name": "@fizise" }, { "type": "Mention", "href": "https://fedihum.org/users/tabea", "name": "@tabea@fedihum.org" }, { "type": "Mention", "href": "https://sigmoid.social/users/enorouzi", "name": "@enorouzi" }, { "type": "Mention", "href": "https://fedihum.org/users/sourisnumerique", "name": "@sourisnumerique@fedihum.org" }, { "type": "Hashtag", "href": "https://sigmoid.social/tags/LLMs", "name": "#LLMs" }, { "type": "Hashtag", "href": "https://sigmoid.social/tags/gpt", "name": "#gpt" }, { "type": "Hashtag", "href": "https://sigmoid.social/tags/ai", "name": "#ai" }, { "type": "Hashtag", "href": "https://sigmoid.social/tags/nlp", "name": "#nlp" }, { "type": "Hashtag", "href": "https://sigmoid.social/tags/historyofscience", "name": "#historyofscience" }, { "type": "Hashtag", "href": "https://sigmoid.social/tags/ise2025", "name": "#ise2025" } ], "replies": { "id": "https://sigmoid.social/users/lysander07/statuses/114493958022865311/replies", "type": "Collection", "first": { "type": "CollectionPage", "next": "https://sigmoid.social/users/lysander07/statuses/114493958022865311/replies?only_other_accounts=true&page=true", "partOf": "https://sigmoid.social/users/lysander07/statuses/114493958022865311/replies", "items": [] } }, "likes": { "id": "https://sigmoid.social/users/lysander07/statuses/114493958022865311/likes", "type": "Collection", "totalItems": 3 }, "shares": { "id": "https://sigmoid.social/users/lysander07/statuses/114493958022865311/shares", "type": "Collection", "totalItems": 2 } }