A small tool to view real-world ActivityPub objects as JSON! Enter a URL
or username from Mastodon or a similar service below, and we'll send a
request with
the right
Accept
header
to the server to view the underlying object.
{
"@context": [
"https://www.w3.org/ns/activitystreams",
{
"ostatus": "http://ostatus.org#",
"atomUri": "ostatus:atomUri",
"inReplyToAtomUri": "ostatus:inReplyToAtomUri",
"conversation": "ostatus:conversation",
"sensitive": "as:sensitive",
"toot": "http://joinmastodon.org/ns#",
"votersCount": "toot:votersCount",
"Hashtag": "as:Hashtag"
}
],
"id": "https://k2pk.com/users/b166ir/statuses/114147559597576670",
"type": "Note",
"summary": null,
"inReplyTo": null,
"published": "2025-03-12T04:25:33Z",
"url": "https://k2pk.com/@b166ir/114147559597576670",
"attributedTo": "https://k2pk.com/users/b166ir",
"to": [
"https://www.w3.org/ns/activitystreams#Public"
],
"cc": [
"https://k2pk.com/users/b166ir/followers"
],
"sensitive": false,
"atomUri": "https://k2pk.com/users/b166ir/statuses/114147559597576670",
"inReplyToAtomUri": null,
"conversation": "tag:k2pk.com,2025-03-12:objectId=134374:objectType=Conversation",
"content": "<p><a href=\"https://youtu.be/J4qwuCXyAcU\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"\">youtu.be/J4qwuCXyAcU</span><span class=\"invisible\"></span></a></p><p>In this video, Ollama vs. LM Studio (GGUF), showing that their performance is quite similar, with LM Studio’s tok/sec output used for consistent benchmarking.</p><p>What’s even more impressive? The Mac Studio M3 Ultra pulls under 200W during inference with the Q4 671B R1 model. That’s quite amazing for such performance!</p><p><a href=\"https://k2pk.com/tags/LLMs\" class=\"mention hashtag\" rel=\"tag\">#<span>LLMs</span></a> <a href=\"https://k2pk.com/tags/AI\" class=\"mention hashtag\" rel=\"tag\">#<span>AI</span></a> <a href=\"https://k2pk.com/tags/MachineLearning\" class=\"mention hashtag\" rel=\"tag\">#<span>MachineLearning</span></a> <a href=\"https://k2pk.com/tags/Ollama\" class=\"mention hashtag\" rel=\"tag\">#<span>Ollama</span></a> <a href=\"https://k2pk.com/tags/LMStudio\" class=\"mention hashtag\" rel=\"tag\">#<span>LMStudio</span></a> <a href=\"https://k2pk.com/tags/GGUF\" class=\"mention hashtag\" rel=\"tag\">#<span>GGUF</span></a> <a href=\"https://k2pk.com/tags/MLX\" class=\"mention hashtag\" rel=\"tag\">#<span>MLX</span></a> <a href=\"https://k2pk.com/tags/TechReview\" class=\"mention hashtag\" rel=\"tag\">#<span>TechReview</span></a> <a href=\"https://k2pk.com/tags/Benchmarking\" class=\"mention hashtag\" rel=\"tag\">#<span>Benchmarking</span></a> <a href=\"https://k2pk.com/tags/MacStudio\" class=\"mention hashtag\" rel=\"tag\">#<span>MacStudio</span></a> <a href=\"https://k2pk.com/tags/M3Ultra\" class=\"mention hashtag\" rel=\"tag\">#<span>M3Ultra</span></a> <a href=\"https://k2pk.com/tags/LocalLLM\" class=\"mention hashtag\" rel=\"tag\">#<span>LocalLLM</span></a> <a href=\"https://k2pk.com/tags/AIbenchmarks\" class=\"mention hashtag\" rel=\"tag\">#<span>AIbenchmarks</span></a> <a href=\"https://k2pk.com/tags/EnergyEfficient\" class=\"mention hashtag\" rel=\"tag\">#<span>EnergyEfficient</span></a> <a href=\"https://k2pk.com/tags/linux\" class=\"mention hashtag\" rel=\"tag\">#<span>linux</span></a></p>",
"contentMap": {
"en": "<p><a href=\"https://youtu.be/J4qwuCXyAcU\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" translate=\"no\"><span class=\"invisible\">https://</span><span class=\"\">youtu.be/J4qwuCXyAcU</span><span class=\"invisible\"></span></a></p><p>In this video, Ollama vs. LM Studio (GGUF), showing that their performance is quite similar, with LM Studio’s tok/sec output used for consistent benchmarking.</p><p>What’s even more impressive? The Mac Studio M3 Ultra pulls under 200W during inference with the Q4 671B R1 model. That’s quite amazing for such performance!</p><p><a href=\"https://k2pk.com/tags/LLMs\" class=\"mention hashtag\" rel=\"tag\">#<span>LLMs</span></a> <a href=\"https://k2pk.com/tags/AI\" class=\"mention hashtag\" rel=\"tag\">#<span>AI</span></a> <a href=\"https://k2pk.com/tags/MachineLearning\" class=\"mention hashtag\" rel=\"tag\">#<span>MachineLearning</span></a> <a href=\"https://k2pk.com/tags/Ollama\" class=\"mention hashtag\" rel=\"tag\">#<span>Ollama</span></a> <a href=\"https://k2pk.com/tags/LMStudio\" class=\"mention hashtag\" rel=\"tag\">#<span>LMStudio</span></a> <a href=\"https://k2pk.com/tags/GGUF\" class=\"mention hashtag\" rel=\"tag\">#<span>GGUF</span></a> <a href=\"https://k2pk.com/tags/MLX\" class=\"mention hashtag\" rel=\"tag\">#<span>MLX</span></a> <a href=\"https://k2pk.com/tags/TechReview\" class=\"mention hashtag\" rel=\"tag\">#<span>TechReview</span></a> <a href=\"https://k2pk.com/tags/Benchmarking\" class=\"mention hashtag\" rel=\"tag\">#<span>Benchmarking</span></a> <a href=\"https://k2pk.com/tags/MacStudio\" class=\"mention hashtag\" rel=\"tag\">#<span>MacStudio</span></a> <a href=\"https://k2pk.com/tags/M3Ultra\" class=\"mention hashtag\" rel=\"tag\">#<span>M3Ultra</span></a> <a href=\"https://k2pk.com/tags/LocalLLM\" class=\"mention hashtag\" rel=\"tag\">#<span>LocalLLM</span></a> <a href=\"https://k2pk.com/tags/AIbenchmarks\" class=\"mention hashtag\" rel=\"tag\">#<span>AIbenchmarks</span></a> <a href=\"https://k2pk.com/tags/EnergyEfficient\" class=\"mention hashtag\" rel=\"tag\">#<span>EnergyEfficient</span></a> <a href=\"https://k2pk.com/tags/linux\" class=\"mention hashtag\" rel=\"tag\">#<span>linux</span></a></p>"
},
"updated": "2025-03-12T04:27:47Z",
"attachment": [],
"tag": [
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/llms",
"name": "#llms"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/ai",
"name": "#ai"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/machinelearning",
"name": "#machinelearning"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/ollama",
"name": "#ollama"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/lmstudio",
"name": "#lmstudio"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/gguf",
"name": "#gguf"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/mlx",
"name": "#mlx"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/techreview",
"name": "#techreview"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/benchmarking",
"name": "#benchmarking"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/macstudio",
"name": "#macstudio"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/m3ultra",
"name": "#m3ultra"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/localllm",
"name": "#localllm"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/aibenchmarks",
"name": "#aibenchmarks"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/energyefficient",
"name": "#energyefficient"
},
{
"type": "Hashtag",
"href": "https://k2pk.com/tags/linux",
"name": "#linux"
}
],
"replies": {
"id": "https://k2pk.com/users/b166ir/statuses/114147559597576670/replies",
"type": "Collection",
"first": {
"type": "CollectionPage",
"next": "https://k2pk.com/users/b166ir/statuses/114147559597576670/replies?only_other_accounts=true&page=true",
"partOf": "https://k2pk.com/users/b166ir/statuses/114147559597576670/replies",
"items": []
}
},
"likes": {
"id": "https://k2pk.com/users/b166ir/statuses/114147559597576670/likes",
"type": "Collection",
"totalItems": 0
},
"shares": {
"id": "https://k2pk.com/users/b166ir/statuses/114147559597576670/shares",
"type": "Collection",
"totalItems": 4
}
}