ActivityPub Viewer

A small tool to view real-world ActivityPub objects as JSON! Enter a URL or username from Mastodon or a similar service below, and we'll send a request with the right Accept header to the server to view the underlying object.

Open in browser →
{ "@context": [ "https://www.w3.org/ns/activitystreams", "https://w3id.org/security/v1", { "lemmy": "https://join-lemmy.org/ns#", "litepub": "http://litepub.social/ns#", "pt": "https://joinpeertube.org/ns#", "sc": "http://schema.org/", "ChatMessage": "litepub:ChatMessage", "commentsEnabled": "pt:commentsEnabled", "sensitive": "as:sensitive", "matrixUserId": "lemmy:matrixUserId", "postingRestrictedToMods": "lemmy:postingRestrictedToMods", "removeData": "lemmy:removeData", "stickied": "lemmy:stickied", "moderators": { "@type": "@id", "@id": "lemmy:moderators" }, "expires": "as:endTime", "distinguished": "lemmy:distinguished", "language": "sc:inLanguage", "identifier": "sc:identifier" } ], "type": "Page", "id": "https://beehaw.org/post/19553780", "attributedTo": "https://beehaw.org/u/arsCynic", "to": [ "https://lemmy.ml/c/linux_gaming", "https://www.w3.org/ns/activitystreams#Public" ], "name": "Force GPU / integrated graphics on Steam games in Linux [games might unknowingly perform badly].", "cc": [], "content": "<p>Your games on <em>Linux</em> might unknowingly run slower than they need to. My games had inconsistent FPS between different sessions—smooth 144 FPS between stuttering ± 120 FPS—and I couldn’t figure out why because all the logs indicated it was running on my dedicated GPU.</p>\n<p>First I assumed it was a caching problem due to running distributed computing projects 24/7 [finding primes, not crypto&quot;currencies&quot;, don’t worry], but it turns out it was in fact arbitrarily using the desktop CPU’s integrated graphics instead of the GPU.</p>\n<p>Via the <em>Steam</em> game <strong>launch options</strong> command below <a href=\"https://wiki.archlinux.org/title/PRIME#Configure_applications_to_render_using_GPU\">one can force the use of one’s dedicated GPU</a>—0 or 1 depending on one’s PC.</p>\n<p><code>__NV_PRIME_RENDER_OFFLOAD=0 __GLX_VENDOR_LIBRARY_NAME=nvidia %command%</code></p>\n", "mediaType": "text/html", "source": { "content": "Your games on *Linux* might unknowingly run slower than they need to. My games had inconsistent FPS between different sessions—smooth 144 FPS between stuttering ± 120 FPS—and I couldn't figure out why because all the logs indicated it was running on my dedicated GPU. \n\nFirst I assumed it was a caching problem due to running distributed computing projects 24/7 [finding primes, not crypto\"currencies\", don't worry], but it turns out it was in fact arbitrarily using the desktop CPU's integrated graphics instead of the GPU. \n\nVia the *Steam* game **launch options** command below [one can force the use of one's dedicated GPU](https://wiki.archlinux.org/title/PRIME#Configure_applications_to_render_using_GPU)—0 or 1 depending on one's PC.\n\n`__NV_PRIME_RENDER_OFFLOAD=0 __GLX_VENDOR_LIBRARY_NAME=nvidia %command%`", "mediaType": "text/markdown" }, "attachment": [ { "href": "https://wiki.archlinux.org/title/PRIME#Configure_applications_to_render_using_GPU", "type": "Link" } ], "commentsEnabled": true, "sensitive": false, "published": "2025-04-20T20:16:09.935680+00:00", "updated": "2025-04-21T05:43:17.864065+00:00", "language": { "identifier": "en", "name": "English" }, "audience": "https://lemmy.ml/c/linux_gaming" }