

The Steam Deck uses the capacitive thumb stick sensors to completely disable the trackpads as soon as the stick above the respective pad is touched. This works very well, so I think they‘ll implement the same thing here.


The Steam Deck uses the capacitive thumb stick sensors to completely disable the trackpads as soon as the stick above the respective pad is touched. This works very well, so I think they‘ll implement the same thing here.


This is such a click-bait comment, my god…
Your source for stability issues in Hogwarts Legacy is a single user in the Steam community with other users in the same thread not having issues at all. Seeing that Hogwarts Legacy is one of the most played games on Deck (ranked 11th at time of writing ), I think many more people would report issues if crashes were common.
Furthermore, your TONNE of optional game stores is one. I can‘t really think of a game store, besides Microsoft’s, that doesn‘t work on Steam Deck.
These early performance comparisons definitely have limited value for comparing Windows/Linux performance on the device. But I’m sorry to say that your arguments have even less.


I don’t think this is quite right. CoD titles do take a long time to develop. They‘re just rotating studios, so they can achieve a yearly release cadence (the last six entries in the series had five different studios working on them). Also, they are by no means getting cheaper. According to court documents development costs rose from $450 million to over $700 million from 2015 to 2020 alone.


Germany actually does that! Quite a few government bodies are already active at https://social.bund.de/. Maybe there‘s hope that other countries will follow.


I‘d probably go with a VPS. It probably won‘t cost more than 10$/month, maybe even less, depending on how much heavy usage your Nextcloud instance requires. And you won‘t have to worry about keeping your hardware and network running, which pretty much always takes up more time than expected.
Some web hosters (I‘ve had very good experiences with Hetzner) charge an hourly rate and allow you to preconfigure VPSes with software like Nextcloud. So unless you have specific needs, you could just spin up an instance, check if it suits your needs and, if not, only pay a few cents.


And that arrogant “I understand it, why don’t you?!”-attitude is exactly what’s so often the main issue in the design process of open source software.
I’d recommend watching this recent talk by Tantacrul, the design lead for MuseScore and Audacity. In it, he shows some videos of first-time user tests he conducted for Inkscape recently. It’s really fascinating to see, how users fail to do what they want because of confusing UX choices. And often it isn’t even that hard to fix. But open source image editors are just full of these little annoyances by now, which really smell like the result of inadequate user testing. And no professional would prefer to work all day with software full of little annoyances when there are alternatives.
I mean, just try adding text in Krita, for example. There’s a giant pop-up where you have to format your text without actually seeing it on your image. That’s just klunky and far more time consuming than a WYSIWYG approach would be.


This isn’t Adobe.
And as much as I want to like Krita, GIMP and such, their workflows just can’t compare with proprietary software in many cases. Also, especially for photo editing, their feature sets can’t compare with Adobe’s or Affinity’s either.
I use Krita, GIMP and Affinity Photo pretty regularly, and while there have been great improvements to the open source alternatives recently, I just get stuff done with Affinity, while still having to constantly search the web for things Krita and GIMP hide somewhere deep within their menus.
All open source image editors I’ve used are in dire need of a complete UX rework (like Blender and Musescore successfully did) before being more than niche alternatives to proprietary software.
So, as of yet, I can definitely understand the wish for a feature-rich and easily usable image editing suite on Linux.


I think Valve’s Pierre-Loup Griffais explained their plans for a Steam Deck 2 pretty well in this interview (starting at 8:36).
Paraphrasing: They are happy to work with other companies, but the people at Valve also have their own ideas and goals for hardware. And they want to be able to set the bar for these ideas themselves. That‘s why they‘re working on a Steam Deck 2.
And when you look at how well that setting the bar worked with the Deck, I‘m really glad that they want to follow up on that.
I own a GPD Win 2, a handheld PC from a few years before the Deck was a thing. That device couldn‘t be charged while using it, it had its speakers wired the wrong way, it constantly overheated and was a pain to use because of that. Ever since the Deck came out, the whole handheld PC market, including GPD, improved their device quality by a country mile.
And that‘s one of the best things about the Deck, in my opinion, and will hopefully also be one of the best things about the Deck 2.


Yup. It‘s from the Cave Johnson Announcer Pack reveal video. Which is definitely worth a watch, even if you‘re not into Dota 2.
I think I‘ve stumbled across this at some point, but I think it has been updated since.
Thanks for finding it, I‘ll keep it in mind when I get around to trying VR on Linux again in the future.
I’m doing all of my PC gaming on Linux for years now. Except for VR. It’s unfortunately not running well at all for me. I’m running an Nvidia GPU with a Valve Index and whenever I was able to even get a picture on the HMD in the first place, the latency from movement to screen was about a second or so. Which is an incredibly efficient way to feel incredibly sick.
I’m not sure about your setup, maybe it’s better supported in some way, but, from my experience, I’d unfortunately recommend keeping a Windows partition for VR and saving yourself the (quite literal) headache.


Although I’d love to see that happen more frequently, this is simply not realistically doable for most commercial games.
Almost all of them use licensed third-party libraries which are integrated deeply into the game’s code base, but which can’t legally be distributed as part of an open source project. So in order to be able to open source a modern commercial game, you’d have to put in quite a lot of work finding all of your code integrating with commercial libraries and either replacing or removing it. And if that’s not enough, you’d probably have to have your (expensive) legal team check the entire code base for any infringements just to be on the safe side.
All that work for no monetary gain just isn’t a very good business case. So, unfortunately, I wouldn’t expect a lot of modern games to be open sourced any time soon.


I‘d be really surprised if Apple tried that.
They have to know that it violates the DMA. And the penalty for violating it can be up to 10% of their yearly worldwide revenue (not earnings!) for the first violation and up to 20% for repeated violations. I don‘t think they‘d risk that, especially as the EU really isn’t known for its leniency when someone intentionally breaks their rules.
To be fair, everyone was offered a refund for that game. So technically they probably haven‘t paid for it anymore.
I still totally agree that Sony shouldn‘t go after private Concord servers. This game is very interesting, because it was an unbelievable failure despite having pretty solid gameplay. And preserving that on private servers provides a great way for other developers to learn, and maybe prevent, the tons of other issues leading to the game‘s failure.