![](/static/66c60d9f/assets/icons/icon-96x96.png)
![](https://fry.gs/pictrs/image/c6832070-8625-4688-b9e5-5d519541e092.png)
Yea same here. In all my years using a computer, I’ve never seen such a thing aside from “preinstalled” apps. Is this a regional thing?
Yea same here. In all my years using a computer, I’ve never seen such a thing aside from “preinstalled” apps. Is this a regional thing?
Now if they would only release a Steam Controller 2…
Not Eastern European but I do remember these in Singapore about 20 years ago. Stores or roadside tables would open up with racks hung full of disks in plastic sleeves. Interesting times.
As a Mac user, for whom PDFs open in Preview - because they’re effectively an image format - I find it wild that, to this day, Windows defaults to opening them in a browser. Windows has an image viewer right there.
I don’t see the difference here. Opening PDFs in an image viewer is wild too to me and I’ve used both Mac and Windows. For the shit that people give Edge, it’s a pretty nice pdf viewer and of all the browsers, it’s the most fully featured one that I know of.
And is it that strange that it opens a link in a browser? That is the default application for handling URLs after all.
Android phone makers are shipping on device LLMs?
Do people actually want these?
Ads will probably stop me from watching YouTube completely. The huge surge of ads at some point was what stopped me from using Instagram.
I used to have a Pebble too but I’ve long since given up on any hope of the market building something similar that looks as cool as the Pebble was. What exactly do you think is awful about Samsung’s Wear OS? I tried both the Pixel Watch and the Galaxy Watch and I greatly prefer Samsung’s.
It depends. Wear OS is heavy because it’s much more feature filled. I switched from a Garmin to a Galaxy Watch 4 because the feature set of Wear OS fits my use case much more than the Vivoactive 4 I had.
Laughs in airfox
Isn’t the multi-monitor support better in 11? It properly supports restoring windows into the correct monitor when you reconnect monitors.
It was pretty buggy though, my class had people’s laptops permanently locked into the browser and unable to close it after the exam. Sometimes it wouldn’t even let you start the exam even after launching with the browser until you restarted the whole system.
There’s not really a lot of options out there. Can’t say I agree with Samsung’s policies but their devices are pretty good compared to everyone else. iPhones are well, if you’d consider an iPhone then we wouldn’t be in this conversation. Chinese brands generally have very problematic software, Pixels are pretty barebones unless you’re into the AI stuff (Material 3 is also pretty ugly), Sony is very expensive and fairly barebones too.
Well for the current generation consoles they’re both x86-64 CPUs with only a single set of GDDR6 memory shared across the CPU and GPU so I’m not sure if you have such a penalty anymore
It’s not that unified memory can’t be created, but it’s not the architecture of a PC, where peripheral cards communicate over the PCI bus, with great penalties to touch RAM.
Are there any tests showing the difference in memory access of x86-64 CPUs with iGPUs compared to ARM chips?
Do you have any sources for this? Can’t seem to find anything specific describing the behaviour. It’s quite surprising to me since the Xbox and PS5 uses unified memory on x86-64 and would be strange if it is extremely slow for such a use case.
Thanks for the links, they’re really informative. That said, it doesn’t seem to be entirely certain that the extra work done by the x86 arch would incur a comparatively huge difference in energy consumption. Granted, that isn’t really the point of the article. I would love to hear from someone who’s more well versed in CPU design on the impact of it’s memory model. The paper is more interesting with regards to performance but I don’t find it very conclusive since it’s comparing ARM vs TSO on an ARM processor. It does link this paper which seems more relevant to our discussion but a shame that it’s paywalled.
Do x86 CPUs with iGPUs not already use unified memory? I’m not exactly sure what you mean but are you referring to the overhead of having to do data copying over from CPU to GPU memory on discrete graphics cards when performing GPU calculations?
Their primary money makers are what’s stopping them I reckon. Apple’s move to ARM is because they already had a ton of experience with building their own in house processors for their mobile devices and ARM licenses stock chip designs, making it easier for other companies to come up with their own custom chips whereas there really isn’t any equivalent for x86-64. There were some disagreements between Intel and AMD over patents on the x86 instruction set too.
Do you mind elaborating what is it about the difference on their memory models that makes a difference?
I think you’re missing the point here. It’s more that people couldn’t even be bothered to search up how to do something (that takes seconds) that they want to do first, and instead just rely on someone they think is an expert without putting in any effort at all.
Your examples don’t really make sense either as a lot of these are paid professions for larger tasks that most people simply don’t want to do. There’s a huge difference in searching online “how to install a Firefox extension” vs “how to do an weave”, etc.
End of the day, the average person doesn’t care and if they truly did they’d have the initiative to have just researched it and done it on their own.
Bringing it back to the whole thing about Linux, can you imagine how frustrating it would be to have to help debug a user’s Linux installation when they already need help with installing a browser add on? I work with tech and Linux on a daily basis and I already find it frustrating doing it for myself (fuck Nvidia drivers). No way am I gonna recommend it to someone else.