

- source code




403GB is the compressed size of all packages for all architectures.


Which you can’t even to because of conflicts/architecture availability.
fake and gay hopefully true and trans (and based af)


For your personal use, you probably shouldn’t get an “AI” GPU. If you start needing a terrabyte of VRAM and heat, space, and energy start getting real problems, reconsider. Even if you need more VRAM than the best gaming GPU (i.e. 5090…), it will be cheaper to get multiple gaming GPUs and use them as a cluster (you will take a performance hit though).
You obviously also need an account for everything. This requirement is only communicated at checkout.
Know the struggle, just keep trying local stores or other sites first, maybe we can be a small part of change for the better ;)
From their blog:
As you probably know, Session is decentralised. There are currently over 2,000 servers around the world working together to deliver your messages. Each one of these servers is an Oxen Service Node, and together they make up the Oxen Service Node Network. These are specialised servers that stake OXEN cryptocurrency to register on the network; and nodes receive OXEN rewards for performing particular services like routing Session messages.
(They are now switching from OXEN/the Oxen network to their own to be able to better control the features of their hosting network)
Why do you think so? We still don’t have proper support for the Fairphone 4 on pmOS, why’d the 6 be any better?


https://pine64.org/devices/pinetime/ 🤣 (rare case of a valid usecase for this emoji)


This fits into Google tying Android closer to them, same with the recent move of only making release source publicly available.
They’re regretting having started Android as an “open” platform and want to gain control fast, maybe preparing for the current anti-trust trials they are facing in the US.


It’s so crazy (technically understandable, but still crazy) to me that reliably receiving calls is still such a major issue


Now more than ever we need more work on PostmarketOS, Mobian, Gnome Mobile etc…
Bummer that it’s still so hard to find a somewhat modern, affordable phone that is Linux compatible


Summaries for complex Wikipedia articles would be great, especially for people less knowledgeable of the given topic, but I don’t see why those would have to be AI-generated.


I agree this feature should be enabled by default so people tech literate enough can just turn it off would be great for several people I know, just not from Google.
I don’t think it’s odd, because LLMs are just way faster than any junior (or senior) Dev. So it’s more like working with four junior devs but with the benefit of having tasks done sequentially without the additional overhead of having to give tasks to individual juniors and context switching to review their changes.
(Obviously, there are a whole lot of new pitfalls, but there a real benefits in some circumstances)


“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say”
Snowden