I don’t think I’ve ever made a “clean upgrade” on Linux. I’ve done the opposite though, that is, bring an old install over to a new computer.
redditor since 2008, hoping kbin/the Fediverse can entirely replace it.
I don’t think I’ve ever made a “clean upgrade” on Linux. I’ve done the opposite though, that is, bring an old install over to a new computer.
Always use /dev/disk/* (I use by-id) for RAID, as those links will stay constant even if a disk is renamed (for example, from sdb to sdd).
Not obvious at all. Motion blur at high movement speeds makes things unreadable even at 540 Hz, proving that even at 540 Hz there is still plenty of motion blur that the human eye can see.
https://www.youtube.com/watch?v=OV7EMnkTsYA&t=682s
As the video says: “Yes, your eyes really are capable of seeing this in real life”
I literally haven’t had ANY of those problems running Windows 10 or 11 FWIW, not have any of my friends or relatives.
I’m not anti-Linux or anything though, have used it for 26 years now, but only briefly on the desktop.
Ubuntu is just getting worse and worse. I was pretty happy running Ubuntu server for years after moving from Gentoo; I jag lost interest in spending time taking care for that server and wanted something easy.
I went to Debian half a year ago and it’s been great. Should’ve done it earlier.
“climate change and other left wing topics”… I know that’s basically how it works in some countries, but it’s insane to consider certain scientific facts left wing, and we really shouldn’t support such statements.
At the International Roguelike Development Conference 2008 held in Berlin, Germany, players and developers established a definition for roguelikes known as the “Berlin Interpretation”.
These guys have extremely strict definitions, which mean that most “rougelike” games are in fact roguelites, if you care about what they think.
There are nine “high value” factors that are more or less a requirement:
Random Environment Generation
Permadeath
Turn-Based
Grid-Based
Non-Modal
Complexity
Resource Management
‘Hack-n-Slash’
Exploration and Discovery
Plus six “low value” factors that are less important:
Single Player Character
Monsters are Similar to Players
Tactical Challenge
ASCII Display
Dungeons
Numbers
There is, as you might expect, a fair bit of controversy about that though.
ZFS is really nice. I started experimenting with it when it was being introduced to FreeBSD, around 2007-2008, but only truly started using it last year, for two NASes (on Linux).
It’s complex for a filesystem, but considering all it can do, that’s not surprising.
I’ve basically only been playing Noita since I started maybe 6 weeks ago. Harsh and unforgiving, but it gets better the more you learn.
I highly recommend looking while others play to learn, and reading up on the wiki (noita.wiki.gg, the fandom wiki is abandoned by the community). There is SO much that is basically impossible to figure out on your own, but it’s so much fun. It’s a much bigger game than you might think if you just jump in and play, too. Even 134 hours in I still have quite a few things I’ve never done.
Helpful yes, but far from enough. It only helps in some scenarios (like accidental deletes, malware), but not in many others (filesystem corruption, multiple disks dying at once due to e.g. lightning, a bad PSU or a fire).
Offsite backup is a must for data you want to keep.
That’s in bytes. A modern NVMe drive can do about 7 GB/s (more than 10 for PCIe 5.0 drives). Even SATA could handle 5 Gbit/s, though barely.
Sorry for the nitpick, but you probably mean GB/s (or GiB/s, but I won’t go there). Gbps is gigabits per second, not gigabytes per second.
Since both are used in different contexts yet they differ by about a factor of 8, not confusing the two is useful.
They’re still working on a game they say is far bigger than BG3, though.
Prior to development on Baldur’s Gate 3, Larian CEO Swen Vincke was already planning out the company’s future, and this included what he calls “the very big RPG that will dwarf them all.”
Speaking to GameSpot at GDC, Vincke explained that Larian’s next game also won’t actually be the aforementioned “very big RPG,” but will be another step toward realizing it.
It’s always possible to re-encode video; it’s usually called transcoding. However, you lose a bit of quality every time you encode, so you might not gain much in the end. You can offset a bit of the quality loss by encoding at a higher bitrate/quality factor/etc than you otherwise would, but that of course takes up extra space.
This again? It’s utter bullcrap I’m afraid.
20 feet is fine unless you want 4K 120 Hz and stuff like that. I’m which case 20 feet may also be fine with a passive cable, but a bit on the edge of where AOC starts to make sense.
As for 1080p and 4K30 I think 10 meters can work passively.
Edit: My in-head unit conversion was a bit off, 20 feet is probably a bit over what’s sensible for 4K120. But it’s probably fine for non-UHS HDMI.
Are there any cases of such payout actually happening…? I’m not buying it. (Literally and figuratively.)