• 0 Posts
  • 194 Comments
Joined 3 years ago
cake
Cake day: June 19th, 2023

help-circle
  • If betting on Polymarket, you would actually have to stump up that money first, and the other person would have to do the same with whatever bid they wanted to use. Then, in order to get any kind of reasonable payback, you would need thousands of other people to make a bet for or against, using their own money.

    The payout isn’t on someone making a bet on themselves, no-one else would bet for or against that as the stakes are so small. The payout is on large-scale events that are - ostensibly - out of the control of the bettor or bettee.

    Polymarket is no different than betting on the outcomes of horse races or sports games, it just opens up the thing being betted on to anything and everything. People will still bet. The key is how “un-rigged” it appears to be.



  • How much do large language models actually hallucinate when answering questions grounded in provided documents?

    Okay, this is looking promising, at least in terms of the most important qualifications being plainly stated in the opening line.

    Because the amount of hallucinations/inaccuracies “in the wild” - depending on the model being tested - runs about 60-80%. But then again, this would be average use on generalized data sets, not questions focusing on specific documentation. So of course the “in the wild” questions will see a higher rate.

    This also helps users, as it shows that hallucinations/inaccuracies can be reduced by as much as ⅔ by simply limiting LLMs to specific documentation that the user is certain contains the desired information, rather than letting them trawl world+dog.

    Very interesting!


  • That may be the case, but the most irritating thing is that thy fill all available spots with the lowest-capacity chips that meet the requested provisioning spec, instead of taking the requested provisioning and using the fewest higher-capacity chips needed to meet the provisioning spec. The latter, at least, would leave spots open for an authorized repair location to manually solder on more approved chips of compatible spec.




  • If you have the money and want simplicity, reliability, and interoperability, go for a Mac. Just clench your sphincter and maximize the RAM; min. 32Gb ought to be minimally appropriate for a 7-8yr lifespan of basic duties. And FFS, go for what your current data uses up ×2.5 or 1Tb, whichever is larger (vital performance reasons in that). Don’t get the smallest storage unless third-party upgrade options exist like for the Mac Mini M4. And remember: all RAM and a lot of storage is integrated these days, which is why you should always max it out; there is no upgrade path except wholesale replacement of the machine. CPU is largely immaterial unless you are doing truly heavy lifting like video editing or AI, so that can often be the lowest choice.

    If you want freedom and truly unconstrained system, some form of Linux/BSD on a Framework system is the way to go. Or if a desktop, hand-assemble it yourself.

    If you are going to stick with Windows, go for a business-class Dell. Trust me, it’ll be almost as $$$$ painful as a Mac, but these little f**kers are built to last. At least you can upgrade the RAM and on-board storage, although I honestly recommend not going under 32Gb for anything other than basic tasks. It’ll be a lot more zippy with 32Gb even if you spend the first week tearing all the AI and built-in spyware out of Windows.





  • And even if the Core Storage held everything straight out of the gate, you could do initial storage configured via RAID-10 using only 28× 30Tb drives.

    In Canadian Pesos, that’s $34,000 before taxes for those drives. If the operating costs were in USD, that’s only 5 months of operating costs. Get a pair of used 4U 16-bay server boxes, and almost anything built within the last decade will work well as a SAN/NAS, especially if you use a specialized FOSS NAS OS.

    A good strategy for migrating to BitTorrent would be to migrate the high value content first, so that bugs and failures ooze out of the woodwork as rapidly as possible. This would also allow you to build the NAS/SAN data storage boxes over time, one at a time, instead of all at once. And you can start with repurposed desktops as the seedbox itself and upgrade to more RAM once the BitTorrent client grows beyond the box’s initial resources. This stepwise growth would also give you the opportunity to work out any kinks and gotchas that you failed to anticipate.

    For example, the BitTorrent client you choose to run on the seedbox itself will be a critical importance. I have found, through my own use of multiple clients, that by far the most aggressive BitTorrent client I have ever come across has been BiglyBT. I am able to achieve a ratio in weeks and sometimes even days the most other clients require years or even decades to achieve. For something seeding out, there is literally nothing better.

    As an example: when MyAnonamouse banned BiglyBT, I tried an experiment, downloading the same movie file with several different torrent clients. After a full year of seeding, the runner-up was qBittorrent, with a ratio of 0.2. BiglyBT? A ratio of 870.

    Same file, same super-seeding, but a massive difference between BiglyBT and pretty much anything else out there.

    It’s a shame that so many closed trackers ban BiglyBT. It is absolutely an overall benefit to the ecosystem.


  • …What is Myrient?

    googles name

    390Tb of history

    …Oh. Oh, no. This loss would be painful.

    I mean, not a gamer, but daaaaaamn.

    A structured BitTorrent system could keep most high-demand files offline after initial seeding, especially if seeding rules like the ones MyAnonamouse uses were implemented. And the low-demand ones could remain online via a seedbox from anywhere, even from the operator’s basement.

    Honestly, while I don’t have funds to take over normal operations or even provide seedbox space, I can see many paths out of this problem.



  • ANYTHING cloud-connected - your doorbell, your security system, even all f**king post-2006 vehicles, regardless of manufacturer - are suspect.

    And are highly likely to be actually spying on you.

    I’ve been working with computers since 1982, on the Internet since 1988, on the Web since 1992, and in the IT industry since 1997. The proportion of average people who don’t realize how much of their stuff is exposing them, and by how much, is frankly astounding. It’s almost 100% of normies who are woefully ignorant. Even IT people who have no clue is in the majority.

    And the security on this stuff that tracks you tends to be - except in rare circumstances - absolute dogshite. Sometimes it comes without any security at all, such as all devices sold having admin creds baked in, or all remote-access credentials being identical and non-user-editable.

    This is why almost all of my stuff is hardlined, I have no IoT devices at all, and the wifi for my family’s devices is physically separate from everything else.

    Don’t get me wrong, as IT for almost three decades I love all the new shinies. But I’m not blind, and I’m not stupid.




  • rekabis@lemmy.catoMemes@lemmy.mli'm a hardliner
    link
    fedilink
    arrow-up
    3
    ·
    2 months ago

    As someone who has struggled with various forms of WiFi for a good three decades, WiFi can just fuck all the way off to the ninth circle of hell.

    Even the rolling gut on my house has Cat7 planned beside every knee-level power plug in every room, with at least one fiber drop in every room as a high-bandwidth option. And my security will be 100% PoE on an airgapped network.

    Hardlining really is the only way to network.



  • You stopped a bit short on your delete spree I guess.

    No, that was just the first two steps. Just on the “rip shit out” category, I typically churn through at least three separate tools, usually in this order:

    • Win10Privacy
    • Win11Debloat
    • Winslop

    I mean, sure, Windows can take as little as a half hour to “install”. But on a personal rig (which also includes my own workflow software and personal data shoehorned back into place), I take another 24-48 hours to gleefully beat it into submission and install secondary programs that bypass the warts it has acquired over the years.

    And as a benchmark, XP needed only about 6-8hrs of extra work to reach the same threshold of data migration, workflow software, and improved usability (I was an NT fanboy, IMO the primary improvement of XP over 2000 was the start menu).

    If we add up the AI push, the spyware/telemetry explosion, the recent attempts to force the use of a Microsoft Account as the default login, and the massive bloating and instability of Windows in general, it’s slowly becoming time for even non-technical, everyday users to move to Linux.