• 0 Posts
  • 359 Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle





  • If I was going to make something like this, it would have to incorporate trust chains. I don’t care if some maga-hat says this lady is horrible. I care if my good friend Alex says she’s horrible. One person’s “this person won’t shut up about communism” is a big red flag (no pun intended) but for someone else that’s the dream.

    When you sign up, you’d need to be referred to someone or be a root node. Anyone connected to you can be weighted differently. If some section of the tree is misbehaving, prune it.

    But that’s a lot of work















  • It is absolutely stupid, stupid to the tune of “you shouldn’t be a decision maker”, to think an LLM is a better use for “getting a quick intro to an unfamiliar topic” than reading an actual intro on an unfamiliar topic. For most topics, wikipedia is right there, complete with sources. For obscure things, an LLM is just going to lie to you.

    As for “looking up facts when you have trouble remembering it”, using the lie machine is a terrible idea. It’s going to say something plausible, and you tautologically are not in a position to verify it. And, as above, you’d be better off finding a reputable source. If I type in “how do i strip whitespace in python?” an LLM could very well say “it’s your_string.strip()”. That’s wrong. Just send me to the fucking official docs.

    There are probably edge or special cases, but for general search on the web? LLMs are worse than search.