• 0 Posts
  • 475 Comments
Joined 2 years ago
cake
Cake day: October 6th, 2023

help-circle




  • Also, live-service games endeavour to stay relevant forever.

    For, say, God of War, you’ll eventually be done with it. You’ve played all the things, you put the box on the shelve and move on to another game. But for these forever-games, you can play them forever.

    And that means that if you want to launch a game in that market, you can’t rely on getting players who just put down God of War and want something roughly similar. You need to not only be better than Fortnite, but you need to be sufficiently better than people will abandon years of investement into Fortnite to go play your game.

    The barrier to entry is HUGE, and it’s made much worse by the idea that the new game might dissapear, meaning you wasted months (or, occasionally, days, lol).







  • Translators are facing some big problems. They’re just now coming the painful realisation that a quality level of “low-to-mid” is acceptable for a LOT of stuff that would previously only have the option for “excellent” and “not at all”.

    If you were in the business of selling excellent translations to people who only want low-to-mid translations, you’re going to be out of a job.



  • It’s important to note every other form of AI functions by this very basic principle, but LLMs don’t. AI isn’t a problem, LLMs are.

    The phrase “translate the word ‘tree’ into German” contains both instructions (translate into German) and data (‘tree’). To work that prompt, you have to blend the two together.

    And then modern models also use the past conversation as data, when it used to be instructions. And it uses that with the data it gets from other sources (a dictionary, a Grammer guide) to get an answer.

    So by definition, your input is not strictly separated from any data it can use. There are of course some filters and limits in place. Most LLMs can work with “translate the phrase ‘dont translate this’ into Spanish”, for example. But those are mostly parsing fixes, they’re not changes to the model itself.

    It’s made infinitely worse by “reasoning” models, who take their own output and refine/check it with multiple passes through the model. The waters become impossibly muddled.


  • Tar_Alcaran@sh.itjust.worksto4chan@lemmy.worldThe pile
    link
    fedilink
    arrow-up
    44
    arrow-down
    1
    ·
    11 days ago

    https://en.wikipedia.org/wiki/Blåhaj

    it has become a cultural icon in countries like Russia and China, played a symbolic role in Switzerland’s same-sex marriage referendum, and found special significance within the transgender community.

    And on a more directly observed, less sourced note, it’s kind of “congrats on being out of your egg” thing that the conservatives somehow still haven’t picked up on, so it’s safe to use for vulnerable trans kids.

    It’s also really soft.