• abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    9 months ago

    Future systems could for example start asking questions more often

    Current systems already do that. But they’re expensive and it might be cheaper to have a human do it. Prompt engineering is very much a thing if you’re working with high performance low memory consumption language models.

    We’re a long way from having smartphones with a couple terabytes of RAM and a few thousand GPU cores… but our phones can run basic models and they do. Some phones use a basic LLM for keyboard auto correct for example.