tl-dr

-Can someone give me step by step instructions (ELI5) on how to get access to my LLM’s on my rig from my phone?

Jan seems the easiest but I’ve tried with Ollama, librechat, etc.

I’ve taken steps to secure my data and now I’m going the selfhosting route. I don’t care to become a savant with the technical aspects of this stuff but even the basics are hard to grasp! I’ve been able to install a LLM provider on my rig (Ollama, Librechat, Jan, all of em) and I can successfully get models running on them. BUT what I would LOVE to do is access the LLM’s on my rig from my phone while I’m within proximity. I’ve read that I can do that via wifi or LAN or something like that but I have had absolutely no luck. Jan seems the easiest because all you have to do is something with an API key but I can’t even figure that out.

Any help?

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    17 hours ago

    How much system RAM, and what kind? DDR5?

    ik doesn’t have great documentation, so it’d be a lot easier for me to just point you places, heh.