If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?

Thanks for any recommendations in advance.

  • Smee@poeng.link
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    2 days ago

    I’ve successfully ran small scale LLM’s on my phone, slow but very doable. I run my main AI system on an older, midrange gaming PC. No problems at all.

    Dicio is a pre-programmed assistant, which one can talk to if one has speech recognition software installed. It has a preset of tasks it can do, in my experience it’s quite incomparable to how LLM’s work.