If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?

Thanks for any recommendations in advance.

  • J52@lemmy.nz
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    Just because things are natural doesn’t mean you can’t apply reason. Anything that’ll end up killing ourselves, in the greater picture, is petty stupid, especially if it only provides short term comfort we easily can do without.