Just a stranger trying things.

  • 0 Posts
  • 1 Comment
Joined 2 years ago
cake
Cake day: July 16th, 2023

help-circle
  • The Hobbyist@lemmy.ziptoSelfhosted@lemmy.worldI installed Ollama. Now what?
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    edit-2
    12 days ago

    Ollama is very useful but also rather barebones. I recommend installing Open-Webui to manage models and conversations. It will also be useful if you want to tweak more advanced settings like system prompts, seed, temperature and others.

    You can install open-webui using docker or just pip, which is enough if you only care about serving yourself.

    Edit: open-webui also renders markdown, which makes formatting and reading much more appealing and useful.

    Edit2: you can also plug ollama into continue.dev, an extension to vscode which brings the LLM capabilities to your IDE.