I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?

  • Jack@slrpnk.net
    link
    fedilink
    arrow-up
    3
    ·
    5 hours ago

    Can’t you run if from a container? I guess the will slow it down, but it will deny access to your files.

    • marcie (she/her)@lemmy.ml
      link
      fedilink
      arrow-up
      9
      ·
      5 hours ago

      yeah you could. though i dont see any evidence that the large open source llm programs like jan.ai or ollama are doing anything wrong with their program or files. chucking it in a sandbox would solve the problem for good though

      • SeekPie@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        3 hours ago

        You could use “Alpaca” flatpak and remove the internet access with flatseal after having downloaded the model. (Linux)

        Or deny the app’s access to internet in app settings. (Android)