LLM

By rbosaz , 12 January, 2026

Per The Register:

A group of anonymous AI industry workers launched “Poison Fountain,” a project encouraging people to seed the web with subtly corrupted data to sabotage AI training. They argue AI systems are growing too fast and too dangerously, and that poisoning scraped data is one of the few effective ways to slow them down. The effort builds on research showing that even small amounts of malicious data can meaningfully degrade model performance.

By rbosaz , 12 September, 2024

The following was taken from here. All below steps assume a Linux O/S. I used Debian 12.

Setup Ollama:

curl -fsSL https://ollama.com/install.sh | sh

Running Model:

ollama run llama3

To use the ollama API, you'll need to open appropriate ports and perform the following extracted from here:

We want our API endpoint to be reachable by the rest of the LAN. For ollama, this means setting OLLAMA_HOST=0.0.0.0 in the ollama.service.

  • Run the following command to edit the service:

    systemctl edit ollama.service

Find the [Service] section and add Environment="OLLAMA_HOST=0.0.0.0" under it. It should look like this:

[Service]
Environment="OLLAMA_HOST=0.0.0.0"

  • Save and exit.
  • Reload the environment.

    systemctl daemon-reload 

    systemctl restart ollama

Example Ollama API is as follows:

curl http://192.168.1.179:11434/api/generate -d '{
 "model": "codellama",
 "prompt": "Why is the sky blue?",
 "stream": false
}'
 

Adding a web UI

One of the easiest ways to add a web UI is to use a project called Open UI. With Open UI, you can add an eerily similar web frontend as used by OpenAI.

You can run the web UI using the OpenUI project inside of Docker. According to the official documentation from Open WebUI, you can use the following command if Ollama is on the same computer:

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

URL to connect from remote computer: <host_name/ip_address>:3000

Keeping Your Docker Installation Up-to-Date

In case you want to update your local Docker installation to the latest version, you can do it with Watchtower:

docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui

Tags