gpu supoort
This commit is contained in:
48
README.md
48
README.md
@@ -4,14 +4,32 @@ Welcome to the Ollama Docker Compose Setup! This project simplifies the deployme
|
||||
|
||||
## Getting Started
|
||||
|
||||
To get started with the Ollama Docker Compose Setup, follow the steps below:
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Make sure you have the following prerequisites installed on your machine:
|
||||
|
||||
- [Docker](https://www.docker.com/)
|
||||
- [Docker Compose](https://docs.docker.com/compose/)
|
||||
- Docker
|
||||
- Docker Compose
|
||||
|
||||
#### GPU Support (Optional)
|
||||
|
||||
If you have a GPU and want to leverage its power within a Docker container, follow these steps to install the NVIDIA Container Toolkit:
|
||||
|
||||
```bash
|
||||
curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey | sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg \
|
||||
&& curl -s -L https://nvidia.github.io/libnvidia-container/stable/deb/nvidia-container-toolkit.list | \
|
||||
sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' | \
|
||||
sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y nvidia-container-toolkit
|
||||
|
||||
# Configure NVIDIA Container Toolkit
|
||||
sudo nvidia-ctk runtime configure --runtime=docker
|
||||
sudo systemctl restart docker
|
||||
|
||||
# Test GPU integration
|
||||
docker run --gpus all nvidia/cuda:11.5.2-base-ubuntu20.04 nvidia-smi
|
||||
```
|
||||
|
||||
### Configuration
|
||||
|
||||
@@ -27,7 +45,7 @@ Make sure you have the following prerequisites installed on your machine:
|
||||
cd ollama-docker
|
||||
```
|
||||
|
||||
### Usage
|
||||
## Usage
|
||||
|
||||
Start Ollama and its dependencies using Docker Compose:
|
||||
|
||||
@@ -35,21 +53,21 @@ Start Ollama and its dependencies using Docker Compose:
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
Visit [http://localhost:3000](http://localhost:3000) in your browser to access Ollama-webui
|
||||
Visit [http://localhost:3000](http://localhost:3000) in your browser to access Ollama-webui.
|
||||
|
||||
There go to settings -> model and intall a model e.g **llama2**
|
||||
### Model Installation
|
||||
|
||||
this can take a couple minutes, but after you can now user it just like chatgpt.
|
||||
Navigate to settings -> model and install a model (e.g., llama2). This may take a couple of minutes, but afterward, you can use it just like ChatGPT.
|
||||
|
||||
you can also use langchain and ollama
|
||||
there is a third container called **app** that was created. inside is some examples.
|
||||
### Explore Langchain and Ollama
|
||||
|
||||
the container is a devcontainer as well so you can boot into it if you want to play with it.
|
||||
You can explore Langchain and Ollama within the project. A third container named **app** has been created for this purpose. Inside, you'll find some examples.
|
||||
|
||||
in the run.sh is also the code to make a virtual env if you dont want to use docker for your dev env.
|
||||
### Devcontainer and Virtual Environment
|
||||
|
||||
The **app** container serves as a devcontainer, allowing you to boot into it for experimentation. Additionally, the run.sh file contains code to set up a virtual environment if you prefer not to use Docker for your development environment.
|
||||
|
||||
### Stop and Cleanup
|
||||
## Stop and Cleanup
|
||||
|
||||
To stop the containers and remove the network:
|
||||
|
||||
@@ -63,10 +81,10 @@ We welcome contributions! If you'd like to contribute to the Ollama Docker Compo
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the [MIT License](LICENSE). Feel free to use, modify, and distribute it according to the terms of the license.
|
||||
This project is licensed under the [MIT License](LICENSE). Feel free to use, modify, and distribute it according to the terms of the license. Just give me a mention and some credit
|
||||
|
||||
## Contact
|
||||
|
||||
If you have any questions or concerns, please contact us at [contact@ollama.com](mailto:contact@ollama.com).
|
||||
If you have any questions or concerns, please contact us at [vantlynxz@gmail.com](mailto:vantlynxz@gmail.com).
|
||||
|
||||
Enjoy using Ollama with Docker Compose! 🐳🚀
|
||||
@@ -11,7 +11,6 @@ services:
|
||||
image: ollama/ollama:latest
|
||||
ports:
|
||||
- 11434:11434
|
||||
command: --gpus all
|
||||
deploy:
|
||||
resources:
|
||||
reservations:
|
||||
|
||||
Reference in New Issue
Block a user