- Регистрация
- 1 Мар 2015
- Сообщения
- 1,481
- Баллы
- 155
? Flask API with DeepSeek-R1 via Ollama
This is an API developed with Flask in Python, connecting to the LLM model DeepSeek-R1 using the platform.
It allows to perform inferences with LLMs in a simple, efficient and local way with a basic UI and response stream return.
Prerequisites
Before running the project, make sure you have the following items installed:
git clone
cd ollama-llm-deepseek-server
pip install -r requirements.txt
# On Linux via curl
curl -fsSL | sh
ollama pull deepseek-r1:latest
? Running the API
Just run the app.py script:
python app.py
? Basic structure
Just run the app.py script:
├── app.py
├── requirements.txt
└── README.md
? Credits
Project developed by Jocimar Lopes.
Feel free to contribute or use in your own projects.
? License
This project is licensed under the terms of the MIT License.
This is an API developed with Flask in Python, connecting to the LLM model DeepSeek-R1 using the platform.
It allows to perform inferences with LLMs in a simple, efficient and local way with a basic UI and response stream return.
Before running the project, make sure you have the following items installed:
- deepseek-r1 downloaded from Ollama
- Clone the repository
git clone
cd ollama-llm-deepseek-server
- Install the dependencies
pip install -r requirements.txt
- Install and start Ollama If you don't already have Ollama:
# On Linux via curl
curl -fsSL | sh
- Download the DeepSeek-R1 model If you don't have Ollama yet:
ollama pull deepseek-r1:latest
? Running the API
Just run the app.py script:
python app.py
? Basic structure
Just run the app.py script:
├── app.py
├── requirements.txt
└── README.md
? Credits
Project developed by Jocimar Lopes.
Feel free to contribute or use in your own projects.
? License
This project is licensed under the terms of the MIT License.