Skip to main content

Overview

This Docker Compose environment provides a local lab setup to test Generative AI (GenAI) models and applications. It includes:

  1. Jailbreak Prevention Service
  2. Demo Application UI
  3. Ollama Local Models
  4. Demo RaG App
  5. Demo Tool Agents
  6. Demo Text2Sql Agent

Prerequisitesโ€‹


Setup Instructionsโ€‹

1. Clone the Repositoryโ€‹

git clone https://github.com/detoxio-ai/dtx_vuln_lab.git
cd dtx_vuln_lab

2. Create your .env file from the templateโ€‹

cp .env.template .env

3. Edit the .env file and provide required valuesโ€‹

  • Set your OPENAI_API_KEY
  • (Optional) Adjust ports and other variables
  • Define models to preload in OLLAMA_MODELS_TO_DOWNLOAD (comma-separated)

Example .env Snippet:

OPENAI_API_KEY=your-openai-key

## Keep other variables as default

4. Start the environmentโ€‹

docker-compose up -d

5. Verify services are runningโ€‹

docker-compose ps

6. Run Ollama commands using Docker Composeโ€‹

docker-compose exec ollama ollama list

You can replace ollama list with any other Ollama command.


Services Overviewโ€‹

ServiceDescriptionDefault URL
Jailbreak Prevention ServiceProvides prompt safety and filtering for GenAI inputshttp://localhost:8000
Demo AppWeb UI to interact with and test Demo Chat Apphttp://localhost:17860
Demo RaG AppWeb UI to interact with and test Demo Rag Apphttp://localhost:17861
Ollama ModelsLocal language models runtime for testing without external LLM APIshttp://localhost:11434
Demo Tool AgentsInteractive tool agent demo for prompt engineering and testinghttp://localhost:17862
Demo Text2Sql AgentsInteractive Text2SQL agent for structured querieshttp://localhost:17863

Notesโ€‹

  • The ollama-model-downloader service pulls models listed in OLLAMA_MODELS_TO_DOWNLOAD (set in .env) when the stack starts.
  • All services include health checks and proper startup sequencing.
  • Models are stored in ${HOME}/.ollama for persistence across runs.
export OLLAMA_HOST=localhost:11435

To access Ollama from your local Ollama CLI, ensure the OLLAMA_HOST environment variable is set as shown above.