This project demonstrates how to deploy a multi-container application using Docker Compose. In this project, an ML service is created where users can input medical prescription text, and the service processes the input using the OpenAI API, extracts structured data, and stores it in a PostgreSQL database. Here’s the project overview:
docker-tutorial/3-text-extractor/
├── backend/
│ ├── config.py
│ ├── database.py
│ ├── models.py
│ ├── extraction.py
│ ├── main.py
│ ├── Dockerfile
│ └── requirements.txt
├── frontend/
│ ├── app.py
│ ├── Dockerfile
│ └── requirements.txt
├── .env
├── docker-compose.yml
└── README.md
Set Environment Variables:
Create a .env file in the project directory with the following environment variables:
OPENAI_API_KEY=YOUR_OPENAI_API_KEY
POSTGRES_USER=YOUR_POSTGRES_USER_NAME
POSTGRES_PASSWORD=YOUR_POSTGRES_PASSWORD
POSTGRES_DB=YOUR_POSTGRES_DATABASE_NAME
Build and run the Docker Containers:
Navigate to the project directory and run:
cd 3-text-extractor
docker compose up --build -d
Note: You must be in the same directory as the
docker-compose.ymlfile.
This single command will build and start up all the containers: the frontend app, backend APIs, and PostgreSQL database
http://localhost:8501 to access the Streamlit frontend.http://localhost:8000 and is responsible for handling requests and processing data using OpenAI.postgres container.localhost:5432 and the credentials defined in the .env file.To check if your containers are running, use the following command:
docker ps
To check docker compose logs in real-time:
docker compose logs -f
To access the shell of a specific container, use:
docker exec -it <container_id_or_name> bash
For example, to enter the backend container, you can run:
docker exec -it 3-text-extraction-backend_api-1 bash
To stop and remove the containers:
docker-compose down
To remove all images associated with this project:
docker-compose down --rmi all