Running Laravel Queues in Docker
Setup queue workers in a Dockerized Laravel app.
Queues are essential for scaling Laravel applications, allowing you to defer time-consuming tasks like sending emails, processing uploads, and more. When deploying Laravel in a Dockerized environment, it's critical to set up queue workers properly so your background jobs run reliably alongside your web containers. In this guide, we’ll walk through the best practices for running Laravel queues inside Docker.
Why Use Queues in Laravel?
Laravel's queue system provides a unified API for different queue backends. Jobs such as sending verification emails run asynchronously, vastly improving user experience and application performance.
Some common use cases for queues in Laravel:
- Delayed or batch email sending
- Data processing (e.g., image resizing, video encoding)
- Third-party API requests
- Notification dispatching
Dockerizing Laravel: The Basics
A typical Laravel Docker setup includes:
- App container: Runs PHP and the Laravel code.
- Web server container: Usually Nginx or Apache.
- Database container: MySQL, PostgreSQL, etc.
- Queue container(s): Dedicated to running
php artisan queue:work
.
Step 1: Ensure Your Queue Driver is Set
In your .env
file, specify the queue driver (e.g., redis
, database
, or sqs
):
QUEUE_CONNECTION=redis
Make sure the service for your chosen queue driver is available as a Docker container (e.g., Redis or database).
Step 2: Dockerfile Configuration
Your Dockerfile
should include all necessary extensions and supervisor (optional for process management):
FROM php:8.2-fpm
RUN apt-get update && apt-get install -y \
supervisor \
&& docker-php-ext-install pdo pdo_mysql
# Copy Laravel files
COPY . /var/www
WORKDIR /var/www
Tip: Supervisor is optional but recommended for managing multiple processes like queue workers and the web server in the same container.
Step 3: docker-compose.yml Example
Sample docker-compose.yml
for a Laravel app with web, Redis, and queue worker services:
version: '3.8'
services:
app:
build:
context: .
dockerfile: Dockerfile
volumes:
- .:/var/www
environment:
- QUEUE_CONNECTION=redis
depends_on:
- redis
web:
image: nginx:alpine
ports:
- "8080:80"
volumes:
- .:/var/www
- ./nginx/default.conf:/etc/nginx/conf.d/default.conf
depends_on:
- app
redis:
image: redis:alpine
ports:
- "6379:6379"
queue:
build:
context: .
dockerfile: Dockerfile
volumes:
- .:/var/www
command: php artisan queue:work --tries=3 --sleep=2
depends_on:
- redis
- app
- The
queue
service runs a dedicated worker, separate from your main PHP application container. - You may spin up multiple
queue
services for different priorities or queues.
Step 4: Managing Workers with Supervisor (Optional)
For advanced setups, use Supervisor to manage multiple workers within the same container. Example Supervisor config (supervisord.conf
):
[program:queue-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/artisan queue:work --sleep=3 --tries=3
autostart=true
autorestart=true
numprocs=2
user=www-data
redirect_stderr=true
stdout_logfile=/var/www/storage/logs/worker.log
Update your Dockerfile and/or docker-compose to include Supervisor and its configuration, and update the container's entry point accordingly.
Step 5: Running and Monitoring Your Queues
- Run your containers:
docker-compose up -d --build
- Inspect logs for your worker:
docker-compose logs -f queue
Queues will now start processing jobs as soon as they are dispatched.
Best Practices
-
Health Checks: Add health checks for your worker containers to ensure they're always processing.
-
Graceful Deployments: Use
php artisan queue:restart
on deploys to gracefully restart workers after code changes. -
Scaling: You can easily scale your queue processing capacity:
docker-compose up --scale queue=4 -d
-
Named Queues: Run different workers for separate queues (emails, notifications, etc.).
Conclusion
Running Laravel queues in Docker is straightforward with a proper multi-service setup. By dedicating containers to queue workers, you gain flexibility and scaleability for processing jobs efficiently. Containerized workers also integrate seamlessly with orchestration platforms like Docker Swarm or Kubernetes for next-level scaling.
Happy queueing! 🚀
Further Reading: