最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - I have celery tasks that i want to run using Supervisord in my Dockerfile and i want to deploy them in my Azure Contain

programmeradmin0浏览0评论

I want to make 4 celery tasks that i might be able to run in one container in my Azure cloud but Im getting restarts on my image. I was using supervisord to run multiple workers on one container and then referrencing in my docker file. But the images made in my Github actions aren't deploying well in my container? Someone help?

`# Dockerfile
FROM python:3.11-slim

# Install system dependencies
RUN apt-get update && \
    apt-get install -y --no-install-recommends \
    supervisor \
    build-essential \
    && rm -rf /var/lib/apt/lists/*

# Create non-root user
RUN useradd -m -u 1001 appuser

# Set working directory
WORKDIR /app

# Copy requirements first for caching
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy application files
COPY . .

# Configure supervisord
COPY supervisord.conf /etc/supervisor/conf.d/supervisord.conf
RUN chown appuser:appuser /etc/supervisor/conf.d/supervisord.conf

# Set permissions
RUN chown -R appuser:appuser /app
USER appuser

# Application port
EXPOSE 80

# Start command
CMD ["/usr/bin/supervisord", "-n", "-c", "/etc/supervisor/conf.d/supervisord.conf"]




[supervisord]
nodaemon=true
logfile=/var/log/supervisord.log
logfile_maxbytes=50MB
loglevel=info

[program:web]
command=gunicorn main:app --workers 2 --worker-class uvicorn.workers.UvicornWorker --bind 0.0.0.0:80
autostart=true
autorestart=unexpected
startretries=3
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:users_worker]
command=celery -A app.tasks worker -Q users_tasks -n users_worker@%%h --loglevel=info --without-heartbeat --prefetch-multiplier=1
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:feed_worker]
command=celery -A app.tasks worker -Q feed_tasks -n feed_worker@%%h --loglevel=info --without-heartbeat --prefetch-multiplier=1
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:providers_worker]
command=celery -A app.tasks worker -Q service_providers_tasks -n providers_worker@%%h --loglevel=info --without-heartbeat --prefetch-multiplier=1
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:celery_beat]
command=celery -A app.tasks beat --loglevel=info --scheduler redbeat.RedBeatScheduler
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr`

I want to make 4 celery tasks that i might be able to run in one container in my Azure cloud but Im getting restarts on my image. I was using supervisord to run multiple workers on one container and then referrencing in my docker file. But the images made in my Github actions aren't deploying well in my container? Someone help?

`# Dockerfile
FROM python:3.11-slim

# Install system dependencies
RUN apt-get update && \
    apt-get install -y --no-install-recommends \
    supervisor \
    build-essential \
    && rm -rf /var/lib/apt/lists/*

# Create non-root user
RUN useradd -m -u 1001 appuser

# Set working directory
WORKDIR /app

# Copy requirements first for caching
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy application files
COPY . .

# Configure supervisord
COPY supervisord.conf /etc/supervisor/conf.d/supervisord.conf
RUN chown appuser:appuser /etc/supervisor/conf.d/supervisord.conf

# Set permissions
RUN chown -R appuser:appuser /app
USER appuser

# Application port
EXPOSE 80

# Start command
CMD ["/usr/bin/supervisord", "-n", "-c", "/etc/supervisor/conf.d/supervisord.conf"]




[supervisord]
nodaemon=true
logfile=/var/log/supervisord.log
logfile_maxbytes=50MB
loglevel=info

[program:web]
command=gunicorn main:app --workers 2 --worker-class uvicorn.workers.UvicornWorker --bind 0.0.0.0:80
autostart=true
autorestart=unexpected
startretries=3
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:users_worker]
command=celery -A app.tasks worker -Q users_tasks -n users_worker@%%h --loglevel=info --without-heartbeat --prefetch-multiplier=1
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:feed_worker]
command=celery -A app.tasks worker -Q feed_tasks -n feed_worker@%%h --loglevel=info --without-heartbeat --prefetch-multiplier=1
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:providers_worker]
command=celery -A app.tasks worker -Q service_providers_tasks -n providers_worker@%%h --loglevel=info --without-heartbeat --prefetch-multiplier=1
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr

[program:celery_beat]
command=celery -A app.tasks beat --loglevel=info --scheduler redbeat.RedBeatScheduler
autostart=true
autorestart=true
user=appuser
stdout_logfile=/dev/stdout
stderr_logfile=/dev/stderr`
Share Improve this question asked Feb 4 at 3:24 Sean STC ChuruSean STC Churu 1 2
  • Do you see errors like ImagePullBackOff, CrashLoopBackOff, or Celery connection failures? Could you provide Azure Logs? – Suresh Chikkam Commented Feb 4 at 3:40
  • @Sean STC Churu If possible, share your GitHub repository? – Aslesha Kantamsetti Commented Feb 25 at 5:13
Add a comment  | 

1 Answer 1

Reset to default 0

To resolve the issue, first check the container logs to see if the Celery workers are working properly or not, or if there's a problem with supervisord.conf

Run the below command to check the logs

docker logs <container_id>

az container logs --name <container-name> --resource-group <resource-group>

I have created fast api with celery tasks as per the requirement and then build using Docker.

 ✔ web                                      Built                                               0.0s 
 ✔ Container fastapi-celery-docker-redis-1  Runnin...                                           0.0s 
 ✔ Container fastapi-celery-docker-web-1    Recreate...

Then I have created ACR and pushed my image into ACR repository.

Run the below commands to push the image to Azure container registry,

az login

az acr login --name <ACR_NAME>

docker tag fastapi-celery-docker-web:latest <ACR_NAME>.azurecr.io/fastapi-celery-docker-web:latest

docker push <ACR_NAME>.azurecr.io/fastapi-celery-docker-web:latest

Now, I have created an Azure Container App, configured it with ACR, and set up deployment from my GitHub repository.

I've successfully deployed to Azure container app via GitHub actions.

Output:

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论