I'm on a very slow internet connection, and the
RUN pip install -r requirements.txt
step of docker compose up --build
keeps timing out halfway through.
When I run docker compose up --build
again, it looks like it restarts from the very beginning. All of the python packages get downloaded from scratch.
How can I make docker use the downloaded packages from the previous attempt?
My dockerfile:
FROM python:3.11
ENV PYTHONUNBUFFERED 1
WORKDIR /app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY . .
CMD celery -A myapp worker -l info -Q ${CELERY_QUEUE}
I'm on a very slow internet connection, and the
RUN pip install -r requirements.txt
step of docker compose up --build
keeps timing out halfway through.
When I run docker compose up --build
again, it looks like it restarts from the very beginning. All of the python packages get downloaded from scratch.
How can I make docker use the downloaded packages from the previous attempt?
My dockerfile:
FROM python:3.11
ENV PYTHONUNBUFFERED 1
WORKDIR /app
COPY requirements.txt ./
RUN pip install -r requirements.txt
COPY . .
CMD celery -A myapp worker -l info -Q ${CELERY_QUEUE}
Share
Improve this question
edited Mar 19 at 10:54
Nils
asked Mar 19 at 6:45
NilsNils
3301 silver badge10 bronze badges
1
- 2 Please provide full Dockerfile, because it looks like you are copying all your code and then install requirements from there instead of copying requirements in separate step – PTomasz Commented Mar 19 at 7:29
1 Answer
Reset to default 1You could use cache mount in the Dockerfile link to Docker documentation
The cache is cumulative across builds, so you can read and write to the cache multiple times.
In the link there is this example for Python
RUN --mount=type=cache,target=/root/.cache/pip \
pip install -r requirements.txt