I have setup a pipiline on Gitlab to automatically deploy my node.js + express.js app, it uses docker-compose to create 4 containers (the app, mariadb, redis and phpmyadmin)
Now when the app container shuts down upon creation becuase express is not found, i had to do npm install in the deployment server then docker build.
Now after adding Redis, I got the same problem Error: Cannot find module 'redis'
Both Express and Redis are explicitly mentioned in package.json and package-lock.json but they are not installed.
Here is my gitlab-ci.yml:
# Define the pipeline stages
stages:
- install
- test
- sast
- deploy
# Global Variables
variables:
NODE_ENV: production
SAST_EXCLUDED_PATHS: "node_modules,tests"
# Install dependencies
install_dependencies:
stage: install
image: node:22.13
script:
- echo "Installing dependencies..."
- npm install --frozen-lockfile
cache:
key: ${CI_COMMIT_REF_SLUG}
paths:
- node_modules/
artifacts:
paths:
- node_modules/
# Run tests
run_tests:
stage: test
image: node:22.13
script:
- echo "Running tests..."
- npm test
allow_failure: false
rules:
- if: '$CI_COMMIT_BRANCH == "main"' # Run tests only on the main branch
# Static Application Security Testing (SAST)
include:
- template: Security/SAST.gitlab-ci.yml
# Deploy the application to the VPS
deploy:
stage: deploy
image: node:22.13
before_script:
- echo "Setting up SSH..."
- mkdir -p ~/.ssh
- echo "$DEPLOY_SSH_PRIVATE_KEY_B64" | base64 -d > ~/.ssh/id_rsa
- chmod 600 ~/.ssh/id_rsa
- ssh-keyscan -H ******** >> ~/.ssh/known_hosts
script:
- ssh -i ~/.ssh/id_rsa ******** "
cd ~/******** &&
git pull origin main &&
sudo docker compose down &&
sudo docker compose pull &&
sudo docker compose up -d"
environment:
name: production
url: ********
rules:
- if: '$CI_COMMIT_BRANCH == "main"' # Deploy only on the main branch
Here is my Dockerfile:
# Use official Node.js image from Docker Hub
FROM node:22
# Set working directory in the container
WORKDIR /usr/src/app
# Copy package.json and package-lock.json first (to leverage Docker cache)
COPY package*.json ./
# Install dependencies using frozen lockfile to ensure consistent versions
RUN npm ci
# Copy the rest of the app's code
COPY . .
# Expose the port the app will run on
EXPOSE 3000
# Command to start the app
CMD ["node", "server.js"]
Now because the project uses Gitlab CI/CD for automatic code deployment this could be a problem to manually execute npm install on the server. Any suggestions?
I have setup a pipiline on Gitlab to automatically deploy my node.js + express.js app, it uses docker-compose to create 4 containers (the app, mariadb, redis and phpmyadmin)
Now when the app container shuts down upon creation becuase express is not found, i had to do npm install in the deployment server then docker build.
Now after adding Redis, I got the same problem Error: Cannot find module 'redis'
Both Express and Redis are explicitly mentioned in package.json and package-lock.json but they are not installed.
Here is my gitlab-ci.yml:
# Define the pipeline stages
stages:
- install
- test
- sast
- deploy
# Global Variables
variables:
NODE_ENV: production
SAST_EXCLUDED_PATHS: "node_modules,tests"
# Install dependencies
install_dependencies:
stage: install
image: node:22.13
script:
- echo "Installing dependencies..."
- npm install --frozen-lockfile
cache:
key: ${CI_COMMIT_REF_SLUG}
paths:
- node_modules/
artifacts:
paths:
- node_modules/
# Run tests
run_tests:
stage: test
image: node:22.13
script:
- echo "Running tests..."
- npm test
allow_failure: false
rules:
- if: '$CI_COMMIT_BRANCH == "main"' # Run tests only on the main branch
# Static Application Security Testing (SAST)
include:
- template: Security/SAST.gitlab-ci.yml
# Deploy the application to the VPS
deploy:
stage: deploy
image: node:22.13
before_script:
- echo "Setting up SSH..."
- mkdir -p ~/.ssh
- echo "$DEPLOY_SSH_PRIVATE_KEY_B64" | base64 -d > ~/.ssh/id_rsa
- chmod 600 ~/.ssh/id_rsa
- ssh-keyscan -H ******** >> ~/.ssh/known_hosts
script:
- ssh -i ~/.ssh/id_rsa ******** "
cd ~/******** &&
git pull origin main &&
sudo docker compose down &&
sudo docker compose pull &&
sudo docker compose up -d"
environment:
name: production
url: ********
rules:
- if: '$CI_COMMIT_BRANCH == "main"' # Deploy only on the main branch
Here is my Dockerfile:
# Use official Node.js image from Docker Hub
FROM node:22
# Set working directory in the container
WORKDIR /usr/src/app
# Copy package.json and package-lock.json first (to leverage Docker cache)
COPY package*.json ./
# Install dependencies using frozen lockfile to ensure consistent versions
RUN npm ci
# Copy the rest of the app's code
COPY . .
# Expose the port the app will run on
EXPOSE 3000
# Command to start the app
CMD ["node", "server.js"]
Now because the project uses Gitlab CI/CD for automatic code deployment this could be a problem to manually execute npm install on the server. Any suggestions?
Share Improve this question asked 2 days ago medkmedk 9,53918 gold badges60 silver badges82 bronze badges 1 |1 Answer
Reset to default 0If you have node_modules
anonymous volume among your volumes in docker-compose.yml
the problem might appear because of it being used instead of installed dependencies. Try to remove it and after this dependencies should install correctly.
services:
app:
build: .
volumes:
- .:/usr/src/app
# Remove next line or the similar one in your code
- /usr/src/app/node_modules
command: sh -c "npm install && node server.js"
This article might also be helpful in you still want to mount node_modules
for some reason.
node_modules
directory, and if you do that, changes inpackage.json
will be completely ignored even if you change the image – the volume content will take precedence. – David Maze Commented 2 days ago