I think the title is self explain but again:
what is the benefit of using the pre script of npm packege.json
for example prestart
over just concatenate commands with &&
in the start
script?
{
prestart: "parcel build",
start "nodemon server.js"
}
vs
{
start: "parcel build && nodemon server.js"
}
It is more cross platform ?
can it handle two async endless process
like two servers (build + api) ?
something else?
edit:
I found benefit for postInstall
. Heroku and such delete devDependency
after npm install
so in postinstall
I can put build
process before Heroku delete the code that do that.
I think the title is self explain but again:
what is the benefit of using the pre script of npm packege.json
for example prestart
over just concatenate commands with &&
in the start
script?
{
prestart: "parcel build",
start "nodemon server.js"
}
vs
{
start: "parcel build && nodemon server.js"
}
It is more cross platform ?
can it handle two async endless process
like two servers (build + api) ?
something else?
edit:
I found benefit for postInstall
. Heroku and such delete devDependency
after npm install
so in postinstall
I can put build
process before Heroku delete the code that do that.
4 Answers
Reset to default 11prestart
runs before start
as the name suggests, therefore running a command in prestart
and a command in start
runs the two commands in sequence, not parallel. Running commands in start
with &&
runs them sequentially, but inside the same step.
The two methods are pretty much the same, at least in terms of results. However, there might be compatibility issues with &&
on certain versions of Windows.
If you want to run commands in parallel, you can use &
inside start
, instead of &&
.
In addition to the other answers, it should be noted that the prestart
hook in package.json
is not supported by the yarn package manager. So in this respect, using &&
allows for easier migration between npm and yarn (provided that your shell can interpret the &&
).
The reason for this is to increase maintainability by avoiding implicit dependency chains:
In particular, we intentionally don't support arbitrary pre and post hooks for user-defined scripts (such as prestart). This behavior, inherited from npm, caused scripts to be implicit rather than explicit, obfuscating the execution flow. It also led to surprising executions with yarn serve also running yarn preserve. https://yarnpkg.com/advanced/lifecycle-scripts
These methods are more for clarity in code, for separation of logical steps.
About compatability. As I understand npm runs all scripts in the local shell, so on most linux systems it will be some sh clone, and on windows it will be cmd. So there may be situation where &&
will not be supported by the shell. But it's unlikely and do you really need to support such behaviour, considering users could install bash on any platform node.js could be installed on and set nom to use it? I personally use bash in npm scripts and document in the README.
If you want to run multiple long-running processes use something like pm2 https://github.com/Unitech/PM2/ in production. When you're developing, usually it's helpful to run processes in multiple terminals to see logs, use supervisor https://github.com/petruisfan/node-supervisor to restart processes on errors and changes.
Also I usually write .sh
scripts for maintenance, like deploy and periodic, but manual tasks and run them using npm - you could add any named scripts in scripts
section.
That is because when you run a script like this - npm install && npm start
node will run both commands in the same process, so if one of the commands will send any non-zero exit code (in case of an exception) you'll not be able to determine which command failed. Running them in separate scripts with "pre" keyword will execute them in separate processes, so it is more accurate.