Running parallel commands with npm on windows?

I am trying to configure a script in package.json that runs all my different clocks for Coffeescript / Sass / etc.

This is a script that I use on my server that works great.

"dev": "coffee --watch --compile js/ & coffee --watch --compile controllers/ & supervisor -e html,js js/index.js", 

When I try to use the same script locally, it seems to run the first command. Windows doesn't seem to know what to do with &. Each team there works fine, works individually, but they will not be executed together.

+6
source share
5 answers

For Windows, the built-in equivalent is & instead the start /B command prefix. That way you can just set "dev" to a small node script that uses the built-in child_process to do something like:

 var exec = require('child_process').exec; var prefix = (process.platform === 'win32' ? 'start /B ' : ''); exec(prefix + 'coffee --watch --compile js/'); exec(prefix + 'coffee --watch --compile controllers/'); exec(prefix + 'supervisor -e html,js js/index.js'); 
+3
source

I developed concurrently just for this purpose. For example, cat a & cat b can be achieved using concurrently 'cat a' 'cat b' . In parallel, several output formatting options are also provided for convenience.

Install it as a dependency dev npm install --save-dev concurrently , and then it is ready for use in package.json scripts.

+11
source

npm-run-all of what I saw is the most popular package for this. It adds run-s (serial) and run-p (parallel) scripts to your bin project, which makes them suitable for use in the project scripts section.

For the following package.json scripts:

 "scripts": { "start": "npm run build -- --watch", "prebuild": "rimraf lib dist", "build:dist": "rollup -c --sourcemap inline --environment NODE_ENV:production", "build": "babel src -d lib --ignore __tests__,__mocks__", "preversion": "npm run build && npm run build:dist", "test": "jest" } 

If you want to change the script preface to npm-run-all , you can shorten it to run-s build build:dist . If you want them to run in parallel, instead of using them in series, you would use: run-p build build:dist . It has options for error recovery, passing arguments to all scripts, and the cross platform works well.


Improved architecture

Recently, I have broken my projects into micro-modules. As soon as I start to encounter the problem that you are facing, this indicates that my project is too large. Large projects are useful for finding everything and keeping one version for each release, but leading to failures in creation and deployment. With large repositories, you should organize separate assemblies for the following types of things:

  • sass / postcss bundling
  • node libraries / utility files (Babel)
  • set of client applications (Rollup / Webpack)
  • executable / deployment
  • CI (unit tests / code coverage)
  • documentation
  • Multiply by ~ 2 for your dev / production builders.
  • Multiply by ~ 2 again for the cross platform. (good luck with NODE_ENV )

This will turn your .json package into a disaster.

To fix these problems and still maintain good version control and coordination, I use Lerna ( npm i -g lerna@prerelease ) for each project. It installs monorepo with the packages/ directory, which contains all your npm packages for your projects. By doing lerna bootstrap , then lerna run start bundles all the packages that are dependent on each other, then runs your npm start script in all the packages that define it. The lerna run in parallel by default, but can run in sequence with --concurrency=1 . I found that the project is too small to guarantee the use of lerna, which makes small projects less cumbersome.


create-react-app is a great example of a project that moves the ecosystem into a modular design. It has simple basics, 3 scripts ( start , test , build ) and a 4th eject script, the only use of which is to vomit the base modular build system into your project directory. Its extremely fast and hot reboot is great, but you have less choice (common misconception). You actually are in much better shape, not pushing, and use the never-built build system, which will receive reliable updates. Grow your system horizontally modulo and not exponentially in all directions.

In an attempt to apply this design to many build systems not covered by create-react-app (which is good), I created noderaider / modular lerna repo. It works basically the same as create-react-app , but targets CLI / API module creation packages that work well with Lerna and upstream from create-react-app . I publish the night hours under the create-<target>-module convention. Each of these packages can be run from CLI scripts, package.json, or organized through their node API. It uses yarn to install if found in the path, and returns to npm otherwise. It has valid working forests for webpack 2, rollup, postcss, and CLI packages, as well as unit tests and code coverage. I am currently working on modular scripts, even more consistent with the Lerna / create-react-app and ending with the remaining modules in the roadmap . Welcome to receive / feature requests. Expect each package to quickly spit you out of the building, a verifiable, published package with travis-ci integration and a quick change to 1.0.0 per semver .

TL DR Use Lerna and modular packages to get rid of these problems.

+5
source

The problem with mscdex's answer is that there is no easy way to kill these background tasks after they run. You need to go to the task manager and kill them.

Instead, the easiest way to start two tasks is to simply open two command windows.

ex: "dev": "start coffee -watch -compile js / and start coffee -watch -computer controllers / and start supervisor -e html, js js / index.js

If you run "npm run dev", you will open a command window for each of these processes, and they can be stopped separately. You do not need the / b switch. I use "start": "start webpack --watch and start reload -b" to start webpack and reload

0
source

There is also a nice node package called parallelshell . Install it with:

npm install --save-dev parallelshell

Then do with:

parallelshell "command 1" "command 2" "command 3"

The advantages of this approach (in more detail from the link above) instead of command 1 & command 2 & command 3 include: It is cross-platform; ctrl-c terminates all 3 processes; and if they die, they all die, as opposed to using & .

0
source

Source: https://habr.com/ru/post/975102/


All Articles