When a relative path is expanded and we're on a windows platform,
it expands to include the drive, eg C:\ , which was causing a ConfigError
as we split on ":" in parse_volume_spec and that was giving too many parts.
Use os.path.splitdrive instead of manually calculating the drive.
This should help us deal with windows drives as part of the volume
path better than us doing it manually.
Signed-off-by: Mazz Mosley <mazz@houseofmnowster.com>
The concurrent.futures backport doesn't play well with
KeyboardInterrupt, so I'm using Thread and Queue instead.
Since thread pooling would likely be a pain to implement, I've just
removed `COMPOSE_MAX_WORKERS` for now. We'll implement it later if we
decide we need it.
Signed-off-by: Aanand Prasad <aanand.prasad@gmail.com>
There's significant speed improvement by having more workers. This
value still shouldn't cause anyone's machines to melt/explode.
Signed-off-by: Mazz Mosley <mazz@houseofmnowster.com>
Commands able to use this parallelisation are `stop`, `kill` and `rm`.
We're using a backported function from python 3, to allow us to make
the most of a pool of threads without having to write the low level
code for managing this ourselves.
A default value for number of threads is a low enough number so it
shouldn't cause performance problems but if someone knows the
capability of their system and wants to increase it, they can via
an environment variable DEFAULT_MAX_WORKERS
Signed-off-by: Mazz Mosley <mazz@houseofmnowster.com>