Thoughts on docker and making it better
Docker works, and I fully recommend using it for basically everything you deploy.
This said, I think there is massive room for improvement. Layers are a pretty terrible way to handle defining an environment.
Take this example
FROM ubuntu:latest
RUN apt-get -y update && \
apt-get install -y apache2 \
git \
golang \
python
When you add a new dependency, you then rerun everything on the run line, including re-downloading all the previously downloaded dependencies. Not to mention if the packages upstream changed, your docker image won't be the same between runs.
A much better approach would be for your container tech to more directly handle dependencies. For example
{
// Describes a specific point in time
"version": "F58C0AEFBFEBFBBB4112",
"devDependencies": ["git","golang","python"],
}
Where rather than re-downloading each, the container tech grabs only the ones that haven't previously been downloaded.
Several notes
- You can of course setup a cache for these dependencies so it doesn't actually fetch from a remote location. I don't think most devs have that setup for their machines though.
- Nix can pretty much do this, however it seems to have a huge learning curve when I've tried it.