Do you guys expose the docker socket to any of your containers or is that a strict no-no? What are your thoughts behind it if you don’t? How do you justify this decision from a security standpoint if you do?
I am still fairly new to docker but I like the idea of something like Watchtower. Even though I am not a fan of auto-updates and I probably wouldn’t use that feature I still find it interesting to get a notification if some container needs an update. However, it needs to have access to the docker socket to do its work and I read a lot about that and that this is a bad idea which can result in root access on your host filesystem from within a container.
There are probably other containers as well especially in this whole monitoring and maintenance category, that need that privilege, so I wanted to ask how other people handle this situation.
Cheers!
That is the exact reason why I wouldn’t use the auto-update feature. I just thought about setting it up to check for updates and give me some sort of notification. I just feel like a reminder every now and then helps me to keep everything up to date and avoid some sort of never change a running system mentality.
Your idea about setting it up and only letting it run occasionally is definitely one to consider. It would at least avoid manually checking the releases of each container similar to the RSS suggestion of /u/InnerScientist
To be honest, you would get frequent notifications for updates that are probably more often than just to remind you. If you’re like me, you’ll just end up ignoring them anyway! There are a lot of small updates to a lot of software, most often not from a security point of view but just as people develop their projects. I update every week if I can but can be a couple of weeks, in which I start to feel “guilty” so when it builds up I know I have to do it
Fair point. It is probably best to keep it simple. I can always setup a reminder in my calendar twice a month if I really have to.