We had set up MongoDB as a Docker container for a project and hosted it on our infrastructure server. We chose to use a Docker container for ease of deployment and management. However, one morning, we discovered that the MongoDB data was gone, prompting us to investigate the cause.
Upon further investigation, we found that the ports exposed from the Docker container to the host were accessible from the world, even when the server was secured with UFW and configured to deny all incoming connections unless a rule is added. This was a critical security issue, as it meant that anyone could potentially access our MongoDB instance and the sensitive data stored within it.
We discovered this by attempting to connect to the MongoDB instance from a remote machine and were surprised to find that we were able to connect successfully. We also ran nmap scans multiple times before, but they did not reveal the particular port, which made this issue even more alarming. This is because the nmap by default only scans most popular 1000 ports and mongoDB port isn’t among that. Here is the faulty config without any host specified which have caused this issue –
# Other config ... ports: # <Port exposed>:<DB port running inside container> - 27017:27017
To immediately mitigate this problem, we changed the
docker-compose configuration to only expose the MongoDB port to the host and not the public. We had to add the localhost address beside the exposed port to achieve this. Which meant that only applications running on the host machine could access the MongoDB instance.
# Other config ... ports: # <Port exposed>:<DB port running inside container> - 127.0.0.1:27017:27017
We also recommend taking backups of your volumes frequently and storing them offsite. This is an important practice for any production environment, as it can help prevent data loss in the event of hardware failures, software bugs, or security breaches. You can follow this article written by my colleague for guidance on how to implement a backup strategy for Docker volumes.