Our Icecast streaming Kirtan server is currently running on Google Compute Engine on a g1-small machine with 1 vCPU and 1.7 GB memory, which is enough to serve 70 simultaneous listeners. On average we have almost 10’000 unique listeners per month. This is of course manually scalable by using bigger machine types, but as the devops community is completely hooked on Docker we also wanted to learn about this new technology.
This will not only give us more flexibilty with deployment and scalability, but also make onboarding of new developers very light weight, because they can install Icecast and Liquidsoap on their local machines for testing and developing new components, such as our new mobile apps and the RESTful APIs.
The Dockerfile is on GitHub and looks like this.
You could build a new image based on that Dockerfile or you can simply
docker pull kirtan/icecast docker run -p 8000:8000 kirtan/icecast
It’s currently not yet possible to pass custom credentials to docker run, but once the Liquidsoap container is ready and both components work together properly, we will work on that, too.
As we’re new to docker we took inspiration from moul’s Icecast Docker file, but as he’s building from the official sources.list, you will end up with Icecast 2.3 instead of the latest 2.4 if you’re using his build.