Hacker News new | past | comments | ask | show | jobs | submit login

In my current project, I have my own mini-infrastructure, where each machine have a "watchdog" process that picks up items from global queue, and calls "docker run" with the item it picked up from the queue. Docker knows which program to start via ENTRYPOINT, and arguments to docker run are passed to this program. Each docker image gets its own queue and autoscaling instance group.

It works well for data processing tasks - my docker images are crawlers, indexers or analytics code in python or R. Deployment is quite simple - just push docker image, and it will be picked up on the next docker run. Images can add items to global queues, and for any bigger data they write things to a shared database.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: