Hacker News new | past | comments | ask | show | jobs | submit login

I'm on board with this answer, and it makes sense to me, but it's competing in my head with the idea of always being able to do a clean build from master that could deploy whenever. Let's say you have a catastrophic failure and lose your binary repo and need to rebuild (and your backups, etc - it's an example) - if your dependencies have changed in the meanwhile, you can't redeploy to where you were. Obviously that scenario should be well prevented, but that doesn't feel like a legitimate excuse to not have some safeguard in place to make sure your build environment doesn't change on you.



"but that doesn't feel like a legitimate excuse to not have some safeguard in place to make sure your build environment doesn't change on you."

If your build process downloads anything from the Internet, your build environment can change on you. Networks can always fail. At best you can detect this occurring and have your build process scream and die, but it can't be fixed.

Note the complete lack of the word "Go" in the previous paragraph. If other languages provide package managers that make it easy to download code from the Internet during your build process, they are tempting you to do wrong things. You should be in personal possession of everything you need to do a full build, whatever that may be (source or otherwise). And, again, note the complete lack of the word "Go" in that sentence.

This has nothing to do with Go. If you are depending on package managers to download your code from external sites during a build, stop reading this right now and GO FIX THAT. Seriously. Someday it will bite you, hard.

And may Turing & Knuth help you if you're downloading packages with unpinned versions that can just upgrade to whatever they want, whenever they want.

Now I will actually use "Go" in a sentence. The fact that Go expects you to vendor your source I do not consider a problem with the language. The fact that it fails to ship with a blessed tool for doing so is a potentially legitimate criticism, but the fact that it fails to ship with a tool that begs you to do the easy, wrong thing is a feature, not a bug. I wouldn't be surprised that they eventually bless something out of the community. Godep seems to be the current leader. Personally I can't recommend or anti-recommend any of them, I've just been vendoring by hand since I'm only using a handful of libraries, and the cost/benefit hasn't worked out in favor of learning something to save me mere minutes of effort. I know where to find solutions if that ratio ever changes.


Since I can fairly reliably type "mvn compile" to build five year old Java projects, I think it's fair to say that the Go community is in a dependency hell of their own making.

Should every library author vendor their dependencies separately too?


I'm not in a dependency hell. I won't be if I add libraries, either, because there are tools for that.

"Should every library author vendor their dependencies separately too?"

Whatever it takes to produce your end-artifact should be under the control of the relevant organization. Again, let me emphasize, this has nothing to do with Go, and you violate it at your sole and singular peril. When the $PACKAGE site goes down due to DOS, security penetration, or sheer time-based neglect, being angry at me won't help you any.


What you may not get is that you're likely doing manually what a package manager does automatically. A program should do the work,not the programmer. Imagine there is a bug in the current package.Well a package manager can invite you to download a recent version of the lib that has no breaking changes automatically just with a few metadatas. Why would you not want to automate that kind of stuff?


A package manager can't run my tests and confirm that my code doesn't break with the newer version of the lib (just because it supposedly has no breaking changes doesn't mean it works.... there are a surprisingly large number of behaviors that consumers can rely on that go way beyond just the provided API).

So, all you're really automating is me going to look at the package's homepage and seeing if there's a new version. I'd still have to read the changelist to see if the bug that concerns me is fixed, what other changes are included, and run all my tests (and possibly write new ones). This is automating the easiest part of upgrading a dependency.


A lot of the work I do involves legacy Ruby software. Fortunately there has always been (well, since Ruby got popular) Rubygems which is equivalent to gopkg.in - I have been burned a couple of times though when authors have removed old libraries. For the last few years Bundler (http://bundler.io/) which uses Rubygems has been the standard way to lock versions of libraries, and before that vendoring was widely used.

When you pick up the maintenance of an old application you have to reinstall the dependencies from scratch if you aren't using vendoring or Bundler. Each method has their advantages and disadvantages, but both are much better than praying that the library author hasn't introduced a breaking change in a release since the library was last used.

Although it isn't the same when you are deploying Go applications, development has exactly the same issues as Ruby without locking the dependencies.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: