Hacker News new | past | comments | ask | show | jobs | submit login

Just wondering, does Go finally has a package manager with proper versioning of modules?

Edit: Somebody is heavily downvoting me on every reply, I would love to know why. I like Go and want to get into it since a few months but the missing package manager held me off. So what is wrong about my question?




You really don't need a package manager for Go. Honestly, the main reason you need them for node, python, etc is because you're deploying source code to your production environment. You don't do that with Go. You deploy a binary, which has everything wrapped up in itself. Think of it as a self-contained virtualenv that requires zero effort on your part :)

In Go, code is packaged via source control repos. If you want a different version, you need a different repo. There are some tools to mitigate this, so that you can get away with a different branch per version (see http://gopkg.in), which can make it a little nicer for the maintainer to deal with multiple branches. To the consumer, it still looks like a different repo (i.e. every different version is a different URL to a repo).

Note that because Go is a compiled language, the only time versioning comes into play is a development time. As long as your developers are using the right version, you never have to worry about versions during deployment.

Now, if you're concerned that your project depends on code repos from external sources, then there are several tools that can insulate you from third parties doing stupid stuff. The most complete and best one is called godep, which basically has two modes - reliable and paranoid. The reliable mode insulates you from changes made to a repo you depend on by recording the exact VCS revision your project is using and always checks out that commit. That way, if an upstream repo checks in a breaking change, it won't affect you (and you control when you update to newer commits in that repo). Paranoid mode actually copies all the code from your dependencies into your own repo. This insulates you from upstream repos getting deleted. And of course, this gives you the ability to test out upstream changes before integrating them into your project. The tool makes it easy to copy all of your dependencies at once with a single command, so it's not a huge hassle to do (which it would be if you had to do it by hand).


> You really don't need a package manager for Go.

"You really don't need X for Go" is becoming the Go community catchphrase. And they're right: I won't need anything for Go because I wouldn't use a language whose community actively resists improvement.

I could respect that Go doesn't have a package manager, generics, etc., if the Go community took more of a "if you want it, develop it" approach. But instead, the core development team has made it unclear whether they would even accept some of these features. I get that compilation performance is a high priority, but they've missed the big picture, that developer time also matters.

The following exemplifies the problem:

> Note that because Go is a compiled language, the only time versioning comes into play is a development time.

Oh, that won't affect me at all; development time is only like 90% of the life of a project.


The community does not resist improvement. The community asks that you acquire some experience, work with what is there, make sure that you understand it, then suggest improvements.

The community definitely does resist people who aren't even users loudly declaring exactly what it is that Go needs. I fear the community may have instinctively built up certain callouses that I'm not a big fan of, such as the near inability to have a sensible discussion about generics within the community due to the near inability to have a sensible discussion about generics outside of the community. It isn't all necessarily hunky-dory.

But if you haven't used Go, which is what it sounds like to me, you don't understand the current situation's cost/benefit analysis and are unlikely to be able to "just" "solve" the community's "problems" without such an understanding.

Step 1 of any engineering effort is identify the problem, and it can never be skipped, no matter how sure you are you can this time. If you're not a Go user, you've skipped this step.

Of course you're free to A: not be a Go user for any reason you choose and B: say whatever you like about Go from the outside. And I really mean that, quite deeply, not merely as a rhetorical flourish. There's too many languages to become expert in them all and we all must choose carefully according to our own knowledge and desires. My only point here is only that you shouldn't expect the community to be enthralled with the resulting commentary.


> The community asks that you acquire some experience, work with what is there, make sure that you understand it, then suggest improvements.

So what is it that you think people who want Go to have generics don't understand? You say that people should acquire some experience before criticizing Go, but you're only accepting a certain kind of experience. I've written tens of thousands of lines of code in similar languages without generics, and tens of thousands of lines of code in similar languages with generics, and there is no question which I would rather write. I have written a number of compilers and am well aware of the concerns and how to implement a performant compiler. Why is that experience not something you value?

You're basically creating a chicken and egg problem: people who like Go will use it, and people who don't like Go won't, and what you're saying is that the latter won't be allowed to criticize Go. But people who don't like Go are the only ones who are going to ever criticize Go. And criticism is what anything needs to not suck.

Don't talk to me about "acquiring experience" as an excuse to discard any experience that comes from outside your community. My core complaint about Go: the authors simply have ignored decades of compiler research. By deciding that only users of Go can criticize Go, you're deciding to only take feedback from people who would use Go, and the only people who would use Go are people who aren't aware of the last few decades of compiler research: people who think coroutines are a new idea and aren't sure whether generics are useful.

As an outsider to the community, I wouldn't care if Go is crappy, except that I work in this industry, and if Go becomes popular enough, a situation will arise where I have to use Go or a language that imitates Go, or turn down a job because of it. Go being terrible and popular means that it affects the whole software industry negatively.

Also, arguing that I need more Go experience to criticize Go is somewhat ridiculous in response to an article which says, "Being an advocate is not really a separate stage, but a role you can fill at any of the other stages." So you need Go experience to criticize Go, but you don't need Go experience to advocate for it? This is why I assert that Go's popularity is mostly hype.


> By deciding that only users of Go can criticize Go

This describes my feeling when going through the thread and reading that significant weaknesses of Go are not significant.


If you haven't written any Go code, how can you know what weaknesses are significant? This is why people read reviews of every new tech gadget - because the feature list doesn't tell the whole story - you need the input of someone that has actually used the thing.

You can choose who to believe - someone who has tried the language, or someone who merely thinks they know what it'll be like. I know whose opinion I'd trust.


> If you haven't written any Go code, how can you know what weaknesses are significant?

First of all, I have written Go code, I just wouldn't call myself an expert. And lack of generics becomes a pain in the ass rather quickly. I decided to come back when they had added generics, but it turns out that isn't happening.

But even if I hadn't ever written a line of Go, I don't think that would invalidate my opinion, because I've used other languages that makes the same mistakes. How many languages do I have to try out before my opinion on language features would be worth considering to you?

Your logic is like that of a child who wants to touch a hot stove. "Have you ever touched this hot stove? No? Then how do you know it will burn?" I've touched enough hot stoves in my life, thanks.

You also haven't responded to anything I said in my previous post, you're just reiterating what you said previously.


My apologies, there didn't seem to be much to address.

> the only people who would use Go are people who aren't aware of the last few decades of compiler research: people who think coroutines are a new idea and aren't sure whether generics are useful

This is simply untrue. Most developers I know that like Go have written tens of thousands of lines of code in other languages that do have generics and more advanced features. You seem to assume that because people disagree with you, that they must just be ignorant. How about if people disagree with you because we have different tastes in languages? Some people love LISPs, they drive me crazy. Some people love Python, I think it's merely ok. ... I love Go, and you hate it. That's ok. I don't think everyone has to like Go. But I don't tell the LISP people they should drop all this paren BS. I don't tell the Python people that they should add braces and static typing to their language.

> what is it that you think people who want Go to have generics don't understand?

...that generics are not needed in many of the cases where they're used. That YAGNI holds true a lot more often than they think. I had actually started coming to this conclusion (about generics) before I even started writing Go - and I'd already started using them less. Not having foo<bar<baz>> in my code simplified it a lot, and given that most of the time I only ever implemented one version... it just wasn't worth defining a whole bunch of generic logic, when really I only had one concrete implementation that I'd ever use.

> My core complaint about Go: the authors simply have ignored decades of compiler research.

Not ignored, so much as "considered and discarded".

You say you've written Go and the lack of generics was a problem rather quickly. I have been writing Go for almost 2 years full time and 9 months on the side before that. Generics have almost never been a problem for me.

I see two possible reasons why we've had different experiences with Go:

Either you happened to be working on a project that really benefits from generics (something that makes heavy and repeated use of trees or vector/matrix math, etc)... or you were trying to write Go code the way you'd write in some language that does have generics.

If the former, well, that's truly unfortunate. Go doesn't work well for all projects. I wish it were better at that sort of thing, and maybe at some point we'll get some features that'll help out with those pain points. Maybe those features will be something like Generics... but I, for one, hope that it's not just generics like everyone else has generics, because I've been there, and I don't like it. There's a hell of a lot of code that doesn't need matrix math and left leaning red black trees and such. Go is good for all those projects.

If the latter, well, then you've really just shot yourself in the foot. This is like taking a minivan offroading in mud and then getting mad that it handles poorly. You have to use the language in the way it was designed. Yes, this means you probably have to write a min function every once in a while. Yes, this means you have to write a loop to test if a string is in a slice of strings. Just like you can't take the shortcut through the mud in the minivan. Just like you have to write type classes in haskell, or getters and setters in Java. Every language has its quirks.

All that being said, it's also completely valid fot you to say you just don't like Go. That's fine. I don't like a lot of languages, even though many other people do. But telling other people they're wrong or dumb for liking Go is not a position you can really justify. It's a matter of opinion, and opinions can't be wrong.


Every small min function that you ride has the potential to have bugs. In a language with generics there is only one min function meaning only one source of potential bugs.

So is it simpler to have all these little functions doing basically the same thing with the chance of bugs or 1 generic implementation that works for all of them?

Looking at things in this light bug potential in Go is O(n) whereas a language with generics is O(1).

The point being that without generics you end up implementing lots of these small similar functions and making lots of potential sources of bugs.

Idiomatic Go encourages writing easy and simple to fix code, which is good. However what I get for all this is go allows me to write the simple and easy to fix code that could potentially have bugs everywhere.

Hope this makes sense, haven't used voice typing for anything this large yet.


Wow, if that's voice typing, I'm impressed, mine is never that good :)

I do understand where you're coming from. More code means more code that can have bugs, I totally agree.

However, more complicated code also means more bugs, and generic code is definitely more complicated and non-generic code (in all but the most simplistic cases). So, I don't think it's necessarily as straight forward as O(n) vs O(1).... and also for small values of n, O(n) is pretty close to O(1)... and my contention is that n is almost always small.

So, I don't think you're wrong, per se, I just think that in real life, it's not so black and white, and in my experience, the kind of code I have to duplicate is the dumb stuff, like min, which is nearly impossible to screw up, and if you do, it's immediately obvious in any kind of basic test of the surrounding code.


> generic code is definitely more complicated

In Haskell at least this is usually not correct. A sorting algorithm with type

    Ord a => [a] -> [a]
(i.e. a generic one) is likely to be more neatly implemented than one of type

    [Int] -> [Int]
because the former can't use irrelevant properties of the type of the values contained in the list.

Perhaps you meant something different by "generic" though. There are a few subtly different uses of that word. One of them means a very neat and powerful generality as seen is Haskell's parametric polymophism. Another is a convoluted run-time tag-based dispatch mechanism which is often under-specified. I can imagine that programmers who have only ever seen "generic" code in the Python object oriented style (to take one example) think this style of "generic" code is more messy than "specific" code.


> Wow, if that's voice typing, I'm impressed, mine is never that good :)

I had to correct a few things, but I was pretty impressed by the accuracy :D

> I do understand where you're coming from. More code means more code that can have bugs, I totally agree.

Right, I find it easier to use very generic functions and compose them together. Go achieves some of this with interfaces, but my rub with that is that they aren't type safe IIRC. Which my experience has taught me is quite valuable.

> However, more complicated code also means more bugs,

No arguments here!

> and generic code is definitely more complicated and non-generic code (in all but the most simplistic cases).

Can we test this right now? Have any examples in mind so we could figure this out in code?

> So, I don't think it's necessarily as straight forward as O(n) vs O(1)....

I agree, I didn't mean for my statements to be taken as absolutes or perfectly accurate but more as a lens to look through.

> and also for small values of n, O(n) is pretty close to O(1)... and my contention is that n is almost always small.

I agree here, and this will lead us to the source of our (and many others disagreements).

I feel that when as much code is as generic as possible, I can make more generic code and lessen the potential points of failure.

Forgive the projection, but I feel your thoughts are more along the lines of "Really generic code isn't simple and simple code is the key to fast, easy to maintain, and well... simple code".

I was a big fan of Go in the past and thought along the same lines. I gave Haskell a try and things being very generic didn't seem to make complexity go up much to me.

This means it feels like a lot of Go proponents just dismiss Ocaml, Haskell, Scala, Clojure, etc as something convoluted and overly complex from the outside, but having spent some time in getting to know the language it seems just as simple or more simple than Go (although VERY different) in many areas (not deployment (Go does very good here) and I suspect not reasoning about performance but my use cases haven't necessitated enough profiling Haskell code to say for sure).

Just a random question, what do you and (if you don't mind guessing a little) other Gophers think of Nim? If I were to pick a dead simple imperative language along the lines of Go that didn't "ignore programming language research" I would pick Nim over Go unless there was a huge deficit in libraries for my use case.


> You seem to assume that because people disagree with you, that they must just be ignorant.

If people disagree with me on things that are pretty solidly proven by my experience, then yes.

> How about if people disagree with you because we have different tastes in languages?

I mean, if you're argument basically comes down to, "That's just, like, your opinion, man", then I'd counter that its pretty objectively true that people are more productive in some language than others.

> ...that generics are not needed in many of the cases where they're used. That YAGNI holds true a lot more often than they think. I had actually started coming to this conclusion (about generics) before I even started writing Go - and I'd already started using them less. Not having foo<bar<baz>> in my code simplified it a lot, and given that most of the time I only ever implemented one version... it just wasn't worth defining a whole bunch of generic logic, when really I only had one concrete implementation that I'd ever use.

So your position is that we should take away features that people overuse?

There are tons of places where you just need a few functions, not a class with a bunch of methods on it. How far are you going to take this logic?

If you want to be babysat by your language, write Java.

> You say you've written Go and the lack of generics was a problem rather quickly. I have been writing Go for almost 2 years full time and 9 months on the side before that. Generics have almost never been a problem for me.

Well, there are a few explanations for that:

1. You've worked in a very narrow domain.

2. You're willfully pretending the problem doesn't exist so you can feel good about your chosen language.

3. You're completely not seeing the numerous cases where your code has duplication which could be removed using generics.

> You have to use the language in the way it was designed.

On the contrary, if the language is designed poorly I do not have to use it at all.

> It's a matter of opinion, and opinions can't be wrong.

So, lacking justification, you just segment off what you're saying into the realm of opinion, where if you believe hard enough anything you want can be true!

EDIT: A funny thing is that the Go core project itself is apparently not one of those projects you claim exist that don't benefit from generics. See here[1] about halfway down the page:

> A weakness of Go is that any generic-type operations must be provided by the run-time. Some day that may change, but for now, to make working with slices easier, Go provides a built-in generic append function. It works the same as our int slice version, but for any slice type.

But sure, that will never come in handy in most projects.

[1] http://blog.golang.org/slices


> > You seem to assume that because people disagree with you, that they must just be ignorant.

> If people disagree with me on things that are pretty solidly proven by my experience, then yes.

I guess I just am aware that others may have had different experiences than I have, and unless their opinion is way out in left field (like, "programming in assembly is just as productive as python")... then I assume they're not ignorant. People think Ruby is fantastic and I think it's pretty horrible, but I don't think they're ignorant of other languages, they've just developed different tastes.

> So your position is that we should take away features that people overuse?

Some of them, yes. Go doesn't have the ternary operator. I consider that a good thing. I've seen critical security bugs that had been in production code for years because someone misused the ternary operator in a way that never would have happened with a boring old if/else.

> So, lacking justification, you just segment off what you're saying into the realm of opinion, where if you believe hard enough anything you want can be true!

I'm saying that what makes a language good or bad is generally subjective. I don't think you can objectively say one language is better than any other except in extreme cases. Is Haskell better than Python? Is C# better than OCaml? It's like asking if minvans are better than sports cars... it depends on what you want to do with them and what you enjoy. I used to like to floor it getting on the highway and hearing the engine rev as I was pushed back into the seat of my v6-powered sedan... now I take great satisfaction in throwing a new couch in the bed of my truck and not having to wait or pay for delivery. Is my truck better than my sedan was? Would my truck be better if it had the same 0-60 time as my old car? Sure... it would also cost like $15k more. There are always tradeoffs.

You say the Go community is being blind to the benefits of generics, and we say you're being blind to the costs of generics. And I don't mean compiler complexity or speed. I don't give a fig about the compiler. I care about the complexity of the code. I care about error messages that make you open up 4 different files to try to figure out what combination of code is actually being run in a single place.

I don't think we're going to agree here, and that's fine.


> The community asks that you acquire some experience, work with what is there, make sure that you understand it, then suggest improvements.

And then it proceeds to ignore these improvements anyway because it has a dogmatic view on certain features and the credentials of the person suggesting the improvements are absolutely irrelevant.


"You really don't need X for Go" is becoming the Go community catchphrase. And they're right: I won't need anything for Go because I wouldn't use a language whose community actively resists improvement.

There's no language you couldn't construct a similar argument for, is there? Except maybe Perl. There's always going to be some capability favored by some loud person who's going to publicly disqualify some language for not having that feature.

Weird that in this case it's an equivalent of npm or Rubygems, which are nice tools for beginners but a complete pain in the ass for serious projects; essentially, you're disqualifying Golang for not reinventing DLL Hell like everyone else.

Generics, I get. Package management?


> which are nice tools for beginners but a complete pain in the ass for serious projects

I'd be interested in hearing more of your perspective here, because mine is the exact opposite. Newbie writing some small scripts? Don't worry about RubyGems. Writing a real project without Bundler? I think I'll pass...


I honestly don't care that much about package management: that's only symptomatic of the deeper issues with Go.

> Weird that in this case it's an equivalent of npm or Rubygems, which are nice tools for beginners but a complete pain in the ass for serious projects; essentially, you're disqualifying Golang for not reinventing DLL Hell like everyone else.

npm and RubyGems have two basic problems:

1. Curation of package sources. 2. People installing packages they shouldn't have.

Curation of package sources: notice that people don't complain nearly as much about dpkg/apt packages: they're curated much more carefully than npm/RubyGems packages (which is to say, they are curated).

People installing packages they shouldn't have: often the problems people run into with package management are self-inflicted. The adage is old: "Libraries from CPAN should not be treated as if they were part of the core." "Everyone else", as you say, is not experiencing DLL hell. I have zero difficulty with package management, because I use mature libraries and specify versions in my package configurations. There's no excuse to be experiencing DLL hell these days. The people who experience DLL hell are the ones who don't check in package configurations and then are surprised when the 0.1 alpha package they installed makes a breaking change and they don't know which version their code works with.

This is a problem in the Ruby community in particular because part of the draw of Ruby is "there's a package for everything". Newbie devs get caught up in the ecstatic frenzy of solving all their problems with gems without researching or testing those gems at all. But that's not a problem solved by Go's lack of a package manager, it's a problem solved by looking more carefully at what dependencies you introduce before introducing them.


I would classify godep as a package manager in the same vein as bundler, pip, npm, etc., and in my experience it is absolutely essential if you still want your Go code to compile a month after you wrote it.

I think the bit about source versus binary deployment is a bit of a red herring. It's not like any true Scotsman is deploying by running pip update in production either. As a developer you want to have code that compiles at all times. Having a binary you can't reproduce or change isn't going to be much comfort.

If you're going to use third party libraries there is no reliable way to judge the stability of either the libraries you use directly or all your transitive dependencies. Pull in enough dependencies and the chance of a conflict approaches one. Some of your dependencies will get updated and some of them won't, and soon enough you'll have a hard time finding any combination of versions that will compile.

Dependency management is not optional, but unfortunately Go makes it easy to hurt yourself by providing climbing holds without a safety harness. It isn't obvious to new programmers that they should use godep before they come back to an older project and it won't compile anymore.


>Pull in enough dependencies and the chance of a conflict approaches one. Some of your dependencies will get updated and some of them won't, and soon enough you'll have a hard time finding any combination of versions that will compile.

I'm not sure what you mean by "conflict". That's the nice thing about the way go packages work. There's no v1 of a package and v2 of a package. All packages are different. If one dependency uses foo.v1 and one uses foo.v2 ... they don't conflict. They're as different to go as foo.v1 and bar.v1.

>Some of your dependencies will get updated and some of them won't, and soon enough you'll have a hard time finding any combination of versions that will compile.

Again, this just doesn't happen. A dependency has one canonical location. If the code changes at that location, it changes for anyone that uses that code. Yes, it's possible this could break only part of your code, if only part of your code was relying on the old behavior/API... but that should be rare (and if it happens, it means you can't trust that third party to stay stable).

I think most people who complain about Go lacking versioning of packages are still stuck in the mindset of pip, npm, etc, where versions are fungible, and your code doesn't know exactly what version it might be working against. This is just not the way Go works. There's one and only one version of a dependency that your project sees. Anyone using that dependency uses the same version. If the dependency's code compiles on its own, it'll compile on your machine, too.


I have no idea what you are talking about.

There is nothing in Go that prevents people from making breaking changes to a package - and therefore breaking changes happen. I'll trust my real-world observations of broken dependencies over your assertion that they can't happen.

My point is if you're not tracking the version of your dependencies, it can be hard to even know what versions to go back to to get something that compiles. It may not be enough to just roll back the package with the breaking change, since you may have other dependencies that have been updated to depend on the new version.

The only way to reliably be able to compile your program in the future is to record the exact versions of all your dependencies.


I'm on board with this answer, and it makes sense to me, but it's competing in my head with the idea of always being able to do a clean build from master that could deploy whenever. Let's say you have a catastrophic failure and lose your binary repo and need to rebuild (and your backups, etc - it's an example) - if your dependencies have changed in the meanwhile, you can't redeploy to where you were. Obviously that scenario should be well prevented, but that doesn't feel like a legitimate excuse to not have some safeguard in place to make sure your build environment doesn't change on you.


"but that doesn't feel like a legitimate excuse to not have some safeguard in place to make sure your build environment doesn't change on you."

If your build process downloads anything from the Internet, your build environment can change on you. Networks can always fail. At best you can detect this occurring and have your build process scream and die, but it can't be fixed.

Note the complete lack of the word "Go" in the previous paragraph. If other languages provide package managers that make it easy to download code from the Internet during your build process, they are tempting you to do wrong things. You should be in personal possession of everything you need to do a full build, whatever that may be (source or otherwise). And, again, note the complete lack of the word "Go" in that sentence.

This has nothing to do with Go. If you are depending on package managers to download your code from external sites during a build, stop reading this right now and GO FIX THAT. Seriously. Someday it will bite you, hard.

And may Turing & Knuth help you if you're downloading packages with unpinned versions that can just upgrade to whatever they want, whenever they want.

Now I will actually use "Go" in a sentence. The fact that Go expects you to vendor your source I do not consider a problem with the language. The fact that it fails to ship with a blessed tool for doing so is a potentially legitimate criticism, but the fact that it fails to ship with a tool that begs you to do the easy, wrong thing is a feature, not a bug. I wouldn't be surprised that they eventually bless something out of the community. Godep seems to be the current leader. Personally I can't recommend or anti-recommend any of them, I've just been vendoring by hand since I'm only using a handful of libraries, and the cost/benefit hasn't worked out in favor of learning something to save me mere minutes of effort. I know where to find solutions if that ratio ever changes.


Since I can fairly reliably type "mvn compile" to build five year old Java projects, I think it's fair to say that the Go community is in a dependency hell of their own making.

Should every library author vendor their dependencies separately too?


I'm not in a dependency hell. I won't be if I add libraries, either, because there are tools for that.

"Should every library author vendor their dependencies separately too?"

Whatever it takes to produce your end-artifact should be under the control of the relevant organization. Again, let me emphasize, this has nothing to do with Go, and you violate it at your sole and singular peril. When the $PACKAGE site goes down due to DOS, security penetration, or sheer time-based neglect, being angry at me won't help you any.


What you may not get is that you're likely doing manually what a package manager does automatically. A program should do the work,not the programmer. Imagine there is a bug in the current package.Well a package manager can invite you to download a recent version of the lib that has no breaking changes automatically just with a few metadatas. Why would you not want to automate that kind of stuff?


A package manager can't run my tests and confirm that my code doesn't break with the newer version of the lib (just because it supposedly has no breaking changes doesn't mean it works.... there are a surprisingly large number of behaviors that consumers can rely on that go way beyond just the provided API).

So, all you're really automating is me going to look at the package's homepage and seeing if there's a new version. I'd still have to read the changelist to see if the bug that concerns me is fixed, what other changes are included, and run all my tests (and possibly write new ones). This is automating the easiest part of upgrading a dependency.


A lot of the work I do involves legacy Ruby software. Fortunately there has always been (well, since Ruby got popular) Rubygems which is equivalent to gopkg.in - I have been burned a couple of times though when authors have removed old libraries. For the last few years Bundler (http://bundler.io/) which uses Rubygems has been the standard way to lock versions of libraries, and before that vendoring was widely used.

When you pick up the maintenance of an old application you have to reinstall the dependencies from scratch if you aren't using vendoring or Bundler. Each method has their advantages and disadvantages, but both are much better than praying that the library author hasn't introduced a breaking change in a release since the library was last used.

Although it isn't the same when you are deploying Go applications, development has exactly the same issues as Ruby without locking the dependencies.


Please somebody explain me how to manage next situation without manual copying of code and forking repos (in other words, we need to still be able get updates for minor versions from repositories).

GOROOT/

pkg

src/

   lib1  

   app1

   app2

And app1 requires lib1 version 3.x, app2 requires lib1 version 1.x. What can we do?

Also, on whole this page https://golang.org/cmd/go/ I can't find ANY info, about how to fetch exact version from repo, not the master branch. In Go world all master branches are for stable code only, not for development? And all tools have only one version?


There's no such thing as lib1 version 3.x. lib1 is a single version. Every import path is a unique package. If you want to make a new version of lib1, you need to give it a different path. This can be as easy as manually inserting "v2" in the path, like /lib1/v2/ or /v2/lib1/

There's no way to use "go get" to get anything other than the head of the master branch... at least, not by default. If you use a service like http://gopkg.in then you can encode a version in the url/import path which can then redirect to a branch in github, so for example import "gopkg.in/natefinch/lumberjack.v2" will get the v2 branch of github.com/natefinch/lumberjack (gopkg.in does this with some git magic)


> There's no such thing as lib1 version 3.x. lib1 is a single version.

I don't agree with this part (or we are talking about different things), but thanks for the second part of reply. At least something :)


An example for a dependency management package manager for a compiled language is [cocoa pods](http://cocoapods.org) for Objective-C. It's really popular with iOS and OS X developers.

Or look at [cabal](https://www.haskell.org/cabal/) for Haskell. It's a package manager that doubles as build system.

It helps when you use external open source projects and want to update it to a new version. With go you have to keep track of you external dependencies and their versions on your own.


Just to extend your list.

dub for D, http://code.dlang.org/

cargo for Rust, https://crates.io/

biicode for C and C++, https://www.biicode.com/

NuGet for C and C++, http://docs.nuget.org/consume/support-for-native-projects

OPAM for OCaml, https://opam.ocaml.org/


> You really don't need a package manager for Go. Honestly, the main reason [...] is because you're deploying source code to your production environment

Not sure if I would agree with that. A full-fledged package manager automates the process of installing, upgrading, configuring and removing software packages in a consistent manner. Dependency management and semantic versioning is an important part to provide this consistency. Wether you deploy source code or binaries does not make any difference why a package manager is always good to have.


The Go tool does do installation and upgrading. There's not really any configuring, and it'd be hard to do removal in any reasonable manner given how GOPATH works. What the Go tool does not do is isolate you from breaking changes upstream. For that you need someone you trust (which could be the package author or a copy of the code in gopkg.in), or be willing to fix breaking changes (which I do quite often).


We really don't need any more binary package managers - that should be the responsibility of the platform you're deploying on. dpkg/rpm/etc on Linux systems, MSI on Windows, APK on Android, etc.


One reason I love package managers, is that as developer I don't want to produce a dpkg/rpm/tgz/apk/msi/ipa/p5i/depot multiplied by the different locations of include files, libraries, documentation across the said OS.

Specially in commercial software.


>>A full-fledged package manager automates the process of installing, upgrading, configuring and removing software packages in a consistent manner.

That sounds like deb(dpgk)/rpm's job. In golang-land I just need to use vendoring+godeps for repeatable builds.


You really need to look at Cargo in rustlang to see what good package manager can do and how it improves development.


So the "solution" to Go's package management is just statically linking every library you use?


We prefer to use version control, and vendor libraries in our project instead. This is language-agnostic, and allows us to have repeatable builds without extra tools.

I've written up a tutorial about it using git-subtree:

https://github.com/jamesgraves/example-go-app


Does this not wholly preclude the possibility of vendors shipping a library without source?


We normally exclude the entire pkg directory, but in that case you could just add selected files from there.


The closest is godep, which lets you take snapshots.

We're having lots of fun with this with Prometheus, quite a bit of work is ongoing to get it into Debian due to this for example.


And do these package managers provide a full ecosystem where all Go libraries/modules are available like e.g. npm does?


All Go packages can be found on godoc.org Installed with "go get" And if you want to vendor the dependencies use godep.


Thanks. And how can I define the version of the module which should be imported?


You can't. You are expected to vendor for dependable builds. This has upsides and downsides.

It does completely sidestep version dependencies hell and encourages you to think carefully about which dependencies you package. It does that by making you manually responsible for keeping dependencies up to date if you distribute other people's code with your own.

So, in answer to your original question, no go doesn't finally have a package manager with proper versioning (at least no official one). In practice, this is not as big an issue as you might think if you come from a language with such a package manager. In practice, I haven't really found it as much of a burden as I thought I might.


Yes, you can. The URL is the version. If you want a different version, you need a different url.

For example, when I released v1 of my package "lumberjack" you imported it with

    import "github.com/natefinch/lumberjack"
When I updated it to v2, I had to change the url. Luckily there's http://gopkg.in to make this easy, so for v2 it's

    import "gopkg.in/natefinch/lumberjack.v2"
Note that the above url actually redirects to the v2 branch of the same github repo.


Sure it's possible with third party tools (in this case gopkg.in), and if all your required packages support this. As gopkg.in is a centralised service I'm not so keen on that solution (same problems and potential pitfalls as relying on say rubygems.org as a source of truth).

There are multiple ways to solve these problems (vendoring is one, your solution another, godep another path), but the go get tool doesn't explicitly allow for versioning in the way the parent requested and there is no official solution to this.


The point about gopkg.in being centralized is valid. You can run a copy yourself (it's open source), or you can use a more simplistic version that many people use which doesn't even require running a service, just serving some static HTML. This obviously only works for code you're maintaining. However, if you think code you're depending on isn't maintained by people you can trust not to break you, it's probably better to use something like godep to insulate you from them anyway, right?

The official solution to versioning is separate URLs. If you mean something more complicated like automatic resolution of different version specifications (e.g. ver >= 1.2), then no, nothing exists like that. But generally all you need is "project A uses version foo, and project B uses version bar", and Go supports that just fine.


However, if you think code you're depending on isn't maintained by people you can trust not to break you, it's probably better to use something like godep to insulate you from them anyway, right?

That goes for most code at some point in time; as it evolves the behaviour changes in subtle ways. I do think the go team's insistence that packages should never break their API and remain effectively versionless is somewhat utopian.

The official solution to versioning is separate URLs.

I remember reading an off-hand comment from someone on the go team that you could just rename your package, however I don't think that's a very good idea, and this doesn't play nicely with many popular hosts like github. They have support for version tags on repos in go get, but only for golang versions so it is unused, which is a shame. So there's no official tooling for it. At present the majority of go code I've seen simply imports dependencies from github or similar without versioning and hopes for the best. Unless the other pkg author wants to version, that's all you can do unless you want to vendor (for large dependencies like net/html that can be impractical).

If you mean something more complicated like automatic resolution of different version specifications (e.g. ver >= 1.2), then no, nothing exists like that.

That's what the parent was asking for I think, and more than that something which applies to all pkgs. Personally I've chosen to minimise external dependencies, and vendor them in version control. It's simple, and it pus me in control of updating and versioning for code I use, not the package authors.


This strategy means you depend on the upstream to keep the version stable. In general, URLs do not imply a stable version.

Vendoring your deps is currently the only certain way to get stable versioning.


https://github.com/tools/godep/blob/master/Readme.md explains it.

It's not really a package manager, though you can get some of the same results. It's more pulling in exact build dependencies.



I don't have a ton of experience with go, just some one off internal tools development, only a few dependencies. That said, I chose to manage my dependencies manually by making a fork of the relevant repositories. Yeah, it would have been nice if the higher order dependencies were automatically resolved, but the whole thing went smoothly, and felt familiar since it had nothing to do with Go at all. Maybe the take away is that a language agnostic dependency manager is what would best benefit Go?


And how you follow minor version updates with your forked repos?


Your questions was serious and not a troll, so I upvoted it. It seems like vendoring really is the "accepted upstream method" for solving this problem.

Even the Fedora packagers for golang apps are asking to relax the requirement on unbundling vendored things for golang as there is no sane way to handle versions other than git commit hashes.


As far as I know the system is different. You pull your requirements from repos recursively. It's maybe slower but in exchange it's more lightweight. For most cases it shouldn't be a kill reason. So what's the problem with that?

PS: I'm no go developer either. But the way it pulls in requirements was a nice idea in my eyes.





Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: