Hacker News new | past | comments | ask | show | jobs | submit login

> “Try different things” is the key, of course.

32 year programmer here (started at 19).

I cannot “amen” this sentiment enough. But probably not in the way most will read it.

Most will interpret this at a macro level: Learn some Smalltalk! Now go learn Lisp! And then Clojure followed by Haskell, throw in some Java or C++ so you know what pain feels like! This is OK, it is good to be somewhat travelled in your journeys as a programmer.

But I find that there is a micro application that gets overlooked. You can do a lot of “try new things” right in the stack you’re in without having to bust strange new worlds. Most languages end up with many ways to do things. You can and should take time to explore those. Learn the conventions/idioms, but then push/challenge those.

I was afraid of C macros until I took time to really try some things with them. It didn’t mean I suddenly used them for everything, but overusing them for a bit helped me better make good choices about them.

C pointers intimidated a peer until I forced him to forego index based for statements, using all pointer math instead.

Smalltalk context objects were kind of “behind the scene” magic until I decided I’d like to figure out how to implement goto in Smalltalk. After that, they opened up for me.

Python decorators are “magic” until you make a few of your own.

The examples could go on and on.

Try new things.




> Try new things.

I broadly agree with you here, but I have a minor side-note to make here: while trying new "new" things, don't forget about old "new" things. A technique from the '60s can, and often will be, new to you today, as well as for a vast majority of today's programmers.

I find it incredibly sad that programming as a profession treats its history like a burden to be shunned in favor of yet another unconscious reiteration of a classical concept that gets fashionable in one part of the community for a year or a few, before slipping back into the huge pile of forgotten ideas and implementations.

Try new things, but look for them both on twitter and in archives of many decades prior.


Indeed. And the original that I was replying to actually reads "Try different things". Which as you point out, is even better than my "new".


> Python decorators are “magic” until you make a few of your own.

To add to this, you must try things to understand them not only so you can wield them, but so you know what not to do. Decorators are squarely in that latter category for me.

Things that feel magical, expressivity, structure, data model, abstractions... there is an element of experience required to develop a robust understanding of not only the power but the costs, which are often subtle and pernicious.


I, really, really, disagree with decorators not being worthwhile

Decorators are a very "high" level of abstraction. Leave aside dependency injection, clojures and currying and all that jazz which worms its way in here.

Decorators are, actually, really excellent for libraries, used by a ton of people.

They're exceptionally easy for end users to use.

Cost is that the designer has to #1 deal with fancy programming. #2 The designer has to really understand all the user cases of what they're decorating.

But:

1. You're good enough to do fancy programming.

2. You're writing software that's popular with a lot of people. Then decorators are really convenient for the people using them.

Right tool, right job.


> Decorators are a very "high" level of abstraction. Leave aside dependency injection, clojures and currying and all that jazz which worms its way in here.

I don't follow. They're magic. Magic (and fancy programming) is generally bad in a professional, software engineering environment. If you're writing software for you, do whatever you want. But I'd you're writing software for your peers, dependency injection, closures and currying (and all that jazz) are significantly more literate and easier to follow.

Decorators nearly universally have hidden side effects through hidden state.


Hidden state is not bad. That's the entire premise of Object Oriented programming: we hide the state with encapsulation and only expose methods that act on that hidden state. Closures are exactly equivalent to objects. (Makes sense, given how JS uses closures to implement objects and how Java uses classes to implement lambdas...)

What's bad is global state, but in Python specifically - and in any language supporting modules and/or being class-only like Java - the globals are actually confined to their own Singleton instances (ie. the modules or static members of classes).

Hiding state in a closure is exactly equivalent to hiding the state in private fields on an object. It's not magic. It's just an alternative formulation of the same idea. In Python, it's slightly less convenient to use due to some peculiarities around scopes, but it's a valid technique nevertheless. You seem to agree, saying that "closures and currying [...] are significantly more literate and easier to follow".

But then, I don't understand what is your problem with decorators. They're just a convenient syntax for higher-order functions, that often happen to be also closures. You said they are easy to follow; the same should be true for decorators, then.

Dependency injection, included in your list of easy to follow things, is often implemented with decorators in Python.

Basically, decorators are just a special syntax for the following pattern:

    def f(): pass
    f = some_higher_order_function(f)
and nothing more. How complex or "magical" you'd like your `some_higher_order_function`s to be is your choice, but that has literally nothing to do with decorators syntax...

I would agree if you singled out descriptors, metaclasses, or stack manipulation, but things like decorators or context managers are simple enough to be your everyday tool for writing Python code.


> Hidden state is not bad.

Exactly

Tiers of information abstraction and interfaces are a very important part of programming.

> it's slightly less convenient to use due to some peculiarities around scopes

The cognitive load of closures and currying is pretty extreme for newer programmers.

I'm sure, given a certain level of experience and ability, people can stare at closures and not blink.

However, I'd be hesitant to use closures in a team of other developers. If you don't have an education in functional programming, a closure definition is quite a lot to dump on folk.

I tend to avoid decorators etc in the majority of my applications where I'm working on a shared codebase with coworkers etc.

All abstractions are a trade-off between "how easy is it for the writer " vs "how easy is it for the reader."

Decorators have a "hurdle" rate of complexity for the writer, and need a sufficient number of readers to justify.


> If you don't have an education in functional programming, a closure definition is quite a lot to dump on folk.

That's true, I agree. It's not good when you have to explain a piece of code to every single developer (or even a majority of them) that comes to the project.

It would be different in Ruby or JS, but Python developers tend to be more procedural and OO oriented, with FP being less widely known and used.

> All abstractions are a trade-off between "how easy is it for the writer " vs "how easy is it for the reader."

Yes, definitely. If you can solve a problem with simpler tools, you should do that. If you can solve a problem in a way that's easier to follow, you should do that, even at the cost of some repetition. All abstractions leak, and if you can solve a problem without introducing yet another leaky abstraction - you should do that.

On the other hand, if your abstraction is tight enough not to leak too much, you can often get away with it by simply documenting how to use it. Huge masses of Python programmers use Django ORM, for example, without ever caring about the metaclasses and descriptors that make them work. Many of the more junior devs don't even understand that the ORM is basically a builder for SQL queries. It's not ideal and they sometimes introduce problems to the codebase by doing N+1 queries where 1 query would suffice (for example) and they become lost at the first sight of `QuerySet.raw` method. Until they hit one of these, though, they can still add value to the project.

> Decorators have a "hurdle" rate of complexity for the writer, and need a sufficient number of readers to justify.

Agreed. The complexity of creating a closure that wraps a given function to add a useful logic to it is definitely higher than simpler solutions, like template method (for example). And while using a decorator has little syntactic overhead, there's a cognitive overhead should the new developer need to change the decorator's logic.

In other words: simple is best. No argument here :)


I want examples of terrible decorators now…



Excellent, thanks!


> overusing them for a bit helped me better make good choices about them

This is my favorite learning method.


when I first learned TMP every program had to be disgustingly compile time for a few weeks. Horrible stuff... but there's no better way to learn imo.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: