Hacker News new | past | comments | ask | show | jobs | submit login

This is the main draw for me. Compared to Python, other languages (well, some) seem to require you to do so many things that just seem... unnecessary.

I couldn't believe all the junk Java required of me when I took a stab at it. It's like being confronted with some unhelpful bureaucrat: I know what I mean, they know what I mean, but they're damn well going to make me trudge through all the nonsense so that what I mean is turned into something the bureaucracy understands. The linked NLP examples are great for illustrating this.




You don't really appreciate Python until you try and program in something like Java again.

The following code is readable and I have required a variation of it in a program before:

    sorted([ord(c) for c in
        set('letters in this sentence') & set('and this one')], reverse=True)
Doing it in Java would be a chore now.


Here's a translation of that to Clojure, since I'm learning that now:

    (use 'clojure.set)

    (reverse (sort (map int
                        (intersection (set "letters in this sentence")
                                      (set "and this one")))))
Edit: Take two...

    (use 'clojure.set)

    (->> (intersection (set "letters in this sentence")
                       (set "and this one"))
         (map int) sort reverse)
(experienced lispers, feel free to make this more idiomatic)


Here's one using a list comprehension to make it more closely resemble the Python example.

    (require ['clojure.set :as 's])

    (-> (for [c (s/intersection (set "letters in this sentence")
                                (set "and this one"))] (int c))
        sort
        reverse)


Not sure why we are posting alternative language implementations, but here is the Scala version :). You could say that this was an exercise for me in learning 2.8, which I am unfamiliar with. A for comprehension isn't needed because Scala already treats Strings as a sequence of characters.

(Set("letters in this sentence":_* ) & Set("and this one":_* )).toSeq.sorted.reverse

Or you, can take advantage of Scala's rich choice of collections.

import scala.collection.immutable.TreeSet

import scala.math.Ordering._

TreeSet("letters in this sentence":_* )(Char.reverse) & Set("and this one":_* ) toSeq

The machinery Scala has in place to make the second example work is quite impressive, and I was pleased to see that it worked. Even though the intersection method returns a brand new set, using implicit arguments it correctly creates a set of the correct type (TreeSet) with the correct ordering function (Char.reverse), without any duplication of code in the standard library (like overriding '&' in every sub-class).


An easier Ruby solution would be:

("letters in this sentence" + "and this one").chars.sort.reverse


Except you aren't using sets, so you get duplicate characters.

I can just as easily write ("letters in this sentence" + "and this one").toSeq.sorted.reverse in scala. You could tag the end of your ruby statement with .uniq and you would be fine. Alternatively, you could wrap the above scala statement in TreeSet(). These solutions process things in a significantly different way than was done by the OP, though.


("letters in this sentence".split(//) & "and this one".split(//)).sort.reverse.join

I'm just learning, but I think this performs it correctly with sets.


I find the python code much more readable.


IMO, both suck at readability, since you have to read inner expressions first. An example with a "pipe" would be clearer.


I'm not sure any decent Java developers would encourage starting a new project in Java anymore. This is exactly the sort of thing that Scala is just as good at.

Scala isn't as succinct, of course, but it gets much closer!


If you look at my Scala examples above, they are both more succinct than their Clojure and Python counterparts.


Doesn't python have sorted sets? Oh, what a pity. Does python only have one single implementation of sets? Too bad. Is the runtime behaviour of that implementation documented? Let's take a look ... I'll be back. This could take a while.


Python has sets as a built-in datatype. If someone needs a different implementation, there's nothing stopping them from writing one.


A couple of points.

1. I love python.

2. The OP was being a troll.

3. People need to be more honest about themselves and their favorite pet languages. I can write a sorted set in assembler too, if I wanted. However, there's been several times that I've wanted a sorted set data type while coding in Python but found that none was easily available. I hate the, "you can write your own!" counterpoint, because it is essentially meaningless unless you can prove that writing your own is as trivial as importing someone else's implementation.


I didn't mean to imply that lacking a sorted set datatype wasn't a pain in the ass; I was pointing out that the lack of multiple set implementations is a tradeoff that we make for all built-in datatypes that could support other operations but don't. (Personally, I would love to have the ability to pick a random element from a set. That would be great.)

By the way, in case anybody was wondering, here is a (relatively inefficient) way of making a sorted set in python:

http://code.activestate.com/recipes/576694/


Couldn't you just do the following?

  from random import choice
  choice(list(myset))
(I'm guessing you'd want to do it without having to convert it to a list first?)


That's what I ended up doing, but this was happening with a very large set, right in an inner loop. Luckily this was running in batch mode, so I could go get lunch or something instead of taking the time to make it faster.


Java is, as I said, a toy language.

A language that needs you, the client programmer, to define getFoo () and setFoo () methods manually can't be right.


Java doesn't require you to define getFoo() and setFoo(). Some best practice advice suggests you do. But you can leave the vars public and just use the . operator, if you choose.

It seems like Java wasn't defined in order to be an efficient for a three-person shop to hack in. It was meant to be efficient for a 40 person shop to build code as a team that they can maintain.

That said, I dread going back to my Java projects and updating them, even though I tried to maintain best practices in order that updating be as simple as possible. I've been immersed in php and javascript lately, so we'll see what it's like to go back.


Java certainly was not designed to be this "enterprise language" it is now. It only was similar enough to Smalltalk to replace it. (and Smalltalk has this culture of "if it is repetitive then extend IDE to automate it", which to some extent influenced Java IDEs)


My impression, although I'm not an insider and could be off base, is that it wasn't conceived to be an enterprise language, but its patterns were. I learned it when I started application programming for my startup (I had been doing scientific routines, i.e. FORTRAN and Mathematica), because I was curious about OO. A lot had changed since I hacked out a little C and assembler on my Mac+ many years ago, and the scale of the problems programmers and programming teams had to handle had changed significantly. It seemed OO became popular as a way to handle this scale of things. Java was presented as a solution for building maintainable apps, so I learned it.

I also tried to stay close to the OO paradigms (of course, I was just learning them), as I built this 50000 line beast called "Egorg." Now, I get to maintain and update it, so we'll see if it paid off.

On a side note, I have a real distaste for objects in Javascript. They seem clunky, almost like they were stuck on as an afterthought (I get a similar feel about Generics in Java). But the design of Javascript makes it quite easily to hammer out a script to do something cool. I'm worried about maintenance, though, because I feel like I'm evolving my own design patterns, so retracing my steps will likely be painful.

You are correct about the IDE thing. That does ease a large burden in Java.


>> feel like I'm evolving my own design patterns

I think that this is happening to many people who are doing a lot of 'non-trivial' JavaScript. The language is still very young and because the concepts that were chosen when creating the language are powerful and flexible (prototypal inheritance, objects can be accessed like hashes, first-class functions etc) there's always a million ways to code something.

OO-wise i think you can divide into those who use the `new` keyword and build objects that are more or less Java-like classes and those who create objects and prototypes on the fly, and probably use a lot more ideas from the functional and LISPy side of things.


I have somehow settled into the pattern of using new/java-like classes for large objects that store a logical set of data and need to operate on it, and on-the-fly prototypes to groups group just data (or, perhaps, a trivial operation).

My worst crime by far (that I am aware of) is that I have no hesitation to use a global variable if it saves me from bending over backwards to get around it. I'm doing animations, and sometimes the algorithm just comes to me a lot faster if I let a var keep a global value and be used by lots of functions between calls to my animate() function (using window.setInterval() ).


Most of these patterns come from Smalltalk (which was probably the original language of complex enterprise OO systems). I think that what made Java this kind of entrprise language is that it was reasonably similar to smalltalk so all these patterns and concepts can be reused and on the other hand it is similar to "normal" programming languages so average programmer does not have problems with it's workflow and IDE (as is the case with ST). On the other hand most uses of design patterns in Java code (eg. 90% of uses of Observer pattern, especially in cases when it is called Listener) seem to be to be compensating for fact that Java is not Smalltalk(/JavaScript/Ruby/Python/C# whatever with first class classes and methods).

JavaScript is language that is very different from anything else. Depending on how you look at it it has either unconventional semantics (majority view) or unconventional syntax (my view). It's object model is strong and often useful (you can emulate class-based inheritance with it, not the other way around), but I agree that it is to some extent afterthought (although I have strong feeling that our reasons are very different). What is probably largest problem of whole JS is the "Java" part of it's name and weird syntax deliberately designed to "look-like-Java" or "look-normal", which simply does not match underlying semantics.

Coming from C/Unix background I actually consider the IDE thing burden in itself. I expect that there are some easily editable source that is transformed by series of some steps into final program, but all changes happen only in the original input. Which is simply not case in Java/ST world, where you essentially need pretty complex IDE (on the side note: few years ago some Java IDE I was using for quick experiment forbade me from saving source file with syntax error with it... WTF?).


LOL- I think I had the same IDE when I was first stumbing about with Java! It was definitely a WTF moment. It became useful for code expansion, refactoring, and unit testing however. I'm now using vim to develop php and Javascript and I miss its convenience (on the other hand, I'm a lot better with sed now...). But using Firebug to find my errors is a bit slow.

What I find really cool about your comment is that you appear to have a deeper view of it all (meaning a deeper understanding of the underlying semantics), and I'm wondering how my view will change as I continue to work with it.


Interesting about objects in javascript. I have actually started feeling that prototype-based object orientation is less tacked on than its class-based cousin. The syntactical implementation is a bit strange (the 'function' keyword for objects, putting functions on the prototype object rather than the object itself), but the mechanism itself seems more generic and flexible. The implementation in Io is better I think, you should check that out before discounting prototype-based object systems outright.


Oh, I won't discount them (I'm not good enough to be a prima-donna about it), and it's nice to know that I will see how to use it to my advantage as I play with it more.


Urgh, I learned Java in school and recently had to use Java at work to do some stuff with a bad XML schema (think lots of "<parent><child><grandchild_1 /><grandchild_2 /><grandchild_3 /></child></parent>" wtf is element "child" for??). I used XStream for it, and started off trying to write the objects the "right" way, i.e. with private members for child nodes and getters and setters for them. However, having drank deep of the Python well, I got about 20 minutes into it before I realised that my time was better spent writing the actual logic of the app and did a :s/private/public/g. The pain from my ulcer faded almost immediately. :)

*I realise that pro Java coders use Eclipse to generate their getters and setters, but I have also drank deeply of the vim kool-aid and find myself going to ridiculous lengths sometimes to avoid other editors or IDEs. I keep trying to navigate with the wrong keys and trying to save with :w and using :sp to open other files when I use them.


One thing to look out for here is this: if you are writing getters and setters for primitive data types like strings and ints and stuff you're doing it wrong. Instead of

  public Customer(String name, int custId)
do

  public Customer(CustomerData cd)
where CustomerData is an object. In Customer you would have getters and setters for CustomerData rather than each item that identifies your customer. This is for extensibility: As your app grows, some part of it may need other ways to identify the customer. Adding these ways won't break your Customer interface. That's the purpose of the rigor of the design standard.

Now what I do, which goes against stated policy, is keep the CustomerData variables (String name, int custId, and so on) public so I can set them easily. The only time I require getters and setters for primitive types (for example, name) is if I were to read in data from a user generated form that needed to be sanitized.

My apps are full of these customized data objects, and it's a technique which has saved my ass as I added features etc.


Now what I do, which goes against stated policy, is keep the CustomerData variables (String name, int custId, and so on) public so I can set them easily. The only time I require getters and setters for primitive types (for example, name) is if I were to read in data from a user generated form that needed to be sanitized.

It's funny, too - though this type of thing goes against best practices, the primary argument for always using getters/setters (some day you might need to validate input or do other sorts of logic, and it's a pain in the ass to put that in after the fact) is pretty much 100% shattered these days, because any IDE worth a damn makes it possible to switch from public field access to getters/setters in a couple of keystrokes ("Encapsulate field" and "Inline method" are typically all you need).

Though it's worth keeping in mind, most of the Java best practices tend to assume that you're writing code that other people will have to use, people whose codebases you won't have access to after you release your code, and that's a much more difficult context to program in. It requires a lot more bureaucratic nonsense to leave your API flexible when you have to worry about breaking other people's code with each edit you make, and extreme paranoia is more than justified when that's what you're worrying about.


ouch.

If you encapsulate all the data about a customer in a "CustomerData" object, do you really need a separate "Customer" object? Or do you mean, not the whole customer but larger bundles of information like addresses etc. that don't need any business logic or abstraction but are nice to have as one single bunch of data?

(I mean, there's a reason why C++ still has "struct"s even though OO zealots will tell you they come from the devil).


The latter. In my contrived example, I meant that "Customer" was the main object you would use, with methods etc. to work on customer information. Then the data that would pass between objects is encapsulated in its own (very simple) class.

Edit: I didn't parse the "ouch" the first time I read this. Sorry, I'm not trying to be an ass about it. But when I first started trying to figure out the "right" way to do it, I came across something (probably by Allen Holub) that confirmed something I was suspicious about: There is no sense in having a private variable if you are just going to expose it with public getters and setters (except for sanitation purposes). The real point was to pass in the data through data objects, and keep the private members unexposed. This also means (typically, at least for me) a lot fewer getters and setters. It's completely against the Java Beans "way," I think, but it works well in real life. I use Java Beans only for persistence, because I like the automatic JB-xml conversion. That is, I make data objects out of the data that I want to store into a Java Bean, and do the automatic xml storage and read thing. It took me a while to figure that out. My first app, I hand-coded the xml. Ugh.


I don't get what you mean by "toy language." Clearly some big, powerful, practical stuff has been written in it. Usually the phrase "toy language" is used to refer to those with little practical use. Java is unwieldy, but it has a rich ecosystem and compiles to very efficient code across a broad range of platforms.

If you want Java without the unwieldiness, try Groovy. It's Java with a ton of syntactic goodness added; for instance, the default variable scope is "create a getter and setter which I can optionally override." Groovy interoperates well with Java libraries. The downside is that it's a dynamic language, so there is a significant performance penalty.


As I said earlier.

"Secondly I don't mean to criticize Java. By calling it a toy language, I was simply referring to the fact that Java tends to make writing bad code difficult, and in doing this takes away some of the flexibility and power that you tend to associate with other languages."

Judging from the general reaction it seems I should not have used the term "toy language". My apologies if I ended up implying that Java has little practical use.


> Java tends to make writing bad code difficult

I've known people who could write bad code in Java with the utmost ease!

It'd probably be more accurate to say that Java deliberately limits its expressiveness, in order to make it harder for people to shoot themselves in the foot, and to make it easier for one programmer to understand what another has written.


> It'd probably be more accurate to say that Java deliberately limits its expressiveness, in order to make it harder for people to shoot themselves in the foot, and to make it easier for one programmer to understand what another has written.

That is exactly what I meant.

Bad programmers exist. They will write bad code, no matter what language they're made to use.


> Java tends to make writing bad code difficult

If this were actually the case, it would be a bonus point for Java. It isn't, however. Nor is it the case for Python, Ruby or even BSDM languages like Ada.

How many times have you come (when searching for a used car or trying to reserve a table at a restaurant) to a slow, ugly, non-functional URL that creates an error message if you press the wrong button at the wrong time? Usually those URL ended in ".jsp", ".asp[x]" or ".cgi" or ".php".

I'd argue, that the largest amount of bad code that exists is in Cobol, followed by Java, C# and then C/C+, PHP and various BASIC dialects. That has nothing to do with how good those languages are but rather with the facts that most code out there:

a) is created by IT departments or outsourcing firms staffed by mediocre or completely disinterested programmers (I've never been in such an environment, but my hair stands up every time I read horror stories on dailywtf or on progit)

b) shouldn't have been created in the first place (there are commercial and open source packages that do this) if it weren't for the NIH symptoms. There are tons of restaurants with their own order/registration forms despite the fact that OpenTable is widely known and available. There are tons of non-technology companies writing their own accounting systems even though there's a multi-billion/year industry around that type of software (that employs programmers who understand not just J2EE but accounting itself as well).

c) is forced upon its users (due to the sunken costs fallacy) and thus isn't exposed to market forces

d) is written in those languages as they're most common and are either easy to learn (PHP, BASICs) or are widely taught in colleges or trade schools (Java, C#, C++)

Truth is, there is no magic language bullet. Some languages are more expressive, more pleasant to program in. Some produce faster code. Others are more scalable in the sense of making it possible to write whole systems in one language. Some are more "safe" in the sense of being less prone to garden variety security attacks, less likely to crash the entire machine when there's a bug (at the cost of imposing restrictive abstractions on the programmer).

No language, however, is a substitute for a team of competent and interested developers solving a relevant problem.


c++ feels incredibly bureaucrat to me too though. like separating definition from declaration into header files.

what do you think of ooc-lang syntax? http://ooc-lang.org/


I think the header-file and source-file distinction was more of an architectural decision (so that you could implement an efficient compiling and linking system) than a language design decision.

Just an opinion, though.


of course it was architectural, but that's precisely the problem - why are we still using a language designed for an age of computational scarcity?

another example: functions should be virtual by default.


another example: functions should be virtual by default.

No no no!

C++ got this one right, and then C# did better ('virtual' to declare a virtual/overridable function in the base class, and 'override' to replace it in the subclass).

http://www.artima.com/intv/nonvirtual.html

Every time you say virtual in an API, you are creating a call back hook. As an OS or API framework designer, you've got to be real careful about that. You don't want users overriding and hooking at any arbitrary point in an API, because you cannot necessarily make those promises. And people may not fully understand the promises they are making when they make something virtual.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: