...Sure, your computer can perform 10 billion floating point operations per second. But most of the time it’s not doing anything at all. Just like you...
This is a great rant. Nice emotional content, lots of technical details, the author qualifies his credentials, etc. Easily one of the better articles I've read in the past few weeks.
One of the things he mentions is the pain of setup -- something I've painfully watched develop over the years. Used to be you could go from a dead stop to programming something useful in about 10 seconds. Now, as he points out, it's not unusual to spend weeks digging around through vendor requirements, obscure dialects, rumor and configuration nightmares simply trying to get started. It's crazy. We've complicated the plumbing to the point nobody knows how to work the damn shower any more.
Ah, you youngsters. Trust me, it is FAR easier now to get started than any time in computing history. Before GitHub, you installed SVN or CVS on your own. Before that you accidentally overwrote your software once a month. Before Cloudant you installed MySQL on your own. Before that you wrote your own storage system. Before Amazon you bought a server and installed Linux on it. Before that you wrote customer databases using an Apple II with 48KB of memory and floppy drives with 120KB of space. Before even my time, programmers used the spin rate of storage drums to calculate where to place blocks of data so they could be read as fast as possible (hint: sequential block placement was not optimal). No whining - you've got it good. Go do great things.
I love this perspective. I think part of the problem is so many of us are stuck doing stuff that is far too damn complicated to be fun to build or fun to use. If you're burning out on hard stuff, do something easy for a change that puts a smile on someone's face. If you don't think you can make a cynical, stressed-out adult smile, make something that puts a smile on a kid's face - that's easy, and it's a way to remember the joy of building fun and creative things.
When I was a kid I loved computers and I saved my paper route money and bought Borland C++ and tried to learn C++ from the manual. I failed utterly and I had no one to teach me. Now kids can learn easier languages that do cooler things on cooler hardware and they have the Internet to learn from.
Great things are possible and computing has made great leaps forward. Enjoy it!
I have the luxury of a corporate job that allows me to create software applications for internal users that are much better than what they were using before - so much better that their daily enjoyment of their jobs is significantly improved. (And that says a lot more about how bad their old tools are than it says about how awesome a programmer I am.)
As challenging and frustrating as it can sometimes be for me to get the application stack working, it's nothing compared to the aggravation and sheer misery I'm eliminating from my users' workflow.
One group sits close enough to me that I can actually hear the reduction in swearing each time I replace a shitty old tool they have to use with something that was actually designed with their needs in mind. That's a pretty nice feeling.
Totally agree, I still remember the pain I had almost a decade ago install something as simple as a Borland C++ IDE. What a nightmare that was for a newb like me, zero resources online to dig through. Today, you want setup and IDE or get iPad development going? There are TON's of resources with thousands of examples. Hitting an error when setting it up? Copy, paste that error into Google. 9.9 times out of 10 you get someone who hit that error before and has already solved it.
I'm just not buying into this "oh, life is so hard to set stuff up nowadays" routine...
I don't think the point of the article is to say "oh, life is so hard to set stuff up nowadays".
Rather, the point is that it is easier and more pragmatic to rush ahead with an imperfect technology stack and achieve great things than it is to spend the time and effort to reach "a better place".
I love how far technology has come, I love writing iOS software and appreciate the resources at my fingertips via Google. However I have a similar range of experience as the article's author, and I completely agree that while great things are being done with technology as it is today, it's very sad that we have limited ourselves to only reach where we are now rather than where we may have been.
Go build a new computer architecture, Zachary Morris. It seems you have the knowledge and the support. Put it on kickstarter and I'll give you money. Godspeed.
A new architecture isn't going to happen until it has to, which fortunately may be soon. The best opportunity for that is if and when we start building parallel microprocessors based on one of the newer transistor replacements, but engineers need to agree that compatibility with x86 isn't important, first.
Youngster? Gee, thanks! I remember writing my first program on a Commodore Pet with 4K of RAM. Used peek and poke statements to draw the game field on the screen and loaded and saved the whole thing on a cassette tape.
The problem here is that you can argue this either way, so let me clarify. When we do systems analysis, we first focus on "happy path" scenarios. You want something, there's no exceptions, you push a big red button and it shows up. Life is good.
For those kinds of behaviors, say provisioning an entire linux stack on AWS, things are rocking. Want to code? Type in a couple apt-gets and you are on your way. There is a huge section of happy path scenarios where things are truly much better than before -- easily enough that many startups can totally stay on the happy path and do awesomely. (And that's exactly what they should do.)
But, alas, because of the dozens of layers of abstractions, and dozens more API, tooling, and hardware configurations, if you are doing anything technically difficult you can easily leave the happy path and get lost in the woods. It's very easy to get into a situation where you are using a combination of a dozen things in such a way as to be almost unique. Yes, each thing is awesome and easy to use, but the scenario you find yourself in is not. It's not called "sad path" A better name for it would be "terrible path of shame and destruction" Ever work inside a problem where you're dealing with multiple edge cases in several layers of complex system that has parts that are inscrutable? Ouch. I've plowed through complex and poorly-written C++ frameworks line-by-line and it's not as bad as that.
The author spun out one such scenario for writing iPad apps, but you could find a hundred of these cases easily. Boards are full of guys asking questions about combining A, B, C, D, E, F, and G into a situation where they are having problems, and responses are usually along the lines of "Hey, I know A, C, F, and G, and here's what worked." Sometimes that's helpful. Sometimes not.
I'm fairly new to linux, although I'm an old .NET/Win32/C++/COM hound. About a year ago, I thought about writing a mono app in F# to do functional reactive programming for the web as a side project. Basically Node. After poking around for a few days, I finally gave up. As far as I could tell, I couldn't find an entry point into a mono app such that I could load the app and hold it in memory, then re-enter from Apache (It probably could be done directly from the command line, but I didn't want to give up Apache) I wanted the speed associated with re-entrency.
Now perhaps somebody will reply and tell me the magic formula to make all of that happen. If so, awesome. But it was just too much bullshit. Finding stuff on Apache? Easy. Setting up a linux box? No problem. Loading up mono? Piece of cake. Getting F# up? A bit of a hassle, but I worked it out. Tying it all together in that unique combination in a configuration not commonly used? A very painful thing. I'm not saying it was impossible. I freely admit being a wus and giving up. But this kind of frustration is all too common any more.
Another example. I help teams a lot. Many teams I help are starting out on their project in a greenfield environment -- it's a new project and everybody is starting fresh. It's not unusual for them to spend a week just getting their environment set up. Now here's the deal: yes, you and I could cherry pick tools and such so that we could set them up in an hour or two, but many times the organizations they are part of have already made these choices for them. So no, it's not git. And no, it's not a plain-vanilla IDE. And so on. I really feel for these guys.
So sure, you can show all kinds of examples where things are a lot easier now than before, and I completely agree. But many folks do not live in that world.
I would venture that anyone who works in a large corporation is not in this world. Which is a loss; large established corporations have some of the most interesting problem spaces...
Luxury! In my time, my father would wake me up by cutting me in half, then I'd have to walk forty miles uphill through a blizzard to get to my computer and then I'd have to electrocute myself to turn it on and I'd interface with it using an abacus and punch cards!
(Off-topic, but if you want to rejuvenate the F#-as-node.js thing, look into FastCGI. It's still a bag of hurt, though - notably, mod_fcgi feels proof-of-concept-y and will not multiplex requests over a single connection. Other HTTP servers apparently do better.)
I got as far as mod_fcgi, but I couldn't figure out how to write a mono program to integrate with it. Found some nice C examples, but nothing for mono. I still think it would be a hell of a great side project -- lots of power and expressiveness in F# that JS doesn't have.
I assume you're aware of http://www.mono-project.com/FastCGI? I suppose you could just implement the FastCGI spec yourself, it's a bit fiddly but not that complicated.
Of course, this assumes you have plenty of time to spend on such projects. ;-)
We've complicated the plumbing to the point nobody knows how to work the damn shower any more.
This is probably the single most irritating thing to me as a developer. And, the most frustrating part is that this aspect often turns a simple project or a curious dive into a new technology into a knock-down, drag-out brawl with configuration files and dependencies that in the end produces frustration instead of working code.
The full stack isn't under your control on Heroku.
Oh, so you have the ability, time and motivation to rewrite your firmware, operating system, network stack, database and web server and implement it on a CPU you designed yourself?
No? Then the full stack isn't under your control either. Get over it.
That's a different path entirely
You know what? It's not a different path at all. The convenience of Heroku is no different in principle to using closed source hardware.
Look, I'm a freedom-zero[1] type of person. I run Ubuntu - not OSX - and I admin my own servers. BUT the chains of convenience of apt-get are much closer to the chains of convenience of something like Heroku than many would like to admit.
To get more control one should move to AWS. But wait, I don't control the network card on the server with AWS. Hmmm... I know! I'll get a colo. Now I can purchase and build my own machine and have control over the network card. But wait, I don't control the backup power system. Hmmm.... I know! I'll rent a building, get an internet backbone piped into it, buy some generators, and control the backup power system. But wait, I don't control the...
I guess what I'm getting at is, why is this an entirely different path? Why isn't using a service such as Heroku "the path" until you hit it's limitations and need to spend a bit more time going to the next step in the process.
One could easily make the statement of "blah isn't under your control on blah" for anything. Where does it stop?
In practice, the biggest loss-of-control problem I have with Heroku is that it dictates my choice of programming language. Of course, choosing a Linux server also dictates some choices, like making it hard to run Windows-only software. But for me at least, the Linux-apps-only restriction feels less constraining than the Heroku-approved-languages-only restriction. I tend to experiment with new languages semi-frequently, and also have a bunch of code in languages that Heroku doesn't support (mainly Lisp).
At least you have documentation. In the bad-old-days before ubiquitous online documentation, you'd have to sort through whatever crap the vendor saw fit to give you. If you were very lucky, there was a mailing list you could post on, if you were very very lucky, your post to the list would actually get a response.
No, software installation today is an absolute breeze compared to what it used to be, no matter which platform you're on. On Windows, DLL hell is largely a solved problem. You can just double-click the Installshield EXE and take a coffee break. On Linux, you can invoke the package manager for your system and have an even more seamless experience. Heck, even programming tools offer their own package management systems now. Want to install django? Pip install django. Want to install rails? gem install rails. It's definitely easier than unzipping files, setting configuration settings, finding (and, in some cases, installing) dependencies, and then crossing your fingers and typing make.
Great analogy - every time I go to a hotel I have to figure out the damn shower; I'd never realised there were so many different concepts and designs just around shower controls.
...Sure, your car can go 150 miles per hour. But most of the time, it's not even being driven.
You need not use all of your capabilities all the time. My A/C can go to 50 degrees. Maybe there will come a day when I need that. Until then, I'll keep it above 70.
Look here's a story: We just had our national holiday in Germany and therefore a long weekend. So I decided to code an update for an iPhone app I have. The app lets the user create funny pictures, so I thought it would be cool to have an online gallery with user-created pictures, where people could vote on the best ones and have a weekly top list. Now this is far from trivial though, suddenly I need online storage, a database for users and votes, and a webservice to handle all of that. So I looked around, found Node.js, MongoDB, Heroku and S3, signed up, started reading, learning and coding. 2 Days later and the first version is done and working. How much did I know about this before and how much does this stack cost me? Almost nothing.
So where's my point with this? First, this kind of story would be impossible just a few years ago. When you realize, how it's now possible for a small developer to reach potentially tens of thousands of users, without the need of a big budget or being in control of delivery channels. When you realize, how much powerful resources are now right at the fingertips of the average developer, how you just need to sit down, read and learn, and you are able to implement even the wildest ideas - then I can't help but think these times are great!
Coding is still hard and despite advancement in technology, it may not have gotten much easier. But many things that used to be straight out impossible, can now, with the right commitment, be archived from the comfort of your own four walls.
How would what you did be impossible a few years ago? The specific services and products that you listed might not have been available, but they are hardly required for something so simple. Any competent Perl developer could have done the same thing in a couple days a decade ago.
Of course a competent Perl developer could have, after all that is his domain, but he'd still needs to come up with scalable hosting somewhere and why exactly would he want to do that in the first place?. As someone who's primary field isn't web development its amazing what one can put together in a weekend these days - if I had, say, a J2ME app a decade ago and wanted to do the same thing, I don't think I'd even try.
An excerpt from an article[1] I ran across a few weeks ago seems appropriate:
"The printer was the first drum printer that I had ever seen. It would print 1500 lines per minute alphanumeric, and 1800 lines per minute when only numeric values were being printed. It cost $243,450. Its reliability was somewhat suspect. I walked through the room that it was kept in every day for a year, and the only time that I ever saw it working was at a trade show in Los Angeles. The main reason that I went to the show was that I heard that the printer was there and working. I suspect that the printer was a strong contributor to the demise of Toni Schumans' career with Burroughs. Doug Bolitho was giving a plant tour to a group of potential customers one day and he somehow had the printer printing something. Toni walked into the room and loudly exclaimed "My God it's working". She left Burroughs shortly after the incident."
It's an excerpt from an autobiography. The article as a whole originally came up because the author worked for a summer with Donald Knuth - as in, Donald Knuth of the Art of Computer Programming, the quintessential tome of accurate, elegant, academic, truth-on-a-whiteboard computer science. Presumably Knuth sometimes walked by the non-operative printer some days in the morning too. His job at the company was to write a compiler, and reportedly it was a very good one.
Success and failure in the state-of-the-art have always coexisted. I sit here - as an iPhone programmer, who has recently had to deal with annoying provisioning issues, LLVM and GCC compilation problems, and advertising networks - and I can look through the window of my office to see our printer, which is located behind the water cooler, available over a wireless network, and can accurately print a requested piece of paper the first time I send a print command from my laptop. I think it cost $200 from the office store - in 2011 dollars, before accounting for inflation.
My office has a printer, yes, it's available over the wireless network. However, it cost... somewhere in the neighborhood of $10,000. I'm not entirely sure, we have a five year lease on the thing, and it costs something like $500/month to run. It's an okay printer, as long as you run Windows.
There is a postscript module for the printer, but it costs around $1000, so Mac/Linux machines are out of luck, and our vendor hasn't actually said when we can get such a postscript module installed. (And I get people in my office at least once a week asking how they can print from their MacBook.)
From a hardware perspective, we've come a long way. From a software perspective, it's a wonder that we're still using proprietary nonsense protocols for printing and scanning. And my organization is stuck with a 5 year lease. But even if we weren't stuck with a 5-year lease, it's a $10,000 printer that is incompatible with OS X.
That said, I think this article is over the top and mostly wrong. But it is a great jumping-off point to talk about the limitations of the jumble of incompatible technologies we find ourselves working with, and how we can make them better.
I wasn't sure where to chime in on these comments but this one resonated with me because I used postscript for a year at hp and it's a remarkable language. I think it's a shame that it's mostly unknown today and the opaque pdf standard has taken over.
The world has largely established that http is the way to query devices and control them. Device drivers are the spawn of the devil to me and completely unnecessary (thank you Microsoft). They may very well be the pinnacle of what I'm complaining about.
Printers could have a free wireless web interface where you upload any file type from tiff to doc and it "just works." I realize it's more complicated than that because of colorspaces and half toning and blah blah blah. But it shouldn't be.
And I shouldn't need any special software to save images from my scanner or take a snapshot with my webcam. I actually wrote a command line tool on the Mac to tell Quicktime to save a snapshot from the webcam as a file. That is pathetic and makes me want to hit myself over the head with a sledgehammer.
I don't think I've said anything earth shattering here but wake me up when any of this happens.
"Device drivers are the spawn of the devil to me and completely unnecessary (thank you Microsoft)."
We sort of had this before Windows. When you bought a printer you had to make sure it had an Epson FX mode to support Wordstar, Diablo 630 to print you invoices and IBM Proprinter to print your mainframe reports. It was a bloody mess.
The Windows printer driver model isn't great and MS appear to recognise this, but things were a lot harder when every application did it's own thing.
No other industry has the features you seem to want. You can't just put any tires on your car and have it just work. You can't just throw any gas in it.
As for printers, PS was proposed as a standard so that printers "just worked". It cost money to license, not every jumped on board, and so it didn't establish itself as the dominant brand. PDF is based on PS and people could certainly standardize around it. The problems are the same as with PS, however.
Your screenshot example. I assure you that you are not the first person to think of this feature. Apple does distribute Photo Booth with OS X after all. Is your complain then that they didn't cater to an the incredible minority with a command line tool to do this? I don't see this as a reasonable complaint. It's certainly not something to hit yourself over the head with.
None of this has anything to do with computers nor state-of-the art. Your complaints seem to be that society is not catering to your specific needs quite enough. Or that people are not working together quite enough. While I agree with you on the later point, it's not worth getting so worked up over. It's always been this way and likely always will.
Also the only decent-performance chips doing it were from Adobe, and they weren't cheap. At the time, this would have pretty much cut the low-end printer market to the ground. The high-end printer market, of course, pretty much all did PostScript if you asked them to.
I don't think he's complaining that there is no CLI utility by default, but rather that he can't write one other than by going through Quicktime, because instead of a widely accessible (e.g. HTTP) server, it has some proprietary driver.
Try using PCL. My wife's office had a printer that didn't have a postcript module option. Every technician said it was impossible to print from the Mac to it. I tried a couple of CUPS PCL drivers and it worked fine.
I completely agree with this sentiment. Look where automobiles were 100 years ago - nothing like the vehicles we use today; vehicles that are designed to squeeze every last MPG out of a gallon of gas, or can keep us from dying in a major car accident.
Computing will improve. Computing will always improve. I think rants like this are helpful to point out where we definitely can improve, today, to bring on the future - such as making the iPhone dev and release process easier ;-)
You are kidding, right? Cars have hardly changed at all, still the same petrol powered devices with are a hard shell. Sure the car companies were forced to add in some extra safety mechanisms, but this does not mean much.
What? That's so wrong it's not even funny any more. If you look at the advances in efficiency (on all fronts - energy consumption, manufacturing, maintenance, ...), comfort, safety and basically all facets of personal transportation, it's nothing short of amazing. Today you can buy a new car for 4 months middle class wages and it'll be miles beyond anything you could buy just 25 years ago. (ok maybe not counting some things like leather seats, but apart from a few exceptions like that).
How often do cars break down now compared to 30, 50 or 80 years ago? I am pretty sure most of the early drivers were quite good mechanics, these days? Not so much. I'm sure Mr. Ford would find modern cars very foreign.
A chair, clothing and even houses haven't changed much over the years either. Most designs only change bits at a time, slowly morphing into unrecognisable things.
True, but a minor head overhaul would have been about 1/2 an hour with 'roadside tools' (say to replace a valve spring) possibly during your trip to grandma.
Today that same repair would be a couple of days in the shop, requiring tens of thousands of dollars of specialty tools. Of course those springs don't fail as often as they did back then (a combination of improved materials science and engineering) but when they do the fix is out of reach, even for a trained mechanic without access to a shop with all the required specialty tools.
I don't think anybody carries a valve spring compressor in their toolbox. Plus you'd run the risk of dropping the valve unless you was really careful and good luck getting the keepers back in on the side of the road without a way to hold the valve up. Even then who has the parts with them to make this a 1/2 hour job? Cars are just as hard/easy to fix now as back then. Fuel injection is arguably easier to troubleshoot then a carb given the computers help.
It's more than that. Burroughs was the mind-blowing innovator of the mainframe era. Look up the Burroughs B5000, which Alan Kay has been raving about for years as still in some ways more advanced than anything that has come since (or at least, anything popular). We had a few threads about it on HN a month or two ago; one of them was the article the GP is linking to. I highly recommend reading the whole thing.
The thing that the author of this rant and every other rant like it don't understand is that computer science is hard.
While tools like functional programming may indeed deliver on the promise of a 60% code reduction, they have a correspondingly higher barrier to learning. Evolving algorithms? Automatic programming? These problems become theoretically intractable so quickly it's not even funny (I currently do research on a very related problem). He wants compilers to just insert the semicolon for him? I'm glad mechanical engineers of the world don't have the same attitude about nails and screws!
Most of his complaints in truth have nothing to do with computer science at all. They have everything to do with sloppy engineering. There are all sorts of obvious reasons why computer engineering is sloppy. A few examples:
1) Developers for open source project usually are not paid. It's not surprising that documentation is weak.
2) Reliability and turn-key operation are expensive to develop and nobody wants to pay. I'm sure the author of the article doesn't either.
3) Bugs have lower impact. Screw up a building foundation and you might end up in jail. A clunky install process? Probably all that will happen is a scolding and a new issue in the tracker.
4) Things change so fast that standards can't keep up. The same goes for most other engineering frameworks that would solve many of the problems Morris complains about.
We've made and continue to make huge progress in the field of computer science. Computers have and continue to replace people in jobs all over the world. Morris should be happy they haven't replaced his job yet. Not working may sound nice, but having an income is also nice. That has nothing to do with computers.
Computers have made our lives easier. If I went back 10 years and told my younger self what I can do today with just my mobile phone, I doubt my younger self would even believe me.
The problem is not that progress is bad. It's that progress is moving too fast for engineering to keep up with. The state of the art is constantly changing.
> While tools like functional programming may indeed deliver on the promise of a 60% code reduction,[…]
Most code is badly written C++ that could have been written in, say, Ocaml, Haskell, or Lisp. This is easily a 10 fold reduction. And the guys at http://vpri.org/ are doing at least five times better than that (50 fold reduction compared to well written C).
On your point 3, Bugs have lower accountability. Meaning, lower impact for those who commit them. If they have lower impact as well, that's because they forced us to distrust software.
(The downvote suggest I may have to clarify my statement. I suppose unqualified bashing isn't exactly a good idea.)
By the LOC count of currently maintained projects, most code is definitely badly written, and in a low-level or otherwise unsuitable language such as C++ (C++ is fast, but most of the time we shouldn't even care.) Most code I see (beware selection bias) is indeed badly written C++ (I even authored some of it). Now, if you count by project times end-user popularity, then you have a completely different story. It is a poor proxy for programmer's work quantity, however.
Regarding the 10 fold reduction, just know that I experienced first hand 5 fold reductions when doing my best with C++ and Ocaml respectively. A 10 fold reduction is not unrealistic at all when we talk about badly written C++. For instance…
…I currently write plugin for a 2 million-lines program. The exposed interface alone weights 80,000 lines. Many classes have hundreds of methods, and the inheritance hierarchy is generally 5 levels deep, sometimes up to 8 or 9. And, it actively deals with no less than 6 different types for handling strings (std::string, char* , old Pascal strings of 3 different types, and some String type that inherits std::string). Oh, and it's slow. C++ didn't even help here. We are past the point where we can talk about technical debt: it's a Big Ball of Mud.
For such a program, I expect a 20 fold reduction. And I bet there are other Chtulu abominations out there that make look this program like a little diamond.
Semicolons in a grammar are nothing like nails or screws. Nails and screws hold things together. The vast majority of the time, semicolons in code are only visual indicators of the end of a statement; another perfectly good visual indicator is a newline. Many languages that require semicolons could (with minor modifications to the grammar) be cleanly parsed without any sort of statement terminator. See the Lua grammar for an example: http://www.lua.org/manual/5.1/manual.html#8
> He wants compilers to just insert the semicolon for him? I'm glad mechanical engineers of the world don't have the same attitude about nails and screws!
Yes! I guess we could learn a lot from other engineering disciplines.
It's always good to go back to the start and remember how much worse things have gotten since any arbitrary time in the past. Before complaining about this rant, go out and explore some of our history:
Yes. I was just talking to my prof./boss today who worked at Stanford, PARC and later Interval about moving some things to Dropbox, arguing that it's a better long-term solution since it's just <i>the file-system</i> and the system doesn't rely on any specific representation (such as a database) or syncing mechanism (can be swapped for rsync, git, AeroFS, etc.). Hearing this, he launched into a rant about distributed file-systems (like AFS) and how hierarchical file-systems are themselves a specific and ultimately transient technology. Working with him and another fellow who was hacking UNIX in the early 80's, you get a sense of a sort of terminal frustration from having witnessed the golden years of research in computers first-hand and the subsequent failure of most of those ideas to manifest.
Today's computing is built mostly on tools that were originally cheap hacks intended to be replaced. If you have not watched the videos above, for the sake of our future, please do.
Ward's Wiki is wonderful. It totally reminds me of the old days, when a website was nothing more than a collection of hypertext links.
The original rant really does make a good point about how accessible things once were. Gosh even being 24 I remember the days of screwing around with logo and qbasic and hacking serial ports with zterm. It's not that everything 'just worked' then (It didn't) but that if it didn't work you could still get something useful out of the machine. The layers of abstraction from the hardware have grown so much now that nothing makes sense.
Anyone who liked this article may like to know that Alan Kay and his team at the Viewpoint Research Institute are currently making a silver bullet. A real one. Of the kind that really really hurts the Complexity Werewolf. They may not kill it, but the scars are already visible.
Take a look at their work, most notably the last STEPS progress report. They can use 50 times less code than well written, useful C code (like TCP). Compare with redundant, useless, badly written C++.
The secret is quite simple: you can write your own programming languages, and good code looks like a formal spec (to the point of being one, ideally). If you can't write good code, then write the language that will let you. How ? Start here : http://www.tinlizzie.org/ometa/
Other efforts similar in spirit but dramatically different in execution are Intentional Software and Jetbrains MPS. I really need to find a weekend to try to parse through VRI's work. Has anyone tried to make a digestable survey of the work they have done? I've glanced through their documentation but it seems quite sparse and their stuff is a moving target.
Kay and Simonyi are both PARC alumni, the place that has brought us a lot of the progress we've seen so far. This should be a slap in the face that current generations are making Angry Birds and Farmville, and these guys feel it is still upon them to do anything in the way of progress. It's just shameful.
I don't think they are similar in spirit. These other projects are missing the radical vision that is by far the most important and guiding thing behind Kay's group: to recreate all of personal computing in 20,000 lines of code. Operating system and applications. That is so profound that, if they succeed (a big if) it can't help but change the world. Simonyi and the Jetbrains people aren't in the same galaxy.
(I agree with you about it being kind of hard to follow what they're doing, though.)
I think you're missing the bigger picture here. It's not about a specific application, it's about the notion that the way we write software is fundamentally broken. All of these projects agree that each problem domain can potentially see orders of magnitude reduction in both LOC and bugs if custom languages are designed that fit the domain more appropriately than a one-size-fits-all approach that's been the status quo.
People are going to look back at this as the 'golden age' of the web. Yes, things are a little screwy all over the place, but you can do ANYTHING you want now. NONE of the perfect software has been written. It's all waiting for improvement. You can do anything you want and profit from it.
And people just complain that life isn't easy enough. Psh.
That was what I thought he was aiming for to but then he, out of nowhere, suddenly praised PHP. I'm 99% sure that I don't have to check out PHP again to see if it's really something about it that I've missed.
Yes, the PHP "language" sucks. But that isn't what matters: From a practical standpoint, it's good for getting things done.
The reason? Almost zero deployment complexity. You can take Joe Web Designer off the street, give him SFTP (or, more likely, FTP) credentials, and have him modifying your site in minutes.
I love PHP, but I'm in programming like a shade-tree mechanic: I just want a large enough hammer to make something work, I don't care how pretty the end result is.
I think the source of your frustration here is idealism. You're assuming, first of all, that things should be "better" and second of all that things could be better.
Software and technology in general is like a gigantic thriving petri dish. The natural world is terribly inefficient, too (I need how many sperm to germinate one lousy egg?!)
The simple fact is that I'm able to read your rant, think about it and reply whilst sitting in a cafe on a mobile phone. That is awesome progress.
The problems we deal with as engineers are the reason we get paid to do it. Compare this to the practice of music: the art is in the beauty of the song. The engineering is in getting 5 stoners on stage simultaneously and managing to get paid by the ephemeral promoter at the end of it. I'd rather wrangle with a broken FreeBSD port dependency in the comfort of my office listening to music of my choosing (streamed over the internet, and made by people using computers) than starve because of crop failure.
In every life we have some trouble, but when you worry you make it double.
This sounds vaguely like one of the many "We must have a Do-What-I-Mean language" posts I've been reading since at least the 80's.
Yes, many things in the "state of the art" could certainly be better. But the belief that that is only so because the powers that be want to make money is rather misguided. Building better tools and developing better techniques is hard work.
(I'm not even going to comment on the fact that the author in the same article demands a more formal basis of our craft and at the same time thinks PHP is the best language ever. I was slightly amused)
Sounds like he is bemoaning the technical debt problem, writ large. While you can't resolve this from the gate level on up overnight, you can create your own "technical oases" that are largely free(er) of technical debt than the crud they float upon.
Ya that is a good way to sum it up, thanks for the brevity. I'm finding that even my thinking has been clouded with complexity, which leads to my large rambling posts. I'm trying to imagine a world where everything is more elegant from the ground up, which is perhaps possible now that we have hindsight, and then apply that to my own work.
I really disagree with a lot of his solutions, and I think his argument is muddled and unclear, and I hate the idea of moving toward languages like PHP that cover for programming errors (that often mask logical errors you want the compiler to check) but I see where he's coming from.
My complaint is that a lot of the measures he proposes (self-modifying code? seriously?) are going to make the problem a lot worse.
Across the industry as a whole, 80 to 90 percent of software engineer time is spent cleaning up crap (sometimes one's own, but this is at least educational) that wasn't done right in the first place and, yeah, it's frustrating. You need to have an unusual degree of autonomy (effectively your own boss or at least CTO) to be able to create such "oases".
Ya I'm beginning to feel out of the loop compared to my peers. But I've been down so deep in some things that I'm profoundly aware of how flawed they are. I need smarter tools and was looking to things like self modifying code as a way to help programmers explore problem spaces and then maybe we could freeze the solutions when they perform properly. I definitely agree that maintaining code is fully 90% of this job.
You obviously know a lot about computing, certainly more than I do on the whole, but I'd recommend leveling up on PL design.
Self-modifying assembly code, in the bad old days, was not uncommon (as a performance optimization). It is, however, utterly unmaintainable.
I would recommend the book Types and Programming Languages. Also, you should spend a year in a strongly and statically typed language. Start with Ocaml or Haskell because they are "purer", then move to Scala if you want. This will give you some ideas on what infrastructural choices make feasible the development of software that isn't complete garbage.
Good advice. Regardless of language, I tend to write a lot of code in a functional manner, which throws off my business partner because he writes macro-style code (where the code is ever-evolving and is practically a media file). So he's about 10 times as prolific and I have a hard time demonstrating why my approach might be better until years down the road. On that note I bought O'Reilly's "Erlang Programming" by Cesarini & Thompson, but haven't read much yet. I'm hoping to find a purer language that can give me some leverage to do what I want with concurrency in the near future. Thanks for the advice on which order to go with.
Oh sorry, basically treating code as any other media file. "Game Scripting Mastery" by Varanese and LaMothe goes into a lot of detail about creating your own language and compiling it to I-code and then running the scripts in the game, the same way you would load images or sounds. My partner writes very expressive code with creative macros (he independently discovered iterators by defining a macro like SPRITE to mean sprite[count]). That might seem a little strange to a computer scientist, but look at what he did. He didn't have to explain how references or lists or anything else worked. If you understand arrays and #defines, then you can use iterators too. And he does things like that to me all of the time, writing one liners to replace my pages of "proper" code. The only problem is that he does it in c++ instead of javascript/python/lua so our games are in this constantly evolving state, so my top priority is adding a scripting library to our engine. Then again, that introduces the need to bind all of these functions and data. It will probably be a win though because we tend to work on large games.
I'd like to buy Zack a drink, because all of this is obvious to those poor shmucks among us that handle the support end of things (also known as the "shit end").
I've been saying a lot of the same things for years. I'm tired of it now; I'm starting to give up, because it's obvious there isn't a programmer out there that gives a damn. You can try telling them that there's something wrong with software -- something fundamental, something in the process itself that has become horribly broken -- but they'd rather tell you why you're wrong rather than really listen to why you're frustrated.
Then if, as a user (or programmer) you complain about something, they say, "So build your own version." What, we're supposed to re-build the world? OK, fine. So you start building your own version of something. Then the response is, "Why are you reinventing the wheel? That's already been done."
But, what the fuck do I know? I spent a couple of hours in a meeting today with a client explaining why upgrading -- er, pardon, "migrating" -- from Joomla 1.5 to 1.7 wasn't going to happen without a lot of money involved. Then I went to another client and we fixed a broken Windows network stack and a handful of other stuff. Then I came home and unboxed the new laptop that I just bought because my old one could no longer run Firefox anymore -- even though it ran it just damn well fine enough a few years ago. Other than the new laptop, this is pretty much my day -- all day, every day. Well, me and the other guy in my shop.
And nobody else seems to think there's anything strange about this stuff. So Samba's documentation is a mess and Samba 4 never got around to implementing allowed_users, which is absolutely necessary in a Windows AD environment? Pfaah, big deal, so what? So Western Digital's backup software interferes with Outlook in funny ways? Psh. Who cares, who's that going to affect? So everybody's decided to abandon sane software versioning altogether and make the support end of things even more nuts? Hah! Get with the program you support idiots, you're supposed to want to spend all day upgrading software and dealing with the inevitable fallout.
Besides, users that don't like upgrades are just morons, they just don't know what they like. As soon as they get used to the new version, they'll like it better, you'll see.
All of this is stuff I've personally heard, or seen in places like HN ... and not just a few times here and there.
Software used to be fun. I remember when it was, when it seemed like most things just worked, even though they didn't look pretty. I remember when it seemed like I could just open something up and start hacking on it without having to hold tens of thousands of lines of code in my head, spread over dozens of files. I remember when looking at someone else's code could actually teach me something, instead of making me want to cry.
I really don't like this industry much anymore. I guess that makes me a bad hacker or something.
I remember when it was, when it seemed like most things just worked, even though they didn't look pretty.
I remember spending hours moving around jumpers to set DMA and IRQ. Only to find out that the only free IRQ was #9 and bloody Packard Bells sometimes didn't have a frikkin' 9 because that batch of boards was $2 cheaper that month...
I remember when it wasn't pretty and it took ridiculous efforts to make it work even a little. Now I mostly jam a USB thing in the side and it goes. There has been much improvement.
I'm actually a little psyched by it all. This just happens to be a really hard problem that arrived at humanity's doorstep before we were really evolved enough to make much of it. It might mean we're ahead of the game, and it almost certainly means that the best is still yet to come.
I was thinking about this last night, the evolution problem - whether the only effective strategy would be to synthetically reintroduce tribal-era scarcity so we could all find our balance again, or would moving to a genuine post-scarcity world be something humankind could handle, or do we need to find some way to hack our minds to accept this new availability model in the same way that alcohol or cocaine short-circuit our reward pathways?
w.r.t. OP, should lines of code be restricted to manageable amounts by some kind of international regulatory fiat, or can they just abstract the problem away into utterly comprehensive libraries of functions, or do coders just need to man up and take it?
I think maybe we just need to grow into it. This takes time, and probably generations. The first generation that has truly never been without a globally interconnected computer as part of their life is about to come of age. Lets see what they can do.
There was a time when there were no humans who knew how to drive a car.
Yeah and at that time say there were dudes running around on the rigging of big 4-mast galleons like it was no big deal, and if you dropped one of them on a busy road they would be as horrified as I would be if someone dropped me in their place and told me to splice the mainbrace or whatever...
I believe he's talking about the whole lot of it. Software and hardware . I think USBs suffer their own issues (see http://wiki.gbatemp.net/wiki/USB_Devices_Compatibility_List as a simple example (hint see the ""If the Device does not work" section)).
This fits the OPs notion that no one really understand the crap underneath it all.
Which software was it exactly that was so great and just worked? MS-DOS? Windows 3.1? Windows Me? Some specifics would be nice, because across the board it seems everything is MUCH better now. It used to be that you would buy a computer game and 75% of the time it would not work out of the box.
"I really don't like this industry much anymore. I guess that makes me a bad hacker or something."
This is the 'little death.' Allow me to relate an anecdote.
I grew up loving computers, loving everything about them, how they were built, how they were programmed, how they did what they did. I started out just as the 'personal computer' revolution was getting started, it was glorious, Altair, IMSAI, SOL20, Heathkit. Lots of folks with their own take on what the PC should be. I have spent hours and hours and hours writing my own BIOS code, hacking ZCPR3, making an emacs clone work on CP/M, falling in love with the Amiga and suffering the incurable disease of incompetent management. By the time the late 90's rolled around I was starting to burn out. A lot of stupid things which didn't have to be that way, Microsoft always trying to make their version of something just a bit incompatible and only buildable with their tools. Etc. I was writing some code on a windows box and hating it. I yearned for a simple 'make foo'.
Then I met a 'kid' who was building stuff on Windows and he had the same wonder I had when I was that age, except he didn't complain about visual studio crap because he had been introduced to computers with this as the way to do it. Where I saw re-implementations of the wheel, done poorly, he didn't see anything, they were just the tools you had to use to get to the end point.
I realized with a start that I had lost my sense of 'wonder.' That childlike state where you ignore the fact that something is uncomfortable or irritating because you have so much amazement over the thing itself. And the truth is that if you use crappy tools for a while your muscle memory will figure out how to minimize the irritation. I looked around and saw that people I knew, people who were bright lights of leadership back in the day, were now stuck in an endless cycle of curmudgeonly rant because they too had lost their sense of wonder. I decided to start picking my battles more carefully. (which you can do in a hobby, not so much at work)
I found an editor that I could use everywhere (Visual Slick Edit, now VIM) so that I could have the same editing experience on all my platforms. I spent some time to understand the build system (make, gcc, java, python, etc) to get to a point where not only could I create my own environment I could keep it running across platforms, and began to develop my own set of APIs which I could link through into the underlying platform. The goal was reduce the friction between getting stuff done, and the tools to get things done. I recognized the reward comes in the running of the code and getting it to work as I wanted.
Then I can mostly ignore the crappy stuff. I can joke about how putting a character on screen used to be to monitor the transmitter buffer empty (TBE) flag on the serial port and then write the ASCII character when that flag was 'true', to something which spends thousands of cycles checking to see if this time I want my characters to go right to left, or which code points or font I should use to display them, or how they should be alpha blended into the background of what is on a screen somewhere. And when I come across something that is horribly, horribly broken like using WebCam's in Linux, I try to develop a durable API for talking to video which isn't cumbersome to use, or has feature stubs I'll be unlikely to use. I try to stay amazed that I can capture digitally on a piece of $20 hardware that which used to cost thousands, and in so doing keep my sense of wonder about what is possible.
So, of all the replies in this thread, this one spoke to me the most. I think you're right. Thank you for describing it so well.
I started out a little after you did, with BASIC on a Commodore Vic-20 and Commodore 64, then Logo, then HyperTalk, and so on. I used to have fun decompiling programs and poking them with MacsBug to make them dance for me, and I used to have fun writing my own toy operating system and generally just screwing around. And, for the most part, my tools were simple and reliable.
So that's what I compare everything now against, and it all seems less reliable and more complicated. MacsBug was a thing of wonder and beauty compared to the "debuggers" I have to deal with most often now -- Firebug and GDBp. And I've started working recently on building my own tools, which is sort of fun again, so maybe I'm sort of headed on the right track.
But anyway, thanks for describing it like you did.
I am the child, instilled with a sense of wonder about, well, everything. I got kind of a late start, didn't run into computers until I was starting High School, until my parents gave me an old Pentium II that they had lying around. I'm 19 now, so that would have been 2005-2006-ish. A horribly underpowered machine. I was infatuated. I installed WinXP on it for the first time, I laid awake at night listening to the hard drive wail... an 8GB hard drive. It barely fit XP on it. Over the next few years I did everything I could think of with that box to make it run just a little bit better. I scavenged my house's parts drawer, googling serial numbers on the silicon to see just what this board did, and whether it would make my relic run faster. A few months later, I upgraded my computer for the first time... A Pentium III! A 20 GB hard drive! My world had opened up. I had to have my old data and parts, so I became familiar with the internals of the computer. Eventually I ran out of things to do in windows. I tried the linux thing I had heard about, burned myself a liveCD of ubuntu, gutsy gibbon. Wifi didn't work on the cheap usb dongle I was using, so it was back to windows to me. But I had gotten a taste. A taste of the terminal, and of an operating system that actually let me tinker with things. When ubuntu fixed the wireless driver for that dongle, and I was ready to install it again (after a failed fedora install or 2). After that it was just one software foray after another. Something broke, and I adventured deeper into the file system to fix it. I started to learn how to program. I learned about Arch Linux. I got a Dual Core computer (!!). The rest is history.
I still learn all I can, but my focus has shifted a little. I'm learning to program properly, to make new tools instead of understanding existing ones. I'm learning C and python and haskell and lisp (and looking for more!) and loving every minute of it.
I find it a little bit disheartening, however, the lack of curiosity that my peers display, even my fellow CS majors. To me, the computer is a vastly complex system, just waiting to be explored, and thanks to open source software, I can! But a lot of my peers don't see things that way. I would hazard a guess that part of that reason is that the environment presented to them is not particularly interesting, particularly in a windows or mac environment, where a lot of the details are abstracted as far away as possible. For me the spark of curiosity was ignited when I had a relic of a machine that I needed to make run faster. I tried to squeeze the most out of that computer, and it taught me enough to whet my appetite. I feel like our youth, my peers have been and are being short changed by technology today. I'm not sure what there is to be done about it, but I know that it isn't right.
I can't speak for your peers, but for me, deadlines and emergencies have probably taken a lot of the fun out of it. It's one thing when you're trying to debug some driver issue on your own system and you're doing it as a hobby so whether it gets done today or next month is no big deal, versus, say, crossing yourself and muttering a prayer and rolling out an update following weeks of testing, only to have it conflict with something you didn't anticipate and the entire network goes down in flames while your phone melts.
Or, for a slightly less doomsday scenario, just a relatively simple problem affecting one person, but they're still trusting you to figure it out, and they're expecting you to do it sooner than later.
There's a point at which the fun really starts to go out of that. I've always been a bit of an adrenaline junkie, I've always worked well under pressure, but nowadays my favorite thing is quiet time in the sunshine digging about in the garden. (Oh god, that makes me sound old.)
Maybe this will never happen to you. I hope not! But I've been thinking a lot about what Chuck said, and about my frustrations with technology, and I think that maybe this has a lot to do with it.
You should have realized this years ago, and made your peace with it. I think I made my peace at about 23, after over 10 years of programming and wading through this crap (and this process was frankly like coming out of a mild depression). I'll reiterate what I said in the previous discussion about Ryan's rant:
"Just about everybody knows that all our software is imperfect crap on top of imperfect crap, from top to bottom. Everybody, when met with a new codebase above a certain size, thinks they could do better if they started over and did it "properly this time". Everybody can look at a simple thing like a submit form in a web browser, and sigh at the inefficiencies in the whole stack of getting what they type at the keyboard onto the wire in TCP frames, the massive amount of work and edifices of enormous complexity putting together the tooling and build systems and source control and global coordination of teams and the whole lot of it, soup to nuts, into a working system to do the most trivial of work.
"But this is not a new or interesting realization by any means. It's not hard to point almost anywhere in the system and think up better ways of doing it. Pointing it out without some prescription for fixing it is idle; and suggesting that it will be fixed by wholesale replacement by another complex system is, IMO, fantasy."
(Things probably only seemed like they worked in the old days because you hadn't realized what was going on in the sausage factory; or it was an almost useless piece of hardware whose software did almost nothing, owing to its simplicity.)
That's how programming started for a lot of us. Start playing with transistors, then discover 555s, 741s and discrete logic. Eventually discover micro controllers and start coding in assembly and the C. 10 years down, find yourself managing a large team of developers and you absolutely hate it :)
And so you start using your programming skills to travel the world while working, see some amazing places, but discover to your dismay that things aren't any better no matter where you go.
But you continue writing your own stuff, releasing open source programs here and there, and then somewhere along the way you move into freelancing, which removes the protective layer between you and unreasonable customers.
Tiring of that, you move into iPhone apps and discover that you suck at marketing.
And then one day you find yourself living in San Francisco, founding a startup with a bunch of awesome partners, working insane hours, and having a BLAST trying to solve hard problems.
There is much joy and wonder in this world; you just need to look harder.
Every morning when I get in my car for the morning commute my Android device automatically connects to it via Bluetooth. From the steering wheel I press play and my personal music collection on Google Music's cloud starts streaming through the internet, to the 4G cell network, to my device in a car traveling 70mph, to the car via Bluetooth, and I hear the music through my car speakers.
"Everything is amazing and nobody is happy" - Louis CK
This is also why IT admins become fascists over the long haul. Turns out nothing is compatible with anything, especially turn-key software. Turns out that no vendors care to test with any other vendors. Turns out that everything is in some state of brokeness and the only sane solution is to limit complexity and variety and focus on x amount of technologies that we have time to test and make work.
This is why you get a PC without admin rights, set to autoupdate, with our default image, our security policies, and our unpopular defaults (IE9 in my case) but with an option for chrome for those who request it.
I have no idea how home users get by. I hear about their problems and its sounds like a nightmare of poorly written default (your average home laptop is a minefield of OEM junk and a malware magnet). I suspect this is why Apple is doing so well. They abstract away the complexity and actually make their products compatible with their OS and other offerings! Toss in the fact that OSX isn't as targetted by malware, and I can see the allure. On the corporate side, linux is the same deal. We all are comfortable with the command line tools we learned in high school/college/whenever and they haven't changed. Its like an old friend. Sure, it may not generally "just work" but its easier to tinker and fix than a closed system.
Well, I'm a software developer and I can relate to what you're saying. I certainly don't try to defend poor software, and often I find myself frustrated with the current state of software.
But, the reality of the situation is, you can't change the whole world all at once! Do what you can, plan to change the world and operate step by step. In the meantime, make your money, earn your living and deal with the current state of affairs while you're improving the world in your spare time! :D
You remember when most things "just worked"? When exactly was that?
I remember when you would buy a new mouse and sometimes it just wouldn't work and then you actually had to take it back to the store because you had something slightly different about your computer and there wasn't a released driver yet.
I remember when Blue Screen of Death was more than just a joke, and it was an actual thing that happened on a fairly regular basis.
You know how everyone thinks IE6 is teh devil? I remember when it came out and was awesome and it actually killed off the competitors for good with all its sweet features.
That's the world I grew up in, and the current state of things is MUCH better.
> I remember when it seemed like I could just open something up and start hacking on it without having to hold tens of thousands of lines of code in my head, spread over dozens of files.
Here's me telling you what's wrong instead of listening: what you're missing here is that we're doing vastly more than we were back then. If you want a pixelized bunch of text that does a simplistic task or two I'm sure you could do now in even less lines and more readable than you could back then.
But no one wants that from us anymore. You can't make a living doing that. You, yourself were complaining about Samba not implementing a specific feature. I bet their code is already too big to meet your "software is fun" metric.
You must have used some pretty shitty IDEs then. I use IDEs for the productivity boost, owing to code completion, fast lookups, folding, integrated documentation, syntax highlighting, integrated build system, interactive graphical debugging, integrated source control, project based search, GUI tools, etc.
Of course, all IDEs have their design flaws (like any other piece of software), but the benefits are compelling.
I use IDEs too — sometimes — but there's a dark side to each of the things you're describing above. Here's the devil's-advocate argument.
If code completion is speeding you up, it means you're writing code with a lot of redundancy in it. But the slow part of programming isn't typing the code in in the first place; it's reading it later. If the stuff you're putting in your source code is redundant boilerplate that code completion can put in there for you, then it's going to make it hard to read the code later, when you need to find the part that you actually typed, because it has a bug in it, or because you need to change it.
If fast lookups (I assume you mean meta-point, quickly jumping to the definition of an identifier from a mention of the identifier?) are speeding you up, well, maybe you're getting into a codebase you don't understand very well, because you're just starting to work on it. (Although these days grep is pretty fast.) Or maybe your codebase is an overcomplicated nightmare with too many layers of indirection, costing you comprehensibility even despite the wonders of meta-point.
If folding speeds you up a lot, maybe your code is way too deeply nested, or you have a lot of irrelevant stuff mixed in with the stuff you're trying to read.
If integrated documentation speeds you up a lot, maybe you're programming to an API that's too big to memorize, which means you're going to make errors when you call it and then not be able to see them when you read the code.
Syntax highlighting is nice, but it matters most when your code is poorly laid out and unclear, and then it doesn't make your code easy to understand.
If you're spending so much time in your debugger that it matters a lot to you whether the debugger is an "integrated graphical debugger" or not, you're wasting a lot of time. Debuggers are indispensable when you're exploring a codebase you don't understand at all. (Although, in that case, the best one is probably IDA Pro, not part of an IDE.) But on code you write yourself, use unit tests and code reviews to minimize the number of bugs you have to debug, and in many cases, logging is a much more efficient way to track down bugs than a debugger, because you can see a whole run at once, instead of a single point in the execution. As an extra-special bonus, logging lets you track down bugs from production that you can't figure out how to reproduce on your own box.
If IDE GUI tools are saving you time, any time at all, your GUI library sucks and you should use a better one. Also, you're probably making shitty GUIs, the clicky equivalent of the IRS 1040-EZ, joyless bureaucratic crap, because GUI tools don't help you when you're trying to construct things like kseg or Inkscape or the GIMP. HTML and CSS, or for that matter Tk, allow you to produce those soul-destroying forms with less effort than some crappy drag-and-drop IDE. That's why HTML has replaced Visual Basic in modern use.
As for project-based search, I keep each project in a directory of its own, and then I can use grep -r or ack to search the project.
As for integrated source control, I probably spend about five minutes per hour interacting with my source control system, except when it completely breaks (in which case IDE integration is generally pretty useless).
If your integrated build system is saving you time, then you have way too much complexity in your build scripts, which are a part of your code that don't add any functionality to your product.
In short, the lovely features you're describing are real productivity boosts — but each only exists to compensate for an even bigger productivity killer, and then they only rescue you partway.
⁂
Do I believe that? Well, halfway. I use M-/ in Emacs (code completion) pretty often. I use integrated documentation because there are some APIs where I never can quite remember the arguments, including sometimes things I've used for decades. I use meta-point in Emacs under some circumstances; it's pretty much a necessity for some codebases. (And I use it often when I'm exploring unfamiliar code.) I always turn on syntax highlighting in any editor, because whether or not it improves my productivity, it's pleasant. I don't use folding, but it's not uncommon that I wish for a hotkey to fold all the comments in a piece of code. I have F5 bound to 'recompile in Emacs for the times when I'm playing around rather than implementing something well-defined. I use GDB's Emacs integration. Also, I use the fuck out of Firebug.
Basically I think it's stupid to do stuff manually that could be automated. Compilation is a good example: by using a compiler, I automate endless hours of fiddling with register assignments. But I think that eliminating work is even better than automating it.
I've gone to report GNOME or Ubuntu bugs, only to find that the problem already has a bug report from 5 years ago. The bugs are still open and unfixed, with a poor sap adding a "me too" comment every few months.
As a programmer I find this offensive, we work all day trying to make software good (enough).
As I see it the problem is with users, they think they should get programs, like firefox, for free.
Most programs you use on a daily basis are now larger than anything a single person can build in a lifetime, and it should be free ?
No business constraints and lack of money is what is keeping software down.
Nobody is paying anyone to 'reinvent the wheel' to be better.
You can pay someone to implement this, as well as any other missing AD features, then open source it so no-one has to implement it again.
The original devs that did much of the reverse-engineering work on Samba have since moved on to much more rewarding work. That work will essentially never be done again, nor be improved upon until financial incentives are introduced. Much of the work on Samba since has been bug fixes and pushing your food around on your plate.
We are not your servants, we are people. Give value, get value. Open source is not your custom-fit panacea.
>"So build your own version."..."Why are you reinventing the wheel? That's already been done."
Probably two different groups of the people, the latter group is likely harried contributors to a project you're asking for help from who doesn't have the benefit of the context for the work you're doing. It's a common thing in mailing lists and IRC.
>So Western Digital's backup software interferes with Outlook in funny ways?
We're dipping into two different pools of shitware for examples of bad software now.
>So everybody's decided to abandon sane software versioning altogether and make the support end of things even more nuts?
It's gotten more standard in some respects, thanks to SemVer. I can't speak for companies that have decided to treat it like a high score, much like the Linux distributions of the late 90s and early 00s.
>Software used to be fun.
Still fun for me, after ~15 years of coding, 5 of them professionally. Sounds like you're just grumpy and unwilling to invest in swatting any of the gnats flying in your face.
That or you should take up buddhism, seriously. I'm an atheist but there's some benefit to learning when you should and should not care about things.
Thank you (seriously) for demonstrating the part where I said, "but they'd rather tell you why you're wrong rather than really listen to why you're frustrated."
> ...as well as any other missing AD features...
This isn't exactly a feature, it's a core part of AD permissions. Samba 4 was developed for the purpose of taking on server roles in an AD environment.
> We are not your servants, we are people. Give value, get value. Open source is not your custom-fit panacea.
OK, I hear that a lot. And it's a fair criticism. Now, here's the other half of it: support people are not your janitors. Quit expecting us to spend hours digging through arcane documentation, followed by further hours troubleshooting things that you left half-finished, and then turn around and tell us to write it our own damn selves. Because, seriously, there just aren't enough hours in the day. I'd love to contribute more to open source, but first I have to get enough revenue in my business to support that, and before I can do that, I have to figure out how to fix my clients' technical issues without raping their pocketbooks. The harder my job gets, the less likely I am to contribute.
For example, I might be writing a Jooma migration tool right now to fix the stupid 1.5->1.7 issues, and I'd be happy to release it and even support it for as long as people need it, but first I have to figure out what the hell is wrong with the wireless drivers in Linux on the new laptop...
> We're dipping into two different pools of shitware for examples of bad software now.
Yeah, that was the point: examples of bad software cross all disciplines, all companies, all environments. If it was just one company that consistently produced crap software, it would be easy to say that there's probably something broken at that company. But when there are so many companies, and so many freelancers, and so many open source developers producing crap software -- there's probably some issue with software development itself.
> That or you should take up buddhism, seriously.
Eh, I appreciate that, really, but I don't want to stop caring. I want it to be better.
> support people are not your janitors. Quit expecting us to spend hours digging through arcane documentation, followed by further hours troubleshooting things that you left half-finished, and then turn around and tell us to write it our own damn selves. Because, seriously, there just aren't enough hours in the day.
Unless I'm missing something, if software were perfect then there wouldn't be a need for support people?
Yeah, and I think that would be OK. I'm not certain about it, but I think that a lot of the time and energy that goes into support could go into development instead, and most people would be happier. For example, clients that are paying me to fix things might pay me instead to make new things for them that would better fit their needs.
I'm not under any illusion that it would work that way for everyone. But, I can't think of a single support-level person (whether consultant, phone, on-site, or otherwise) that's been doing it for more than a few years that's really happy with it. It can be exceptionally frustrating to be the go-between between users and fluky systems. So, at the very least, it would eliminate a necessary evil that makes a lot of people unhappy.
It might also result in fewer jobs to go around. But I don't think so.
>It might also result in fewer jobs to go around. But I don't think so.
It would result in more wealth and efficiency, not more jobs. Capitalism says nothing about jobs, only wealth.
We don't really (in western society) have a mechanism for transferring wealth beyond trade, labor, and government fiat. In the absence of busywork created by government fiat, we're going to have a hard time socially speaking midwifing an increasingly efficient world where redundant jobs get replaced with technology and processes.
This will continue the trend of increasing wealth stratification as whomever has control over the means of production will be subject to the will of others less and less, and be able to keep more of their profits.
Incidentally, this means it'll be fantastic to be a programmer, and terrible to be a laborer.
If a novel solution isn't found, the best many could hope for is a service job or medieval-style patronage of arts as production and maintenance of product pipelines requires fewer humans.
You're missing that software which works perfectly still wont do what you want magically. Just because it works on a terminal server doesn't mean it will select and install itself on your cluster when you want it.
Just because it generates reports promptly without crashing every time doesn't mean it will know which reports to generate or which data to pull from, or how to include your new office statistics.
Just because your phone synchronises email when abroad without messing up doesn't mean it knows which email addresses to forward to which people.
Support people would then be primarily involved in using technology to help businesses do things, instead of primarily using technology workaround other technology's problems.
You brought up some great points there but couldn't those be generally classified as training? When I think of support I think of people who help the customer to analyze problems and fix them. If support people are spending time holding the customer's hand then I'd say it isn't fair to be tasked with support and training.
> support people are not your janitors. Quit expecting us to spend hours digging through arcane documentation, followed by further hours troubleshooting things that you left half-finished
I am amazed once again at some people's ability to take free stuff and complain that it isn't making them money fast enough.
Just take a few seconds to think about the value we all get out of the deep and broad Free Software stacks. Finding bugs and fixing docs is a part of that process. You're not entitled to any of it, but you're welcome to participate and partake of the fruits.
The problem is that writing the code yourself is nice, while tracing/solving bugs in the code of others and fixing docs sucks. Too many free software authors do the nice work and rely on others to do the dirty work. They write code, release it and claim victory, while their code is only barely usable. The problem is that something barely usable is better than something nonexistent, so it gets used and we are stuck with a suboptimal solution. In this respect, free software development constantly gets stuck in a local maximum.
You have to weigh the value of immediacy against permanently enshrined qualities and perform your own cost-benefit analysis.
This is called critical thinking, they introduce this around age 12 in most western societies. You don't have to pick a religion and stick with it, you can just make a judgment call on a case-by-case basis.
> Quit expecting us to spend hours digging through arcane documentation, followed by further hours troubleshooting things that you left half-finished
I am amazed once again at some people's ability to take free stuff and complain that it isn't making them money fast enough.
That complaint applies equally to commercial software.
CEO arrives in a foreign airport, calls us up because his Blackberry now shows email headers instead of email, including email in his inbox which previously had all the content. The carrier tells me everything is setup fine for roaming and he needs to reset his Blackberry by taking the battery and SIM out, and then booting it with no SIM, then putting it back together properly. And if that doesn't work, he needs a new phone. And we couldn't talk him through it because that's his phone, and we couldn't email instructions, and anyway it was evening in an airport and he was in no mood for it.
This is a commercial device, praised for its business email handling, dealing with a well established, decades old protocol, from a large international carrier. Talk about a problem which just shouldn't happen, with a nonsensical solution.
And it's just one anecdotal example of every day workaround-finding. Wobbly software Jenga towers everywhere, and reboot-into-a-known-state is rule number 1.
>Eh, I appreciate that, really, but I don't want to stop caring. I want it to be better.
Corpses write no code. shrugs
> If it was just one company that consistently produced crap software, it would be easy to say that there's probably something broken at that company.
Anything below the top, say, 2-5% of software is guaranteed to be shit because all the programmers below the top 2-5% are shit. There is no grand movement or methodology to be had here.
It's a people problem.
>For example, I might be writing a Jooma migration tool right now to fix the stupid 1.5->1.7 issues, and I'd be happy to release it and even support it for as long as people need it, but first I have to figure out what the hell is wrong with the wireless drivers in Linux on the new laptop...
This is why I use Linux for workstations where it seems more comfortable, and my Mac for my mobile machine as it's more tolerant of network/display disruption. I'm not trying to troll or be a Mac fanboy here, I prefer working in Linux as I am simply more productive and it's my production OS. But, a laptop that is on the move plays to the strengths of OS X sufficiently that I am travelling right now and using my Mac instead of my Linux laptop.
You're going to have to accept that if you use the wrong software for the wrong problems, you're going to keep getting poked in the eye. You're using bad hammers and complaining about how bad hammers are.
There's a distinction to be made, an important one.
>Now, here's the other half of it: support people are not your janitors.
Open source projects don't have "support people", they have contributors and devs that volunteer their time. I worked for a MOTU and volunteered in the #ubuntu channel on FreeNode for years. I've done thousands of man-hours of support. I know exactly how bad software is, and how bad the situation is.
I'm on your side here, but until you start solving the problems one at a time, nothing changes.
My apartment gets cleaned one trash bag at a time.
>The harder my job gets, the less likely I am to contribute.
Something's gotta give. Stuff like Joomla registers as shitware in the circles I go in.
Complaining about things like Joomla, PHP, Drupal, Outlook, etc. doesn't really register with me.
Might as well buy a Kia and complain about how terrible the state of automobiles are. The author had much stronger points than you have. You work with some terrible stuff, period.
Anything below the top, say, 2-5% of software is guaranteed to be shit because all the programmers below the top 2-5% are shit. There is no grand movement or methodology to be had here.
It's a people problem.
Disagree. I don't think 95-98% of programmers are idiots. However, a lot of programmers (even some very smart ones) are terrible architects and have no sense of the big picture. Moreover, a lot of projects start out well but turn to shit through neglect and departure from the original architecture.
95-98% of software architecture is deplorable, but that's not because our industry is has 20+ idiots for every decently smart person. It's a lot more subtle than that. A big part of the problem is that most of professional programming is so disconnected and often alienating that a lot of programmers never learn architecture and its importance; the only way to really learn it is to support your own creation and experience first-hand the consequences of your original decisions. A lot of programmers never have that experience.
In sum, I think the general shittiness of the software industry and of architecture has a lot more to do with the fact that few programmers never have the experiences that will make them any good, than it does with a lack of talent. It takes 10,000 hours of deliberate practice to become good at something, but most of what most programmers do for work is not "deliberate practice"; it's repetitive drudge work that they often don't have the creative control to automate.
If you have 10,000 hours of practice in a particular software package or platform, congratulations, you are an expert at something that was obsolete five years ago. This is something that makes software development both a frustrating field and an interesting one. I always feel like I'm learning, but never feel I've really mastered anything.
I'm glad you brought up the point about software architecture because it's something I constantly find myself explaining: the difference between good architecture that survives many many generations and forms of code re-use vs. code monkey output that is fit only for that individual programmer's or his client's one-time use. This is an especially important point to bring up when you encounter business people who wonder why we can't just outsource everything. Cheaply-outsourced code is too often the drudge code monkey output we would find deplorable.
>Disagree. I don't think 95-98% of programmers are idiots.
I didn't say that.
> It takes 10,000 hours of deliberate practice to become good at something,
People need to stop re-hashing this. 10,000 hours of deliberate practice is likely but not necessarily going to make you an excellent programmer. I know plenty of programmers in their 40s and older who have that much time in or more that are frankly, garbage.
Investing time is necessary, but insufficient, and there is no one grand unified number that defines all professions for what is necessary to become excellent for every individual.
I know a 60-something whose code output terrifies me and he has a great deal more than 10,000 hours invested in programming.
OTOH, one of the programmers I most deeply respect is a 50-year old woman.
It's hit or miss. In my experience, passion counts more than anything.
>but that's not because our industry is has 20+ idiots for every decently smart person.
Work at a major insurance/something-not-directly-related-to-software company. That's an optimistic ratio.
Wait a minute, are you talking about people who spent 10,000 hours really trying to get better at their craft, and not just doing their work? It's like natural languages, where you can be immersed 20 years in a country and talk no better than you did 2 months after your arrival.
You're No-True-Scotsman'ing a subject/concept that was dead on arrival.
Stop pretending the 10,000 hours thing is a "real thing" or somehow fact.
It's not. Fucking stop it. It's the fantasy of a bad writer who makes up shit based on pure anecdote.
And to use your own bullshit against you, he never said it was deliberate practice in the book, he explicitly used the example of the Beatles, whose "10,000 hours" was them jamming in public and fiddling around privately, not hammering at chord progressions.
The 10,000 hours meme is bullshit. Stop propagating it.
The 10,000 hours meme (if it is that, it seems more of an observation to me) is not to be taken literal.
If you take it literal, then yes, it is nonsense, it is not that until the 9999th hour you are going to be bad at something and then suddenly, boom magic.
What it means - and what I've found to be very true - is that to get better at something you need to put in time and you need to practice your trade.
Nobody is born a 'great programmer', sure there are some differences in talent but I've seen guys go from bad to mediocre to good to excellent just by applying their trade and learning their lessons. Some of the kids I taught a decade ago that were struggling with basic concepts now run circles around me. That's proof enough to me that there is some truth in the 10,000 hour rule.
You can disagree with the writer all you want but in practice he does seem to have a point.
And for all your aggressive use of language you so far do not seem to have one. If you want to show that something is not true you have to provide counterexamples, not simply jump up and down using foul language telling people to stop.
People need to stop re-hashing this. 10,000 hours of deliberate practice is likely but not necessarily going to make you an excellent programmer. I know plenty of programmers in their 40s and older who have that much time in or more that are frankly, garbage.
Sure. And I could spend 10,000 hours playing basketball and I'd never become good at it. I don't have the genes. The point about "10,000 hours" is not that anyone can become good. It's that this is the amount of time that it takes for a person with sufficient talent (which is uncommon but not outstandingly rare, as it might seem) to become great at something.
Also, 10,000 hours of inadequate or badly-structured practice is useless. Otherwise, five years of work would be enough, and for most people, it's not. Most of the things that software developers do for money don't make them better programmers and therefore don't count.
It's hit or miss. In my experience, passion counts more than anything.
I agree. Passion, creativity, and courage are all important. It takes all three to figure out how to divert 10,000 hours away from what you're "supposed to do" and toward what will actually teach you something.
The point about 10,000 hours is that anyone can become good.
The point of Outliers, the Gladwell book, is to dismiss the idea that talent exists at all. "A person with sufficient talent", as you say, is a person who started onto decent amounts of practise at a young age, such that by the time the world notices them at age 8, 10 or 14, they're already surprisingly good and can make good use of professional coaching.
He used The Beatles travelling to Germany and having to perform for severl hours every night for months as the turning point between them being a band and them being a good band.
He didn't say they were "jamming and fucking around", or anything of the sort. He also didn't say "chord progressions is the only practise that counts", to reply to your other comment.
He also also didn't say "10,000 hours is a fact, 9,999 hours won't do", it's an anecdote fitting rule of thumb to tell a story.
But if you can find plenty of people who are world class at what they do, and haven't done anything close to 10,000 hours of doing it, have at it.
>> Still fun for me, after ~15 years of coding, 5 of them professionally.
Lots of us did coding as a hobby for 10 years before gettting out into the professional world. With 5 years under your belt, you're still a junior programmer and the world's your oyster. Let us know how much fun it is 15-20 years or more from now. (And when you do have to care about things more important than programming: family, health, aging parents, job security, etc., etc.)
Hey everyone thanks for your comments, I have learned a lot today.
For all I complain about the state of technology, it's remarkable that I am using it to communicate with some of the most amazing people in the world. You just can't make this stuff up.
I think that maybe it's time to stop taking the pain and do something about it. The only part I've never been able to figure out is how to earn enough income and maintain the independence required to work on the really big problems. Heck, maybe that IS the problem.
This morning I was going to get up and fix another computer to sell on eBay because I am living month to month since quitting my job in November to make the leap into living the dream. I've called in almost every favor and in all honesty am within a few weeks of applying for jobs. Maybe I should have blogged about the process earlier!
I feel like if I can't hack it, then there must be others out there just like me. I just didn't expect this post to resonate quite like it did. I wonder how many of us feel like that friend we all know whose band hasn't quite made it yet so they couch surf. The waste of talent is staggering, and all around us.
If you are interested in this issue, I set up a google doc here:
The first step to solving a problem is to understand exactly what the problem is and know the problem domain. I read your rant, and I'm afraid I did not come away understanding what the problem is.
What do you think the problem (or problems) is exactly?
Well you poked a stick in the hornet's nest for sure. Now we're all watching you :)
I'll give you my current high-level pet peeve for free:
While programming, it is so incredibly easy to miss something, to not be entirely thorough.
I think a lot of problems originate from the fact that, even for an experienced programmer, it is hard to get a feel for what a function actually does, including all the corner cases. Especially if someone else wrote it, or you wrote it a long time ago. Yes, you can read the program text, and encounter a lot of logical expressions, evaluations, etc. Fitting it all into your head can be difficult and takes quite some time. Code as text format is quite limited and slow to parse deeply in your brain.
I'm sure that is possible for a tool (some kind of uber-IDE) to help with this, for the compiler to 'explore' what happens inside the function (as that's what a compiler does). But what it can't do is show it to you in a useful way.
Some vague ideas around this:
- test cases: a lot of work in writing test cases is simply mechanical. You want to evaluate all the code paths, verify assertions and function input/output. There are initiatives like (Microsoft) Pex that attempt this. If you have auto-generated unit test-cases that explore the corner cases you get more feel for what a program does and what might still be wrong.
- code verification: computers are incredibly fast, and can understand the code that we type instantly. With post/preconditions and assertions, it could be possible to find bugs and overseen cases while we type in the background by random/structured "brute force" fuzzing (let the monkey loose) or modeling analytically. Imagine the hours of debugging saved.
- graphical languages: pictures sometimes say more than words. Would also remove the "mental overhead" of worrying about specific superficial program syntax. Alternative representations of source code. Some kind of (intuitive) graphical visualization of software would be very useful. Both for building programs and looking at programs from a new perspective. This could also include intermediates between fully graphical and fully text, or simply looking at the program in another syntax.
- literate programming: why did this never go anywhere in the mainstream? the perspective/train of thought of the developer would be another input to the person maintaining the code. Code comments are part of the solution, but they're completely separate from the program. To be more useful the there needs to be a deeper way to associate/interleave the train of thought and even discussions between developers with the code.
Anyway, there's probably lots of issues with these ideas, but I do feel that better tools of the trade could help. Things don't have to be completely automated, but the computer could certainly help more in filling in the blanks. It seems that with all the jobs 'we' automate, there is little focus in automating part of the job of a software developer (on the other hand, the state of the art does advance, high-level languages like Python and PHP already helped a lot in making us more productive, oh man imagine that we had to write everything in C...)
Computer science has utterly failed to tackle the real world problems, things like automating jobs so people don’t have to work, or working hand in hand with humans to explore solutions we have trouble seeing ourselves.
I disagree. Automation is happening every day. Your error is assuming that will lead to "people not having to work." It won't, and the reason for that is social and political, and has nothing to do with CS.
Yes, the state of the art is atrocious. It's hack piled upon hack piled upon hack. Mind you, things weren't massively better in the 50s, 60s, or (maybe?) the 70s.
I'd like to believe that a deep rethinking of computer systems in languages that aren't C-based and incorporate the academic OS research done in the last 30 years would produce some fantastic innovation. But that requires something like a Bell Labs willing to allow years of hacking for potentially 0 return.
Wouldn't any accomplished professional look at their own industry and feel this way? I can't imagine any medical doctor who would look at state of the art in health care and say, "Our industry is perfect!"
I think sometimes as developers, we need to cut ourselves a little slack.
Sure we need to continue to move the state of the art forward. But sometimes we do get stuff done even in spite of our industry's imperfections. And some people find software pretty useful even with two different versions of PNG image loading code.
Last year I purchased a new tractor for my farm to replace an aging machine. I ended up with one from the same company, only thirteen years newer. The new tractor is definitely much more comfortable to operate and has a nicer appearance to onlookers, but when you start to look at the details, the thing is not all that much different than the one it replaced.
For example, the old tractor had a couple of problems that would crop up when it was cold. Nothing serious, but definitely annoying. The new tractor exhibits the exact same problems in an identical fashion. Thirteen solid years of engineering and the only thing that really changed was some improvements to the user experience.
And that sounds a lot like the software industry – a new interface on top of the same old technology. With that, I imagine you are right that virtually all industries have the same problem.
It's important to step back and take a broad-based perspective. Are you complaining about something that saps you 5% (which can be annoying) or 90%? People complain about both, and it's not easy to tell which is the case just from the volume of complaint.
Across the software industry, we lose an incredible amount of time to the maintenance of bad code. The average professional programmer writes about 250 lines of new code per month. Most large software companies have zombie legacy systems that have ceased to grow but have one or more full-time developers only on maintenance.
So, yes, there is a problem. The way we are doing things, as an industry, is terrible. On the upside, this means that there's a lot of profit potential in improving engineering practices.
Instead of, say, adding a new feature to a large project, let's lay a new subway line in New York. Better make sure that it works with the current signaling system, and can accomodate every train that's on the lines now. Oh, and before you start digging, you better check that you're not cutting across existing power, water, steam and gas lines. Not to mention other tunnels, building foundations, mole people colonies, etc.
And the code I wrote for IBM mainframes using CICS and VSAM _still_ _runs_ today, as does the code for Unisys mainframes written in TIP, COBOL and DMS. Maybe I'll go back to using that. Hell, it runs more reliably than anything that tried to replace it!8-))
But I really miss toggling instructions and data directly into memory on the PDP-11's control panel. Yeah, the state of the art is terrible indeed.
Good article. The app store stuff is spot on (coming from someone who's either directly or indirectly worked on over a dozen iOS apps.)
I tried talking a client through deploying his stuff to the app store. It took about a half hour to figure out what weird setting (overridden by his target settings) that he had different.
Does stuff really need to be this difficult?
it's almost absurd.
Though I speak from a probably utterly uninformed + unqualified standpoint, this raises a few points for me:-
1. If x86 hardware is so terrible (and I have heard that the architecture really is bad many times), how come we don't have competing chips out there which are many, many times more efficient? I know ARM outperforms on the low-power front, but not in terms of perf to my knowledge. Do such chips exist? And if not, why not if this is true? Even with x86 backwards compatibility concerns, you could bring out a theoretical amazingly powerful chip and just port compilers to it to leverage it and gain some (possibly niche) market hold that way.
2. I think he is underestimating the vast complexity of computing, and deeply underplaying the techniques which have evolved to get us where we are. Yes, I have again heard that the Von Neumann architecture has shortcomings, but look at what computing has enabled us to do - it has changed the world + enabled amazing things which are evolving at breakneck speed. Again - if there really is an ideal alternative approach, why isn't it being pursued? I don't buy the conspiracy theory crap about vested interests holding it back. Again, you could do well in a niche if the alternatives really are that much better.
3. I think it is likely that as things evolve previous ideas/approaches will be overturned and old approaches thrown away, like any pursuit. As @deweller says, this is true of any field. Ask any professional and they will tell you how their industry is fucked in 100 different ways. It is frustrating to hear software engineers talk about how terrible immature + hackerish + crap it all is while assuming that things have some sort of crystalline beauty in other industries. I did Civil Engineering at (a good) university, and it turns out that most consultancies use linear models to model everything, then apply an arbitrary safety factor, resulting in hopelessly inefficient over-engineered structures all over the place + turning the job into something of an uncreative administrative role in some respects (with no disrespect intended for all the civil engineers out there).
4. It is my theory that any attempt to interface with the real world and actually do useful real things results in an enormous amount of uncertainty and chaos in which you still have to make decisions however imperfect and however negatively others will judge it going forward. I think that's true because the world is chaotic and complicated and imperfect, and to influence it means to be touched by that too. That doesn't excuse poor engineering or not caring or stupid decisions, etc. but it is a factor everywhere I think.
5. So change it :-) there are efforts afoot everywhere to struggle to improve things, and from the sounds of it the guy has perhaps not tried enough things. I feel go is a great step in the right direction for a systems language which removes a lot of the crap you don't want to think about, as does C# (though it does potentially tie you to ms crap).
6. Programming is difficult, solving real problems is difficult, abstractions are nice but leak, and sometimes it's quite painful to have to put up with all the crap other people hacked together to make things work. But the value is in the end product - I don't care that my beautiful macbook air runs on a faulted architecture and some perhaps imperfect software, the beautiful ease of use is what matters to me. We mustn't lose sight of these things, though again this is not an excuse for crap code. There is definitely a balance to be struck.
7. Zack Morris is clearly a highly accomplished + competent hacker given what he's done and the obvious passion he writes with (to care enough to be disappointed + upset by these things is indicative), which I think has a bearing - the deeper your knowledge, the better your skill, the more faults you notice by dint of your intelligence + competence. There is a definite curse at play here.
I resonate with Zack's rant, we have apparently chewed much of the same bugs.
A quick answer to your @singular's questions:
1) Existing code - this trumps writing everything from scratch.
2) I don't agree, I believe the compuation is straightforward, my belief is that what you perceive as 'progress' is mostly just 'go really really fast.' I showed a Microsoft engineer at the Vintage computer festival installing an RDBMS on VMS while four people were playing games and exploring VMS on four terminals connected to the machine, then I fired up and ran the test code to show the code had installed and was usable. It did not impress him that I didn't reboot the system once, nor did the other four people using the system notice I had installed a new capability that was available to everyone using the OS. Those are not design goals of a 'personal' OS like Windos/DOS/NT, although they could be. The stuff you learn in CS classes about architecture and layers and models and invariants, can make for very robust systems.
3. My experience is that programmers program. Which is to say that they feel more productive when they produce 10,000 lines of code than when they delete 500 lines of code and re-organize another 500 lines. Unlike more 'physical goods' types of industries it is easier to push that stuff out into production. So more of it ends up in production.
4. Not sure where this was going.
5. This is something I like to believe in too, its just code, so write new code. Hey Linus did it right? The challenge is that it will take 4 - 5 years for one person to get to the point where they can do 'useful' stuff. That is a lonely time. I wrote the moral equivalent of ChromeOS (although using eCos as the underlying task scheduler and some of the original Java VM as the UI implementation.) Fun to write but not something picked up by anyone. You get tired.
6. I'd take a look at eCos here, one of the cool things about that project was a tool (ecosconfig) which helped keep leaks from developing.
In the 'hard' world (say Civil Engineering) there are liability laws that provide a corrective force. In software it is so easy to just get something put togehter that kinda works, that unless you are more interested in the writing than the result, you may find that you're spending less time on structure and more on results.
1 - I agree that's a huge, massively important thing, but there are non-x86 processors in the world which find their niche (in ARM's case it's quite a huge niche), so surely if it is possible to develop a processor which is so much better than x86 then why don't they already exist? I am hardly very well informed on the processor market, so for all I know they do, though I'd be surprised.
2. Sure, I guess what I'm getting at is that we've done amazing things with what we've got, I'm by no means suggesting we shouldn't take a broader view and replace crap, or at least work towards it where market entrenchment makes things difficult. The point is, again, that if there exists such a plausible alternative to the Von Neumann architecture, then why aren't there machines out there taking advantage? Again you could probably fill a niche this way. I suppose, in answer to my own question here, that you would be fighting a losing battle against the rest of the hardware out there being reliant on V-N but still, I'd have assumed that something would exist :)
3. Yeah. But it's hard + often the harder path to do things right in any industry. Such is life, not that that excuses anything.
4. A sort of philosophical point. Feel free to ignore :-).
5. There is stuff out there that already exists too though. Go, OCaml, Haskell, F# are all really interesting languages which in their own ways tackle a lot of the accidental complexity problems out there. Plan 9 + inferno are very interesting OSes, though they are probably a little too niche to be all that useful in the real world. But yeah, understandable, fighting the tide is difficult.
6. Cool will take a look.
Yeah - one of the things that attracts me to software engineering is the relative freedom you get to be fully creative in solving a problem. However that cuts both ways it seems.
Yes, and it was hilarious. It stunned me that Intel couldn't figure out that AMD was going to eat their lunch when they figured out a way to extend the x86 architecture to 64 bits while retaining software portability. For years Intel had beaten challenger after challenger based on the juggernaut of their software base (680x0, MIPS, NS32032, SPARC, PowerPC, Etc) and yet their brash attempt to push Itanium by not extending x86 was counter to all those previous victories. Kind of like a general taking a battle plan known to work and ignoring it.
As we move into an era of 'I don't see why I should get a new machine' of growth minimization there is a window for folks like ARM to get in with 'all day computing.' But it will take someone extraordinary to make that happen. Look at the state of Linux on ARM to understand the power of a legacy architecture.
The x86 thing is just grousing by people who think aesthetics in the assembler are the definition of a clean architecture. Instruction decode for a modern x86 CPU is indeed a difficult problem when compared with cleaner architectures (though ARMv7 is hardly "clean" -- how many instruction sets does it support now? Five, I think?). Instruction decode, however, is one of the easiest problems to tackle on a CPU. It just happens to be the part that software people understand, so it's what they like to whine about.
You're absolutely right: if it could be done much better, there'd be an example in the market to prove it. Yet Intel is walking all over the market, and has been for the last 8 years or so.
Statements like this -- saying that x86 is either good enough or as good as anything could possibly be anyway -- sound to me like a lack of imagination, lack of perspective, or not wanting to stir up any cognitive dissonance given that market forces have caused x86 to dominate.
Would you also say that there probably couldn't exist a significantly better OS than Mac, Windows, or Linux, or else we'd know about it? I admit "better" can be hard to define; what would make a 10x better car, or IP protocol, for example? It strains the imagination, because what makes these things "good" is complicated. But ask anyone who was around during the early proliferation of computer architectures and operating systems, and they will tell you that what ended up winning out is markedly inferior to other things that could have been. Paths get chosen, decisions get made, billion-dollar fabs get built. The market doesn't pick the best technology.
It's like web standards -- accept them, but only defend them to a point. We may be stuck with CSS, but that doesn't mean it's any good or that it succeeded on its merits.
1. Well GPU's crush x86 CPU's in some areas so there is at least one competing technology that is a clear win. Also, Intel added both a GPU and video decoder to their CPU’s, but neither of them use anything close to x86.
As to the rest of it, I think you can look at microwaves for a perfect example of terrible software in wide spread use. You need to be able to select cook time and possibly power level or set the clock. Yet, most microwaves have such a terrible interface that few guests get embarrassed asking how to get a new one to work. And as long as it takes more effort to send it back than it takes to figure out the strange design there is little reason to build a better system.
You are right that it's hard to compete with x86 but it's for a weird reason (beyond the economic might of behemoths like Intel). x86 has good density, so it can do more in a few bytes than sparser instruction sets like RISC. In the late 90s when computers starting being memory bandwidth limited, PowerPC lost out even though it was perhaps a more "modern" architecture. I've often wondered if someone would generalize code compression (I could swear there was something like that for ARM?) Oh and I suppose I'm more of a ranter than a hacker - too many years of keeping it all inside - so now I kind of regurgitate it in these ramblings...
ARM has a second instruction set built into it called Thumb, which performs a subset of ARM operations in smaller instruction sizes. ARM is also an incredibly complex architecture which -- as someone who can effectively write x86 assembly in his sleep now -- I can barely wrap my mind around.
the core dilemma of computer science is this: conceptually simple systems are built on staggeringly complex abstractions; conceptually complex systems are built on simple abstractions. which is to say the more work your system does for you, the harder it was to build.
there are no stacks which are pure from head to toe. I guarantee you, old LISP Machine developers from Symbolics probably had a hard time designing their stuff as well.
There are other downsides to the ancient x86 instruction set than just a complicated decode step (which isn't all that complicated in transistor count). For example, think how much more efficient a compiler could be if it had 256 registers to work with. Or what if we could swap context in a cycle or two instead of the multi cycle ordeal that's needed now to go from user space to kernel space. It would finally make a micro kernel a viable option. Technically easy enough if you can start from scratch, but all software would need to be rewritten.
ARM thumb is a 2-operand (i.e. one of the registers is the destination) instruction set over 8 registers, just like i386. It has similar code density to x86, at the expense of fewer instructions per cycle. It does lack x86's fancy memory addressing modes though.
And I wouldn't say PPC lost. IBM has competitive CPUs in the market, they're just not in consumer devices. But they're just that: "competetive". They aren't much better (actually pretty much nothing is better than Sandy Bridge right now).
I think this discussion may be going in the wrong direction. Arguing the merits of ARM and Power instructions over x86 just seems to be falling into the trap the article discusses - slightly different ways to keep doing the wrong thing.
To me TFA is about a reassessment of fundamental assumptions, and it's about exploration. It doesn't suggest concrete solutions because nobody knows what they are, but it does suggest that our efforts to better the art have been short-sighted. Right now the next Intel chip or ARM chip is just another target for compilation, just another language or library fight with instead of solving real problems - solving old problems, not just the latest new/interesting/imagined ones.
(FWIW, this particular example doesn't excite me too much - If the future is DWIM, it almost certainly has to be done first in software, even if it is eventually supported by specialised hardware.)
1 it is lie that people like to wank on about. The inefficient x86 portion of a CPU they talk about is less than 1% of the transistors in a modern processor. The first thing an intel processor does when it receives an x86 Opcode is translate it into a more efficient internal Opcode that it actually executes. In fact it could be claimed this has helped to improve efficiency because front side bus bandwidth is at a premium and Opcodes that convey more information help conserve FSB bandwidth. Basically if it wasn't a processor designer who told you that I would take it with a grain of salt because people(including myself) like to pontificate on topics which are adjacent to their domains of knowledge, but which they know little.
I am no fan of x86; however, as I understand it, it's really not as bad as it seems. Modern CISC CPU's have turned into RISC-CISC hybrids; they have a CISC frontend/interpreter that drives a RISC backend. As a result, you get most of the best of both worlds.
We still have x86 because we don't like to change things. We like to be able to run windows 95 apps on a modern desktop, just like how we wanted to be able to use our original B&W TVs when colour and widescreen were introduced (although, since analogue signals are dying out, this will soon not be possible; hopefully interlacing will dieout soon too). We like backwards compatibility, upgrading our chips while still retaining compatibility.
Also, x86 isn't as bad as it is made out to be either, even if most of that is just how well our compilers can optimise x86 code.
1. If x86 hardware is so terrible (and I have heard that the architecture really is bad many times), how come we don't have competing chips out there which are many, many times more efficient? I know ARM outperforms on the low-power front, but not in terms of perf to my knowledge. Do such chips exist?
What about IBM's Cell (the one used in PlayStation 3)? Or GPU related technologies, like nVidia CUDA?
for a better idea why the state of art is actually much less terrible than it appears to idealists.
"I believe the hard part of building software to be the specification, design, and testing of this conceptual construct, not the labor of representing it and testing the fidelity of the representation. We still make syntax errors, to be sure; but they are fuzz compared with the conceptual errors in most systems.
If this is true, building software will always be hard. There is inherently no silver bullet."
I'm a bit unconvinced, since specification, design, and testing could be fundamentally different and vastly improved from the way they are now, not just the area of syntax errors.
Whenever you manage to make a specification "simpler" when there are in fact a lot of implementation details you're giving away the chance to modify the behavior of the stuff that is implied.
E.g. think why Google just doesn't buy Oracle servers instead of developing all their software infrastructure -- they wouldn't be able to provide the service they do and they wouldn't earn anything. Whenever you decide that you "don't care" you didn't really remove complexity, you just decided to accept the way the system behaves by default with all the limitations that come from that.
Idealists think that a lot of the details can be "abstracted away" just because they'd rather not think about them, but they are still there and they make the difference.
"Because they know what you are just starting to grasp. That it shouldn’t be like this. That computers have vastly underserved their users. Conceptually, mobile and casual interaction is the future of computing. But it has no formal basis."
"The good tools like functional programming and provably correct algorithms are either too esoteric or too expensive for the mainstream"
I was really with him there.
Then when he said:
"That’s a reason why one of my favorite languages is php. It just works. Screw up your variable typing? Who cares. Forget a symbol somewhere? Heck you still get some output and can see your error."
The part about pushing out a iPhone app is soul crushing. But the same can be said about web applications. Back then a bit of HTML, CSS, and some forms processing and you were in business. Now you need to know about server side vs client side processing, AJAX, blocking, HTML5, CSS3, and all the quirks about the browsers. Then after you actually get some version of a static page up you are busy debugging your SQL models and why your many to many joins are taking forever. Throw in the myriad of libraries you have for both front end and back end code and you are in for a nightmare.
I don't really get the point of this article. Felt like he was just saying that everyone today is a moron. I find this attitude annoying. Get off your high horse. Yeah, there's bad programmers and some people are oblivious. In any profession, there are people that are going to suck at it. There's also people that are simply brilliant and are amazing programmers.
Then to act like the entire software stack is broken is silly to me. Sure there's inefficiencies. In any large complicated system, you can expect there to be room for improvement. And anytime you introduce layers of abstraction, inefficiency is bound to happen. But guess what? Not everybody has time to write code in assembly. The fact of the matter is, the software ecosystem has become so complex that people are forced to specialize in more and more specific subsets of it.
And all this jazz about "Oh boo hoo.. software hasn't solved world hunger and doesn't wipe my ass for me when I use the bathroom". Geez man. Calm down. These problems your talking about like figuring out AI are incredibly hard. Its going to take a long, long time for them to be tackled fully. Software and computers have only been around for the blink of an eye in human existence.
Are computers inefficient and software complex and unwieldly? Sure, but I suspect the sense of fatigue and disappointment is due mostly to the cause he mentions last: the irrelevance (or even hostility) of most computer applications to a life well lived.
"if your grandparents can’t use it out of the box to do something real... then it fails"
The problem is not the difficulty of software, or its aesthetic decline. The problem is that the most important things for human happiness -- such as autonomy, integrity, feeling of connection to a world larger than us and love for other human beings -- are mostly ignored or eroded by technology rather than improved by it. This is not something new to computer technology, but it does seem to be focused and hastened by it.
When you're a kid, it's easy to be enthralled by the wonder of the machine. I certainly was. As an adult, you don't have that anymore. You need to feel like you're working for something worthwhile. All of that complexity would be worth managing if we understood it as part of a struggle for something of magnitude. The feeling that it's all crap comes, as much as anything, from this lack of ends.
Computing is in a bad shape partly because of momentum, and the momentum exists because of the nature of capitalism.
The momentum means we can never stop to make sure the software is as carefully written at it needs to be, and we can never stop to think of the big picture (all the other software we need to operate with).
And because of capitalism's inherent competitive nature, there is no concerted movement towards a unified goal in computing; competition leads to fragmentation, not unification. So we have a profileration of operating systems, redundant software reinventing wheels etc.
Open source is interesting in that regard. It's sort of the solution to a lot of the mess -- in theory, by making the source public, nobody should ever need to solve a problem more than once, and a specific piece of problem-solving code should evolve over time into the perfect solution -- but it's screwed by the competitive nature of people. (People don't just compete among themselves with their egos. They also compete against the status quo; what I like to call the "not invented by me" syndrome which drives people to create new stuff even though what they really ought to have done is to improve the old stuff, thus expanding the pile of legacy software even further.)
Why do we have both Python and Ruby? They are incredibly similar to each other. They are so similar it's silly. Sure, one's got whitespace-sensitive syntax, the other has runtime-extensible OO. But those are superficial differences, and nobody can objectively say that Python is better than Ruby, or vice versa. Do we really need both? And yet Matz and Guido are never going to join forces to work together on a common goal to create a single, superior technology.
Open source ought to work less like capitalism. People need to work together into creating the best, safest, most robust software imaginable.
> Computing is in a bad shape partly because of momentum, and the momentum exists because of the nature of capitalism.
This is true of any engineering discipline throughout history (read Henry Petrowski for many examples). One example was the technology of building iron railroad bridges in UK in 1800s. As the railway network grew bridges were needed and UK did not have enough forests left, so bridges were constructed from cast iron. However, at the time the science of metal fatigue was not there, so many of the these bridges failed - killing people in the process.
Clearly the state of the art of iron bridge building was in a horrible state, yet bridges were built because people needed them.
Eventually the science caught up and bridges don't fail as much anymore..
Maybe, but it's a shame you need two competing, but very similar projects, to accomplish this. If people had been competing about contributing the best code to the same project, we could have reduced the duplication.
Programming, at least the fundamentals, should be taught in elementary school just like any other language, but the teachers don't yet have the required familiarity, so we should start with them.
There's much to enjoy, agree with, and debate in this essay. But this is the most important line.
I was doing some consulting work at a public library in 1987, and they had several public access Macs. I set up Hypercard on them and started showing people, staff, patrons, kids, winos, how to use it.
In 1987 _lots_ of people had never sat down before a computer in their lives. I could spend an hour with someone and have them doing useful things with Hypercard. It was extremely approachable. And I don't just mean doing word processing or drawing pictures, though they were doing that too. They were creating software that solved real problems from their real life.
I started out by giving out floppy disks for them to save their "stacks", for them from out of my own pocket. We finally had to start selling them at the circulation desk.
The biggest conceptual problems, then as now, were in teaching people how to deal with files in a file system, on multiple volumes/disks, and in making people think in terms of versions of documents (actually applications in this case) and in backing them up.
Data loss from exposing the disks to magnetic fields was also a major problem, at least we seem to have conquered that problem...
A good, if not great, rant. I think the one point I'd like to make is I think the iPad represents the first opportunity in a long time to really fundamentally re-think computing. Yes, it comes with a lot of legacy garbage, and it's shackled by Apple itself, but it's different enough that it might just be the thing we need to have some fresh thinking.
The tragedy will be if we are so blinded by the past that we do nothing more than project the status quo onto it. If we can get back to our roots and remember what computing is fundamentally about, extending the capability of humanity, I think we can expect to see some amazing things over the next decade.
I still use computers, heck, the whole developed world uses computers now. But that's despite the way software/hardware have developed; a side effect of Moore Law making things faster and cheaper.
Unix is the big thing in 2011? Really? Nooo.... well shit!
Funny that the OP mentions Hypercard. When it came out a number of very smart people said, "Hey, that's kind of neat!"
And it was. It was little better than a toy at first, but I saw street people sit down with it and _an hour latter_ they were using it to solve problems, and were delighted. And let me tell you, in 1987 part of that hour was used to explain how to us a mouse, drag selections, double click, etc.
There are some true things in this rant, but I'm pretty sure about one thing: "provably correct algorithms" are never going to deliver us a computer to which "you should literally be able to tell it what you want it to do and it would do its darnedest to do a good job for you."
Natural Intelligence is the product of billion years of crappy evolution, and thinking that the application of some clear, clean mathematical concepts will be able to recreate it is very naive.
I used to work on Clinical Trials management software. We had an IVR product written in PHP and PGSQL. Project designers would build a project workflow with a half-baked web "flow chart" and custom PGSQL function that called existing PGSQL functions.
I have a nacent dream of a language that could elegantly replace the workflow definitions with pure code that looked like Lisp or Json. I'd like to work on a program like that.
Similar professional experience: I go back to having built rudimentary microprocessors with diode arrays and gates; Creating my own microcode and ALU; Wire-wrapping my own computers using 8080, 6502 and other early processors; Punching in my own mini-os using a hex keypad; Rolling my own Forth-based OS, editor and applications; working with PALs and later on multi-million-gate FPGAs.
I find myself agreeing with the author on many points. The evolutionary progress in computing seems to have stalled a long time ago. The fact that we are still hand-coding loops and such things means that it is hard to move up into a higher level of consciousness, if you will. The paradigm shift we need is one where the programmer is able to think and work in problem space rather than being pushed down into verifying loop counts and semicolons every five minutes.
Back in college I experienced a mind-opening moment when a physics professor insisted that I enroll in a class he was teaching. The class was for a language called APL. I won't go into details here. Look it up if interested. That class and that language changed my view of computing and how computing could work forever. I was taking FORTRAN and C classes at the same time. The contrast between the languages was almost beyond description. While we were mired in do loops and other language-mechanics in FORTRAN and C we were actually solving real problems with APL in very short order...even writing a game or two, database applications and doing some scientific computing. Programmer productivity and the ability to express and solve a problem simply could not be compared. APL felt like it was a century ahead of anything else.
APL lets you focus on the problem space. In a certain way it is like playing the piano, you don't think about frequencies and durations, you think about expression of ideas.
As people focused in languages like C++ (which were easier to grasp and use with equipment available in those days) APL never became mainstream and, to some degree, did not evolve into what it could have become. Ironically, the machines we all have on our desks today provide an incredible environment for a language like APL in terms of available resources and speed.
I am not saying that APL is the end-all. What I am saying is that my path through this craft was altered in a non-trivial manner by being exposed to a very different set of ideas. I find myself longing for feeling that way about the tools I have to use today, particularly when hitting the pavement with languages such as Objective-C and VHDL, which, despite their many supporters are far, very, very far, from providing the kind of evolution and progress we so desperately need in computer science.
Software is just massively complex. Some day, someone will come along with the resources to redo large parts of the technology stack "correctly." If they do it right, it may even be an improvement. But make no mistake, the investment required will be massive.
So until we hit a major technical wall, why bother? Why optimize prematurely until we're bumping up against atoms?
Kudos for a beautiful rant, I really liked it an resonated. But after it and all of the comments in here I remembered Jobs' saying: "Computers are bicycles for mind". Which means you need to pedal it to get somewhere. As much as programmers would like automation and being lazy, It's still about pedaling.
> In any given 12 hour day of programming, I’d say less than a single hour goes to writing new code now.
For the longest time I've felt guilty of this, until I started seeing my coworkers committing code that I would later have to correct. Our company is such a mess right now.
In light of all these recent rants it would probably be interesting to track what Alan Kay and his team are doing at www.vpri.org. I am looking forward to see what they'll come up with.
Sometimes it's easy to forget that software development is a fundamentally human endeavor and will reflect all that is right and wrong with the humans involved.
This guy just sounds like you run-of-the-mill crappy programmer.
"I can't get things to work! It's the fault of The World - it couldn't possibly be me! I've had a game in the App Store!"
As a rule, the worse the programmer, the more convoluted his solutions to problems. Maybe instead of writing huge rants on his blog, he should re-evaluate the way he does things so the world doesn't seem so horrible. Or get a new job.
this article seems like a more educated and experienced manner of asking "mr. babbage, if you put the wrong numbers in, will the right answers come out" ...
"Most computers today, for all of their potential speed, are largely a mistake, based on the provenly unscalable Von Neumann architecture, controlled with one of the most shortsighted languages of all time, x86 assembly. They are almost unfathomably inefficient."
OTOH there are compilers (SBCL) which take that beautiful thing Zach probably likes (Lisp) and run efficiently on the stupid thing he seems to hate (Von Neumann machines)...
this is great news. b/c it means there's still (after how many yrs?) huge opportunity. think different.
code something simple (=terse) that follows common sense and just works and it's an island of consistent speed and reliability in a sea of crap and bloatware.
we need more djb-like coders, who do not mimick 1000's of other coding monkeys.
mcillroy said the hero is he who writes negative code. he's right. the world needs less not more code.
if you can't handle that, then you're just contributing to keeping the "state of the art" terrible. but then people just keep paying for this crap so that's why it won't disappear. the idiots are rewarded for their "productivity" in producing saleable crapware.
In my opinion, the industry as a whole has gotten obsessed with new hotness over properly finishing what was started. I don't know whether that's always been true.
He has a problem with proprietary software. All of the problems are due to proprietary softawre and his examlpes with Apple produced closed software is just the tip.
This is a great rant. Nice emotional content, lots of technical details, the author qualifies his credentials, etc. Easily one of the better articles I've read in the past few weeks.
One of the things he mentions is the pain of setup -- something I've painfully watched develop over the years. Used to be you could go from a dead stop to programming something useful in about 10 seconds. Now, as he points out, it's not unusual to spend weeks digging around through vendor requirements, obscure dialects, rumor and configuration nightmares simply trying to get started. It's crazy. We've complicated the plumbing to the point nobody knows how to work the damn shower any more.