I think it's more of a spiral, where you get to stage 5-7 in one area, find out there's a whole new world you never knew about, then end up back at stage 1 in that new, broader world.
I wrote my first computer program, a text-adventure game in BASIC, when I was 10 (stage 2). Learned Pascal and C as a pre-teen and played around a bunch on a VAX in 8th grade (3). Started working on a MUD in 9th as I moved to a new school (4). Found out it was too hard, gave it up, and took up electric guitar instead (not quite 5).
Then when I graduated from high school, I got my first real programming job (stage 1 & 2). And found out I could write GUI programs in Swing and Java just like the professionals (3). Ended up doing the GUI for the beta version of our product - except the company never shipped it, and ended up going out of business six months later (4). I blamed the failure on a lack of engineering process, so I started learning all I could about design patterns and XP and rigorous engineering processes (5).
Then in college, I found out about Lisp and functional programming (1). Many of the problems in that previous startup were because of too many bugs, so it had an immediate appeal to me (5 & 1). I wrote my first little programs in Haskell and Scheme (2), then started learning everything I could about compiler & interpreter implementation and wrote a little tutorial about how to implement Scheme in Haskell (3). Then I tried making my own programming language (4), which went nowhere, except I knew that (5) was not the answer. I'd like to think I've skipped to (6), though given that the programming language is still going nowhere, I'm not so sure about that.
The first stage of being a programmer is having heard of programmers but not being one yourself? In that case, I'm a first stage surgeon, concert pianist, and fighter pilot.
Only works if you can be sure about the behavior and specification of all of your inputs and outputs (not to mention the compiler). Part of getting to stage 6 is realizing that this won't often be the case.
Well, not very difficult if you're using a functional programming language, where pure and impure code is usually separate. Haskell code can be "proved" just like any other mathematical theorem.
I wasn't thinking primariy of pure vs. impure, but of user vs. computer. You generally cannot formally verify users, and they often want the dumbest, most illogical things. Like asking that null vs. singular vs. plural cases all be separated out, or that you should filter porn out of their results (what's porn to a computer? hell, what's porn to the supreme court?), or that you should render things brokenly if there's no doctype but do it correctly if there is.
One of my cubemates at Google is a project manager who used to work at Microsoft. She said that one of the biggest differences between the engineering cultures of the two companies was that engineers at Google are encouraged to make "product" decisions themselves, while Microsoft engineers basically stuck to code. If a spec was unclear on some point at Microsoft - say, what color of blue to use for the border on some dialog box - the engineer would throw it back over the wall to the PMs and wouldn't work on it until they got further clarification. While at Google, the answer was likely to be "Use your judgment, and we'll fix it if it's terribly bad." That implies some amount of knowledge about what users want, and you generally can't prove what users want.
Assuming that the compiler, any libraries you use, the hardware, and the interfaces to that hardware all behave correctly, yes, you can formally verify code.
It's the stage 5 programmers that scare me the most. They're the ones who want to rewrite everything from scratch because the current codebase doesn't contain enough patterns and it wasn't built using "proper" methodologies.
After you realize managing people and other's assets is not fun anymore you'll get back to your roots: start hacking again and start using your connections to build your own business
I think I've been stuck at '4' for the last 20 years or so :)
But seriously, the cycle is endless, there are always new depths of understanding waiting to be found and every couple of years you sort of go back to square #1, learn a new paradigm, only to find out after much hard work that the new is the same as the old only in different clothing.
I'm not sure where i am in this.
I recently discoverd that my solutions started to be too complete, too much complexity analysis, too much enduser friendlyness.
And that a 10 line POS script could do the same, if only the users would know how to code.
I wrote my first computer program, a text-adventure game in BASIC, when I was 10 (stage 2). Learned Pascal and C as a pre-teen and played around a bunch on a VAX in 8th grade (3). Started working on a MUD in 9th as I moved to a new school (4). Found out it was too hard, gave it up, and took up electric guitar instead (not quite 5).
Then when I graduated from high school, I got my first real programming job (stage 1 & 2). And found out I could write GUI programs in Swing and Java just like the professionals (3). Ended up doing the GUI for the beta version of our product - except the company never shipped it, and ended up going out of business six months later (4). I blamed the failure on a lack of engineering process, so I started learning all I could about design patterns and XP and rigorous engineering processes (5).
Then in college, I found out about Lisp and functional programming (1). Many of the problems in that previous startup were because of too many bugs, so it had an immediate appeal to me (5 & 1). I wrote my first little programs in Haskell and Scheme (2), then started learning everything I could about compiler & interpreter implementation and wrote a little tutorial about how to implement Scheme in Haskell (3). Then I tried making my own programming language (4), which went nowhere, except I knew that (5) was not the answer. I'd like to think I've skipped to (6), though given that the programming language is still going nowhere, I'm not so sure about that.