Hacker News new | past | comments | ask | show | jobs | submit login
MIT is a national treasure (cdixon.posterous.com)
525 points by hoag on March 23, 2011 | hide | past | favorite | 150 comments



Honestly speaking, if he played around on an Apple II, this happened almost 30 years ago when the computer industry was still brand-new. Not to denigrate him, his achievements, or MIT, but the world is different now.

It's an awesome anecdote, and I am a big fan of MIT, but consider this my preemptive counter-argument to the inevitable, "Here, see, more proof of why you should drop out of high school!"

(though, after all's said and done, I do hope MIT is not too different from the MIT that accepted him back then)


In California, you can take the CHSPE

http://chspe.net/

to enroll in community college as young as 16. California has made it really easy to transfer from community colleges to UC schools. Transferring in to UC Berkeley's computer science program is very doable, especially if you apply through its College of Letters and Science (which you want to do anyway, since they are a lot more lenient than the College of Engineering in terms of letting you take time off and choosing the classes that interest you).

Advantages of this route:

* You can skip 2 years of high school and graduate with a BA while your high school friends are finishing their second years of college.

* You save money living with your parents and going to community college for the first 2 years.

* Your chances of getting in to a prestigious UC actually increase quite substantially.

This is roughly what I did and I've been extremely happy about it. Feel free to contact me for details/individual advice: [my username] at gmail dot com


Which leaves you with a BA instead of a BS and on a less technical level than the alternative route would take. This appears no different than CIS through Arts & Sciences versus CSE through Engineering at alma mater.


I can assure you, after graduation virtually no one cares about BA vs BS. All they see is Berkeley.

The level of your technical skills is entirely up to you.


Honestly though, will this really mean anything a year after graduating? Sincere question. I don't have a CS degree (or any bachelors degree) though I am pursuing one. The drumbeat seems to be that the most important lessons one learns with a BS in CS is the algorithms classes. Would a BA not deliver those classes to its students?


As a recruiting manager, I never really made any distinction between BA/BS/B.Sc/whatever. A degree from a good school with good grades is good enough, it isn't critical what the degree is called exactly. CS isn't Physics where you should have lots of years of study to actually be able to do something useful.


To add a frame of reference, Berkeley also gives a BA in Math.

The difference between BA in CS vs. BS in EECS (there is no BS in CS!) is that BA CS is more grounded in mathematics, algorithms, logic etc... BS EECS is more grounded in hardware/engineering. Nonetheless, all have to take courses in algorithms, computer architecture, machine structures, etc... The rest is up to you. You can do a BA with somewhat of a hardware focus, or a BS with a software focus. You can take whatever classes you'd like on top of that.

Otherwise, there's no difference. As for getting a job, getting a good internship matters far more than what your degree and where you went to school. A good school will help you find a good internship. Just for $deity's sake, do something technical over the summer vacations (instead of doing nothing or working at the mall).

For what it's worth, interview questions also tend to be more focused on discrete mathematics and algorithms than they are on systems and hardware, so a BA might actually prepare you better.

Finally, the education is for your own sake, not for the sake of your employer.


At Berkeley, the BA and the BS degrees have the same associated CS classes--the BS just requires you to take physics and electrical engineering.


And why couldn't you just take the CS classes that the BS requires anyway? The BA/BS non-major classes are the ones where the BS/BA are going to matter for CS. Do you have more background knowledge in Arts? or Sciences?


I certainly depends on the school, but at least as important as Algorithms, course-wise, are Architecture, Operating Systems, Networking, and in some fields, Numerical Analysis. I'd say a BA with those first three is just as good most BS'es. If you're looking to work in industry building real systems (not just gluing Java API's together), you need at least a cursory understanding of those topics, which a BA may not provide.


It matters some. I've done some hiring, and in a technical field where the BS is commonly done a BA makes me a little skeptical. Like you couldn't handle or avoided the spectrum of harder science classes. Might not be fair, and it wouldn't rule you out in my book, but it matters some.


You can't judge this simply based on the degree name. A lot depends on the school and how they do things. I'm not entirely sure but I think my CS degree is a BA. But I've taken enough high level math classes to have a BS in math (I had to pick either or, they wouldn't give me both), and I also took all the hard CS courses: Algorithms, OS, Compilers, Computer Graphics. If having a BA gives you pause for a candidate, I would strongly suggest you simply ask them what courses they took.


I had a BA because every first degree at my alma mater BA. I guarantee you it's at least the equal of any science course anywhere in the world.

(They give you an MA for free later, but that's a weird historical accident; I have a proper Masters' and a PhD as well.)


  > I had a BA because every first degree at my alma mater
  > BA. I guarantee you it's at least the equal of any 
  > science course anywhere in the world.
I assume it's a bit different for European universities. In general, the BA/BSc distinction tends to be a meaningful one in the US.

  > (They give you an MA for free later, but that's a weird
  > historical accident; I have a proper Masters' and a PhD
  > as well.)
Weird? Try cool. They're seriously considering abolishing that particular quirk, though.


At Berkeley, the BA and BS for Computer Science are effectively the same. All the core courses are the same, it's just the general requirements are different. The degree designation really doesn't matter. If I'm not mistaken, I believe Harvard also grants Bachelor of Arts degrees (they use AB instead of BA) for Computer Science.


The CS classes I take are the same. The only difference is what I'm taking in addition to my CS classes.


It really differs between universities. In some universities, there isn't a difference between BA and BS, while in others, there are big differences. And in some British universities (like the one I attend), the only choice is BA in computer science.


Figuring out a way around the standardized admissions process is the admissions process for nonstandard applicants.

This is no different than anything else in life, from job applications to pitching a client, and the world hasn't changed as much as you seem to imply.


> Figuring out a way around the standardized admissions process is the admissions process.

So true.


I'm only 25 yet I taught myself how to program on an Apple //e in elementary and middle schools. Furthermore, today's computing environments are arguably less conducive to autodidactism. When the Apple II booted it dropped you at a BASIC prompt.


I don't think I can agree.

I'm 29 and learned BASIC on an Apple II. While remembering what it was like makes me nostalgic for simpler times, I don't kid myself. It sucked.

I had to read out of scrappy library copies of Byte and Compute magazine. There were hours of trial and error. And worst of all, it all went away when you turned off the machine unless you had one of those notoriously unreliable tape machines.

Contrast that to my first experiences with Linux nearly a decade later. I had man pages, Usenet, IRC, and the web; better hardware, an operating system, developer tools, and free compilers! I learned about Perl and dynamic languages (I was into C at the time). I slowly started learning about how operating systems worked. It was a far more conducive environment for learning.

All of this information was right there on the computer. No more trudging down to the library or throwing away my money on thick technical manuals and trial-version compilers.

I'd say modern FOSS computing environments are the most conducive to learning. The Apple II was a toy compared to what we have now.


And, when you booted into GNU/Linux, you also had a web browser, chat programs, games, etc. The possibilities for distractions are numerous. Contrast that to a BASIC prompt. If you go to your computer to learn BASIC, as long as you are sitting in front of the computer and actively staring at it, you're either going to:

    a) get bored as hell by doing nothing 
    b) type some stuff in from your magazine and execute 
        (thereby learning "something" along the way because of
        typos, pattern recognition, etc. (of course you'll /then/ get
        distracted, but you had to work a bit)
    c) write a program from scratch by exploration, or need.


For some value of conducive.

Distraction is a human problem, not a technological one.

You can just as easily use a type-writer instead of a computer to write your next novel in order to escape the distractions of twitter, wikipedia, et al. But you'll just find something else if you are prone to being distracted.

I measure "conduciveness" differently. For me it's the distance between what I know now and what I want to know. When I was growing up and typing in programs on an Apple II, the gap was really large. Getting information was hard and came in at a trickle. Access to technology was pretty limited; I had what came with my computer and that was it for a long time. Conversely, modern *nix systems are largely self documented. There's no guess-work at what a particular C function does or how it is implemented; there's a man page for it. I have access to the complete source code of my operating system. If I want to learn anything, it's a few key-strokes away.

Discipline is the cure to distraction. Not blunt tools.


If anyone wants to try something that approximates a computer that boots into a BASIC prompt, try AVR programming or Arduino. It's either C or assembler and you've got to use mainly reference manuals. Arduino has more resources and thus may be more prone to distraction.


Either of these require programming on a computer with web-browser, games, chat, etc. Programming embedded hardware does not solve the problem of working in a distracting environment. Self-discipline, determination, and perhaps other inner qualities are necessary and always have been.


I think that was important for my actually learning to program circa 1978. If I was growing up today, it's very likely I would have been too distracted by all the other cool stuff I could do on the computer. I teach High School programming courses now, and that is something I have to battle.


Oh, you actually had an Apple II, my condolences :) The //e actually had disk drives (although I also had the tape deck experience thanks to a TI 99/4A). Anyway:

> I had to read out of scrappy library copies of Byte and Compute magazine. There were hours of trial and error.

That's a great way to build persistence. Many kids lack that skill.

> All of this information was right there on the computer. No more trudging down to the library or throwing away my money on thick technical manuals and trial-version compilers.

Mm, I remember library trips. Taught me how to actually look stuff up. I agree, when I first got Internet access, being able to download DJGPP changed my life.

But on the //e, I learned how to reverse-engineer. Taught myself assembly by reading hex/assembly listings and via trial and error.

I agree modern systems make it easier to learn. But the Apple //e made you learn.


Yeah, I grew up with second-hand everything. The school only had ICON machines (http://en.wikipedia.org/wiki/Unisys_ICON). It wouldn't be until 93 or so when I'd finally set eyes on a 286 and it was 94 before I got my own 386. I still had that Apple II up until then (but the school had gotten rid of the ICON machines around 91 or so and acquired decent PCs... the school had better computers than I did!). I "programmed" then, but it was really just "punch it in and mess around with the values."


I learned to program on modern FOSS computing environments and I must say that I think a simpler, more constrained learning environment would have helped me out a lot. It's obviously dependent on the student's learning style, but having complete knowledge of a simple system is much better for me than working knowledge of a complex system.


> I had to read out of scrappy library copies of Byte and Compute magazine. There were hours of trial and error.

Funny, I look back on the same experience fondly, except it was assembly on an 8088. The "a-ha" moments were exhilirating.


> today's computing environments are arguably less conducive to autodidactism

I believe this fact cannot be overestimated.


When I was learning to program there was no Internet and there were preciously few technical books on the market. Often I had to resort to trial and error, where error could easily mean watching my machine rebooting, since there was little memory protection. Obviously I now see the world with different eyes, but it seems to me that today's computing world is much more suited to self-learning.


You're right in that there are more learning resources - ebooks and lectures galore! However, the complexity of software systems has increased quite a bit too. This complexity, and the ensuing layers of abstraction are enemies to learning IMHO.

Here's a personal anecdote ... I remember having a vague understanding of MFC for a number of years, until one day I came across a manual of MFC 1.0. For whatever reason, I went through the old book and came out with a strong understanding of main class structure and motivation behind the framework. I was a bit surprised that the tomes on the subject (and I had many of the latest at that time) were thick but lack some of the essential ideas.


But web development, while very abstract, is very simple. Sure there are tons of frameworks, but you can still do great stuff with just css/html/jquery/server. And the amount of stuff on the web makes it that much easier.


Still, you have to understand a lot of very abstract stuff to get what _ means in Jquery.

There is a world of difference between that and 8-bit BASIC.

While learning 8-bit BASIC may not be the optimal path between a neophyte and a computer scientist, it strikes a nice balance between high and low level.


Just for debate sake (because I tend to feel as you do) letm e take the other side (although we could be arguing semantics around the world simple.... so leaving that aside)

Is making a web page simple? Yes. Is making a web page do cool stuff simple? Yes. And it's way cooler than the logo and basic (and worse) of my youth.

IS "web development" simple? Not in the slightest, as a generalization.

The web is the UI component of an application. That application could be as complex as any other - but instead of a simple visual toolkit for our GUI, we have to understand a bundle of technologies that is constantly changing (HTML, CSS, JS), plus the end user platform (browser) and all it's quirks.... now throw mobile in,now throw in that some understanding of the network itself is required.

Thing are more open now, information is more readily available, and there is certainly a lot to choose from.

Further, there's the audience - you can now publish your apps to the world immediately, you don't need to get recognized and published on paper or through a publisher. So a simple, silly little game hacked up in javascript/html5 by a kid (yay kid! You did more than I did at you age, keep it up!) has more impact and visibility than did similar feats by similar kids back in the 8-bit days when they figured out how to make my CRT do things God didn't intend it to do all for my amusement.


Web development as a full blown software engineering profession is not simple. But what I meant is that I think I'd find it just as easy, if not easier, to teach a 12 year old basic web development; and produce something they'd find interesting, while being educational, as it would be to do this with 8bit BASIC.

And the tools now are so much better. I remember as a kid using graph paper to draw sprites, and the converting the images to hex streams by hand. Not mentally challenging, but tedious and error prone.


I think the web browser is the new place where you can teach yourself with immediate feedback. (I also learned from the Apple BASIC prompt, and simultaneously from the HP-41 command language.)


I think compilers today have much better error messages, and you can always turn to sites like Stack Overflow and Google. Back then, if it wasn't in some undecipherable manual, you had to figure it out by trial and error.

When I first learned to program, I recall struggling with run-on loops. I'm not sure if that is even really an issue with modern compilers. There wer things that it was just hard to figure out.

Today, I can just go to Google.


> Furthermore, today's computing environments are arguably less conducive to autodidactism.

You might be interested in this if you haven't seen it before:

http://viewsourcecode.org/why/hacking/theLittleCodersPredica...


To rekindle the nostalgia, headless linux gives you bash prompt.


This.

I, too, got into college without graduating high school, based on some software I wrote on an Apple II. It wasn't MIT, but that's kind of the point-- many schools are open to non-traditional students who impress them in some way. And back in those days, I imagine the bar was a lot lower in terms of what kind of code it took to impress (due to Moore's Law, as much as anything else...)


Guess I should have submitted my Apple II code to MIT in the mid-80's... Too bad that, even if I did it now, they probably won't be able to read my Apple II floppies (I am afraid I won't be able to read them anymore - and I have a couple Apple IIs in my collection)

:-/

How many of us wrote a window manager with 1K of 6502 code?


I did on the C64, prob more than 1K mind you.

It allowed you to create a bordered window of any size and colour, write text directly to it and when it was closed, as if by magic, the text that was behind it was still there! I think you could have 8 such windows open at the same time.

I was rather pleased with it at the time.


Z80, using HEX only. That was great. And no internet needed, just some magazines and BBS printouts.


I confess I kind of cheated. I used William F. Luebbert's "What's Where in the Apple II" to find routines in ROM suitable to what I wanted to do.

You can grab yours at http://apple2scans.net/apple-ii/whats-where-in-the-apple/


This blog post makes it sound like he didn't fill out an application. However, I'm guessing he did and also sent in code as supplemental material (which is quite common).

College admissions is a game. I currently go to MIT and it's definitely more meritocratic than many other institutions of similar "prestige" but it's not an exception.

  High school students today optimize their grades and SATs   
  and after school activities. They speak French and  
  Chinese, play piano and paint abstract art.  They dance  
  around and play hockey and act like they help homeless 
  people.
It's ultra-competitive. That said, the number of kids who do this sort of thing is waning. I think people realize that being extremely good or having insane dedication to one thing or "theme" is key.

On the other hand this stereotype arose from an era where admissions committees were trying to create a class of well-rounded students. It still occurs to some degree. However, things are trending towards creating a well-rounded class instead (lots of highly specialized students).


"Here, see, more proof of why you should drop out of high school!"

I actually love these stories for a different reason. If your interests are specific, you find most classes boring, repetitive, unhelpful and you have passion for your interests - the school system fails you big time.

It pains me to see our society viscerally and religiously touting high school education when it doesn't satisfy a high minority of minds. If we can add some light to that and possibly change the system for the better, less of us would be dropping out and actually touting high school as useful because it actually would be.


Not to belittle those whom the school system is the failing, but he wasn't in the school system. The blog post notes there was no school close enough for him to go to. It could be since the area was rural the majority of the kids engaged in agriculture/worked after 8th grade with a minority leaving to attend school in a nearby town.

Perhaps you could argue the school system did fail him because there wasn't one close by. However, I think your comment was meant in the spirit of schools that fail there students like so many do in Cleveland (where I live).


>Honestly speaking, if he played around on an Apple II, this happened almost 30 years ago

The only Thomas Pinckney in the Alumni directory has a graduation year of 1996/1997 (two degrees, so it fits with the linked story).


Not quite 30, closer to 18. Tom appears to have been in the class of 97.

I played around on an Apple ][ and I was class of '94.

I suspect Tom and I just started in our early teens.


Actually, someone I know got into MIT after Grade 11 due to mathematics competition achievements/math work in general. That happened this year.


Agreed. My dad didn't have a high school degree- or even a GED- and he got a full scholarship to UMass Amherst. It wasn't that unusual.


the guy built the exokernal project at MIT. you work with the tools you have. Apple 2 at the time and later radically new OSs.


Speaking as someone who applied for MIT a few years ago, something like this is no longer possible and the "rat race" description used for comparision is now in fact valid for MIT as well.

Nowhere in the recrutation process you have much possiblity to show your "software code" - everything is very formalized and you have to submit your grades, essays on specified topics, pass the SATs and go through a interview (but the interviewer doesn't have to know anything about the discipline you want to study). Yes, you can describe your most interesting projects as part of your application, but if you read the admission blogs and other MIT materials, it is quite clearly implied that unless you have near-perfect grades and/or near-perfect SAT scores, they won't even look at the project descriptions, essays etc. Also there is no way of knowing why you were accepted or rejected, because the whole proccess is 100% opaque to the outside world.

I still think the MIT is awesome and the admission process probably has to look more or less like it looks like because of the volume of applications they have to go through. But the post and some of the comments seem to leave the impression that the MIT addmission comitee will look at every person as a "unique snowflake" to find the really outstanding candidates. In reality, the admission process has to be quite mechanical so that they can at all manage it and only after the initial 90% of the applications gets rejected, they can be scrutinize the remaining 10% in more detail. So, if you want to get-in, you have to "optimize grades and SAT" and "speaking French and Chinese, playing piano and painting abstract art" won't hurt either.


Bullshit.

They have the application process. You can ignore it. I did. I had shit grades in high school (not shit-by-MIT-standards, but actual shit). I took no time to do homework, and instead spent time reading math books and programming. By the time I applied, I had done research at a reputable university laboratory (although I did not get to the point of publishing), and had a good recommendation from a well-known professor there. I had also taken several advanced math classes at a state school.

I sent MIT a custom admission packet. I filled out their paperwork for biographical information, but mostly ignored it.

MIT rejected me early admission. I called to ask why. They told me that they liked my packet, but given my grades, they were concerned about my maturity and my ability to work hard. I got an extra recommendation from a professor teaching the math class I was taking, whom I had asked to explicitly let them know that I was mature and a hardworker. At that point, MIT took me.

I didn't even bother applying to Princeton, Harvard, Stanford, and the like, since I knew I had no chance.

Optimizing for grades is a bad and stupid strategy. If you're late in the game (high school), it's the only strategy with a chance of success. If you're circa elementary school, the best strategy is to ignore school. The time with no homework will let you learn math and science (which you can do much more quickly than school will teach you), and to have real accomplishments by the time you apply.

Real accomplishments ALWAYS trump grades, for anything, be it university admissions, jobs, or YCombinator. Grades are a proxy to see whether you are smart and can do useful things. Accomplishments are a direct measure.


Unfortunately if I encouraged my daughter to ignore school and pursue something she's passionate about, Child Protective Services would be here in short order.


I think "ignore school" is less about minimum mandatory attendance and more about where you put your heart.


That's exactly what I'm talking about too. I didn't even think about attendance. There are a lot of busybodies in this world who will think nothing of trying to ruin your life, or at least make it temporarily miserable, if they don't like what you're doing. ESPECIALLY if they don't like how you're raising your kid.


Your daughter is lucky to have a parent with a good sense of humor.


Thanks, though "good" is subjective. (Especially here on HN.)


All well and good, but I'm not sure doing "research at a reputable university laboratory" and having "a good recommendation from a well-known professor" can really be classed as bypassing the academic admissions system.


It's not, but it's certainly outside of the "standard" sequence. The useful lesson to extract is if anyone uses one metric (say, grades) as a proxy for another (say, ability to work and flourish in a university environment), and it's possible to score points on the second metric, do so. Even if there's no official channel for it, that's what people really care about and they'll immediately recognize it as such.

The reason that this sort of thing is so rare, I think, is that we don't know what metrics matter until we're well past the point of applying to something. Even if we're told, it's hard to overcome the expectation and pressure of doing well on the proxy metrics. For example, in college, I had no concept of what mattered for grad school: research. And even if someone sat me down and told me, I'm not sure if I would really understand it. It's a rare person that realizes it's even possible to obtain points in the metric that really matters.


That said, it was MIT's experience back when the SAT meant something (pre-1994) that class rank (i.e. grades) were one of the two best predictors of subsequent success.


Woah, slow down here. Optimizing for grades is NOT a bad or stupid strategy, especially when you're young. By doing well academically, you put yourself in a position to do well in the future. Sure, pursue your interests and shoot for real accomplishments--so that when you're applying to college with those real accomplishments you don't need to explain why you didn't get good grades. It's just putting yourself in the best possible position to be rewarded for your work.

Don't short change yourself. Do your best in school, but be pragmatic and work hard outside of it too.


> I took no time to do homework, and instead spent time reading math books and programming

vs.

>I got an extra recommendation from a professor teaching the math class I was taking, whom I had asked to explicitly let them know that I was mature and a hardworker...

These statements are irreconcilable. A hard worker does work that he/she is assigned, and does not make pre-mature value judgements on the worthiness of said work. Being smart does not mean you're a hard worker. That was a pretty irresponsible recommendation by the professor, considering you didn't exhibit the qualities you claimed to.


> These statements are irreconcilable. A hard worker does work that he/she is assigned, and does not make pre-mature value judgements on the worthiness of said work.

Nonsense. Work should always be in furtherance of some goal.

I don't want to hire the person who does exactly as I say, even if it's a dumb idea in the larger context. What's important is reaching the goal (ethically, legally &c).

While there are people who use anti-authoritarianism to justify laziness, that doesn't seem to be the case here.


Perhaps I was unclear. I had crappy grades in high school. High school homework was a pointless waste of time, so I didn't do it. I had good grades in 3 of 4 of my university math classes. Homework there was interesting and not a waste of my time, so I did it. When I hit MIT, my grades skyrocketed, because with a small number of exceptions, the classes were fun, and the problem sets were interesting and useful.

I did poorly in my first class -- I didn't realize this immediately -- I don't believe I sent a university transcript to MIT admissions, but if I did, they would have seen one bad and one good early, and one bad and two good normal admissions.


What? A hard worker is just somebody who works hard. (Note: on something, not on everything. Working hard on everything is impossible.)


Well, maybe the professor didn't know what his grades were like.


> A hard worker does work that he/she is assigned

Maybe. But a smart worker evaluates what's important and what's not. Formal education and learning are sometimes completely orthogonal to each other. You might be thinking "well, how do you know what's important?" and the answer is that it's a gamble and decisions like those are not for everyone. There is no virtue in simply doing what you're told and being passive in your own education.


The OP claimed that he procured a reference that attested to his "hard-workingness". It's perfectly ok to be a smart worker, and I don't disagree with your point. My point of contention is with the abuse of the reference system - what value does the reference system have if you simply ask for (and receive) character attributes you do not possess?

So basically he was admitted on traditional merit, by misleading application reviewers.


Yeah I guess it depends how you define "hard work". Certainly spending hours reading math books and programming is some sort of work.


If you don't mind sharing how long ago was this?


About one decade. I'm still an MIT affiliate. Admissions has not changed, in this regard, since. There have been a lot of other changes (the students now are better rounded and better looking than when I was a student), but this is not one of them.

MIT Admissions intentionally takes high risks. They admit a number of students at the extremes who may end up very bad or very good. The risks don't always pay off (we had a few real idiots), but the policy is actually quite sound. MIT's name comes from it's most famous graduates -- the Feynmans, Aldrins, Metcalfes, and Kurzweils -- the cost of having a number of bad or flaky graduates is rather low -- only the people who work with them ever hear about them.


"Grades are a proxy to see whether you are smart and can do useful things. Accomplishments are a direct measure."

We've always used this type of thinking in terms of hiring but you've got it nailed in a sentence. Fantastic.


Props to you for your determination, hard work and perseverance. I think it speaks very well of your then maturity when you look at the way you approached what is, for most young adults, a very intimidating process.


Yeah I agree with everything you said. I wish I had had the balls and/or forethought to do that when I applied. Kudos to you, sincerely.


That's not entirely correct. You don't need near perfect SATs/grades, especially at a place like MIT. I know plenty of people (myself included) who do not fit that bill. And there is plenty of opportunity to discuss your projects. I mailed MIT a packet of all the interesting work I thought I had done and they even have an optional essay where you can talk about "something you built" in their application. I think they really care about those pet projects and they are the best differentiator they have.


They have stats publicly available, so everyone can judge it themselves:

http://www.mitadmissions.org/topics/apply/admissions_statist...

If you look at percentage rates, the self-selection of the candidates might make it look like it is not so rigoristic. But take a look at the absolute numbers, for example for the "SAT Reasoning Test Scores (Math)":

1172 / (1172 + 269 + 108 + 2) =~ 0.76

So 76% of the students enrolled have scores in the best 750-800 range.


The math portion of the SAT doesn't even cover precalculus (stops at Algebra II). Would you really expect someone who scored below 700 to have otherwise impressive credentials?

I'll agree that the more stellar applicants do tend to have high SAT scores but that's also in part because they are a) stellar and b) the SAT is a trivial test. I still disagree that SAT is a cutoff or benchmark. I highly doubt the admissions committee sees a 600 and throws the student out before reading the rest of their application to look for things that stand out. That said, there are tons of students with 800s that don't stand out at all and get weeded out.

Exceptional candidates will be exceptional regardless of their standardized testing scores.

Edit: Not trying to say that they admit all the exceptional people either. I'm absolutely positive plenty of people get screwed by the limited number of slots despite being more than qualified.


I think the following sums it up quite nicely:

"Of course you need good scores and good grades to get into MIT. But most people who apply to MIT have good grades and scores. Having bad grades or scores will certainly hurt you, but I'm sorry to say that having great grades and scores doesn't really help you - it just means that you're competitive with most of the rest of our applicants. MIT is very self-selecting in that regard."

From: http://www.mitadmissions.org/topics/apply/the_selection_proc...


I can agree with that (and with you for the most part). I just maintain that "near perfect" isn't part of the equation.

Edit: I never was really arguing against this (just that near perfect notion). There's a big difference between near-perfect and bad.


> Would you really expect someone who scored below 700 to have otherwise impressive credentials?

Errrr. I think I got a 680 and I'm pretty sure I got a 98% in Calc BC and a 5 on the AP. I also know a friend that got a 750 on the Math SAT, dropped out of Calc because he didn't understand it, and I had to tutor him.


I don't think I said "No one who gets below a 700 has impressive credentials". I was just trying to generalize that higher scoring students are more likely to have other impressive things about them.


Mens et Manus.


Look the SATs are such ridiculously easy tests that scoring perfect or near-perfect on them doesn't say anything about you, only that you might be competent. Scoring lowly almost certainly means you aren't. SATs are only useful for distinguishing varying grades of mediocrity. That's why MIT asks for AMC/AIME scores.


I call BS to some extent, to both the original post and this comment.

I went to a major Ivy. I know firsthand of someone who got early admission without doing senior year (from an elite foreign school); in honors calculus we had a high school kid who ran circles around us (not a full-time degree student IIRC); in our CS program we had a Doogie Howser type who was admitted in his teens and last I heard was some kind of security guru.

I'm sure any top school has a small number of brilliant students with oddball backgrounds. In other countries you have to get the piece of paper, but in the US no one cares about a HS diploma. If you can show you've got the goods, it doesn't matter how you show it.

(Although MIT is more likely to care purely about ability to be a great engineer as opposed to a well-rounded human being - to that extent I can believe other schools would not have been receptive to a candidate based on code LOL)


MIT doesn't care purely" "about ability to be a great engineer as opposed to a well-rounded human being", some* of the latter is required. And in my experience that isn't too hard to find, how many great engineers or scientists that you've known or heard of are only interested in their work and have no interest in, say, music (that was common at MIT in the '80s at least).


I'd have to disagree too.

I just applied this year, and was incredibly pleased to get accepted. (woo!)

The thing is, my grades weren't stellar. Don't get me wrong, they were great, and I was ~15th in my class-- but people had better. Similarly, I only had a 2010 on the SATs after taking it twice. (To be fair though, I did have good AP scores)

Yet, with all that, I was the only one in my class to get in, out of 6. That group included the valedictorian, and saledictorian--heck, I was the only person not in the top 5%. And, I've been the first person to get in from my school in at least the last 7 years.

And, for what it's worth, my extracurriculars were relatively flat: only one language (French), no instruments, no athletics, no arts.


Why do you think they accepted you?


1) Got rejected from MIT 2) Admissions process must be shitty 3) MIT is great but must admit shitty students instead of me 4) Read an article about a strong non-traditional 5) Equate that guy with self 6) Try to forget that self was just a weak traditional


I agree. It is very difficult to figure out where and who to show your code too.

When I applied to undergrad 5 (or was it 6?) years ago I applied to CMU. I submitted with my application an "abstract" about some software I wrote. The software was really good. In fact it got me a job my first year in college (at a security consulting company). But, I don't think it got to the right people at CMU. They probably looked at a low score I got on a test and passed.

Maybe I should have tried MIT instead of CMU. I didn't know anything about colleges so I only applied to three Case Western, CMU, and Northwestern and they were all near where I lived. Only when I was in college did I learn about other good schools out there.


Ed Fredkin has a somewhat more impressive story. He became an MIT professor without ever getting a degree — even an undergraduate degree. But by that point he'd invented a fundamental data structure (the radix tree or "trie"), worked at MIT for years on defense contracts, and made enough money off a high-tech startup to buy a small island in the Caribbean. Not metaphorically. He actually bought the island. He'd also been teaching at MIT for some time.

He's at CMU now.


"he never got he never got a high school degree"

This sentence tripped me up. I vividly remember some of the more boring classes where you end up staring at the clock, for some subjects I actually tried to put in the least effort possible to achieve 80%. I wish I had those years back to do follow something I really enjoyed doing.


It tripped me up too, but for the (to me more obvious) reason that the "he never got" is repeated.

Does that make sense to everyone but me? Or am I just being silly/obnoxious for complaining about (what I think is) an editing error in an article with an actual point?

Or is meant like "he never understood that he never got a high school degree"? I'm not a native speaker of English, and the missing "that" in the linked-to article makes this interpretation feel unlikely to me.

Confusing.


This is the problem that I had when I was applying to colleges: I used to ignore classes that bored me but were required and instead spent time that should have been spent on homework, etc. doing programming side projects and learning CS concepts.

When application season rolled around, I had to compete with candidates who had a much shallow understanding of their area of study, but had a much stronger overall GPA, loads of random APs, etc. While I did mention my side projects and depth in my area of interest, I didn't think to submit code or the actual projects; I usually just mentioned it in the questions or essays (which I'm not certain anyone even reads). This lead to quite a few rejections.

I'm at Georgia Tech now and doing well, because all my classes, more or less, are related to what I'm interested in. While I'm very happy here, I'm curious if I would be as happy if I wasn't accepted to Tech, and were instead studying in a place without such abundance of opportunity. I'm sure there are others in similar situations.


Well, the long school years is a way to structurally hide mass unemployment now that we don't have child labour. If youngsters could provide meaningful work and pick up their skills on the way, things would look radically different. Especially in the turtle-pace that tuition normally take. Sure, learning takes time to settle, but why not do something meaningful while the learning settles?


If you have talent, and work hard to cultivate it, then you can take advantage of the opportunities that are around you. Doesn't matter if you're at GA Tech or MIT, there are always more good opportunities than good people.


I definitely agree; it's just a matter of finding the right people.

Anecdote: attending some of the tech talks that companies give at Tech, there is always a gradient of people, from the people that are there for free food to the people that are there to ask questions and get something out of it. These tech talks are thinly veiled recruiting events, and the latter set of people are always approached afterwards for interviews or networking purposes.


Sure you're awesome, and everybody else sucks.

Georgia Tech admits 61% of all applicants .... just saying.


No idea where this came from (or if it's a good idea to feed the trolls; "hn_is_dumb" - really?), but Tech does have the policy to accept most and then fail out those who don't belong there. Only 29% of incoming freshman graduate in 4 years.

I'm glad Tech has this policy because there are lots of bright people here that did the same thing I did in high school and they do well here for similar reasons.


This is rare, but not unheard of; I can think of fiveish people off the top of my head that were admitted to MIT and Caltech without a high school diploma. All of the cases I know of are kids who just decided to leave high school without finishing their requirements, and went directly into one of the tech schools a year early.

The blog post mentions that there 'was no place nearby to go to high school.' That's really the issue in play. All of the 'MIT a year early' people I know about made a case to admissions that they had exhausted all of the resources at their schools and the time for MIT was now. The tech schools don't discriminate against lack of opportunity. If you're perceived as not taking all of the opportunities presented to you, though, you're finished. The post mentions that he took some community college classes. This shows a desire to learn and an ability to take advantage of the resources available to him. If he hadn't gotten a high school diploma because he was just too cool to be bothered, I imagine that he would have had more of an uphill battle.


While this is fascinating let's not forget that other people are NOT like Tom. Education is important and we shouldn't dismiss it if we have an opportunity to take it.


Philip Greenspun also entered MIT after dropping out of high school.


I honestly believe our education systems(those that I know about) most of the time end up keeping some of the best people behind.

The idea of degrees and diplomas where everything is depends on scores achieved in a set of examination is utterly counter productive in my opinion.

I hate it when I see people worrying about grades without understanding what those grades actually mean.


The vast majority of the "best people" cruise through our education systems. The few who truly don't fit the mold usually end up making it if they are genuinely exemplary. Low caliber people then hide behind these few outliers to try to rationalize their own lack of ability and academic mediocrity. "I suck at everything but I know I'm smart cuz <famous_person> also didn't like school." Everybody wants a medal. And when they don't get one, they blame everything but themselves.


I did not realize that! I was just reading some of his old blog posts this morning. He's pretty impressive.

http://philip.greenspun.com/school/tuition-free-mit.html http://philip.greenspun.com/careers/octopus.html


It's not so unusual, I was accepted into my Computer Science MSc (at Oxford) without a CS background - I did have a first class BA, but it was a joint honours in IT and Philosophy from a more-or-less unknown university (Lampeter).

Anyone who knows CS will know that IT is nothing like CS. I didn't have any A-Levels either. Masters degrees are a lot more forgiving, and I had some experience in software engineering.

(edit: this was year of 2009, and yes, I passed ;0) )


I don't know if this applies to the course you did but there used to be a lot of "conversion course" MSc courses in CS related areas in the UK.

These were very much targeted at people who did not have a background in CS.


yeah I think those courses do still exist, mine was a straight-up MSc though.


It's rare, but not unheard of even outside of CS.

A friend of mine is finishing up his degree at Brown University having never graduated from high school. Instead he joined up and did three tours in Iraq and Afghanistan as a pararescueman. The university has a program in place for students who experience an interruption (family, work, service, etc.) in their education.

http://www.brown.edu/Administration/Dean_of_the_College/advi...


The submitted blog post acclaims MIT as a "national treasure" because it admits applicants to its undergraduate degree programs who don't have a high school diploma (certificate of completion of secondary schooling). MIT is not alone in this policy. The Common Data Set Initiative

http://www.commondataset.org/

surveys United States colleges and universities each year about their admission policies. Question C3 asks if a high school diploma is required for undergraduate admission.

Harvard

http://www.provost.harvard.edu/institutional_research/Provos...

does not require a high school diploma for admission.

Neither does Princeton.

http://registrar.princeton.edu/university_enrollment_sta/com...

Nor does Yale require a high school diploma.

http://www.yale.edu/oir/cds.pdf

MIT has long reported that it does not require a high school diploma for admission.

http://web.mit.edu/ir/cds/2010/c.html

http://www.mitadmissions.org/topics/qanda/questions_and_answ...

There are other colleges that explicitly say in their Common Data Set filings that they do not require a high school diploma for admission. Moreover, homeschooling is widespread in around the world,

http://learninfreedom.org/homeschool_growth.html

and all of the most famous and most desired colleges and universities have admitted homeschoolers,

http://learninfreedom.org/colleges_4_hmsc.html

who often have "home brew" transcripts (as my oldest son did when he applied for his undergraduate university studies last year).

Lacking a high school diploma issued by a government-operated school is not a barrier to admission to any of the better colleges or universities in the United States, if the applicant is well prepared for higher education study.

After edit: I'm amazed that this thread has not yet mentioned pg's essay "What You'll Wish You'd Known,"

http://paulgraham.com/hs.html

his advice to high school students about how to use their time meaningfully. High school students who take this advice to heart can get into a good college with good financial support if they want to, or pursue some other challenging personal goal if they would rather do that.


I don't think the point is whether the schools require a high school diploma as a stated prerequisite; it's more whether they actually admit people with no diploma on the basis of other work.


Many / most schools admit people with no diploma on the basis of their work. This is why many students apply as Juniors in high school and how early admission works. If universities required a high school diploma as a qualification, students wouldn't be able to apply until they actually completed high school. There are many cases of people entering college as early as 13 years old, and this is not limited to MIT.


Early admission has nothing to do with having a diploma. All undergraduate and graduate programs let you apply before you have finished the previous level because the application and acceptance process takes more than the time between schools.

In cases where diplomas are required, the acceptance letter will always contain language stating that they will retract your acceptance if your grades drop an unacceptable amount or you don't finish the degree you're working on. Then most schools require you to submit another set of transcripts, proof of graduation, etc. before you officially enroll.


of course you don't have the degree when you apply and get accepted. the whole point was the guy didn't have a high school to go to, wouldn't ever get a degree, and MIT was awesome enough not to care.


There were freshmen as young as 15 when I was at MIT, but as I remember it, they generally had graduated from high school. It's not that unusual to be 17 when you start college, so if you've skipped a year or two you could easily be 15 when you get to MIT (and generally, it's also fairly possible to take some extra classes and finish high school a year or two early.)

I mean, didn't you ever hear of Doogie Howser?


yes, thank you.


Anecdotally, Berkeley did not require a high school diploma for me. I applied, was accepted, and graduated without having attended a day of high school. It didn't even seem that they cared much; admissions called once to confirm that I wasn't going to submit a diploma and never mentioned it again. Although I haven't met them, I've heard of a few others from Berkeley that had also skipped high school.


Agreed - the article makes a dubious claim. Many universities regularly accept high school students that haven't yet gotten a diploma as part of their early admission process. All that is required are standardized test scores, a transcript showing what has been learned, and evidence of excellence.

The claim that MIT is somehow a national treasure and unique here seems to miss the point. I know this because I was accepted early admission to MIT and graduated from there.

Extracurricular activities and evidence of capabilities have always carried significant weight in MIT's decision making process, but this is also the case for many other schools. It's only the graduate school process that requires evidence of completion of lower education. Try getting into a PhD program without a high school diploma or an undergraduate degree. And no, I'm not talking about honorary doctorates.


Try getting into a PhD program without a high school diploma

Does any graduate program even ASK if an applicant has a certificate of completion of secondary schooling? Which one?

(Note that the submitted post claims that the author knows someone who received multiple offers of admission to graduate programs without possessing a high school diploma, which sounds very plausible to me indeed.)


Sorry - I misspoke. I meant to say that there's no need to ask for a high school diploma as a qualifier to get into a graduate program - graduate programs require undergraduate degrees with very few (any?) exceptions. Even if the undergraduate degree did not in turn require a secondary (high school) diploma, the undergraduate program was itself a qualifier. The point is that in response to the main concept of this article that you don't need a high school diploma to get into many schools (MIT notwithstanding) is only relevant to undergraduate degrees, and even in that case, is not particular to just MIT.


actually, MIT's Media Lab does not require PhD candidates to have an undergrad education.

Yes - the world renowned Media Lab.


After college, I went to the University of Chicago for math grad school. (I didn't finish the program--the Northwestern math department was too easy in those days, and my study skills were shot.) Three of my classmates had not finished college, and one had not finished high school either. That one had gone to MIT.


This interesting to me. I'm a high school senior, and when applying to colleges many admissions departments told me that since I have been home schooled my entire life I don't have a good chance of getting. One college even told me point blank that I shouldn't even attempt to apply.

I was baffled as to why. I have good grades, above average SAT scores, a solid academic foundation(I believe). On top of that I've been working at a software company for the past year, founded multiple robotics teams in my area, and I'm now working on starting my own company.

Enough with the rant. The way I see it, if the college told me they weren't interested in me since I've been home schooled they're probably not worth my time to even apply.


One college even told me point blank that I shouldn't even attempt to apply.

Which one? I'll write to the college's admission office and tell them what they are missing (based on my acquaintance with dozens of homeschoolers in recent graduating classes).

See

http://learninfreedom.org/colleges_4_hmsc.html

for how much of an outlier such a college must be.


I too am curious about these schools. My three younger siblings were homeschooled through high school and attend/ed well regarded public and private schools. I think their hack was to get a G.E.D. but I'm not sure if that was even necessary in the end.


is this an onion piece about people who read legal documents and don't actually know how the world works? brilliant stuff.


MIT is a national treasure because of this: http://ocw.mit.edu/index.htm

OpenCourseWare is absolutely amazing. I'm using it to study SICP and then will continue with K&R. I didn't go to MIT, but I'll always feel indebted to it because of these amazing resources.


Although arguably implied, there is nothing in this blog post that explicitly states that MIT is alone amongst institutions of higher learning in accepting a student without a HS diploma. Rather, it is simply demonstrating a particular example of just such an unusual occasion.

I'm not sure why everyone is reading into it so much: it's just a "feel good" piece, really, illustrating how one student's practical skill set -- here, coding -- was sufficiently talented to warrant a second look by one of the country's (best) universities. And, being a private school, they were willing (and able) to peel back their own red tape and allow admission notwithstanding his otherwise disqualifying credentials.

The point of the story is simply: here's a kid who was unqualified in the traditional, technical sense. But due to his obvious skill and intelligence in a particular field, a private school was willing to look past his technical disqualifications and, by its own prerogative, make an exception to its own rules.

This is most certainly why Berkeley and other public schools were unwilling to make an exception: they have less flexibility. (As someone who attended UCLA, I can attest personally to the stringent red tape of California's public university system.) That the blog throws public and private schools into the discussion demonstrates a remarkably cavalier oversight that misses the point entirely with respect to why, precisely, MIT -- a private school -- is the school that happened to grant the student the exception.


Once again, the title of the post and content overstate / misstate a point and belie the reality. Many high school students apply to schools like MIT without having a degree -- they get the degree when they actually graduate, by which time they have already been accepted or denied admission by schools like MIT. Speaking as an MIT graduate, and one that was accepted early as part of the early admission process, not once did they ask in the application or in person whether or not I already had a high school degree. Of course I didn't - I'd get one when I graduated. When I applied, I was still a Junior. and I applied early. All I needed were my SAT scores, a transcript (which the person in the article had as well), and evidence of excellence.

I don't understand the point of articles like this that breathlessly trump one thing while the reality is something else. Colleges everywhere regularly accept people that have not yet completed high school. This is not just MIT. To say that MIT is somehow unique here misses the point. And yes, I know, because I went to MIT.


My friend Ryan Lackey also got into MIT without a high school diploma at 16 (http://en.wikipedia.org/wiki/Ryan_Lackey). He later dropped out to become the CTO of Sealand.


To all the comments here - I don't think Chris is making a point against schooling. He's sort of implicitly making a point against resume padding.

Resume padding is not a healthy thing and such examples could enlighten a lot of high school students.


I think another way of looking at this is in parallel with Ben Horowitz's dictum, in the context of hiring, to look for strength rather than lack of weakness. I think in school admissions or hiring, there's a trade-off in the end of trying to find the best people in terms of the most impressive lack of weakness in any area versus the most impressive strength in at least one area. The more bureaucratic the process becomes, the more it tends to favor screening for lack of weakness.

There's a corresponding tension on the candidate side in how to prioritize the record of accomplishment one seeks to develop to prepare for a desired school admission or job, in how much effort to devote to shoring up any potential weaknesses versus how much effort to devote to pursuing capabilities and accomplishments centered on one's core competency. Sometimes that choice might take the form of whether to procrastinate less consequential pursuits to focus all one's effort on the most important thing one knows of to work on at that time, and sometimes that neglect of other pursuits can make a difference not just in degree but in kind, like pg talks about in "Good and Bad Procrastination".

The danger is that all college admissions processes are becoming homogenized and over-bureaucratized to the point of excessively screening for lack of weakness, to the point of never fairly considering a candidate's record of profound intellectual accomplishment because all the right boxes on their record aren't checked off, and that students are calibrating their intellectual pursuits accordingly. The glory of MIT in this example is that it avoided that over-bureaucracy of the process at least in this instance.

Obviously you can say well, any really gifted student should devote all necessary effort to a well-rounded education and SAT prep and extra-curricular activities as well as develop clear accomplishments in a core interest, and should be able to do well at it all. There's always a trade-off at some point though; and I think many of the greatest intellectual accomplishments have come from people who didn't consistently devote large chunks of their schedule to a diverse portfolio of widely varying subjects and activities.


When I was a sophomore at the 'Tute I became friendly with a frosh who was a little different. He was from Texas (as I am, but that is not germane) and was 24. He had pledged the co-ed frat next door to my dorm where I used to hang out a bit, and always to play pool at their Friday happy hours. His father was a senior executive at a well-known semiconductor manufacturer.

He was certifiable on many levels, but a very interesting guy. He was working at Draper Labs within a month of his arrival on campus doing who-knows-what with some-unknown-level security clearance.

He had applied to MIT from a Texas state penitentiary where he was serving a six-year sentence for robbing a series of pharmacies and related misdeeds. Once he finished there, he started a different sort of prison. ;)

I recently submitted an application for the summer funding round as a sole founder. My one good friend who has been living JavaScript and CSS for the last few years is busy with his own company, but I am sure this is a good spot to meet potential partners. I call my idea StratoShare, and it involves a gateway for providing a uniform access API across users' data aggregations. The gateway would also manage a sharing graph for each user that would include those of their various aggregators, but would be independent of them. Share once with each other for everywhere, and manage it all in one view.

If you have some Web app chops and are interested at all, please email jmichaeltindell@gmail.com and I'll send you a link to my application and video.


After that story, there's no source code? It would be interesting to see, at the very least.


Just goes to show you can still be successful without an MBA, a bachelor's, or a diploma. So many successful people missed some part of standard education so I guess we all should since those are the ones we keep on celebrating.

I can't tell if I'm being sarcastic or not...The idea is to avoid the typical route and focus on building and execution, where the real world is giving you a report card and not a school. If you're good enough, you'll get an honorary degree or be accepted without the standard credentials.


I went to MIT without a high school diploma (and a few years early); I got a great score on the SAT standardized test, good recommendations from a couple of HS teachers, MIT summer camp grad student/professor instructors, and a hacker job I'd had (via the Internet).

I don't think HS is actually a major factor in the MIT undergraduate admissions decision if you have a plausible reason for wanting to skip it.


Someone I know well was recruited on full scholarship to Cornell as a math student in 2007, even though he was a high school dropout.


There is some ridiculous story about how the guy who founded Burger King got into Cornell. Apparently the head of the mycology department had written some article 20 years earlier in a trade journal saying that anyone who did X, Y, and Z would get an automatic in to the university, and the guy came along and did these things even though he was otherwise completely unqualified.


In other news, one data point is enough.


sounds impressive but this is a exception, not the norm

there are many areas where certification/practice needs prerequisite qualifications eg surgeon, attorney, airline pilot

if the course is highly competitive/lucrative like say with IIT or AIIMS in india, expect litigations


While this story is awesome, it's really not that relevant today. During the Apple II days most programmers would have been self taught. Today, not so much.


"Software code?" Really?

This is the sort of thing that could have happened during a very small slice of highly unusual history. It certainly wouldn't happen these days.


the title should have been: "Student accepted to MIT without high school degree thanks to his software code"


MIT only enrolls geeks.


Sounds like a modern Forrest Gump.


Glaring typos in the third paragraph, with repeated phrases and bold text. Good story, though. Upvoted.


This should be an eye opener for all those bookish/only SAT score people.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: