Hacker News new | past | comments | ask | show | jobs | submit login
Innovation and the Bell Labs Miracle (nytimes.com)
200 points by gkanai on Feb 26, 2012 | hide | past | favorite | 53 comments



I was recently talking with Prof. Dave MacQueen about this (he ran a group at Bell Labs for around 20 years before the Lucent debacle). The most amazing thing was the management style of that lab. Once a year, you had to write an "I am smart" report (yes, the actual name) where you told management why you were doing good, smart things. Management would then meet on two separate days, once to ensure everyone was meeting the bar of doing good things and once to figure out how the money thing should be handled.

That's it. No assessment of "high-impact publications this year"; no assessment up the chain of how many $100MM businesses had been created by your group (thanks, Lucent!); no demo days to see if some product group is going to give you the "leave research and work on this or work on something else" ultimatum...


> The most amazing thing was the management style of that lab

I've worked some years in academic research in a position where the pioneering research of one guy led to a big laboratory with plenty of researchers in the lucky situation that they can work very freely on what they want. Coming from industry I was completely baffled by this. This is probably the style where the great ones work best. No corporate BS, just the work. What has been very puzzling is that in this setting, is that the people there are not that great at many things. Incompetence can be quite rampant among some people. Was there some kind of "quality control" for the work at Bell Labs? I see now the circles I'm working in moving towards a more classical "publish or perish" mindset, which produces safe and unambitious research. Sure, the incompetence needs to be purged out of the system, but direct measurement very easily kills the long term for the short term. I wonder if there is any other alternative.


"Regrettably, we now use ["innovation"] to describe almost anything. It can describe a smartphone app or a social media tool; or it can describe the transistor or the blueprint for a cellphone system. The differences are immense."

I cannot agree with this more. We may not be able to re-create the environment of Bell Labs, but I'm hoping to see more in terms of actual science (and it will be at the nano-scale) in the future, rather than seeing so many people create yet another social app, claiming that it's "revolutionizing" an industry.


Exactly. Also, the word "technology" is now sometimes used to simply describe a certain piece of software middleware.

While some fascinating Bell-labs-like research is taking place at IBM and Microsoft, newer tech giants have opted to foster creation of lots of competing startups to do research for them, so that they're able to spend money (through acquisition) only on the successful ideas, rather than develop their own large and expensive research departments. The result has been that startups often focus (in fact, they are strongly advised to focus - for example in numerous blog posts that are widely popular here on HN) on ideas that can generate market value within a couple of years. They actually have no choice because otherwise they are losing their chances for investments and/or lucrative exits.

This is not to say that most people working in startups are the same type of people who could have worked at Bell Labs - they aren't. Most of them are "simple" engineers well versed in current "technologies" but are uninterested or unable to break new frontiers (this is not meant as a negative statement). However, quite a few of them are capable and interested, but the money and the Silicon Valley game are simply too enticing.

I often feel angry at Google particularly, who've taken quite a few bright and inquisitive minds from more innovative companies, like Sun Microsystems (RIP), and turned them into application builders.

But the game has changed, money, and a lot of it, could be made from technology much faster now than before, and many minds who could have been used for true innovations are now working on designing social networks (which is an interesting research topic, but not THAT interesting, and there are certainly other less explored avenues that lead to truer innovation).


> "Most of them are "simple" engineers well versed in current "technologies" but are uninterested or unable to break new frontiers".

I'm glad you said this because I feel as if it needs to be reiterated much more often. I'm a CS student right now, and a lot of times I look around and see my university's CS program as a factory designed to turn out by-the-book software engineers fit for corporate consumption.

I really enjoy CS, but here's my anecdote: the other day I got a chance to tour a microfabrication lab, with tons of expensive equipment and a lot of knowledgeable people milling about (who I'm sure were nervous with us being there). That really instilled an appreciation of the complexities involved with real, true innovation. I love writing software, but I won't do it forever, because the vast majority of true innovation really does require a deep understanding of scientific foundations.

Also: the guy who showed us around worked for Bell Labs for awhile before coming to the university where I'm studying. His one pre-condition for accepting the university job: the $2 million, room-size, laser-equipped system that he built for detecting flaws in silicon wafers had to come with him.


Are you talking about science in general? There's enough science left to do on the macroscale, too.


Science isn't a finite spectrum, so of course there's much to be done in all fields. But this article specifically laments the fact that the term "innovation" is no longer strongly associated with substantive breakthroughs in relatively unknown areas.

Bell Labs pioneered many practical applications of microscience; nanoscience is the next level down that remains vastly unexplored and has very few practical applications associated with it (yet).


In defense of Google, I think their machine translation is highly underrated, its simply incredible how good it is, at least for European languages. As is their contribution to building large data centers and manipulating large amounts of data (mapreduce, Bigtable was the first major nosql).

Facebook is too young to tell.

Still, IBM is the gold standard for CS research - Microsoft appears to have many productive researchers but I'm not sure they have made fundamental advances like virtual memory, hard disks, relational databases (all of which came from IBM).


I thought virtual memory came from Burroughs.


I think it was the first commercial computer to support it, but much of the theoretical groundwork wasn't really figured out till 68 and 70

http://dl.acm.org/citation.cfm?doid=363095.363141

http://dl.acm.org/citation.cfm?doid=356571.356573


That sounds like the old joke, "I can see that it works in practice but will it work in theory?" If the Burroughs guys both invented it and commercialized it then surely they should get the credit for it.

The story is in this wonderful memoir: http://news.ycombinator.com/item?id=2856567 (with the relevant passage quoted here: http://news.ycombinator.com/item?id=2928672).


I agree, I hadn't heard of the Burroughs guys until this post. The two papers I linked to were always the earliest ones I was aware of.

There's a surprising amount of stuff that gets built though without understanding all of the theory behind it. So it doesn't surprise me if a "pre-theory" example of VM was built.

After all, we had the wheel for thousands of years before figuring out PI.


In fact it did. It didn't enter into "mainstream" consciousness until IBM "invented" it.


Having worked at Bell Labs, I gotta say the culture of innovation was paramount best described by this statement of an old supervisor: "You should do what you feel is the right thing, it's my job to align that to the needs of the business." So you might spend weeks researching highly available protocols and data distribution techniques to get the one feature correct. It should be no surprise then that the old refriderator-looking phone systems pioneered in the late 70s and 80s had a well known 5 9's of reliability.


I thought this op-ed was important because I agree with the author that companies like Google, Apple, and Facebook (etc.) get the bulk of the media's attention when none of what they do could have been done without the work that came out of Bell Labs. Real innovation is not done at Google or Facebook or Apple for that matter. Those companies are too tied to meeting quarterly results to invest in real innovation.

I think the other important implication is that the US doesn't have an entity like Bell Labs innovating today. And so we rest on the work that was done back then (cramming more transistors onto a smaller chip, etc.) vs. new innovation that would provide the platform for decades of future growth.


IBM has turned/is turning into the Bell and Parc of yesteryear. They've all but vanished from the consumers' eye, yet they're stil huge doing and licensing and all kinds of research and technologies all across the technology spectrum, from transistors to chips to software.


Do you have any real evidence to back that up? When Lou Gerstner came to IBM he all but said their innovating days were over. Since the 90's IBM hasn't come up with any new products, instead focusing on acquisitions. Their patent flow is full of business process patents.

and their technology products are aimed clearly at tying themselves into large govt/business service contracts (See "smarter cities" stuff).

Watson is a neat tech demo, but not much more. I've built parts of the technology in Watson independently.


IBM apparently still does a lot of interesting fundamental research, in their PR it looks like they've got a focus on materials science for storage and medicine and they do interesting data-driven work around health, population genetics work (out of Africa hypothesis as one example) etc.

http://www-03.ibm.com/press/us/en/pressreleases/finder.wss?t...

It might not be Xerox Parc stuff, and a lot of the PR looks like fluff, but it looks like there's real stuff going on there...


Oh come on.


To support your point: Last week I attended a talk about how they are turning the technology behind Watson from winning Jeopardy to advising your GP.


It's hard to spot innovation while it's happening. While the transistor revolutionized our world, I'm sure few people understood it's implications in 1947. Similarly, investments that Google, Apple, Facebook, and IBM make now might seem either incremental (making an open OS for a phone) or gimmicky (driverless cars, HUD), but that's how research coming out of Bell Labs might have been viewed 60 years ago without the benefit of hindsight.


Popular Science in 1948 had an article "This Capsule Challenges Vacuum Tube" on the germanium transistor calling it a device that may spark a revolution in electronics, and discussing various applications like much smaller radios, better TVs, and improved telephone transmission. They point out that the future success of the device depends on its cost, but mass-production might be possible. Computers are notably not mentioned. Link: http://books.google.com/books?id=YCcDAAAAMBAJ&lpg=PA117&...


Agreed. Lets not forget that Google has X labs, and while secretive, we at least know they will pave the way for augmenting reality and self-driving cars. That's pretty innovative.


Well we may have less entities like Bell Labs but we still have the T.J. Watson Research Center.


I think of somewhere like DEKA as a small scale modern bell labs.


At first, I really agreed with this piece. I think it's an awesome point that the overuse of the word innovative deflates its meaning, just like calling every programmer a "hacker" or saying "let's do this 'the hacker way!'" makes it less meaningful, or calling every member of the military a 'national hero' is maybe a little degrading to purple-heart recipients.

But on a second reading, this article really uses a straw man. Comparing a facebook and a few cherry-picked innovations from Bell Labs is a little disingenuous for a few reasons.

The most obvious comes from the fact that there were a lot of people working at Bell Labs. See that first picture? Halls so long they ended in the vantage point? There were a lot of ideas that never quite made a splash in the 70 years of Bell Lab's heyday.

The second part is that the innovations of Bell Labs may not have been lost, but transferred. There's still quite a lot of research done in this country, although most of hit has slowly been shifting to universities. I'm not sure if this is better, but no one can say there isn't tons of cool stuff being done for the "understanding" and not the short-term profit. I think there should be more, but that's not the point. While we're cherry-picking, I might point out brain machine interfaces that let monkeys (and soon humans) control prosthetic limbs directly from their brains, nano-scale machines a few molecules large, quantum computers. These things are anything but short-term profit focused, and everything but falsely innovative.

Finally, the real straw man lies with the fact that Bell Labs has the benefit of hindsight. We now know the transistor was an incredibly useful invention. Before we knew what we could really do with computers, it wasn't quite so obvious. It might end up that facebook doesn't turn out anything more innovative than extremely well executed social networking, but that's going to take more than 8 years to find out.

Don't overuse innovation, but don't become so narrowmindedly awed by the past that you don't take part in the tumult of things that might (or might not) share the same sentences as the transistor when someone write an article in 30 years wondering why we don't have the same pizzazz as those Silicon Valley entrepreneurs at Google.


> The most obvious comes from the fact that there were a lot of people working at Bell Labs.

There's a number of issues I'd raise with the piece, but the above is not one of them. There were a lot of people in a number of R&D labs, but measuring the output per person (examples: nobels/papers/patents/citations per employee) I'd guess there was something special about the place.

Full disclosure - I'm a Bell alum, so perhaps I'm glorifying the past.


We are not necessarily going to find out easily how to recreate these occasional convergencies of creativity and innovation. Bell was a huge monopoly that was very costly for the US and maybe the innovation was worth that cost. Microsoft maybe us in a similar position but not as productive. Google looks the best bet right now for an innovation centre.

The other thing of course was that various antitrust settlements meant that Bell could nit charge for many inventions, such as Unix. This made it more open than the modern paranoid corporation...

Must have been a great place to be.


My dad was a Bell alum, and I agree completely. It's rare to have this convergence of genius, creativity, and perhaps most importantly money - but when it happens, society is forever improved.


Could you tell us about project genesis at Bell Labs? How did projects originate, grow, and get killed or morphed?

Thank you.


> project genesis

It was a big company: I pretty much saw the entire scale of formal defn to skunk project.


Thanks jpdoctor. Could you elaborate on what you mean by formal defn? Who defined them, out of which principles or goals? (I understand that this might have multiple answers)


Guess who. Ok, when it was still in NY.

"I went to Princeton to do graduate work, and in the spring I went once again to the Bell Labs in New York to apply for a summer job. I loved to tour the Bell Labs. Bill Shockley the guy who invented transistors, would show me around. I remember somebody’s room where they had marked a window: The George Washington Bridge was being built, and these guys in the lab were watching its progress. They had plotted the original curve when the main cable was first put up, and they could measure the small differences as the bridge was being suspended from it, as the curve turned into a parabola. It was just the kind of thing I would like to be able to think of doing. I admired those guys; I was always hoping I could work with them one day."


The fact that they had Shockley tour-guiding Richard Feynman shows that Bell Labs was more than just a big pile of patient capital.


Richard Feynman. An essay about his work on computing machines - http://longnow.org/essays/richard-feynman-connection-machine...


I have watched this more than a dozen times, I'm still awe-struck every time I watch it, and it barely scratches the surface of all that was created at such a magical place:

http://doc.cat-v.org/bell_labs/innovations_song/


So what happened to Bell Labs?

What were the reasons why it started to lose that magic?


I'm a Bell alum. First off: It really was an amazing place. The guy across the hall won the Nobel while I was there, but even going to lunch could result in learning about state-of-the-art in something or another. The data rate of average discussions was quite high; Both parties usually had PhDs.

> What were the reasons why it started to lose that magic?

I could write a book on this. Much of the problem is that the metric-of-success changed and the organization was not prepared/couldn't make the transition. Measuring output in $$ might be good for biz, but is not good for certain types of R&D, nor is it good when it comes time for annual review. Eventually, many of the young capable people realized that they were going to make much more money by leaving Bell, and as the meltdown in 2001-3 showed, they weren't even going to sacrifice "the stability of a big company". (There was terrible turmoil around that time. The fact that developers were going to turn Holmdel into condos threw many alums for a loop. It was quite a loss.)


A very good book could be written on the topic, is one of the most sad tragedies of the 20th century.

All my information is second hand, and is a very complex topic, but basically when AT&T spun off Lucent, and they tried to more directly monetize the innovations they were making, the whole thing started to break down, management sucked, researchers started to bail out, and it became a vicious circle.


Ah. Thank you.

If anyone has any more information about the decline of Bell Labs, it would be deeply appreciated.



My guess is that what happened was the benefits of the monopoly like status vaned off.

We saw that here in DK when they privatized the state controlled telco here.

For a number of decades I guess since the 30s and up till around the eighties those organisations where stable enough with few enough "paradigm" shifts in technology that it would make sense to maintain an R&D division with a long outlook.

But after the internet revolution it's my guess that they couldn't maintain the 10 year horizons they often had the luxury to work within. The competition is now constant and the "innovation cycles" narrowed in which means that you will miss out on revenue each paradigm creates.

I think this is why ultimately places like Bell Labs and Rank Xerox makes no sense anymore.

The irony of all this is of course that in some sense it slows innovation down because each new trend is being sucked dry before the industry moves on.


From the article:

"Bell Labs (unlike today’s technology companies) had the luxury of serving a parent organization that had a large and dependable income ensured by its monopoly status."

Essentially deregulation of the phone market was a major contribution to the decline.

http://en.wikipedia.org/wiki/Bell_System_divestiture

Similar in a way to what a profitable company can do that is not a pure monopoly (like google or apple).

Companies spend differently depending on competition or profitability, that's obvious. But what's not really obvious is that with a company like Apple or Google how much money they can spend or waste AND they still eak out that large profit. If you make money, you inevitably spend on things that you wouldn't if you were in a lower margin or less certain industry.


My dad worked at Bell Labs in the 80s, and he's always telling me about the quality of the experience there.

I'm not sure if it was the politics, or the people, but he said there was an Apple or Google-like quality of innovation and inspiration. What's amazing though is that this innovation extended to basic research as well, and not just consumer products. He always tells me that when his team was using an early implementation of C++, a member simply called up Bjarne (creator of C++), and got an answer within seconds.

I agree with others about shifting research locations, but I think we have some fundamental problems with the attitudes of research. I've researched heavily outside of school for the past 4 years (thousands of hours, multiple publications) and the thing I've learned is that there's gotta be a better way to do this. Labs aren't the best place for industry and academia to talk to each other, because we'll always have conflicting motives.

Perhaps a move towards more technology incubation from university research - a Y-Combinator for universities, perhaps - is the future.


"Two of its researchers were awarded the first patent for a laser, and colleagues built a host of early prototypes."

I hear that most companies own the things that their employees create while working there. The above quote seems to imply that the patents were owned by the individual researchers. Did Bell Labs have a policy of researchers owning their inventions?


Did Bell Labs have a policy of researchers owning their inventions?

No, Bell retained the ownership (and revenue stream.)



For insight into the computing side of Bell Labs, Doug McIlroy's retirement lecture is a great read: http://research.swtch.com/bell-labs.

A paper covering the earlier period is A History of Computing Research at Bell Laboratories (1937-1975): http://cm.bell-labs.com/cm/cs/cstr/99.pdf


a book bell labs' innovations already exists, "three degrees above zero" by jeremy bernstein. highly recommended. i have a copy i got used off of amazon, but here it is for preview in google books.

http://books.google.com/books?id=N6s8AAAAIAAJ&dq=three+d...


I'm another Bell Labs alum. This article seemed banal and backward looking.

Think about what a "hothouse of innovation" is and was. Back then, you had to physically gather people together to get ideas flowing. Today, we can do it here, on chat rooms, listservs, irc channels, blogs.


For all the genuine innovations that came out of Bell Labs, it's also the case that they killed inventions that would have threatened their telecommunications monopoly, such as the magnetic answering machine that was invented at Bell Labs in 1934!

http://gizmodo.com/5691604/how-ma-bell-shelved-the-future-fo...


The article never mentioned the creation of Unix. How could any tech loving journalist miss this point?!!


It's mentioned on the first page, second paragraph from the bottom: "Its computer scientists developed Unix and C, which form the basis for today’s most essential operating systems and computer languages."




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: