It works fine for me, too. I don't understand why anecdotes of problems are seen as interesting data points (for anyone other than the developers attempting to improve the software), especially in a community of technical people who presumably have some decent grasp on probability. This could be explained just by buggy hardware, or hard drive corruption, or any number of things. It's also not unique to Ubuntu.
See also:
1. My iPhone 4S crashed randomly when I got it.
2. Google "windows 7 freezes randomly" -> 74,000 hits
Not only that, accelerated 3D video hardware is as tricky as it gets. Hats off to the open source developers who attempt it, often having to reverse engineer some of the most complex hardware known to man.
Am I missing some context here? Why is a nouveau driver crash front page news?
Hmm, lots of anecdotes and complaining here. I've experienced no issues of the sort. Running perfectly smoothly here on 12.04, 64-bit. And I'm on a 4-year-old Pentium Dual Core machine, of all things.
I did not read through all of the posts, but it appears to be related to nVidia hardware. I'm guessing that you are not using an nVidia card.
On a side note, I gave up nVidia hardware altogether (about 4 years ago) because of endless issues with Linux (and macs). This thread isn't encouraging.
Same here, but with an even older computer: AMD 64 4000+ with compiz turned off and running LXDE on the desktop. No issues so far and I haven't had any crashes at all.
I don't understand why they don't make the release cycle longer. Perhap 9 months or a year, or even just one additional month for QA. I first started using Ubuntu, and Linux, when 9.04 came out and it was great. Since then the quality has steadily declined. 9.10 was good too, but ever since 10.04 every release has sucked. They can't even do LTS releases right.
Sorry for ranting, but the Ubuntu team seems to fuck up time after time. Perhaps they should try focus on making their system stable, instead of adding more crap.
Ubuntu is just a distribution. They don't directly hack on everything they ship. This is why they try to keep up with the upstream which releases the new versions. They do it in fact less than Fedora, which may or may not be a good thing. Maybe if they get really big they can afford to slow down.
To add my little Linux rant (as a stupid user):
I installed Ubuntu 12.04 and I had almost everything I needed after a 15 minute installation. I prepared mentally for a battle with Skype. It actually worked really well.
I toyed with gnome-language-selector and somehow my language got switched to Chinese. And then the 2 hour battle began. No matter what I did, my GNOME or Unity sessions stayed in Chinese. Ubuntu devs decided that I don't need to select the language at login, so I couldn't do it there, and the various locales didn't convince anyone except the shell that I don't want Chinese. Finally I opened again gnome-language-selector in which I couldn't previously select the language. Turns out I had drag and drop and not click the languages. So I had to pick up English and drop it above Chinese for it to work.
I should have reinstalled Ubuntu - it would cost me less time. And I still have no idea why I had Chinese in the first place.
LTS releases are supposed to do this. The six-month releases can be considered, for all intents and purposes related to stability, "testing" releases. The LTS releases are supposed to be stable enough for large-scale rollouts, with three years of desktop and five years of server support. In fact, Canonical upped the ante with 12.04--five years for both the desktop and server versions. Of course, plenty of things are supposed to be.
If that were the case, then one would not expect any serious changes to be made in the actual final release from the one that precedes it, as the changes being tested by the six-month period get locked in as "this definitly worked". As that is obviously not the case, either the release process is flawed or your premise is incorrect. In fact, it seems more like the cycle is simply "don't make us support forever every single release", not "let's take a year and a half to test these changes before committing to them (which, to be clear, I'm fine with, although in practice it has meant that I always upgrade away from LTS anyway as it is often less stable, in a "will it lock up on me" sense, than the subsequent releases).
First of all, LTS is no real LTS prior to the .1-release. All LTS-releases had serious issues at first, as far as i remember even dapper drake. Which is not even really harmful for an LTS, since corporations had no need to upgrade as soon as possible from the last LTS.
Furthermore, the story of the declining release-quality was also said about 7.04 (pulseaudio was the issue, if i remember correctly), and edgy prior, end the one after feisty... hoary was nice, but it wasn't that good.
12.04 got - apart from this bug now - really good reviews and felt good on my laptop, so condemning canonical because of one bug seems inappropriate.
> Which is not even really harmful for an LTS, since corporations had no need to upgrade as soon as possible from the last LTS.
Not only that, but Ubuntu doesn't give you the option to upgrade until .1 is available. Trying to upgrade my 10.04 boxes reports that there's no new distribution yet.
Actually, I think they'd do a lot better by making the release cycle shorter. When Firefox changed their release cycle to be much much shorter, they got a significantly improved, much more stable product. The main reason was that instead of shoving everything that they wanted to release into it, features got released when they were mature enough to go in. I think this is something Ubuntu (and almost every other OSS project) could really learn from.
The real problem here is that the LTS cycle is synched with the six-month release cycle; so the LTS is simultaneously a "new" six-month release. It might make sense to say that the Even.04 releases are the "LTS beta" or maybe even the "LTS RC", and then make the real LTS release be Even.07 or so.
10.04 systems hang after 200+ days due to a kernel bug. If you install the latest kernel, which addresses the issue, after 200+ days processes still no longer report any CPU usage.
11.04 used to frequently crash for me too on my Thinkpad T420. I started building my own 3.0 kernels and that got rid of the crashing problem. I've had no such problems since I upgraded to 12.04 over a month ago. Works perfectly here.
How is one supposed to go about debugging these freezes? None of the comments on the bug appear to offer any insight into the actual cause, or offer much help in narrowing down the possibilities.
Wanted to try out 12.04 (64bit) also. Downloaded Windows installer for Ubuntu (wubi or what it's called).
Everything seemed to work well but after every 15 minutes or so the computer freezes totally for few minutes... when trying to install something new even longer.
This is anecdotal as well, but +1 for the freezing, Ubuntu 12.04 is unusable on my machine. Even when it's idle, it will last 12hrs at most before a hard hang, network is dead. I haven't tried to Update All The Things yet though, hoping that fixes it.
As EternalFury says, hardware support is tricky. Ultimately that is why I went back to windows, I couldn't deal with total system crashes in the middle of coding or demo'ing and maintaining a windows VM was just a band aid.
Weird. I've been using 12.04 since it was in beta and it's been stable for me. The only problem I've had lately has been with flash videos chewing up too much CPU and causing things to get a little hot.
12.04 has been the single worst release ever for me (used Ubuntu since 2006). These include the Nvidia binary driver crashing, the open source Noveau driver losing the plot and drawing random content from the wrong windows, a 2 minute wait booting a laptop waiting for network, compiz move window making life hard but disabling is even worse, sound applet not showing up, repeatedly disabling my webcam microphone for no reason, massive increase in power consumption and the list goes on.
The binary drivers or the nouveau ones? I've always used the binary ones, and Nvidia's driver quality has been why I've only bought Nvidia cards for many years. It has only been this release that caused me problems, and even then I finally solved them by downgrading the Nvidia driver version.
I tried gtkperf to see what figures I get. I did note that the results depend on the size of the window quite a bit.
On my workstation with an 8800GT I got 3.46 seconds at default window size and 4.08 at 1920x1200 (using mutter window manager). On my 5 year old laptop with Intel 965 graphics I got 7.30 seconds (using metacity).
I switched to Chrome years ago and its tab switching is fine.
12.04 release quality is the poorest since I got introduced to 8.04. First zenity problems, apd and synaptic is crashing several times a day, freezes that make you hard-reset the machine. It is overwhelming. I switched to XFCE (xubuntu) for now but otherwise mentally preparing myself for Fedora. So far no freezes with XFCE.
Because nothing could make the signal to noise ratio on that bug report worse than posting it to a news aggregator. Thank you, SandB0x. I'm sure everyone on Ubuntu's XOrg team appreciates it.
You make it sound as if it was posted to lipstick.com. HN mostly has very technical type of people who are well capable of debugging and eliminating issues.
Reading the bug reports, I am pretty sure that the signal to noise issue will actually be better in commments from HN folks.
32bit; I had this for a while during testing but not with recent kernel versions (HP recycled workstation nvidia GT520 graphics), seeing similar on Debian Wheezy with same kernel numbers. I think we sometimes forget the OS is Gnu/Linux and the rest is packaging and customisation (nice though that is).
11.10 I had proper black screen kernel panics on the EeePC netbook, again only during testing.
I might run a live session on 64bit and see what happens.
Maybe this is me not paying attention much lately...
How can a bug like this end up being specific to a distro? Aren't the distros effectively just different package managers, different init conventions, and maybe some system management sauce?
The overwhelming majority of the operating system is just: kernel + modules, GNU, X, and a window manager, isn't it? So how can a bug like this be specific to Ubuntu?
Every distro uses different kernel configs/versions as well as different versions of libc and other such core packages. If the crash is specific to a certain kernel config option that is only enabled in one distro, then the crash would be distro-specific.
I'm running 12.04/amd64 perfectly on i3 and i5 machines.
Considering the bewildering variety of hardware out there and all the testing manufacturers during the beta phase (no, just kidding - they do absolutely nothing to help) not that many people got hit by the bugs mentioned. It depends on a carefully aligned stack of bugs that's easily removed by changing one component.
Obviously, Ubuntu is not for some people. For all others, it works.
I made the mistake of importing Compiz settings that I was using in 10.04 and 11.04 in an attempt to make Ubuntu more usable. It made it completely unusable.
KDE with nVidia's binary works for me. Everything with Gnome 2, 3 and Unity has been flickering whenever I move a window and window decorations like to disappear.
Early adoption of LTS and LTS^ releases is never and increasingly a bad idea.
I've also experienced locking problems in 12.04 that were very easy to recreate with a proprietary OpenGL-based program. We reverted back to 11.10 for now.
Different levels of regression. I have several Ubuntu using friends - completely normal users - who constantly need help because the Ubuntu upgrade broke.
I did a 10.04 to 12.04 upgrade on a server the other day. It also broke. Completely standard with no software installed except for Apache. No modifications done whatsoever. Upon booting into 12.04, the screen just flickers. I looked at the log (from ssh) and it had to do with vesafb or something like that. I didn't bother investigating, and did a plain 12.04 install from ISO.
Seriously, that really can sum it up. Linux has made a lot of inroads, but its nowhere near the stability of operating systems that spend their time "doing work" as their first priority.
Are you grouping operating systems into "linux" and "non-linux" categories, and asserting that the "linux" operating systems are nowhere near as stable?
If you compare with Solaris, AS/400, OS/390, AIX, FreeBSD, HP/UX, and a few others - Linux loses bigtime in the uptime department.
That doesn't make Linux less useful, it just hilites the maintenance required. People (MSFT mainly) use Windows to a high degree of success, they just reboot it a lot. I have a Linux system that needs to be rebooted every 3-4 days to maintain usability, I can't do anything about it because the system itself is essentially closed.
I have a lot of customers who use Linux, the ones who don't have problems mostly run CentOS. The ubuntu users always seem to have problems, although I suspect thats usually due to it being developers running systems vs ops people. The ones who run every other Linux seem pointy headed enough not to screw themselves over.
ProTip: If you break your box regularly, you are not a good sysadmin.