Many decades ago, during my time as a NASA Space Shuttle engineer, a co-worker couldn't locate a rogue unbalanced parenthesis in a complex program listing, in the days when computers, instead of identifying a particular syntax error, would print lame error messages like "Something went wrong." Worse, we wrote programs by punching 80-column cards -- no syntax highlighting, no color monitors.
My co-worker printed two paper listings, one with the error, one without, and asked me to count parentheses as he was doing, over a dozen pages. But because I knew this "superpower" trick, I laid out pairs of pages and crossed my eyes. A few seconds later I found and circled the error.
> So the wall is actually at the beginning of the runway. That wall was never never meant to be at the end of a landing but at the start of landing.
Airports like this are designed to have two approach directions -- in this case, 10 and 190 degrees. Either approach direction would have been acceptable depending on the prevailing wind.
I'm a pilot. The airplane was sliding on the ground and the landing gear was not deployed. Too fast to stop but not fast enough to use the rudder for directional control. There was no realistic chance to change direction.
If there had been enough engine power to control direction on the ground, there might also have been enough power to remain airborne, but based on limited information, that wasn't so. Under the circumstances the pilots would have wanted to stay airborne to buy time for a more controlled descent, were that possible.
All these speculations are preliminary and may completely change once the black box information is released.
> not fast enough to use the rudder for directional control.
Sure about that? 160kt (how fast someone calculated it went off the end of the runway) is way above Vs1 for a 737, there should be plenty of rudder authority. Heck, Vapp is usually in the 130-150kt range.
We're all players in a game called evolution. This means we're biologically designed to cope with adversity, not success. The real contest in life is between closely matched organisms, each having only a small chance to survive.
Being a creature that's 1% more or less appropriate to its environment (not too smart or dull, not too strong or weak) is the actual game. Everything else is random noise. This means becoming spectacularly successful makes you irrelevant -- a spectator.
This can lead to artificial contests, between organisms that have already won the primary contest. So, after saying, "Wait, what now?", people invent make-believe goals -- climb mountains, make even more money, learn to cook a perfect omelette -- hoping to restore a sense of purpose.
In this non-contest, the least successful burn out, maybe even die prematurely. The most successful invent an imaginary, artificial goal that turns into a real advance. Transistors. Lasers. Antibiotics. Things that weren't really necessary until they appeared and, by existing, became their own reason to exist.
So ... done climbing mountains? Choose an important, unsolved problem and work on it. The problem list is long and deserving:
Batteries.
Cancer.
Population.
Late-stage capitalism.
People who want to kill everyone who doesn't share their beliefs.
Decades ago, prior to the existence of personal computers, when a "computer" was a glassed-in room staffed by lab-coat-wearing technicians (picture John Von Neumann standing next to the first stored-program computer: https://www.theguardian.com/technology/2012/feb/26/first-com...), someone reduced an entire printed book (or more than one) to a word-token decision tree, at great cost and effort, just to see what would happen.
I can't find the original paper, but with an appropriate amount of pseudorandomness to avoid dead ends, this primitive algorithm would generate the occasional sentence that almost made sense and that bore little resemblance to the original data.
Because of the state of computer technology it was a massive effort and a source of general astonishment. I suspect we're now recreating that minimal environment, this time with better ways to curate the data for small size and maximum drama.
Let's remember that a modern GPT isn't far removed from that scheme -- not really.
Decades ago, primarily for show, computer technicians wore lab coats, in a glassed-in facility at the University of Colorado. To gain access to the computer one would punch a set of 80-column cards and pass them through a little window into the hands of a person who apparently lived in a separate, elevated, dimension. Debugging an otherwise trivial program was often a multi-week nightmare.
It was my first exposure to the world of computing. Ten years later, hand calculators appeared and the ridiculousness of the entire show was revealed for all to see.
> ... A Modern and Efficient Alternative to JupyterLab ...
This is not meant as criticism, just perspective. It's a classic development sequence:
* A team creates a powerful, small-footprint, REPL environment.
* Over time people ask for more features and languages.
* The developers agree to all such requests.
* The environment inevitably becomes more difficult to install and maintain.
* A new development team offers a smaller, more efficient REPL environment.
* Over time ... wash, rinse, repeat.
This BTW is what happened to Sage, which grew over time and was eventually replaced by IPython, then Jupyter, then JupyterLab. Sage is now an installable JupyterLab kernel, as is Go, among many other languages, in an environment that's increasingly difficult to install and maintain.
Hey -- just saying. Zasper might be clearly better and replace everything, in a process that mimics biological evolution. Can't leave without an XKCD reference: https://xkcd.com/927/
There is no such thing. There are Jupyter kernels. JupyterLab is just one of many UIs that speak the Jupyter protocol. Other examples include the original Jupyter notebook editor, VSCode Jupyter extension, and now Zasper.
I'm pretty sure Sage was always intended as a project that integrates the world, never "small footprint".
A Web search reveals that the alternate term "Jupyter kernel," appears equally often. The terms are interchangeable.
> I'm pretty sure Sage was always intended as a project that integrates the world, never "small footprint".
A large install became true eventually, but it began as a small Python-based install, about 120 KB. Then people asked for extensions, and William Stein said "Yes".
Sagemath offers a different purpose which is scientific computing in order to compete with Mathematica and MATLAB. It offered a good interactive notebook interface which went on till about 2016, and later on was migrated to using the jupyter backend. It currently isn't well supported in Windows which is what you might have meant by the complexity. However it works pretty well with linux systems.
> Sagemath offers a different purpose which is scientific computing in order to compete with Mathematica and MATLAB.
Yes, that was its goal, when Python wasn't as evolved as it is now. More recently I've come to rely on Python libraries like sympy for symbolic processing. For these kinds of results Sage relies on a rather old environment called Maxima, and I think current sympy does pretty much everything that Maxima does. And as time passes Python libraries are beginning to provide some of the numerical processing originally provided by MATLAB (but more slowly).
> It currently isn't well supported in Windows which is what you might have meant by the complexity.
Actually I was thinking of JupyterLab itself. As time passes I find it more difficult to get it installed without library conflicts. But that can be said about many Python-based projects in modern times, which is why a Python virtual environment is becoming more the rule than the exception, in particular with GPU-reliant chatbots and imaging apps, to avoid the seemingly inevitable library version conflicts.
If memory serves, Sage now installs on Windows by creating a Linux VM to support it.
> In fact, due to the immense damage technology has caused, the burden of proof should be on the technologists to reliably demonstrate that new technology is even worth it beyond propping up our broken, global capitalistic system.
A fair argument as far as it goes, but unlike engineering and some other activities, mathematics isn't about creating more technology. Although mathematics can be applied to that purpose, that's not its essence.
If electronics and computers didn't exist, if industrial society didn't exist, mathematics would still exist, and perhaps it wouldn't be confused so often with its applications.
Mathematics isn't responsible for how we choose to apply it.
> " ... as the Azerbaijan Airlines aircraft was approaching to land as scheduled."
Not as scheduled. The aircraft was diverted because of unacceptable weather at its primary destination -- https://www.bbc.com/news/articles/cjwl1e6895qo : "The plane was en route to Grozny in Russia but it was diverted due to fog, the airline told the BBC."
Just saying, not excusing a hostile takedown, if that's what happened.
Diversion due to weather is a possibility anticipated by pilots, who normally carry extra fuel to accommodate the possibility of a diversion.
Ironically, because of the advent of drone attacks, a small aircraft like the Embraer 190 is more likely to be mistaken for hostile, compared to a full-size airliner.
> Ironically, because of the advent of drone attacks, a small aircraft like the Embraer 190 is more likely to be mistaken for hostile, compared to a full-size airliner.
To avoid confusion I made sure to compare it to a full-size airliner. Interestingly, the aircraft's comparatively small size may have determined its fate (https://www.cnn.com/2024/12/26/asia/kazakhstan-plane-crash-q...) : "Russia may have downed Azerbaijani jet after confusing it for Ukrainian drone, US official says."
> I find it a sad thing that often great achievements and ruthlessness go hand in hand.
That may be true (plenty of examples), but it may also be an outlier. It might be more common for talented individuals to work cooperatively, in a way that leads to breakthroughs, but without anyone trying to steal all the credit. I emphasize this is just a theory, in a field that cannot be reduced to science.
I've met many talented people over decades, in effective teams, but worked with only one classic martinet -- Steve Jobs. To hear Steve tell it, he was the reason for Apple's success. But he didn't design anything -- he was a salesman, not an engineer. In fairness, his incompetent replacements were in every way worse.
This thread is perfectly timed. We're about to see another martinet try to steal credit for the accomplishments of others, while denying responsibility for errors along the way.
From the linked Gitlab writeup: "Some changes to OpenSSH are used from Arachnoid's SSHelper." I'm very glad to see this port of open-source code I wrote years ago, especially now that Google has removed SSHelper from the Google Play store (BTW still available at https://arachnoid.com/android/SSHelper).
After years of trying to keep up with Google's perpetual Android tweaks, I gave up and accepted that they would eventually remove any apps that weren't updated for each new Android version.
These events only remind me how out-of-date I am as a programmer. I wrote and released my first major title, Apple Writer (https://en.wikipedia.org/wiki/Apple_Writer) in 1979. It lasted for six years in various forms, then was replaced by better programs. I wasn't a corporation, I was an individual, and my programs (then and since) have been individual projects.
In modern times, individual releases are rare, and in the future are likely to be even more rare, replaced by collaborations between developer teams and AI.
Not saying things were better in the past. Just different.
I'll say things were better in the past. It's obviously subjective, but I hate the direction things are going.
The user is now viewed as a security threat to their own device, the hyper-churn culture of the javascript ecosystem is now embedding in other areas even systems (like Android, as you point out), "updates" for apps and to a lesser but growing extent OSes, are routinely pushed and forced on users regardless whether they contain new bugs/regressions or horrible UI/UX changes, more and more software is becoming proprietary SaaS and "subscription" based, and backwards compatibility is for the birds. In the name of "security", tech companies and even individual devs are turning our own home networks into opaque spy apparatuses that make network connections that we (the owners of the network) can't even inspect. Even maintaining self-hosted apps is becoming a several-hours-per-week job.
It feels like during the late 00s and early 10s we had some real golden years of open source, but now the poisonous engineering culture that pushes the above things is poised to squash it as a "daily driver" for people. For example, once Microsoft completes their requirements for TPMS and can do hardware attestation like Apple and Google, the ratchet of websites not working (or not working completely) unless the device passes hardware attestation will start, and it will make life on a Linux laptop/desktop similar to how Tor is now where you get endless CAPTCHA hell and nobody cares because you're in a tiny minority of users and many of the tools that provide technological liberation for an individual are also tools used by gray and black hat actors.
And I haven't even gotten to the Apple-ization of everything where it's becoming all about building walled gardens. I remember when compatibility was a selling point of hardware/software.
It's not all bad of course, but it does feel like a lot more bad than good is developing. Happy Monday everyone!
When Microsoft first proposed attestation features in 2002 under the name Palladium, it was almost universally seen as a nightmare scenario. I don't understand why most of the tech world is OK with Apple and Google doing the same thing to our phones now, and Microsoft bringing it back on Windows.
I do understand trying to bury full access to the device a bit deeper than it was on older PC operating systems. The average person doesn't know how to use a computer, and it doesn't appear there was ever much hope of that situation changing. Letting a third party verify the computer is in a certain state, however seems outright malicious.
My co-worker printed two paper listings, one with the error, one without, and asked me to count parentheses as he was doing, over a dozen pages. But because I knew this "superpower" trick, I laid out pairs of pages and crossed my eyes. A few seconds later I found and circled the error.
"Ta-daa!" I said. He never forgave me.