Hacker News new | past | comments | ask | show | jobs | submit login

But what happens if we hit the final plateau, same processor speed (+ minor improvements) for the decades to come?



We start optimizing software, and then we start optimizing requirements, and then computing is finally finished, the same way spoons are finally finished.


Are spoons really finished? I am sure plenty of people are designing better/cheaper spoons today. I love looking at simple, everyday objects and look at how they evolved over time. Like soda cans, water bottles. Even what may be the oldest tool, the knife is constantly evolving. Better steels, or even ceramic, folding mechanisms for pocket knives, handle materials, and of course all the industrial processes that gets us usable $1 stainless steel knives.

Computers are the most complex objects man has created, there is no way it is going to be finished.


you can also optimize society because every time a human gets in the loop, trillions of cycles are wasted, and people / software / platforms are really far from efficient.

actually if companies and software were designed differently (with teaching too, basically an ideal context) you could improve a lot of things with 10x factors just from the lack of resistance and pain at the operator level


This is a really good point you make.

A simple example for me is how the ATM replaced the bank teller, but the ATM has been replaced with cards with chips in them. It’s a subtle, but huge change when magnified across society.


are chips an issue ?

having worked in various administrations, the time / energy / resources wasted due to old paper based workflow is flabbergasting

you'd think after 50 years of mainstream computing they'd have some kind of adequate infrastructure but it's really really sad (they still have paper inbox for internal mail routing errors)


Computing will never be finished like spoons in the software realm because software is like prose. It's a language where we write down our needs, wants, and desires, and instructions to obtain them, and those are always shifting.

I could definitely see standard classical computer hardware becoming a commodity though.

There will also be room for horizontal expansion for a LONG time. If costs drop through the floor then we could see desktops and servers with hundreds or thousands of 1nm cores.


The hardware will be finished. But the food we eat (the software) will keep changing.


You optimize software--which means more time/money to write software for a given level of functionality. More co-design of hardware and software, including more use of ASICs/FPGAs/etc. And stuff just doesn't get faster/better as easily so upgrade cycles are longer and potentially less money flows into companies creating hardware and software as a result. Maybe people start upgrading their phones every 10 years like they do their cars.

We probably have a way to go yet but the CMOS process shrink curve was a pretty magical technology advancement that we may not see again soon.


To some extent yes, but there's a lot of low hanging fruit.

Java is only just now getting value types, even though flattened value types are fundamental to getting good performance on modern CPUs. Much software is HTML and JavaScript which is so far from having value types it's not even worth thinking about. Simply recoding UIs from JS -> Java/Kotlin+value types would result in a big win without much productivity loss.


Transistor size is not the only factor in processing speed, architecture is also important. We will still be able to create specialised chips, like deep learning accelerators and such.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: