Hacker News new | past | comments | ask | show | jobs | submit login

The 4080/4090 pages have the prices.

4080 12GB: $900

4080 16GB: $1200

4090: $1600




That specs make it seem like these wouldn't be quite worth it at launch.

Ampere specs: https://www.nvidia.com/en-us/geforce/graphics-cards/compare/

This looks like a smaller bump than Ampere. 3090 seems cheaper and better than 4080. As people have pointed out, what I'd really like is more / upgradeable RAM. NVidia is trying to not compete with their higher-end cards, but ML tools are increasingly consumer.


I would like to see independent benchmarks for 4090 before making judgements. "2-4x" as claimed over 3090 Ti is not a small bump.


2-4X in video games, when using DLSS with their new frame interpolation technique.

It kinda feels like cheating.


If it’s not visually distinguishable, it won’t matter. I hear recent versions of DLSS are very good at 4k despite earlier versions having some bad artifacts. I guess we’ll have to wait for the reviews :)


I guess it depends on your definition if distinguishable, but as someone who has dabbled with most ML frame interpolation techniques, this appears just as artifact ridden. https://i.imgur.com/9U9HIpH.jpeg


Yeah I'm going to call this a hard pass until I see third-party benchmarks, power draw numbers, and what AMD has with RDNA3.

This is starting to look like a repeat of Geforce FX vs Radeon 9700 Pro all over again.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: