Hacker News new | past | comments | ask | show | jobs | submit login

Whatever you saw a decade ago, it definitely wasn’t this.

I do recommend you play with it. But if you don’t feel like signing up with their free account, here’s a screen recording of me asking it some random general knowledge questions and instructing it to use a different language in the response each time: https://youtu.be/XX2rbcrXblk




> Whatever you saw a decade ago, it definitely wasn’t this.

To be fair, the post might be from Schmidhuber, in which case it’s true he saw it 20 years ago.

I saw it 10 years ago and it was called MegaHAL, wasn’t this good though.


I had to look up MegaHal, apparently that was based at least in part on a hidden Markov Model.

Using "it" in this way to refer to both that and GPT-family LLMs, or similarly saying "I remember being shown this software almost a decade ago" like the other commenter, is like saying "I remember seeing Kitty Hawk fly, and there's no way that can get someone across the Atlantic" when being presented with a chance to a free seat on the first flight of the Concord. (Actual human level intelligence in this analogy is a spaceship we still have not built, that's not where I'm going with this).


It’s not clear to me MegaHAL language models are less powerful than transformer models. The difference is the “largeness”, but that’s a hardware/dataset/budget detail.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: