Hacker News new | past | comments | ask | show | jobs | submit login

"Intention" is what the final fraction of a bit gap in predictive performance feels like from the inside of your head.

It has all the low-order correlations learned well, but there are long-range correlations still lacking. (Think of a detective novel where the clues are hidden thousands of words apart, in very slight tweaks to wording like an object being 'red' rather than 'blue'.) As models descend towards the optimal prediction, 'intention' suddenly snaps into place. You can feel the difference in music between something like a char-RNN and a GPT-2 model: it now sounds like it's "going somewhere". (When I generate Irish music with char-RNN, it definitely feels 'intention-less', but when I generate with GPT-2-1.5b, for some pieces, suddenly it feels like there's an actual coherent musical piece which builds, develops a melody and theme, and closes 'as if' it were deliberately composed. Similarly for comparing GPT-2 stories to GPT-3. GPT-2 stories or poems typically meander; GPT-3 ones often meander too but sometimes they come to an ending that feels as if planned and intended.)

Once this final gap is closed, it will just feel real. Like if you look at No Alias GAN (~StyleGAN4) faces, there's no 'lack of intention' to the faces. They just look real.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: