Hacker News new | past | comments | ask | show | jobs | submit login

"It talks to me like a person"

Because it has a sample size of our collective human knowledge and language big enough to trick our brains into believing that.

As a parallel thought, it reminds of a trick derren brown did. He picked every horse correctly across 6 races. The person who he was picking for was obviously stunned, as were the audience watching it.

The reality of course is just that people couldn't comprehend that he just had to go to extreme and tedious lengths to make this happen. They started with 7000 people and filmed every one like it was going to be the "one" and then the probability pyramid just dropped people out. It was such a vast undertaking of time and effort that we're biased towards believing there must be something really happening here.

LLMs currently are a natural language interface to a Microsoft Encarta like system that is so unbelievably detailed and all encompassing that we risk accepting that there's something more going on there. There isn't.






> Because it has a sample size of our collective human knowledge and language big enough to trick our brains into believing that.

Yes, it's artificial intelligence. It's not the real thing, it's artificial.


Again, it's not intelligence. It's a mirror that condenses our own intelligence and reflects back to us using probabilities at a scale that tricks us into the notion there is something more than just a big index and clever search interface.

There is no meaningful interpretation of the word intelligence that applies, psychologically or philosophically, to what is going on. Machine Learning is far more apt and far less misleading.

I saw the transition from ML to AI happen in academic papers and then pitch decks in real time. It was to refill the well when investors were losing faith that ML could deliver on the promises. It was not progress driven.


> our own intelligence

this doesn't make any more sense than calling LLMs "intelligence". There is no "our intelligence" beyond a concept or an idea that you or someone else may have about the collective, which is an abstraction.

What we do each have our own intelligence, and that intelligence is and likely always be, no matter how science progresses, ineffable. So my point is you can't say your made up/ill defined concept is any realer than any other made up/ill defined concept.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: