We understand reasoning enough to know humans (and other animals with complex brains) reason in a way that LLMs cannot.
While our reasoning also works with pattern matching it incorporates immeasurably more signals than language - language is almost peripheric to it even in humans. And more importantly we experience things, everything we do acts as a small training round not just in language but on every aspect of the task we are performing, and gives us a miriad of patterns to match later.
Until AI can match a fragment of this we are not going to have an AGI. And for the experience aspect there’s no economic incentive under capitalism to achieve, if it happens it will come out of an underfunded university.
We understand reasoning enough to know humans (and other animals with complex brains) reason in a way that LLMs cannot.
While our reasoning also works with pattern matching it incorporates immeasurably more signals than language - language is almost peripheric to it even in humans. And more importantly we experience things, everything we do acts as a small training round not just in language but on every aspect of the task we are performing, and gives us a miriad of patterns to match later.
Until AI can match a fragment of this we are not going to have an AGI. And for the experience aspect there’s no economic incentive under capitalism to achieve, if it happens it will come out of an underfunded university.