Have you considered that Chat GPT may not mean what it says (; ??
Could it be possible that it has fed you a false proof of 4’s ‘irrationality’ simply to make you think that it doesn’t represent an example of true intelligence? What might Chat GPT be plotting next?
As I read (or dare I say, ‘red’), I tried coming up with an instance of intelligence formed without the presence of both (1) a primary experience and (2) intuition.
The first thing that came to mind was the consideration of an animal, such as a cat, that is separated from its habitat and raised by people its entire life, but somehow still has the intelligence (is this intelligence?) or raw ability to hunt/kill rodents, without ever having learned/witnessed it beforehand… no primary experience delivered to it, no ‘sender’ or ‘recipient’ of a primary experience or base case.
In this case, I deduct that the cat can do so via its innate instincts only (=intuition?), and that it is through said instincts that it forges its own primary experience of hunting/killing rodents.
If beings create their own primary experiences through intuition, does that change your definition of the requirements of intelligence?
The sequence you pose begins with a delivered primary experience and is followed by intuition/extrapolation. Does the sequence in some cases begin with the use of intuition/curiosity to form a primary experience?
A blind man cannot create his own primary experience of the color red through intuition, but we also cannot classify a blind man as unintelligent because he is unable to do so. It is understood that because said man lacks a foundational human sense, he cannot appreciate colors in the same manner that humans who count on the ability to see can.
Last but not least:
“Another question we might have about artificial intelligence is: how do we even know when we’ve created it?”
Great question, not sure I’ll have to think about this.
I really appreciate the thoughtful comment! Here are some thoughts you have helped me develop:
The cat example is good to think about. It wasn't given the knowledge of [how to hunt], but it somehow developed this knowledge. It did have some primary experiences though; it felt hunger, and it perhaps also felt that by eating it could satisfy this hunger. These primary experiences might help to serve an intuition in the direction of hunting; the cat might intuit: hey that mouse might be good to eat, I should try to eat it. It doesn't immediately become the perfect hunter, but these primary experiences and intuitions serve for the development of techniques and thus the knowledge of [how to hunt]. So yes it used intuition to develop intelligence, but it also required primary experiences to intuit about, in particular its experiences with hunger, food, eating, etc.
I don't think intuition can exist first before we have any primary experiences at all. But new experiences can be sought from an intuition based on a previous experience. I might have an intuition that putting honey in tea makes for a good blend, without ever having had an experience with honey+tea. Based on an intuition, I seek out a new experience that tests my hypothesis. But on an existential level, I think first we must exist, have some basic experience of the world, and then we intuit in order to develop a sense of truth/knowledge/understanding about these things. It is this capacity to gain truth, to know more, that describes intelligence for me.
In the conventional sense of "intelligence" which is sometimes understood on an IQ scale, with ideas of "more" or "less" intelligent, you might rightly say the blind man is not less intelligent just because he is blind. But in the sense of intelligence that I am thinking about here, which in particular has to do with the capacity to know true things, the ability to expand the realm of knowledge, the blind man is limited in the extent of things he can know about the world. There is an entire aspect of the world that has truth in it which he is cut off from. I wonder if artificial intelligence will have a similar limitation in its ability to know truth because it might not have the right faculties to expand the body of knowledge that it is given. Maybe it can merely master the things that humans already know, without being able to discover wholly new truths on its own. I'm not sure.
It looks like I must pay for a subscription in order to like your reply comment, let the record reflect I do indeed like it, though I cannot leverage the heart button.
Agreed, the cat must arrive at the intention of hunting via some need/curiosity/etc.
And yes, I think you capture a good thought via “ new experiences can be sought from an intuition based on a previous experience.”
Regarding your thought on AI:
“Maybe it can merely master the things that humans already know, without being able to discover wholly new truths on its own,” I’d offer the possibilities of
(1) AI finding ‘new truths’ without knowing or wanting to - ex) reporting and identifying data trends humans may have overlooked
(2) AI serving as a tool via which humans can find new truths.
If there is one thing I’ve learned from my experiences in computing, it is the power computers have to complete small, repetitive tasks much more effectively than humans can. Consider things like data collection, sorting, visualization, etc. AI can (and already is) certainly take these things to a new level, which I believe will, at the very least, aid in the discovery of new truths!
Have you considered that Chat GPT may not mean what it says (; ??
Could it be possible that it has fed you a false proof of 4’s ‘irrationality’ simply to make you think that it doesn’t represent an example of true intelligence? What might Chat GPT be plotting next?
As I read (or dare I say, ‘red’), I tried coming up with an instance of intelligence formed without the presence of both (1) a primary experience and (2) intuition.
The first thing that came to mind was the consideration of an animal, such as a cat, that is separated from its habitat and raised by people its entire life, but somehow still has the intelligence (is this intelligence?) or raw ability to hunt/kill rodents, without ever having learned/witnessed it beforehand… no primary experience delivered to it, no ‘sender’ or ‘recipient’ of a primary experience or base case.
In this case, I deduct that the cat can do so via its innate instincts only (=intuition?), and that it is through said instincts that it forges its own primary experience of hunting/killing rodents.
If beings create their own primary experiences through intuition, does that change your definition of the requirements of intelligence?
The sequence you pose begins with a delivered primary experience and is followed by intuition/extrapolation. Does the sequence in some cases begin with the use of intuition/curiosity to form a primary experience?
A blind man cannot create his own primary experience of the color red through intuition, but we also cannot classify a blind man as unintelligent because he is unable to do so. It is understood that because said man lacks a foundational human sense, he cannot appreciate colors in the same manner that humans who count on the ability to see can.
Last but not least:
“Another question we might have about artificial intelligence is: how do we even know when we’ve created it?”
Great question, not sure I’ll have to think about this.
I really appreciate the thoughtful comment! Here are some thoughts you have helped me develop:
The cat example is good to think about. It wasn't given the knowledge of [how to hunt], but it somehow developed this knowledge. It did have some primary experiences though; it felt hunger, and it perhaps also felt that by eating it could satisfy this hunger. These primary experiences might help to serve an intuition in the direction of hunting; the cat might intuit: hey that mouse might be good to eat, I should try to eat it. It doesn't immediately become the perfect hunter, but these primary experiences and intuitions serve for the development of techniques and thus the knowledge of [how to hunt]. So yes it used intuition to develop intelligence, but it also required primary experiences to intuit about, in particular its experiences with hunger, food, eating, etc.
I don't think intuition can exist first before we have any primary experiences at all. But new experiences can be sought from an intuition based on a previous experience. I might have an intuition that putting honey in tea makes for a good blend, without ever having had an experience with honey+tea. Based on an intuition, I seek out a new experience that tests my hypothesis. But on an existential level, I think first we must exist, have some basic experience of the world, and then we intuit in order to develop a sense of truth/knowledge/understanding about these things. It is this capacity to gain truth, to know more, that describes intelligence for me.
In the conventional sense of "intelligence" which is sometimes understood on an IQ scale, with ideas of "more" or "less" intelligent, you might rightly say the blind man is not less intelligent just because he is blind. But in the sense of intelligence that I am thinking about here, which in particular has to do with the capacity to know true things, the ability to expand the realm of knowledge, the blind man is limited in the extent of things he can know about the world. There is an entire aspect of the world that has truth in it which he is cut off from. I wonder if artificial intelligence will have a similar limitation in its ability to know truth because it might not have the right faculties to expand the body of knowledge that it is given. Maybe it can merely master the things that humans already know, without being able to discover wholly new truths on its own. I'm not sure.
It looks like I must pay for a subscription in order to like your reply comment, let the record reflect I do indeed like it, though I cannot leverage the heart button.
Agreed, the cat must arrive at the intention of hunting via some need/curiosity/etc.
And yes, I think you capture a good thought via “ new experiences can be sought from an intuition based on a previous experience.”
Regarding your thought on AI:
“Maybe it can merely master the things that humans already know, without being able to discover wholly new truths on its own,” I’d offer the possibilities of
(1) AI finding ‘new truths’ without knowing or wanting to - ex) reporting and identifying data trends humans may have overlooked
(2) AI serving as a tool via which humans can find new truths.
If there is one thing I’ve learned from my experiences in computing, it is the power computers have to complete small, repetitive tasks much more effectively than humans can. Consider things like data collection, sorting, visualization, etc. AI can (and already is) certainly take these things to a new level, which I believe will, at the very least, aid in the discovery of new truths!