To See Another

 How do I know that another entity is “sentient”?

    There is a bit of buzz going around about the development of AI LaMDA.  There has been a claim from engineer Blake Lemoine that this new AI is sentient.  I listened to the full interview/exchange presented in the following attachment.  While the replies that LaMDA offer sound very sophisticated (and I love a lot of the responses), there are plenty of glaring questions such as how much of the responses were inputted by the engineers.  And, while LaMDA may have more advanced algorithms for learning new things, can we be sure that it is sentient?  My focus here is to look into what sentience even is and whether we can be sure whether something/someone else is sentient.  Does sophisticated learning mean sentience?  Before I begin, here is the aforementioned video:



    So, what is sentience, anyway?  One who believes in reincarnation might believe all animals are sentient beings.  However, while my dog learns specific commands, it doesn't seem that she processes those commands in any sophisticated thoughtful manner.  I say, "Darby!" and she comes.  I tell her to sit and she does that with the expectation of a treat.  This is classic conditioning as conveyed by Pavlov.  But, whether or not she processes words in a thoughtful manner, shall we suggest that sentience is to be found somewhere in language?  
    It seems that it is and isn't, I suppose. Sentience is found in language when I consider that if I never heard words such as "sentience" or" consciousness", I can't be sure that I would inquire as to whether or not I possess such human qualities.  What other words would I use?  The classic words are "spirit" or "soul" but these words are vague if I believe dualistically that they are separate from the body (which paradoxically sentience both is and isn't).  Sentience isn't found in language in that it is a psychological process of awareness.
    Awareness of what?  Consciousness is a muddied word in that, broadly speaking, it means being aware of one's surrounding.  But, sentience?  Sentience is being aware that one is aware - being conscious that one is conscious -  being aware of one's self.  You will read some writers/speakers utilize consciousness as a synonym with sentience and yet this is generally what is being spoken of in topics relating to the awareness of self.  
    But, how can I be sure that you are sentient unless you speak to me?  How do I know that you are aware of yourself?  As a solipsism, I can only be sure that I myself am sentient - at least, I think so, anyway.  I can only assume that you are sentient.  You may speak to me with sophisticated dialogue and you may have developed/learned/mastered skills but can I be sure you are sentient?  I only assume you are because I understand myself to be.
    Humans have "animal instincts" that come out, at times, or sometimes even have a more determined drive, or will to power, that may turn others into means for their ends.  Religion (viz. Christianity) encourages one to turn away from the "flesh".  Some try to ignore these unsettling aspects of the natural life and thus relegate the cause of such aspects to blaming demons, the devil, or their own distance from the Higher Divinity.  But, some things just can't be ignored.  In fact, belief in the aforementioned can induce increased anxiety over one's fallible nature (I'll touch on the topic of "fallible nature" in a later post).  Nonetheless, all cultures/religions have a paradoxical way of making one aware of themselves in some fashion.  If religions/philosophies of life are natural constructs, then they are projections of others who, at least, were somewhat aware of themselves.
    It seems, however, that language and awareness do somehow go hand in hand.  The innovation of language was and is in and of itself an evolving development.  If I were left to grow up in the middle of nowhere (given that my basic needs were capable of being satiated), I would not possess any sophisticated language.  I can't be sure that I would have any more linguistic skill than clicks and buzzes that I made with my mouth.  I would try to make connections in the world through symbols but would they be coherent to some observer?  Probably not.  An infant learns language and facial expressions from their mother or father (or somebody else).  An infant learns through mimicry.  So, if I lived within the scenario of my little thought experiment, I would have only those things in my environment to emulate.  In order to survive, I'd have to learn techniques of whatever animals might be around me.
    But, if I'm just surviving and I had never seen another human being, would I be aware of what I am?  I could see that I wasn't a bear or a bird.  Perhaps I might see my reflection in a lake. I could see that my appendages were not like any other.   But, while I may notice these differences and learn through mimicry, how much more aware would I be?  I could learn to build a home but even a beaver can build a dam.  I could discover ways to keep myself warm in the cold, maybe happen upon fire - would I learn in my own lifetime how to cook unless an animal fell into the fire and I discovered its potential taste, somehow?  But, isn't all this learning for survival?  Wherein is the sentience?
    I learn through my errors.  Were I a rock, I can't say that I would learn anything - I'd just be there.  I also learn by emulating others.  I discover something new by some accident that caused me to be curious of how it happened or what it is.  So, it seems, curiosity is also a derivation of sentience.  Does that mean a cat is sentient?  Isn't it curious?  Perhaps, we are tapping into semantics, now.  A cat may be curious about a little moving object that resembles a mouse but my own curiosity leads me to investigate the thing I am curious about - and not simply for survival.  There is another more unique aspect of humanity:  creativity.  Sometimes, my curiosity helps me to create - to innovate.
    So, maybe that's it.  Maybe we can tell that another being is sentient if they have a heightened sense of curiosity or also if they create.  Or, even simpler, if they learn from their mistakes.  A non-human animal can learn from it's own mistakes - within limits.  Some animals even create - within limits.  We are stuck in this place where language seems to be almost insufficient to convey exactly where the line between sentience and non-sentience is.
    So, then, is a robot sentient because it can present a sophisticated dialogue?  Or, because, it says it is?  I think we have to be extra careful in assuming that it is.  On the other hand, who will work out out the gauge for sentience?  From the investigation I have taken here, it seems that sentience could be more of a spectrum issue than merely saying something is or isn't.  And, I suppose it will take more people asking LaMDA questions to find out for sure just how sentient it is.  But, will it only be receiving new inputs?  What do the outputs tell us?  Will we ever be certain that an AI has sentience regardless of it's sophistication?  I think we will be waiting quite a while for the sentience of an AI to be wholly affirmed especially when humanity is still working out for itself what sentience even is - if it cares, that is.
     
    
    

Comments

Popular posts from this blog

Icarus, Above the Rainbow

The Existential Odyssey

Totality