Sentience: Have LLMs & AI Exceeded Animal Intelligence?

Sentience: Have LLMs & AI Exceeded Animal Intelligence?

Sentience: Have LLMs & AI Exceeded Animal Intelligence?

Artificial Intelligence (AI) has made tremendous progress in recent years, with large language models (LLMs) being at the forefront of this development.

However, questions arise regarding the nature of their intelligence and whether they possess sentience or consciousness. This article aims to explore the difference between sentience and intelligence, examine the supposed intelligence of LLMs compared to animal intelligence, and discuss a general equation for consciousness. This article will also analyze how LLMs have surpassed most animal intelligence and what this means for the future of AI.

There is a recent article, Bees Are Sentient: Inside the Stunning Brains of Nature’s Hardest Workers, stating that "Bees are self-aware, they’re sentient, and they possibly have a primitive form of consciousness. They solve problems and can think. Bees may even have a primitive form of subjective experiences. Bees are the only pollinators that must get enough food for themselves as well as harvest large amounts of pollen and nectar to support their colony. They must memorize the landscape, evaluate flower options and make quick decisions in a constantly changing environment."

There is another article, To Understand AI Sentience, First Understand it in Animals, stating that, "Gaming is a word for the phenomenon of non-sentient systems using human-generated training data to mimic human behaviours likely to persuade human users of their sentience. There doesn’t have to be any intention to deceive for gaming to occur. But when it does occur, it means the behaviour can no longer be interpreted as evidence of sentience."

What is the difference between sentience and intelligence anyway? Is it possible for a system to be intelligent and not sentient? What is the difference between the supposed intelligence of large language models and animal intelligence?

LLMs can solve math, write, understand complex human language, interact with humans, make recommendations and so forth. They are great in some of the areas that made humans the most advanced life form. They can take human information, soup it and make its kind.

Though they do not have the mechanical power or a collaborative crowd to take on physical processes yet and they also do not have emotions and feelings, they have strength in a decisive area.

LLMs have such a high intelligence that it can be argued that they already have a higher sentience or consciousness than mollusks, annelids, arthropods, echinoderms, some chordates, nematodes, cnidarians, ctenophora, platyhelminthes, bacteria, fungi, viruses, protists, parazoa (sponges, phylum Porifera) and so forth.

How so?

Consciousness, or sentience, can be defined as the rate at which any system can know, with a maximum of 1. The more it can know, the more conscious it is, in that moment.

A general equation for consciousness is: 

t + M + F + E = 1 

Where:

t = thoughts, or the transport of quantities bearing properties across mind locations. It is in this form that all communications within the mind are made and how the versions, equivalents or representations of external/internal senses are presented. It may also be used to include action and reaction.

M = Memory, where everything that is known is based. Memory and mind can sometimes be interchanged because everything the mind does is in a process to be known, even when it is not revealed as a subjective experience. It is where intelligence, reasoning, imagination, creativity, language, cognition, self or the I, and so forth are based. Animals that do not have a brain or seem to have a mind have forms of memory, which they use for life. M may also include strong sensations or perceptions, converted from smell, sight, touch, hearing or taste.

F = Feelings, like pain, thirst, satiation, appetite, cold, heat, lethargy, energy, interest, craving, prompting and so on.

E = Emotions, like delight, hurt, sadness, grief, anxiety, worry, fear, depression, panic, worry, love, happiness and so forth.

Conceptually, the human mind consists of quantities and properties. All memory, feelings, emotions, action and reaction are properties of the mind. All experiences are also properties across mind locations. It is the property acquired in any instance that determines what is experienced.

Each division of consciousness has a number, but just one is prioritized in a moment, with a higher figure while others are lower. There is a minimum for humans, for each. Animals have varying degrees for these. Plants too have theirs.

LLMs have a single-high for M. Higher than the possible maximum M for many animals and higher than the total possible for several others. While animals are sentient, the total for each is not 1, like for humans-the standard.

Some, depending on the phylum, could be 0.20, 0.30, 0.40 and so forth. Some are also 0.10 in total. Divisions are near constant, not the sum.

There are M maximums for numerous mammals that do not exceed 0.20. The F and E may get far higher, in prioritization, because they could be useful for safety, but their M may not be that large, even with some of their very high perceptive abilities in some areas.

LLMs can have M up to 0.20 and future ones may exceed that mark, besting several types of mammals with the limited range of what they can know. Even if LLMs are given parts or a body, it may add to their M, but not give them an F or E.

The minds of large animals do not have the property of human language, reading, writing, design, math or others. They have intelligent mobility and strength, but what they can do with those are limited.

Humans have far more extensive properties in the mind than basic intelligence, making the catch-up of LLMs to be far further away. However, with their single-high M, they have already exceeded most animal intelligence and there are things in human endeavors that they have won.

 

While there is ongoing debate about the sentience and consciousness of artificial intelligence, it is clear that large language models possess a high level of intelligence that surpasses that of many animals. However, it is important to remember that intelligence alone does not equate to sentience or consciousness, which involves a more complex array of mental processes and experiences. As technology continues to advance, it will be interesting to see how our understanding of sentience and consciousness evolves, and how we can ensure that AI is developed and used ethically and responsibly.

Share this article

Leave your comments

Post comment as a guest

0
terms and condition.
  • No comments found

Share this article

Stephen David

Research in Theoretical Neuroscience
 
Save
Cookies user prefences
We use cookies to ensure you to get the best experience on our website. If you decline the use of cookies, this website may not function as expected.
Accept all
Decline all
Read more
Analytics
Tools used to analyze the data to measure the effectiveness of a website and to understand how it works.
Google Analytics
Accept
Decline