Can artificially intelligent robots one day become sentient, like humans? Sentience is the ability to feel, perceive and respond to the sensations of sight, hearing touch, taste or smell. How human can robots become? It is matter of the degree, a gradient.
Machine sensors can identify, detect and process sensory input. For example, there are cameras for sight, vibration detectors for sound, pressure gauges for touch, gas chromatography for smell and volatile compound sensors for taste. Computer responses to the sensory stimuli can be either programmed or learned without being explicitly programmed through machine learning, a subset of artificial intelligence (AI).
Artificially intelligent machines may lack the biological parts, physiology and biochemical systems that identify and regulate emotions, but have the capability to emulate human sentience. The expression of human emotions are both innate through inherited genes and learned. There are two parts of emotions, the mental state and the accompanying somatic response. Computers can identify, label and communicate sensory input. If environmental sensory inputs are within a certain range, computers have the ability to label the state of emotion and even respond. The range can be derived either through hard-coding or through machine learning. The initial data input and algorithms serve as the de facto DNA of artificially intelligent machines, the baseline of inherited machine sentience.
Like the human brain, AI can have a fast and slow processing of sensory input. Sensory input that is already learned and stored in the computer’s memory could theoretically be quickly retrieved and used for actionable outputs. The slower portion is the thinking part of machine learning, where novel inputs or combinations require additional processing. The factors impacting the degree of emulation are processing speed, algorithms, data, cloud-based computing and memory capacity.
In the human brain, sensory input from the body is sent to the thalamus to the amygdala and brain cortex. The amygdala is part of the limbic system that regulates emotions and the fast ‘fight or flight” response. It signals the hypothalamus to release hormones that influence the Autonomic Nervous Systems (ANS). The sensory input to the brain cortex is processed at a more gradual rate. Thus both the human brain and AI have similar dual processing of sensory input at variable speeds.
So this leads us to seek a greater delineation between man and machine. AI machines have the capability to approach human-like sentience, but will never be alive in the sense of consciousness. Or will they? Can man recreate consciousness in machines? There is a greater likelihood that man will plug into the machine first.
Today we enhance our thinking and memory externally with smartphones, search engines, cloud computing, calculators, GPS, digital assistance, fitness trackers, and other technology. We boost our somatic capabilities with glasses, hearing aids, implants and artificial limbs.
Inventor, author, futurist and Google executive Ray Kurzweil predicts that by 2030 humans will connect the thinking part of the brain, the neocortex, to a cloud computing network. The question becomes “How integrated will man be with the machine?" It is a matter of degree.