There is a new book, The Consciousness Revolutions: From Amoeba Awareness to Human Emancipation, whose author, in an interview said:
According to most explicit theories that have been proposed (including the one I’ve developed with several collaborators), consciousness, like all other aspects of the mind, is fundamentally a kind of computation. As such, it can be implemented in different substrates, which do not have to be biological — just like addition can be implemented in an electronic computer or a mechanical one (like an old-fashioned cash register). While it may feel different to be a conscious electronic machine, compared to being a conscious living one, the difference is secondary to what is common to all conscious systems. Depending on the level of consciousness in question, the commonalities include such qualities of experience as the distinction between the self and the rest of the world, positive and negative affect, and self-modeling — all discussed in the book.
Shimon Edelman argues that single-celled organisms are conscious and that machines can have consciousness. The author mentions commonalities as experience, self, the world, affect and self-modeling. But what could the commonality of the commonalities of consciousness be?
If different things could mean consciousness, what must those things have, to contribute to or be able to be categorized with consciousness? Also, if consciousness is a kind of computation, is everything like a computation conscious?
If machines can be conscious, to what extent must they possess the constituents of consciousness to be considered? Consciousness, simply, can be defined as how much any system can know, with a maximum [and minimums per division].
Attention is known, awareness is, as well. Emotions, feelings and whatever is in the memory are, or can be, known. Thinking is carried out with whatever is known, or directed towards what can be known. Signals from internal senses to the central nervous system are so that they can be known, or to get—knowing—as feedback that they are working properly. This also applies to external senses—since they are processed to be known, like see and know.
The more any system can dynamically know, the more conscious it is, in comparison to humans as the standard. Whatever is common for consciousness must be a form of knowing. Self-modeling is known, so is affect, and experience.
Machines can know, at least in the memory category, varying from system to system. Most machines are negligibly conscious. Large language models with some of what they output, recognized by humans as accurate, can be categorized for dynamic knowing—or intelligence, with a figure comparable to a minimum for humans in the memory division.
Consciousness is from the mind, it is obtained by a set of components of mind, from others. The mechanism is not necessarily similar to computation, as some properties of the mind are obtained directly from quantities, without splits or sequences. Conceptually, the mind consists of quantities and properties—with varying features.