2 suggested paths from mechanism to consciousness
Irrationality is a necessary (and perhaps sufficient) condition for consciousness?? Think about it this way. You have a system like Drescher's which compiles statistics to direct its actions in a "world." For consciousness, the most important aspect of such a system is that which
"is defined not so much by its particular set of primitives as by its ways of combining structures to form larger ones, and by its means of abstraction -- its means of forming new units of representation that allows the details of their implementation to be ignored." (Drescher, Made Up Minds, p10)
As I read it, what this means is we get a system which combines its primitive processes into higher-level ones so that you get a "commanding" program with any number of functions, or sub-routines. For non-programmers, this means you could have the lower level processes carrying out the detailed statistical analyses to determine actions, while the so-called commanding program is "unaware" of of those activities. All it needs to operate is a "Yes" or "No" from the sub-routine based on its detailed statistical analysis.
What you end up with is a system, if it is indeed ignoring the primitive functions, that is unaware of its own internal processes and how its actions are determined. If it receives a command, say, to stack 3 blocks that are in its world, the primitive statistical processes are going to run in order to determine the locations of said blocks and make the parts move to execute the command. The higher order process of the system, though, undergoes only the "experience" of receiving the command, finding the blocks, and doing it. If you could ask what it is doing, it would say it was following the order, not that it was carrying out statistical analysis. I believe this is the first step towards conscious machines. There is nothing it is like to be a mechanism which merely compiles statistics and uses it to push buttons on and off. And maybe there is nothing it is like to be a higher-level program which "sees" only the results of such compiling and uses the "Yes" or "No" to push other buttons on and off. But maybe there is something it is like to be a higher-higher-higher ... -level program which pulls vast amounts of different types of input together in one orderly mechanism and is ignorant of the extraordinarily complex underworkings.
Why could this be true? Because it could be true that we are such systems. The complex set of processes that go into pouring myself a bowl of cereal feels simpler by orders of magnitude.
Labels: Functionalism, Philosophy of Mind
3 Comments:
This is cool. The key is question re:
Dan(niel) said . . .
"What you end up with is a system, if it is indeed ignoring the primitive functions, that is unaware of its own internal processes and how its actions are determined."
Is why, since we are epistemically closed (at least right now, maybe in principle) to how our actions are determined, assume that they are enitrely determined at the sub-personal level?
Hi Richard,
If I understand Dan(iel)'s post (correct me if I am way off, Dan(iel)), I think he is in a way defining or explicating "irrationality" as def. = (i) necessary blindness at a higher-order command level to information at a lower-order level of a cognitive system and/or (ii) necessary blindness by the higher-order command level to the causal complexity of the deterministic, lower-order mechanisms that produce action (therefore creating a need for unitary / agential consciousness, to help the system navigate while having imperfect access to its own informational and causal states) Jesus, this is way too much jargon.
The "personal" or consciousness emerges because the sub-personal can't organize itself efficiently without it(therefore, consciousness is the necessary product of "irrationality" in his sense). This is pretty much Dennett's line, no?
This is cool because traditionally "irrationality" just means blindness to salient information that is available available to an agent. So we get an updated explication of "irrationality" using cognitive science.
My question was motivated by the following concern: why would we need is the illusions of consciousness if the processes at the sub-personal level are in fact deterministic? A fully deterministic mechanism should have no need for illusions of agency >> it just does what it deterministically needs to do. An epiphenomenal higher-order system that just says "yes" or "no" to the results of deterministic causal processes seems unnecessary and wasteful by evolutionary standards. This is Meixner's point in the excerpt of the Book I linked to below. Consciousness only seems necessary if we assume an indeterministic world.
If we start, as we always do, from the first-person perspective trying to justify our beliefs to each other (including our beliefs about what the first-person perspective is), and then we give an analysis of the question of what the first-person is by saying that the first-person is a kind of necessary illusion of agency created by the lower-order complexities and "irrationality" in the system, then what is the criterion for justifying the belief that the mechanistic model Dan(iel) proposes is the fundamental one and not just a projection out of the first-person to explain itself in objective terms?--what is it that gives metaphysical priority to one level over the other?--if we can't actually (because of principled epistemic limitations) account for all the sub-personal facts that we claim are determining what is going on at the personal level, then why assume they are there?-- we don't need this assumption to do cognitive science, right?-so what's the compelling need to slim the ontology down? Longest question ever.
Daniel said,
"Rationality involves behaving according to some set of rules or algorithm - logic, basically. "
But rationality (or at least inteligence) is more than this: it is responding to your situation appropriately. Sadly, George W. Bush consistently acts upon some basic algorithms, but they are no sign of intelligence in this cases.
Post a Comment
Subscribe to Post Comments [Atom]
<< Home