Print Email Download Reference This Send to Kindle Reddit This
submit to reddit

Higher Order Thought Theory Of Consciousness Correct Philosophy Essay

The higher-order thought (HOT) theory of consciousness is an attempt at giving the conditions for a mental state to be conscious. It is not engaged in whether consciousness can be reduced to a physical state and is thus not an ontological debate about the nature of consciousness. Rather it tells us that a mental state is conscious if and only if that mental state (regardless of whether it is fundamentally physical or not) is the object of another thought. I will argue that there is a good argument for the higher-order thought theory of consciousness which is Lycan’s simple argument. I will slightly modify this argument in order to render it valid as well as defending the initial premise against Dretske’s objections. I will argue that the conclusion of this argument is best interpreted as supporting a higher-order thought theory of consciousness rather than a higher-order perception theory of consciousness. I will thus be arguing that the higher-order thought theory of consciousness can give the correct conditions for a mental state to be conscious.

Some initial distinctions must be made in order for us to understand the higher-order thought theory of consciousness. The first distinction is between state consciousness and creature consciousness. State consciousness is a question of whether a mental state is conscious (and it is left open as to what would make a mental state conscious) whilst creature consciousness is the question of whether or not a particular being is conscious. Creature consciousness can be interpreted as either that being transitively conscious or it being intransitively conscious. Intransitive consciousness is ‘just the creature’s being conscious’ whilst transitive consciousness is a ‘creature’s consciousness of something or other’ (Seager 1994: 270). The distinction between transitive consciousness and intransitive consciousness is the distinction between a being that is conscious of something or another and a being that is conscious but not conscious of anything in particular. This distinction does not presuppose that either exists or that either requires the other but is merely a useful starting ground for exploring higher-order thought theory of consciousness. The idea for this theory is, roughly, that you can define state consciousness in terms of transitive creature consciousness. That is, the conditions for a mental state being conscious are being conscious of something or another and more specifically being conscious that one is in that mental state. It might be objected that this is a circular because we are defining consciousness in terms of consciousness but this objection misses the mark because we are defining state consciousness in terms of transitive consciousness. They are two different concepts and the mere fact that they both share the word ‘consciousness’ does nothing to show that they mean exactly the same (and thus nothing to show that defining state consciousness in terms of transitive consciousness is circular). Higher-order thought theories are still more specific in that they specify transitive consciousness of a mental state must be that that mental state is the object of another mental state for the former mental state to be a conscious mental state. The higher order thought theory of consciousness may be initially stated as:

A mental state M is conscious for a person P (state consciousness) if and only if M is the object of another thought held by P.

Do we have any arguments for (1)? Why should we believe that a state is conscious when and only when it is the object of another thought? Lycan provides an argument for higher-order representation theories of consciousness which is the theory that a state is conscious if it is represented by another state. Although this is argument doesn’t itself establish the higher-order thought theory I will argue later that the only alternative interpretation of the conclusion of this argument is implausible and so we must accept the higher-order thought theory of consciousness. The argument runs as follows:

‘A conscious state is a mental state whose subject is aware of being in it. [Definition]’ (Lycan 2001:3)

‘The ‘of’ in (2) is the ‘of’ of intentionality; what one is aware of is an intentional object of the awareness.’ (2001:4)

‘Intentionality is representation; a state has a thing as its intentional object only if it represents that thing.’ (2001:4)

‘Awareness of a mental state is a representation of that state’ (2001:4)

Therefore,

‘A conscious state is a state that is itself represented by another of the subject’s mental states’ (2001:4).

Is this a valid argument? As Lycan notes the only two inferences drawn are (5) and (6) and so we only need to consider them to consider the validity of the argument. (5) seems to be a valid inference from (3) and (4) because (3) implies that we are aware of the intentional object (a mental state) of our awareness and (4) implies that if a state has something as its intentional object then it must represent that thing then it does imply that awareness of a ‘mental state is a representation of that state’ (because the mental state which is the intentional object of our awareness implies that our awareness of that mental state also represents it). The second inference from (2) and (5) to the conclusion (6) seems to be an invalid inference. If it is the case that ‘awareness of a mental state is a representation of that state’ (5) and that a ‘conscious state is a mental state who subject is aware of being in it’ (2) it doesn’t follow that a conscious state is ‘represented by another of the subject’s mental states’ [my emphasis]. For it is perfectly compatible for a conscious state to be defined as awareness of that mental state and to state that ‘awareness of a mental state is a representation of that state’ (as was validly implied from (3) and (4)) without there being another mental state that represents it. That is, there could be a mental state that represents itself in which case that being aware of a mental state entails that there is a representation of that state does not mean that there is a mental state distinct from itself which represents it. So we could be aware of a mental state and thus by (5) have a representation of that state but whereby that state represents itself. In order to make this inference valid we need an additional premise:

‘No mental state represents itself.’ (Gerken 2008:246)

If (7) is true then (6) does follow from (5) and (2) because as consciousness is awareness of a mental state and awareness of a mental state is a representation of that state then (7) implies the state which represents that mental state cannot be itself and so there must be another mental state which does.

We have a valid argument but how plausible are the premises? Premise (3) is a statement about intentionality. A state is an intentional state so long as that state is about something. In our case the intentional state is the state of awareness about the mental state and that mental state is the intentional object. We often say that we are aware of a tree in the garden or of a pain in our back (indicating intentional states) and so there doesn’t seem to be any good reason why we cannot be ‘aware’ of our own mental thoughts. As intentionality can apply to mental thoughts and there doesn’t seem to be any other way of interpreting (2) then it seems that (3) provides a compelling analysis of being aware that one is in a mental state. Premise (4) is plausible because intentionality is closely tied to representation. When we have the thought that it is raining we are having an intentional thought because it is about something (i.e. the rain or sense datum, depending on the theory of perception accepted) and it seems that this thought represents the rain because its content is about the rain. If x has a thought about y then it seems obvious that x’s thought represents y and if x has a thought that represents y then it seems equally obvious that x’s thought is about y. Because they seem inextricably linked then we can grant that intentionality is representation. Premise (7) seems plausible because we think of representation having a relational structure whereby a thought represents something else which is distinct from the thought. The very nature of representation is that of one thing being similar to or describing another thing but of course this means that nothing can ever represent itself. As well as this problem there don’t seem to be any cases of mental states representing themselves (or at least not in the case of our awareness of our mental states). Having said this, I think that the initial premise, the definition, is the most likely candidate for suspicion. The rest of the argument seems obvious once the initial premise is granted. I will now consider an objection from Dretske.

The definition of a conscious mental state as a ‘mental state that the subject is aware of being in’ (Lycan 2001:3) does seem to be an initially plausible premise. How can we have a conscious state without being aware of anything? So it seems that ‘if one is totally unaware of some mental state, that state is not a conscious state’ (Rosenthal 2002:407). (2) defines state consciousness in terms of transitive consciousness (or being aware of something).Nevertheless Dretske has objected to defining state consciousness in terms of transitive consciousness because he believes that state consciousness doesn’t entail transitive consciousness. That is, he believes that a subject can have a mental state that is conscious (state consciousness) without that subject being transitively conscious of that mental state. In order to support this he gives an example where we can have a conscious experience without being transitively conscious of anything (that is, aware of anything). We might imagine seeing a Dalmatian breed of dog on Tuesday and when petting it we looked at all of the black spots on its face and then on Wednesday we saw the same dog (which we know because it has the same tag on it) and again we look at the spots on the dog’s face. In order to argue against (2) we can assume that both experiences are conscious experiences in virtue of us being aware of those experiences. The problem for (2) is when we are told that on Wednesday the dog had an extra spot on its face. Ex hypothesi, on Wednesday we have a conscious experience of all of the spots on the dogs face (because according to (2) we are aware of that experience) and we experienced a spot on Wednesday that we did not experience on Tuesday. On Wednesday we had a conscious experience of a spot (because we are aware of that experience) that we did not experience on Tuesday (because that spot did not exist on Tuesday and so we were not aware of an experience of it). We thus had an experience on Wednesday that consciously differed (because we were aware of a different experience) and so had a conscious difference in our experiences. But if we had a conscious difference in our experiences then if (2) is true we must be aware that our experiences differed because a conscious mental state is one that we are aware of. Yet it ‘does not follow from this conscious difference in my experiences that I am conscious of the difference’ (Byrne 1997:114). For we can easily imagine seeing the dog on Tuesday and Wednesday and not being aware that they differed by the Dalmatian on Wednesday having a spot that it didn’t have on Tuesday. If this argument succeeds then it shows that we can have a conscious experience (in this case a conscious difference in experiences) without being aware of that experience (in this case we are not aware of the conscious difference in experiences). That would conflict with (2) which stipulates that we must be aware of an experience for it to be a conscious experience. Whilst this argument may seem initially convincing it makes a mistake in its reasoning. The mistake is made when it is said that because they have a conscious experience (because they are aware of having it) which differs to another conscious experience that they have then they have a ‘conscious difference’ in experiences (which would require awareness of the differences by (1) which is the purported problem). But this doesn’t seem to be the case. It is certainly true that (2) commits us to saying that they have two different conscious mental experiences (because they are aware of two different mental experiences) but it doesn’t follow that we are aware that two different conscious experiences do in fact differ. We can perfectly well be aware of our experience of the dog on Tuesday and be aware of our experience of the dog with an extra spot on Wednesday without being conscious of the fact that the extra spot is what makes them different. According to (2) we would only be conscious that they differed if we were aware of a mental state which contained the idea that one experience was of an extra spot but the other wasn’t but clearly (2) does not require that this mental state exist at all and so does not require that that we are conscious that the two experiences differ. Whether we are conscious of the difference is a different issue to whether we have two different conscious experience which differ . Thus we can maintain premise (2) in our argument for higher-order theories of consciousness.

So far our argument takes us to the conclusion that a mental state is conscious as long as it is represented by another mental state. There seems to be two ways in which a mental state can represent another mental state. The first is that we can ‘perceive’ a mental state or secondly that we can have a thought about a mental state. Higher-order thought theories of consciousness require that the representation of a mental state which confers consciousness upon that mental state to be the latter. They require that we have the thought that we are in a mental state in order for us to be in that mental state. The distinction between having a thought about something and perceiving that mental state can be made by distinguishing between fact-awareness and thing-awareness. Thing-awareness is being aware of some object and being fact-aware is being aware that something is the case. Fact-awareness is distinguished from thing-awareness by the ‘deployment of concepts’ (Dretske 1993:265) such as the concept of a ‘red house’. Clearly we don’t need the concept of a red house to be thing-aware of a red house but we do need the concept of a red house in order to be fact-aware of a red-house. For instance we need the concept of ‘red house’ to be fact-aware that the red house is 13 metres high. Perception of a mental state and a thought about a mental state is then easily distinguished so that the former is object-awareness of that mental state and the latter is fact-awareness of that mental state. So a mental state is either conscious in virtue of being perceived or there being a thought about it. The problem for perceptual accounts of consciousness is that they don’t seem to have sensory qualities which are essential to perception. For example when we have the first-order perception of a green apple not only do we perceive the green quality of the apple but we perceive the green apple in a certain way. Our perception of a green apple has the sensory quality of the greenness of the apple which is ‘distinct from any property of the object we perceive’ (Rosenthal 2002:409). Thus we should expect that our higher-order perception of our mental states to be different to the qualities of the mental state. Yet our higher-order representations of our mental states don’t seem to have their own distinct sensory qualities. For example when we are aware of our perception of a dog (and thus by the argument given earlier a representation of that state) that awareness doesn’t seem to have a quality of its own. Our awareness seems to be fact-awareness (we seem to believe that we have that perception of a dog) and fact-awareness doesn’t come with its own sensory qualities. This leads us to reject the perceptual model of consciousness because our awareness (and representation) of mental states lacks the essential quality of perception and instead accept the cognitive model where we have a conscious mental event if we have a thought to the effect that we are having such an experience. But not only do we have reasons for rejection the perceptual account we also have reasons for accepting the higher-order thought theory of consciousness. When we report that something is the case then we are expressing our thought that something is the case (we have fact-awareness and this involves the deployment of concepts which are the essential feature of thoughts). We often report our conscious mental states (without begging the question these are mental states that we intuitively believe are conscious) and we must thus be expressing our thought about some mental event. So it follows that we must have thoughts about conscious mental states because we often report our mental states and reporting an event entails reporting our thoughts. If we have thoughts about mental events then this is support for the higher-order thought theory of consciousness because this is exactly what the theory argues is the case. Given that it is ‘unclear how I could have the ability to express some thought without actually having that thought’ (Byrne 1995:109) then it seems that we must have higher-order thoughts about our mental state. If this is the case then we have provided a good argument for representation of a mental state as requiring a mental thought which is about that mental state.

In conclusion, the higher-order thought is a correct account of consciousness. Lycan’s argument for higher-order theories of representation can be made valid by the addition of a very plausible premise. The other premises are also highly plausible. The only premise open to doubt is the definition of a conscious state as one that a subject is aware of being in. I considered Dretske’s objection to this premise which attempted to give an example of a mental state which is conscious but of which we are not aware but found this example to be flawed. This is because Dretske says that someone committed to (2) is also committed to saying that someone is aware of the difference between two conscious experiences but this is not the case. Someone committed to (2) is not committed to saying that somebody who has two different conscious experiences must be aware of the feature that distinguishes them. Dretske quite rightly argues that it doesn’t follow from having conscious experiences that differ that we are conscious of the difference between them but he is wrong in assuming that (2) implies that this is not the case. I further argued that the best interpretation of Lycan’s conclusion is for a higher-order thought theory of consciousness because a perceptual account is flawed and the fact that we can report our mental states gives us good reason to believe we have higher order thoughts.

Print Email Download Reference This Send to Kindle Reddit This

Share This Essay

To share this essay on Reddit, Facebook, Twitter, or Google+ just click on the buttons below:

Request Removal

If you are the original writer of this essay and no longer wish to have the essay published on the UK Essays website then please click on the link below to request removal:

Request the removal of this essay.


More from UK Essays