Home Introduction Cognitive Psychology Cognitive Perspective Social Perception Social Memory Social Categorization Social Judgment Language Automaticity Self Social Neuropsychology Personality Social Intelligence Development Sociology of Knowledge Social Construction Conclusion Lecture Illustrations Exam Information

 

The Self

In the introductory lecture, I noted that the existence of the self creates a qualitative difference between social and nonsocial cognition.

The self lies at the center of mental life. As William James (1890/1981, p. 221) noted in the Principles of Psychology,

Every thought tends to be part of a personal consciousness.... It seems as if the elementary psychic fact were not thought or this thought or that thought but my thought, every thought being owned.... On these terms the personal self rather than the thought might be treated as the immediate datum in psychology. The universal conscious fact is not "feelings and thoughts exist" but "I think" and "I feel"....



In other words, conscious experience requires that a particular kind of connection be made between the mental representation of some current or past event, and a mental representation of the self as the agent or patient, stimulus or experiencer, of that event (Kihlstrom, 1995). It follows from this position that in order to understand the vicissitudes of consciousness, and of mental life in general, we must also understand how we represent ourselves in our own minds, how that mental representation of self gets linked up with mental representations of ongoing experience, how that link is preserved in memory, and how it is lost, broken, set aside, and restored.

James went on to distinguish between three aspects of selfhood:


My Avatar, My Selfie, My Shelfie, My Self?

To James's list we might now add one's avatar -- the image chosen to represent oneself in video and online games like world of Warcraft (no, I'm not a gamer, unless you count cribbage and backgammon).  An avatar, after all, is nothing less than a digital representation of yourself.  And if you configure a close enough resemblance, you're dealing with what's known as a digital doppelganger (Magid, 1998; Chimielewski, 2005).  And, it turns out, watching your digital doppelganger behave in a game environment can actually have effects on your self-concept.  There are studies to be done here (hint, hint).


  • Sherry Turkle has suggested that online gamers create avatars in order to explore options for themselves in the relatively safe environment of the game.
  • Nick Yee, a researcher with Ubisoft, an company that produces video games, claims that if players are given more attractive-looking avatars, they will play more confidently.  And if their own face has been blended in with another player's avatar, they will become more positive about the avatar.  For more, see his book, The Proteus Paradox (2014).
Not to mention:
  • Selfies -- pictures of oneself, taken by oneself, typically with a cellphone camera, and posted to social networking sites
    • Selfies became popular with the introduction of devices like the iPhone4, with a front-facing camera, in 2010. 
    • The word itself was designated the "word of the year" by the Oxford English Dictionary in 2013.
  • And now shelfies -- or pictures of one's own possessions, also posted to social networking sites.  Shelfies get their name from their appearance: objects artfully arranged on a shelf, a tray, or some other horizontal surface (see "Me, My Shelfie*, and I" by Dale Hrabi, Wall Street Journal, 04/26-27/2014).
    • Note that there's an alternative definition of shelfie, offered by Elaine Showalter in her essay, "Rise of the Shelfie" (Chronicle of Higher Education, 05/23/14) -- as a special form of the reading memoir" in which the author selects a shelf of books, sometimes at random, and then writes a book about his or her reactions to reading them.  Examples are:
      • The Year of Reading Proust: A Memoir in Real time (1997) and From LEQ to LES: Adventures in Extreme Reading (2014) both by Phyllis Rose (the latter really exemplifies this version of the "shelfie" concept).

James was onto something with his concept of the material self, and that is that our possessions, like our avatars, are expressions of our selves. I'm the kind of person who buys a Subaru, and I'm not the kind of person -- whatever that is -- who would buy a Lamborghini, even if I could afford it.  At least some of our possessions are, quite literally, expressions of ourselves - -what we might call the material culture of the self

There are studies to be done here (hint, hint).

One possibility, if you could get around the obvious confidentiality problem, would be a study of people's passwords, and why they chose them.  Yes, your passwords are supposed to be random numbers, letters, and symbols; but they aren't.  Yes, you're supposed to use different passwords for different sites; but you don't.  Yes, you're supposed to change them all the time; but you never do.  And it's not just a matter of laziness, or the demands on memory.  It's that our passwords are part of us, they represent something about ourselves.  For more on "keepsake" passwords, see "The Secret Life of Passwords" by Ian Urbina, who coined the term, New York Times Magazine, 11/23/2014.  Some excerpts:

[T]here is more to passwords than their annoyance.  In our authorship of them, in the fact that we construct them so that we (and only we) will remember them, they take on secret lives.  Many of our passwords are suffused with pathos, mischief, sometimes even poetry.  Often they have rich back stories....  These keepsake passwords, as I cam to call them, are like tchotchkes of our inner lives....  Like a tattoo on a private part of the body, they tend to be intimate, compact, and expressive.

***

I asked Andy Miah, a professor of science communication and digital media at the University of Salford in England, for his thoughts on passwords, and he offered an anthropological outlook.  Keepsake passwords, he suggested, ritualize a daily encounter with personal memories that often have o place else to be recalled.  We engage them more frequently and more actively that we do, say, with the framed photo on our desk.  "You lose that ritual," Miah said, "you lose an intimacy with yourself.

The Covid-19 pandemic offered another opportunity to study the material culture of the self.  During the lockdown, with many people working from home and social distancing strongly encouraged, many people communicated with each other via Skype, Zoom, and similar utilities.  This circumstance allowed people to look inside friends' and coworkers' homes for the first time; and, accordingly, many people deliberately arranged the backgrounds of their videos in such a way as to make a statement -- about themselves by displaying favorite books, mementoes, and the like.  Not only that, Room Rater, an account on Twitter, emerged with screenshots of various backgrounds, such as those appearing on news programs, accompanied by brief comments and a 1-10 rating.  The trend was noted by popular magazines such as House Beautiful ("Room Rater is the Best Thing to Come Out of Lockdown" by Hadley Keller, June 2020).

Developmental psychologists (and parents) note that object attachment begins very early in life (think of your childhood teddy bear); and while attachment to specific objects (like that teddy bear) may drop off in later childhood, our attachment to certain objects remains very strong throughout life.  Our possessions help define who we are.  Russell Belk, a specialist in consumer behavior, calls this the extended self (e.g., Belk & Tian, 2005). To some extent, we are what we own.  So much so, that we can be traumatized when we lose them, or they are taken from us.

Certainly we are reluctant to give them up.  In a classic study, Kahneman and his colleagues (1990) gave college students coffee mugs embossed with their college logo, and then allowed them to trade them in a kind of experimental marketplace.  Interestingly, they found that the students were very reluctant to sell their mugs -- even though they hadn't owned one before the experiment, and they hadn't paid anything for them in the first place.  Selling prices were very high, and buying offers were very low.  This is known as the endowment effect -- a reflection of loss aversion.  The subjects simply didn't want to lose something that they now owned. 

The self is one of the most prominent topics in personality and social psychology, as indicated by the listings in the indexes of recent textbooks.  Along much the same lines, Thagard and Wood (2015) identified some 80 self-related phenomena.






Nevertheless, the nature of the self has long been a problem for psychologists -- one expressed plaintively by Gordon Allport (1961, p. 128):

This puzzling problem arises when we ask, "Who is the I who knows the bodily me, who has an image of myself and sense of identity over time, who knows that I have propriate strivings?  I know all these things and, what is more, I know that I know them.  But who is it who has this perspectival grasp...?  It is much easier to feel the self than to define the self.




One way to define the self is simply as one's mental representation of oneself (Kihlstrom & Cantor, 1984) -- a knowledge structure that represents those attributes of oneself of which one is aware.  As such, the self contains information about a variety of attributes:




Cognitive psychologists (like Anderson, 1995) generally identify two forms of mental representation:





From this perspective, the self is a knowledge structure just like any other knowledge structure stored in memory.  As such, we can view the self through the same lenses as we use to analyze other knowledge structures:

The Self as a Concept

Within personality and social psychology, the self-concept is commonly taken as synonymous with self-esteem, but within the social-intelligence framework on personality (Cantor & Kihlstrom, 1987) the self-concept can be construed simply as one's concept of oneself, a concept no different, in principle, than one's concept of bird or fish. From this perspective, the analysis of the self-concept can be based on what cognitive psychology has to say about the structure of concepts in general (Smith & Medin, 1981).


Attributes of the Self

As a first pass, we may define the self as a list of attributes that are characteristic of ourselves, and which serve to differentiate ourselves from other people.  

One possible way to assess the self-concept is simply to give people an adjective checklist and have them indicate the degree to which each item is self-descriptive.  But such a procedure does not distinguish between those attributes which are shared with other people, and those that are truly distinctive about oneself.  Nor does it distinguish those attributes that are trivial from those that are truly critical to one's self-concept.

010selfschema.jpg
            (60040 bytes)For that reason, Hazel Markus (1977) introduced the notion of the self-schema.  In her research, she presented her subjects with the usual sort of adjective checklist, but asked them to make two different ratings for each item:




Markus then classified subjects as self-schematic for a particular attribute (e.g., dependence-independence) if they rated that attribute as both extremely descriptive of themselves (whether high or low, dependent or independent) and extremely important to their self-concepts.  Subjects were classified as aschematic for that attribute if they gave it moderate ratings on self-descriptiveness, and low ratings on self-importance.

Markus' notion of "self-schematicity" (pardon the neologism) was an important advance in the assessment of the self-concept, but it was not entirely satisfactory.  As Robert Dworkin and his colleagues noted, her method essentially confounds self-descriptiveness and self-importance.  Subjects who have a moderate standing on a particular trait (e.g., midway between dependence and independence) must be classified as aschematic for this trait, even if their moderate standing is extremely important to their self-concept.  This doesn't seem right.  The implication of Dworkin's argument is that the only rating that really matters, so far as the self-concept is concerned, is self-importance.


Of course, even the fact that subjects give high self-importance ratings to some attribute doesn't mean that it's really part of their self-concept.  Strictly speaking, the self-concept should focus on how people naturally, spontaneously, think about themselves.  Following this logic, William McGuire and his colleagues put forward the notion of the spontaneous self-concept -- in which subjects define themselves in their own terms.  In his procedure, McGuire essentially presents subjects with a blank piece of paper with the instruction "Tell me about yourself".  This results in a free listing of attributes that is entirely ideographic in nature -- that is, subjects are not forced to use terms chosen by the experimenter, or terms that might also be used by other subjects.  They are allowed to define their self-concepts freely.

012McGuirecontent.jpg (77480 bytes)In one study of sixth graders, McGuire and his colleagues performed a content analysis of the spontaneous self-concept.  Interestingly, self-esteem played a relatively minor role, accounting for only about 7% of the attributes listed. Habitual activities, and other people, played a much larger role in the way these children thought about themselves.

 

 

In his research, McGuire has been particularly interested in testing what he has called the distinctiveness postulate:

The distinctiveness postulate flows from McGuire's view of the self-concept as information.  The informational value of the self-concept is high if it contains information about how we differ from other people -- what makes us relatively unique.

013McGuireSpont.jpg (41870 bytes)In testing the distinctiveness postulate, McGuire and his colleagues classified their subjects (again, sixth graders) as typical or atypical on various objectively measured attributes, such as age or birthplace.  Across a large variety of such attributes, they found that children were more likely to mention a feature if they were atypical on that feature with respect to their classmates.  For example, left-handed children are more likely to mention handedness in their spontaneous self-concepts than are right-handed children.

 

An interesting feature of distinctiveness is that "atypicality" is not defined in the abstract, with respect to population statistics, but rather concretely, with respect to the immediate social context.  This was shown clearly by McGuire's analysis of the appearance of sex or gender in subjects' spontaneous self-descriptions.  There are, roughly 50% each males and females, so gender can't really be atypical.  But it turned out that gender was mentioned more frequently by sixth-graders who were atypical for gender with respect to the distribution of the sexes in their classrooms or households.

Thus, we can redefine distinctiveness for purposes of exploring the distinctiveness postulate further:

 

More on Distinctiveness

There's lots more research to be done on McGuire's distinctiveness postulate.  For example:

  • For reasons of convenience, McGuire and his colleagues studied mostly elementary-school children, and mostly sixth-graders.  There's no reason to think that older (or younger) subjects would perform any differently, but university-based personality and social psychologists would probably want to focus on college and university students.  Determining "typicality" in such a heterogeneous population, of course, is no mean task -- which is probably why McGuire and his colleagues worked with schoolchildren to begin with.
  • Again for reasons of convenience, McGuire and his colleagues focused their analyses on objectively measurable characteristics such as age, birthplace, and eye color.  It would be really interesting to extend their analyses to the sorts of psychosocial characteristics -- traits, attitudes, and values -- typically of interest to psychologists.  Of course, measuring "typicality" on these sorts of constructs wouldn't be easy.  Nor could we guarantee that a term like "extraversion", appearing on a personality assessment, would also appear in subjects' spontaneous self-descriptions.

But here's one idea:  One could employ an adjective checklist such as Markus used, and use the self-descriptiveness ratings as a basis for assessing typicality, and the self-importance ratings as a proxy for the spontaneous self-descriptions.  The prediction is that subjects are likely to be self-schematic for psychosocial attributes in which they describe themselves as atypical.  Anybody who wants to do such a study: YOU READ IT HERE FIRST!


The Structure of the Self-Concept

From the time of Aristotle until only just recently, concepts were characterized as proper sets: summary descriptions of entire classes of objects in terms of defining features which were singly necessary and jointly sufficient to identify an object as an instance of a category. Thus, the category birds includes warm-blooded vertebrates with feathers and wings, while the category fish included cold-blooded vertebrates with scales and fins. 

 

The Self-Concept as a Proper Set

In principle, at least, a classical proper-set structure could be applied to the self-concept.  Thus, the self-concept could be a summary description of oneself, whose defining features consist of those attributes that are singly necessary and jointly sufficient to distinguish oneself from all others.  This is possible in principle, but in practice it seems not terribly useful, as defining features would probably be restricted to the individual's birth date, place, and time, and the names of his or her parents.

 

The Self-Concept as a Prototype

But both philosophical considerations and the results of experiments in cognitive psychology have persuaded us not to think about concepts in terms of proper sets and defining features, but rather in terms of family resemblance, in which category members tend to share certain attributes, but there are no defining features as such. According to this view of categories as fuzzy sets, category instances are summarized in terms of a category prototype which possesses many, but not necessarily all, of the features which are characteristic of category membership.

In the late 1970s and early 1980s the idea that the self, too, was represented as a category prototype was popular, and some very interesting experiments were done based on this assumption (Rogers, 1981). But category prototypes are abstracted over many category instances.  How does one talk about family resemblance, or abstract a prototype, when there is only one member of the category -- oneself? The notion of self-as-prototype, taken literally, seems to imply that the self is not unitary or monolithic. We do not have just one self: rather, each of us must have several different selves, the characteristic features of which are represented in the self-as-prototype.

In fact, the idea that we have a multiplicity of selves can be traced to the very beginnings of social psychology.


Cooley (1902) defined the looking-glass self:

The self consists of whatever attributes are associated with first-person pronouns....  Each person possesses as many selves as there are significant others in his or her social environment.
The "looking-glass self" obviously has its roots in James' concept of the social self -- the self as one is viewed by others. 

Mead (1934) discussed the self in symbolic interactionism (an early version of social cognition):

A person has as many selves as there are social roles for him or her to play.


2008Eve.JPG (67199 bytes)Taken to the extreme, the self-concept as a set of exemplars is exemplified (sorry) by dissociative identity disorder, formerly known as multiple personality disorder.  In this exceedingly rare psychiatric syndrome, the patient posses two or more different identities, each associated with a different set of autobiographical memories.  The different identities, in turn, are separated by an interpersonality amnesia, which is typically asymmetrical.  For example, in the famous case of the Three Faces of Eve, there were three different "personalities", or identities, within a single Georgia housewife:

Link to a collection of articles on dissociative identity disorder and other dissociative disorders affecting memory and identity.

But one does not have to have a dissociative disorder to have a multiplicity of selves. Traditional personologists assume that behavior is broadly stable over time and consistent over space, and that this stability and consistency reflect traits which lie at the core of personality. Viewed cognitively, the self might be viewed as the mental representation of this core. But social psychologists have argued that behavior is extremely flexible, varying widely across time and place. 

Accordingly, a person might have a multiplicity of context-specific selves, representing what we are like in various situations, and reflecting our awareness of the contextual variability of our own behavior.  For example:

023Hierarchy.jpg
          (39156 bytes)If so, then the self-concept should represent this context-specific variability, so that each of us possesses a repertoire of context-specific self-concepts -- a sense of what we are like in different classes of situations (Kihlstrom et al. 1995). The self-as-prototype might be abstracted from these contextual selves. Thus, we might begin to think about a hierarchy of selves, with more or less context-specific selves at lower levels, and a very abstract prototypical self at the highest level.



The Self-Concept as a Set of Exemplars

This is all well and good, but maybe there is not a prototype after all. Another trend in cognitive psychology has been to abandon the notion entirely that concepts are summary descriptions of category members (Medin, 1989). Rather, according to the exemplar view of categories, concepts are only a collection of instances, related to each other by family resemblance perhaps, and with some instances being in some sense more typical than others, but lacking a unifying prototype at the highest level. Some very clever experiments have lent support to the exemplar view, but as yet it has not found its way into research on the self-concept. Nevertheless, the general idea of the exemplar-based self-concept is the same as that of the context-specific self, only lacking hierarchical organization or any summary prototype.

Regardless of whether this "family" of context-specific selves is united by a summary prototype, or exists simply as a set of exemplars, it is possible for these multiple selves to come into conflict -- as when, speaking metaphorically, the angel on your right shoulder tells you to do one thing, and the devil on your left shoulder tells you to do something else.  T.C. Schelling, an economist, has suggested that these alternate selves can work against each other in a process he calls self-binding: one of your selves wants to eat that Little Debbie, the other wants to stay thin, and they duke it out in our head.


The Self-Concept as a Theory

The three views of categorization presented so far -- proper sets, fuzzy sets, and exemplars -- all assume that the heart of categorization is the judgment of similarity. That is, instances are grouped together into categories because they are in some sense similar to each other. But similarity is not the only basis for categorization. It has been proposed that categorization is also based on one's theory of the domain in question; or, at least, that people's theories place constraints on the dimensions which enter into their similarity judgments (Medin, 1989). 


Application of this theory-driven view of categorization to the self was anticipated more than 20 years ago by Epstein (1973, p. 407), who argued that the self concept is 

"a theory that the individual has unwittingly constructed about himself as an experiencing, functioning individual, and... part of a broader theory which he holds with respect to his entire range of significant experience".

Except that there's no reason to think that this theory is constructed unwittingly.  It may, in fact, be constructed very wittingly indeed, as the person generates a theory of how he came to possess his characteristic features, and how they relate to each other.


Epstein's views have not been translated into programmatic experimental research on the self, but we can perhaps see examples of theory-based construals of the self in the variety of "recovery" movements in American society today (Kaminer, 1992). Whether we are healing our wounded inner child, freeing our inner hairy man, dealing with codependency issues, or coping with our status as an adult child of alcoholics or a survivor of child abuse, what links us to others, and literally constitutes our definition of ourselves, is not so much a set of attributes as a theory of how we got the way we are. And what makes us similar to other people of our kind is not so much that they resemble us but that they went through the same kind of formative process, and hold the same theory about themselves as we do of ourselves. Dysfunctional or not, it may well be that we all have theories -- we might call them origin myths -- about how we became what we are, and these theories are important parts of our self-concept. Such self-theories could give unity to our context-specific multiplicity of selves, explaining why we are one kind of person in one situation, and another kind of person in another (Kihlstrom et al., 1995).

The Self and Identity Politics

For many people, a major part of their self-concept concerns their identity with some sociodemographic group -- usually, though not always, a minority group or other outgroup.  Partly, this reflects McGuire's distinctiveness postulate -- that people include in their self-concepts those features that tend to distinguish them from others, which infrequent features do almost by definition.  But since the 1970s, as a result of the civil rights and feminist movements in the United States and elsewhere, group identity has also taken on a political dimension, as social and political activists influenced by Marxist notions of class analysis and consciousness-raising.  See, for example, the work of Arthur Schlesinger, Jr., a historian who in The Disuniting of America that identify politics threatened to destroy the common culture that was, in his view, necessary for liberal democracy to thrive.

Identity politics can also infect majorities and ingroups, as well, as we can see in the predominantly white, Euro-centric "Tea Party" movement that came on the scene in the wake of the 2008 financial crisis.

As a result of identity politics, terms like "black", "feminist", "LGBT" (for Lesbian-Gay-Bisexual-Transgender) stand for political identities as well as for features of one's own personal self-identity.  And, accordingly, one's identity with these political movements can become incorporated into one's self-concept.  And a person's identification with some group becomes an important element in how that person will be perceived by others.

A dramatic example of this came in 2010, when President Barack Obama filled out his form for the U.S. Census.  Viewed objectively, Obama is of mixed race, and (beginning with the 2000 census) he could have identified himself as such on his census form.  But he didn't -- he checked only the box for "Black, African-American, or Negro".  Obama was criticized for this in some quarters, on the view that such an act betrayed his self-presentation as someone who is "post-racial".  

It's objectively true, of course, that Obama is the product of a mixed marriage, with a white mother and a black father.  But that's not the way he subjectively identifies himself.  Despite being raised by his white maternal grandparents, Obama has always cultivated an identity as a black person -- as indicated clearly in his memoir, Dreams From My Father.  And, as he quipped to David Letterman, he "actually was black before the election".  What makes Obama "post-racial", if anything, is that he is able to express pride in his black heritage without making whites uncomfortable.

Transgender people who are objectively male can identify themselves as female ("I'm a woman trapped in a man's body").  

Mixed-race people can identify themselves as black -- or white.  Even more than gender, race is a social construction.

And, for that matter, there are cases of transracial identification -- not just light-skinned Blacks "passing" for white (as discussed in the lectures on Social Categorization), but also white individuals who identify themselves as Black.  An interesting example is that of Rachel Dolezal, a white woman who served as president of the Spokane, Washington chapter of the National Association for the Advancement of Colored People.  Now, there's no problem with a white person being a member, or even an officer, of the NAACP.  The issue is that Dolezal identified herself as black.  Both her parents are white, with no African ancestry; interestingly, they adopted two Black children, Izaiah and EzraWhen her deception was discovered, Dolezal argued that, although she was "biologically born white", she had long considered herself to be Black.


A similar story pertains to Andrea Smith, a Native American Studies scholar and Indigenous People activist whose claim of Cherokee ancestry, based on family stories, is disputed by the Cherokee Nation itself ("The Native Scholar Who Wasn't" by Sarah Viren, New York Times Magazine, 05/30/2021).  Although repeated attempts to trace her ancestry have failed to turn up any Indigenous ancestors, and she is not enrolled as a legal citizen of the Cherokee Nation, according to Viren she continues to identify as Cherokee.  Interestingly, her sister Justine, another Native activist, claims both Cherokee and Ojibwe heritage.  Although Smith has made notable contributions as both a scholar and an activist, she has been accused of "ethnic fraud" (or, in more gentle terms, "playing Indian" -- a term coined by the Native American historian Philip J. Deloria in a book with that title). 

It's important to note, though, that neither of these cases is a simple one of someone claiming minority-group status in order to take advantage of affirmative action.  Neither of these individuals simply "talked the talk".  They both also "walked the walk", as activists in behalf of their putative groups.  Their identities, as Black or Indigenous, were put into relevant action.

In large part because of the Dolezal case, the topic of transracialism -- e.g., being biologically "white" but identifying oneself as black -- has begun to be discussed by both psychologists and philosophers (see, e.g., "In Defense of Transracialism" by Rebecca Tuvel, Hypatia, 2017).    If biology is not destiny when it comes to gender, and everyone agrees that race is to a large extent a social construction, then why can't people be transracial in the same way that they can be transgender?  One contrary argument is that race, while not a strictly biological concept, is an ancestral one.  That is, to a large extent being black entails having parents, grandparents, or  great-grandparents who were Black.  The point is particularly applied to African-Americans, whose ancestry includes the heritage of slavery (for this reason, some scholars consider Michele Obama, whose ancestors were slaves, to be African-American in ways that Barack Obama, whose ancestors remained in Africa, are not (these same scholars use "Black" as a superordinate term for anyone who has African ancestry); and why some African-Americans and Hispanic-Americans object when affirmative-action positions are offered to candidates from Africa or Latin America (or Spain or Portugal, for that matter) who do not have ancestors who lived in the United States, or who did not share a history of racial or ethnic discrimination.  On the other hand, a similar point could be made about individuals who are transgender: you can be "a woman trapped in a man's body", but as a man, you did not suffer the kinds of discrimination faced by women who were always biologically female.  It's complicated, and it will be interesting to see how the debate plays out.

Anyway, the basic point is that the self-concept concerns how we identify ourselves -- how we perceive ourselves, not how others perceive us.  (We can incorporate others' perceptions of us into our self-concepts, but that is not at all the same thing.)

Another recent turn in identity politics concerns intersectionality, a term coined by Kimberle Crenshaw (University of Chicago Legal Forum, 1989), a legal scholar, to label the situation where an individual finds him- or herself a member of two or more groupsFor example, an African-American woman might find herself in conflict between her identify as a black woman (distinguishing herself from other women), and her identify as a black woman (distinguishing herself from other black people), and identify herself instead as a black woman -- a completely different category entirely.  In Crenshaw's argument, black women constitute a multiply-burdened [sic] class.  Intersectionality isn't just a matter of a subordinate category (black woman) inheriting the features associated with two superordinate categories (blacks and women); rather, it has special features that inhere to its intersectionality.  Put bluntly, black women confront issues that are different from those that confront either black people in general, or women in general.  Although the issue of intersectionality originally arose out of discussions of feminist theory, it's easy to imagine other points of intersection -- for example, a gay African-American man -- where a person posses the "markers" of two or more different minority identities.

For more on intersectionality, see Intersectionality: A History by Ange-Marie Hancock (2016); Intersectionality by Sirma Bilge and Patricia Hill Collins (2017); and Intersectionality: Essential Writings, edited by Crenshaw herself (2018).

For more on social identity as it relates to politics, see "The Identity Illusion" by Stephen Holmes, New York Review of Books, 01/17/2019.  In this essay, Holmes reviews Identity: The Demand for Dignity and the Politics of Resentment by Francis Fukuyama, a political scientist famous for having declared the end of the Cold War was also "the end of history"; and the Lies that Bind: Rethinking Identity: Creed, Country, Class, and Culture by Kwame Anthony Appiah, a political philosopher (and contributor to the "Ethicist" column in the New York Times Magazine.  Both, one coming from the political right, the other coming from the political left, caution that identity politics can quickly devolve into clannishness and tribalism -- seen, for example, in the Trump Era debate in the United States over immigration and cultural diversity) and make it harder for people in a culturally diverse society to live together.  Fukuyama, once (but no longer) a prominent neoconservative, puts the principal blame for this situation on the political left, whose emphasis on multiculturalism, a "cult of diversity", and policies of "positive discrimination" in favor of various minority groups led to a backlash of nativism among white working-class Americans (never mind that most of them are just one or two generations away from being immigrants themselves).  Appiah, whose writings on cosmopolitanism (including a wonderful book with that title) is more sanguine about cultural diversity, presents a more balanced discussion of religious, national, racial, cultural, and class identity, and argues that all forms of identity politics share the same basic flaw, which is the assumption of essentialism -- that there is an "inner something" shared in common by all members of an identity group which makes that group cohere and different from other groups of its type.  So, for example, there is something essential about being female (or male), black (or white or Asian), etc.  Instead, Appiah wants us to think of identity as little more than a label -- not as a representation of something real, but simply as a way of grouping people, with full acknowledgement that any such labels oversimplify what is a complex underlying reality.  If Appiah were a psychologist, he'd put it this way: group differences (as, for example, between men's and women's math abilities) are not as large as we think they are; and there is more variabilty within groups than there is between them.



"Person-First" or "Identity-First"?

Still, how others perceive us, and label us, is important.  This issue has come to the fore with the "disability rights" movement, and the objection of people who have various disabilities to be identified with their disabilities (a similar issue has been raised in racial, ethnic, and sexual minority communities as well).

One important question is how to refer to people with various disabilities.  Put bluntly, should we say that "Jack is a blind person" or "Jack is a person who is blind"?  Or substitute any label, including black, Irish, gay, or schizophrenic

Dunn and Andrews (American Psychologist, 2015) have traced the evolution of models for conceptualizing disability -- some of which also apply to other ways of categorizing ourselves and others.  The current debate offers two main choices:

  • A "person-first" approach -- as in, "Jack is a person with a disability".  In this social model (Wright, 1991), disability is presented "as a neutral characteristic or attribute, not a medical problem requiring a cure, and not a representation of moral failing" (p. 258) -- or, it might also be said, as a chronic condition requiring rehabilitation.  Instead, disability itself is seen as a sort of social construction -- or, at least, a social categorization.
  • An "identity-first" approach -- as in, "Jack is a disabled person".  While this might seem a step backward, this minority model (Olkin & Peldger, 2003) "portrays disability as a neutral, or even positive, as well as natural characteristic of human attribute" (p. 259).  Put another way, disability confers minority -group status: it connotes disabled people, with their own culture, living "in a world designed for nondisabled people".

So it all depends on how you think about minority-group status -- that of other people, if you're the member of the majority; or your own, if you're a member of the minority (any minority). 


The Self as an Image

Our discussion of the self as concept and as story illustrates a strategy that we have found particularly useful in our work: beginning with some fairly informal, folk-psychological notion of the self-concept, we see what happens when we apply our technical understanding of what that form of self-knowledge looks like. Much the same attitude (or, if you will, heuristic) can be applied to another piece of ordinary language: the self-image.


Schilder (1938, p. 11) defined the self-image as
"The picture of our own body which we form in our mind, that is to say, the way in which the body appears to ourselves".
What follows from this?


First, there is the question of whether, in talking about our mental images of ourselves, we should be talking about mental images at all. Beginning in the 1970s, a fairly intense debate raged about the way in which knowledge is stored in the mind. At this point, most cognitive psychologists are comfortable distinguishing between two forms of mental representation: meaning-based and perception-based (Anderson, 1983). 

Perception-based representations represent the physical appearance of an object or event -- including the spatial relations among objects and features (up/down, left/right, back/front), and the temporal relations among objects and features (before/after).  Perception-based representations are analog representations, comprising our "mental images" of things.

Meaning-based representations store knowledge about the semantic relations among objects, features, and events, that is abstracted from perceptual detail, such as their meaning and category relations.  Meaning-based representations take the form of propositions -- primitive sentence-like units of meaning which omit concrete perceptual details. 

The self-concept is a meaning-based representation of oneself -- regardless of whether it is organized as a proper set, a fuzzy set, a set of exemplars, or a theory.  The self-as-story is also meaning-based.

The self-image is a perception-based representation of the self, which stores knowledge about our physical appearance.  

 

Perception-Based Representations of Other People

Perception-based representations are relatively unstudied in social cognition, but it is quite clear that we have them.

That we have perception-based representations of others makes it more likely that we have perception-based representations of ourselves as well. In fact, Head (1926) coined the term body schema to refer to our postural models of our own bodies -- models which allow us to maintain stability and adjust to our environment; models which are distorted in the classical experiments on prism adaptation. The fact that we can adjust our movements when our vision is distorted, and that these adjustments persist when objective stimulus conditions change, indicates that we have internal representations of our bodies, and their parts, which are independent of sensory stimulation.

 


Perception-Based Representations of Ourselves

In the laboratory, studies of the self-image qua image are very rare. One exception is a fascinating study by Mita, Derner, and Knight (1977) on the mere exposure effect (Zajonc, 1968), in which subjects view a series of unfamiliar objects (e.g., nonsense polygons or Turkish words), and later make preference ratings of these same objects and others which had not been previously presented. On average, old objects tend to be preferred to new ones, and the extent of preference is correlated with the number of prior exposures. In Mita et al.'s (1977) experiment, subjects were presented with pairs of head-and-shoulder photographs of themselves and their friends, and asked which one they preferred. In each pair, one photo was the original, and the other was a left-right reversal. 

034MarilynDual.jpg (88560 bytes)Some sense of the procedure in Mita et al.'s experiment is given by considering these two photographs of Marilyn Monroe.  The left-hand photo is a true photograph of the actress, with the conspicuous beauty mark on her left cheek.  The right-hand photo is mirror-reversed, so that the beauty mark appears on her right cheek.  According to Zajonc's prediction, Monroe herself, if she were given the choice, would prefer the reversed photo on the right, because it reflects (sorry) the way she would see herself in the mirror.  But other people would prefer the original photo on the left, because that is the way they have seen her in movies, on television, and photographs.

 

037MitaPreferences.jpg (50838 bytes)The result was as predicted.  When viewing photos of their friends, subjects preferred the original view (that is, the view as seen through the lens of the camera); but when viewing photos of themselves, the same subjects preferred the left-right reversal (that is, the view as would be seen in a mirror). Thus, our preferences for pictures match the way we typically view ourselves and others. Mita took this as evidence for the mere exposure effect, which it is; but it is also evidence that we possess a highly differentiated self-image which preserves information about both visual details and the spatial relations among them.

 


Beyond the Face: Body Image

The Meta et al. study demonstrates clearly that we possess a highly detailed image of our own (and others') faces, but the point probably applies to the rest of our bodies as well.

A number of psychometric instruments have been devised for assessing aspects of the body image.


Draw-a-Person Test

Historically speaking, perhaps the most popular assessment method has been the Goodenough-Harris Draw-a-Person Test, in which the subject is asked to draw pictures of a man, a woman, and other figures.  The traditional assumption behind this projective technique is that the subject will project his own body image onto the drawings of other people.  But this is a dubious assumption, unproven at best.  Moreover, the assessment relies heavily on unwarranted assumptions that people can draw well (just try, in the privacy of your own home, to draw a picture of a man and a woman; then shred the results before any of your friends can see the products of your efforts!).

The DAP, like most projective tests, is a pretty crummy psychometric instrument, pretty much lacking anything resembling standardization, norms, reliability, or validity.

  

Body-Image Aberration Scale

Loren and Jean Chapman, two prominent schizophrenia researchers, developed a questionnaire method, the Body-Image Aberration Scale, for assessing body image aberrations (hence the title) in patients with schizophrenia and other psychoses, and in individuals hypothetically at risk for schizophrenia.  The scale consists of a number of subscales:

Unfortunately, this assessment of body image is entirely verbal, which is pretty much inconsistent with the goal of assessing perception-based representations of the self.

Note to readers prone to medical-student syndrome.  The BIAS is actually not a particularly good predictor of later psychosis (it was an interesting idea that didn't really work out -- not least because, as we'll see a little later, body-image aberrations don't seem to be characteristic of schizophrenia), so don't worry too much if you said "yes" to most or all of the sample items given above.  For details, see Chapman, Chapman, & Raulin, Journal of Abnormal Psychology, 1978.

 

Body-Image Assessment

By far the most popular instrument used in research, this consists of line drawings of males and females, clad in swimsuits, ranging from thin to not-so-thin.  The drawings are connected by a continuous scale, on which subjects indicate their:

044F&RDesirable.jpg (100596 bytes)In one study by Fallon & Rozin (1985), college student women showed a greater discrepancy between their current and ideal body shapes than did college student men.

 

 


046R&FGenerational.jpg (85422 bytes)A "generational" study by Rozin & Fallon (1988), compared college men and women to their mothers and fathers.  On average, mothers showed larger Current-Ideal discrepancies than their daughters -- and the fathers even more so!

 

  

Clinical Anomalies of the Self-Image

As with the self-concept, the self-image can be illustrated with clinical data. 

Patients in the acute stage of schizophrenia often complain of distortions in their perception of their own bodies. 

Some classic studies of body image in acute schizophrenia employed adjustable mirrors, such as used to be found (perhaps they still are) in amusement-park "fun houses".  These mirrors can be bent in three planes to produce distorted reflections of the person.  In a a series of studies by Taub, Orbach, and their colleagues, subjects presented with distorted reflections of themselves, produced by bending certain portions of the mirror concave or convex, and asked to adjust the mirror until they looked right. 


The general finding was that the patients' mirror images remained distorted even after adjustment -- suggesting that schizophrenics really did have distorted body images.  But then the investigators made the "mistake" of running a control condition, in which a rectangular picture frame was placed in the mirror, and the subjects were asked to adjust that image until it appeared normal.  The schizophrenics showed an aberration in perception of the frame, as well, suggesting that their perceptual aberration was not limited to their body image.

 

Much the same procedure can be used with computer "morphing" software.

In autotopagnosia (also known as body-image agnosia or somatotopagnosia), a neurological syndrome associated with focal lesions in the left parietal lobe, the patient can name body parts touched by the examiner, but cannot localize body parts on demand. 

In phantom limb pain, amputees perceive their lost arms and legs as if they were still there. 

In body dysmorphic disorder, the patient complains of bodily defects where there really aren't any. 

In eating disorders such as anorexia and bulimia, the sufferer sees fat where the body is objectively normal, lean, or even gaunt.

But, outside of the Traub-Orbach studies of body image in schizophrenia, little of this clinical folklore has been studied experimentally.


Body-Image in Eating Disorder

One exception is in the study of eating-disordered women (eating-disordered men haven't been studied much, though they do exist).

048ZellnerEatDis.jpg (85014 bytes)A study by Zellner et al. (1989), using the BIA, found that eating-disordered women showed a bigger current-ideal discrepancy than non-eating disordered women, or males in a comparison group.

 

 

050WilliamsonBulimia.jpg (38108 bytes)Another of the BIA, study by Williamson et al. 051WilliamsonWeight.jpg (89788 bytes)(1989) of women with bulimia, a special form of eating disorder, confirmed this difference.  Even when women with bulimia were statistically matched with non-bulimic women on actual weight, the bulimic women showed a much greater current-ideal discrepancy than the normals, and gave higher current IBS ratings as well -- indicating not just that they have an exaggerated ideal body image, but that they have an exaggerated current body image as well. 


Perhaps inspired by the Traub-Orbach studies with the adjustable-mirror paradigm, more recent investigators have made use of computer software for image morphing to study body image in normal and eating-disordered individuals.

One such study employed the Body-Image Assessment Software developed at the University of Barcelona by Letosa-Porta, Garcia-Ferrer, and their colleagues (2005).  They take numerous biomorphic measurements of subjects' bodies, and use these to create a computer "avatar" that mimics the shape of the subject's body.  The subject is then asked to modify the image so that it corresponds to his/her real and ideal body image.  the discrepancy between the original, objective avatar and the subject's real body image is taken as a measure of perceptual distortion; the discrepancy between the subjects real and ideal body images are then taken as a measure of body image.  A later study by Ferrer-Garcia et al. (2008) showed that women with a diagnosis of eating disorder, or at risk for ED, scored higher on both measures than women who were not at risk.

Another approach employs the Adolescent Body Morphing Tool developed by Aleong et al. (2007) at the University of Montreal.  They first developed an Adolescent Body-Shape database from front and side photographs of 160 male and female Canadian adolescents. The subjects were first photographed (front and side views while dressed in a body suit and ski mask (to protect anonymity).  Then trained judges applied virtual tags to various points on the body image.  A factor analysis of various measurements yielded a multidimensional representation of the "average" adolescent body.  The resulting images can be morphed by increasing or decreasing the size of these dimensions by a certain percentage.


In a later study, Aleong et al. (2009) employed height, weight, and body-mass index to match each of 182 normal (i.e., non-eating-disordered) males and females to one of the images stored in the Adolescent Body-Shape Database, and then distorted the image, especially around the hips, thighs, and calves.  The subjects were then asked to return the image to "normal".  In psychophysical terms, the point of subjective equality, reached when the subject believed that the image had been returned to normal, is a measure of the accuracy of the body image: females were less accurate than males, but only with the side image.  The difference limen measured how much morphing was required for the subject to detect a difference from normal: females were more sensitive than males -- again, especially when viewing side images


But you don't have to suffer from mental illness in order to have a self-image that is wildly discrepant from objective reality. A clever demonstration of this took the form of the "Dove Beauty Sketches", an advertising campaign mounted by Dove, a unit of Unilever that makes a popular brand of bath soap, in 2013. Dove commissioned Gil Zamora, a forensic artist who has worked with the San Jose Police Department and the FBI, who prepared sketches of ordinary women (i.e., not movie stars or other celebrities) from their self-descriptions (the women sat behind a screen, so that Zamora was able to work only from their verbal descriptions). Then Zamora prepared another sketch, of the same women, based on verbal descriptions of them by a randomly selected perceiver. When the sketches were placed side by side, the sketch based on the observer's description was more attractive than the one based on the women's own descriptions. The moral of the exercise, according to Dove: "You're more beautiful than you think you are". ("Ad About Women's Self-Image Creates a Sensation" by Tanzina Vega, New York Times 04/19/2013). Link to a YouTube video of the Dove Real Beauty Sketches.


Virtual Reality and the Self-Image

How do we know what we look like?  We look in the mirror (which reverses left and right), and we look down toward our feet (which gives us a somewhat distorted view of what lies between), and we feel our hearts beating (but not our blood pressure).  Like video, only more so, new virtual-reality technology allows us to see ourselves as others see us, and also allows us to experience what it would be like to live in other bodies -- a process known as "VR embodiment".  VR embodiment also enables subjects to have "out of body experiences" in which the self (at least the conscious self) appears to leave the body.

For a discussion of the possibilities of VR embodiment, see "As Real As It Gets: Are We Already Living in Virtual Reality?" by Joshua Rothman, New Yorker, 04/02/2018, which discusses the work of Thomas Metzinger, Olaf Blanke, and Mel Slater and Mavi Sanche-Vives, among others. In Being No One (2003) and other books, Metzinger, drawing on Phillip Johnson-Laird's (1983) work on mental models in thinking, suggests that we carry around in our heads a number of images of our selves, some of which are based on direct sensory experience (like vision and audition), and some of which actually are "complex forms of virtual reality" produced by brain activity.



Interoception, Proprioception, and the Proto-Self

Two special senses contribute to the self-image:

  • Interoception provides information about the state of one's internal organs -- blood pressure, blood-glucose levels, salt concentrations, and the like.  Interoception is technically sensation, in the sense that it involves afference, or communication from the peripheral nervous system to the central nervous system, but interoception does not always give rise to conscious sensory experiences.
    • You know when you're hungry, but it's not because you're aware of your blood-sugar levels, or your caloric deficiencies.  You know you're hungry when you experience the stomach contractions known as hunger pangs.  But that's just a matter of feedback from the muscles in the stomach.  It's not really interoception -- though some writers call it that.
  • Proprioception refers to sensations concerning the position and motion of the body.  It consists of two special senses, each of which does give rise to conscious sensory experience.
    • Kinesthesis, or sensation of the motion of the body.
    • Equilibrium, the sensation of balance and position in space.

It's been suggested that difficulties with interoception are risk factors for eating disorders and body dysmorphic disorder.  That is, we can't sense blood pressure or cell-fluid levels directly, but we can appreciate them indirectly, from sensations of the heartbeat, hunger pangs, dryness of the lips, and so forth.

Anthony Damasio believes that the neural systems supporting interoception and proprioception constitute a primitive proto-self.  For details, see the following section on "The Self and Its Brain".


The Self as Memory


For most of its history the study of memory has been the study of verbal learning. And accordingly, many psychologists have come to think of memory as a set of words (or phrases or sentences), each representing a concept, joined to each other by associative links representing the relations between them, the whole kit and kaboodle forming an associative network of meaning-based knowledge (Anderson, 1983) -- Schank and Abelson's (1995) theory of knowledge as stories is explicitly opposed to this conventional thinking. It is also commonplace to distinguish between two broad types of verbal knowledge stored in memory (Tulving, 1983). Episodic memory is autobiographical memory for a person's concrete behaviors and experiences: each episode is associated with a unique location in space and time. Semantic memory is abstract, generic, context-free knowledge about the world. Almost by definition, episodic memory is part of the self-concept, because episodic memory is about the self: It is the record of the individual person's past experiences, thoughts, and actions. But semantic memory can also be about the self, recording information about physical and psychosocial traits of the sort that might be associated with the self-concept.

Within the verbal-learning tradition, knowledge about other people has been studied extensively in a line of research known as person memory (Hastie, Ostrom, Ebbesen, Wyer, Hamilton, & Carlston, 1980). Several different models of person memory have been proposed (Kihlstrom & Hastie, 1993), and some of these have been appropriated for the study of memory for one's self (Kihlstrom & Klein, 1994; Klein & Loftus, 1993). The simplest person-memory model is an associative network with labeled links. Each person (perhaps his or her name) is represented as a single node in the network, and knowledge about that person is represented as fanning out from that central node. The person-nodes are also connected to each other, to represent relationships among them, but that is another matter. The point is that in these sorts of models the various nodes are densely interconnected, so that each item of knowledge is associatively linked to lots of other items. In theory, the interconnections among nodes form the basis for associative priming effects, in which the presentation of one item facilitates the processing of an associatively related one.

Of course, knowledge about a person can build up pretty fast: consider how much we know about even our casual acquaintances. According to the spreading activation theory that underlies most associative network models of memory (Anderson, 1983), this creates a liability known as the fan effect: the more information you know about someone or something, the longer it takes to retrieve any particular item of information.

Is there any way around the fan effect in person memory? One possibility which has been suggested is that our knowledge about ourselves and others (especially those whom we know well) is organized in some way -- perhaps according to its trait implications. There is some evidence that organization does abolish the fan effect (Smith, Adams, & Schorr, 1978), but this evidence is rather controversial, and some have concluded that memory isn't really organized in this manner after all (Reder & Anderson, 1980). Nevertheless, a hierarchically organized memory structure is so sensible that many person-memory theorists, such as Hamilton and Ostrom, have adopted it anyway (Hastie et al., 1980; Kihlstrom & Hastie, 1993).

JamesBartlettPM.JPG (41655 bytes)How  can we generalize from person memory to the structure of memory for the self? Sure: The simplest expedient is simply to take a generic associative-network model of person memory, which has nodes representing knowledge about a person fanning out from a node representing that person him- or herself, -- following the "James Bartlett" example discussed at length in the lecture supplement on Social Memory -- and substitute a "Self" node for the "Person" node.

062SelfPriming.jpg
          (59719 bytes)Actually, there are three possibilities here, distinguished particularly with respect to the relations between behavioral and trait information, or episodic and semantic knowledge. 

 

 

059Independence.jpg (54352 bytes)Research on person memory -- i.e., memories for other people depicts a model in which episodic self-knowledge is encoded independently of semantic self-knowledge -- or, put another way, in which knowledge of behaviors is represented separately from knowledge of traits.  If the self is a person just like any other, we would expect that the representation of self in memory would have the same structure as memory representations of other people.

 

060Hierarchical.jpg (45123 bytes)On the other hand, the self may be exceptional, in that memory for specific behavioral episodes might be organized by their trait implications. In this model, nodes representing traits fan off the node representing the self, and nodes representing specific episodes which exemplify these traits fan off the trait-nodes. This hierarchical model implies that retrieval has to pass through traits to access information about behaviors. Thus, traits will be activated in the course of gaining access to information about behaviors. 

 

061Computational.jpg (50867 bytes)On the other hand, Bem's self-perception theory denies that we retain any knowledge about our traits and attitudes (because we don't really have any traits or attitudes); when we are asked about our traits and attitudes, we infer what they might be from knowledge of relevant behaviors.  From this point of view, the self contains only episodic knowledge about experiences and behaviors, and that semantic knowledge about traits is known only indirectly, by inference. One such inferential process would involve sampling autobiographical memory, and integrating the trait implications of the memories so retrieved. In this computational model, retrieval must pass through behaviors in order to reach traits. Put another way, nodes representing behaviors will be activated in the course of recovering -- put precisely, in the course of constructing -- information about traits. 

 

065KleinPrime.jpg
            (52109 bytes)An extensive series of studies by Klein and Loftus (1992) has produced a compelling comparative test of these models. These studies adapted for the study of the self the priming paradigm familiar in studies of language and memory, in which presentation of one item facilitates the processing of another associatively related item. Subjects were presented with trait adjectives as probes, and performed one of three tasks. In the define task, they simply defined the word; in the describe task, they rated the degree to which the term described themselves; in the recall task they remembered an incident in which they displayed behavior relevant to the trait. For each probe, two of these tasks were performed in sequence -- for example, describe might be followed by recall, or define by recall, or recall by describe. There were nine possible sequences, and the important data was the subject's response latency when asked the second question of each pair.

Because priming occurs as a function of overlap between the requirements of the initial task and the final task, systematic differences in response latencies will tell us whether activation passes through trait information on the way to behaviors, or vice-versa, or neither. When the two processing tasks were identical, there was a substantial repetition priming effect of the first one on the second. But when Klein and Loftus (1992) examined the effect of recall on describe, they saw no evidence of semantic priming compared to the effects of the neutral define task. Nor was there semantic priming when they examined the effect of describe on recall (again, compared to the effects of the neutral define task). Contrary to the hierarchical model, the retrieval of autobiographical memory does not automatically invoke trait information. And contrary to the self-perception model, retrieval of trait information does not automatically invoke memory for behavioral episodes. Because self-description and autobiographical retrieval do not prime each other, Klein and Loftus conclude that items of episodic and semantic self-knowledge must be represented independently of each other.


Self-Knowledge in Amnesia

Parallel findings have been obtained in case studies of amnesic patients' self-knowledge.  

The Case of K.C.  The first such study was, in fact, inspired by Klein's (Klein & Loftus, 1993) claim, based on his priming studies, that episodic self-knowledge is encoded independently of semantic self-knowledge.  In a commentary on Klein's paper, Tulving reported an experiment that he conducted with Patient K.C., who was rendered permanently amnesic as a result of a motorcycle accident at age 30.  K.C. is an especially interesting case of amnesia, because he has a complete anterograde and retrograde amnesia: he has no conscious recollection of anything he has done or experienced throughout his entire life, both before and after his accident.  Interestingly, K.C. also underwent a marked personality change as a result of his accident.  Whereas his "premorbid" personality was rather extraverted (he was injured riding a motorcycle, after all!), his "postmorbid" personality became rather introverted.  

2008KC1.JPG (44095 bytes)Tulving was interested in whether K.C. had any knowledge of what he was like as a person, despite his dense amnesia in terms of explicit episodic memory.  Employing an adjective checklist supplied by Klein, Tulving asked K.C.  and his mother to rate both their own and each other's personality.



2008KC2.JPG (86510 bytes)Of course, much of this apparent accuracy could be achieved if K.C. and his mother simply said good things about each other.  Accordingly, Tulving conducted a more rigorous test in which K.C. and his mother were asked to select the more descriptive adjective from pairs of adjectives that were matched for social desirability.



The Case of W.J.  Klein himself obtained similar findings from an 18 y/o college student who temporarily lost consciousness following a concussive blow to the head received when she fell out of bed during the second quarter of her freshman year (Klein, Loftus, & Kihlstrom, 1996).  Although a medical examination revealed no neurological abnormalities, W.J. did show an anterograde amnesia covering the 45 minutes that elapsed after her injury (very common in cases of concussion), as well as a traumatic retrograde amnesia that covered the preceding 6-7 months -- essentially, the entire period of time between her high-school graduation and her concussion (organic amnesias that are bounded by personally significant events are rare, but they do occur, and they are very intriguing).  This retrograde amnesia remitted over the next 11 days, but not before Klein was able to complete extensive memory and personality testing.  The personality testing was particularly interesting because W.J., like many college students, showed rather marked changes in personality as a college student, compared to what she had been like in high school.  

052WJDuring.jpg (51953 bytes)Memory testing of W.J. confirmed her retrograde amnesia.  053WJPost.jpg (49894 bytes)Employing the Galton-Crovitz cued-recall technique, in which subjects are presented with a familiar word and asked to retrieve a related autobiographical memory, Klein et al. showed that she was much less likely than control subjects to produce memories from the most recent 12 months of her life, and much more likely to produce memories from earlier epochs.  After W.J.'s retrograde amnesia lifted, her performance on this task was highly similar to that of the controls.

 

2008WJ.JPG (74067 bytes)Personality testing revealed a pattern of performance similar to that observed in patient K.C. 




Taken together, these neuropsychological case studies support the conclusions of Klein's priming studies of self-knowledge.  Amnesic patients, who lack autobiographical memory for their actions and experiences, nevertheless retain substantial knowledge of their own personalities.  This indicates that "semantic" trait information and "episodic" behavioral information are represented independently in memory.  

Semantic Self-Knowledge as Implicit Memory

Technically, in both cases the semantic memory about the self preserved in cases of amnesia accompanied by personality change probably reflects spared implicit memory for the patient's past personal experiences.

Another perspective on the relationship between memory and identity is provided by Encircling, a novel in three volumes by Carl Frode Tiller (2007; translated from the Norwegian by Barbara Haveland, 2015; reviewed in "The Possessed" by Clair Wills, New York Review of Books, 07/22/2021).  David, a Norwegian in his thirties who has lost most of his memory, places a newspaper advertisement asking for people who knew him to write to him and fill in the gaps in memory.  Vol. 1 covers his teenage years; Vol. 2, childhood; Vol. 3, young adulthood.  David's recent memories have been spared (so far), so in all there are nine different accounts of his life (as well as nine different perspectives on the recent social history of rural Norway).  The result is what Endel Tulving (and I) have called remembering by knowing -- abstract knowledge of the past, knowing the events of one's own life much as one can recount the major battles of the Civil War.  Because autobiographical memory is an important part of identity, David constructs (or maybe reconstructs) his self-concept the same time. On the jacket of Vol., 3, a blurb states that "Identity is not a monolith but a collage", built up of fragments.  On the other hand, David's therapist suggests that (quoting Wills quoting the book) "we are little more than 'situation-appropriate personas', with no coherent identity at all".  Wills herself has a different take: that Tiller "is not so much interested in how we are formed by the perspectives of other people as in how we are destroyed by them.  Again and again his characters battle to maintain a sense of self in their encounters with others, and again and again they lose the battle".

 

The Case of D.B.: Projecting Oneself into the Future

Another amnesic patient illustrates the same basic points.  Patient D.B. (Klein, Loftus, & Kihlstrom, 2002) was a 78-year-old retired man who went into cardiac arrest while playing basketball (this patient is not the D.B. of "blindsight" fame).  As a result, he experienced anoxic encephalopathy, or brain damage due to a lack of oxygen supply to the brain.  A CT scan revealed no neurological abnormalities, but upon recovery he displayed a pattern of memory loss similar to that of patient K.C.: a dense anterograde amnesia covering memories for episodes since his accident, plus a dense retrograde amnesia covering his entire life before the accident.  

058DBMem.jpg (40477 bytes)D.B. was tested with both episodic and semantic versions of the Galton-Crovitz procedure.  When asked to recall cue-related personal experiences, D.B. showed a profound deficit compared to control subjects.  But when he was asked to recall cue-related historical events, his performance improved greatly.  Again, these results are consistent with the idea that amnesia affects episodic self-knowledge, but spares semantic knowledge. 

 

060DBFuture.jpg (45960 bytes)In another part of the study, D.B. was asked to imagine the future as well as to remember the past, with respect to both personal experiences and historical events.  For example:



Compared to controls, D.B. displayed almost no knowledge of his past experiences, but also no ability to project himself into the future.  By contrast, he showed fairly good ability to recall public issues from the past, and to give reasonable predictions of issues that might emerge in the future.  

These findings on the experience of temporality are consistent with the conclusion that amnesia impairs episodic self-knowledge, but spares other forms of memory, including semantic knowledge about the self and semantic knowledge about the historical past.  But D.B.'s data also makes the intriguing suggestion that the ability to imagine the future is related to the ability to remember the past.  Both are components of chronesthesia, or mental time travel -- defined by Tulving (2005) as the ability to project oneself, mentally, into the present and the future.

 

The Self as a Story: Autobiographical Memory

Both narrating the personal past and projecting the personal future entail storytelling -- which brings up yet another form of mental representation, knowledge as stories. Recently, Schank and Abelson (1995, p. 80) have asserted that "from the point of view of the social functions of knowledge, what people know consists almost exclusively of stories and the cognitive machinery necessary to understand, remember, and tell them. As they expand on the idea (p. 1).



  1. Virtually all human knowledge is based on stories constructed around past experiences;
  2. New experiences are interpreted in terms of old stories; 
  3. The content of story memories depends on whether and how they are told to others, and these reconstituted memories form the basis of the individual's remembered self.
  4. Further, shared story memories within social groups define particular social selves, which may bolster or compete with individual remembered selves.

Schank and Abelson concede that knowledge also can be represented as facts, beliefs, lexical items like words and numbers, and rule systems (like grammar), but they also argue that, when properly analyzed, non-story items of knowledge actually turn out to be stories after all; or, at least, they have stories behind them; or else, they turn out not to be knowledge (for example, they may constitute indexes used to organize and locate stories). Their important point is that from a functional standpoint, which considers how knowledge is used and communicated, knowledge tends to be represented as stories.

The idea of knowledge as stories, in turn, is congruent with Pennington and Hastie's (1993) story model of juror decision-making. According to Pennington and Hastie, jurors routinely organize the evidence presented to them into a story structure with initiating events, goals, actions, and consequences. According to Schank and Abelson (1995), each of us does the same sort of thing with the evidence of our own lives. From this point of view, the self consists of the stories we tell about ourselves -- stories which relate how we got where we are, and why, and what we have done, and what happened next. We rehearse these stories to ourselves to remind ourselves of who we are; we tell them to other people to encourage them to form particular impressions of ourselves; and we change the stories as our self-understanding, or our strategic self-presentation, changes. When stories aren't told, they tend to be forgotten -- a fact dramatically illustrated by Nelson's (1993) studies of the development of autobiographical memory in young children. Furthermore, when something new happens to us, the story we tell about it, to ourselves and to other people, reflects not only our understanding of the event, but our understanding of ourselves as participants in that event. Sometimes, stories are assimilated to our self-understanding; on other occasions, our self-understanding must accommodate itself to the new story. When this happens, the whole story of our life changes.

The twin ideas of self as memory, and of self as story, bring us inevitably to the topic of autobiographical memory (ABM) -- that is, a real person's memories for his own actions and experiences, which occurred in the ordinary course of everyday living. 


Memory and Identity

ABM is an obviously important aspect of the self, as it contains a record of the individual's own actions and experiences.

In his Essay Concerning Human Understanding (1690), the philosopher John Locke went so far as to assert that memory formed the basis for the individual's identity -- our sense that we are the same person, the same sense, now as we were at some previous time.  Locke's idea sounds reasonable, but other philosophers raised objections to his equation of identity with memory.



David Hume (in his Treatise of Human Nature, 1739), ever the radical empiricist (more radical than Locke, apparently), argued that the self, as an immaterial object, doesn't even exist.  But he agreed that the impression of identity is created by memory, including direct recollections of past experiences, as well as knowledge and beliefs about oneself based inferences from one's memories.




For his part, the "Scottish" philosopher Thomas Reid (1785) argued (in "Of Mr. Locke's Account of Our Personal Identity", 1785) that the self was real enough, but that it couldn't be identified with memory.  He based his argument on what he called the Brave Officer Paradox:




Suppose a brave officer to have been flogged when a boy at school for robbing an orchard, to have taken a standard from the enemy in his first campaign, and to have been made a general in advanced life; suppose, also, which must be admitted to be possible, that, when he took the standard, he was conscious of his having been flogged at school, and that, when made a general, he was conscious of his taking the standard but had absolutely lost consciousness of the flogging.
Reid's point is a logical one, based on the principle of transitivity:
That's right as a matter of logic, but it's not right psychologically.  Viewed objectively, of course, the identical person experienced the flogging, took the standard, and was made a general.  But viewed subjectively, from the Brave Officer's point of view, it's not part of his identity because it's not something that he remembers.  It's not something that he associates with himself -- or, perhaps, his self -- either as an episodic memory of past experience or a semantic memory about his personal history.  Now, if the Brave Officer knew that he had been flogged, as a kind of semantic memory, even if he didn't remember the event himself, as an episodic memory, that fact could still be part of his identity.  Now, Reid didn't have the conceptual distinction between episodic and semantic memory, but the point still holds.  As Reid stated the Brave Officer Paradox, he has no consciousness of the flogging, as any part of his consciousness; therefore, "having been flogged" isn't part of the Brave Officer's identity.

But now consider the more recent case of Millvina Dean -- who, before she died on May 31, 2009, was the last living survivor of the Titanic Disaster, on April 14, 1912, when the ship struck an iceberg on her maiden voyage and sank with great loss of life.  Among those lost were Dean's father, though her mother and older brother, then 2 years old, also survived.  Obviously, Dean had no memory of the sinking -- infantile and childhood amnesia took care of that; moreover, she didn't even learn about the sinking, or her involvement in it, until she was 8 years old.  But she did know that she was a survivor of the Titanic, and this knowledge was part of her identity.  This reminds us that the self, viewed as a memory, contains semantic as well as episodic self-knowledge; and also that autobiographical memory can be based on "knowing" as well as "remembering".
Note for Coincidence-Collectors: The day that Dean died, May 31, 2009, happened to be the 98th anniversary of the launching of the Titanic.  And her older brother died on April 14, 1992, which was the 80th anniversary of the shipwreck.  These are examples of what is sometimes called an anniversary reaction, in which people die on the anniversary of some important event in their lives.  The deaths of John Adams and Thomas Jefferson, who both died on July 4, 1826, the 50th anniversary of the signing of the Declaration of Independence, are two other examples.

For more about memory and identity, see "Memory and the Sense of Personal Identity" by Stanley B. Klein and Shaun Nichols, Mind, 2012).


Autobiographical Memory and Episodic Memory

Autobiographical memory is technically classified as episodic memory -- itself a subset of declarative memory, consisting of factual knowledge concerning events and experiences that have unique locations in space and time (two events can't occur at the same time in the same space).  Episodic memory is commonly studied with variations on the verbal-learning paradigm, is explicitly intended as a laboratory analogue of autobiographical memory: each list, and each word on a list, constitutes a discrete event, with a unique location in space and time.  And, as we'll see, ABM can be studied with variants on verbal-learning procedures.

ABM is episodic memory, as opposed to semantic memory or procedural knowledge, but ABM isn't just episodic memory -- there's more to it than a list of items studied at particular places and particular times (Kihlstrom, 2009).

I expand on these points below.


Self-Reference

Autobiographical memories are episodic memories, but they're not just episodic memories.  In an important essay, Alan Baddeley (1988) put his finger on the difference: "Autobiographical memory... is particularly concerned with the past as it relates to ourselves as persons" (p. 13).  To really qualify as autobiographical, a memory ought to have some auto in it, so that  the self is really psychologically present -- in a way that it's not present in a memory like The hippie touched the debutante

Taking a leaf from Fillmore's case-grammar approach to linguistics (Fillmore, 1968; see also Brown & Fish, 1983), it seems that in every autobiographical memory the self is represented in terms of semantic role:

The context in which the event occurred goes beyond time and place and other features of the external environment, and includes the person's internal mental state.  So autobiographical memory also represents the person's cognitive, motivational, and emotional state at the time of the event-- what the person was thinking at the time, what he or she wanted, and how he or she felt.  Although the person's emotional and motivational state provides important elements of context at the time of retrieval, the emotions present at the time the experience is initially encoded are also represented in memory, as evidenced by phenomena such as mood-congruent encoding and mood-dependent retrieval.

But what is self-reference reference toAs discussed earlier, from a cognitive point of view, the self is, simply, one's mental representation of oneself -- no different, in principle, from the person's mental representation of other objects.  This mental representation can be variously construed as a concept (think of the "self-concept"), an image (now think of the "self-image"), or as a memory structure.  For present purposes, we can think of the self simply as a declarative knowledge structure that represents factual knowledge of oneself.  And, like all declarative knowledge, declarative knowledge comes in two forms:

Within the framework of a generic network model of memory (such as ACT), we can represent the self as a node in a memory network, with links to nodes representing other items of declarative self-knowledge.  Although this illustration focuses on verbal self-knowledge, it should be clear that the self, as a knowledge structure, contains both meaning-based and perception-based knowledge about the self.  Examples follow:


Autobiographical memory can be of either kind.

For present purposes, we're going to focus on verbalized recollections.


Let's get concrete and see how a particular autobiographical memory would be represented in a generic associative network model of memory, such as ACT.  For this purpose, we'll take some passages from James' discussion of secondary (long-term) memory, and see how a particular autobiographical memory would be represented in John Anderson's ACT theory of memory. 



First, there is knowledge of the event itself.  The elementary concepts are themselves linked to other nodes representing related concepts .



Anderson's (1976) first statement of the ACT theory was built around a single sentence, A hippie touched a debutante. Of course, there was more to the book than that -- much more.
Then we add knowledge about the time and place in which the event occurred.

This act of touching took place at a particular place and time.
And finally, we add knowledge about the self as the agent or patient, stimulus or experiencer of the event.

The the person himself saw the event occur (or he could be represented as the hippie, or she could be represented as the debutante).  
In ACT, concepts are represented by nodes, and associative links represent the relationships between nodes, such as Subject (S), Object (O), and Relation (R).  


The Autobiographical Knowledge Base

Perhaps the most thorough cognitive analysis of ABM has been provided by Michael Conway and his colleagues (Conway, 1992, 1996; Conway & Pleydell-Pearce, 2000) in terms of what Conway calls the autobiographical knowledge base, which is presented in the form of a generic associative memory network (although without the operating computer simulation of ACT and similar formal models). 



In Conway's theory, individual ABMs are represented as nodes linked to various other elements in the network, and organized in various ways.




What Conway calls the self-memory system reflects the conjunction of the autobiographical knowledge base with the working self.  This structure, analogous to working memory, and consists of an activated self-schema as well as current personal goals and current emotional state. 

Conway and Pleydell-Pearce (2000) argue that ABMs are constructed (note the Bartlettian term) in two ways.


Temporal Organization of Memory

Autobiographical memory is not just about episodes, and it is not just about auto: it is also biographical.  It is not enough to construe autobiographical memory as memories of one's own experiences, thoughts, and actions, strung together arbitrarily as if they were items on a word-list.  Autobiographical memory is the story that the person tells about him- or herself  or, at the very least, it is part of that story.  As such, we would expect autobiographical memory to have something like an Aristotelian plot structure (see his Poetics): an "arrangement of the incidents" into a chronological sequence.

A good example (if I may say so) is recall of the events of hypnosis, which typically begins at the beginning of the session and proceeds in a more or less chronological succession to the end.  Kihlstrom (Kihlstrom & Evans, 1973; Kihlstrom, Evans, Orne, & Orne, 1980) found that the temporal sequencing of recall was disrupted during suggested posthypnotic amnesia.  The implication is that (1) episodic memories are marked with temporal tags; (2) episodic memories are organized according to the temporal relationships between then; and (3) that the retrieval process enters the sequence at the beginning and follows it until the end.  

It should be noted, however, that in the hypnotic case subjects are given a retrieval cue that specifies the beginning of the temporal sequence.  For example, the subject is asked to "recall everything that happened since you began looking at the target (a fixation point used in the induction of hypnosis by eye closure).  If the instructions had been different -- for example, a request to recall "everything that happened while you were hypnotized", a somewhat different pattern of recall might be observed. 

We can distinguish between two aspects of temporal organization:

For current purposes, we'll focus on external organization, or how one autobiographical memory is related to others.

Viewed across the lifespan, the distribution of memories shows a clear temporal gradient, which was summarized by Conway & Pleydell-Pearce (2000) as follows.  As a general rule, ABM shows the usual pattern of time-dependent forgetting.  Most ABMs are of very recent events, and the frequency of ABMs  drops off progressively. This is just a recency effect, and needs no special explanation.  However, there are two somewhat unusual features of the distribution.




Cascading Reminiscence Bumps in Memory for Music

Conway's explanation of the reminiscence bump is as good as anyone else has come up with, but it may not be quite right. Recently, Krumhansl and Zupnik (2013) found a peculiar pattern of reminiscence in memory for music.  They presented college students with 11 1-minute music clips, each containing extracts (actually, the choruses) of 10 "Top 100" popular songs from a 5-year period, 1955-2009.  For each clip, the subjects were asked to report the percentage of individual songs they recognized.  They were also asked to rate the songs for likability, quality, and emotional response; and also to report whether they had any specific autobiographical memories for the songs they had heard.

  • The overall pattern of results showed a clear temporal gradient, such that memory was best for the clip containing the most recent songs ("Boom Boom Pow" by the Black-Eyed Peas and "Poker Face" by Lady Gaga), and worst for the clip containing the oldest songs ("Cherry Pink and Apple Blossom White" by Perez Prado and "Rock Around the Clock"by Bill Haley & His Comets).  But remember, these subjects were college students, roughly 20 years old when the study was performed. 
  • The recency effect was strongest, of course, for songs from the most recent periods of time.
  • Most interestingly, there was evidence of what might be called intergenerational transmission of reminiscence: clear "reminiscence bumps" for songs popular before the subjects were born.
    • One such bump occurs for music from the late 1960s, when the subjects' parents were in their 20s.
    • And there was yet another "reminiscence bump, for songs from the 1960s, when their parents were very young, probably when their grandparents were in their 20s. 
      • Alternatively, this bump might reflect the high quality of popular music of that time, such that the songs were regarded as "classics".
  • Each of these intergenerational "bumps" was associated with an increase in personal memories associated with the music, and the older bumps were associated with feelings of nostalgia
Although musical preference might be transmitted intergenerationally as a kind of semantic memory, this study revealed a clear episodic, and thus autobiographical, component, as the music in question evoked specific autobiographical memories.


Despite infantile and childhood amnesia and the reminiscence bump, the major feature of ABMs is the temporal gradient.  This is universally observed, no matter the means by which ABMs are sampled.

Employing this Crovitz-Robinson technique, Robinson (1976) observed that the distribution of response latencies followed an inverted-U-shaped function.  Recall of very recent memories occurred relatively quickly; as the age of the memory increased, so did response latency.  The exception was rather short latencies associated with very remote memories.  Robinson suggested that these very remote memories might be unrepresentative of ABMs in general: because they are highly salient, they are quickly retrieved.  Otherwise, the distribution suggests that the retrieval of ABMs follows a serial, backwards, self-terminating search.  That is, the retrieval process begins with the most recent ABMs, and searches backward until it identifies the first memory that satisfies the requirements of the cue.

The way to get around this problem is to present subjects with retrieval cues that constrain the time period from which the memory is to be retrieved.  This is what Chew (1979) did in an unpublished doctoral dissertation from Harvard University (see also Kihlstrom et al., 1988).  Chew found that cues with high imagery value produced shorter response latencies than did those with low imagery value, and that imagery value interacted with the epoch (remote or recent) from which the memory was drawn.  But still, memories drawn from the more recent epoch were associated with shorter response latencies than those drawn from the more remote one.  Moreover, within each epoch the distribution of memories was characterized by a reverse J-shape.  That is, fewer memories were retrieved from the early portions of each epoch.  These findings are consistent with the hypothesis that retrieval begins at the most recent end of a temporal period, and proceeds backwards in time until it reaches an appropriate memory.

At the same time, the idea that ABMs are organized along a single continuous chronological sequence seems unlikely to be true.  First, it's very inefficient.  Yes, it would produce a fan effect, but as subjects aged the memories would pile up, and retrieval would be extremely inefficient.  Rather, it seems likely that the chronological sequence of ABMs is organized into chunks representing smaller segments of time.  There's an analogy to the English alphabet.  Yes, it's a single temporal sequence, from A and B through L and M to X, Y, and Z.  But this large temporal string is also broken up into smaller chunks. 

Consider, for example, a telephone keypad (or, for those old enough to remember, a rotary dial), which breaks the alphabet up into 3- or 4-letter strings corresponding to the numbers 2-9.




Or, better yet, "The Alphabet Song" sung almost universally by native English-speaking children as they learn their alphabet, which breaks the alphabet up into 6 or 7 chunks (depending on how you count)



Klahr et al. (1983), perhaps inspired by "The Alphabet Song", reported a study of an alphabetic search task that may well serve as a model for chunking in the temporal organization of autobiographical memory.  Klahr et al. presented their subjects with a randomly selected letter of the alphabet, and asked them simply to report the letter which immediately followed, or preceded, the probe.  Analysis of response latencies revealed a sawtooth pattern suggesting that subjects searched through sub-strings, not the whole 26-letter string.  

 

Based on these results, they developed a model of the alphabetic search task, implemented as ALPHA, an operating computer simulation.  In the model, the letters of the alphabet are represented in -- forgive me -- alphabetical order, but that ordered string is also subdivided into  chunks roughly following the "Alphabet Song" learned by probably every English-speaking schoolchild.  In the theory, the subject enters the string from the beginning, but then branches out until it finds the chunk, or subsidiary ordered string, which contains the probe item.  Search then proceeds through the subgroup until it locates the probe.  ALPHA successfully mimicked the performance of real subjects on the alphabetic search task.


Along similar lines, Skowronski and his colleagues  (2007) proposed that ABM is also organized into chunks.  In their experiments, subjects first listed and dated their ABMs for high school and college.  Then pairs of these memories were presented to subjects in a judgment of recency task -- in which, as its name implies, subjects are simply asked to say which of two remembered events was the more recent.  Skowronski et al. then divided the memories into various epochs: high school vs. college, year, quarter of the school year, freshman-sophomore vs. junior-senior, and academic year vs. summer.  On some trials, both memories were drawn from the same epoch; on other trials, the memories were drawn from different epochs.  Subjects were highly accurate on this task, making the correct choice 82.5% of the time.  When Skowronski et al. looked at response latencies, however, they found that, in general, response latencies were shorter for between-epoch judgments than for within-epoch judgments.  The exception was whether the memories were drawn from the earlier or later years of high school or college.  They concluded that the chronological sequence of autobiographical memory was indeed divided into smaller temporal chunks, promoting a retrieval process not unlike that uncovered by Klahr et al. in the alphabetic search task.

Although Skowronski et al. imposed the same chunking scheme on all of their subjects, it seems likely that there will be some idiosyncratic variation in this -- variation which, itself, reflects the vicissitudes of the life cycle.  A student who switched schools in the middle of high school might well, for example, differentiate between his freshman and sophomore years at Elmira Free Academy and his junior and senior years at Notre Dame.  A child whose parents divorced, or remarried, might use those events to mark out epochs in memory.

And for that matter, the markers might well be psychosocial in nature -- perhaps along the lines of Erik Erikson's "Eight Ages of Man".

The point here is that any temporal epochs in an individual's memory are likely to be individual, not universal, and themselves reflect his or her self-concept.  Put another way, the epochs which break up an individual's autobiographical memory are likely to be subjective, not objective, in character.  


The Plot Thickens: Causal Relations in Autobiographical Memory

Temporal organization is important in ABM, but it's not the whole story.  Autobiographical memories are episodic memories, but Aristotle argued (again in the Poetics) that purely episodic plots, in which the only thing that bound individual episodes together was that they all involve the same person, were the worst kind of plots.  At the very least, in autobiographical memory, there ought to be some sense of beginning, middle, and end -- some sense of how individual episodes are related to each other in the flow of personal time.  The sequence of events makes a difference to their meaning.  In addition to a chronological ordering, however, the temporal organization of ABM may reflect the causal relations among events. Of course, causes come before effects in time, but the point here is that the linkages among events recorded in ABM may not be merely chronological.  If the self is a story about oneself, then it's got to have all the elements of a real story.

Aristotle lists a number of causal relations that might be represented in a drama, and these might be recorded in ABM as well:

Along the same lines, but much more recently, David Pillemer (1998, 2001) provided a list of causal events in the individual's life story:






But plot does not simply involve a chronological organization of events: it also entails a causal organization of them, an analysis of causality that makes a difference to the meaning of both events.





Memory and Character

Aristotle suggested that the events in a good drama should reveal something about character, and this is likely to be true of ABMs as well.  Not every remembered episode reveals our tragic flaws, not least because not every life is a tragedy: still, our memories say something about ourselves, and about the other people in the events we remember -- which is perhaps just another way of saying that they say something about us (McAdams, 1993).  This is a big subject, but in general we can distinguish two broad points of view.



Recollective Experience

So far, this discussion has concerned conscious autobiographical memory, begging the question of whether there are unconscious autobiographical memories as well.  Certainly implicit memories, which influence experience, thought, and action outside of conscious awareness and conscious control, are autobiographical in the narrow sense of being episodic (and thus declarative).  The studied item that gives rise to priming effects is, after all, an event in the subject's life.  But autobiographical memories are intrinsically self-referent, and implicit memories lack self-reference.  When I complete the stem ash--- with "ashcan" rather than "ashtray" because I read the former word on an earlier list of items,  I am saying something about a word, but I am not saying something about myself.  The whole point of dissociations between explicit and implicit memory is that implicit memories represent an event in the objective past that is not part of autobiographical memory.   It follows, then, that autobiographical memory cannot be unconscious.  The rare exceptions that test this rule are found mostly in the functional amnesias of the dissociative disorders  -- which are very special cases indeed.

Conscious recollection, in turn, comes in many forms.  Tulving (1985) distinguished between two primary varieties of recollective experience: remembering, which entails one's concrete awareness of oneself in the past, and knowing, or a more abstract knowledge of the past.  I clearly remember swimming across Lake Keuka (this was a sort of rite of passage for kids who were raised in upstate New York ).  And I know that my parents took me to visit Santa's Workshop Village in North Pole, New York : that is part of my autobiographical memory, too, but I don't remember a thing about it.  It's just a fact about my life, and I know it because of family story-telling around the Thanksgiving table, photographs in my mother's scrapbook, and the like. 
Milvena Dean, the last survivor of the Titanic disaster, died on May 31, 2009-- interestingly, on the 98th anniversary of the ship's launching.  In her later years, especially after the release of James Cameron's movie, Titanic (1997), she enjoyed some degree of celebrity, but she had no personal recollection of the event-- not least because she was only nine weeks old when the ship went down, and she only learned that she had been on the ship at age eight.  She knew she was a Titanic survivor, and that fact played an important part in her life, but she had no recollections of the event at all.
It appears, however, that remembering and knowing do not exhaust the varieties of recollective experience.  At the very least, both "remembering" (viewed as full-fledged conscious recollection of an event as part of one's subjective autobiography) and "knowing" (viewed as retrieval from semantic memory of an event as part of one's objective biography) can be further distinguished from an intuitive "feeling" that something happened in the past.  The "feeling of knowing" state is well documented in studies of verbal learning and retrieval from semantic memory, but the same sort of feeling occurs in genuine autobiographical memory, as when we have a feeling that we have met someone somewhere before, but cannot say where or when.  I have a feeling that I saw Woody Allen's Midsummer Night's Sex Comedy at the New Yorker Theatre in Manhattan in 1982, soon after its premiere, but-- with apologies to the friends who must have been with me at the time -- I don't actually remember it; and I know full well that Woody Allen movies premiered at the Beekman Theatre, not the New Yorker.  Perhaps the memory is, at least, in part, the product of priming: I spent a lot of time in New York City in the early 1980s, and I'm a long-time subscriber to The New Yorker.

In addition, the controversy over recovered memories of child sexual abuse and other trauma -- what have been called "the memory wars" (Crews, 1995) suggests yet a fourth variety of recollective experience: believing that something happened, on the basis of something else you know (or, at least, think you do), in the absence of any personal recollection or independent evidence.  The belief may be wrong, of course, and the event may have never happened at all.  And it might be right.  I believe that I once met Edler Hawkins, an early civil rights pioneer, because he was a friend of my parents.  But I have no personal recollection of the event, nor is there any evidence in the documentary record.  This only underscores the fact that autobiographical memory is one's mental representation of one's own personal past-- and, like all mental representations, it may depart substantially from historical truth (Spence, 1982).

The varieties of recollective experience in autobiographical memory imply that that there are many different sources of autobiographical memory: personal recollection, independent knowledge, intuition, and belief.  Just as Bartlett 's reconstruction principle reminds us that remembering is more like telling a story than reading one, so the varieties of recollective experience in autobiographical memory remind us that there are at least two forms of personal story.  Autobiography, like biography, is objective and well documented, limited to recollections that can be verified and facts that can be sourced.  Memoir, on the other hand, is private, and subjective, and includes recollections, inferences and beliefs that cannot be verified.  In the final analysis, checking autobiographical memory against historical fact, the same way we check recall and recognition against the list of words that subjects actually study, may miss the point  of autobiographical memory (J. F. Kihlstrom, 2002).  It is our memories that guide our experience, thoughts, and actions, not the historical record.  But in the absence of independent corroboration, autobiographical memories should be viewed skeptically; and when they conflict with the historical record, something has got to give.  Arguably, history should trump memory.

Most research on ABM is on conscious recollections, reminding us (sorry) that there are several varieties of recollective experience observed in episodic memory:

This focus on the conscious experience of ABM raises the question of whether there are unconscious ABMs as well.  Certainly Freud thought so: the "reminiscences" that "hysterics" ostensibly "suffered from" were unconscious, because they had been repressed.  A more recent take on this formula is the trauma-memory argument, which holds that certain forms of mental illness and other problems in living are caused by repressed (or dissociated) memories of trauma, especially of childhood sexual abuse.  I discuss this problem at length in the "Memory" course lecture supplements on "Emotion and Memory".


The Function of Autobiographical Memory

Why should we have autobiographical memory?  The function of autobiographical memory is a legitimate question, but in asking it we should be careful to avoid what Gould and Lewontin (1979) called the adaptationist fallacy-- the assumption that every trait evolved because it was good for the species.  Some traits just happen, as accidents of common ancestry: We do not have two arms and two legs because that combination was particularly useful in the environment of early adaptation.  Apparently, we have two arms and two legs simply because we are descended from fish that had four fins-- and that is all there is to it.  

As a cognitive faculty, memory enables organisms to learn from their experience, and that is what the philosopher John Searle would call its nonagentive function.  So far so good, but why do we not simply retain the knowledge acquired through learning?  Why do we have to remember the learning experience as well?  One answer has been offered by Klein and his associates, who have suggested that episodic memory places boundary conditions on the generic knowledge recorded in semantic memory-- much as knowledge of specific category instances supplements, constrains, and corrects knowledge of general category prototypes.  But this does not address the question of the function of autobiographical memory as I have described it here, as something more than a mere record of specific events-- indeed, as a narrative which includes both the chronological and causal relations among individual events.  

It seems almost tautological to say that the function-- the nonagentive function-- of autobiographical memory is to permit individuals to consciously remember individual episodes of past experience, thought, and action.  But that is what it is, and even that function only exists in a world in which people value the ability to remember the past-- if only, pace George Santayana, so that they will not be condemned to repeat it. 

So now let us ask what the agentive functions of autobiographical memory are.  Given the nature of autobiographical memory, what are the uses to which sentient beings put it?  Again, Baddeley got it right, at least to a first approximation.  After distinguishing autobiographical memory from episodic memory in general, he noted that 

"Autobiographical memory... is particularly concerned with the past as it relates to ourselves as persons....  [It] is important because it acts as a repository for those experiences that constitute one's self-concept.  If you lose contact with your past, then presumably you lose contact with much of yourself" (p. 13). 

Of course, there is more to the self-concept than autobiographical memory.  Viewed as a knowledge structure encoded in memory, the self includes not just episodic self-knowledge concerning the individual's past experiences, thoughts, and actions, but also semantic self-knowledge, concerning the individual's more general traits, attitudes, physical, and demographic characteristics.  Semantic self-knowledge is generally spared in amnesic patients, supporting evidence from other paradigms that episodic and semantic self-knowledge are represented independently in memory.  Still, there is no doubt that autobiographical memory is an important component of the self: while semantic self-knowledge reminds us who we are, episodic self-knowledge reminds us how we got that way.  Autobiographical memory also records those episodes in which we were true to ourselves, and those in which we were not.  And by recording those episodes, it allows us to behave the same way the next time, or not-- it is up to us to determine how we will use what we remember. 

But the function of autobiographical memory is not just intrapersonal: it is also interpersonal.  We do not simply rehearse our autobiographical memories to ourselves: we also share them with others, and this sharing of autobiographical memories in and of itself constitutes an important form of social interaction, binding the participants together.  In a singles bar, one of the most popular pick-up lines (so I am told) is "Come here often?".  Another is "Don't you hate places like this?".  Both are invitations to share our personal experiences with another, as an initial step toward finding common ground (surely some evolutionary psychologist will now propose that the function of autobiographical memory is to support mating activities).  The sharing of autobiographical memories is an important experience for both children and their parents-- one which, interestingly, is crucial for the development of memory itself. 
Alea and Bluck (2003, 2009) underscore the social function of autobiographical memory in their survey of the uses to which people put autobiographical memory.  Both younger and older adults, and men and women alike, reported thinking about the past, and talking about the past, in order to maintain social bonds.  And, in fact, the social functions of autobiographical memory-- introducing oneself, developing a closer relationship, strengthening a friendship, finding out what another person is like, helping someone, or getting help-- seem to outshine the action-directive and self-defining functions.  When we enter into an intimate relationship with another person, in a very real sense their autobiographical memories become our own, and vice versa.  And if the relationship ends, there typically ensues a kind of anterograde amnesia for what the other has been doing since the breakup-- and, perhaps, a retrograde amnesia as well. 

Amnesic patients still retain their semantic self-knowledge, but their anterograde amnesia, affecting autobiographical memory, must put severe constraints on their social relationships.  Can one even have a serious relationship with someone who lacks autobiographical memory?  Amnesics can acquire new preferences (Multhaup et al., 1994), but can they fall in love?  I have long lamented the fact that, for all the attention given to the memory functions of H.M. and other amnesic patients, so little attention has been given to their social relationships-- except by a science writer (Hilts, 1995).  And here I confess that I have always wanted to take a page from Rokeach's  The Three Christs of Ypsilanti (1964) and put three amnesics together, just to see how they got along.

  Even the worst autobiographical memories-- intrusive, vivid, unbidden memories of traumatic experiences-- can have a positive function.  Kraus et al. (2009) sometimes verge close to the adaptationist fallacy-- assuming that even traumatic memories, because they exist, must have adaptive value in the grand evolutionary scheme of things.  But they also make a pretty convincing case that traumatic memories prevent future harm, elicit social support, and enhance intimacy.  On the other hand, it is also true that these are uses to which trauma victims can put traumatic memories, if they choose to do so.  Other victims may use the same traumatic memories as reasons to avoid others or reject offers of support.

On the other hand, maybe intrusive traumatic memories have no intrinsically adaptive function: maybe they are what they are, simply by virtue of high levels of beta-adrenergic activation, as discussed by Cahill and McGaugh (1996).  Being so deeply encoded, they are hard to forget, much less actively repress (assuming that repression ever actually occurs at all); and, because they won't go away, some victims may choose to put them to positive use.  The recent announcement of the discovery of a drug which might one day erase traumatic memories, precisely by dampening beta-adrenergic activation, reminds us not only of the central role played by autobiographical memory in personal identity, but also of what Margalit (2002) calls the ethical obligation to remember the past.  Even if such a drug were found to have highly selective effects, so that it could erase a victim's memory of trauma (or, for that matter, of your faux pas at that party last weekend) it is not at all clear that it should be used.  People have a right to remember what happened to them, and they may also have an obligation to do so. 

Autobiographical memory can also be put to economic use, enabling people to earn a living -- by writing memoirs, literally "trading on memory" (Baxter, 1999; Hampl, 1999).  Perhaps the most amazing trend in the modern publishing world has been the proliferation of memoirs-- most of which are by people of whom we are totally ignorant, and most of which sell like hotcakes (Atlas, 1996).  To all appearances, the reading public prefers memoirs to works of fiction -- even if the memoirs themselves prove to have been fabricated.  William Grimes (2005), inspired by the idea that all of world literature consists of just a very few basic plots, has even offered a taxonomy of memoirs: the retired-statesman (or--bureaucrat) memoir, the military memoir; memoirs of traumatic childhoods or substance abuse, illness or sexual exploitation, spiritual journey or show business; memoirs of food, or ethnic identity, or vanishing small-town America, bad jobs, or bad journeys (Grimes, 2005).  Grimes also writes that the work of memoirists "may be as fundamental as breathing".  Arguably, this is because the recalling and telling of our personal stories is such a central part of both our sense of self and our relations with others.  But if Dr. Johnson was right that "No man but a blockhead ever wrote, except for money", we might just as well all pay each other for putting it down on paper.

An April 2008 press release from IBM predicted that "Forgetting will become a distant memory" with the development of "smart appliances" equipped with microphones, video cameras, and memory to record, store, analyze, and retrieve all the "details of everyday life" (IBM, 2008)-- apparently one further step toward the Great Singularity of man and machine (as envisioned by Ray Kurzweil, 2005).  At that point, presumably, autobiographical memory will have no function at all.  Or maybe not.  After all, if the Great Singularity happens, we will have had a rehearsal of sorts when the art of memory began to be replaced by cheap paper and moveable type.  Even then, memory retained its usefulness.  As Anthony Grafton notes (2008): "As shelves groaned and notebooks swelled to bursting, memory remained the only thread that could lead one back through paper labyrinths to the facts and data that mattered".  In a world of artificial minds with infinite capacity for data storage, there will still be no substitute for human consciousness, with its capacity to remember what really matters, and forget what does not.


Memory and Memoir

Returning to "the marketing of memory", Ben Yagoda (2009), in his history of the memoir, notes that a first-person narrator has a long tradition in fiction, and was a major form for the early novel -- the narrator being reliable and omniscient (the unreliable narrator was largely a 20th-century invention).  There have been memoirs, too, at least since the time of St. Augustine, but in the 19th and early 20th centuries most memoirs were written about others, namely people who were more famous than the author -- My Life with Napoleon, by his Valet or something like that. 



Only in the late 20th century did the personal memoir, in which the author writes about him- or herself, emerge as a popular literary genre.  Interestingly, this was accompanied by an increase in 1st-person narratives in academic writing.  Whereas scientific reports traditionally were written in the third person ("The subjects for this experiment were 30 college undergraduates, divided into three groups of 10"), they are now frequently written in the first person ("I recruited 30 college undergraduates for the experiment, and divided them randomly into three groups of 10").  More tellingly, perhaps, literary studies, which used to feature dispassionate analyses of, say, the fiction of Dostoyevsky, now feature reports of what it was like to read Dostoyevsky.  Now, as James Atlas noted in a 1996 essay, the memoir has outstripped the novel as the most popular literary genre -- so much so that there is a market for just about any memoir, regardless of whether the person is famous, or even knew anyone who was famous -- or, even whether the person is someone we would have liked to have known in real life.

As Bernard Cooper (1999) noted, the memoir exemplifies what might be called "the performance of self" -- a vehicle for putting one's private self on public display.  

The memoir is not the only performance of the self. 

Interestingly, however, a number of memoirs have been found to be substantially false ("fake memoirs"), which raises the question of the relationship between autobiographical memory and historical truth.  Fiction can be based on the author's experience, but we don't expect fiction to be true in the sense of fidelity to some historical record.  However, we do expect that histories and biographies, whatever interpretive point of view they might take, will be faithful to the historical record.  The same goes for autobiographies.  The situation is different with memoir, because it's obvious that someone can remember the past in a manner that departs substantially from the objective historical record.  The whole point of memoir is to relate one's memories, and we don't expect what's remembered to be true.  But we do expect that a memoir will be faithful to the writer's memories-- otherwise, it's fiction dressed up as memoir (which, of course, is what some novels were in the 18th and 19th centuries). 


"Flashbulb" Memories

Among our many autobiographical memories are what are called flashbulb memories.  In a groundbreaking paper, Brown and Kulik (1977) asked subjects to remember the circumstances under which they learned about a surprising, consequential, affect-laden event.  The classic example, for people of my generation, and for the subjects in Brown & Kulik's experiment, is the assassination of President John F. Kennedy.  Other events included in the survey were:


For older Americans, the Japanese attack on Pearl Harbor would also serve.  For younger Americans, the terror attacks on the World Trade Center and the Pentagon.  

Brown & Kulik performed a content analysis on the subjects' responses, and found that they commonly contained the following information:

114Flashbulb77.jpg (50357 bytes)Brown and Kulik found that most of their subjects had a "flashbulb memory" for the assassination of President Kennedy.  

 

 

 

They suggested that "flashbulb memories" represent a richly detailed, almost sensory memory of the circumstances in which a highly surprising, affect-laden, and consequential event occurred.  Such emotionality induces a great deal of overt and covert rehearsal, leading to a highly elaborate episodic memory trace.  They also invoked Livingston's (1967) "Now Print!" mechanism, by which it is as if the mind takes a picture of what is going on at the time of a surprising, consequential, affect-laden event.  Like Livingston, they suggested that such flashbulb memories might have evolutionary significance, in that they produced a prompt, enduring record of critical events in the organism's life.  

Moreover, the rich detail in such memories seemed inconsistent with Bartlett's view that memory is reconstructive, and thus inaccurate, in nature.  In these cases, anyway, the memories appear to be reproduced "verbatim".

Following in the wake of Brown and Kulik's study, other investigators have studied flashbulb memories for various surprising, consequential, and affect-laden news events:


Characteristics of Flashbulb Memories

Some of these studies have investigated the characteristics of flashbulb memories in some detail.  For example, Mulane Swar, Glisky, and Kihlstrom (2002) performed a psychometric study of flashbulb memories for the Challenger disaster, which occurred in 1986.  In 1989, a group of college-student subjects were asked to write down their memories of the first time they heard of the Challenger disaster.  These free-recall narratives were coded for the six "canonical" categories employed by Brown and Kulik, as well as three other categories that we thought might be important:


124CatsFree.jpg (54591 bytes)Ratings of the free-recall narratives showed that most of 125CatsQuest.jpg (57367 bytes)these features (except for Time) occurred with a high frequency.  When subjects were specifically asked about each detail in a subsequent questionnaire, the frequencies shot up to almost 100% in all categories.

 

 

126DistFree.jpg (50140 bytes)Most subjects' free-recall narratives 127DistQuest.jpg (50159 bytes)included at least four of the features; and the distribution was even more skewed for the questionnaires.

 

 

128CanCats.jpg (51008 bytes)The average subject's memory had more than four of the six "canonical" categories employed by Brown & Kulik -- and on the questionnaire, the average subject's memory had all six.  Flashbulb memories are, indeed, very rich representations of the event in question -- but are they accurate?

 


Accuracy of Flashbulb Memories

118NeisHarsch1.jpg (45144 bytes)Some of these subsequent studies compared memories collected immediately after the event, with those collected after some delay.  119NeisHarsch2.jpg (53661 bytes)For example, Neisser and Harsch studied college students' flashbulb memories for the 1986 Challenger disaster (probably the most studied of all flashbulb memories).  When retested in 1988, Neisser and Harsch investigators found that many subjects were highly confident that their memories were accurate; but in fact, most of the memories were quite inaccurate, and many bore no resemblance to the accounts that the same subjects had given shortly after the event occurred.  

 

Similar findings have been obtained by other investigators studying the Challenger disaster, and from studies of other flashbulb memories.  Apparently flashbulb memories are not as accurate as they appear to be.

Along the same lines, Larsen (1992) performed a "self-study" -- that is, a study of his own memories -- to compare the accuracy of recollections of personal and public events (there is a long history of such studies, beginning with one performed by Harriet Linton).  

We can conclude from studies like these that, appearances to the contrary notwithstanding, flashbulb memories are not exceptions to the rule.  Flashbulb memories are very vivid, and remembered with a high degree of confidence, but like "ordinary" memories, they are not particularly complete or infallible; they are not immune to forgetting; and they are not produced by special mechanisms.  Flashbulb memories, like ordinary memories, are reconstructions, not reproductions.

Still, whether they are accurate or not, our memories, including our flashbulb memories, are -- well, they're our memories -- our mental representations of the past.  And because they typically represent surprising, consequential, emotional events, they're relevant to the question of emotion and memory.  

Regardless of whether they're accurate, flashbulb memories are our recollections of important events.  And "importance" is something that differs from group to group (remember, in the Brown & Kulik study, that relatively few white subjects had flashbulb memories for the assassination of Medgar Evars).  And it's also something that differs from individual to individual.  I have a flashbulb memory for the time I first met my spouse, but I don't have flashbulb memories for everyone I ever met for the first time.  An uninvestigated aspect of flashbulb memories are those idiosyncratic events that are important to us as individuals, even if they're not important to the wider public or history.  For that reason, our flashbulb memories -- not necessarily of 9/11 and the Challenger Disaster, but the more "mundane" memories for things like our first kiss -- are important expressions of our personality.  And, it turns out, they are also important for social interaction.  So, we'll take them up again in the lectures on Personality and Memory and Social Memory.


Linking the Personal with the Public

But if flashbulb memories are not veridical representations of some past event, what are they?  Neisser (1992) noted that flashbulb memories are, first and foremost, stories that follow certain narrative conventions -- they indicate who did what, where, when, and why.  Rather than being "pictures" that the mind has taken of certain events, he suggested that flashbulb memories serve as benchmarks marking the intersection of private and public history.  As Neisser put it:

"They are the places where we line up our own lives with the course of history itself and say 'I was there'."

People's flashbulb memories of the 9/11 terror attacks provide some evidence for Neisser's hypothesis.  In 2002, the Pew Research Center for the People and the Press released the results of a survey which showed that 97% of Americans could remember exactly where they were or what they were doing the moment they heard of the attacks -- thus fulfilling the basic criterion for a flashbulb memory (Pew Center, 09/05/02).  The survey also asked respondents to describe "the biggest life event of the past year".  In past surveys, this question had elicited a fairly routine miscellany of births and deaths, marriages and divorces.  But in this case, fully 38% of those surveyed cited 9/11.  September 11, 2001, was certainly an important date for public history; but it was also, apparently, an important date in personal history -- an intersection of public and private marked by a "flashbulb" memory.  

129Instructions.jpg (103226 bytes)9/11 may be a benchmark, and other flashbulb memories 130NewsEvents.jpg (127030 bytes)may represent benchmarks too, but apparently not all of them are benchmarks, as data from our study of flashbulb memories for the Challenger disaster shows (Mullane Swar, Glisky, & Kihlstrom, 2002).  Before subjects reported their memories of the Challenger incident, we asked them to remember events that occurred in the surrounding 12-month period, from August 1, 1985 to July 31, 1985 (the Challenger disaster occurred on January 28, 1986).


131Chall.jpg
          (45551 bytes)In contrast to the Pew survey results concerning 9/11, none of the subjects reported the Challenger incident as a personal memory, unless they had been specifically reminded of it beforehand.  Actually, very few of them reported it as a historical memory either -- again, unless they had been particularly reminded.  Recall that virtually all of these subjects had a flashbulb memory for the Challenger incident.  Yet unless they were specifically reminded of it, the Challenger disaster did not occur as an event in the subjects' personal autobiographies.  Some flashbulb memories may be benchmarks, as Neisser has suggested (and I agree that it's a good idea); but apparently not all of them are.  

 

So What Does the Self Look Like?

Philosophers, neuroscientists, and even some psychologists often ask what "the self" looks like -- sometimes betraying a nervousness about matters like consciousness and agency, and raising the spectre of the homunculus -- "the little man in the head".

 To this question, cognitive psychology offers four straightforward answers:

One feels a little like someone watching four blind people describe an elephant. But this is all right: why shouldn't self-knowledge be rich and multifaceted? In the end it's all neural connections anyway, but at the psychological level of analysis, which is where we as psychologists should be operating, there's no reason why we shouldn't consider different representational formats for different kinds of self-knowledge.

One thing is for sure: the self, which plays such an important role in social interaction, is also a knowledge structure represented in the mind of the actor. Far from being a mystical entity, it appears that we can study, and understand, the self using the conceptual and methodological tools of modern cognitive psychology -- bringing what we know about category structure, story understanding, image processing, and priming effects to bear on the self. And we can draw data for research on the self from a wide variety of sources: from conventional personality and social psychology, from conventional cognitive psychology, from cognitive neuropsychology, and from clinical psychology. So, both conceptually and empirically, the study of the self seems to serve a unifying function in psychology, bringing cognitive, social, personality, developmental, and clinical psychology together in common cause -- just as William James thought it would.

 

The Development of Selfhood

Development can be viewed in two ways: ontogenetically, in terms of changes in individual organisms across the life cycle from birth to death; and phylogenetically, in terms of changes in species across evolutionary time.


The Phylogenetic View

Locke viewed a sense of self as essential for personhood, but nonhuman animals may also have a sense of self. In a classic study, Gordon Gallup painted an odorless, nontoxic red mark on the foreheads of anesthetized chimpanzees. In the absence of a mirror, the chimps showed no awareness that their appearance had been altered. When exposed to their reflections in a mirror, however, the animals often examined the spot in the mirror, touched the spot on their foreheads, and then inspected and smelled their fingers. They appeared to recognize a discrepancy between what they thought they looked like, and what they actually looked like -- suggesting, in the process, that they possessed at least a rudimentary self-image. The same effect has been found in some orangutans and bonobos, but not in gorillas (except perhaps for the famous Koko), monkeys, and other primates, or in non-primate species. However, it should be noted that not all chimpanzees pass the self-recognition test, and alternative means of testing may well reveal self-recognition in other species.


The Ontological View

By the time they are 18-24 months old, most human infants also pass the mirror-recognition test. However, if the infants are shown a videotape of themselves after a delay as short as three minutes, most fail to recognize themselves on the monitor; most four-year-old pass this more difficult test. By the age of two, then, human infants have at least a minimal sense of self, but it takes a while longer for them to develop a narrative sense of themselves as extended in time -- that they are the same person now that they were a while ago. Similarly, children younger than four years old seem unable to recognize that their current knowledge and beliefs differ from those they held in the past.  Interestingly, age four is about the time that children achieve a capacity for episodic memory -- the ability to recognize that a current mental state is in fact a representation of a past mental state.


The Cultural View

There's also a third developmental approach, which might be called the cultural view -- i.e., how the self develops over the course of historical time.  Julian Jaynes (1979) famously argued that there was a time when humans didn't really have consciousness -- or, at the very least, didn't realize that they had it.  In much the same way, there may have been a time when we didn't really have a self-concept -- or, if we did have one, it didn't matter very much.  

In cultural terms, the self, at least in Western culture, may have experienced a radical alteration around the time of the Renaissance.  As Jacob Burckhardt put it in The Civilization of the Renaissance in Italy (1860):

Man [previously] was conscious of himself only as a member of a race, people party, family, or corporation -- only through some general category.  In Italy this veil first melted into air; an objective treatment and consideration of the state and of all things of this world became possible.  The subjective side at the same time asserted itself with corresponding emphasis; man became a spiritual individual, and recognized himself as such.

Burckhardt had the idea that this discovery of the self was uniquely a feature of the Italian Renaissance, and particularly of the Florentine Renaissance, based in the Italian city-state of Florence.  In fact, the Italians got the idea that portraits of contemporary individuals should be painted at all, and that such portraits should portray their subjects in all their individuality, from the artists of the Northern Renaissance, especially Flanders, and especially the work of Jan Van Eyck (1395-1441).  But more generally he argued that the Renaissance emphasis on "the subjective side", in which "man became a spiritual individual, and recognized himself as such" is central to European, and Western, culture.  

RenPortrait.jpg
            (84558 bytes)Andrew Butterfield, reviewing the Renaissance Portrait from Donatello to Bellini (an exhibition at the Metropolitan Museum of Art in New York City, 12/21-2011-03/18/2012), notes:

 

 

 

 

Burckhardt's idea has had the most profound influence on the study of the Renaissance portrait....  Beginning in the late thirteenth century several changes greatly stimulated the advance of portraiture....  So portraits had existed before, but it was only in the fifteenth century that independent images of actual persons other than rules and religious figures began to be made in large numbers.  This new tendency started in Flanders, and then spread to Florence, where it reached unprecedented currency (reviewed in "They Clamor for Our Attention", New York Review of Books, 03/08/2012; see also "Faces Out of the Crowd" by Barry Schwabsky, The Nation, 03/26/2012).

Walter Benjamin, the 20th-century German philosopher and Marxist cultural critic famous for The Arcades Project (1927-1940), his study of Parisian bourgeois culture, gave a somewhat later date: somewhat between 1830 and 1848, during France's Second Empire.  It was at this point, Benjamin claimed, that individuals first became private, in the sense that their work lives were separated (Marx would have said "alienated") from their domestic lives, and the public domain separated from the private.  For the first time, as Merve Emre writes in an overview of the personal essay (which emerged at about the same time), ("The Illusion of the First Person", New York Review of Books, 11/03/2022; see also Emre's contribution to the Oxford Companion to the Essay) a middle- or upper-class individual had the time and inclination to "probe what he believed to be his thoughts, lodged in his self, his mind, his body, and his home".

 

Pathologies of Selfhood

Whatever the findings in infants and animals, a sense of self is part and parcel of the conscious experience of all normal human adults. However, a number of pathological conditions appear to involve disruptions in self-recognition and self-awareness.

Whether these deficits in social cognition are limited to the sense of self, or extend to other people as well, is a topic of much current investigation.


Narcissism

Not to mention narcissismThe DSM-5 defines narcissistic personality disorder as "a pervasive pattern of grandiosity (in fantasy or behavior), need for admiration, and lack of empathy. 

Of course, the term itself had been around for longer than that. 

 

Self-Knowledge into Action

While cognitive psychology tends to study mind in the abstract, social psychology studies mind in action. Mental representations of self, others, and situations do not exist for themselves, but rather as guides to social behavior. How we behave towards others depends not just on how we perceive them, but also on how we perceive ourselves. Erving Goffman, E.E. Jones, and others have argued that people often engage in strategic self-presentation to shape others' impressions of them, in an attempt to gain or retain control over the social situation. Many social interactions are characterized by what Robert K. Merton would call a self-fulfilling prophecy -- in which, for example, a person who believes that another person is aggressive may treat him or her in a manner that evokes aggressive behavior that may not have occurred otherwise. A strong sense of self may promote strategic self-presentation, but it may also militate against others' self-fulfilling prophecies concerning oneself. If one does not define oneself as aggressive, perhaps one will be less likely to act aggressively, regardless of how he or she is treated. From a social-psychological perspective, then, the self is not just something that knows, and is known. It is also something that one does.


The Self and Its Brain

As philosophers and psychologists became interested in the biological substrates of mental life, and brain-imaging techniques have permitted us to watch the brain in action, cognitive science has evolved into cognitive neuroscience.

Anthony Damasio has argued that the self consists of three different representational levels, each associated with a different brain system (see his book, The Feeling of What Happens: Body and Emotion in the Making of Consciousness, 2000).

  1. Interoception and proprioception give rise to a primitive sense of self which he calls the proto-self.  This is a primitive representation of the body, which monitors and controls basic physical functions such as homeostatic regulation.
  2. The core self generates our phenomenal awareness of ourselves in the here and now, including conscious emotional and motivational states. 
  3. The autobiographical self enables us to relate our current experiences to our past and future.

Taking cognitive neuropsychology as a model, Klein and Kihlstrom have argued that neuropsychological studies of brain-injured patients, and brain-imaging studies of normal subjects, may provide new solutions to old problems, and afford new theoretical insights, for personality and social psychologists as well. Consider, for example, the relation between self and memory. If, as Locke argued, our sense of self is intimately tied up with our recollection of our past, what is the sense of self for an amnesic patient?  H.M., the famous patient with the amnesic syndrome, cannot consciously remember anything that he did or experienced since the operation that destroyed his medial temporal lobes. Of course, H.M.'s amnesia is primarily anterograde in nature, and his sense of self may be confined to whatever memories he has from before his surgery. Moreover, Locke did not fully appreciate the distinction between episodic and semantic memory. Amnesic patients retain some ability to acquire new semantic knowledge, and this dissociation may permit their self-concepts to be based on "updated" semantic knowledge, even if they are lacking a complete record of autobiographical memory.

Such questions have not been asked of H.M. himself, but they have been asked of other patients. For example, the patient known as K.C., who suffered a severe head injury as a result of a motorcycle accident, has both a complete anterograde amnesia covering events since the accident, and a complete retrograde amnesia covering his life before the accident. K.C. has no autobiographical memory at all, but research by Endel Tulving reveals that he has a fairly accurate self-concept. The same accident that caused his amnesia also resulted in a profound personality change: the premorbid K.C. was quite extraverted, while the postmorbid K.C. is rather introverted. When asked to rate himself as he is now, K.C. rates himself as introverted, in agreement with his mother's ratings of him. Interestingly, his ratings of his premorbid personality do not agree with his mother's. K.C. has acquired semantic knowledge about himself, but he has not retained in episodic memory the experiences on which this self-knowledge is based; and his newly acquired semantic self-knowledge has effectively replaced that which he possessed before the accident.

Similar results were obtained by Klein and his colleagues in a study of W.J., a college freshman who suffered a temporary retrograde amnesia, covering the period since her high-school graduation, as a result of a concussive blow to the head. Asked to describe herself, W.J. showed a good appreciation of how she had changed since matriculating, as corroborated by her boyfriend's ratings of her. Findings such as these lend strength to the conclusion, based on experimental studies of priming, that semantic (trait) knowledge of the self is encoded independently of episodic (behavioral) knowledge.

Amnesic patients typically suffer damage to the hippocampus and related structures in the medial temporal lobes, leading to the conclusion that these structures constitute a module, or system, for encoding consciously accessible autobiographical memories. Is there a similar structure responsible for the sense of self? Recently, Craik and his colleagues (1999) used PET to image the brain while subjects rated themselves on a list of trait adjectives. As comparison tasks, subjects rated the Prime Minister of Canada on the same traits; they also judged the social desirability of each trait, and the number of syllables in each word. One analytic technique, statistical parametric mapping, revealed no differences in brain activation between the self- and other-ratings tasks. While this finding would be consistent with the proposition that the self is a person like any other, a partial least squares analysis showed significant self-other differences in the right and left medial frontal lobes, and the middle and inferior frontal gyri of the right hemisphere. Further studies of this sort are obviously in order.

So is the mental representation of the self located somewhere in the right hemisphere? Probably not. Self-referent processing may be performed by a module or system localized in the right frontal lobe, but control is critical in these conditions, and it may well be that other-referent processing is performed by the same system, provided that the other is well-liked and/or well-known. Although cognitive neuroscience has generally embraced a doctrine of modularity, the neural representation of individual items of declarative knowledge is distributed widely across the cerebral cortex. Self-reference may be localized, but self-knowledge is widely distributed over the same neural structures that represent knowledge of others.  I discuss this issue more in the lectures on Social Neuropsychology.


Is the Self a Person, Like Any Other?

Lying behind all four of these models of the self is the general idea that oneself is a person like anyone else, and represented accordingly. However, there may be both quantitative and qualitative differences between self-perception and the perception of other people.


The Self-Reference Effect

On the quantitative side, it seems obvious that we simply have more knowledge about ourselves than we do about other people.  Evidence for this proposition comes from studies of the self-reference effect in memory, by Rogers and his associates (1977).

Rogers made use of a popular procedure in the study of memory known as the depth-of-processing (DOP) paradigm, also known as the levels of processing (LOP) paradigm.  In DOP studies, subjects are presented with a list of words (or other material), about which they have to make one of a number of judgments:

A DOP study is typically presented as an experiment about language, not memory.  But at the conclusion of the study, the subjects are surprised with a test of recall or recognition.  The typical finding is that memory is best for items subject to semantic processing, and worst for items subject to orthographic processing, with items subject to acoustic processing somewhere in the middle.  The general interpretation of this result is that semantic processing creates a "deeper" or more elaborate encoding of the item.  That is, contact between the item and the rich, elaborate network of semantic memory produces a more memorable item.

For more details on the DOP/LOP paradigm and its implications, see the page on Memory in the lecture supplements on General Psychology; and also the page on Encoding in the lecture supplements on Human Learning and Memory.

2000Rogers.JPG (52875
            bytes)To the standard DOP paradigm, Rogers et al. added a self-referent judgment.  They presented a number of trait adjectives and, in addition to the standard conditions, asked subjects to decide whether each adjective was self-descriptive.  The finding was that self-reference produced a huge increment in memory.  On the basis of this self-reference effect (SRE), and in line with the standard interpretation of the DOP, Rogers et al. suggested that the self was, perhaps, the largest knowledge structure in human memory. 

 

2000Keenan.JPG (60794
            bytes)The conclusion was quite provocative, and quickly a number of investigators stepped in to put it to the test.  In particular, Keenan and Baillet (1980) performed a number of studies in which they compared self-referent processing to the effects of processing information with respect to other people. Most critically, they found that processing with respect to the subject's best friend yielded a DOP effect roughly equivalent to that of self-reference.  Keenan and Baillet concluded that self-referent processing affords no privilege in memory.  The implication is that self-reference is no different from other-reference (provided, perhaps, that the other person is liked and well-known); or, put another way, that the representation of the self in memory is no richer, no more elaborate, than the representation of other people that the individual knows well and likes.  (Setting aside the very interesting question of the self-reference effect in depressive individuals, who may not like themselves very much; or in people with personality disorders, who may not know themselves very much!).  

But, as they say in late-night television ads, there's more!

Stanley Klein noticed a subtle confound in the standard SRE experiment -- which is that self-reference and organization are confounded in memory.

The difference is important because the self-referent orienting condition encourages subjects to sort the items into two categories -- those that are self-referent and those that are not.  (The same is true, of course, for the "other-reference effect" documented by Keenan & Baillet.)  By contrast, the semantic orienting condition precludes such a dichotomous sorting.  The point is important because we know that organization at the time of encoding also improves memory.  In fact, elaboration and organization are the two primary principles governing the encoding phase of memory processing.

For more details on the Organization Principle and its relation to the Elaboration Principle, see the page on Memory in the lecture supplements on General Psychology; and also the page on Encoding in the lecture supplements on Human Learning and Memory.

The problem, then, is to unconfound self-reference and organization.  Klein accomplished this in an ingenious way.  For his experiment, he shifted the stimulus materials from trait adjectives to body parts.

One of these conditions entails self-reference while the other one does not, but both of these conditions discourage categorization

Again, one of these conditions entails self-reference while the other one does not, but both of these conditions encourage categorization -- to sort the target items into two dichotomous categories.

2008Klein1.JPG (57764
            bytes)Klein argued that the standard SRE experiment compared an organized self-referent condition with an unorganized semantic condition.  And, indeed, when you look at just those two conditions, you see a big SRE.

 

 

2008Klein2.JPG (56393
            bytes)But when you look at all four conditions, you see that the SRE is matched by the semantic condition -- so long as the semantic condition is also organized.  In fact, virtually 99.44% of the variance in memory performance was accounted for by the organization factor, and virtually none by the self-reference factor.  As Klein suspected, the SRE is wholly an artifact of organizational activity, and has nothing to do with the self.

 

The bottom line is that the self may, indeed, be the largest knowledge structure in memory. Most of us do, after all, know more about ourselves than we do about others, and most of us probably care about ourselves more than we do about others as well.  But the self-reference effect doesn't provide any evidence for this proposition -- because, it turns out, the SRE has nothing to do with the self.


Actor-Observer Differences in Causal Attribution

With respect to qualitative differences between cognition of self and others, there is of course the large social psychological literature on actor-observer differences -- which is to say, self-other differences -- in causal attribution.

The first thing to be said about these two differences is that it never was clear that these biases were intrinsic to self-knowledge. Perhaps they applied to knowledge about others, as well, so long as we like them (as we tend to like ourselves) and/or know them well (as we think we know ourselves).  This is the lesson of the self-reference effect.

2008Malle1.JPG (51144
            bytes)But 2008Malle2.JPG (43842
            bytes)we now know that early studies of causal attribution were misleading, because they made an inappropriate distinction between the Person and the Environment -- and, more critically, because they attempted to fit causal explanation into an inappropriate Internal-External framework.  In fact, Malle's (2006) review, indicated that there is no evidence for the Actor-Observer Difference, and precious little evidence for the self-serving bias either.

 

 

The Self as Object and the Self as Subject

So is it true, that the self is "just another person" after all, and that there are no qualitative differences between self-perception and other-perception?  Not so fast.

There is one difference between self and other that is absolutely qualitative: While we have direct introspective access to the contents of our own minds -- our beliefs, feelings, and desires -- we can know the minds of others only indirectly -- from what they tell us, and from observing their behavior.

Knowledge of our own minds is direct (at least in part); 

Knowledge of other minds is (always) indirect.

2008AllportRedux.JPG (102463 bytes)So, in the final analysis, Cantor and I were perhaps too quick to offer a solution to Allport's problem.

  

 

 

The "puzzling problem" of the self is not that of self as object of knowledge.  Viewed as an object of knowledge, the self may indeed just be a mental representation of oneself, no different from our mental representations of other people.  (Though, frankly, I do think that the idea of the self as a mental representation of oneself was an awfully good idea.)

The "puzzling problem" of the self is, rather, that of self as knower.  Viewed as a subject, as the person who has self-knowledge, the problems are of how self-knowledge arises, how we know what we know about ourselves -- the little man inside the head that captures the experience of consciousness.

And consciousness is a very puzzling problem indeed.

 

Further Reading

Those interested in more details may wish to consult the following articles:

  • Kihlstrom, J.F., & Cantor, N. (1984). Mental representations of the self. In L. Berkowitz (Ed.), Advances in experimental social psychology. Vol. 17. New York: Academic Press.
  • Kihlstrom, J.F., Cantor, N, Albright, J.S., Chew, B.R., Klein, S.B., & Niedenthal, P.M. (1988). Information processing and the study of the self. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology. Vol 21 (pp. 145-177). San Diego: Academic Press.
  • Kihlstrom, J.F., & Klein, S.B. (1994). The self as a knowledge structure. In R.S. Wyer & T.K. Srull (Eds.), Handbook of social cognition, 2nd Ed. (Vol. 1, pp. 153-208). Hillsdale, N.J.: Erlbaum.
  • Kihlstrom, J.F., & Schacter, D.L. (1995). Functional disorders of autobiographical memory. In A. Baddeley, B.A. Wilson, & F. Watts (Eds.), Handbook of memory disorders (pp. 337-364). London: Wiley.
  • Kihlstrom, J.F. (1996). Memory research: The convergence of theory and practice. In D. Hermann, M. Johnson, C. McEvoy, C. Hertzog, & P. Hertel (Eds.), Basic and applied memory: Theory in context (Vol. 1, pp. 5-25). Mahwah, N.J.: Erlbaum.
  • Kihlstrom, J.F. (1997). Consciousness and me-ness. In J. Cohen & J. Schooler (Eds.), Scientific approaches to consciousness (pp. 451-468). Mahwah, N.J.: Erlbaum.
  • Kihlstrom, J.F., & Klein, S.B. (1997). Self-knowledge and self-awareness. In J.G. Snodgrass & R.L. Thompson (Eds.), The self across psychology: Self-recognition, self-awareness, and the self-concept. Annals of the New York Academy of Sciences, 818, 5-17.
  • Kihlstrom, J.F., Marchese, L.A., & Klein, S.B. (1997). Situating the self in interpersonal space. In U. Neisser & D.A. Jopling (Eds.), The conceptual self in context: Culture, experience, self-understanding (pp. 154-175). New York: Cambridge University Press.
  • Kihlstrom, J.F., Beer, J.S., & Klein, S.B.  (2003).  Self and identity as memory.  In M.R. Leary & J. Tangney (Eds.), Handbook of self and identity (pp. 68-90).  New York: Guilford Press.
  • Kihlstrom, J.F., & Klein, S.B. (2003).  Self.  In L. Nadel (Gen. Ed.), Encyclopedia of Cognitive Science, Vol. 3, Article #A152 (CD-ROM).  London: Macmillan Reference.
  • Kihlstrom, J.F.  (2009).  "So that we might have roses in December": The functions of autobiographical memory.  Applied Cognitive Psychology, 23, 1179-1192.
  • Kihlstrom, J.F. (2012). Searching for the self in mind and brain. Social Cognition, 37(4), 367-379.

Really, what did you expect in a Lecture Supplement on "The Self", except a bunch of self-citations?

Author Tom Wolfe famously designated the 1970s the "Me Decade" (1976), at roughly the same time that the historian and social critic Christopher Lasch was writing about The Culture of Narcissism (1979).  But it wasn't just the 1970s.  The New York Times Magazine dubbed the entire second millennium of the Common Era the "Me Millennium" and devoted one of its six special "Millennium Issues" to the self, narcissism, and related issues.  Link to the October 17, 1999 issue of the New York Times Magazine online.  

 

This page last modified 07/18/2023.