Become a Patron


 

Esther Quintero

http://shankerblog.org/?p=9719

The research on implicit bias both fascinates and disturbs people. It’s pretty cool to realize that many everyday mental processes happen so quickly as to be imperceptible. But the fact that they are so automatic, and therefore outside of our conscious control, can be harder to stomach.

In other words, the invisible mental shortcuts that allow us to function can be quite problematic – and real barrier to social equality and fairness – in contexts where careful thinking and decision-making are necessary. Accumulating evidence reveals that “implicit biases” are linked to discriminatory outcomes ranging from the seemingly mundane, such as poorer quality interactions, to the highly consequential, such as constrained employment opportunities and a decreased likelihood of receiving life-saving emergency medical treatments.

Two excellent questions about implicit bias came up during our last Good Schools Seminar on “Creating Safe and Supportive Schools.”

  • First, does one’s level of education moderate the magnitude of one’s implicit biases? In other words, wouldn’t it be reasonable to expect people with more education to have a broader repertoire of associations and, therefore, to be able to see beyond clichés and common places?
  • Second, does the frequency of interaction with (and deep knowledge of) a person affect our reliance on stereotypic associations? Simply put, shouldn’t information that is individual-specific (e.g., Miguel loves Asimov novels) replace more generic information and associations (e.g., boys don’t like to read)?

The short (albeit simplified) answers are no and yes, respectively. Below, I elaborate on each, reflect on strategies that can help reduce the unintended ill effects of implicit biases, and touch on some implications for schools and educators.


Are More Educated People Less Biased?

To address this question, we first need to answer: Where do (social) biases come from? — which leads us into the fascinating topic of stereotypes. Stereotypes are cognitive associations between a group and a trait (or set of traits) – e.g., women and nurturing, men and leadership skills, African American males and aggression, etc. After frequent (and sometimes subtle) exposures from our social environments, these mental associations form automatically, even in the absence of conscious antipathies toward groups – see also Bargh 1999; Devine 1989; Gaertner & Dovidio 1986; Greenwald & Krieger 2006; Jost et al 2009.

What’s important here is our awareness of these associations that exist in our culture. It does not really matter that John has a Master’s degree or has traveled around the world. John could be the most knowledgeable person with the deepest egalitarian, non-essentialist beliefs – but what matters here is that John is also aware that most other people aren’t like him; that many others out there still believe that black men are more aggressive and more sexual or that women are more dependent, nurturing, and communal. Stereotypes operate implicitly (also here and here), regardless of our own race/gender, and even when our personal beliefs are completely to the contrary. In fact, many theorists have argued that implicit biases persist and are powerful determinants of behavior precisely because people lack personal awareness of them – meaning that they can occur despite conscious non-prejudiced attitudes and intentions.

So, to recap, it’s not primarily about education or knowledge, but perhaps a bit of the opposite: You would have to have lived under a rock all your life to claim true ignorance of the shared beliefs that exist in our society and that these beliefs don’t affect you in any way. At the risk of belaboring the point, let me also summarize three classic studies that explore the behavior of highly educated decision makers:

  • In an audit study of employer hiring behavior, researchers Bertrand and Mullainathan (2003) sent out identical resumes to real employers, varying only the perceived race of the applicants by using names typically associated with African Americans or whites. The study found that the “white” applicants were called back approximately 50 percent more often than the identically qualified “black” applicants. The researchers found that employers who identified as “Equal Opportunity Employer” discriminated just as much as other employers.
  • Steinpreis and colleagues (1999) examined whether university faculty would be influenced by the gender of the name on a CV when determining hireability and tenurability. Identical, fictitious CVs were submitted to real academics, varying only the gender of the applicants. Male and female faculty were significantly more likely to hire a potential male colleague than an equally qualified potential female colleague. In addition, they were more likely to positively evaluate the research, teaching, and service contributions of “male” job applicants than of “female” job applicants with the identical record. Faculty were four times as likely to write down cautionary comments when reviewing the CV of female candidates – comments included notes such as “we would have to see her job-talk”, or “I would have to see evidence that she had gotten these grants and publications on her own.”
  • Finally, Trix and Psenka (2003) examined over 300 letters of recommendation for successful candidates for medical school faculty positions. Letters written for female applicants differed systematically from those written for male applicants. Letters for women were shorter, had more references to their personal lives and had more hedges, faint praise, and irrelevancies (e.g., “It’s amazing how much she’s accomplished.”, “She is close to my wife.”).

Does Familiarity Weaken Implicit Bias?

The second question was about whether one should expect more conscious and structured decision making (and thus, fewer snap judgments) from teachers as the school year progresses, simply because they have more real information to go by produced through their constant interaction with students (see here, minute 50:55). Great question.

We rely on stereotypes (and other cognitive shortcuts) more heavily when there are more unknowns to a situation – for example, when we don’t know a person well, when we are unfamiliar with the goals of the interaction, when the criteria for judgment are unclear or subjective, etc. This suggests that the more information we have about a person or situation, the less likely we are to automatically fill in potential (knowledge) gaps with more “generic” information (e.g., stereotypes). In fact, “individuating” – or gathering very specific information about a person’s background, tastes etc. — has been proposed as an effective “de-biasing” strategy. When you get to know somebody, you are more likely to base your judgments on the particulars of that person than on blanket characteristics, such as the person’s age, race, or gender. This suggests that teachers may be better positioned than other professionals (e.g., doctors who see patients sporadically for a few minutes at a time) to overcome potential biases.

I am not suggesting that teachers, because they are teachers, are immune to implicit biases – in fact there is some research documenting that they are not (see Kirwan Institute’s recent review, pp. 30-35). However, teachers may be better situated to combat these biases than professionals in other fields. Getting to know their students well is part of a teacher’s job description. Thus, a potential intervention aimed at breaking stereotypic associations might build on and support this aspect of teachers’ work – for example, by providing a structured and systematic way to gather information on students during the first weeks of the new school year.

In addition, teachers are well positioned to actually disrupt classroom (status) hierarchies (which can emerge among students based on characteristics such as race, gender, academic ability). For example, by (authentically) praising a low status student on something specific that the student did well, the teacher can effectively raise the social standing of that student in the classroom. This, in turn, can elevate both the student’s confidence and self-assessment (i.e., what the student thinks he/she is capable of accomplishing) as well as his/her peers’ expectations (i.e., what other classmates think she/he is capable of). This is a powerful way of breaking stereotypic associations and equalizing learning conditions in the classroom.

In sum, formal education, in and of itself, is not enough to disrupt associations that are deeply embedded in the culture. We are all profoundly aware of these associations (even when we don’t share them) and for that reason alone, our thoughts and behaviors can be subtly and implicitly influenced by them. Individuation (or gathering information about the specific person in front of you) can, however, be an effective strategy to break automatic associations. This technique can help you see (or assign more weight to) the particulars of a person before you consider his/her age, class, gender, race, ethnicity, sexual orientation, etc.

In this respect, I noted that teachers may be better positioned than, say, doctors or judges, to individuate their students – and that this can facilitate more objective and less stereotypic judgments and more equitable classrooms. I also noted that there are strategies and tools that schools could take advantage of to support teachers’ natural desire to get to know their students well. In my next post, I will provide some additional ideas on how to do this, as well as more information on research-based strategies that have been shown to reduce implicit biases and what their implications might be for schools and educators.



A couple of weeks ago, a colleague asked a great question during the Shanker Institute’s Good Schools Seminar on “Creating Safe and Supportive Schools.” His question was motivated by a presentation on implicit bias by Kirwan Institute director Sharon Davies. The question was: Wouldn’t you expect more conscious, systematic decision-making (and fewer automatic, snap judgments) from teachers who, after all, see their students everyday and get to know them well? (See here, minute 50:55.)

As I related in the previous post, individuating (or learning about the particulars of a person, his/her interests, skills, family, etc.) can be a very effective “de-biasing” tool.* So, how might we leverage and support teachers’ natural inclination to get to know students well? How might a potential de-biasing intervention build on this feature of teachers’ work?

The reason I ask this question is that cognitive biases come in all sizes and shapes – stereotypes are just one source. For example, we tend to remember and pay more attention to information that confirms our preexisting beliefs – a.k.a. “confirmation bias.” We also tend to give more weigh to information that is presented to us earlier rather than later – a.k.a. “primacy effect.” Very important too is the “fundamental attribution error”: the belief that, while our own actions can be explained by circumstances (i.e., I yelled at a colleague because I had a stressful day), others’ behaviors are explained by their personalities and dispositions (i.e., he yelled at a colleague because he is a bully). The list goes on and on – for an overview of heuristics and biases see Thinking and Deciding and Thinking, Fast and Slow). My point is that tools and strategies that can guide and scaffold the way we gather and weigh information are essential if we hope to arrive at decisions that are more objective and less biased — think of structured analytic techniques as “reins for the (often unruly) mind.”

A teacher might be in a better place to get to know his/her students but this process is still very complex; all kinds of biases such as those mentioned above could get in the way. For example, the primacy effect suggests that something a student does on the first day of class may have more influence on the teacher than subsequent student behavior. But if the teacher has a good way of documenting behavior throughout the year, then he/she might be less prone to more vividly remembering (thus, of disproportionately considering) what the student did or didn’t do that first day. So, the question becomes: What kinds of tools might help teachers collect, record, and evaluate information about their students in a way that is more systematic and less prone to error or bias?  

This is not a trivial question and I don’t have all the answers, but let me just offer a couple of thoughts. I was curious about ClassDojo, an app designed to help teachers collect and share data on student behavior. It made me wonder if teachers could use this or similar tools to collect information about students’ particular strengths, talents, and interests. I personally like apps because they can facilitate processes that can be labor intensive such as data gathering, aggregating and sharing data or even tasks that require additional knowledge such as data analysis. But let’s face it; low-tech approaches could be just as effective. For example, Getting To Know My Child is a (paper and pencil) mechanism that enables parents to share information with their child’s kindergarten teacher, including questions about the child’s background, health, abilities, preferences and so forth.  

Individuating is an important strategy because it sets the stage for a second, more complex but even more powerful type of intervention: Assigning competence to low status students, a strategy that takes advantage of the power of the teacher as an evaluator. “Assigning competence is a public statement that specifically recognizes the intellectual contribution a student has made to the group task” – more here. Students tend to believe and respect the evaluations that teachers make of them. Thus, if the teacher publicly commends a low status student for being strong on a particular (and real) ability, that student will tend to believe the evaluation. At the same time, the other students in the classroom are likely to accept the evaluation as valid. Once this happens, the expectations for the student’s competence – as well as his/her relative status in the classroom – can rise dramatically, which is likely to result in increased activity and influence of the low status student as well as increased success in future classroom tasks.

In Designing Groupwork (3rd edition, in press), Cohen and Lotan specified that an effective assignment of competence has three critical features: 1) evaluations must be public, 2) evaluations must be specific, referring to particular intellectual abilities and skills and 3) the abilities/skills of the low status student must be made relevant to the rest of his/her classmates. Thus, learning about students’ particular strengths and skills (as well as detecting when these abilities are being demonstrated) is a very important feature of effective status treatments.**


What additional strategies might help mitigate implicit biases? Experimental psychologist Patricia Devine has argued that biases are like “habits”; with effort and practice, they can be broken. According to Devine three conditions need to be met for individuals to successfully counteract their biases:

  • Acknowledgement that we all harbor unconscious biases and motivation to change.
  • Attention to when stereotypical responses or assumptions are activated.
  • Time to practice strategies designed to break automatic associations.

Devine and her colleagues developed an eight-week multi-faceted prejudice habit-breaking intervention - also nicely summarized here. Participants were given a toolkit of five strategies, and were asked to practice at least some of them on a weekly basis. After the intervention, participants self-reported increased concern about racial discrimination, and tested lower on implicit bias against African Americans than those in a control group. The strategies were:

  • Stereotype replacement: recognizing when one is responding to a situation or person in a stereotypical fashion, and actively substituting the biased response with an unbiased one.
  • Counter-stereotypic imagining: detecting one’s stereotypical responses and visualizing examples of people who are famous or known personally who prove the stereotype to be inaccurate.
  • Individuating: gathering specific information about a person, so that the particulars of that person replace generic notions based on group membership.
  • Perspective taking: adopting the perspective of a member of a stigmatized group. This strategy can be useful in assessing the emotional impact on individuals who are often being stereotyped in negative ways.
  • Increasing opportunity for positive contact: actively seeking out situations that expose us to positive examples of stereotyped groups.

In my previous post I noted that education – as in, someone’s general level of schooling – does not appear to be an effective “de-biasing” strategy or factor. However, specific training on the mechanisms of implicit bias can be a potent approach. As Correll and Benard explain in a research review of bias in hiring, exposing decision-makers to “systematic, well-designed research that documents the existence of biased processes is one of the most effective types of intervention. Research and case studies have shown that this type of training significantly reduces biases” (also here and here). Once aware of the existence of unconscious bias, individuals tend to be more careful in scrutinizing their own decisions, thereby avoiding the cognitive shortcuts that lead to biased decisions. Finally, Correll and Benard argue, “exposing people to systematic research on cognitive biases is more effective than alternative models of awareness training, such as those where participants examine their own, particular, biases (see here and here).” In several instances, the latter type of approach has actually been found to increase biases.***

I am a big believer in changing contexts (not individuals) and making it easier for people to “do the right thing” or, in our case, engage in structured thinking and ditch the tendency respond to other people and situations automatically. So, in addition to training and practicing strategies that can help break the habit, what features of the local context may support structured decision-making? 

Accountability: “Holding individuals accountable for their decisions has helped to reduce bias in hiring and promotion decisions,” argue Correll and Benard (see also here, here, here and here). “When managers know they will be required to justify their actions (particularly to an impartial authority), they tend to engage in more complex thought processes and fewer snap judgments (here and here).” A study by Foschi (1996) found that participants were less likely to hold women to a higher standard than men when they were required to explain their responses to a partner in a subsequent task. “Requiring those responsible for making decisions to explain those decisions to a disinterested third party helps preempt the introduction of bias into decision making.”

Transparency: When criteria are objective and explicit, it is easier to ensure that that everybody is held to the same standard. Researchers Uhlmann and Cohen (2005) found that listing job requirements immediately prior to selecting a candidate constrained opportunities to use subjective criteria during candidate selection. Subjective criteria allow bias to be hidden because the standards by which decisions are made are unclear.

Time: Allowing sufficient time to make decisions is another important contextual element. In a field study, Ayres et al. (2004) found that African-American cab drivers received lower tips than white drivers. The authors concluded that decisions made quickly, when one is preoccupied with other things, can result in unconscious discrimination.****

Diversity & Messaging: As mentioned earlier, intergroup contact is one of the best researched means of reducing both explicit (here and here) and implicit bias (here and here). Exposure to a male pre-K teacher, to a black school principal, or to a female mathematician can help challenge and expand our assumptions (conscious or unconscious) about who is good and bad at certain tasks – and even the nature of those jobs and the skills required to do them well. This is important because it helps us break free of the strong cultural associations that often place needless limitations on the aspirations and achievements of women and minorities. Because of its importance, I will address the role of diversity and messaging as “institutional-level de-biasing strategies” in my next post.

*****

* I am not suggesting that teachers, because they are teachers are immune to implicit biases – in fact there is research suggesting that they are not (see Kirwan Institute’s recent review, pp. 30-35).

** There are other important preconditions that must be met for status treatments to work. For example, multiple-ability tasks are a “necessary condition for teachers to be able to convince their students that there are different ways to be “smart.” Students who do not excel at paper-pencil tasks often do excel when academic content is presented in different ways. Tasks that require multiple abilities give teachers the opportunity to give credit to such students for their academic and intellectual accomplishments.” More here.

*** See also, Rudman et al 2001; Legault et al 2011; Mann & Kawakami 2012; Plant & Devine 2001; Valian 1999; Wilson & Brekke 1994. While the motivation to be non-prejudiced may lead to reduced discrimination (Plant, Devine & Peruche 2010), thinking of oneself as non-prejudiced may increase discrimination – e.g., instructing people to assert that they are objective decision-makers prior to a hiring decision increases gender discrimination (Uhlmann & Cohen 2007).

**** For more on the importance of time see Beattie, 2013; Bertrand, Chugh, & Mullainathan, 2005; Richards-Yellen, 2013.


 

The arguments for increasing the representation of people of color in teaching are often based around two broad rationales. First is the idea that, in a diverse, democratic society, teachers of color can serve as important role models for all children. The second idea is that teachers of color are particularly well suited to teaching students of color because they possess an inherent understanding of the culture and backgrounds of these learners.

I can think of at least two additional pro-diversity arguments that are relevant here, not only for schools but also for the broader landscape of work organizations. First, diversity can increase everyone’s sense of “fitting in” in a given setting; social belonging is a basic human need that can in turn predict a wide range of favorable outcomes. Second, diversity can do more than offer role models. Repeated exposure to male pre-K teachers or black, female high school principals can challenge and expand our thinking about who is or is not  suited to certain tasks – and even the nature of those jobs and the skills required to do them. This is important to the much broader goal of fairness and equality because it contributes to disrupting strong stereotypic associations present in our culture that too often limit opportunities for people of color and women.

As I noted the first two posts of my implicit bias series (here and here), intergroup contact is one of the best researched means of reducing explicit (here and here) and unconscious (racial, gender) bias (here and here). This post explains why and how faculty diversity can act as an institution-level “de-biasing” policy or strategy.


Stereotypes & Micro Aggressions: (More Than) “Comments That Sting”

A recent New York Times story called attention to “micro aggressions” or “brief and commonplace daily verbal, behavioral, or environmental indignities, whether intentional or unintentional, that communicate hostile, derogatory, or negative racial slights and insults.” The term might be relatively new, but the fundamental interactional status processes that it captures have been studied since the 1950s.*

There is a well-established body of theory and research documenting why, when and how automatic mental associations trigger unconscious behaviors that shape social situations that are often high stakes. We’ve known for a long time that women in work groups are more likely than men to be interrupted, and often report that their ideas are ignored or mistakenly credited to a male coworker. African Americans often feel that they have to perform twice as well as their white coworkers to be given the same level of recognition. Ideas often sound better when offered by someone perceived to be attractive.

As Shelley Correll and Patricia Ridgeway (2003) explain:

What all of these observations have in common is that some members of a group seem to have real advantages that are denied to others. They have more opportunities to speak, their ideas are taken more seriously, and they have more influence over other group members. (…) These hierarchies of evaluation, influence, and participation are referred to as the ‘power and prestige structure’ or the ‘status structures’ of the group.

Various theories explain how these structures emerge and are maintained, and how they contribute to other aspects of social inequality.

I am not a huge fan of reinventing the wheel here, but perhaps the term “micro aggression” provides an additional, more accessible way to draw attention to the complex processes briefly outlined above. But, perhaps, given the numerous hostile reactions to the Times article, the problem is not one of simplification but precisely the opposite. Here’s one comment to the article that did capture the issues well:

Many commenters here seem to believe that “innocent” and inadvertent utterances that promote stereotypes should be forgiven because no offense was intended. I disagree. (…) There is also the victim-blaming argument echoed by bullies throughout time: “Toughen up,” which does nothing to address, for example, stereotype-fueled hiring bias. We can either make excuses for complacency, and ignore the harm that our collective contributions to stereotyping do to others, or we can try changing the societal status quo by objecting to such utterances, making people aware that some of the stereotypes they “inadvertently” perpetuate tacitly condone a society where a multitude of groups have fewer opportunities because of unconscious systemic bias.

The key here is that some of these micro aggressions occur (and/or are salient in) high-stakes situations, such as job interviews, workgroup discussions, or even taking the SAT – in other words, they can subtly shape the results of these situations. So micro aggressions aren’t just annoyances. They have real consequences that transcend the specific moment in which the micro aggression occurs. In addition, their effect is cumulative; individually, they can be brushed off, but after a while it is difficult for anyone who hears them to remain immune to their underlying message. Finally, micro aggressions can be really subtle, making it harder to call out the perpetrator without appearing like one is overreacting. This includes things like mistakenly introducing someone as someone else of the same race, or commenting on “how articulate” an African American is, to more consequential incidents like attributing a woman’s idea to the male coworker sitting next to her.

Personally, as a non-native speaker of English I’ve experienced my share of these situations. I am routinely asked things like: “Did you know any English when you came to the U.S.?” I usually respond politely but am often tempted to say: “No, I somehow learned all my English as I completed a doctoral program at an Ivy League school – I am that kind of a genius.” Some people repeat something they just said replacing a word they perceive as sophisticated with one they perceive as more colloquial. The underlying message of many of these is that having an accent somehow detracts from your general intelligence.

Why do these seemingly small things matter? Because they shape how we view ourselves, as well as how others perceive us and our abilities (step 3 in the figure below). Self- and third party evaluations, in turn, affect our aspirations and decisions about what fields we want to pursue, the jobs we see ourselves holding, etc. (1). Finally, individuals in a society form collective, broadly shared beliefs about who does what based on what’s around (2). Micro aggressions (stereotypes, status beliefs) contribute to processes in step 3 but shape things all the way up to macro-level forms of social stratification and inequality such as the gender segregation of paid work. If we change things at step 1 — e.g., by changing the composition of workplaces like schools — we can help to disrupt this “vicious cycle.”


It’s Not About Changing People; It’s About Changing Contexts

Broadly shared beliefs change when enough experiences around us disconfirm and replace our unconscious, stereotypical associations – that is one of the reasons why we need more male pre-k teachers, more African American college professors, more female CEOs, and so forth. Beliefs also change when we expand the way we understand occupations and roles.

In Unlocking the Clubhouse, Margolis and Fisher (2002) examined the many influences contributing to the gender gap in the field of computing. Over a period of four years, the researchers performed interviews with more than 100 computer science students of both sexes from Carnegie Mellon University, as well as conducting classroom observations and having conversations with hundreds of college and high school faculty. A recurring theme from these interviews was that a computer scientist is a “geek,” someone who can’t stop thinking about computers, hacking and so on. All participants alluded to this image at some point. Then, when they were asked the extent to which they thought they fit that image; 3/4 of men agreed, compared with only about 1/3 of the women. One conclusion stemming from the work is that the image we hold of computer scientists needs to be expanded and, frankly, portrayed more accurately. The authors concluded that it is not women who need to be “changed” but rather, our understanding of what computer science as a discipline of study is all about.

In another experiment, researchers Murphy, Steele & Gross (2007) recruited male and female STEM majors and showed them a promotional video of a future STEM conference, depicting either an unbalanced ratio of men to women or a balanced ratio. Female students who viewed the unbalanced, majority male video exhibited more cognitive and physiological vigilance, and reported a lower sense of belonging and less desire to participate in the conference than did women who viewed the gender-balanced video. Men were unaffected by this situational cue. This suggests that we need to make contexts inviting for those we want to attract, and “engineer” situations so that everyone feels that they fit in.**

Feeling socially connected is a basic human need that can predict a wide range of favorable outcomes. Walton & Cohen (2007) showed that stigmatization can trigger “belonging uncertainty,” a state in which people are sensitive to information diagnostic of the quality of their social connections. The researchers showed that belonging uncertainty undermined the motivation and achievement of people whose group was negatively characterized in academic settings. Students were led to believe that they might have few friends in an intellectual domain; while white students were unaffected, black students displayed a drop in their sense of belonging and potential. Then the researchers designed an intervention that mitigated doubts about social belonging in college, which raised the academic achievement (e.g., college grades) of black students, but not of white students.

What’s important about the three studies I highlighted above is that it is possible to reverse these persistent social trends just by tweaking the characteristics of our local contexts. It’s not so much about patting women or other disadvantaged groups on the back or saying “toughen up” (as many of the comments on the Times story advised). Rather, the research suggests that we need to think carefully about what our organizations project, who needs to be recruited, how jobs and roles are characterized, and what images and symbols dominate our organizations.

A diverse teaching force can help erode stereotypes and biases by changing the interpersonal configuration of actors in a given setting, and allowing more stereotype-disconfirming experiences to permeate the society. Commitment to diversity sends a strong institutional message. Organizational policies, laws and institutions have an expressive importance — they signal a consensus about what we value and desire as a society. These two factors combined can favorably shape the aspirations and behaviors of students and  educators alike, the lenses through which they judge their our own potential (and the potential of others), and ultimately their aspirations, choices and accomplishments.

*****

* In addition to the status scholarship, which I discuss in the post, MIT professor Mary Rowe wrote about “micro-inequities” in 1973 defining them as “apparently small events which are often ephemeral and hard-to-prove, events which are covert, often unintentional, frequently unrecognized by the perpetrator, which occur wherever people are perceived to be ‘different.’”

** Recent research suggests that this needs to be done well or it can backfire. In an article published in the journal Group Processes & Intergroup Relations, researchers investigated the reactions produced by the “over-representation” of minority images in a flyer advertising a local university. The study found that white students felt more positively about a flyer that overrepresented the proportion of Asian students on their campus than about a flyer with more accurate depictions. However, students of Asian ethnicity (a stigmatized minority group in Australia) felt less favorable towards the advertisement that showed many Asian faces than toward a flyer that showed a more realistic number.