Become a Patron! 


 

 Abstract

Excerpted from: Chris Chambers Goodman, Shadowing the Bar: Attorneys' Own Implicit Bias, 28 Berkeley La Raza Law Journal 18 (2018) (209 Footnotes) (Full Document)

 

ChrisChambersGoodmanWhat is it that attorneys fail to see because of biases they do not believe they have? They are quicker to note explicit biases and try to counter them, yet so many do not even consider the possibility they may have acted according to implicit biases. While explicit biases are less pronounced, less tolerated, and less often spoken, unconscious biases impacts decision making every day, and there are many open questions still to explore.

For instance, what are the driving factors behind “prosecutorial discretion,” in charging decisions generally and the death penalty specifically? When defense lawyers defend their clients, which “stock stories” do they use, which ones do they believe and which ones do they encourage in their efforts to convince the judges and juries whose decisions can be influenced by their own explicit and implicit biases? On the civil side, can attorneys become better at understanding why they may make certain strategic choices for one client and different strategic choices for another?

This article analyzes the implications of implicit bias in the legal profession, focusing on how the implicit biases of attorneys impacts litigants. Part one summarizes research in the cognitive science field defining bias and explains some of the Harvard Implicit Association Tests (IAT). Part two describes some studies conducted on juror and judicial bias in the courtroom, as well as those dealing with the bias of attorneys in criminal cases. Using this background, part three provides an analysis of the impact of an attorney's implicit bias on her strategic decision making and conduct of civil litigation, its results for clients, and its impact on the justice system. Part three also provides an argument for the American Bar Association's (ABA) proposed rule for a negligence standard regarding ethical regulations intended to ensure that lawyers work harder to overcome biases to better serve justice. The conclusion in part four proposes a framework for interrupting biased behaviors.

 


Part One: Implicit Bias Research

A. Identifying Bias

Bias is the pre-judging of a person based on his or her, perceived or actual, status of being a member of a particular group, without regard to that person's actual conduct or performance. Biases can be explicit or implicit.

Explicit biases are easier to identify and explain. People who have explicit biases will express those biases, verbally or in writing. They are aware of their biases and admit to having the bias if asked or challenged. They are deliberate in relying upon those biases and have an animus--a mental state similar to purpose or knowledge. For instance, a person may say, “African American men are more prone to violence than white or Asian men.” When faced with a situation requiring him to assess the potential violence of an approaching man, he will readily admit to this bias being a part of his thought process in deciding whether self-defense strategies are needed.

By contrast, implicit bias is unintentional from a mental state perspective: People do not know they are speaking or acting in a particular way because of the influence of a bias. For instance, when participants were primed with either pop or rap music and asked to evaluate a Black person's behavior, those primed with rap music noted the behavior as more aggressive and the individual as less intelligent. Similarly, a recent study asking participants to estimate height and weight based on photos of faces found people estimated that Black male faces belonged to taller and heavier bodies than white male faces; when in reality the white male faces were from bodies that were taller and heavier. These subjects did not intend to discriminate and were not conscious of the discrepancies between their estimates and reality. Still, those estimates can have an impact.

The implicit bias field has been developing over the past several decades. The research focuses on how human brains work to process information and make decisions. When people have time to evaluate information and make a decision, that decision can be more deliberate. Deliberate decisions are more thoughtful and purposeful, relying upon our analytical skills. By contrast, when making decisions quickly, people rely on intuition and on somewhat automatic processing by their brains. For instance, people know that a flame is hot and infer that a pot sitting on the stove over an open flame is hot. So, there is no need to think about whether or not to use to an oven mitt when lifting the pot from stove: people will reach for the oven mitt.

Similarly, lawyers may know the court house contains many white judges. So, when feeling some time pressure and stress having arrived late at an unfamiliar courtroom and upon seeing a Latina step into the room, the lawyer may ask her when the judge is expected to take the bench. This quick reaction may be based on an implicit bias and the lawyer may presume that the Latina is not the judge, or (more pejoratively) that she must be the clerk. If the attorney took the time to think about it, the attorney might not have made that assumption; but when faced with a quick decision, the brain defers to its well-worn paths to help process information quickly.

One well-worn path is that most judges are white men. Another well-worn path is that most people of color in the courtrooms are either parties or court employees. These are both true statements in the experience of many. And they both reveal potential implicit biases. Implicit bias is not easy to uncover, explain, or analyze. It is one's intuitive reaction in situations that generally are immediate or require fast-thinking and quick judgment calls. When confronted or simply asked, people are likely to reject the notion that they are biased and will deny that they behaved in a biased way.

The Harvard Implicit Association Tests are the “most well-known and highly regarded measure of implicit bias.” According to data from the Race IAT, eighty-eight percent of white Americans have implicit bias against African Americans. Further, forty-eight percent of African Americans have an implicit bias that favors white Americans. There is significant support for the conclusion that, “in the aggregate, implicit bias can have a substantial impact on perception, judgment, decision making, and behavior.”

While having bias is not necessarily the same as acting upon it, some people may call upon these biases when making decisions. For instance, these biases could influence someone's decision on whether a defendant is guilty or innocent. The next sub-section describes a selection of these studies.

B. The Stereotypic Association Between African Americans and Violence

Words can activate implicit biases. For instance, researchers often use a technique called “priming,” which involves exposing a test subject to some focused stimulus before the actual test occurs. The control group may have no priming or be primed with something intended to be neutral. In some IATs, when primed with “words associated with Blacks, such as slavery,” subjects were more likely to rate the ambiguous behaviors of a male as hostile even when the race of the male was not specifically identified. This particular study involved mock jurors, who the researchers concluded were impacted in two ways: (1) they were more likely to consider ambiguous evidence as supporting guilt, and (2) they were more likely to believe that the defendant actually was guilty.

In addition, test subjects were more likely to interpret ambiguous behaviors as aggressive when there are Black actors rather than white actors performing the action. Several studies have evaluated the association between African Americans and violence by analyzing whether a suspect's race influenced the participant's decision to fire a weapon at a suspect, as well as whether race influenced the time spent deliberating before making the decision as to whether or not to shoot.

In one study, the researchers developed a simple video game with twenty different backgrounds and eighty different target images. Ten African American men and ten white men posed as models for the target images. The models appeared in the video game multiple times in different scenarios and positions. Sometimes the targets were armed with guns and other times they were not armed but holding “no-gun” objects. When participants played the game, they encountered a slideshow of different backgrounds. An image of a man randomly appeared and the participants were prompted to make a decision as to whether to shoot this target, after having been told they needed to react quickly to shoot armed suspects. The results of the study show that potentially hostile targets were identified more quickly if they were African American, and participants were also more likely to miss an armed target if he was a white man.

A second study repeated the first study with shorter time frames in which to decide, thus further activating the brain's automatic shortcut processes. Researchers found that participants “set a lower threshold” for shooting African American targets, which can be interpreted to mean that they were more willing to shoot “less threatening” African American targets.

A third study tested whether participants used stereotypic associations between African Americans and violence to help them decide whether to shoot. The subjects were forty-eight undergraduates (twenty-six female and twenty-two male) playing the same video game. They also completed a questionnaire to examine whether they endorsed a negative stereotype of African Americans as dangerous or aggressive. The results suggested it was knowledge of the cultural stereotype rather than indirect personal prejudice that influenced the decision to shoot. Knowing about stereotypes is more pervasive than ascribing to them, but if conduct is partially determined by this knowledge, rather than acknowledged prejudice, acting in response to it would not constitute purposeful discrimination. As intent or purpose is required for actionable state actor discrimination, the lack of intent negatively impacts the ability to effectively combat stereotypes in the justice system.

A fourth study in this group used the same video game parameters and the participants included fifty-two adults (in part, composed of twenty-five African Americans and twenty-one white Americans) selected from bus stations, malls, and food courts. The results showed that the decision to shoot African Americans more quickly did not differ between white participants and African American participants.

The researchers' analysis of these studies led to four findings: (1) white participants made the correct decision to shoot an armed target more quickly if the target was African American; (2) white participants decided not to shoot an unarmed target more quickly if he was white; (3) the magnitude of bias varied with perceptions of the cultural stereotype and with levels of contact, but not with racial prejudice; and (4) a follow-up study showed that the levels of bias were the same among African American and white participants in a community sample. The researchers concluded:

In four studies, participants showed a bias to shoot African American targets more rapidly and more frequently than White [sic] targets. The implications of this bias are clear and disturbing. Even more worrisome is the suggestion that mere knowledge of the cultural stereotype, which depicts African Americans as violent, may produce Shooter Bias, and that even African Americans demonstrate the bias.

This evidence shows the stereotype has an impact on the behavior of people deciding whether or not deadly force is warranted. If students and community members demonstrate this bias, the next question is to consider those who are charged with enforcing the law, and who have more opportunities to make a decision about whether or not deadly force is justified. The next section evaluates some of these studies.

C. The Stereotypic Association Between African Americans and Crime

The media plays a large role in molding our stereotypic associations. For example, “regularly seeing images of Black but not white criminals in the media may lead even people with egalitarian values to treat an individual Black as if he has a criminal background or assume that a racially unidentified gang member is Black.” The association between Black people and crime and against white people and crime influences society at large and police officers in particular.

Another group of studies included police officers as subjects to analyze the influence of stereotypical associations on visual processing. These five studies aimed to identify whether a person's preconceived notion about a person or group of people influenced what that subject perceived when viewing certain images or objects.

Specifically, the first study “investigated (a) whether the association between Blacks and crime can shift the perceptual threshold for recognizing crime-relevant objects in an impoverished context and (b) whether these perceptual threshold shifts occur despite individual differences in explicit racial attitudes.” The participants were primed with Black male faces, white male faces, or no faces. In an unrelated task, they were shown images of objects with incomplete pixels, such that it was difficult to identify the object initially; and as more pixels were added, the resolution gradually enhanced. The subjects were asked to push a button at the point when they thought they could identify the object and write down their guess. The images included both “crime-relevant (e.g., a gun or a knife) and crime-irrelevant (e.g., a camera or a book).

The authors concluded that “black [sic] faces triggered a form of racialized seeing that facilitated the processing of crime-relevant objects.” Further, in comparing the participants who were primed with white faces and those who were not primed with any faces, the authors found that the mere priming with white faces actually inhibited the detection of crime-relevant objects. People are thrown off guard and have a more difficult time connecting crime objects with white faces, which may lead to both a lower expectation of danger from white actors and a lower percentage of deadly force engagements.

Participants in another study were all police officers, seventy-six percent of whom were white. First, they were primed with “crime words,” such as “violent, crime, stop, investigate, arrest, report, shoot, capture, chase, and apprehend.” Then, the participants were asked to look at photos of sixty Black male faces (with features stereotypically associated with Blacks) and sixty white male faces. Next, they were asked to participate in a surprise face-recognition task where they viewed images of Black “lineups” and white “lineups” and were asked to identify any faces they were shown during the previous task.

This study found the police officers were more likely to identify a face that was more “stereotypically Black” than the target they actually were shown when they were primed with crime words. “Priming police officers with [words associated with] crime caused them to remember Black faces in a manner that more strongly supports the association between Blacks and criminality.” The authors determined:

Researchers have highlighted the robustness and frequency of this stereotypic association by demonstrating its effects on numerous outcome variables, including people's memory for who was holding a deadly razor in a subway scene .... [¶] The mere presence of a Black man, for instance, can trigger thoughts that he is violent and criminal. Simply thinking about a Black person renders these concepts more accessible and can lead people to misremember the Black person as the one holding the razor.

From these studies, the authors drew five conclusions: (1) Black faces influence a participant's ability to detect degraded images of crime-relevant objects such as guns and knives; (2) showing crime-relevant objects to participants prompts them to visualize Black male faces--suggesting that the association of Black and criminality is bidirectional--i.e., when participants saw Black faces, they visualized violent objects and when they saw violent objects, they visualized Black faces; (3) these associations exist based on both positive and negative images because when the participants were exposed to positive stereotypical images involving Black people (basketball and athletics), the results were similar; (4) police officers associate the concept of crime with Black male faces and priming police officers with crime words or concepts “increases the likelihood that they will misremember a face as more stereotypically black [sic] than it actually was;” and (5) the more stereotypically Black a face appears, the more likely police officers are to report that the face looks criminal. This association impact suggests there is a strong bias in police officers themselves that Black people are more likely to be engaged in criminal activity than white people, which impacts who is stopped, frisked, questioned, and detained in their community encounters.

Another IAT experiment identified a potential implicit racial bias in favor of guilt, despite the presumption of innocence. Professor Demetria Frank discusses the applications of cross-racial identifications and their unreliability, as well as the overrepresentation of Black people in the criminal justice system. From her evaluation of several studies, she concluded, “Whites are more likely to exhibit racial neutrality in decisions where race is a salient feature in the trial or when normative cues to avoid bias are strong.” When attention is called to bias, people are on guard and make the effort to be race-neutral.

It is imperative that lawyers and judges understand the implications of these and other research studies. Lawyers and judges are often forming their own opinions and making decisions based on the evidence they are given, which includes eyewitness testimony. If implicit bias is so strong it can cause actual eyewitnesses to incorrectly remember who was the perpetrator during a crime, then implicit bias not only affects the lawyers and judges directly but also indirectly as they develop strategies and arguments in reliance upon this testimony.

D. Critiquing Implicit Bias

There are some notable challenges made against the implicit bias testing regime within the scientific community. Those challenges include: defining what is and is not implicit bias; physical processing of information used to measure implicit bias; whether the samples are sufficiently generalizable; what the level of correlation says about causation; how predictive the measures can be; and the impact of changing levels of implicit bias. On the first issue, there are questions about whether the distinction between explicit and implicit biases is a spectrum rather than a bright line. In other words, are we really measuring what we think we are measuring?

On the physical processing issue, one instrumental critique is that younger people generally have quicker reflexes and are better at video games and sending text messages with one finger per hand than older adults, which could lead to tests revealing greater levels of bias in older people. Any critique about varying reflex times would necessarily undermine the perceived validity of the test results because part of the IAT test relies upon measuring the difference in the amount of time it takes the subject to react when processing “two stimuli that are strongly associated (e.g., elderly and frail)” with “two stimuli that are less strongly associated (e.g. elderly and robust)” to test out the existence and clarity of pathways.

In terms of subject sampling, many of the subjects are college students, who are overrepresented in the data, and the studies that have used other subjects recognize disparate outcomes for other groups. While participation in the IAT is open to all through its website, people self-select; and therefore extrapolations about their results may be misleading as to the general population.

On the issue of correlation and causation, critics question whether implicit bias governs people's actions. Even if the IAT tests accurately predict the existence of pathways that evidence implicit bias, they do not demonstrate that people act consistently with the implicit biases that the test measures. Some studies show that more than four percent of variance in discrimination-relevant criteria measures is predicted by the Black-white race IAT measures, but if implicit bias accounts for only about four percent of behaviors, over ninety-six percent remains; and that is no greater impact on behavior than what some have found from explicit bias. Others found an effect size of discrimination closer to zero.

Researchers measure what people do (such as who employers hire, or who juries find to be guilty), but cannot measure what they are thinking when they actually engage in that conduct. Nor do researchers know whether subjects are more likely to engage in biased conduct based on their implicit bias score. Studies show the IAT finds that many people have a high level of favorability for white males and words associated with leadership roles, but the IAT does not tell whether these subjects will actually give a hiring preference to white males when they have the opportunity to do so. We may know that certain individuals promoted white men in the past, but the IAT test does not explain whether their preferences for associating white males and leadership caused them, or played a role in their decision to promote a white male. The issue is that “behavior toward black [sic] people, or white people in isolation, cannot be operationalized as discrimination ... since they fail to capture differential treatment. Hence, treating a black [sic] person badly is not discrimination per se; it only becomes discrimination if the treatment is worse than the treatment of an equivalent white individual.” Thus, the research does not explain whether people will act in accordance with their implicit preferences, nor whether their past acts were because of those implicit preferences.

On the issue of predictive validity, others also criticize the value of the IAT, noting “severe validity and reliability issues,” and stating that:

[T]he most important finding of the present study is that the current literature is uninformative as to whether the IAT can predict discrimination or not, as it turned out that too many studies failed to measure or provide evidence of discrimination actually occurring in the first place. Hence additional empirical work is needed.

They “strongly caution against” applying the IAT based on any assumption that it can or does predict discrimination. For instance, someone could show an amygdala reaction that equates with prejudice against Black people when tested, but actually treat Black and white people the same way in the real world. That same person could be explicitly biased against Black people, and still refrain from treating them differently in a particular situation.

In a study that evaluated changes in levels of implicit bias, researchers found that while some procedures changed levels of implicit bias, their impact was very small. For instance, “[p]rocedures that associate sets of concepts, invoked goals or motivations, or tax people's mental resources produce the largest changes in implicit bias, whereas procedures that induced threat, affirmation, or specific moods/emotions produce the smallest changes.” Appeals to fear, pride, and other emotions had the least impact on implicit bias measures. Conversely, “big picture” strategies addressing associations and goals are less tangible and had a larger impact in changing implicit bias levels. This study further notes that “even the procedures that produced robust effects on implicit bias had effect sizes that are ‘small,’ both by conventional standards and as compared to typical effect sizes in social psychology.”

Despite their limited impact on implicit biases, these procedures had no significant impact on explicit biases and behaviors. The researchers were surprised to find “little to no evidence that the changes caused by procedures on explicit bias and behavior are mediated by changes in implicit bias.” Recognizing the limitations of their analysis--that most of the studies relying upon university student samples can differ significantly from the larger world--they noted the need for further research to better understand “changes in implicit biases and their role in explicit bias and behavior.” They concluded that it would be “more effective to rid the social environment of the features that cause biases on both behavioral and cognitive tasks, or equip people with strategies to resist the environment's biasing influence.”

Instead of recognizing and measuring implicit biases, researchers suggest eliminating environmental biases altogether (“biases in the air” so to speak) or training people to make bias-free decisions. In order to make an implicit bias-free decision, one has to notice bias, identify it, and then act consciously; and recognizing and measuring biases are important steps on that path. Other researchers, like Professors Greenwald and Banaji note that even taking these and other meta-analyses into account, “[t]his level of correlational predictive validity of IAT measures represents potential for discriminatory impacts with very substantial societal significance.” These apparently small impacts in individual cases can add up to a significant impact on how litigants are treated in the court system in general and by individual attorneys and judges in particular.

In the same journal, others reply that despite the additional research and analysis, “by current scientific standards, IATs possess only limited ability to predict ethnic and racial discrimination and, by implication, to explain discrimination by attributing it to unconscious biases.” The authors criticize Professors Greenwald and Banaji because they “focus on a set of implicit bias effects without considering the vast array of other realistic effects that could be competing with implicit bias in any setting.” The authors conclude with a word of caution noting:

[I]f one allows anything to grow unimpeded--be it money in the bank or an epidemic or the ripple effects of unconscious bias in a population--that phenomenon will eventually, with enough time, grow to gargantuan proportions. That is mathematically uncontestable. Whether the small effects of unconscious bias that are suggested as at least possible from these meta-analysis will in reality grow, be contained or disappear in complex, real-world social systems is a question that should be resolved through vigorous empirical testing, not computer simulations and thought experiments that, by their nature, must rely on strong yet untested assumptions.

So, does the IAT really help?

E. The Next Generation of Implicit Bias Research

In responding to some of these critiques, others note too little attention has been focused on the way decision makers justify their decisions after the fact. It is not clear whether people make decisions and then explain how bias did not play a role in the decision or make decisions and then create an explanation showing that bias did not play a role, even if bias actually did have an impact. Professor Kang and others note:

[B]roadly speaking, this research demonstrates that people frequently engage in motivated reasoning in selection decisions that we justify by changing merit criteria on the fly, often without conscious awareness. In other words, as between two plausible candidates that have different strengths and weaknesses, we first choose the candidate we like--a decision that may well be influenced by implicit factors--and then justify that choice by molding our merit standards accordingly.

The authors describe an experiment involving subjects evaluating finalists for a job as police chief, with one of each gender and different profiles that suggested either “book-smart” or “street-wise.” The subjects were asked to rank the candidates and then identify the factors that contributed to their ranking. Depending on which candidate they selected, the subjects ranked factors such as education and experience differently, leading the authors to conclude that “what counted as merit was redefined, in real time, to justify hiring the man.” When the man had more experience and less education, the subjects noted that experience was ranked more highly after the fact. When the man had more education and less experience, education was ranked more highly after the fact.

The next question was whether this post-hoc valuation of factors was done consciously to provide a cover story or was merit “re-factored in a more automatic, unconscious, dissonance-reducing rationalization, which would be more consistent with an implicit bias story?” Further research tested this question. Participants evaluated college admissions decisions for African American and white candidates, with variations in their GPAs and in the number of Advanced Placement courses taken. When asked to identify which criteria was most important, the rankings changed as to whether GPA was the most important factor depending on whether or not the white or the African American candidate had a higher GPA.

Even where the participants were not going to select who would be admitted because admission decisions had already been made and thought they were simply examining the most important criteria, their assessments of value varied such that the white applicant satisfied the higher-valued criterion. The process of “reasoning from behavior to motives, as opposed to the folk-psychology assumption that the arrow of direction is from motives to behavior, is, in fact, consistent with a large body of contemporary psychological research.”

This outcome suggests the subjects are not consciously trying to justify what they know to be biased decisions, but rather that the bias is truly unconscious and the brain engages in a dissonance-reducing rationalization. These authors also studied how jurors evaluate attorneys and what the implications of the juror evaluation of the attorney has for the client, which will be addressed in part two below.

 


Part Two: Implicit Bias in Criminal Cases

A. Prosecuting Attorneys and Prosecutorial Bias

Implicit bias can operate at many stages when we consider the choices prosecutors face during the various stages of a criminal case. Prosecutors have discretion when deciding whether to charge a person for a crime at all. Studies show prosecutors are more likely to charge Black suspects than white suspects in similar circumstances. For instance, on the issue of justified homicide and self-defense, the cell phone/weapon IAT results suggest that “prosecutors might be more likely to believe that the white victim was reaching for his cell phone, and thus, that the suspect acted unreasonably in shooting the deceased;” but when the victim is Black they are more likely to find that the suspect acted reasonably in discharging the weapon. The bias operates doubly here, because the white victim is more likely to be perceived as reaching for a cell phone whereas the Black victim is more likely to be perceived as reaching for a weapon, and the Black suspect is more likely to be perceived “as reacting unreasonably in discharging his weapon” against the white victim, whereas the white suspect is more likely to be perceived as “being in reasonable fear” of the Black victim, and therefore acting appropriately in discharging his weapon.

The prosecutor has discretion as to what level of crime to charge to each individual. The Black or Latino male “drug dealer” stereotype can impact whether a prosecutor files a simple possession charge or “with intent to distribute” charge. Similarly, the decision to charge juveniles in adult court can be impacted by the race of the individual.

Determining the level (or even availability) of bail is impacted by presumed ties to the community, which often are based on race, ethnicity, and socioeconomic status. Justice Hyman notes that “these discretionary decisions are closely tied to the prosecutor's evaluation of the suspect's behavior and whether a suspect seems likely to be a future danger to society.” In short, “the perception of the defendant ... in turn, alters the perception of the seriousness of the crime.”

But when is the prosecution evaluating the suspect's behavior? If before arrest, such as in the decision whether to seek a warrant, the prosecutor is generally seeing things unfold through the lens of the police or investigator, who is interpreting and reporting the suspect's actions, influenced by his or her own biases. When the prosecutor interprets the officer's report, that interpretation is influenced by the prosecutor's own biases. Thus, two levels of bias could creep in before the prosecutor even makes a charging decision.

If the prosecutor's evaluation of the defendant's behavior is post-arrest, common and expected reactions to being in jail--where one must not show fear or remorse, act “hard” and most of all be silent and self-protective--may leave an unfavorable impression with the prosecutor. The same person likely behaves very differently out in society on bail than when in prison or jail.

On the issue of plea-bargaining, the power of “in-group favoritism” impacts how empathetic a person will be when seeing or hearing about another person experiencing pain. For instance, studies have shown people empathize more with a lighter-skinned person being subjected to pain than with a darker-skinned person being subjected to pain, particularly when the people being measured have lighter skin. Empathy can lead to lower sentences; and the evidence shows that white defendants also receive more favorable plea bargains than Black defendants.

In jury selection, peremptory challenges allow implicit racial bias to seep in when Black jurors are disproportionately stricken. Implicit bias seems to play a large role during the voir dire process with preemptory strikes often reflecting an attorney's own unconscious stereotypes that govern which jurors to strike without reason. For instance, the prosecutors may strike African American jurors who live in the inner city--applying a stereotype about Black Lives Matter supporters when any particular person in that group could be “tough on crime” based on its devastating impact on the community. The remaining jurors are likely to be those who the attorney believes fit a favorable stereotype and they too could actually hold counter-stereotypic views.

Thus, prosecutors will attempt to keep those perceived to be pro-law enforcement jurors (just as the defense may strike otherwise pro-defense jurors or keep pro-prosecution jurors), all because they assess the jurors based on what could be erroneous or inapplicable stereotypes. Even though the defense can offer a nonracial reason for the strike, that reason may be covering for implicit bias as an after-the-fact, dissonance-reducing justification.

This potential for prejudice can continue into the conduct of the trial. The prosecutor hears and evaluates the testimony of the witnesses, and makes judgments about the credibility of those witnesses, which influence the substance of the prosecution's closing arguments. Black defendants are more often dehumanized by the words and phrasing used in the prosecution's closing arguments, such as referring to Black defendants as “animals.” Professor Frank cautions against references that implicate racial stereotypes. Courts “seldom find that such references deny nonwhite defendants a fair trial,” and therefore:

[U]ncharged acts admitted against non-White [sic] defendants likely assist jurors in filling any evidentiary gaps in the prosecution's case with implicit race associations. This phenomenon is consistent with studies demonstrating the jurors are more likely to convict racial minorities than non-White [sic] defendants when the evidence is ambiguous, versus weak or strong.

Prosecutors are using language that is persuasive to the jurors, and defense attorneys must employ counter-stereotype techniques in their efforts to vigorously represent the clients because judges are not likely to step in. Nevertheless prosecutors, as representatives of the state, must first pursue justice rather than any individual's interest, and this asymmetry suggests a more compelling case for curbing implicit bias in prosecutors over defense lawyers.

B. Criminal Defense Attorney Bias

Criminal defense attorneys also demonstrate implicit bias. Defense attorneys may be affected when primed with words or conduct associated with gangs or violence in their everyday practice. They may also be more likely to believe that ambiguous evidence relates to guilt or that the defendant is actually guilty in conformance with the stereotype. Defense attorneys may also attribute a higher level of violence to conduct allegedly committed by Black defendants over white defendants. Defense lawyers also rely on stereotypes when they pick juries, making assumptions based on their race, and Lyon cautions:

[I]f we are using this process of elimination based on stereotypes, jurors will know it. And then we cannot get angry if the jurors return the favor by making the assumption that our young male minority client is guilty, a gang member, or otherwise dangerous and not deserving of respect.

For public defenders, implicit bias may particularly affect decision making, given the triage circumstances of a need for quick action with few resources. Triage begins immediately, when public defenders prioritize cases based on whether or not the state can prove its case beyond a reasonable doubt or whether the public defender believes the client is “factually innocent.”

In a triage situation, public defenders allocate their resources first to the cases they can or should win. If public defenders fall under the spell of generalizations or stereotypes, they may miscalculate the merits of the case and the odds of winning it and thus decide to allocate fewer resources to what they perceived to be losing cases. For instance, public defenders subscribing to a stereotype that those who do not look their questioner in the eye are lying, may discount a witness's potential testimony or choose not to put that witness on the stand. Without that witness, who while truthful, appears to the defense attorney to be untruthful, the case against the defendant may be stronger, resulting in a disservice to the client.

The prophecy becomes self-fulfilling, reinforcing the perception for similar clients or circumstances in the future. These implicit biases “can cause attorneys to treat stereotyped individuals in stereotype-consistent ways” and the attorneys' “unconscious negative expectations may produce perceptions and attributions consistent with them.” The client may sense this negative reaction and respond in a way that causes the attorney to confirm her opinion and “can create a vicious cycle of mutual distrust and dislike, adversely affecting the attorney's triage decisions. Spending time with a client can, of course, change these initial impressions. But, an unpleasant initial interaction may reduce the defender's desire to do so.”

One might think public defenders would be “better” on race issues, but as one former public defender learned, “there is no person without prejudices, myself included.” For instance, when Professor Andrea Lyon had a Black defense client who became hostile and raised his voice, she succumbed to the “angry Black man” stereotype and decided to walk away from the client. Only later did she realize that the client might have been suffering from developmental disabilities, acting “tough” to cover-up the fact he could not follow her conversation and did not understand the multiple options she was presenting to him. She attributes her decision in the moment in part to the concept of “race loyalty,” which occurs when people are accustomed to “assuming the best in people who match our race because of our desire to see our race (and ourselves) positively.” The converse is also true, assuming the worst in people whose race is different from one's own race, which can impact how one represents criminal defendants.

Implicit bias may be even more of a concern in death penalty defenses, though little has been recorded about the attitudes of death penalty attorneys specifically. Taking the data sets of presentations made at training sessions for capital defense attorneys, researchers divided them into three groups: habeas corpus, trial lawyers, and law students. They found statistically significant differences in the results of the white-with-good-and-Black-with-bad IAT over the Black-with-good-and-white-with-bad for all three groups.

Interestingly, subjects did better when associating their own race with “good.” Having more Black capital defense attorneys would likely lead to more comfortable associations of Black-with-good. Based on this study, diversity in this attorney pool can have an impact on the volume and level of bias made manifest. Given that forty-two percent of individuals on death row are Black, some states have taken the initiative to enact laws that give defendants a claim of racial discrimination in their post-conviction appeals. For example, North Carolina enacted the Racial Justice Act in 2009, allowing defendants to make a claim of racial discrimination in their post-conviction appeals. Professor Shelley Song suggests additional reforms in the criminal context. While her article specifically addresses death penalty cases, her suggestions are broadly applicable to criminal cases generally and several also apply to civil cases.

C. Effects on Jurors

We know skin color may impact juror decision making. With otherwise identical scenarios, the darker the skin of the alleged perpetrator, the more likely jurors in mock situations are to find the alleged perpetrator guilty. The “violent Black man” stereotype discussed above can also operate as character evidence as Professor Mikah Thompson notes, referring to the nature of implicit bias as “unspoken evidence” often used at trial against African Americans. Various concepts of implicit bias come together to paint a picture of “stereotypical Blackness” that is often used against criminal defendants because “rather than offering inadmissible evidence of a Black defendant's character for violence, the government can instead offer evidence of the defendant's stereotypical Blackness, thereby playing upon the jurors' implicit biases to establish the guilt of the defendant.”

The George Zimmerman trial (for the killing of Black teenager Trayvon Martin in Florida) provides an illustration of the idea of “stereotypical Blackness.” The racialized criticism of prosecution witness Rachel Jeantel (as “dumb,” uneducated and not credible) provides insight into the way jurors and Zimmerman viewed Jeantel and also how they possibly viewed the victim. If people watching the televised trial thought Trayvon had a similar character to Jeantel because of their friendship and common color, then her “stereotypical Blackness” could have been passed on to him throughout the trial given that he was deceased and therefore not present. If no evidence was offered to “humanize” Trayvon Martin, all that remained was stereotypical Blackness, which could have acted as his character evidence. As such, the jurors likely would perceive him as threatening or violent and would perceive Zimmerman as being more reasonable in fearing for his life and using deadly force in self-defense. Hence, the acquittal.

In addition to being influenced by implicit bias, Professor Thompson suggests that white decision-makers may be affected by the “transparency theory.” The transparency theory is “the tendency of whites not to think about whiteness.” In short, because whiteness is the norm, being a white person is not something that many white people tend to think about. They do not actually reflect on their whiteness unless they are in a situation that calls them to compare themselves to someone who is not a white person. This transparency phenomenon is dangerous for African Americans because it causes white people to disregard the existence and salience of white-specific norms for those who are not white people.

Professor Thompson examines Federal Rule of Evidence 404(a), which limits when character evidence (other than for credibility purposes) may be used against a defendant in a criminal trial. Although the rule is meant to protect defendants from having unduly prejudicial evidence admitted, she suggests implicit bias combined with the transparency theory creates a different kind of character evidence not regulated by Rule 404(a): evidence of “stereotypical Blackness,” which is just as prejudicial. Therefore, when African Americans do not assimilate to these norms or seems unwilling to assimilate, they will face discrimination from jurors, even if those jurors have good intentions. Further, a Black defendant's perceived failure to assimilate to white-specific norms can cause that defendant to fall into the category of “stereotypical Blackness,” which implicitly includes evidence “that he or she has a propensity for engaging in certain behavior.” That behavior, based on the IAT studies discussed above, is aggressive, violent and often criminal.

D. How Judges' Implicit Bias May Affect Rulings

A few studies identify implicit biases in judges. Judges are “just as susceptible as are jurors to three cognitive illusions that hinder accurate decision making: anchoring, hindsight bias and egocentric bias.” An IAT study showed:

[T]he white judges mostly showed a white preference while black [sic] judges showed no clear overall preference. When subliminally primed with black-associated words in a hypothetical vignette, judges who expressed a white preference on the IAT were more likely to impose harsher punishments on defendants in the story than those primed with neutral words. When race of the defendant was made explicit in a hypothetical vignette, black [sic] judges were appreciably more willing to convict the white defendant rather than the defendant identified as African-American.

Justice Hyman's article recalls a moment when an attorney saw an African American woman exiting the judge's chambers and exclaimed “you must be the law clerk.” It turned out the woman was not the law clerk but in fact the judge. The justice then questioned whether this incident would affect the way the judge viewed that attorney moving forward in the case and how the judge's rulings would be affected. If the judge's actions were somehow influenced moving forward, this incident would affect not only the attorney but ultimately the client being represented.

Even this informed judicial perspective reveals a deeper level of hidden bias. Why would the judge let a personal slight impact her rulings? Perhaps because the author's common sense tells him that some female judges would be offended by the assumption that they are not judges. Professors Kang and Banaji and others caution against the use of so-called “common sense” by judicial actors because theories and notions that were once considered common sense are less accurate now based on the literature and empirical research. For instance, many may have considered it “common sense” that men on average are better drivers than women, but automobile insurance companies have studied the data and give women (especially when compared to young men) lower rates because their risk of accidents is lower.

Why would his common sense tell him the judge might be offended? Is it because of the stereotype that African American women are “unforgiving,” “hard,” “cold,” and perhaps are more likely to hold grudges over real or perceived slights than white males or other females? Quite probably. And so, she is second-guessed when a young white male judge would get the benefit of the doubt that he would not hold any such grudge if he instead had been mistaken for the law clerk or an off-duty bailiff. Implicit bias has an impact.

Another recent study of over 200 sitting judges analyzed the strength of positive stereotypes of white people and Christians, along with both negative and positive stereotypes of Asian people and Jewish people. Self-reporting using IAT tests, the study found most judges “displayed strong to moderate implicit bias against Asians relative to whites and against Jews relative to Christians.” The participants also completed a sentencing task--reading a file and making a sentencing decision where the race and religion of the defendant varied. Other findings include: (1) State judges gave longer sentences to white defendants than to Asian defendants; (2) Anti-Jewish, pro-Christian biases accurately predicted lesser sentences for Christian defendants, and (3) male judges showed stronger anti-Jewish biases than female judges.

The behavioral realism approach posits that judges need to keep up with empirical research if they are going to implicitly or explicitly base their decisions on theories about how people behave. Do judges even want to address the issue of implicit bias? Professor Eric Girvan argues, “the law-science gap exists and persists in significant part because judges believe they lack the ability to effectively remedy non-purposeful discrimination of the kind described by work on implicit bias and are unwilling to take steps necessary to develop ways to do so.” He criticizes judges for not doing a better job of “conforming their assumptions about human behavior to available social science.” He notes a Westlaw search he did, revealing “more examples of cases in which a majority of the judges indicating that they are aware of the concept refused to alter anti-discrimination doctrine accordingly in cases in which a majority of judges acknowledged evidence of implicit bias as justification for liability.”

In other words, despite reasoning that implicit bias played a role in allegedly discriminatory behavior, the judges declined to extend the interpretation of antidiscrimination doctrine to permit or support a finding of liability. In one recent case, the U.S. Court of Appeal for the Fourth Circuit broke from this pattern, holding that “[i]nvidious discrimination steeped in racial stereotyping is no less corrosive of the achievement of equality than invidious discrimination rooted in other mental states.” The court's comment explained that while the circuit had “admonished district courts, albeit in unpublished, non-precedential decisions,” it is now “past the time when that admonishment should be given precedential force.” Recognizing the “real risk” that legitimate claims based on subtle “stereotyping or implicit bias” may be dismissed by judges substituting their own rationales, the court reversed the district court's dismissal. Professor Girvan concludes, “if the antidiscrimination [sic] law-science gap is caused primarily by a lack of judicial knowledge, ... the solution to it should be to inform judges of research supporting the psychological science of implicit bias and evidence of its applicability to the cases they are deciding.” Part four addresses these suggestions.

E. Implicit Bias in the Courthouse

In the courthouse, further research is needed on the impact of implicit bias on court personnel. Court workers may treat people differently based on what they look like and based on implicit biases. As Professor Debra Bassett notes:

Court clerks who accept court filings may unconsciously respond differently to individuals of different races leading them to provide more help to some individuals than to others. As a mundane example, suppose a litigant presents paperwork for filing, and the paperwork lacks a required two-hole punch across the top. The court clerk may reject the paperwork when offered by some individuals but in other cases may accept the paperwork and simply punch it themselves.

Differences in treatment can lead to real consequences for litigants--even impacting the outcome of their cases.

Another important facet of civil litigation is how few cases go to trial. We know that the attorney's assessments of the case and its settlement value play a large role in the recovery options for civil plaintiffs. Implicit biases of settlement judges, mediators, and other dispute resolution actors may exacerbate the prejudice to the fair administration of justice.

 


Part Three: Implicit Biases of Attorneys in Civil Litigation

A. The Negligence Standard

The ABA Model Rules of Professional Conduct define as misconduct when a lawyer “reasonably should know” she is discriminating on the grounds of race, gender, ethnicity, religion, and other categories, or when such actions are prejudicial to the administration of justice. The word “reasonably” invokes a negligence standard. When should an attorney reasonably know that her conduct is discriminatory? Do the implicit bias studies make it reasonable for attorneys to know that their conduct may be discriminating on the basis of race, ethnicity, or other prohibited categories? IAT research suggests that the answer is yes. Because we know implicit biases exist, there can be a finding of a prima facie “claim against any facile claim of colorblindness.”

The ABA model rule applies broadly to “conduct related to the practice of law,” and therefore can apply to how lawyers represent their clients, including which strategies they pursue and what evidence they present at trial. Strategies involve the attorneys' judgments about the facts of the case as well as the client's particular circumstances, and may be difficult to label as racially or ethnically discriminatory in an individual case. But, implicit bias in the application of certain strategies may violate the ABA rule against conduct that is “prejudicial to the administration of justice.” For instance, implicit bias that takes the form of “Black male as violent” stereotyping can be prejudicial to the administration of justice, particularly when it enhances the perceived dangerousness of a person based on race; and thereby justifies lethal force against that person or uses ambiguous evidence to support a finding of dangerousness or violent conduct.

California's Rules of Professional Conduct are arguably milder than the ABA model rule is for attorneys, requiring actual unlawfulness, or knowingly permitting such conduct when applied to the behavior of another attorney, or someone in the law office. As Professor Girvan notes: “Legal doctrine, however stagnated. It only recognizes and targets overt, explicit bias with the accompanied understanding that social desirability concerns might frequently prompt people to conceal it;” however, the Fourth Circuit case Woods v. Greenboro suggests some movement in this area. While acting on bias can constitute actionable discrimination, implicit bias generally is not seen as actionable because there is no purpose or intent motivating it. In addition, the California rule is limited to accepting and terminating client representation (as well as employment conditions not applicable to this article), and therefore would not implicate strategic decisions in the conduct of representation.

Next, this article examines some of the situations in which implicit biases can impact the attorney-client relationship and be prejudicial to the fair administration of justice. Attorneys “reasonably should know” about these various opportunities for implicit bias to result in differential treatment based on race, gender, ethnicity, and other protected classifications.

B. Implicit Bias in Communication

Communication is often influenced by implicit bias. Clients may be hesitant to communicate based on their own culture, the lawyer's culture, or either's perceptions. “Cultural differences often cause us to attribute different meaning to the same set of facts,” as law professors found in a course on cultural competence. They explain:

[E]ven in situations in which trust is established, students may experience cultural differences that significantly interfere with Lawyers' and clients' capacities to understand one another's goals, behaviors, and communications .... One important goal of cross-cultural training is to help students make isomorphic attributions, i.e., to attribute to behavior and communications that which is intended by the actor or speaker. Students were taught about the potential for misattribution [so they] can develop strategies for checking themselves and their interpretations.

For instance, consider when a lawyer asks a client to speak up if the client wants the lawyer to explain something she does not understand or if the lawyer is not being clear, cultural barriers may preclude the client from responding truthfully. This instruction, while seeming appropriate on its face, does not necessarily take into consideration that in some cultures, it would be considered rude to suggest the lawyer is not being clear, and in others, it would be considered embarrassing to admit one does not understand. If the client is from such a culture where it is disrespectful to imply that someone in a position of power is being unclear, she does not ask for an explanation; and when asked if she understands, she answers affirmatively. But she does not understand. Later, the lawyer may ask why the client is not going along with the strategy, not realizing that the client did not understand enough to participate in the strategy in the first place.

C. Credibility Assessments that Rely on Implicit Biases

On the issue of substantive credibility (whether we believe the story to be true), we are more likely to believe stories that make sense to us and less likely to believe those that do not make sense. Consider a case where a defendant's photo identification card is found at the scene of a crime. For those of us who carry our identification cards with us every day, it does not make sense that the card would end up at a place and time that we are not (unless we assert that the identification card was stolen). For those who do not carry an identification card each day and instead keep it in a drawer somewhere, the card could end up someplace else or be left somewhere for days without the defendant noticing. Those who carry identification cards are less likely to find the second story credible, though it may very well be true.

Similarly, cultural differences can impact the process of evaluating credibility (whether we believe the storyteller to be credible). For instance, there are people who tell stories in nonlinear ways. Some cultures have different orientations about time and space and “[l]awyers and clients who have different time and space orientations may have difficulty understanding and believing each other.” What will seem to an attorney to be lying or uncooperative, may be the respectful or appropriate way to provide context and background for the story in the client's culture. It is important to be cognizant of the difference between having an individualistic culture where “people are socialized to have individual goals and are praised for achieving those goals,” as opposed to a collective culture where:

[P]eople are socialized to think in terms of the group, to work for the betterment of the group, and to integrate individual and group goals. Collectivists use group membership to predict behavior. Because collectivists are accepted for who they are and accordingly feel less need to talk, silence plays a more important role in their communication style.

Those silences and apparent meanderings will be interpreted by some as meaning the storyteller is less credible. This interpretation likely applies to the attorney who is deciding how to represent the client, as well as to the judges and jurors who ultimately determine what testimony to believe.

D. Representation Strategies Developed by Implicit Bias

Cultural competence has an impact on attorneys' strategies for representing their clients. A strategy that seems to be the best one from the attorney's perspective may be rejected by the client based on the cultural meaning of such an admission. For instance, in a tort case involving the plaintiff's own potential negligence, an attorney may want to pursue a strategy that shows the plaintiff did not understand a written warning that was not verbally explained. In the plaintiff's culture, failure to understand could be a sign of lack of intelligence; and if intelligence and ability to read are highly prized in that culture, the client may be unwilling to advance that argument, even though doing so may be the most effective argument for the case. In fact, the argument might be particularly effective when jurors may apply the “unintelligent Black people” stereotype prevalent in the American media, even though the plaintiff's culture is Nigerian, where education is highly prized. “Availability bias” plays a role here, which operates when people base their assessment of how likely a particular fact is to be true on how quickly or clearly they can recall examples of that fact being true. If the attorney knows a lot of educated Black people, that adjective-noun pairing makes sense and is more acceptable. If she does not know many educated Black women, then that pairing in unlikely in her mind.

Implicit bias studies show that people routinely discount the amount of pain they perceive another is feeling if that person has a darker complexion than they have. In personal injury cases, attorneys may therefore discount the amount of pain they attribute to Black clients seeking damages, while being more realistic, or even augmenting, the amount of pain they attribute to their white clients. Of course, the attorneys are not solely responsible for the valuation because they may have doctors or other expert witnesses provide an assessment of the pain levels. These experts may also be influenced by implicit bias, availability bias, confirmation bias, and others. Nevertheless, the valuation has an impact on case strategy and, for contingency fee cases, similar to the public defender triage situations described in part two above, on the amount of time and other resources to be devoted to the case.

When entering a contract, some people rely upon a handshake deal and others require a written agreement. In examining what is “reasonable reliance” or what constitutes an “offer and acceptance,” different cultures may have different assessments. Attorneys who are not mindful of these differences might decline to accept what could be a winning case.

In employment discrimination and wrongful termination litigation, stereotypes about certain groups being uneducated, unintelligent or untrustworthy can impact the attorney's assessment of the strength of the claim or defense.

E. Promoting the Fair Administration of Justice

There are many contexts in which cultural understanding enhances the attorney-client relationship and promotes the fair administration of justice. Conversely, a lack of cultural understanding undermines the attorney-client relationship, as well as the fair administration of justice.

Given the depth of research, Continuing Legal Education (CLE) courses, and other resources on the existence and implications of implicit bias, attorneys “reasonably should know” they may be acting in ways “prejudicial to the fair administration of justice” due to implicit biases. Attorneys who are not mindful of its potential impact in their representation may be discriminating in their “conduct related to the practice of law.” Given the Fourth Circuit's recent recognition that subtle stereotyping and implicit bias can give rise to legitimate discrimination claims, attorneys should abide by the ABA proposed Model Rule 8.4 and make special efforts to reduce the impact of implicit bias in the practice of law. The next section identifies specific strategies to implement this rule.

 


Part Four: Reducing the Impact of Implicit Biases in Civil Cases

There are three basic steps to reduce reliance on implicit biases. First, become more aware of one's own thought processes. Second, develop a healthy concern for the consequences of implicit bias. Third, learn to replace biased reactions with nonbiased ones. Ways to implement each of these steps in the civil context is discussed below.

A. Notice Race

If attorneys try to be colorblind, they will fail since most people do see color. Instead, attorneys should practice mindfulness, because simple awareness is not enough to reduce bias. Attorneys should focus on thoughts outside oneself, and use:

[E]mpathy-related techniques like perspective-taking, which prompts people to consider the experiences of individuals who are different from themselves. Colorblindness is not the answer; noticing race will help. Adopting an identity-conscious perspective (e.g., accepting and considering different identities) rather than an identity-blind mindset (ignoring or denying stigmatized attributes such as race and gender) can reduce bias. Finally, deliberately setting pro-diversity goals has been found to enhance diversity-related attitudes and behaviors.

Attorneys and judges alike should aim to make race salient in the courtroom when dealing with cases that raise concerns about racial bias. Although parties may be hesitant to bring up race in a civil trial unless discrimination is at issue, for fear of being “accused of playing the ‘race card,’ race,” however is often relevant. U.S. Supreme Court Justice Sonia Sotomayor provides a powerful example of this continuing problem in criminal cases:

In February 2013, [Justice Sotomayor] made race painfully salient for one federal prosecutor by publicly criticizing his racially stigmatizing questioning of an African American defendant charged with participating in a drug conspiracy .... Justice Sotomayor made race salient by highlighting the ways in which the prosecutor's remarks relied on racial stereotypes and prejudice. Her remarks will likely encourage the prosecutor in this case and attorneys in future cases to think twice before making similar comments that draw generalizations about individuals based on their race.

The effects of implicit biases are likely largely unrecognized and unacknowledged particularly due to some viewing American society as “post-racial,” and thus are even more likely to be ignored in civil cases.

Present methods of addressing bias may exacerbate implicit bias because they are directed primarily at explicit bias. Judges should be cautioned against dominating the jury selection process because jurors do not want to give biased answers or admit bias to a judge, but may be more likely to do so to lawyers. Therefore, Judge Bennett proposes that “the implicit bias of jurors can be better addressed by increased lawyer participation in voir dire, while the implicit bias of lawyers can then be curbed by eliminating peremptory strikes and only allowing strikes for cause.” Both suggestions are provocative.

Asking a question like “Can you be fair and impartial in this case?” is unhelpful because the question “does not begin to address implicit bias, which by its nature is not consciously known to the prospective juror. Thus, a trial judge schooled in the basics of implicit bias would be delusional to assume that this question adequately solves implicit bias.” Judge Bennett understands that jurors are likely to give him the answer they think he wants (and he is rather surprised whenever jurors admits they cannot be fair). Moreover, sometimes the questions are posed in a way that educates the jurors about what would be an appropriate response and in these situations “the trial judge is probably the person in the courtroom least able to discover implicit bias by questioning jurors.”

B. Recognizing the Importance of Cross-Cultural Competency

To help attorneys and judges become more aware of the consequences of their implicit biases, some law schools offer a seminar in cross-cultural competence applying the “five habits for building cross-cultural competence” from an article by Professors Susan Bryant and Jean Koh Peters. These authors made a plea to enhance clinical education by increasing the cross-cultural competence of law students. They recognize that “lawyers and clients who do not share the same culture face special challenges in developing a trusting relationship in which genuine and accurate communication can occur.” In addition, they suggest that some version of the course curriculum could be offered as a CLE course for attorneys already in practice.

For instance, the fourth habit is described as “pitfalls, red flags, and remedies,” and it “encourages conscious attention to the process of communication,” such as asking questions that “explore how others who were close to the client might view the problem and how they or she might resolve it.” This habit has four focal points: “(1) scripts, especially those describing the legal process, (2) introductory rituals, (3) client's understanding and (4) culturally specific information about the client's problem.” Using scripts to promote better communication would foster greater understanding on both sides, and highlighting culturally specific information would better equip the attorney to address and mitigate the impacts of implicit biases in representing clients.

The fifth habit is the “camel's back,” which “proposes two ways to work with biases and stereotypes: (1) creating settings in which bias and stereotype are less likely to govern, and (2) promoting reflection and change with the goal of eliminating bias.” One way to put this fifth habit into practice is to align oneself with counter-stereotypes. In the criminal context, one author suggests having people close their eyes and imagine they are being attacked by a white man and a Black man comes to their rescue. Just that notion of switching the “violent-Black and savior-white” stereotype to “violent-white and savior-Black” can make a big difference in subsequent decisions and deliberations.

Similarly, priming with counter-stereotypical words and phrases can impact decision making. For instance, thinking about words like “educated” and “intelligent” or “hard-working” and “responsible” before considering an employment discrimination claim can activate a different part of the brain that puts a minority former employee in a better light than doing nothing, which might leave the brain to fall back upon availability bias and unstated stereotypes like “chronically unemployed” and “entitlement-seeker.”

During trial, the court should talk about the concepts of implicit bias and the transparency theory with jurors, so they can be “trained” in this area, similarly to how police officers are “trained,” to try to combat implicit bias. For instance, to prepare jurors to be mindful of the consequences of their biases, Judge Bennett shows a video clip from a television show involving hidden cameras that captured bystanders' reactions to various situations with varied race and gender actors to show how people respond differently. In addition, he gives a specific jury instruction on implicit biases at the beginning of each case.

Judge Bennett also notes judges do not have the same resources to address implicit bias in prospective jurors and they do not have the same knowledge of the case to fully understand the impact of implicit bias in a particular situation. He shows a PowerPoint presentation about implicit bias and believes providing information upfront “may mitigate the effect of the bias.” In addition, he recommends the use of jury instructions, though he recognizes many of his colleagues are not receptive to the idea “fearing that implicit biases will only be exacerbated if we call attention to them.” Judge Bennett uses the following jury instruction:

[A]s we discussed in jury selection, growing scientific research indicates each one of us has implicit biases, or hidden feelings, perceptions, fears and stereotypes in our subconscious. These hidden thoughts often impact how we remember what we see and hear and how we make important decisions. While it is difficult to control one's subconscious thoughts, being aware of these hidden biases can help counteract them. As a result, I ask you to recognize that all of us may be affected by implicit biases in the decisions that we make. Because you're making very important decisions in this case, I strongly encourage you to critically evaluate the evidence and resist any urge to reach a verdict influenced by stereotypes, generalizations, or implicit biases.

Some jurors will try to follow the rules and will decline to apply racial stereotypes and generalizations if specifically asked not to do so, and other studies have confirmed that noticing race actually makes a difference in race-salient cases. Still, few courts give jury instructions to counter-stereotypes and prejudice.

C. Moving Toward More Deliberative Decisions

As Professor Blasi and others have noted, implicit biases can be changed. In order to make a conscious decision, the decision-maker has to have time to deliberate and not just react. Being overworked or rushed makes a difference in one's reliance upon the shortcut of stereotypes and unconscious bias. To combat this, courts can try to reduce time pressures. Since so few civil cases go to trial, there should be plenty of time for civil trial attorneys to assess, evaluate, and check for biases.

Professor Bassett proposes a standardized training program for all lawyers and clients, jurors and witnesses, as well as all court personnel, which integrates three approaches from prominent psychological studies: “diversity education, educating individuals about unconscious bias, and appealing to individuals' beliefs in equality and fairness.” The education components increase awareness, and the appeals to fairness enhance concerns about the consequences of implicit biases. She notes Professor Gary Blasi's conclusion that “if our values include fairness and treating people as individuals, then anything that increases self-awareness should decrease our application of stereotypes.”

Exacerbating the problem, studies note that judges “tend[] to favor intuitive rather than deliberative faculties.” Judges need to recognize that the decision they reach is not necessarily the right one. Others note:

Intuition is also the likely pathway by which undesirable influences, like the race, gender, or attractiveness of parties, affect the legal system. Today, the overwhelming majority of judges in America explicitly reject the idea that these factors should influence litigants' treatment in court but even the most egalitarian among us may harbor invidious mental associations.

One way to reduce implicit bias is to go back and question how a decision was reached, why it was reached, and what else might have impacted it. Having conversations with others can bring implicit biases to the foreground and judges are often constrained from using this tool except in open court. Attorneys and judges should take the time to fill in informational gaps with actual information, rather than using stereotypes and implicit biases as shortcuts.

The authors of the Blinking on the Bench study explain that inducing deliberation and more deliberative thought processes to judges could result in less bias. The authors conclude that:

“[W]e believe that most judges attempt to ‘reach their decisions' utilizing facts, evidence, and highly constrained legal criteria, while putting aside personal biases, attitudes, emotions, and other individuating factors.” Despite their best efforts, however, judges, like everyone else, have two cognitive systems for making judgments--the intuitive and the deliberative--and the intuitive system appears to have a powerful effect on judges' decision making. The intuitive approach might work well in some cases, but it can lead to erroneous and unjust outcomes in others. The justice system should take what steps it can to increase the likelihood that judges will decide cases in a predominantly deliberative rather, rather than a predominantly intuitive, way.

 


Conclusion

As the Fourth Circuit recently noted in Woods, “[i]nvidious discrimination steeped in racial stereotyping is no less corrosive of the achievement of equality than invidious discrimination rooted in other mental states.” While such discrimination can and does occur in the workplace, in business transactions, and in other facets of life, it is particularly pernicious when attorneys as officers of the court perpetuate this type of harm. This article analyzed the implications of attorneys' own implicit biases and how those biases impact their clients, jurors, and the fair administration of justice.

Actionable discrimination is difficult to prove because of the intent standard that the courts apply. But, using a negligence standard for attorneys' actions in and outside of the courtroom could reduce the impact of implicit bias. If attorneys comply with the ABA Model Rule of Professional Conduct that imposes a negligence standard on whether lawyers “reasonably should know” when they are discriminating on the grounds of “race, sex religion, national origin, ethnicity, disability, age, sexual orientation, gender identity, marital status or socioeconomic status,” the rights of litigants and the responsibilities of officers of the court would better serve the interests of justice.

 


Professor of Law, Pepperdine University School of Law; J.D. Stanford Law School; A.B cum laude Harvard College.