Our Dishonest Discourse About “The Hard R”

A controversy that began with this open letter asking Spotify to “take action against the mass-misinformation events which continue to occur on its platform” (with regards to COVID-19 and vaccines) and musicians including Neil Young and Joni Mitchell pulling their music from the platform took an interesting turn when India.Arie shared a supercut of Rogan not just using “the hard r”, but calling black people apes in talking about why she was pulling her music from Spotify. Joe Rogan has since apologized, and Spotify removed 70 podcast episodes where he used the slur.

It is possible, if not highly likely that I am being overly cynical regarding the sincerity of Rogan’s apology. My cynicism is animated at least in part by how often mea culpas for this sort of thing include phrases similar to this:
“It’s a video that’s made of clips taken out of context of me of 12 years of conversations on my podcast. It’s all smushed together and it looks f—— horrible, even to me”

and this:
“I never used it to be racist.”

and especially this:
“I do hope that this can be a teachable moment.”

This last quote in particular is one that provides an opportunity to pivot to academic usage of “the hard r”. Randall Kennedy argued last year in favor of what might be called a pedagogical exception for the word to be used in full for teaching purposes. Included in his argument however, are skepticism of some of the claims of hurt by students hearing the word used. Further, he argues that his race should not give him more leeway to use the n-word than his white colleagues. Dr. John McWhorter, a professor of linguistics, has written about the n-word on multiple occasions prior to the controversy over Rogan’s usage of it. Beyond pedagogy (in use) or virtue-signaling (in non-use), the question not being asked or adequately answered is why this debate only seems to persist around the use of a slur that only applies to black people (though there is a modifier for it that applies to Middle Eastern people that I first heard in the movie The Siege)?

This quote by James Baldwin poses and answers that question more eloquently and bluntly:
“What white people have to do is try and find out in their own hearts why it is necessary to have a ‘nigger’ in the first place, because I’m not a nigger. I’m a man. But if you think I’m a nigger, it means you need it.”

A poem written in 1912 and attributed to H.P. Lovecraft, provides another answer to where the necessity for the word (and the idea) comes from:
“When, long ago, the gods created Earth
In Jove’s fair image Man was shap’d at birth.
The beasts for lesser parts were next design’d;
Yet were they too remote from humankind.
To fill the gap, and join the rest to man,
Th’Olympian host conceiv’d a clever plan.
A beast they wrought, in semi-human figure,
Fill’d it with vice, and call’d the thing a NIGGER.”

In Lovecraft’s conception (and almost certainly in the conception of others who subscribed to the eugenics movement as he did), black people were not fully human. We were beasts to be feared, objects of and/or causes of lust, purveyors of vices, corrupters of innocence–but not human beings like everyone else. While Lovecraft is long since dead, the sentiments behind his poetically-expressed contempt for black folks live on in others–not just in the body politic, but in some of its leaders as well. This is why the governor of an entire state can say publicly that Joe Rogan should not have apologized (for using the n-word). I believe this to be at the heart of why the debate over the word continues.

To borrow from James Baldwin again, he expresses the sentiment well regarding how American society in his time used and viewed black people:
“That in a way black men were very useful for the American. Because, in a country so absolutely undefined – so amorphous – where there were no limits – no height really and no depth – there was one thing of which one could be certain. One knew where one was, by knowing where the Negro was. You knew that you were not on the bottom because the Negro was there.”
Though decades have passed since Baldwin spoke these words, it seems that America has yet to outgrow its need for black people to define where the bottom of society is, and the casual (if not unapologetic) usage of the n-word is just one manifestation of that broader sentiment. I maintain no illusions that this affliction is unique to the political right, or to libertarian ideology. Those on the political left are no shrinking violets when it comes to using “the hard r”.

It is very telling that many of the same people who rush to defend voices of dissent in other contexts lack the same concern when it comes to black people objecting to the use of a slur targeting them. The social norms against using slurs and stereotypes which attack Jews, or Italians, or the Irish, or Hispanics, or Asian people remain intact. You rarely (if ever) see people from those communities presuming to give out metaphorical hall passes for others to use slurs against them without consequences. Because black people are still not seen or treated as full citizens of this country, our opinions on “the contours of acceptable speech” lack the same weight as those of others. Too many people in this country apparently still prefer an older version of it where slurs against black people could be said without consequence. But that isn’t a version of America we’re returning to.

Social “Firsts” and the Supreme Court

A few days ago, Stephen Breyer announced his retirement from the Supreme Court of the United States at the end of the current term.  Because Joe Biden pledged to nominate a black woman to the nation’s highest court if he became president, he now has an opportunity to make good on that pledge.  Predictably, we began to hear and see a lot of high-minded (and hypocritical) commentary about how Biden should be choosing the “most-qualified” justice–regardless of their skin color.  Our attention span as a country is so short, we’ve already forgotten that Trump’s rise to the presidency was powered at least in part by publicizing a Federalist Society-authored list of high court nominees he would choose from if the opportunity presented itself.  We’ve already forgotten that Ronald Reagan promised to name a woman to the Supreme Court.

But the history of using the Supreme Court to accomplish social firsts stretches back much further than we might suppose from current commentary.  This thread by David Frum takes us all the way back to 1887, when President Grover Cleveland appointed Lucius Quintus Lamar to the high court in a bid to gain the support of conservative white southern Democrats for re-election.  Read Frum’s thread in full to get a complete sense of how unrepentant a Confederate Mr. Lamar was.  This dubious social first—the appointment of a traitor to the Union to nation’s highest court–would prove very important for a reason not fully touched on at all in Mr. Frum’s thread.  1887 marked the year the US federal government fully abandoned Reconstruction–and the nation’s black citizens to decades of voter disenfranchisement, terrorism, property theft, murder, and Jim Crow laws.

No discussion of the Supreme Court and social firsts would be complete without mentioning Maryland’s own Thurgood Marshall.  He earned his undergraduate and law degrees from 2 HBCUs (graduating 1st in his class from Howard Law because the University of Maryland School of Law was still segregated).  Out of 32 cases he argued before the Supreme Court, Marshall won 29, losing just 3.  He served as a federal appeals court judge for the second circuit for a number of years prior to becoming the nation’s first black solicitor general.  Some months of his tenure as an appeals court judge were served as a recess appointment due to certain southern senators holding up his official appointment, including the same segregationist James Eastland that Joe Biden recalled a civil relationship with.  He would win 14 cases on behalf of the government in that role, losing just 5.  Among his peers both at the time and since, there may not be a more successful justice at winning arguments before the Supreme Court prior to becoming a member of it.

Discussing the legal and rhetorical brilliance of Thurgood Marshall requires discussion of his successor.  Few nominations to the high court are a better demonstration of the hypocrisy of many of today’s conservatives regarding “qualifications” (including those who oppose Trump) than the absence of such concerns being raised when Clarence Thomas was nominated to the Supreme Court.  In contrast to the years served as an appellate court judge and solicitor general by Marshall, Thomas was an appellate judge for the DC circuit for just 16 months.  Thomas graduated in the middle of his law school class at Yale in contrast to Marshall’s 1st in class at Howard.  The White House and Senate Republicans apparently pressured the American Bar Association (ABA) to give Thomas a qualified rating even while attempting to discredit the ABA as partisan–and this is before Anita Hill’s interview with the FBI was leaked to the press and led to the re-opening of Thomas’ confirmation hearings.  The same GOP that loves to quote that one line from that one speech of Dr. Martin Luther King, Jr. could not have cared less about “the content of [Thomas’] character”.  They cared that he was both conservative and black.  The way the Senate treated Anita Hill during those re-opened confirmation hearings would in retrospect be a preview of the treatment awaiting future black women appointees to federal roles.

How Thomas fared during his confirmation hearings almost certainly animated the treatment of Lani Guinier after her nomination to become assistant attorney general for civil rights by Republicans.  Her treatment by them, conservative media, and by the White House who nominated her was utterly shameful.  Conservatives lied about her positions.  The same Joe Biden who contributed to the poor treatment that Anita Hill received before the Senate Judiciary Committee he chaired 2 years earlier, reported “grew lukewarm about Guinier”.  President Clinton would ultimately withdraw the nomination in the face of lies and distortions about her writings.  His administration had apparently instructed her not to make any public statements about until after he’d already decided to withdraw her nomination, enabling her opponents to smear her in the press and her “allies” to get cold feet about supporting her.  Particularly now as a wave of anti-CRT legislation, book bans, and attacks on affirmative action gain traction around the country (especially in light of Guinier’s recent death), it is important to remember that Guinier only got to make her case to the public in one interview with Ted Koppel–and the public received her views well.  She never got the Senate hearing that even Robert Bork got for his extreme views because Bill Clinton–her friend from Yale Law School–pulled her nomination instead.

Not even two weeks have passed since the annual hypocrisy-fest that is MLK Day, and a significant majority of Americans surveyed seem to have decided once again that black women should wait for what should be theirs.

The attacks on the first black woman Supreme Court nominee will be fierce (if Biden follows through on his commitment).

When it comes to the Supreme Court and credentialism however, perhaps the best example of the double standard that seems to exist for women generally is the brief nomination of Harriet Miers.  Conservatives in particular dragged this woman for her lack of elite education (she earned degrees in mathematics and law at Southern Methodist University).  Only in looking back did I learn that Harry Reid (Senate minority leader at the time) actually recommended Miers as the successor to O’Connor, and that other members of the Senate Judiciary Committee hoped to see nominees from outside the federal appellate court system.  Perhaps because Reid earned his law degree in George Washington University’s part-time program, he didn’t put as much stock in an Ivy League pedigree as he did in bringing the perspective of an experienced practicing lawyer to the Supreme Court.  Potential conflict of interest concerns raised by Miers’ relationship with President Bush and his staff might ultimately have sunk her nomination anyway had she not withdrawn it.  By contrast, Clarence Thomas has ruled in numerous cases where he had clear conflicts of interest with little or no criticism from his supporters on the political right.

Considering the sorts of cases which will soon come before the Supreme Court, we should remember that as an institution it has been used as often as a tool to remove and restrict rights as it has to grant them (if not more so).  The aforementioned appointment of Lucius Lamar is not the only time that the Supreme Court has been used to undermine full citizenship for black people in the United States.  Before William Rehnquist became associate justice (nominated by Nixon), then Chief Justice of the Supreme Court (nominated by Reagan), he was a “poll watcher” in Arizona under the auspices of Operation Eagle Eye, a nationwide campaign by the Republican National Committee to suppress black votes.  This 2021 piece by Charles Pierce makes a convincing argument Rehnquist tried to pass off his personal opposition to the ultimate outcome of Brown v Board of Education as that of the justice he clerked for (Robert Jackson, Jr).  In this memo, he defended Plessy v Ferguson as good law, and likely lied about it in both of his Supreme Court confirmation hearings.  From the time he became one of Rehnquist’s law clerks, to replacing him as Chief Justice of the Supreme Court, John Roberts has had the Voting Rights Act in his sights as a law to be weakened (if not destroyed).

Contrary to the polls (and numerous previous demonstrations of an utter lack of spine), Lindsey Graham has emerged as a supporter of the idea of a black woman nominee to the Supreme Court.  Current US District Court judge J. Michelle Childs of South Carolina being a possible nominee certainly doesn’t hurt.  If the current shortlist is any indication, any of the black women Biden selects from it will be just as qualified–and likely more so–than any of their colleagues at the time of their selection.  It wouldn’t surprise me if Biden chose Breyer’s former clerk (Ketanji Brown Jackson) to succeed him.  But as a state university graduate myself, part of me hopes that someone with at least one degree from outside the Ivy League gets selected.

Two Tales of Tech Recruiting

In an industry that has had (and continues to have) persistent problems when it comes to how it hires and treats black people within its ranks, few things are worse than a black woman announcing on social media that she short-changed a candidate of $45,000 because “I personally don’t have the bandwidth to give lessons on salary negotiation”.

I’ve worked with both contract recruiters and full-time recruiters in 10 years as a manager staffing software engineering positions on multiple teams and none of them low-balled any candidate I chose to extend an offer because I intended to keep those folks for as long as I could. The alternative–losing good people to companies that can poach them simply by offering more money–meant not just losing their skills, and having fewer people to divide the same amount of work between, but my employer incurring costs trying to backfill the open position. Especially in a market where the competition for talented people is more and more challenging, the last way any company should start a relationship with a new employee is by undervaluing them from the moment they join.

A position I only filled a couple of weeks ago had been open for two solid months before that. Rather than risk losing a good candidate over $10,000, I requested an exception to offer a larger signing bonus. With the exception granted, we made a best and final offer that he accepted. The onboarding process is going smoothly, and since we’re paying him what he’s actually worth based on the geography we’re in and what our competitors are offering, he will be harder to poach with just money.

Fortunately, there are good examples of recruiters doing well by the people they recruit.

Unlike the first Johnson, this one probably built a significant amount of goodwill and trust–not just between herself and the candidate, but between the candidate and the company she will be working for. In an industry where software engineers are encouraged to switch jobs every couple of years, this company has a good chance of growing this junior software engineer into a senior software engineer–perhaps even a engineering leader–because a recruiter put their best foot forward.

As is sometimes the case on Twitter in cases like this, someone tagged the company Mercedes S. Johnson is recruiting on behalf of–and someone responded requesting a DM with more information. The tweet that actually led me to this whole story was about doxxing and how Ms. Johnson shouldn’t lose her job over the post. I’ve written about at-will employment and cancel culture before, and people have definitely lost their jobs for less than what this woman bragged on Twitter about doing. As of this writing, she was still defending her action.

If you work in tech recruiting and the opportunity presents itself, choose to be a Briana instead of a Mercedes. Both the companies you hire for and the candidates you recruit for them will thank you.

MLK Day 2022

The third Monday in January is here, and once again people who oppose everything Dr. King stood for are abusing the one line they know from the I Have a Dream Speech (because they don’t know any others) for their own political ends.  This annual whitewashing of King’s legacy only succeeds to the degree it has because the people doing the whitewashing don’t dare venture beyond the confines of that line in that speech because too much of what he written stands in direct opposition to their political aims.  This applies not just to the secular, but to the religious as well.

One of my cousins read his children Letter from Birmingham Jail yesterday.  This letter is where we can find the phrase “Injustice anywhere is a threat to justice everywhere.”    This letter is also where we can find this phrase: “Anyone who lives inside the United States can never be considered an outsider“.  You can be certain that none of the hypocrites quoting King today will quote that.  Decades after this letter was written, we’ve seen how this country continues to treat and talk about certain immigrants.  Decades after this letter was written, the segregation and police brutality of which King wrote in 1963 are still problems in this country today.  Actually reading his letter reveals that direct action was chosen as a last resort, only after the local leaders they negotiated with broke their promises.

This passage from the letter is sadly relevant once again in the wake of GOP measures to make it harder for those in the electorate who oppose their program to cast votes:

An unjust law is a code inflicted upon a minority which that minority had no part in enacting or creating because it did not have the unhampered right to vote. Who can say that the legislature of Alabama which set up the segregation laws was democratically elected? Throughout the state of Alabama all types of conniving methods are used to prevent Negroes from becoming registered voters, and there are some counties without a single Negro registered to vote, despite the fact that the Negroes constitute a majority of the population. Can any law set up in such a state be considered democratically structured?

When you read the letter written by eight Alabama clergyman that King was responding to, the motivation for this paragraph becomes crystal-clear:

First, I must confess that over the last few years I have been gravely disappointed with the white moderate. I have almost reached the regrettable conclusion that the Negro’s great stumbling block in the stride toward freedom is not the White Citizens Councillor or the Ku Klux Klanner but the white moderate who is more devoted to order than to justice; who prefers a negative peace which is the absence of tension to a positive peace which is the presence of justice; who constantly says, “I agree with you in the goal you seek, but I can’t agree with your methods of direct action”; who paternalistically feels that he can set the timetable for another man’s freedom; who lives by the myth of time; and who constantly advises the Negro to wait until a “more convenient season.” Shallow understanding from people of good will is more frustrating than absolute misunderstanding from people of ill will. Lukewarm acceptance is much more bewildering than outright rejection.

One key insight among many in King’s five-and-a-half page letter is the different ways in which the black community responded to the stubborn persistence of Jim Crow & segregation: adjusting to it, being desensitized to the problems of those black less secure economically and academically than themselves, or bitterness.   His warnings about what could happen if the nonviolent efforts for justice he supported were rejected would unfortunately become true–not just in the immediate wake of his assassination five years after this letter, but many times in the wake of police violence resulting in the death of someone in their custody (and/or acquittals as the result of the rare court trials officers faced for such violence).

King’s decades-old criticism of the contemporary Christian church in the America of his day should shame today’s Christian church:

The contemporary church is so often a weak, ineffectual voice with an uncertain sound. It is so often the arch supporter of the status quo. Far from being disturbed by the presence of the church, the power structure of the average community is consoled by the church’s often vocal sanction of things as they are.

But the judgment of God is upon the church as never before. If the church of today does not recapture the sacrificial spirit of the early church, it will lose its authentic ring, forfeit the loyalty of millions, and be dismissed as an irrelevant social club with no meaning for the twentieth century. I meet young people every day whose disappointment with the church has risen to outright disgust.

The hypocrites referencing King on this day are doing things like invoking his name in support of “All Lives Matter”, or to support their bans on Critical Race Theory (which could pretty easily prevent children in our public schools from actually learning anything about King’s letter).  Some More News has a hilarious, profane, and correct take on the annual whitewashing of King’s legacy.

1/6 and 9/11

Absent from much of the written commentary I’ve read about the insurrection at the US Capitol last year has been any mention of how much the nation’s response to the 9/11 attacks helped to pave the way to where we are now.  A friend sent me this piece by a Canadian professor which serves as a good example of what I mean.

Though he correctly identifies specific individuals and economic forces going back 40 years that transferred wealth upward even as they directed discontent (if not rage) about this state of affairs against poor and minority populations at home and “foreign aid” abroad, there is not a single mention of the nation’s response to the 9/11 attacks.  The nation’s lurch toward authoritarianism in the wake of those attacks was bipartisan.  Just a single congresswoman, Barbara Lee of California, voted against the open-ended Authorization for Use of United States Armed Forces which would later be used to invade Iraq on pretexts that would prove false.  Large bipartisan majorities in the House and Senate drafted and approved the Patriot Act for George W. Bush to sign into law.  It authorized the creation of the Department of Homeland Security.  George W. Bush’s administration engaged in warrantless surveillance of millions of Americans, extraordinary rendition of terrorism suspects, and torture of those same suspects.  Enemy combatant status was created out of thin air, as were the military tribunals in Guantanamo Bay, Cuba–all to deny people the rights they should have had under our Constitution.  The NYPD illegally surveilled Muslims both inside and outside New York City for over a decade after the attacks.  The LAPD tried and failed to create a similar surveillance program in 2007.

Thomas Homer-Dixon’s piece mentions Christians just twice, once as fertile soil for the seeds of white nationalist great replacement theory to take root and flourish, and again as a group that would be super-empowered in a second Trump administration.  He projects a rise in violence by vigilante, paramilitary groups in the same sentence, though the use of Christian symbols and rhetoric by such groups has a history stretching back well over a century in the US.  The involvement of conservative Christian groups in the insurrection is much less-surprising however when you look back at their response to 9/11.  When surveyed in 2009 by the Pew Research Center, a majority of white evangelical Protestants said that torture against terrorism suspects could sometimes or often be justified.  This belief was held both by majorities of Christians who attended church a few times a year or monthly, and those who attended church weekly–or more often.  Years after the original survey, you could even find a piece like this one in The Federalist quoting Bible passages and Thomas Aquinas to argue that Christians can support torture.

Not mentioned at all in the Homer-Dixon piece–significant increases in anti-Muslim sentiment in the aftermath of the 9/11 attacks.  The first murder victim of an anti-Muslim hate crime turned out to be a naturalized American citizen, Balbir Singh Sodhi. The turban he wore in adherence to the Sikh faith was sufficient cause for a bigot to murder him.  Anti-Muslim sentiment would later take the form of the birther conspiracy, for whom Donald Trump would become the most powerful cheerleader.  We have seen other anti-Muslim murders due to the ignorance of bigots (in Olathe, Kansas) as well as violent assaults. We’ve also seen the political right demagogue Park51 into the “Ground Zero mosque”.  That same year (2010) saw the introduction of anti-Sharia bills in a significant majority of our 50 states.  The number of conservative professed Christians who believed (and perhaps still believe) the birther conspiracy is in retrospect perhaps one explanation for the ease with which the QAnon conspiracy spread within the same community.  But looking back a bit further, that community’s response to 9/11 might have revealed a predisposition to conspiracy theories more generally.  In 2006, a division of the denomination publishers for the Presbyterian Church published a 9/11 conspiracy book.

There will certainly be more commentary about January 6th as this year progresses–particularly as more insurrectionists plead guilty to the crimes with which they’re charged or (finally) face trial.  But the absence of a full reckoning with how this country’s responses to 9/11 helped pave the way for 1/6 will prevent us from fully understanding that event–and might enable the next insurrection to succeed.

Is a College Degree Worth It?

Public discourse has turned (again) to the question of whether or not a college degree is “worth it”. I say again because in the tech industry, this question has been asked about computer science (CS) degrees over a decade ago. I was prompted to revisit this blog post from over 14(!) years ago by Scott Hanselman’s response to a TikTok video saying a computer science degree is never worth it:

 

Back in 2007, I was managing a team which consisted mostly of what Tarver calls “street programmers”.   In that particular experience, Tarver was wrong about street programmers being superior to formally-trained CS graduates.  The members of my staff who consistently turned out the highest-quality code (which not coincidentally was also the best-tested and the least likely to require re-work) all had CS degrees.  In my next role, one of my colleagues was an Air Force veteran who was self-taught in software engineering.  He was one of the most skilled engineers I’ve worked with in my entire career, and taught me a ton about the practice of continuous integration over a decade ago that I still use in my work today.  

In re-reading Tarver’s post, even he concedes that the combination of hands-on programming practice and a strong grasp of theory creates a superior programmer when compared to one trained only in university or only on-the-job.  The other thing which struck me as odd in retrospect was the lack of any mention of summer internships.  Back in the early-to-mid 90s when I was earning my own computer science degree, it was definitely the expectation that CS majors would complete at least one summer internship before they graduated so they had at least a little experience with programming outside of coursework requirements.  I found an on-campus job where I worked during the semester which at least had tasks that I could automate with scripts, as well as database work.  My summer internship with The Washington Post as a tech intern turned into a part-time job my last semester of undergrad and a full-time job offer at the end of the year.  So instead of a declarative statement such as “college is never worth it” or “college is always worth it”, a better answer to the question is more like “it depends”.

Quite a lot has changed since 2007 when it comes to the variety of ways available to learn about programming.  There are lots of programmer bootcamps now.  My current employer partners with one to train college graduates with degrees in fields other than computer science for entry-level software engineering roles with us.  Beyond instructor-led bootcamps, there are a wealth of online education options both free and paid.  Having worked with engineers who came into the field via the bootcamp route at two different companies now, I’ve seen enough inconsistency in the readiness of bootcamp graduates for professional work that most require more oversight and supervision at entry-level positions than graduates from computer science programs.

At least some of the discussion about the worth of college degrees (in CS or many other fields) is a function of tuition continuing to increase at rates triple that of inflation (and have been doing so for decades).  The total amount my parents spent on in-state tuition for my CS degree in the 90s might not even cover 2 years at the same school today.  A year of tuition at my 1st-choice school today, Carnegie-Mellon University costs at least triple the $24,000 they charged in 1992.  It might be possible to rationalize paying high tuition for a STEM degree with high long-term earning potential, but those high tuition rates apply regardless of major.

Another issue that discussions of whether or not college degrees are “worth it” consistently misses is how open different fields and companies within those fields are to hiring people without formal training.  Particularly in tech, that openness exists for white men in a way that it definitely does not for people of color.   Shawn Wildermuth’s documentary Hello World gets deep into why women and minorities tend not to pursue careers in software development and even with the credential of a college degree and experience, it can be very challenging to sustain a tech career–much less advance–if you don’t look like the people who make hiring and promotion decisions.

Count me in the camp of those who believe a CS degree is worth it.  I wouldn’t have the tech career I have today without it.

Thoughts on the Many Shades of Anti-Blackness

A friend shared the following tweet with me not long ago:

Whoever Jen Meredith is, she is hardly alone in sharing these sentiments.  Few routes to acceptance by the still-predominant culture in the United States are shorter and more reliable than implicit or explicit criticism of the black community in America whose heritage here stretches back even before the founding of the country as we know it.  There have always been people who buy into the model minority myth. The term “Asian” elides significant differences between its various subcultures (and erases the parts of that very large community which don’t support the immigrant success story in exactly the same way some white conservatives do).  People from the Philippines have meaningfully different backgrounds than those from South Korea, Pakistan, and Vietnam to take a few examples.

Meredith is (obviously) sub-tweeting American blacks with her entire comment, but the “no ethnic leader” part in particular betrays a very specific ignorance about the history of black people in the United States. Black people in this country have never just had one leader. Malcolm X and Martin Luther King, Jr. are just the ones that recent history (the vast majorities of which have not been written by black people) has acknowledged. Less often-noted are men like Marcus Garvey, who while Jamaican (not American) nevertheless found a receptive audience among some black Americans (including the parents of Malcolm X).  A. Philip Randolph was no less important than either of those men. The same can be said of Bayard Rustin, Fred Hampton, W.E.B Dubois, or Booker T. Washington.

Asians in the United States may not have had a singular figure that history chooses to recognize in this way (or a Cesar Chavez, like the Mexican-American community), but perhaps that’s in part because they haven’t really needed one. This doesn’t mean they haven’t even experienced racism in this country. The federal government passed laws against Chinese immigration and some were even lynched in California the way they did blacks in the South. Japanese-Americans were put in concentration camps and had their property taken. But at least they had property to take, which could not be said of black Americans in many cases.  One Asian-American experience which may not be broadly known, but is emblematic of the subtleties of racism in this country, is that of the Mississippi Delta Chinese.  The entire project is well-worth reading and listening to in full, but here is one part which stood out to me:

After WWII, China was an ally to the United States and then the rules relaxed; I think it was in 1947 or 1948. After the war, Chinese kids were allowed to attend white public schools, so that was the year that I started first grade.”

Issac Woodard was just one of many black veterans of WWII who was attacked just for wearing his uniform around this time.  Some black veterans fared even worse than Woodard.  The US military didn’t desegregate until 1948.  Over two decades would pass before schools in Yalobusha County, Mississippi (and the rest of the state) would finally desegregate.  At the same time members of the Asian-American & Pacific Islander (AAPI) community were attending better-quality schools and building wealth, many black military veterans were being denied the benefits of the GI Bill.  Black people resorted to overpaying for housing via contracts, due to racist real estate covenants and redlining by the Federal Housing Administration.  All of this happened before you even get to the ways in which federal civil rights, voting rights, and fair housing legislation have been actively undermined or passively neglected from the Nixon administration forward.

When your experience (and your parents’ experience) of the United States doesn’t include the combination of chattel slavery, pogroms, property theft, terrorism, segregation, and other aspects of the black American experience, you’re bound to see this country differently. That’s why you can (unfortunately) hear some of the same anti-black American sentiments from black immigrants to this country. Particularly as someone who writes software for a living and leads teams of software engineers, I have more common experiences with my fellow church members, classmates, and co-workers from India, China, and the Philippines than I do with some black people with hundreds of years of heritage in this country.

Finally, it is exceedingly unwise to underestimate the growing political power of the Asian-American & Pacific Islander community. This movement with “no ethnic leader” (as Meredith claims) got federal legislation passed against Asian hate crimes—in our current political environment—when we still don’t have a federal law against lynching after over a century of attempts to pass one.  It’s all well and good to talk about having agency in one’s life.  I am doing my best as a parent to teach my own children the same lessons about making good choices that my parents taught me.  But criticisms of the American black community that fail to acknowledge how an unjust society increases the difficulty of making wise choices are dishonest.

Thoughts on Diversity in Tech

On April 28, I participated in a panel and Q & A on the intersection of race & technology.  My 2 co-panelists and I each had 15 minutes for a monologue regarding our personal experiences with how race and the tech industry intersect.  This post will excerpt my prepared remarks.

Excerpt of Prepared Remarks

How did I end up writing software for a living anyway?  I blame LEGOs, science fiction, and video games.  While I’ve never actually worked in the gaming industry, I’ve built software solutions for many others—newspapers, radio, e-commerce, government, healthcare, and finance. Tech industry salaries, stocks, and stock options have given me a lifestyle that could accurately be called  upper middle-class, including home ownership and annual domestic and international travel for work and pleasure (at least before the pandemic).
For all the financial rewards the industry has had to offer though, “writing software while black” has meant being comfortable with being the only one (or one of two) for the majority of my career–going all the way to my initial entry to the field.  As an undergraduate computer science (CS) major at the University of Maryland in the early to mid-nineties, I was on a first-name basis with all the other black CS majors in the department because there were never more than 10-12 of us in the entire department during my 4 1/2 years there–on a campus with tens of thousands of students.  In that time, I only ever knew of one black graduate student in CS.  My instructor in discrete structures at the time was Hispanic.  Even at a school as large as the University of Maryland, when I graduated in the winter of 1996, I was the only black graduate from the computer science department.
Unlike law, medicine, engineering, or  architecture, computer science is still a young enough field that the organizations which have grown up around it to support and affirm practitioners of color are much younger.  The National Society of Black Engineers for example, was formed in 1975.  The Information Technology Senior Management Forum (ITSMF), an organization with the goal of increasing black representation at senior levels in tech management, was formed in 1996.  The oldest founding year I could find for any of the existing tech organizations specifically supporting black coders (Black Girls Code) was 2011.  I’d already been a tech industry professional for 15 years at that point, and in every organization I’d worked for up to that point, I was either the only black software engineer on staff, or 1 of 2.  It would be another 2 years before I would join a company where there was more than one other black person on-staff in a software development role.
I’ve had project and/or people leadership responsibilities for 8-9 years of my over 20 years in tech.  As challenging as succeeding as an under-represented minority in tech has been, adding leadership responsibilities increased the scope of the challenge even more.  As rarely as I saw other black coders, black team leads were even scarcer until I joined my current company in 2017.  It basically took my entire career to find, but it is the only place I’ve ever worked where being black in tech is normal.  We regularly recruit from HBCUs.  We hire and promote black professionals in technical, analytical, managerial, and executive roles in tech.  There are multiple black women and women at the VP level here.  The diversity even extends to the board of directors–four of its members are black men, including the CEO of F5 Networks.
Perhaps most importantly–and contrary to the sorts of things we hear too often from people like James Damore and others about diversity requiring lower standards–this diverse workforce has helped build and maintain a high performance culture.  This publicly-traded company is regularly in the top 25 of Fortune Magazine’s annual best places to work rankings.  But this year–even in the midst of the pandemic–it jumped into the top 10 for the first time.
The company uses its size to the benefit of under-represented minorities in tech with business resource groups.  Two of the BRGs I belong to have provided numerous opportunities to network with other black associates, to recruit and be recruited for growth opportunities in other lines of business.  As a result, it’s the only company I’ve worked for in my entire career where I’ve had the ability to recruit black engineers to join my team.  These groups have even provided a safe space to vent and grieve regarding the deaths of unarmed black men and women at the hands of police officers.  When we learned that Ahmaud Arbery had been murdered, I had black coworkers I could talk about it with–all the up to the VP level down to the individual contributor level.  We were able to talk about George Floyd’s murder at the time, and in the aftermath of Derek Chauvin’s trial.  As long as these deaths have been happening, this is the only employer I’ve ever worked for where I know there is a like-minded community where I can talk through such issues with–as well as sympathetic allies.
Not only has this company put millions of dollars into organizations like the Equal Justice Initiative, they set up a virtual event for EJI’s founder, Bryan Stevenson,  to speak to us and field our questions.  Ijeoma Oluo and Dr. Henry Louis Gates, Jr have participated in corporate events as well.  They are one of just three Palladium Partners with ITSMF.  I recently completed a program they created for us called the Leaders of Color Workshop for the purpose of helping black managers advance within the organization.
All the good things I’ve shared doesn’t mean it’s a perfect employer (as if such a thing existed).  I found it necessary to transfer to a different department and line of business in order to find a manager interested in helping me advance my career.  Talking to my classmates in the most recent workshop revealed quite a few stories of far more negative experiences than mine from people who have been part of company much longer than I have.   They’ve had at least a couple of instances of viral Medium posts from former employees whose experiences were far more negative than mine.  But at least in my experience, it’s been and continues to be a great place to be black in tech.
Because the majority of our workforce is women, and nearly 1/3rd of the staff comes from minority groups that are under-represented in tech, the company has done a pretty good job of avoiding the sort of missteps that can put you in the news for wrong reasons.  Seemingly just in time for the discussion we’re about to have, the founders of Basecamp (the very opinionated makers of the product of the same name and the HEY email client among other products) are taking their turns as the proverbial fish in a barrel due to their decision to follow the example of Coinbase in disallowing discussions of politics and social causes at work.  So it was very interesting to read the open letter published to them by Jane Yang, one of their employees currently on medical leave.  She writes in some detail about the founders’ decision to exclude hate speech and harassment from the initial use restrictions policy for their products.  Read Jason Fried’s initial post and David Hanson’s follow-up for fuller context.
Basecamp is a small example (just 60 employees), Coinbase a slightly larger one (1200+ employees), but they are good proxies both for many companies I’ve worked for and companies orders of magnitude larger like Facebook, Amazon, and Google who have recently been in the news for discriminatory treatment of underrepresented minorities in their workforce.  Their failures, and those of the tech industry at large to seriously address the lack of diversity in their recruiting and hiring practices has resulted and will continue to result in the creation of products that not only fail to adequately serve under-represented minorities, but actively cause harm.  In the same way monoculture in farming creates genetically uniform crops that are less-resistant to disease and pests, monoculture in corporate environments leads to group think, to more uniform, less-innovative products with a higher risk of automating and perpetuating existing biases.
I recently watched Coded Bias, a documentary available on Netflix (and PBS) that highlighted the failings of existing facial recognition technology and the dangers it poses–to people of color in particular (because it tends to be far more inaccurate with darker-skinned people) but to people in general.  Were it not for the work of Joy Buolamwini, a black woman research assistant in computer science at MIT, we might not have learned about these flaws until much later–if at all.  These dangers extend beyond facial recognition technology to the application of algorithms and machine learning to everything from sentencing and parole determinations, hiring and firing decisions, to mortgage, loan, and other credit decisions.  Particularly as a bank employee, I’m much more conscious of the impact that my work and that of my team could potentially have on the lives of black and brown bank customers.  Even though it’s outside the scope of my current team’s usual work, I’ve begun making efforts to learn more about the ML and artificial intelligence spaces, and to raise concerns with my senior leadership whenever our use of ML and AI is a topic of discussion.  Despite all the challenges we face being in tech as under-represented minorities, or women, or both, it is vital that more of us get in and stay in tech–and continue to raise the concerns that would otherwise be ignored by today’s tech leaders.  Current and future tech products are quite likely to be worse if we don’t.

False Unity and “Moving On” is Dangerous

Even before yesterday’s inauguration of Joe Biden and Kamala Harris as the new President and Vice President of the United States, there were calls for unity—even empathy—and not just from Joe Biden.  Such calls seemed very premature at the time, given the efforts of Trump and his allies to overturn the election result.  With the failure of those efforts, despite a literal assault on the entire legislative branch incited by Trump resulting in five dead, such calls for unity and healing look even more naive.

Too many so-called conservatives (and some of those further left on the political spectrum) would rather put unity ahead of accountability. MAGA adherents and believers in the QAnon conspiracy theory essentially invaded the  US Capitol and delayed the legislative branch from executing its responsibility to certify the Electoral College results at the urging of the president and his allies. They may have been aided and abetted in this insurrectionist act by multiple members of the GOP in both the Senate and the House. At least one shared the location of Speaker Pelosi on Twitter, as if to direct insurrectionists to her location.  The wife of a Supreme Court justice may have funded the transportation to the Capitol for some of these insurrections.  Even the death toll, the damage to the US Capitol, and the risk to their own lives did not prevent some Republicans from voting against certification of the Electoral College tally once the Capitol was secured.

Placing unity before accountability too many times before is what has led the country here. Unity before accountability killed Reconstruction, subjecting black Americans to almost another century of domestic terrorism, property theft, and subjugation at the hands of whites. The Nixon pardon, the Iran/Contra pardons, and the lack of accountability for those who engaged in torture and warrant less wiretapping of US citizens all placed unity before accountability.  All of these actions paved the way for President Trump to be acquitted despite clear evidence that he tried to shake down the president of Ukraine in exchange for the announcement of an investigation into Hunter Biden.

Less than a year has elapsed between the Senate’s acquittal of Trump on two impeachment charges and the insurrection on January 6.  Only a tiny number of GOP House members put their country ahead of their party in voting for a second impeachment.  A second acquittal for Trump seems likely–and we will live to regret it.

The Minimum Wage Debate is Too Narrow and Small

Recently I’ve found myself having variations of the same conversation on social media regarding the minimum wage.  Those to my political left have made statements such as “if your business would fail if you paid workers $15/hour you’re exploiting them.”  Those to my political right–some current or former business owners, some not–argue that minimum wage increases had a definite impact on their bottom line.

I have two problems with the first argument: (1) it oversimplifies and trivializes a very serious issue, (2) these days, the arguers tend to aim it at small business owners.  Worker exploitation is real, and conflating every employer who follows the law when it comes to pay and other facets of employment harms the cause of combatting serious harms.  The outgoing Trump administration has been trying to reduce the wages of H-2A workers.  Undocumented workers in sectors like agriculture, food, home-based healthcare, and others fare even worse.  In some cases, drug addiction treatment has turned thousands of people into little more than indentured servants, with complicity from judges and state regulators.  Until recently, large corporations like Wal-Mart and Amazon evaded accountability for low worker pay and mistreatment despite having significant percentages of workers on food stamps and Medicaid and a high rate of worker injuries.

Another variation of the first argument takes a starting point in the past (like the 1960s) then says the minimum wage should be whatever the rate of inflation would have grown it to be between then and today.  If you go back to when Dr. Martin Luther King, Jr. was alive (for example), the minimum wage today “should” be $22/hour.  You can pick any point in time and say what the minimum wage should be based on inflation, but that’s not the same as grappling honestly with how industries have changed and/or how the nature of work has changed in the half-century plus since the civil rights era.

One challenge with the second argument is that the examples cited are typically restaurants or food services–businesses that operate at low margins and have high fixed costs in addition to being labor-intensive.  Even in that sector, the impacts of a $15/hour minimum wage are not necessarily what you might expect.  But not every business is the restaurant business, and a single sector cannot govern the parameters of debate for an issue that impacts the entire economy and the broader society get a broadly beneficial result.

At this point in the discussion, someone usually brings up automation, followed by someone mentioning universal basic income (UBI).  What I have said in the past, and will continue to say, is that automation is coming regardless of what the federal government, states, and/or localities do with the minimum wage.  As someone who has written software for a living for over 20 years, the essence of my line of work is automating things.  Sometimes software augments what people do by taking over rote or repetitive aspects of their jobs and freeing them up to do more value-added work.  But if an entire job is rote or repetitive, software can and does eliminate jobs.  The combination of software and robots are what enable some manufacturers to produce so many goods without the large number of workers they would have needed in the past.

Talking about UBI enlarges the conversation, but even then may not fully take on the nature of the relationship between government, business, and people.  We do not talk nearly often enough about how long the United States got by with a much less-robust social safety net than other countries because of how much responsibility employers used to take on for their employees.  Nor do we talk about the amount of additional control that gives employers over their employees–or the cracks in the system that can result from unemployment.  The usual response from the political right whenever there is any discussion of separating health care from employment is to cry “socialism”.  But the falseness of such charges can be easily exposed.  Capitalism seems to be alive and well in South Korea, and they have a universal healthcare system–a significant portion of which is privately funded.  Germany is another country where capitalism, universal healthcare, and private insurers seem to be co-existing just fine.

The conversation we need to have, as companies and their shareholders get richer, share fewer of those gains with their workers, and otherwise delegate responsibilities they used to keep as part of the social contract, is how the relationship between government, business, and people should change to reflect the current reality.  The rationale always given for taxing capital gains at a lower rate than wages was investment.  But as we’ve seen both in the pandemic, and in the corporate response to the big tax cut in 2017, corporate execs mostly pocketed the gains for themselves or did stock buybacks to further inflate their per-share prices.  Far from sharing any of the gains with workers, some corporations laid off workers instead.  Given ample evidence that preferential tax treatment for capital gains does not result in more investment, the preference should end.  People of working age should not be solely dependent on an employer or Medicare for their healthcare.  A model where public and private insurance co-exist for those people and isn’t tied to employment is where we should be headed as a society.  

We need to think much harder than we have about what has to change both to account for the deficiencies in our social safety net (that corporations will not fill), and an economy on its way to eliminating entire fields that employ a lot of people today.  Bill Gates advocated in favor of a tax on robots year ago.  The challenges of funding UBI and whether or not it’s possible to do that and continue to maintain the social safety net as it currently exists need to be faced head-on.  Talking about the minimum wage alone–even as multiple states and localities increase it well beyond the federal minimum–is not enough.