By Sarah Montgomery
“The @AppleCard is such a fucking sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work,” tweeted user @dhh.
The man behind the Twitter handle is David Heinemeier Hansson, a software developer with a large social media following. When he and his wife, Jamie, applied for the new Apple Card, he received a much higher credit limit, even though his wife has a better credit score and they share assets.
Steve Wozniak, a co-founder and current employee of Apple, said that he and his wife face a similarly unfair discrepancy in creditworthiness in terms of the credit card.
The Apple Card, which came out in August, was developed by Apple and Goldman Sachs. According to a support page, the applicant’s credit score, credit report and income level are entered into an algorithm that decidess creditworthiness.
In response to the media attention that Hansson’s tweet garnered, the New York State Department of Financial Services announced that it will be investigating the algorithm that Apple and Goldman Sachs use.
How can artificial intelligence share human biases? According to a CNN Business article, artificial intelligence “can quickly learn about a simple concept, but it is dependent on the data that us humans feed it, for better or worse.”
Gender discrimination in the finance world is nothing new. Only in 1974, not even 50 years ago, did Congress pass the Equal Credit Opportunity Act (ECOA). The act requires that banks, credit card companies, and other lenders make credit equally available to all creditworthy customers. It also outlaws discrimination based on several personal characteristics—sex being one of them— and thus lenders can only decide creditworthiness on income, expenses, debts, and credit history, amongst other limited pieces of information.
“[I]f you weren’t a white male you were likely to be treated like a potential problem, not a potential customer [by lenders].”
Billy Fay of debt.org
Until ECOA was implemented, women were not allowed to apply for credit. All credit had to be obtained through husbands or fathers, even if the woman in question was gainfully employed. Though the legislation was a major push in the right direction, the National Consumer Law Center argues that unfair credit discrimination is alive and well, particularly against women.
Following up on his original tweet, Hannson writes: “I’m surprised that they even let her apply for a credit card without the signed approval of her spouse? I mean, can you really trust women with a credit card these days??!”