Apple launched its own MasterCard brand across the country in August. In the months since, the digital first payment system has won some fans for its easy integration into the iPhone and Apple ecosystem, and it seemed to work more or less like any other credit card. Now, however, financial services regulators want to know what's going on under the hood, amid accusations that the software that determines the card's terms has a sexist twist.
Program developer and entrepreneur David Heinemeier Hansson took to Twitter late last week to complain about his wife Jamie Heinemeier Hansson's experience with AppleCard.
"@ AppleCard is such a fucking sexist program," began his long thread. "My wife and I filed a joint tax return, live in a community real estate state and have been married for a long time. Still, Apple's black grid algorithm considers that I deserve 20x the credit limit she makes. No appeals work."
"It gets even worse," he added, sharing a screenshot showing $ 0 owed on a limit of, apparently, $ 57.24. "Even when she pays her ridiculously low limit in full, the card won't approve any expenses until the next billing period. Women are obviously not good credit risks, even when they pay up the damn balance in advance and in full."
Talking to Apple's customer service did no good, he added, with representatives who repeatedly divert blame to the black box that makes the decisions. Customer service representatives were "very nice, polite people who represent a completely broken and reprehensible system," Hansson said. "The first person was" I don't know why, but I swear we don't discriminate, it's just the ALGORITHM. "I'm not kidding you." IT'S ONLY THE ALGORITHM! ""
Several other men on Twitter chimed in with answers describing similar experiences. They said their wives, who on paper look like the better credit risks, were given significantly less favorable terms on their Apple cards than they did. One of the answers came from Apple founder Steve Wozniak, who tweeted that even though he and his wife only have common bank accounts and assets, his Apple card got a limit 10 times higher than his wife's.
As Hansson's thread went viral and garnered media attention, Apple VIP customer service representatives rose. They pushed the credit line on Jamie's card up to match David's and launched an internal investigation.
Apple VIP support are not the only ones interested in finding out if the company's mysterious algorithm is behaving in a discriminatory manner; regulators are now also investigating.
Hansson's tweets caught the attention of Linda Lacewell, director of the New York Department of Financial Services. "Here in the state of New York, we support innovation," Lacewell wrote in a blog post on Sunday, adding:
However, new technology cannot leave some consumers behind or anchor discrimination. We believe that innovation can help solve many challenges, including making high quality financial services more accessible and affordable. Yet this cannot be achieved without maintaining public confidence. In order for innovation to deliver lasting and sustainable value, consumers who use new products or services must be able to trust that they are treated fairly.
All financial products and services offered in New York State must not discriminate against protected groups. These products include the Apple card, which is supported by New York-based Goldman Sachs.
"We look at an individual's income and an individual's credit rating, which includes factors such as personal credit score, how much debt you have and how that debt was handled," the company said. "Based on these factors, it is possible for two family members to make significantly different credit decisions. In any case, we do not and will not make decisions based on factors such as gender."
CNBC reports that Goldman was "aware of the potential issue" before the card was launched in August but chose to go ahead anyway. The bank says it is still considering ways to start shared accounts, including adding multiple cardholders to a single account or allowing co-signers.
The statement (and the potential for joint accounts or co-signers) does not specifically address why several users reported that their wives – in some cases literal millionaires – received significantly lower Apple Card credit limits and higher interest rates despite being the higher income in the family, had higher credit scores, or both.
Accidental consequences  It is extremely unlikely that someone at either Apple or Goldman Sachs sat down, twisted his mustache à la Snidely Whiplash and said, "Ah ha! Let's treat women more badly than men!" Doing so would be both morally and financially stupid, and no one is accusing companies of doing it intentionally.
However, decisions made by the algorithm have a way of reflecting good old-fashioned human prejudices – only with even less transparency. And it happens in almost every area. The examples are countless.
About a year ago, Amazon had to stop using an AI tool to hire and recruit targets after it turned out not to promote female candidates. In essence, the software looked at the company's current successful workforce, which distorts male and decided that "male" must be a decisive factor for success.
In 2015, ProPublica discovered that Asian American families would likely be charged significantly more for SAT test prep services. The algorithm that determines price was not explicitly built to discriminate by race; instead, it used zip codes – but it charged higher prices in neighborhoods that turned out to be predominantly Asian.
Algorithms of systemic prejudice are also pervasive in the criminal justice system, where mathematics tends to assign black criminals a greater chance of recidivism after serving their terms than white criminals, as well as higher cash loans, despite evidence showing that the score is unreliable. and often incorrect.
"The formula was particularly likely to fake black defendants as future criminals, incorrectly labeling them in this way almost twice as much as white defendants," wrote ProPublica 2016. "White defendants were labeled as low risk more often than black defendants."  High profile happiness
Hanssonsen is lucky in several ways. First, they are about the highest part of the consumer spectrum. Jamie wrote in a statement today that she has been financially successful, independent of her husband, for a number of years. She currently has a full-time job outside the home while taking care of her three children, she said, but "I am still a millionaire who contributes greatly to my household and pays off credit in full every month." Both Hansson's have repeatedly stated publicly that her credit score is not only excellent but also higher than his.
In addition, David has a high profile in the tech and business world, with lots of acquaintances and allies in the right places and more than 350,000 Twitter followers. He can make a stink that will both be seen and taken seriously. The Apple card is a luxurious good, and Hanssons got such a strong response, in short, because they have almost all the privileges of the book – and they are both very aware of it.
"This situation … does not matter to my livelihood," Jamie wrote in his statement, admitting, "This is not just a story about sexism and the black squares of the credit algorithm, but about how rich people almost always get their way. Justice for another rich white woman is not justice at all. "
Instead of dealing specifically with her, she wrote, it is a matter of principle:" We cannot bend for the algorithms. We cannot continue to slip into a Black Mirror world. Apple can and should be better than this. We should all be better than this. "
" I hear the frustration of women and minorities who have already hit this drum high and public in several years without this level of attention, "she added. "I didn't want to be the subject that led to these fires, but I'm glad they're burning."