Some Apple Card customers say the credit card's issuer, Goldman Sachs, is giving women far lower credit limits, even if they share assets and accounts with their spouse. But it's impossible to know if the Apple Card -- or any other credit card -- discriminates against women, because creditworthiness algorithms are notoriously opaque.
"It's such a mystery we are seeing," said Sara Rathner, travel and credit cards expert at NerdWallet. "Because we don't know exactly what those algorithms are looking for, it can be hard to say if there might be some bias built into them."
The New York Department of Financial Services is looking into the allegations of gender discrimination against users of the Apple Card. The allegations blew up on Twitter Saturday after tech entrepreneur David Heinmeier Hansson wrote that Apple Card offered him 20 times the credit limit as his wife, although they have shared assets and she has a higher credit score.
Many other users voiced similar experiences — including Apple co-founder Steve Wozniak, who said his credit limit was 10 times that of his wife, despite the fact that they share all assets and accounts.
It's not known what led the algorithms to make the assessments they did.
"It's very important to underline that in these two very high-profile cases, we just don't have enough information about the specific applications to be able to make any hard and fast judgments about it." said Matt Schulz, chief industry analyst at CompareCards.com.
It's common practice for credit card issuers to use algorithms when they're making lending decisions, but there's very little transparency, Schulz said.
"Oftentimes, there is a perfectly legitimate explanation for why there could be discrepancies," he said. "But anytime there's a lack of transparency in these things, it shouldn't be surprising that people would raise some issues with these decisions."
How credit decisions are made
Credit decisions are based on a variety of different factors, which makes it difficult to know if an individual factor in particular would have the biggest effect. But things like income and consumer habits can offer some leads about the way the algorithms work.
For example, women tend to make less money than men, and with income being a significant factor in determining credit limits, it wouldn't be surprising to see that women might end up having lower credit limits than men, Schulz said.
"The simplest potential explanation for some of the stuff that came out this weekend would be a disparity in income," he said. "If one spouse makes a lot of money and the other spouse doesn't make as much, it doesn't matter how you file your taxes, or what your net worth is. The income you put on the application drives a lot of your credit limit."
Spending habits can also have a big impact on credit score and credit limit, which is another reason why spouses could end up with significantly different credit limits.
"If you have a person who has a high income, and spends a lot of money on a credit card on a regular basis, they may get a higher credit limit than somebody with a high income and a high credit score who just doesn't use their card all that often," Schulz said. "So that could be the reason why you see disparities between, you know, two individuals who otherwise seemed like they would be likely to get similar credit."
In a statement to CNN Business, Goldman Sachs said that Apple Card customers do not share a credit line under the account of a family member or another person by getting a supplemental card.
"With Apple Card, your account is individual to you; your credit line is yours and you establish your own direct credit history," a Goldman Sachs spokesperson said in the statement.
"As with any other individual credit card, your application is evaluated independently," the spokesperson said. "We look at an individual's income and an individual's creditworthiness, which includes factors like personal credit scores, how much debt you have, and how that debt has been managed."
Based on these factors, it is possible for two family members to receive significantly different credit decisions, the company said.
"In all cases, we have not and will not make decisions based on factors like gender," the statement said.
Goldman Sachs also said it frequently hears from its customers that they would like to share their Apple Card with other members of their families, and that the company is looking to enable this in the future.
Bias in artificial intelligence
The New York Department of Financial Services confirmed Tuesday the ongoing investigation, and said in a statement that New York law prohibits discrimination against protected classes of individuals, including disparate impact and treatment. The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex, the statement said.
"Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law," a DFS spokesperson said in the statement. "DFS is troubled to learn of potential discriminatory treatment in regards to credit limit decisions reportedly made by an algorithm of Apple Card, issued by Goldman Sachs."
Artificial intelligence has been shown in a number of contexts to be biased, and gender bias has occurred in a variety of industries. Nerdwallet's Rathner said that discrimination has been a problem in the financial services industry for decades, and not just by credit card companies, but also lenders and banks.
"These sorts of stories have been going on for a long time," Rathner said. "The idea is that it sorts out the bias because it's a machine. But these codes are still written by humans, and humans are biased naturally."