• 19/03/2024

Threat of ‘postcode discrimination’ as credit scores skewed by where you live

Daniel Ziffer| ABC| 7 February 2022

https://www.abc.net.au/news/2022-02-07/threat-of-postcode-discrimination-in-credit-scores/100723574?fbclid=IwAR36VZplMBBVdFVzo7YDscYRWs92ab_vJNQ-7i9-SPHlscyAeKvbn7hkI9M

Where you live could determine if you can get a mortgage, with Australia’s biggest credit scoring company now applying postcode data when assessing applications.

Key points:

  • People’s credit scores are used by banks in determining if they should be approved for loans and credit cards
  • Market leader Equifax is using data about the credit risk of people in particular postcodes
  • The use of automated decision making (ADM) in finance is growing

Equifax holds information on almost 20 million Australians and the ‘credit scores’ it produces are used by banks in determining whether they will extend a home loan.

The confirmation of Equifax’s use of postcode data is another barrier to social mobility – people improving their lives – because it makes it harder for people in poor areas to access credit.

“It’s a bit like the rich get richer and the poor get poorer,” said Victoria Coster, founder of Credit Fix Solutions, a company that helps people get finance.

Victoria Coster, founder of Credit Fix Solutions, takes notes while talking on the phone.
Victoria Coster, founder of Credit Fix Solutions, warns that residents of lower income postcodes may find it harder and more expensive to get loans.(ABC News: John Gunn)

“I think is actually disgraceful the fact that if I was from a poorer suburb, I could have a lower score and I have to pay more for a home loan. It makes me sad.”

Now, or never

One of the elements now used in assessing if people are a good credit risk is how their neighbours behave when they have been lent money.

“Where you live now shouldn’t affect where you can buy a home in the future,” said Amy Pereira from consumer advocacy organisation CHOICE.

Amy Pereira
Amy Pereira is concerned people don’t understand the impact that ADM or automated decision-making plays in their lives.(ABC News: Daniel Irvine)

She is concerned the use of postcode data to determine if people should get credit could lead to discrimination.

“Using postcode data is problematic because where you live is often linked to other factors like your wealth, your income, your age, ethnicity, gender and many other qualities,” Ms Pereira said.

“We’re concerned that using postcode data to determine your creditworthiness really presents an invisible barrier.”

Equifax has only recently begun using “geodemographic data”, such as how people behave with credit in a particular area, in assessing credit scores for individuals.

In a statement, the company said the information is used “to form a small component of Equifax credit scores in limited circumstances, as it has been found based on statistical analysis to be a relevant factor in determining credit risk”.

Big data weighs decisions

It is part of a wider issue: the explosive growth in the use of automated decision-making, or ADM, where algorithms and artificial intelligence (AI) programs are used to make decisions instead of humans.

Tensions in that area occupy the mind of Lyria Bennett Moses, director of the UNSW Allens Hub for Technology, Law and Innovation.

Lyria Moses
Lyria Bennett Moses says there need to be definitions about what fairness is in AI systems before they are deployed.

“You have to ask: what systems are being used? How are they being used? How are they being tested, in what ways might systems be biased and so forth?” she said.

This is not a theoretical discussion.

By using loyalty cards, online shopping and credit, consumers create reams of linked data about their spending and lifestyle.

That information is now going into credit decisions.

A credit score of over 700 is generally considered good, but mortgage brokers have told the ABC that algorithmic assessment of customers means a single transaction using a buy now, pay later (BNPL) service like AfterPay or Zip can see 50 to 100 points lopped off instantly. One in five consumers using buy now, pay later miss payments Australia’s corporate watchdog says one in five consumers using buy now, pay later are missing payments.

“There is the potential to use AI [artificial intelligence] systems in ways that are fair by criteria that we specify as fair,” Professor Moses said.

“We can actually specify what we mean by fair and what criteria systems need to satisfy before deployment.”

But using postcode data adds another opaque layer – because the behaviour of your neighbours now affects decisions about you.

“One of the arguments against [using data from] postcodes is it will often correlate very highly with things like ethnicity or whether you’re Australian-born or an immigrant and so forth,” Professor Moses cautioned.

“So there’s a risk, particularly using something like postcodes, that you will ultimately be doing proxy discrimination for something that might be prohibited.”

US experience

The use of automated decision-making in finance has soared due to its lower cost. But it is not without its problems.

In 2019, the US financial regulator investigated Apple over what was described as its ‘sexist’ credit card, that used an algorithm that appeared to be biased against women.

Tech entrepreneur David Heinemeier-Hansson complained that his Apple Card gave him a credit limit 20-times higher than what his wife got. (Ironically, she had a better credit score).

The issue was confirmed by Apple co-founder Steve Wozniak, who shares all banks accounts and assets with his wife. But he was given a card with a credit limit 10 times higher than hers.

Automated decision-making should herald the end of discrimination in finance because it has the potential to remove the bias of humans.

Across history, different ethnic and disadvantaged groups have struggled to gain access to finance.

But that assumes machines make better decisions and that is not necessarily true.

Not always smarter

Computers and algorithms rely on two things, according to Jeannie Paterson from the Centre for AI and Digital Ethics at the University of Melbourne.

Woman sits on chair
Jeannie Paterson, from the Centre for AI and Digital Ethics at the University of Melbourne, is concerned about the explosion of ADM relating to financial decisions.(ABC News: Billy Draper)

There are the people that create them and the information that informs them.

“What we need to realise is that automatic decision-making doesn’t mean that there’s a superhuman machine, a machine there with superhuman powers that knows everything about us and can make an accurate prediction about every individual,” Professor Paterson said.

Humans use ‘rules of thumb’ and a matrix of factors to make decisions, such as deciding whether to lend money to people.

Automated decision-making is just a sophisticated statistical process that is only as good as the information that is fed into it.

Equifax is adding in data about people’s postcodes because it believes it’s a relevant factor in making a decision about credit scores.

“There’s no transparency to the consumer, but there’s also potentially no transparency internally,” Professor Paterson noted.

“So it’s possible that the bank doesn’t know quite how the decisions are made either.

“If we automate a process that people don’t really understand, then – because we don’t know what’s happening – certain cohorts, certain groups of people could be much worse off.”

Not the only factor

Equifax insists using postcode data helps make better decisions.

In a statement, the company said it was an element in people’s credit scores, but far from the main one.

“Information such as an individual’s credit history and behaviours have a much more significant impact in determining an individual’s Equifax credit score,” it said.

“The biggest determining factor between two individuals with similar demographics would be their individual credit behaviours.”

Those “behaviours” include the types of credit they have applied for, the type of lenders they have used, how often they apply, how many open accounts they have and the limit of those accounts, whether they make monthly repayments on time and any previous defaults.

“It is important to note that Equifax Credit Scores only form part of the information a lender will use to assess a credit application,” the company’s statement continued.

“Each lender may apply their own lending criteria and policies, and in some cases their own scores, which is why some lenders may approve an application while others will not.”

Destiny by domicile

The explanation does not soothe Victoria Coster. For clients with marginal or problematic credit scores, a difference of 20 to 30 points because of where they live could be the difference between getting a loan or not.

“And what our brokers tell us who refer to us is that unless a consumer is sitting at (a credit score) around 700 – which is pretty high – they can’t access the best interest rates and the more traditional lenders,” she said.

“And when it comes to things like personal loans, for example, you actually have no chance unless you have a very good credit score of getting finance across the line.

“These new AI systems that the credit reporting agencies have put in are unfair when it comes to demographic judgement.

“If you live in a poor area, you’re going to be penalised, versus if you live in a wealthy postcode area.”

Read Previous

More changes to finance

Read Next

Economists predict interest rates will surge soon despite Reserve Bank’s assurances

Accredited Broker