Type a search term to find related articles by LIMS subject matter experts gathered from the most trusted and dynamic collaboration tools in the laboratory informatics industry.
Universal default is a now-banned practice in the United States financial services industry whereby a creditor would change the terms of a loan from the normal terms to the default terms (i.e. the terms and rates given to those who have missed payments on a loan) when that lender is informed that their customer has defaulted with another unrelated lender, even though the customer has not defaulted with the first lender.[1]
Beginning with the deregulation of financial services in the mid-1990s, credit card companies began to include universal default language in their cardholder agreements. By the mid-2000s, approximately half of all US credit card-issuing banks had universal default language, albeit with most not enforcing them regularly or systematically.
By 2003, Congress began to consider bills to curb universal default and other abusive credit card practices. The Bush administration followed suit, with the Office of the Comptroller of the Currency issuing a stern advisory letter to the credit card industry regarding practices including universal default. In 2007, Citibank became the first bank to voluntarily eliminate its universal default provision.
In 2009, most forms of the practice were outlawed in the United States by the Credit Card Accountability Responsibility and Disclosure (CARD) Act of 2009.[1]
Under the theory and practice of risk-based pricing, the interest rate of the loan should reflect the risk of the borrower to avoid subsidizing those who default at the expense of those who always pay on time (or alternatively, to allow loans to be given to a broader range of customers, with a broad range of credit history).
Usually, if an interest rate is to be risk-based, the risk premium (or amount charged extra for the risk) is set at the time of an account opening. However, this does not take into consideration that the risk of a borrower defaulting may change later (and in fact the risk might be less).
Thus, while lenders have increased credit limits and lowered rates to borrowers in good standing, reflecting the decreased perception of risk, recently lenders have begun to raise rates to those it later has found have defaulted with other lenders.
This practice generally only happens on credit cards, which are one of the only forms of consumer credit to have an adjustable interest rate not simply based on an interest rate index but on the perceived risk of the customer (both positive and negative).
Instead of a specific increase in the risk premium charge, credit cards often change their interest rate to what is known as the default rate. This rate is usually the highest rate charged by the card, an average of 27.8%. In addition this is charged in a first in, last out FILO basis.
Normally the default rate is charged when a customer fails to make a payment on a particular lender's credit card, but with universal default, the lender will charge the rate if the customer defaults elsewhere.
The concept of universal default is criticized for many reasons.
The Credit Card Accountability, Responsibility, and Disclosure Act of 2009 prohibited the practice of retroactively raising any annual percentage rate, fees, or finance charges for reasons unrelated to the cardholder's behavior with their account. One of the intentions of this law was to shield customers from arbitrary rate increases if they have been on time with their account.
However, this law did not prohibit all forms of universal default. Credit card companies have begun the practice of canceling altogether the accounts of customers who are delinquent or in default with other credit agencies even if the customer is still in good standing with the credit card company.[2][3]