Default Rate: It’s Not Just If a Loan Defaults, But When | P2P Lending, Peer to Peer Lending, People to People Lending

I had a very interesting email exchange with RGF. (I emailed him after using his forum posts in articles last week).

Here is a portion of that email exchange on default rates over time and their affect on yield.

Default rate: It’s not just if a loan defaults, but when.

A lot of people, especially early in this game, foolishly took the interest rate and subtracted the annualized expected default rate and thought this would give them their expected return. Some people got little smarter and realized it’s a 3 year loan and multiplied the default rate times three and subtracted that from the loan rate to predict expected return. This is not mathematically correct for several reasons. For one, if a loan has a 2% annual risk of default, the chances of them defaulting over a 3 year period is actually about 5.88% (1-.98^3rd). For another, it assumes a total loan loss which only happens on a first payment default, and it also does not consider the partial recovery on a sale of charged off loans.

Ok, the following is the best way I can think of to analyze this. I’m not convinced I’m doing this right, I’d be curious to see what a real stats person thinks.

If a person were equally likely to default at any point (this is almost certainly not true), you can calculate expected default loss by adding up the return of the 37 potential events times the possibility of that event happening. The 37 events are the default at any payment from1 to 36 plus the event of full payoff (this still is a bit of a cheat, since it doesn’t consider partial payments, early payoff, time value of money, etc). So, let’s take a $1,000 loan at 12% with a 2% annual chance of default, it would have a payment of $33.21. There are 37 possible events, for default at payment 1 through 36 you take the possibility of each event happening (.001666) times the amount you’ve received in payments up to that point (0 to 35 x 33.21). The expected value contribution of a first payment default would be .02/12, or .001666 x $0 in payments made, or zero return. 2nd payment default .001666 x $33.21 in payments made, or $.05535 contribution to expected value. Using a spreadsheet, adding up the contribution to expected return of each event, the expected return on this investment would be $1158.70, while the 36 payments add up to $1195.56. So, the 2% default risk would mean the expected return here is $36.86 lower than a risk free loan. If you figure you get, say, 18% of this $36.86 back in defaulted loan sales, then it’s $30.22, loss, or around 3.0% return reduction due to default risk, so you’re expected return is 9.0%. (there are other tiny considerations that you could get bogged down on here, such as the opportunity cost of waiting months to get your 18% back from a loan sale, reinvestment delay on loans that pay but pay slower than the payment schedule, etc). I did the same calc using a 20% loan rate and 10% default rate (note same default rate loan rate spread of 10%), the expected return was $1131.52, so the same spread at higher risks appears to have lower returns.

Note even if loans were equally likely to default at any point, you’d still have your highest percentage of defaults at the first payment. This is because all loans “make it” to where the first payment is due. Not all loans make it to the next payment, so the pool is smaller, so you’d have the same percentage defaulting out of a smaller pool, so a slightly smaller # of loans defaulting. Note at low default rates this is a very low effect, in the above example .001666 (.1666%) defaulting on the first payment, and 1-(.001666×35) x .001666, or .0015688 (.15688%) defaulting on the last payment (6 percent less). Note in my math above I assumed a steady risk of .001666 for default at every payment, which is technically wrong, but the effect is not significant to the return. It will be more important at higher risk rates.

However, and this is the big problem, people are not equally likely to default at any point. People will probably be dramatically more likely to default early in the loan. There are many reasons for this, but the reasons don’t matter. But it is very important. In the first example above (12% rate 2% default), a full $35 of expected return (3.5%) comes from partial payments on defaulted loans. On the second example (20% rate 10% default) $195 of the expected return was from partial payments on defaulted loans, on an expected gain of $131! If defaults risks are heavily weighted towards the beginning of the loan, the 20% rate 10% default may well have a near zero or even negative expected return. For this example, if the default risk was 20% for the first year, 10% for the second year, and 0% for the third year, the expected return becomes negative.

What will be needed, in the long run, is not default rate, but default rate at each payment by credit grade (you could throw more variables in here such as loan size, autofund, etc). When we have this (and we’ll need at least 3 years of data obviously) you can calculate expected return like I did in the first paragraph, replacing .001666 with the actual default rate by payment number.

Leave a Reply

Your email address will not be published. Required fields are marked *