Close-range shots and predicting future outcomes

Breaking news: Close range shots are more likely to go in. - Derek Leung

Shot differential is a common metric for assessing teams. Do we do better if we focus on the close-range shots?

A lot of today's hockey analysis relies on the importance of shot differential in predicting future success.

Team shooting and save percentages often run hot for a stretch, but in the long run that success is rarely sustainable and so shot differential is usually a better predictor than looking at goals or wins.

Recently, Michael Parkatti posted a really nice piece suggesting that in some cases, various measures that account for shot quality might do a slightly better job of predicting future results than simple shot differential.

My one hesitation was that he only looked at three specific dates in one season -- and the benefits of factoring in shot quality only showed up on one date. Given how little attention the community generally pays to shot quality, even that was newsworthy. Still, I wanted to repeat his work and bolster it a bit, to make sure we weren't reading too much into a quirk of a specific date in history.

Discrete tests of predictive power

When I see a new stat, I'm always looking for an assessment of its predictive value.

To do this, you look at the correlation between performance by that stat up through a certain point in time and wins or goal differential from that time on. If something is a good predictor, then it will have a high correlation to future results.

In this case, Michael did some of that analysis, and the result was very intriguing. He compared a variety of metrics, looking at the correlation between each metric and the team's point total in their remaining games.

He found that at a certain point in the season, a team's fraction of the shots from close range (within 25 feet) was a better predictor of its future outcomes than its overall shot differential was (Corsi or Fenwick). He also found stronger correlations to future results from his Expected Goals metrics that weights each shot based on its location and type.

In other words, if you wanted to estimate on December 25, 2011 how many points each team would have in the remainder of the '11-12 season, you would have made better predictions by using Expected Goals or close-range shots than if you based them on Corsi or Fenwick.

However, December 25, 2011 was the only point in time for which he could make that claim. This analysis has the potential to make a big impact, and I wanted to make sure we were on solid footing. The trend from just the few dates he examined wasn't as clear as I would like:

Predicting_points_3_medium

From just this data, there's room for debate about both the value of the shot quality metrics and the value of focusing on situations where the score was close.

I found myself craving a more granular study, one that could show us in more detail how the value of shot quality evolves with sample size.

Daily tracking of predictions

Michael's Expected Goal numbers and the formula that produces them aren't published, so I couldn't work on that part of the analysis. But I could hope to repeat his assessment of the predictive value of shots from within 25 feet.

For starters, I used the NHL's play by play, which reports the distance of every shot on goal. I worked out the correlation between even strength shot differential (or ES close-range shot differential) and a team's future points -- not just for a single day, but following each game of the NHL season.

So instead of just a table with correlations on a couple of days, I can produce a whole curve showing how the correlation changed over the course of the season:

Predicting_points_2_medium

In the plot above, the red curve is for focusing on close-range shots and the blue curve is for if you include all shots on goal. Where the blue curve is higher (over the first ~250 games of the year), including all shots results in better predictions.

But here we can see that after the season is about 1/5 of the way done, we get superior predictions of future point totals by focusing on just the close-range shots than if we include all shots on goal. It's not just a single date; it was true for the majority of the year.

This plot isn't exactly the same as Michael's data. Above, I used only shots on goal; his analysis also included shots that missed the net. [Edit: Originally I pulled missed shots from a different data stream. Post has been updated to use the NHL play by play for both types of shots.]

Here's the result when we also factor in shots that miss the net:

Predicting_points_9_medium

So far, so good. It seems the three dates he reported did represent a real trend. As one last step, I wanted to duplicate this analysis for a second year -- maybe 2011-12 was just a quirky season for whatever reason.

Here's the comparison for 2010-11, with just shots on goal in the left plot and all shots in the right:

Predicting_points_5_medium Predicting_points_10_medium

Uh-oh, that's backwards -- now the blue curve is on top all year long. In 2010-11, shots under 25 feet were much less predictive than if we included all shots -- the opposite of what we saw for 2011-12.

What about 2009-10?

Predicting_points_6_medium Predicting_points_11_medium

Again, shots under 25 feet look to be less predictive. Hmm. Maybe something changed recently; let's move a year ahead to the shortened 2012-13 season.

Predicting_points_7_medium Predicting_points_8_medium

All of the correlations are weaker because it was a shorter season -- by the time you had enough data to tell how good a team was, there wasn't much of a season left to predict.

Close-range shots did a bit better last year than in the older data, but I wouldn't have looked at this plot and concluded it was a better measure than simple shot differential.

So what does all that tell us?

I'm inclined to say that it doesn't look good for the shots-within-25-feet measure, that the four plots above have roughly 2600 points in time where total shot count does better and only 900 or so where the close-range shot count does better.

However, I wouldn't call anything settled. For one thing, we still don't know what we'd find if a refined, continuous shot quality function like Expected Goals were published so we could test it the way we tested using a sharp cutoff at 25 feet.

And maybe it's not a coincidence that shot quality metrics did better in the more recent years -- maybe as the shot differential craze has taken hold, some teams have changed strategy and the ones that didn't suddenly found it easier to amass players who drive shot location.

I see a lot of room for things to turn out differently going forwards, and I'm excited to see where this goes. But looking at the complete body of data that we have right now, I find myself unwilling to make predictions based on the close-range shot measures at this point in time.

Postscript for statguys: Year-end correlations

There's one additional place where I struggled to get the same result as Michael. I think the issue is just a semantic error, but I can't be sure it isn't indicative of a deeper divide.

Michael's piece opens with a look at the correlation between teams' point totals in 2011-12 and their even strength (including 4-on-4, excluding empty net) performance by a variety of measures. Here's a sampling of what he reported (Expected Goals is his metric which accounts for shot volume, location and type):

Measure r^2
Corsi 0.62
Fenwick 0.65
Expected Goals 0.73
Goals 0.79
Shots inside 25 feet 0.73

These numbers really surprised me -- all of those correlations looked really high, and I wanted to make sure I could repeat this result before I pressed on. I haven't seen anyone focus on this specific set of game states, but related analyses of even strength results have all suggested much weaker correlations.

Here's what we get for the relationship between 5-on-5 play (with empty net removed) and team point total using stats.hockeyanalysis.com:

Predicting_points_medium

It looks like 2011-12 is right in line with previous studies (e.g. from JLikens, Hawerchuk, and PuckStopsHere), and quite a bit lower than what Michael reported. Granted, his data includes 4-on-4 and this plot doesn't, but adding in the 4-on-4 totals from Behind the Net only increases r^2 for goals to 0.64 -- still well short of Michael's 0.79.

It does stand out to me that his values are roughly the square root of mine. Since I can't think of any other explanation, I wrote this piece on the assumption that wherever he wrote r^2, he actually meant r -- the correlation coefficient, not the square of the correlation coefficient. Hopefully that's all it is; if that's not the case, then we have much bigger disagreements about the data.

X
Log In Sign Up

forgot?
Log In Sign Up

Please choose a new SB Nation username and password

As part of the new SB Nation launch, prior users will need to choose a permanent username, along with a new password.

Your username will be used to login to SB Nation going forward.

I already have a Vox Media account!

Verify Vox Media account

Please login to your Vox Media account. This account will be linked to your previously existing Eater account.

Please choose a new SB Nation username and password

As part of the new SB Nation launch, prior MT authors will need to choose a new username and password.

Your username will be used to login to SB Nation going forward.

Forgot password?

We'll email you a reset link.

If you signed up using a 3rd party account like Facebook or Twitter, please login with it instead.

Forgot password?

Try another email?

Almost done,

By becoming a registered user, you are also agreeing to our Terms and confirming that you have read our Privacy Policy.

Join SBNation.com

You must be a member of SBNation.com to participate.

We have our own Community Guidelines at SBNation.com. You should read them.

Join SBNation.com

You must be a member of SBNation.com to participate.

We have our own Community Guidelines at SBNation.com. You should read them.

Spinner.vc97ec6e

Authenticating

Great!

Choose an available username to complete sign up.

In order to provide our users with a better overall experience, we ask for more information from Facebook when using it to login so that we can learn more about our audience and provide you with the best possible experience. We do not store specific user data and the sharing of it is not required to login with Facebook.