Here is his graph:
Right of the bat something seems fishy. Anyone who has done a stats course should notice that this graph has waaaay too much variance to be near perfect in R^2. Something with a 0.98 should barely be varying from the red line.
So, I did what anyone should do when they see something that seems contrary to what they believe; I looked at it myself.
I took each defender's season from 2010-14 for anyone who played at least half the season (n=771):
We get a reasonable, but still powerful, coefficient of determination of 0.3.
What we learn is three fold:
1) Corsi%, as has been common belief for years, is effected by deployment AND talent
2) Some people will do anything to prove their point
3) Always re-check other peoples work
When speaking on eye-test vs statistics, Jonathan Willis said something incredibly smart and powerful
In specific instances where eyeballs and analytics disagree, an intelligent man asks why the discrepancy exists and investigates further.These are words to live by. When something seems counter-intuitive or someone brings an alternate opinion, do not automatically change or dismiss their ideas. Do the work yourself. Re-check. Re-evaluate. Investigate further.
EDIT: Some extra stuff
Something many of us in the analytical community already know, but isn't talked much about, was brought up by my friend Garik on twitter. There is reverse causation that occurs with zone starts and Corsi players. Prior to Tyler Dellow's blog going down, Dellow spoke about this for Brooks Orpik.
A lot of poor Corsi defensemen are bad due to poor puck movement, especially in breakouts as shown by the Zone Exit project. These guys get trapped in their own zone and often end up with their goalie covering the puck more often. Also, these guys tend to also tend to create a lot more icing calls as they chip the puck out.