Derek Jeter, Performance Management, And Yes . . . Data Management

As a lifelong Yankees fan (which makes me a pariah with many of my Red Sox Country-based Forrester coworkers in Cambridge, Mass.), I’ve been following with amusement the sports media frenzy around the New York Yankees' "not-so-public yet not-so-private" contract negotiations with their star shortstop, Derek Jeter. While I read these news snippets with the intent of escaping the exciting world of data management for just a brief moment, I couldn’t escape for long because both sides of the table bring up reams of data to defend their positions.  

According to the media reports and analysis, the Yankees' ownership is seemingly paying less attention to Jeter’s Hall of Fame-worthy career statistics, including a fantastic 2009 season, and his intrinsic value to the Yankees brand, but instead is focusing on Jeter’s arguably career-low 2010 season on-field performance and advancing age (36 years old is practically Medicare-eligible age in baseball).    

Jeter’s side of the negotiations, on the other hand, point out that Jeter’s value to the Yankees is “immeasurable,” and that one off year shouldn’t be used to define his value to the team. They point out that Jeter, as team captain, is a major leader in the clubhouse and excellent role model for younger players. He’s certainly among the most popular players the Yankees employ and influences boatloads of fans to attend games, watch the Yankees cable network, and provide significant licensing revenue. And of course they point out that Jeter is still an excellent player and 2010 should be viewed as an anomaly, not the norm.

I’m not a baseball analyst (lucky for everyone), and I have no intention in joining the debate on whose point of view is correct or how much Jeter should earn, for how many years, etc. (That’s best discussed over a few beers, not on a blog, right?)

 But I do think this situation poses a great example of a mission-critical business decision that has to be made with the most effective use of trusted data and performance management metrics to ensure that the business remains competitive while managing its limited resources effectively. (Okay, the Yanks' resources are not as “limited” as the next team's, I know. Work with me here, I’m getting to my point!)

What’s interesting from a data management standpoint is that everyone has access to much of the same data (e.g., batting stats, fielding stats, post-season contribution, licensing revenue on “Derek Jeter” merchandise . . . not to mention career stats on practically every major league shortstop who has ever played the game), and unlike most enterprise data, everyone actually agrees that this data itself is accurate and trustworthy. What they can’t agree on is how to apply this trusted data to effectively evaluate past performance, predict future performance, and determine ROI contribution to the organization. Sound familiar?  

Large organizations invest heavily in human resource management apps that are meant to help measure employee performance management, business performance solutions that measure financial performance, and business intelligence and predictive analytics solutions that are meant to provide business insights into past trends and high confidence models to anticipate future events and needs. But none of these investments will deliver on their promise if business process owners and strategic decision-makers can’t agree on what needs to be measured, how often, and how to leverage the results of this analysis to optimize their business and deliver the most value to the organization.

The cynic in me just views these back-and-forth negotiations as further proof that a truism told to me by a manager early in my career still rings true: “Statistics lie, and liars use statistics”. His (half-joking) point was that anyone can take raw data and make it support a story he wants to sell. But in my subsequent professional experience, I recognize that a well-governed data management, business intelligence, and performance management set of competencies, best practices, and tools can deliver significant value and insight to senior management to truly differentiate an organization from its competitors and provide positive business outcomes. The challenge is always going to be getting these efforts aligned.

 Of course, data epiphanies aside, if I see Derek Jeter in another uniform next year I might have to start watching competitive basket-weaving to make up for the free time I’ll gain by dropping the Yanks like a bad habit. ;-)


A non-baseball parallel

Coming from the 'wrong' side of the Atlantic I'm not a baseball fan (I'm not even sure I understand the rules!), but I did get the message in your post.

Your post reminded me of a client a couple of years ago who had outsourced part of their operations. They had some shared systems and data with the contractors and each technical discipline had regular monthly performance meetings with their opposite number. Most of these were only a bit dysfunctional, but one in particular was memorable.

One contractor team provided the client team with their performance for the preceeding 4 week period in pdf format so that no further analysis etc. could be done on it. The client team produced their own version of the same performance stats, however, their 4 week time frame was different to the contractors by 1 week and they used a slightly different calculation methodology. Therefore, there was no possible way for the performance measures to agree with each other.

The result was that the progress meetings were almost entirely devoted to arguing about the level of performance and whose calculation was correct. Precious little time was spent discussing what the performance data actually meant, identifying where improvement may be required and agreeing what the contractors needed to improve upon.

A year or so later there was an acrimonious ending of the outsourcing arrangement resulting in large costs for both sides.

It is far better to agree a single methodology and then discuss the meaning of the outputs, even if the measures are not perfect. This will at least support performance improvement activiites.


Great practical example, thanks!

Hi Julian
Thanks for that really valuable example, that demonstrates the point I was trying to make extremely well. It often is not about the quality or availability of the data, but agreeing how to use it once you get it.
Thanks again!

Relying on the old measures is a mistake

Rob, great insights here.

I think another piece of the story is in the idea of the supposedly "unmeasurable." I always like to tell the story of the man looking for his dropped car keys under the lamp post, not because he's sure that's where they were dropped, but because "the light's better here." It illustrates that we rely on the metrics we have, whether or not they are the best. The Yankees can't measure those intangibles, so they focus on the BA, the OBP, or whatever. People lose their jobs for those reasons, products aren't marketed or developed, creative analytic ideas aren't followed up, because we don't know how to measure something that would add insight. It's up to analysts to think out of the box andnot be limited by the tools and data they already "know."


Hi Merv
Thanks for the comments - I agree completely, I thought the unmeasurable comment from Jeter's agent was the best part of the story - if it can't be measured, you just have to trust me on it, right? :-)

decision analysis

Not sure it was a good idea for Jeter's agent to call a 3-year, $45 million offer "insulting"... but nethertheless, an interesting situation where the measurable value to both parties is dependent on the other (i.e. Jeter is worth less as a Twin).

Agree with you 100% on gaining agreement on how the data will be applied before the data is aggregated. Our clients often request market research to support a business decision, and we tell them that you need to determine your go/no go point before you conduct the research -- otherwise, individuals will just use the data to support their initial gut instincts.

Treasure trove of baseball statistics


I agree with you wholeheartedly that "how to apply" data is an important issue. One case I am familiar with is confusion with identifying and differentiating strategic key performance indicators (KPIs) in scorecards and operational performance indicators (PIs) in dashboards.

Switching topics back to baseball, there is amazing use of baseball stats by SABR-maticians. I have written a couple of articles about these if you can spare a few minutes. They are at:

I welcome dialog.


Gary Cokins, SAS

Great posts

Hi Gary
Thanks for your comments and links to your posts. We're definitely on the same page and I of course respect any data discussion that can morph into a baseball discussion - all goodness!
Thanks again