Bruce S. Old: The tennis data pioneer who interrogated Nazi scientists

[ad_1]
The Longwood Club in nearby Brookline hosted the US doubles championships and Old took notes on what he watched.
Speaking to the Concord Oral History Program, external in 1993 he said: “I made many charts. Where they hit the serve, where they returned the serve, how they tried to open up the court to make winners possible, their whole tactics. After about four years of this, I was able to write the draft of a book on the game of doubles.”
After his requests for collaboration on the book were rejected by many of the day’s top doubles players, Old approached Bill Talbert, an American nine-time Grand Slam winner who agreed to take a look at his manuscript.
The text included advice on tactics with supporting data that included original metrics Old devised such as ‘shot potency’, which worked by ranking a shot’s effectiveness in winning a point.
Attributing a basic potency of 1.0 to a second serve, Old’s data valued a first serve as 1.2 times more likely to win a point than a second serve and a shot at the net 2.6 times more likely. An overhead smash was 3.9 times more likely – and this was the most potent shot.
Old and Talbert worked successfully on the book and Old would go on to have four more published on tennis. They sold well, but ultimately the sport just wasn’t ready to fully embrace the role of data analysis.
At that time tennis was going through a period of change, adjusting to amateurs and professionals playing together for the first time in the Open era from 1968, with the political upheavals of various tours, circuits and governing bodies vying for supremacy.
Through the 1970s a few independent researchers looked at tennis but these were mostly academic studies. On the court, the Cyclops system was used to electronically monitor the service line at Wimbledon from 1980 and more recently it is Hawkeye that tracks the ball and allows for player challenges.
The last of Old’s books in 1983 was a compilation of his singles and doubles tactics publications and included a foreword penned by Arthur Ashe.
By then he had enjoyed a successful amateur career of his own – winning 38 titles in singles, men’s doubles and mixed doubles, until a hip operation in 1980 ended his playing days at the age of 67. He continued to watch Wimbledon on television, but his own involvement with the game had ended. He died in 2003, just before his 90th birthday.
So where does the use of tennis data stand today?
Perhaps the best three male players of this generation – or indeed of any generation – have different approaches.
Rafael Nadal claims to have no interest in data and says he does not use it in his preparations. Novak Djokovic takes the opposite approach and, since 2013, has employed a personal data analyst. Roger Federer is said to be wary of using data but, according to reports, pays a hefty premium, external to a third-party data supplier for exclusive access to certain information.
Craig O’Shannessy worked with Djokovic from 2017 and the Serb won four Grand Slam tournaments during his employment.
The Australian’s approach was a new one. Rather than focus on the two traditional strands of ‘winners’ and ‘unforced errors’, he wanted his player to make the opponent succumb to ‘forced errors’.
“Winning matches isn’t about playing perfect tennis or getting one more ball back,” he told Tennis.com, external in 2020. “It’s about putting your opponent in places where he’s more likely to miss.”
O’Shannessy also emphasised that long rallies should not be the focus. When he discovered that 70% of points were won in rallies of four shots or fewer, he knew that the player who came out on top in the majority of these short exchanges would win the match – and they could do so by forcing the opponent to make mistakes.
But as might be expected, because each is out there on their own, players and coaches generally tend to be tight-lipped on the subject of data analysis.
A frustration for fans and anyone interested in digging deeper into the sport is that most data is kept secret. Its ownership is spread across tennis’ overlapping governing bodies – the ATP Tour, the WTA, the ITF, the Grand Slam Board and the hundreds of organising committees for the various tournaments around the globe.
What is publicly known is limited, but that hasn’t stopped amateurs and enthusiasts doing some heavy digging.
Jeff Sackmann set up the Tennis Abstract website which provides a wealth of tennis data. At the moment, more than 10,000 matches and six million shots have been logged.
Presented as the ‘Match Charting Project’, the site collates data from volunteers and provides a fascinating resource for public viewing.
While lacking the ball and player movement data that might be afforded with access to the Hawkeye system, it has progressed to include a number of metrics that go beyond lengths of rallies or service percentages.
‘Serve Impact’ ranks a serve’s influence on winning a point even if the receiver manages to return the ball, while ‘Return Depth Index’ shows the effect of where balls are returned to.
Drop shots, slices, forehand and backhand winners, passing shots, winners down the line and various aspects of play at the net are all catalogued too.
How many players and coaches use the ‘Match Charting Project’ to prepare for competition is unknown, but for those who cannot afford the fees charged by the big data companies it is likely an attractive option and a growing number are searching for even a slight advantage.
Talbert certainly knew of data’s importance 70 years ago, as Old recalled about their first connection.
“I was in New York on business and I called Talbert, and his wife answered the phone,” he said.
“I gave her my name and said that I had left a notebook for her husband to read and I wondered what was happening. She said, ‘Thank God you called, he’s been sleeping with that damn notebook under his pillow every night because he’s so afraid he’ll lose it’.”
Rob Haywood is the author of the forthcoming book Many Impossible Things: The Ingenious Evolution of Football Data
Source link