Can an Apple Watch Detect a Hurling Strike?

· 997 words · 5 minute read

Part 1: Data Analysis.

About a year ago I started playing a bit of hurling. I’m still new to the sport and learning as I go, and the speed of play and the level of skill required are insane!

Some days I go outside to practise a few long strikes against the wall. On other days, I ask my son to join me so we can pass to each other.

After a while, I started wondering what kind of data I could get from hurling. My Apple Watch records the activity as running, but there is no such thing as hurling. That made me think: maybe I can build an app that records my play and lets me extract some useful metrics from it. Making sense of all the IMU data would be a huge job, though. There are many different player activities during play, and it would be hard to tell what is what. The first simple activities we can probably parse out safely are running and air striking, and then build on top of that.

Running is actually very easy to spot on IMU graphs, and this is something I will come back to later. The pattern is very clear.

Fig 1. IMU Running pattern
Fig 1. IMU data from Apple Watch - Running pattern.

I have a bit of experience with IMUs and did some prototyping about a decade ago. I watched several videos on YouTube to refresh my knowledge. One particular thing to know is how negative/positive acceleration is followed by the opposite positive/negative acceleration. Keep this in mind reading further.

Fig 2. IMU acceleration on graph
Fig 2. IMU acceleration on graph.

The first thing to focus on is air striking. This activity potentially has a lot of interesting data points.

To collect data from the IMU, I created a very simple app for iOS and watchOS. It records acceleration, gyroscope, and gravity. For now, the data is just a CSV file, sample.

timestamp,ax,ay,az,gx,gy,gz,grx,gry,grz
1773765939.470816,0.043493,0.177917,0.113206,-0.055432,0.104720,-0.119114,0.153025,-0.529662,-0.834291
...

I recorded some air strike data. The next step is to visualise it, get a feel for what it looks like, and try to identify patterns.

Fig 3. IMU Acceleration and Gyroscope data from Apple Watch - air strike
Fig 3. IMU Acceleration and Gyroscope data from Apple Watch - air strike

Gravity also shows quite an interesting pattern. Looking at multiple recordings, I was able to get a sense of how it appears on the graph. One thing though: these are big strikes, meaning a large C-shaped swing. That was another decision I made early on: focus first on big C-shaped swings, and only from one side, from right to left. The idea is to narrow the scope as much as possible, get good at that, and then expand further.

Fig 4. IMU Gravity data from Apple Watch - air strike
Fig 4. IMU Gravity data from Apple Watch - air strike

To visualise the data shown in Figs. 1, 2, and 3, I created a tool called imu-analysis-tool.

Looking at graphs is great, but visualising the motion in 3D seemed even more useful, so I built a small thing with three.js. AI tools helped with research and some coding. Having a background in graphics, maths, and rendering definitely helped me make sense of the approach, experiment further, and review the code.

Video 1. IMU acceleration and gyroscope data translated into motion.

Here is a breakdown of the trajectory computation for three.js. There is a lot of code, and most of it is there to dampen drift from the IMU sensor. The rest of the code is there to calculate trajectory and apply acceleration.

Visualising in 3D helped me get a better sense of the acceleration. But I still was not sure at which moment the hurl was touching the ball, so I added the ability to record audio to my watchOS app. Next, I improved playback in the IMU Analyzer tool and synced the audio with playback of data from the CSV.

Interestingly, there is some lag that makes playback different on Safari and Chrome. Maybe the code needs to be more optimised, or it is audio buffering, not sure.

After reviewing about 5 recordings, I was able to get a better understanding of the point at which the impact is happening.

Fig 5. Break down what is what on the graph.
Fig 5. Break down what is what on the graph.

Still, I was not 100% sure that impact happened at that exact point on the graph. A video recording made it clear that impact does indeed happen around that point.
Another confirmation is how jagged the XYZ signal looks on the gyroscope graph.

Looking at the magnitude transform, where green is acceleration and purple is gyroscope data, makes the impact point easier to spot.

Peak Jerk - Largest sample-to-sample change in acceleration divided by time. Higher jerk usually means a sharper, more abrupt impact.

For this particular strike, peak jerk is 1283.4605 g/s.

Fig 6. Successful air strike - magnitude view highlighting a jagged impact signal.
Fig 6. Successful air strike - magnitude view highlighting a jagged impact signal.

It took many software iterations and many takes to get the best possible sync between iPhone video and IMU sensor data from the Apple Watch.
At first, I tried to record only successful hits, but I quickly realised that unsuccessful swings (when I miss the ball) are valuable samples too. They do not show the same jagged pattern on the graphs, which helps us parse out swings without impact.

Fig 7. Unsuccessful air strike.
Fig 7. Unsuccessful air strike.

The magnitude is smooth here as well, and it can be used to isolate just the hurley swing.
Unsuccessful air strike, peak jerk is 199.1469 g/s.

Fig 8. Unsuccessful air strike - magnitude graph.
Fig 8. Unsuccessful air strike - magnitude graph.

With this groundwork and a growing dataset, I am now on track to train a small neural network to recognise air strikes.

Stay tuned for part 2, where I will dive into model training and deploying it to my Apple Watch.

The short answer to the post title is yes. An early look at true positives and false positives already shows this is possible. Whether I can parse out specific air strike types is still unclear, but that is not the aim of this experiment.