The application ranks surfing conditions across beaches by analysing wave behavior using live, low resolution web cameras.
Can you imagine having to shift through thousands of hours worth of video footage to find 10 or 20 seconds worth of valuable information?
Helps you pick the best spot to surf on the day and reports on various metrics extracted from beach webcams in the UK, Ireland, France, Portugal, New Zealand, Australia and Bali.
The Competition involved two robots per team of twelve developers in a round robin. We had to develop everything from scratch including the vision system using opencv and the Arduino firmware using C++.
I delivered a live demo of Apache Beam (Dataflow) and how to develop a pipeline to consume clickstream data using interactive notebooks
In this article I'll walk you through the SQL statements that I used to preprocess GPS sport data for machine learning. The aim was to train a model to make predictions on future games.
I ran a webinar showcasing how we have built a Powered by Looker web application. Take some time out to see how we have gone about using Looker in a sports science and performance setting.
At Hacktheburgh in Edinburgh I built an algorithm that optimised booking accommadation for a large groups in O(n) time using the Google Places API.
A presentation I did on several predictive models I'd attempted to get the most out of the live sport data. Some models performed better than others but all gave predictive insights during their live sporting events.
I spoke at a Apache Druid meetup about utilising the speed of apache druid on scout and tracking sourced sport data in realtime. The presentation is avaliable here https://imply.io/resources/videos/.
A computer vision hackathon where I extracted 3d pose estimations from a 2d UFC fight to derive new stats like aggressiveness, pressing and use of octagon.
I worked on a platform that could deliver audio directly from sports data within media speeds so that it could be multiplexed with the live video stream. I used my own model for text generation then chatGPT for audio all from public data feeds sourced from betting sites.
In this article I’ll take you through the transformations and latency optimisations I learnt whilst building a spark structured streaming pipeline to deliver play by play data for the Ryder Cup.
Measuring the performance of players is hugely valuable in many areas of sports analytics as comparing a player’s live performance against their yearly average gives a lot of insight into what will happen for the rest of that game or match.
Many MLOps frameworks can encourage architectures that won’t offer the best performance and cost when it comes to deploying your model. In this article I will describe three different ways you can deploy a model yourself that will better suit how it’s used.
In 2023 the team was tasked with making 100 live golf stats available via REST endpoints for the Open Championship.
Building on our existing Data Warehouse hosted in Apache Druid we aimed to enhance its value by adding a data visualisation tool that could be offered to external clients.
In 2023, we were tasked with creating an engaging live commentary bot to enhance the watching experience.