Velon Procycling Road Code App
Usability Testing MVPv1 and develop MVPv2

Background to Velon
Velon is a joint venture company created by 13 of the World Tour professional cycling teams. It exists both to represent the needs of the teams and riders to the world governing body (UCI) and also the key race organisers who dominate the sport (for instance, Tour de France/Amaury Sports Org) and also to create and commercialise products based on the aggregation of performance data.
Road Code App
Velon have tried to launch a concept called Road Code in 2019 using data created directly from the bikes (non-biometric) within the race that showcases the rider’s prowess relative to the other racers, based on their performances as climbers, sprinters, one-day and tour specialists. This provides a richer experience for the many ‘ultra-fans’ that usually have to rely on the finish-line results and expert interpretation.
Velon had worked directly with their data management/front end partner, EY and had a very limited amount of access to a UI Designer to produce an MVP which was pushed out on a very limited basis in mid-2019.
The Brief
Whilst still skilling-up in UX, through a networking connection, I was invited to initiate, recruit, conduct, analyse, debrief and produce a workflow for 2020, starting with usability testing on the live app both moderated in-person and remotely using Lookback.io and then to establish a panel of ‘ultras’ that agreed to offer their time and feedback on future iterations.
The Process
Discovery, Benchmarking and Open Listening
The Road Code App didn’t have any obvious direct competitors as the raw data was unique and proprietory.
However, within cycling media and fandom, PCS (Pro Cycling Stats) had build a reputation of having accurate and rapidly updated information on the various tours, the teams, the individual riders, the races. It was really only comprehensible to the expert user so as part of the competitive analysis, I also sat with some of these heavy user/experts as they gave me their personal tour of the site, allowing me to learn the terms but also some of the user workarounds to reach the information most important to them.
Through 3 interviews, with a fan, a journalist and a gambler who used this site to place wagers, I was able to report back a range of insights that informed the next stage of the research.
Recruitment
I contacted 4-5 cycling clubs local to me both directly and through two elite cycling shops that also acted as meeting points for Saturday and Sunday ride-outs. Whilst I was sceptical as to whether I’d get any kind of response, I received almost a dozen offers to participate with no incentive offered.
Velon also assisted me in introducing me to a pro-cycling team managers, elite riders and some freelance media (who agreed to participate without reporting on their participation).
Interviews

I arranged to interview all participants at their homes which meant evenings and weekend interviews. I used my mobile usability testing ‘lab’ (see below) for all in-person sessions.
I conducted 8 interviews with ‘ultra’ fans and media which I recorded with their permission and used Lookback.io for the remote moderated sessions with the athlete and team experts.
Each session lasted around 30minutes, with first ten minutes on formalities, consent, explanation of my role and the tech plus 1-2 warm-up questions of no value to study beyond relaxing them and ensuring they were being ‘heard’.
Ten minutes recorded during which each participant was asked to narrate a ‘cold’ opening of the App without any prompting or explanation. I applied extended silence to allow them to focus their efforts.
They were then asked to perform 2 tasks of discovering a specific rider’s recent performance and to narrate themselves and judge their own success rates.
Final ten minutes was unrecorded open discussion of the App and a request of ‘I’d like to feedback 2-3 things that you’d like to see in the next version’. No predictive questions asked.
Debrief

My stakeholders including the CEO of Velon were (by their own admission) mixed capacity in terms of receiving and acting on the research, so I produced a 2 minute edit of the recordings that captured all of the highs and lows of the interviews.
This had a very dramatic effect and allowed my 20minute presentation that followed to be fully acknowledged by the team as being directive and positive in acceptance.
Next Action
I can share my follow-up on a limited basis but not do not have permission to ‘publish’ this information.