The evaluation process at the European Rover Challenge starts with the registration. This is the time when teams send the first documentation related to the rover and drone projects. Documents, such as the Proposal, Preliminary Report, and Final Report, are delivered to the jury board during the project lifecycle, forming a comprehensive, multi-stage evaluation that assesses both the development process and the final performance.
The evaluation is conducted through two primary pillars: Continuous Assessment through Documentation and On-Site Performance Evaluation at the ERC Finals.
Throughout the competition cycle, teams are required to submit a series of technical and planning documents that are scored by the jury. This process evaluates the team’s engineering methodology, project management, and readiness. Key deliverables include:
Submitted at registration, outlining the initial project concept.
Details the team’s design progress and technical solutions.
A 10-minute video demonstrating the rover’s and drone’s capabilities and the team’s readiness for the competition.
Reports like the Science Planning Report and the Droning Sub-Task Report are submitted before the finals and assess the team’s strategic preparation for specific tasks.
The culmination of the project is the on-site finals, where teams are evaluated on the practical performance of their systems and their ability to operate under competition conditions. This evaluation includes:
Before each task, rovers and drones must undergo mandatory checks for weight and RF communication compliance, overseen by judges. Teams must also submit required forms, such as the Traverse Planning Form, before starting the Navigation Task.
Teams execute a series of complex tasks on the Mars Yard and in the Drone Cage. Performance in each task—Science (Exploration, Sampling, Astro-Bio), Navigation (Traverse, Droning), Maintenance, and Probing—is observed and scored by judges according to detailed criteria outlined in the rules. Judges monitor execution, ensure safety, and record results.
Certain tasks, like Science-Exploration and Science-AstroBio, require teams to analyze data and submit a final report within a strict deadline (e.g., 2.5 hours) after their run, testing their ability to perform rapid scientific analysis.
Teams deliver a 20-minute presentation to a panel of judges, followed by a 20-minute Q&A and discussion session. This task evaluates their project management, technical design, problem-solving skills, and outreach activities.
This entire process is overseen by an independent jury board, including Main Judges for each task and a Head of Jury, who ensure that scoring is fair, consistent, and adheres to the competition rules.