Real-time insights, real-time impact
#OrbitalAI Challenge
What’s the competition about?
Get ready to embark on an exciting challenge that explores the cutting-edge possibilities of in-orbit data processing and its potential to drive progress in Earth’s sustainability, business, industry, and science.
ESA’s next-generation Φsat-2 mission will deliver a platform for the in-flight uploading, deployment and updating of third-party AI models. This space mission is currently under development by a European Consortium led by Open Cosmos (UK), alongside CGI (Italy), Ubotica Technologies Ltd (Ireland), Ceiia (Portugal), KP Labs (Poland), Geo-K (Italy), Simera Innovate GmbH (Switzerland), and Deimos (Spain).
In parallel, Microsoft and Thales Alenia Space will demonstrate and validate in-orbit computing technologies and potentialities onboard the International Space Station (ISS) for the mission named IMAGIN-e (ISS Mounted Accessible Global Imaging Nod-e).
ESA’s vision for edge computing in space is to foster an ecosystem of Earth observation applications. This challenge is designed to align with this goal and is open to a global audience of space, EO, and AI enthusiasts. It’s a once-in-a-lifetime opportunity for the global community to explore the potential of in-orbit data processing and contribute to the advancement of Earth’s sustainability.
Don’t miss out on this chance to be part of the challenge! Join today!
Pick your track
AI and Earth observation players, researchers, experts, and scientists from all around the world can participate in any of these two tracks:
- ESA Φ-sat-2 track: 6U CubeSat orbiting Earth at 500 km with a multispectral camera in a sun-synchronous orbit.
- IMAGIN-e track (ISS Mounted Accessible Global Imaging Nod-e, in collaboration with Microsoft and Thales Alenia Space): featuring a hyperspectral camera on the ISS.
The journey
Registration is now open 🎉
The challenge is open worldwide to all EO or AI practitioners from students and early professionals to researchers, engineers and experts in the field.
Check out the participants’ eligibility to know more about the registration.
Phase I: The open competition
The first phase started in mid-February and ends in June. Participants will submit their best solution in the form of a Jupyter notebook together with a PowerPoint presentation and a pitch video of their solution.
End of June, for each track 10 teams will be selected for Phase II.
But that’s not all! We have reserved more surprises for you. Once you join and get started, you will be part of the “Swag race”, a competition to win space-themed limited-edition branded goodies.
These first 3 milestones are checkpoints along your journey to building a winning solution and will focus on various components of your project such as team composition, idea concept, initial model description and prototype.
Read more about the race here.
Phase II: Incubation & Production
The second phase starts in July and ends in October. During this phase, the selected teams will work on their solution together with the Φsat-2 consortium and the Φ-lab team, or with Microsoft and Thales Alenia Space. The solutions will be tested and verified on the conditions of the mission’s payload.
At the end, the two best teams from each track will have the privilege to launch their solutions into space!
Resources and toolkits are available at your disposal during the competition on the AI4EO platform.
A simulation of the data for the IMAGIN-e & Φsat-2 optical payloads is available as input to build your applications empowered by AI.
Excited to be part of a space mission? Join today and start your new adventure!
👉JOIN NOW
Watch the video!
Be part of a space mission
OrbitalAI makes you part of a space mission where you collaborate with leading experts in the field.
Explore edge computing in space
Change the paradigm of EO data flow and extract insights that will fuel ground-breaking solutions.
Address global challenges
You can contribute to solving critical issues and make a positive impact on the world by helping business, industry, and science.