Advancing Airfield Safety: Acubed Completes Data Collection
Acubed recently wrapped up a major data collection initiative at Dallas-Fort Worth International Airport (DFW), marking an important step in exploring how artificial intelligence (AI) and machine learning (ML) can enhance airport ground operations. This successful campaign was made possible through the valued partnership and support of the DFW Innovation Team, as part of our collaboration under the MOU. Using the Baron Flight Lab, we set out to capture critical video data to support advancements in Vision-Based Learning (VBL) for Foreign Object Debris (FOD) detection, runway inspections and obstacle detection. Over the course of three days and five taxi sessions across the north-south taxiways surrounding the terminal core, 1.3 million images were captured. As Acubed continues to collect massive amounts of airport and runway data, efforts like this lay the foundation for smarter, safer and more efficient airfield operations powered by AI.
Strategic Data Collection for Enhanced Safety
The primary objective of this campaign was to capture detailed video data of aircraft and ground equipment interactions, particularly in congested zones prone to potential collisions. The effort prioritized capturing taxi data across as many taxiways as possible with a strategic emphasis on areas where conflicts were most likely. Additional emphasis was placed on recording aircraft operating at lower speeds, especially during taxi operations near moving vehicles and ground service equipment, all while ensuring minimal disruption to regular airport activity.
Phased Approach to Data Acquisition
The operation was structured into three distinct phases to optimize data collection opportunities as outlined below, with corresponding examples:
- Phase I: Concentrated on congested gate areas and intersections
Aircraft-Ground Vehicle Interactions: Capture instances of aircraft taxiing from runways to gates, particularly at intersections with service vehicles such as fuel and catering trucks.
- Phase II: Targeted high-traffic zones
Aircraft Departures and Ground Vehicle Interactions: Capture aircraft taxiing to the runway while interacting with ground service vehicles. Focus on areas with narrow taxiways and increased potential for conflict or limited maneuverability.
- Phase III: Aimed at finalizing data collection
Proximity to Active Runways: Observe aircraft taxiing near active runways and their interactions with ground equipment like tugs and fuel trucks, with an emphasis on identifying potential FOD scenarios.
Leveraging Flight Lab for Future Innovations
The insights garnered from this data collection–and from future campaigns using Acubed’s new King Air Flight Lab–will significantly contribute to Acubed’s progress and provide valuable insights into the application of AI and ML technologies for improving airport ground operations. By analyzing the collected data, we will be able to refine Acubed’s detection systems for FOD, runway hazards and obstacles, which are critical to ensuring the safety of both aircraft and ground personnel. Additionally, the data will support further advancements in automated systems and real-time decision-making capabilities for air traffic management, leading to more efficient operations across airports.
If you are interested in joining our team, check out our job postings roles here.