ArtIAMAS Industry Projects

The ArtIAMAS-MIPS Program supports projects within the ArtIAMAS capability research areas. Projects are a three-way partnership between (1) private sector companies that wish to pursue collaborative research of interest to the ARL ArtIAMAS program, (2) ArtIAMAS-eligible faculty from the University of Maryland College Park (UMD) and the University of Maryland Baltimore County (UMBC), and (3) an ARL researcher.

Perception for Maneuvering Legged Robots in Dense Unstructured Terrains with Booz Allen Hamilton

This project will develop novel perception technologies that can be used for navigation of legged robots in complex terrains. The resulting technology will be integrated with ARL Autonomy Stack. Research activities include: Develop supervised and semi-supervised learning algorithms; Develop metrics to classify terrain features; and Integrate the perception and navigation algorithms with a Boston Dynamics Spot.  Anticipated Outcomes: Our novel perception algorithms will use deep neural networks to enable a Boston Dynamics Spot to robustly locomote over challenging terrain.

Autonomous Planning and Navigation for Multi-Robot Ground and Aerial Collaborative Coordination and Decision Making with Kick Robotics

This project will develop software that provides an infrastructure for multi-robot coordination that will be integrated with ARL’s autonomy and simulation stacks. We propose to enhance ARL’s ongoing internal effort of extending the autonomy stacks to support cooperative multi-robot systems along with the integration of ARL simulations for both UAVs and UGVs. Anticipated Outcomes: a suite of software packages and simulation environments that will reside in the existing ARL autonomy stack for supporting multi-robot operations.

Virtual Physical Co-Simulations and Real-Time Collaborative Decision Making using Ground and Aerial Robots with Stormfish Scientific Corporation

 This project seeks to combine information from physical and virtual environments to guide and optimize the behaviors, interactions, and maneuvers of a set of autonomous agents for deployment, planning, sensing, and coverage control by emulating various scenarios that are not always feasible to stage in real environments. Anticipated Outcomes: a dynamic scene graph of the hierarchical relationships between the shape and geometry of objects, agents, and structures; a reduction of the computing and communication overhead for virtual-physical simulations and experiments; a strategy to meet the low-latency and high fidelity requirements of intelligent decision making applications in contested environments; and augmented reality experiments involving multiple agents in a mixed virtual-physical environment including generation of synthetic and real data for mapping and navigation.

Detecting Landmines in Contested Environment using Unmanned Ground Vehicles with MachineSense

This project seeks to develop novel computer vision and signal processing techniques on unmanned ground systems to detect landmines and explosives devices placed under the bridges and roads in battlefield environments. We plan to use mmWave radar to detect different types of metallic and non-metallic objects on the ground in different environmental conditions. We plan to develop novel signal processing techniques that help to detect landmines or similar artifacts in the natural environment. Our ARL partner had developed standoff IED, landmine and UXO detection sensor systems for the Army.  Anticipated Outcomes: collaborate with ARL to advance our proposed system and algorithms for more precise and robust landmine detection techniques. 

Onboard Image Processing for Applied Aerial Autonomy with ModalAI

This project will improve the capabilities of ModalAI drones to support onboard perception and computation relevant to the applied aerial autonomy activities under the ArtIAMAS cooperative agreement with ARL. Research activities include: Improve Support for Onboard Neural Networks; Improved Support for Image Processing; and Hardware Integration Support. Anticipated Outcomes ARL: improved autonomous capabilities using ModalAI’s ARL-compatible flight systems with MAVericks, ARL’s aerial autonomy stack.