In a pivotal stride towards reining in the complexities looming over autonomous systems, CoVar, a celebrated innovator in AI and machine learning technologies for the Department of Defense, has clinched a substantial multi-year contract under the auspices of the Defense Advanced Research Projects Agency (DARPA). This endeavor falls within the ambit of the ambitious Autonomy Standards and Ideals with Military Operational Values (ASIMOV) program, which seeks to forge a concrete framework to evaluate the ethical readiness of autonomous systems. The urgency of this initiative resonates profoundly as these technologies find themselves increasingly woven into the fabric of military and civilian spheres.
The ASIMOV initiative does more than just focus on technological agility; it underscores the pressing need to scrutinize the ethical layers surrounding autonomy and AI. As these systems assume roles that demand complex decision-making, the initiative aspires to construct a cohesive framework that addresses ethical autonomy in tandem with performance metrics. This shared foundation will empower the Developmental Testing and Operational Testing communities with the tools necessary to quantitatively assess how autonomous entities navigate the intricate moral landscapes presented by military operations. Enriching this effort is an embedded Ethical, Legal, and Societal Implications (ELSI) advisory group, tasked with providing sage guidance throughout the course of the program.
At the heart of this transformative mission, CoVar is set to engineer a pioneering ethical testing infrastructure, aptly named GEARS (Gauging Ethical Autonomous Reliable Systems). This system promises to bring forth an innovative “mathematics of ethics.” By harnessing knowledge graphs to depict ethical quandaries and the intents of commanders, GEARS aims for clarity—a language both human and machine can grasp. These intricate graphs are poised to yield quantifiable ethical challenge ratings across varied scenarios, offering a new lens through which to view the moral dimensions of autonomy.
To ascend to the challenges posed by the ASIMOV program, CoVar has curated a multidisciplinary team. This proficient assembly includes not just ethical theorists but also savants in AI and machine learning trust, seasoned operatives from combatant command operations, and engineers at the cutting edge of technology. A notable collaboration springs from Duality AI, whose avant-garde Falcon digital twin platform aids in providing simulative insights and thorough data extraction for autonomous systems.
Dr. Pete Torrione, CoVar’s Chief Technology Officer, expressed an anticipatory optimism about the potential ramifications of this groundbreaking effort: “If this work is successful, it will represent the first quantitative ELSI-based evaluation framework suitable for testing the ethics of autonomous systems. This will empower the US Department of Defense to deploy AI/ML-capable autonomous systems with a clear understanding of not only the technical capabilities of the systems but also the ethics of their behaviors.”
Thus, through its pivotal engagement in the ASIMOV program, CoVar is not merely pursuing recognition but rather carving a path to reinvigorating responsible AI and ML development within the Department of Defense. This endeavor aspires to not only set the stage for ethical assessments but to ultimately sculpt the narrative concerning the future integration of autonomous technologies in an ethical framework.