Aug 18, 2022
This content was originally published on BMNT's YouTube
Channel. You can find the original video here.
In this follow-up conversation to BMNT’s June panel "The Race
for Autonomy: Navigating a New Battlefield," A'ndre Gonawela talks
to Dr. David Broyles, Research Program Director at the Center for
Naval Analysis and co-host of "AI with AI", on the challenges
facing the Department of Defense when it comes to developing and
leveraging autonomous systems and capabilities. Dr. Broyles digs
into why he (like our prior panelists) believes the state of
autonomy today is ‘brittle’, and why the end goal for many is
‘general AI’ – the ability for artificial intelligence to behave
and adapt like human intelligence can. We discuss Dr. Broyles’
belief that an ‘AI Winter’ may be approaching, where momentum in
the development of systems is slowed or even halted. We then dig
into where the Department of Defense is on the racetrack,
dissecting the lingering confusion that underlies the differences
between unmanned systems and autonomous systems, and how we can
better equip DoD leaders in understanding how autonomous systems
can operate. Dr. Broyles highlights opportunities to build trust in
autonomous systems with the warfighter, in addition to addressing
the edge cases and ‘fat tails’ that can impede the success of
You can read about our first panel here: https://www.bmnt.com/post/the-race-for-autonomy-is-here
Notes from Episode
- General consensus of state of autonomy is that it is brittle,
and still in infancy when it comes to DoD
- Bigger debate in AI community – end state is general AI,
equivalent to human intelligence, adaptable to environment, and
process things like a human can. What are the tools to go about
- Two camps that disagree with each other:
- Neural network reward: Can employ larger neural networks, dump
more data, put more processing power, and have reward schemes.
- Symbolic logic camps – need ways to encode information in
symbols that machines can manipulate at higher levels of
- Still trying to figure out the things we really need to make
these things work and get rid of the bugs.
- AI Winter?
- There have been periods where the momentum in AI development
stopped – last one in early 2000s, influenced by availability of
graphical processing capabilities (large computational power being
dumped on the problem)
- Are we coming to the limits of the tools and capabilities we’ve
developed? Margins of incremental improvements are
- AVs are a bellwether of progress – if progress isn’t delivered
in tangible ways, market could lose interest, meaning less
- AI Summer?
- Alexnet winning image recognition competition in 2014 was first
real success of neural networks, motivated community at large, many
developments between 2014 through 2019. People were trying many
- Where’s DOD with developing/leveraging autonomous systems?
- It’s hard to pinpoint where they are on the racetrack.
- Confusion between unmanned and autonomous systems – can be
communicated unclearly, sometimes unmanned systems are mistakenly
attributed as autonomous when they aren’t.
- First major step is for DoD to employ more unmanned systems –
it’s been slow, but CNO actually incorporating uncrewed systems
into their force structure direction is a significant step.
- Lots of little things here and there are going on but there’s
nothing being coordinate in a big way. CDAO (Chief Digital AI
Office, former JAIC), is trying to play a role here but there’s
more ways in which they can step in.
- Ensuring trust for warfighters?
- You can either not have enough trust, or you can overtrust, and
the latter gets less attention – the example here is Tesla’s
autopilot system being overtrusted and then getting involved in
- Need to get autonomous systems into the hands of the
warfighters – biggest priority.
- Need to communicate the capabilities better to an operator,
need to ensure that the operator can have other cues and/or ways of
interacting with the system.
- Do our DoD leaders understand how autonomous systems can be
used/leveraged and how they work? Can we work to educate them
- Area of high concern, and cyber discussions are indicative of
the difficulties that could be faced as senior leaders have taken
some time to embrace and understand the technologies.
- Very small number of senior leaders who have a good idea of
what’s going on, and larger number with staff who know what they’re
talking about, but there’s issues with proposals promising to
develop tech that simply won’t happen.
- People in approval chain may not understand that these things
- Arming senior leaders with the key questions, but that’s a
bandaid – we need more people with basic understandings of how
these technologies work. This does not necessarily mean we hire
computer scientists, but DoD can work internally to raise the floor
on level of understanding – and these areas are beginning to slowly
come up to speed.
- Addressing edge cases?
- Fat tails – distribution of things that you may run into, most
of the stuff is going to fall into a general bin, but there’ll be
edge cases that’ll extend.
- What happens if a plastic bag runs into a screen of an AV?
- Uber and others couldn’t just throw hundreds or millions of
hours of driving data to fix this.
- Solution is General AI – we can’t throw fat tail problems into
same bucket. Running simulations still runs into the same problem,
and throwing info won’t solve it. There really is no good answer,
there’s not been a good articulation of the answer.
- We’re trying to minimize the edge cases as best we can.
However, alternatives like smart roads and sensors can provide
added information to help prevent accidents or minimize disruptions
- Experimentation – What’s Commercial doing that DoD is not
- Mechanics around how to do things are the primary thing that
can hinder experimentation.
- There’s a strange acquisition ecosystem that isn’t always
friendly to innovative ideas going through standard program office
- Policy Lagging Behind on Autonomous Systems?
- There are some new technologies falling under clear regulation
– and as long as it doesn’t cause any other problem, but because
these technologies are so wide ranging they can cause issues.
- You can forecast some of these things, but there’s always an
unexpected bit. Is there a general philosophy on how to handle
this? There’ll always be questions on privacy and safety.
- Is DoD adequately reaching out to small businesses?
- It is happening, but biggest barrier (in his view) is DoD
contracting and being able to decipher postings, requirements,
forms, and etc.
- Need to take a quantitative approach to assessing effectiveness