Posted: 6 June 2017, 7:45 p.m. EDT
Participants in the panel discussion, "The Verification and Validation of Intelligent Machines," June 6 at the 2017 AIAA AVIATION Forum in Denver.
Panelists: Moderator Mike Francis, chief of advanced programs and senior fellow, Autonomous and Intelligent Systems, United Technologies Research Center; Noah Flood, aviation and autonomy consultant, Delta Air Lines; Fritz Langford, chief engineer, Autonomous Aerial Cargo/Utility System, Aurora Flight Sciences; Paul Nielsen, director and CEO, Software Engineering Institute, Carnegie Mellon University; Alessandro Pinto, project leader, Embedded Intelligence, United Technologies Research Center; Scott Strimple, director of training and education, The Drone Flight School
Duane Hyland, AIAA Communications
As machines driven by artificial intelligence interact with humanity more frequently, it’s critical we find a way to trust them and get them to trust us, a panel of experts said June 6 during the Demand for Unmanned session “The Verification and Validation of Intelligent Machines” at the
2017 AIAA AVIATION Forum in Denver.
Paul Nielsen, director and CEO of Carnegie Mellon University’s
Software Engineering Institute, said that when dealing with autonomous machines, the industry has to keep in mind that things change quickly.
“These machines will not be the same next month as they are now, and they will not be the same a year from now as they are today,” he said, adding there’s a real need for figuring out how to test them properly.
Alessandro Pinto, project leader of Embedded Intelligence with the
United Technologies Research Center, said that testing properly is “especially important as they will be operating in highly complex environments.”
The panelists agreed humans will still have a role in autonomous systems because machines can’t feel their wait through things and tend to be myopic.
“Human nature allows us to make decisions based on gut feelings,” explained Scott Strimple, director of training and education at
The Drone Flight School. “We might not be able to explain why we did what we did, but it was the right thing to do.”
Cybersecurity and coding errors are other challenges panelists said the industry would have to address in the rise of autonomous systems. Nielsen pointed out that there is about “one defect for every 1,000 lines of code, meaning that in 24 million lines of code, you have 24,000 potential errors.” This makes guarding against injection of malicious code particularly important, he said.
According to the panelists, humanity will be able to co-exist with more autonomous machines. And, they said, through the development of strict architectures and parameters, the relationship will be a good one.
Back to 2017 AIAA AVIATION Forum Headlines
Back to 2017 AIAA AVIATION Forum home