At the recent Federal Aviation Administration (FAA) Drone Symposium (co-hosted by AUVSI), FAA Deputy Regional Administrator Deb Sanning discussed the impact of autonomy and AI, human/machine integration, and the strategies for gaining public trust in autonomous systems, like drones. Sanning discussed this topic along with Brendan Groves from Skydio; Taylor Lochrane, the Deputy Director for Science and Technology at DOT; Lauren Haertlein from Zipline; and Margaret Nagle from Wing. What did the panel have to say about this issue? Well, in the aviation sector, “[a]utomation is making a meaningful impact in worker safety.” For example, over 30 state DOTs use drones for bridge inspections, which helps to cut time and costs as well as reduce the likelihood of dangerous (and even deadly) outcomes. While most would agree that the use of an autonomous drone to perform these inspections makes sense, the issue of safe and responsible use of AI and robotics still lingers. The panel suggested that responsible autonomous drone use rests on 1) the obligation to mitigate potential misuse of the technology; and 2) governments should be the final arbiter of appropriate conduct.

The core concepts behind these points for drone manufacturers and drone operators, as well as drone software developers using AI and machine learning, are to educate, listen, and respond. When a drone company communicates with the people of the cities and towns in which they operate, they can cultivate acceptance, build connections, and alleviate potential privacy concerns.

To promote widespread use of autonomous drones and vehicles, drone companies must engage stakeholders at all levels: the FAA, civil aviation authorities, AND mayors and community boards. Automation and societal acceptance of drones are connected: automation allows for scale, and scale allows for widespread value amongst a community.

Photo of Kathryn Rattigan Kathryn Rattigan

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security…

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy+ Cybersecurity Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security compliance. Kathryn helps clients review, revise and implement necessary policies and procedures under the Health Insurance Portability and Accountability Act (HIPAA). She also provides clients with the information needed to effectively and efficiently handle potential and confirmed data breaches while providing insight into federal regulations and requirements for notification and an assessment under state breach notification laws. Prior to joining the firm, Kathryn was an associate at Nixon Peabody. She earned her J.D., cum laude, from Roger Williams University School of Law and her B.A., magna cum laude, from Stonehill College. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.