Active Imperception and Robotic Privacy

Abstract: In late July last year it came to light that iRobot Corp. intended to sell the maps that modern Roomba vacuum cleaning robots build to help them navigate. This caused a public furore among consumers. This situation and several others (e.g., nuclear inspection, use of untrusted cloud computing infrastructure) suggest that we might be interested in limiting what information a robot might divulge. How should we think about robotic privacy? In this talk I'll describe a few recent ways we've been examining the foundations of robotic privacy. This will include a privacy-preserving  tracking problem, where we'll look at how one might think about estimators which are constrained so as to ensure they never know too much. And also how we can solve planning problems subject to stipulations on the information divulged during plan execution.  The idea is to have robots that obtain knowledge strictly on a need to know basis. I'll address the question of how to formulate and solve planning and filtering problems subject to informational constraints. 

Bio: Dylan Shell is an associate professor of Computer Science and Engineering at Texas A&M University. His research aims to synthesize and analyze complex, intelligent behaviour in robotic systems. His work has been funded by the National Science Foundation, the Department of Energy, and DARPA. He has been the recipient of an NSF Career award, the Montague Teaching award, the George Bekey Service award, and multiple best reviewer awards. For academic year 2018-19 he is very fortunate to be the Mary Shepard B. Upson Visiting Professor in Engineering at Cornell University.