Talk 1: 13:30-14:30
Speaker: Dr. Manoranjan Mohanty (University of Auckland)
Title: e-PRNU: Encrypted Domain PRNU-Based Camera Attribution for Preserving Privacy
Abstract: Photo Response Non-Uniformity (PRNU) noise-based source camera attribution is a popular digital forensic method. In this method, a camera fingerprint computed from a set of known images of the camera is matched against the extracted noise of an anonymous questionable image to find out if the camera had taken the anonymous image. The extension of this image-centric PRNU-based method for a video, however, is not trivial. A number of video-centric challenges, such as video stabilization, need to be addressed. In addition, the PRNU-based method also presents privacy challenge. Using the camera fingerprint (or the extracted noise), an adversary can identify the owner of the camera by matching the fingerprint with the noise of an image (or with the fingerprint computed from a set of images) crawled from a social media account. In this talk, we will first discuss a preliminary PRNU-based approach for stabilized videos. Then, we will discuss an encrypted-domain PRNU-based camera attribution framework that can address the privacy concern. This encrypted-domain approach encrypts the camera fingerprint using partially homomorphic Boneh-Goh-Nissim (BGN) encryption scheme such that fingerprint matching can be performed without knowing the content of the fingerprint. To overcome the leakage of privacy from the content of an image that is used in the fingerprint calculation, the fingerprint is computed in a trusted environment, such as ARM TrustZone.
Talk 2: 14:30-15:30
Speaker: Prof. Chia-Mu Yu (National Chung Hsing University)
Title: Local Differential Privacy: State-of-the-art Privacy Notion for Distributed Environment
Abstract: Differential privacy (DP), a mathematical privacy notion that can quantify the privacy loss, has been the de facto standard for privacy guarantee. An inherent assumption of DP is the trusted data curator that can reliably sanitize and then release the sanitized data. However, this is not the case for many data collection applications. Recently, a variant of DP, local differential privacy (LDP), has been proposed to eliminate the assumption on trusted curator. Thus, LDP particularly applies to the framework of privacy-preserving data collection, where an untrusted data aggregator is required to infer the statistic information of the population but is not allowed to know any individual information. To meet the privacy-compliant policy, LDP has been implemented in products from Apple, Google, and Microsoft to collect the user statistics. In this talk, I will introduce the relevant privacy notions and recent advances on these topics.
|Date||July 3, 2018 (Tue) 13:30 - 15:30|