[Theory seminar] Jalaj Upadhyay

When:
November 30, 2016 @ 12:00 pm – 1:00 pm
2016-11-30T12:00:00-05:00
2016-11-30T13:00:00-05:00

Speaker: Jalaj Upadhyay

Affiliation: Penn State University

Title: Fast and Space-Optimal Differentially-Private Low-Rank Factorization in the General Turnstile Update Model

Abstract:

The problem of {\em low-rank factorization} of an mxn matrix A requires outputting a singular value decomposition: an m x k matrix U, an n x k matrix V, and a k x k diagonal
matrix D) such that  U D V^T approximates the matrix A in the Frobenius norm.  In this paper, we study  releasing differentially-private low-rank factorization of a matrix in
the general turnstile update model.  We give two differentially-private algorithms instantiated with respect to two levels of privacy.  Both of our privacy levels are stronger
than  privacy levels for this and related problems studied in previous works, namely that of Blocki {\it et al.} (FOCS 2012), Dwork {\it et al.} (STOC 2014), Hardt and Roth
(STOC 2012, STOC 2013), and Hardt and Price (NIPS 2014). Our main contributions are as follows.

1. In our first level of privacy, we consider two matrices A and A’ as neighboring if  A – A’ can be represented as an outer product of two unit vectors. Our private algorithm
with respect to this privacy level incurs optimal  additive error.  We also prove a lower bound that shows that the space required by this algorithm is optimal up to a
logarithmic factor.
2. In our second level of privacy, we consider two matrices  as neighboring if their difference has the Frobenius norm at most 1. Our private algorithm with respect to this
privacy level is computationally more efficient than our first algorithm and incurs optimal additive error.