BEGIN:VCALENDAR VERSION:2.0 PRODID:-//128.220.13.64//NONSGML kigkonsult.se iCalcreator 2.20// CALSCALE:GREGORIAN METHOD:PUBLISH X-FROM-URL:https://www.cs.jhu.edu/~mdinitz/theory X-WR-TIMEZONE:America/New_York BEGIN:VTIMEZONE TZID:America/New_York X-LIC-LOCATION:America/New_York BEGIN:STANDARD DTSTART:20231105T020000 TZOFFSETFROM:-0400 TZOFFSETTO:-0500 TZNAME:EST END:STANDARD BEGIN:DAYLIGHT DTSTART:20240310T020000 TZOFFSETFROM:-0500 TZOFFSETTO:-0400 TZNAME:EDT END:DAYLIGHT END:VTIMEZONE BEGIN:VEVENT UID:ai1ec-353@www.cs.jhu.edu/~mdinitz/theory DTSTAMP:20240319T074948Z CATEGORIES: CONTACT: DESCRIPTION:Speaker: Audra McMillan\nAffiliation: Apple\nTitle: Hiding amon g the clones: a simple and nearly optimal analysis of privacy amplificatio n by shuffling\nAbstract:\nDifferential privacy (DP) is a model of privacy -preserving machine learning that has garnered significant interest in rec ent years due to its rigorous privacy guarantees. An algorithm differentia lly private if the output is stable under small changes in the input datab ase. While DP has been adopted in a variety of applications\, most applica tions of DP in industry actually satisfy a stronger notion called local di fferential privacy. In local differential privacy data subjects perturb th eir data before it reaches the data analyst. While this requires less trus t\, it comes a substantial cost to accuracy. Recent work of Erlingsson\, F eldman\, Mironov\, Raghunathan\, Talwar\, and Thakurta [EFMRTT19] demonstr ated that random shuffling amplifies differential privacy guarantees of lo cally randomized data. Such amplification implies substantially stronger p rivacy guarantees for systems in which data is contributed anonymously [BE MMRLRKTS17] and has led to significant interest in the shuffle model of pr ivacy [CSUZZ19\, EFMRTT19]. In this talk\, we will discuss a new result on privacy amplification by shuffling\, which achieves the asymptotically op timal dependence in the local privacy parameter. Our result is based on a new proof strategy which is simpler than previous approaches\, and extends to a lightly weaker notion known as approximate differential privacy with nearly the same guarantees. Based on joint work with Vitaly Feldman and K unal Talwar (https://arxiv.org/abs/2012.12803) DTSTART;TZID=America/New_York:20210324T120000 DTEND;TZID=America/New_York:20210324T130000 LOCATION:https://wse.zoom.us/j/91450299380 SEQUENCE:0 SUMMARY:[Theory Seminar] Audra McMillan URL:https://www.cs.jhu.edu/~mdinitz/theory/event/theory-seminar-audra-mcmil lan/ X-COST-TYPE:free X-ALT-DESC;FMTTYPE=text/html:\\n\\n
\\nSpeaker: Audr
a McMillan
\nAffiliation: Apple
Title: Hiding among the clone s: a simple and nearly optimal analysis of privacy amplification by shuffl ing
\nAbstract:
\nDifferential privacy (DP) is a model of priva
cy-preserving machine learning that has garnered significant interest in r
ecent years due to its rigorous privacy guarantees. An algorithm different
ially private if the output is stable under small changes in the input dat
abase. While DP has been adopted in a variety of applications\, most appli
cations of DP in industry actually satisfy a stronger notion called local
differential privacy. In local differential privacy data subjects perturb
their data before it reaches the data analyst. While this requires less tr
ust\, it comes a substantial cost to accuracy. Recent work of Erlingsson\,
Feldman\, Mironov\, Raghunathan\, Talwar\, and Thakurta [EFMRTT19] demons
trated that random shuffling amplifies differential privacy guarantees of
locally randomized data. Such amplification implies substantially stronger
privacy guarantees for systems in which data is contributed anonymously [
BEMMRLRKTS17] and has led to significant interest in the shuffle model of
privacy [CSUZZ19\, EFMRTT19]. In this talk\, we will discuss a new result
on privacy amplification by shuffling\, which achieves the asymptotically
optimal dependence in the local privacy parameter. Our result is based on
a new proof strategy which is simpler than previous approaches\, and exten
ds to a lightly weaker notion known as approximate differential privacy wi
th nearly the same guarantees. Based on joint work with Vitaly Feldman and
Kunal Talwar (https://arxiv.org/abs/2012.12803)
Speaker: Mary
am Negahbani
\nAffiliation: Dartmouth University
Title: “Revi siting Priority k-Center: Fairness and Outliers
\nAbstract:
\nC
lustering is a fundamental unsupervised learning and facility location pro
blem extensively studied in the literature. I will talk about a clustering
problem called “priority k-center” introduced by Plesnik (in Disc. Appl.
Math. 1987). Given a metric space on n points X\, with distance function
d\, an integer k\, and radius r_v for each point v in X\, the goal is to c
hoose k points S as “centers” to minimize the maximum distance of a point
v to S divided by r_v. For uniform r_v’s this is precisely the “k-center”
problem where the objective is to minimize the maximum distance of any poi
nt to S. In the priority version\, points with smaller r_v are prioritized
to be closer to S. Recently\, a special case of this problem was studied
in the context of “individually fair clustering” by Jung et al.\, FORC 20
20. This notion of fairness forces S to open a center in every “densely po
pulated area” by setting r_v to be v’s distance to its closest (n/k)-th ne
ighbor.
In this talk\, I show how to approximate priority k-center with outliers: When for a given integer z\, you are allowed to throw away z points as outliers and minimize the objective over the rest of the poin ts. We show there is 9-approximation\, which is morally a 5\, if you have constant many types of radii or if your radii are powers of 2. This is via an LP-aware reduction to min-cost max-flow and is general enough that cou ld handle Matroid constraints on facilities (where instead of asking to pi ck at most k facilities\, you are asked to pick facilities that are indepe ndent in a given matroid). Things become quite interesting for priority kn apsack-center with outliers: where opening each center costs something and you have a limited budget to spend on your solution S. In this case\, we do not know how to solve the corresponding flow problem\, so we alter our reduction to reduce to a simpler problem we do know how to solve taking a hit of +5 in the approximation ratio. There are still many open problems i n this work\, in addition to solving the flow problem in the knapsack case \, the best LP integrality gap we know for priority k-center with outliers is 3.
\n END:VEVENT BEGIN:VEVENT UID:ai1ec-346@www.cs.jhu.edu/~mdinitz/theory DTSTAMP:20240319T074948Z CATEGORIES: CONTACT: DESCRIPTION:Speaker: Leonidas Tsepenekas\nAffiliation: University of Maryla nd\nTitle: Approximating Two-Stage Stochastic Supplier Problems\nAbstract: \nThe main focus of this talk will be radius-based (supplier) clustering i n the two-stage stochastic setting with recourse\, where the inherent stoc hasticity of the model comes in the form of a budget constraint. Our event ual goal is to provide results in the most general distributional setting\ , where there is only black-box access to the underlying distribution. To that end\, we follow a two-step approach. First\, we develop algorithms fo r a restricted version of the problem\, in which all possible scenarios ar e explicitly provided\; second\, we employ a novel scenario-discarding var iant of the standard Sample Average Approximation (SAA) method\, in which we also crucially exploit structural properties of the algorithms develope d for the first step of the framework. In this way\, we manage to generali ze the results of the latter to the black-box model. Finally\, we note tha t the scenario-discarding modification to the SAA method is necessary in o rder to optimize over the radius.\nPaper: https://arxiv.org/abs/2008.03325 DTSTART;TZID=America/New_York:20210407T120000 DTEND;TZID=America/New_York:20210407T130000 LOCATION:https://wse.zoom.us/j/91450299380 SEQUENCE:0 SUMMARY:[Theory Seminar] Leonidas Tsepenekas URL:https://www.cs.jhu.edu/~mdinitz/theory/event/theory-seminar-leonidas-ts epenekas/ X-COST-TYPE:free X-ALT-DESC;FMTTYPE=text/html:\\n\\n\\nSpeaker: Leon
idas Tsepenekas
\nAffiliation: University of Maryland
Title: Approximating Two-Stage Stochastic Supplier Problems
\nAbstract:
\nThe main focus of this talk will be radius-based (supplier) clustering
in the two-stage stochastic setting with recourse\, where the inherent st
ochasticity of the model comes in the form of a budget constraint. Our eve
ntual goal is to provide results in the most general distributional settin
g\, where there is only black-box access to the underlying distribution. T
o that end\, we follow a two-step approach. First\, we develop algorithms
for a restricted version of the problem\, in which all possible scenarios
are explicitly provided\; second\, we employ a novel scenario-discarding v
ariant of the standard Sample Average Approximation (SAA) method\, in whic
h we also crucially exploit structural properties of the algorithms develo
ped for the first step of the framework. In this way\, we manage to genera
lize the results of the latter to the black-box model. Finally\, we note t
hat the scenario-discarding modification to the SAA method is necessary in
order to optimize over the radius.
Paper: https://arxiv.org/abs/2 008.03325
\n END:VEVENT BEGIN:VEVENT UID:ai1ec-358@www.cs.jhu.edu/~mdinitz/theory DTSTAMP:20240319T074948Z CATEGORIES: CONTACT: DESCRIPTION:Speaker: Samson Zhou\nAffiliation: Carnegie Mellon University\n Title: Tight Bounds for Adversarially Robust Streams and Sliding Windows v ia Difference Estimators\nAbstract:\nWe introduce difference estimators fo r data stream computation\, which provide approximations to F(v)-F(u) for frequency vectors v\,u and a given function F. We show how to use such est imators to carefully trade error for memory in an iterative manner. The fu nction F is generally non-linear\, and we give the first difference estima tors for the frequency moments F_p for p between 0 and 2\, as well as for integers p>2. Using these\, we resolve a number of central open questions in adversarial robust streaming and sliding window models.\nFor both model s\, we obtain algorithms for norm estimation whose dependence on epsilon i s 1/epsilon^2\, which shows\, up to logarithmic factors\, that there is no overhead over the standard insertion-only data stream model for these pro blems. DTSTART;TZID=America/New_York:20210414T120000 DTEND;TZID=America/New_York:20210414T130000 SEQUENCE:0 SUMMARY:[Theory Seminar] Samson Zhou URL:https://www.cs.jhu.edu/~mdinitz/theory/event/theory-seminar-samson-zhou -3/ X-COST-TYPE:free X-ALT-DESC;FMTTYPE=text/html:\\n\\n\\nSpeaker: Sams
on Zhou
\nAffiliation: Carnegie Mellon University
Title: Tigh t Bounds for Adversarially Robust Streams and Sliding Windows via Differen ce Estimators
\nAbstract:
\nWe introduce difference estimat
ors for data stream computation\, which provide approximations to F(v
)-F(u) for frequency vectors v\,u and a given function F. We show how to u
se such estimators to carefully trade error for memory in an iterative man
ner. The function F is generally non-linear\, and we give the first differ
ence estimators for the frequency moments F_p for p between 0 and 2\, as w
ell as for integers p>2. Using these\, we resolve a number of central open
questions in adversarial robust streaming and sliding window models.
For both models\, we obtain algorithms for norm estimation whose depe ndence on epsilon is 1/epsilon^2\, which shows\, up to logarithmic factors \, that there is no overhead over the standard insertion-only data stream model for these problems.
\n END:VEVENT BEGIN:VEVENT UID:ai1ec-376@www.cs.jhu.edu/~mdinitz/theory DTSTAMP:20240319T074948Z CATEGORIES: CONTACT: DESCRIPTION: DTSTART;TZID=America/New_York:20230906T120000 DTEND;TZID=America/New_York:20230906T130000 SEQUENCE:0 SUMMARY:[Theory Seminar] Welcome Back / Introductions URL:https://www.cs.jhu.edu/~mdinitz/theory/event/theory-seminar-welcome-bac k-introductions/ X-COST-TYPE:free END:VEVENT BEGIN:VEVENT UID:ai1ec-372@www.cs.jhu.edu/~mdinitz/theory DTSTAMP:20240319T074948Z CATEGORIES: CONTACT: DESCRIPTION:Speaker: Zeyu Guo\nAffiliation: Ohio State University\nTitle: T BD\nAbstract: TBD DTSTART;TZID=America/New_York:20231025T120000 DTEND;TZID=America/New_York:20231025T130000 SEQUENCE:0 SUMMARY:[Theory Seminar] Zeyu Guo URL:https://www.cs.jhu.edu/~mdinitz/theory/event/theory-seminar-zeyu-guo/ X-COST-TYPE:free X-ALT-DESC;FMTTYPE=text/html:\\n\\n\\nSpeaker: Zeyu
Guo
\nAffiliation: Ohio State University
Title: TBD
\nAbstract: TBD
\n END:VEVENT END:VCALENDAR