pith. machine review for the scientific record. sign in

arxiv: 2605.08358 · v1 · submitted 2026-05-08 · 💻 cs.DS

Recognition: no theorem link

Online Matrix Factorization, Online Private Query Release, and Online Discrepancy Minimization

Aleksandar Nikolov, Haohua Tang, Jonathan Ullman

Authors on Pith no claims yet

Pith reviewed 2026-05-12 01:33 UTC · model grok-4.3

classification 💻 cs.DS
keywords online matrix factorizationdifferential privacystatistical queriesdiscrepancy minimizationonline algorithmsgamma-two norm
0
0 comments X

The pith

Online matrix factorization can match offline error bounds up to logarithmic factors for non-adaptive queries.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper shows how to answer sequences of statistical queries arriving one by one with differential privacy, using error that stays nearly as small as the best offline matrix factorization methods. This works when the queries are fixed in advance rather than chosen based on previous answers. The key is an online factorization algorithm that, as each new row of the query matrix arrives, extends the right factor and produces a matching left factor row while staying competitive with the optimal offline factorization. The same technique also produces an online discrepancy minimization algorithm whose discrepancy stays within logarithmic factors of the gamma-two norm of the matrix. The algorithm continues to work even when the total number of rows is not known ahead of time.

Core claim

An online algorithm maintains a factorization L_t R_t = Q_t by appending rows to R_t and outputting a new row ell_t satisfying ell_t R_t = q_t for each arriving row q_t; this factorization is competitive with the best offline factorization up to logarithmic factors even when the total number of rows is unknown in advance.

What carries the argument

The incremental online matrix factorization that builds L_t R_t by extending the right factor and producing a matching left row for each new query row, preserving competitiveness with the optimal offline factorization.

If this is right

  • Private release of statistical queries can be performed online with error nearly matching the offline gamma-two bound.
  • Online discrepancy minimization achieves discrepancy competitive with both the gamma-two norm and hereditary discrepancy up to log factors.
  • The algorithm requires no advance knowledge of the total number of queries or rows.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Similar incremental factorization ideas could apply to other online linear-algebra tasks that currently rely on offline matrix decompositions.
  • Removing the non-adaptive restriction would likely require either larger logarithmic factors or entirely different techniques.
  • The competitiveness guarantee suggests that online versions of other gamma-two-based algorithms may also be possible with only logarithmic overhead.

Load-bearing premise

The queries arrive in a fixed sequence chosen ahead of time and cannot change based on the answers already produced.

What would settle it

Compute the ratio of the online factorization cost to the optimal offline gamma-two norm on a fixed sequence of query rows; if the ratio grows faster than logarithmic in the number of rows for some matrix family, the competitiveness claim fails.

Figures

Figures reproduced from arXiv: 2605.08358 by Aleksandar Nikolov, Haohua Tang, Jonathan Ullman.

Figure 1
Figure 1. Figure 1: The Meta-Algorithm TransM⇌A m (x) for non-adaptive analysts We also consider adaptive analysts A, that determine the next query to be given to M based on the answers to previous queries, as shown in [PITH_FULL_IMAGE:figures/full_fig_p013_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: The Meta-Algorithm TransM⇌A m (x) for adaptive analysts Definition 2.1 (Privacy of Interactive Algorithms). An interactive algorithm M is (ε, δ)-differentially private (for non-adaptive/adaptive analysts) if for every dataset x ∈ U n, and every (resp. non-adaptive/adaptive) analyst A, the algorithm TransM⇌A m is (ε, δ)-differentially private. 13 [PITH_FULL_IMAGE:figures/full_fig_p013_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Online SDP algorithm From the definition of the algorithm, it is clear that when Xt is output, the conditions tr(Xt) ≤ 3γ 2N and Xt ⪰ q T t qt are satisfied. To show that Xt ⪰ Xt−1, recall the classical fact that the square root function is operator monotone on [0, ∞), i.e., A ⪰ B =⇒ √ A ⪰ √ B for positive semidefinite matrices A, B (Proposition 20 [PITH_FULL_IMAGE:figures/full_fig_p020_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: A simple example of insertion Then, for each active instance I with universe VI, on arrival of the row qt and Ut, we send the row qt|VI as input to I, and we combine the outputs as follows: 23 [PITH_FULL_IMAGE:figures/full_fig_p023_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Combining Instances Here we assume that in all instances I, the right matrix RI,t of the factorization has |U| columns, and those that are not inside VI are set to be all zeros. By how ℓ ′ t is constructed, inactive instances do not contribute to ℓ ′ tR′ , thus we have ℓ ′ tR ′ t = X every active instance I ℓI,tRI,t = X every active instance I qt|VI = qt|Ut . Here we used property 1 of the instances (they … view at source ↗
Figure 6
Figure 6. Figure 6: Algorithm within each zone 26 [PITH_FULL_IMAGE:figures/full_fig_p026_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: The AboveThreshold algorithm AT B,τ,ε(x) Composition of Differential Privacy. Finally, we review the composition properties of differential privacy that allow us to build our algorithm modularly out of these components. Input: dataset x ∈ U n, adversary B, T ∈ N. For t = 1, . . . , T: B chooses an (ε0, δ0)-differentially private algorithm Mt : U n → R. vt = Mt(x) Output (v1, . . . , vT ) [PITH_FULL_IMAGE:… view at source ↗
Figure 8
Figure 8. Figure 8: Adaptive composition game CompB,T ,ε0,δ0 (x) Theorem 6.15 (Composition of Differential Privacy [DMNS06, DRV10]). For every adversary B, T ∈ N, 0 ≤ ε0, δ0 ≤ 1, the algorithm CompB,T ,ε0,δ0 is: 1. (ε, δ)-differentially private for ε = T ε0 and δ = T δ0, and 2. for every δ1 > 0, (ε, δ)-differentially private for ε = q 6T ln( 1 δ1 ) ε0 and δ = T δ0 + δ1. 6.3 An Online Competitive Algorithm and its Analysis The… view at source ↗
Figure 9
Figure 9. Figure 9: A competitive algorithm Mε,δ,m,β(x) 6.3.1 Analysis of the Inner Algorithm Lemma 6.16 (Privacy of Minner). For every s, m ∈ N, 0 ≤ ε, δ ≤ 1, and β > 0, the algorithm Minner s,ε,δ,m,β is (ε, δ)-differentially private. 33 [PITH_FULL_IMAGE:figures/full_fig_p033_9.png] view at source ↗
Figure 10
Figure 10. Figure 10: The subroutine Minner s,ε,δ,m,β(x) Proof of Lemma 6.16. The proof proceeds in two steps. First, every iteration of the inner loop is an instance of the AboveThreshold algorithm (Theorem 6.14) with threshold τ and parameter ε0, and therefore is (ε0, 0)-differentially private. To verify that Minner can be expressed as an instance of AboveThreshold, observe that in each round the function being tested agains… view at source ↗
Figure 11
Figure 11. Figure 11: Inserting into data structure Clearly, the distance requirement we defined above will be satisfied. We use the notation ρ(qt) = ρh+1(qt) for the vector in Lh that is closest to qt. For this lemma, ρ(qt) will simply be a copy of qt, since we set h = ⌈log N⌉ which means in the very first iteration the condition inside the “if” cannot be satisfied. We will consider a more general case later in this section. … view at source ↗
Figure 12
Figure 12. Figure 12: Updating Rt That is, every difference edge will correspond to a row in Rt. Here εi is a parameter for every layer that we will set later. Note that at the beginning of the algorithm R0 is an empty matrix. The last step is to output a row ℓt so that ℓtR = qt. 40 [PITH_FULL_IMAGE:figures/full_fig_p040_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Computing ℓt To see the correctness of the factorization, observe that the sum of q − ρi(q) over difference edges (q, ρi(q)) on the path from qt to the zero vector in L0 is just qt itself. Now we prove the bounds on ∥Lt∥2→∞ and ∥Rt∥F , using Haussler’s packing lemma (Lemma A.1). A direct consequence of the packing lemma is that|Li | ≤ 2 di. Let h ∗ = l log2 m d m , and εi = m d−1 2d (1+|h ∗ − i|) −1.5 . B… view at source ↗
Figure 14
Figure 14. Figure 14: Computing ℓt In other words, we choose R′ t the same way we chose Rt in Lemma A.2, and L ′ t the same way we chose Lt. The same reasoning as in Lemma A.2 gives us that ℓ ′ tR′ t = ρ(qt), so we set ℓ (I) t = qt − ρ(qt), which guarantees that ℓtRt =  ℓ (I) t ℓ ′ t  I R′ t  = ℓ (I) t + ℓ ′ tR ′ t = qt − ρ(qt) + ρ(qt) = qt. We now proceed to bounding ∥L (I) t ∥2→∞ ∥L ′ t∥2→∞, and ∥R′ t∥F . Recall that eac… view at source ↗
Figure 15
Figure 15. Figure 15: Insertion with scaling The only difference with the algorithm in Lemma 4.8 is that we scale every instance by a parameter βk that depends on its size. Let us apply this transformation to the online average factorization algorithm in Lemma A.4. Since the online average factorization algorithm outputs a factorization where the rows of Rt and the columns of Lt are partitioned into two parts, and the transfor… view at source ↗
read the original abstract

In this paper we consider several related online computation problems. First, we study answering sequences of statistical queries arriving online, and being answered immediately when they arrive with differential privacy. Known matrix factorization mechanisms can answer a set of statistical queries with error bounded by the $\gamma_2$ norm of their query matrix, but require that all queries are known in advance. We show that nearly the same error bounds can be achieved in the online setting for non-adaptively chosen queries. To do so, we give an online factorization algorithm that competitively matches the best offline factorization up to logarithmic factors. In the online matrix factorization problem, a new row $q_t$ of a matrix arrives at each time step $t$, and the algorithm needs to maintain a factorization $L_tR_t=Q_t$ such that at each time it appends some rows to $R_t$, and outputs a new row $\ell_t$ s.t. $\ell_tR_t=q_t$. Our algorithm maintains the competitiveness over this online process, even if the number of rows to arrive is unknown. As another application, we give an online discrepancy minimization algorithm that achieves discrepancy competitive against the $\gamma_2$ norm (and also against hereditary discrepancy) up to logarithmic factors.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

0 major / 3 minor

Summary. The paper presents an online matrix factorization algorithm for a matrix whose rows q_t arrive sequentially in a non-adaptive fashion. At each step the algorithm appends (possibly multiple) rows to the right factor R_t and emits a new row ell_t of the left factor L_t so that L_t R_t equals the prefix matrix Q_t seen so far. The central claim is that the resulting factorization norm remains competitive with the best offline gamma_2-norm factorization of the entire matrix, up to logarithmic factors, even when the horizon T is unknown in advance. The same technique yields an online differentially private query-release mechanism whose error is competitive with the offline gamma_2 bound and an online discrepancy-minimization algorithm whose discrepancy is competitive with both the gamma_2 norm and the hereditary discrepancy, again up to logarithmic factors.

Significance. If the claimed competitiveness holds, the result is significant because it removes the offline requirement that all queries be known in advance while incurring only logarithmic overhead, a standard price for online algorithms that use doubling epochs or potential-based resizing. The unified treatment of matrix factorization, private query release, and discrepancy minimization under a single online framework is a strength, and the explicit handling of unknown horizon distinguishes the work from many prior online results that assume a known T.

minor comments (3)
  1. [Abstract] The abstract states that the online factorization is competitive 'up to logarithmic factors' but does not exhibit the precise dependence on the dimension or on the gamma_2 norm; a short statement of the main theorem (including the exact log factors) should appear in the introduction.
  2. The description of the algorithm in the abstract mentions appending 'some rows' to R_t; the precise rule for deciding how many rows to append and the potential function used to control the growth should be stated explicitly in the algorithmic section.
  3. For the discrepancy application, it is not immediately clear whether the hereditary-discrepancy competitiveness follows directly from the gamma_2 bound or requires an additional argument; a short paragraph clarifying the relationship would improve readability.

Simulated Author's Rebuttal

0 responses · 0 unresolved

We thank the referee for the careful summary of our paper and the positive assessment of its significance. We are pleased that the referee recognizes the value of the online matrix factorization result and its unified applications to private query release and discrepancy minimization, as well as the handling of unknown horizon. We appreciate the recommendation for minor revision.

Circularity Check

0 steps flagged

No significant circularity identified

full rationale

The paper constructs an online matrix factorization algorithm that maintains a factorization L_t R_t = Q_t by appending rows to R_t and emitting ell_t such that the norm remains competitive with the offline gamma_2 norm of the full matrix up to logarithmic factors, even for unknown horizon T. This competitiveness is measured against an externally defined offline benchmark (gamma_2 norm and hereditary discrepancy), not redefined in terms of the online outputs or fitted parameters. The non-adaptive query assumption and online process are handled via standard techniques without self-referential definitions or load-bearing self-citations that reduce the central claim to its inputs. The applications to private query release and discrepancy minimization inherit the same independent comparison, rendering the derivation self-contained.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

Based solely on the abstract, the central claim rests on standard definitions of differential privacy, matrix norms like gamma_2, and online algorithm competitiveness; no free parameters, invented entities, or ad-hoc axioms are explicitly introduced in the summary.

axioms (1)
  • domain assumption Non-adaptive query arrival and standard differential privacy definitions
    The result is stated for non-adaptively chosen queries under differential privacy.

pith-pipeline@v0.9.0 · 5526 in / 1297 out tokens · 40537 ms · 2026-05-12T01:33:38.928359+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

48 extracted references · 48 canonical work pages

  1. [1]

    Liu, and Mehtaab Sawhney

    Ryan Alweiss, Yang P. Liu, and Mehtaab Sawhney. Discrepancy minimization via a self-balancing walk. In S ymposium on T heory of C omputing, STOC , pages 14--20, 2021

  2. [2]

    Balancing vectors and G aussian measures of n -dimensional convex bodies

    Wojciech Banaszczyk. Balancing vectors and G aussian measures of n -dimensional convex bodies. Random Structures and Algorithms , 12(4):351--360, 1998

  3. [3]

    On a class of balancing games

    Imre B \' a r \' a ny. On a class of balancing games. J. Comb. Theory, Ser. A , 26(2):115--126, 1979

  4. [4]

    The G ram- S chmidt walk: a cure for the B anaszczyk blues

    Nikhil Bansal, Daniel Dadush, Shashwat Garg, and Shachar Lovett. The G ram- S chmidt walk: a cure for the B anaszczyk blues. Theory Comput. , 15:Paper No. 21, 27, 2019

  5. [5]

    Practical privacy: the SuLQ framework

    Avrim Blum, Cynthia Dwork, Frank McSherry, and Kobbi Nissim. Practical privacy: the SuLQ framework. In Proceedings of the 24th ACM Symposium on Principles of Database Systems , PODS '05, pages 128--138. ACM, 2005

  6. [6]

    Matrix analysis , volume 169 of Graduate Texts in Mathematics

    Rajendra Bhatia. Matrix analysis , volume 169 of Graduate Texts in Mathematics . Springer-Verlag, New York, 1997

  7. [7]

    Decoupling via affine spectral-independence: Beck-fiala and koml\'os bounds beyond banaszczyk, 2025

    Nikhil Bansal and Haotian Jiang. Decoupling via affine spectral-independence: Beck-fiala and koml\'os bounds beyond banaszczyk, 2025

  8. [8]

    The design of competitive online algorithms via a primal-dual approach

    Niv Buchbinder and Joseph Naor. The design of competitive online algorithms via a primal-dual approach. Found. Trends Theor. Comput. Sci. , 3(2-3):front matter, 93--263, 2007

  9. [9]

    Decomposable searching problems i

    Jon Louis Bentley and James B Saxe. Decomposable searching problems i. static-to-dynamic transformation. Journal of Algorithms , 1(4):301--358, 1980

  10. [10]

    Nikhil Bansal and Joel H. Spencer. On-line balancing of random inputs. Random Structures Algorithms , 57(4):879--891, 2020

  11. [11]

    Make up your mind: The price of online queries in differential privacy

    Mark Bun, Thomas Steinke, and Jonathan Ullman. Make up your mind: The price of online queries in differential privacy. In Proceedings of the 28th Annual ACM-SIAM Symposium on Discrete Algorithms , SODA '17, pages 1306--1325, Philadelphia, PA, 2017. SIAM

  12. [12]

    Fingerprinting codes and the price of approximate differential privacy

    Mark Bun, Jonathan Ullman, and Salil Vadhan. Fingerprinting codes and the price of approximate differential privacy. In 46th Annual ACM Symposium on the Theory of Computing , STOC '14, pages 1--10, New York, NY, USA, 2014

  13. [13]

    Tight hardness results for minimizing discrepancy

    Moses Charikar, Alantha Newman, and Aleksandar Nikolov. Tight hardness results for minimizing discrepancy. In ACM-SIAM Symposium on Discrete Algorithms, SODA , pages 1607--1614, 2011

  14. [14]

    Private and continual release of statistics

    T-H Hubert Chan, Elaine Shi, and Dawn Song. Private and continual release of statistics. ACM Transactions on Information and System Security (TISSEC) , 14(3):26, 2011

  15. [15]

    Towards a constructive version of B anaszczyk's vector balancing theorem

    Daniel Dadush, Shashwat Garg, Shachar Lovett, and Aleksandar Nikolov. Towards a constructive version of B anaszczyk's vector balancing theorem. Theory Comput. , 15:Paper No. 15, 58, 2019

  16. [16]

    Our data, ourselves: Privacy via distributed noise generation

    Cynthia Dwork, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. Our data, ourselves: Privacy via distributed noise generation. In Advances in Cryptology - EUROCRYPT 2006, 25th Annual International Conference on the Theory and Applications of Cryptographic Techniques, St. Petersburg, Russia, May 28 - June 1, 2006, Proceedings , volume 40...

  17. [17]

    Dajani, Amy D

    Aref N. Dajani, Amy D. Lauger, Phyllis E. Singer, Daniel Kifer, Jerome P. Reiter, Ashwin Machanavajjhala, Simson L. Garfinkel, Scot A. Dahl, Matthew Graham, Vishesh Karwa, Hang Kim, Philip Lelerc, Ian M. Schmutte, William N. Sexton, Lars Vilhuber, and John M. Abowd. The modernization of statistical disclosure limitation at the U.S. census bureau, 2017. Pr...

  18. [18]

    Calibrating noise to sensitivity in private data analysis

    Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. Calibrating noise to sensitivity in private data analysis. In Proceedings of the 3rd Conference on Theory of Cryptography , TCC '06, pages 265--284, Berlin, Heidelberg, 2006. Springer

  19. [19]

    Revealing information while preserving privacy

    Irit Dinur and Kobbi Nissim. Revealing information while preserving privacy. In PODS , pages 202--210. ACM , 2003

  20. [20]

    Privacy-preserving datamining on vertically partitioned databases

    Cynthia Dwork and Kobbi Nissim. Privacy-preserving datamining on vertically partitioned databases. In Annual International Cryptology Conference , pages 528--544. Springer, 2004

  21. [21]

    Rothblum

    Cynthia Dwork, Moni Naor, Toniann Pitassi, and Guy N. Rothblum. Differential privacy under continual observation. In Symposium on Theory of Computing (STOC) , pages 715--724. ACM , 2010

  22. [22]

    Rothblum, and Salil P

    Cynthia Dwork, Moni Naor, Omer Reingold, Guy N. Rothblum, and Salil P. Vadhan. On the complexity of differentially private data release: efficient algorithms and hardness results. In STOC , pages 381--390. ACM , 2009

  23. [23]

    The algorithmic foundations of differential privacy

    Cynthia Dwork and Aaron Roth. The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci. , 9(3-4):211--407, 2014

  24. [24]

    Rothblum, and Salil P

    Cynthia Dwork, Guy N. Rothblum, and Salil P. Vadhan. Boosting and differential privacy. In FOCS , pages 51--60. IEEE , 2010

  25. [25]

    New efficient attacks on statistical disclosure control mechanisms

    Cynthia Dwork and Sergey Yekhanin. New efficient attacks on statistical disclosure control mechanisms. In Annual International Cryptology Conference , pages 469--480. Springer, 2008

  26. [26]

    Online semidefinite programming

    Noa Elad, Satyen Kale, and Joseph Naor. Online semidefinite programming. In 43rd I nternational C olloquium on A utomata, L anguages, and P rogramming , volume 55 of LIPIcs. Leibniz Int. Proc. Inform. , pages Art. No. 40, 13. Schloss Dagstuhl. Leibniz-Zent. Inform., Wadern, 2016

  27. [27]

    Alexander Edmonds, Aleksandar Nikolov, and Jonathan R. Ullman. The power of factorization mechanisms in local and central differential privacy. In Proceedings of the 52nd Annual ACM SIGACT Symposium on Theory of Computing, STOC 2020, Chicago, IL, USA, June 22-26, 2020 , pages 425--438. ACM , 2020

  28. [28]

    Competitive online algorithms for resource allocation over the positive semidefinite cone

    Reza Eghbali, James Saunderson, and Maryam Fazel. Competitive online algorithms for resource allocation over the positive semidefinite cone. Math. Program. , 170(1):267--292, 2018

  29. [29]

    Iterative constructions and private data release

    Anupam Gupta, Aaron Roth, and Jonathan Ullman. Iterative constructions and private data release. In 9th IACR Theory of Cryptography Conference , TCC '12, pages 339--356, Taormina, Italy, 2012. Springer

  30. [30]

    Sphere packing numbers for subsets of the B oolean n -cube with bounded V apnik- C hervonenkis dimension

    David Haussler. Sphere packing numbers for subsets of the B oolean n -cube with bounded V apnik- C hervonenkis dimension. J. Combin. Theory Ser. A , 69(2):217--232, 1995

  31. [31]

    A simple and practical algorithm for differentially private data release

    Moritz Hardt, Katrina Ligett, and Frank McSherry. A simple and practical algorithm for differentially private data release. In Advances in Neural Information Processing Systems 25: 26th Annual Conference on Neural Information Processing Systems 2012. Proceedings of a meeting held December 3-6, 2012, Lake Tahoe, Nevada, United States. , pages 2348--2356, 2012

  32. [32]

    A multiplicative weights mechanism for privacy-preserving data analysis

    Moritz Hardt and Guy Rothblum. A multiplicative weights mechanism for privacy-preserving data analysis. In FOCS , pages 61--70, 2014

  33. [33]

    Optimal online discrepancy minimization

    Janardhan Kulkarni, Victor Reis, and Thomas Rothvoss. Optimal online discrepancy minimization. In Bojan Mohar, Igor Shinkar, and Ryan O'Donnell, editors, Proceedings of the 56th Annual ACM Symposium on Theory of Computing, STOC 2024, Vancouver, BC, Canada, June 24-28, 2024 , pages 1832--1840. ACM , 2024

  34. [34]

    Better gaussian mechanism using correlated noise

    Christian Janos Lebeda. Better gaussian mechanism using correlated noise. In Ioana Oriana Bercea and Rasmus Pagh, editors, 2025 Symposium on Simplicity in Algorithms, SOSA 2025, New Orleans, LA, USA, January 13-15, 2025 , pages 119--133. SIAM , 2025

  35. [35]

    Optimizing linear counting queries under differential privacy

    Chao Li, Michael Hay, Vibhor Rastogi, Gerome Miklau, and Andrew McGregor. Optimizing linear counting queries under differential privacy. In Proceedings of the 29th ACM Symposium on Principles of Database Systems , PODS'10, pages 123--134. ACM, 2010

  36. [36]

    Complexity measures of sign matrices

    Nati Linial, Shahar Mendelson, Gideon Schechtman, and Adi Shraibman. Complexity measures of sign matrices. Combinatorica , 27(4):439--463, 2007

  37. [37]

    Discrepancy of set-systems and matrices

    L \'a szl \'o Lov \'a sz, Joel Spencer, and Katalin Vesztergombi. Discrepancy of set-systems and matrices. European Journal of Combinatorics , 7(2):151--160, 1986

  38. [38]

    Better differentially private approximate histograms and heavy hitters using the misra-gries sketch

    Christian Janos Lebeda and Jakub Tetek. Better differentially private approximate histograms and heavy hitters using the misra-gries sketch. SIGMOD Rec. , 53(1):7--14, 2024

  39. [39]

    Optimizing error of high-dimensional statistical queries under differential privacy

    Ryan McKenna, Gerome Miklau, Michael Hay, and Ashwin Machanavajjhala. Optimizing error of high-dimensional statistical queries under differential privacy. Journal of Privacy and Confidentiality , 13(1), Aug. 2023

  40. [40]

    Optimal private halfspace counting via discrepancy

    S Muthukrishnan and Aleksandar Nikolov. Optimal private halfspace counting via discrepancy. In Proceedings of the forty-fourth annual ACM symposium on Theory of computing , pages 1285--1292. ACM, 2012

  41. [41]

    Factorization norms and hereditary discrepancy

    Ji r \'i Matou s ek, Aleksandar Nikolov, and Kunal Talwar. Factorization norms and hereditary discrepancy. Int. Math. Res. Not. IMRN , (3):751--780, 2020

  42. [42]

    Matou sek

    J. Matou sek. Tight upper bounds for the discrepancy of half-spaces. Discrete Comput. Geom. , 13(3-4):593--601, 1995

  43. [43]

    General gaussian noise mechanisms and their optimality for unbiased mean estimation

    Aleksandar Nikolov and Haohua Tang. General gaussian noise mechanisms and their optimality for unbiased mean estimation. In ITCS , volume 287 of LIPIcs , pages 85:1--85:23. Schloss Dagstuhl - Leibniz-Zentrum f \" u r Informatik, 2024

  44. [44]

    The geometry of differential privacy: The small database and approximate cases

    Aleksandar Nikolov, Kunal Talwar, and Li Zhang. The geometry of differential privacy: The small database and approximate cases. SIAM J. Comput. , 45(2):575--616, 2016

  45. [45]

    Interactive privacy via the median mechanism

    Aaron Roth and Tim Roughgarden. Interactive privacy via the median mechanism. In STOC , pages 765--774. ACM , June 5--8 2010

  46. [46]

    Balancing games

    Joel Spencer. Balancing games. J. Comb. Theory, Ser. B , 23(1):68--74, 1977

  47. [47]

    High-dimensional probability , volume 47 of Cambridge Series in Statistical and Probabilistic Mathematics

    Roman Vershynin. High-dimensional probability , volume 47 of Cambridge Series in Statistical and Probabilistic Mathematics . Cambridge University Press, Cambridge, 2018. An introduction with applications in data science, With a foreword by Sara van de Geer

  48. [48]

    Differential privacy via wavelet transforms

    Xiaokui Xiao, Guozhang Wang, and Johannes Gehrke. Differential privacy via wavelet transforms. IEEE Trans. Knowl. Data Eng. , 23(8):1200--1214, 2011