In the classical setting, we aim at partitioning the vertices of a hypergraph into two classes in such a way that ideally each hyperedge contains the same number of vertices in both classes. A partition into two classes can be represented by a coloring. We call −1 and +1 colors. The color-classes and form the corresponding partition. For a hyperedge, set The discrepancy of with respect to and the discrepancy of are defined by These notions as well as the term 'discrepancy' seem to have appeared for the first time in a paper of Beck. Earlier results on this problem include the famous lower bound on the discrepancy of arithmetic progressions by Roth and upper bounds for this problem and other results by Erdős and Spencer and Sárközi. At that time, discrepancy problems were called quasi-Ramsey problems. To get some intuition for this concept, let's have a look at a few examples.
If all edges of intersect trivially, i.e. for any two distinct edges, then the discrepancy is zero, if all edges have even cardinality, and one, if there is an odd cardinality edge.
The other extreme is marked by the complete hypergraph. In this case the discrepancy is. Any 2-coloring will have a color class of at least this size, and this set is also an edge. On the other hand, any coloring with color classes of size and proves that the discrepancy is not larger than. It seems that the discrepancy reflects how chaotic the hyperedges of intersect. Things are not that easy, however, as the following example shows.
Set, and. Now has many complicatedly intersecting edges, but discrepancy zero.
The last example shows that we cannot expect to determine the discrepancy by looking at a single parameter like the number of hyperedges. Still, the size of the hypergraph yields first upper bounds.
Theorems
with n the number of vertices and m the number of edges. The proof is a simple application of the probabilistic method: Let be a random coloring, i.e. we have independently for all. Since is a sum of independent −1, 1 random variables. So we have for all and. Put for convenience. Then Since a random coloring with positive probability has discrepancy at most, in particular, there are colorings that have discrepancy at most. Hence
For any hypergraph such that we have
To prove this, a much more sophisticated approach using the entropy function was necessary. Of course this is particularly interesting for. In the case, can be shown for n large enough. Therefore, this result is usually known to as 'Six Standard Deviations Suffice'. It is considered to be one of the milestones of discrepancy theory. The entropy method has seen numerous other applications, e.g. in the proof of the tight upper bound for the arithmetic progressions of Matoušek and Spencer or the upper bound in terms of the primal shatter function due to Matoušek.
Assume that each vertex of is contained in at most t edges. Then
This result, the Beck–Fiala theorem, is due to Beck and Fiala. They bound the discrepancy by the maximum degree of. It is a famous open problem whether this bound can be improved asymptotically. Beck and Fiala conjectured that, but little progress has been made so far in this direction. Bednarchak and Helm and Helm improved the Beck-Fiala bound in tiny steps to . A corollary of Beck's paper – the first time the notion of discrepancy explicitly appeared – shows for some constant C. The latest improvement in this direction is due to Banaszczyk:.