-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathcombinatorics.tex
893 lines (733 loc) · 45.3 KB
/
combinatorics.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
\documentclass{article}
% preamble
\def\npart{III}
\def\nyear{2023}
\def\nterm{Michaelmas}
\def\nlecturer{Prof B\'ela Bollob\'as}
\def\ncourse{Combinatorics}
\def\draft{Incomplete}
\input{header}
% and here we go!
\begin{document}
\maketitle
\tableofcontents
\clearpage
\setcounter{section}{-1}
\section{Introduction}
For a finite set $A$, we write its cardinality $\abs A$.
For a graph $G = (V, E)$ and $A, B \subseteq V$, we denote $\Gamma(A) = \{b | \exists a \in A, a \sim b\}$ the set of neighbors of $A$ and $e(A, B)$ the number of edges between $A$ and $B$.
\clearpage
\section{Basic Results}
\subsection{Chains, Antichains and Scattered Sets of Vectors}
\newlec
During WW2, Littlewood and Offord were interested in roots of polynomials with random coefficients. They came up with the following neat theorem.
\begin{thm}[Littlewood-Offord, 1943]
If $z_1, \dots, z_n \in \C$ with $\abs{z_i} \ge 1$, then, for any disk $D$ of radius $r$,
$$\#\{\eps \in \{-1, 1\}^n | \sum_i \eps_i z_i \in D\} \le c \log n \frac{2^n}{\sqrt n}$$
for some constant $c$ depending only on $r$.
\end{thm}
Upon seeing this theorem, Erd\H os immediately knew he could drastically improve the bound if the $z_i$ were real.
\begin{thm}[Erd\H os, 1945]
If $x_1, \dots, x_n \in \R$, $\abs{x_i} \ge 1$, then, for any interval $I$ of length $2$,
$$\#\{\eps \in \{-1, 1\}^n | \sum_i \eps_i z_i \in I\} \le \binom n{\frac n 2}$$
\end{thm}
This is best possible, as we see by taking $x_1 = \dots = x_n = 1$.
Let $G$ be a bipartite graph with parts $U$ and $W$. A {\bf complete matching} from $U$ to $W$ is an injective function $f : U \to W$ such that $\for u \in U, u \sim f(u)$. \\
If $G$ has a complete matching, then certainly $\abs A \le \abs{\Gamma(A)}$. Surprisingly, this is enough.
\begin{nthm}[K\H onig-Egerv\'ary-Hall Theorem, Hall's Marriage Theorem]\label{thm:hall}
$$G \text{ has a complete matching } \iff \for A \subseteq U, \abs A \le \abs{\Gamma(A)}$$
\end{nthm}
\begin{proof}
Exercise
\end{proof}
Let $\mcF = (F_1, \dots, F_m)$ where the $F_i$ are finite sets. We say $a_1, \dots, a_m$ is a {\bf set of distinct representatives}, aka {\bf SDR} if they are distinct and $\for i, a_i \in F_i$. Certainly, if $\mcF$ has SDR, then $\abs I \le \abs{\bigcup_{i \in I} F_i}$ for all $I \subseteq [m]$.
\begin{nthm}
$$\mcF \text{ is a SDR } \iff \for I \subseteq [m], \abs I \le \abs{\bigcup_{i \in I} F_i}$$
\end{nthm}
\begin{proof}
Define a bipartite graph $G$ with parts $[m]$ and $\bigcup_i F_i$ by $i \sim a \iff a \in F_i$. For all $I \subseteq [m]$, $\abs I \le \abs{\bigcup_{i \in I} F_i} = \abs{\Gamma(I)}$, so Theorem \ref{thm:hall} applies.
\end{proof}
\begin{nthm}
If $G$ is a bipartite graph with parts $U$, $W$ such that $\deg(u) \ge \deg(w)$ for all $u \in U, w \in W$, then there is a complete matching from $U$ to $W$.
\end{nthm}
\begin{proof}
Find $d$ such that $\deg(u) \ge d \ge \deg(w)$ for all $u \in U, w \in W$. For all $A \subseteq U$, we have
$$d\abs A \le e(A, \Gamma(A)) \le d\abs{\Gamma(A)}$$
Hence $\abs A \le \abs{\Gamma(A)}$. We're done by Theorem \ref{thm:hall}.
\end{proof}
For $A \subseteq U, B \subseteq W$, define $w(A) = \frac{\abs A}{\abs U}, w(B) = \frac{\abs B}{\abs W}$.
Say a bipartite graph $G$ with parts $U, W$ is {\bf $(k, \ell)$-biregular} if
$\deg(u) = k, \deg(w) = \ell$ for all $u \in U, w \in W$.
\begin{nlemma}\label{lem:weight-biregular}
If $G$ is biregular with parts $U, W$ and $A \subseteq U$, then $w(A) \le w(\Gamma(A))$.
\end{nlemma}
\begin{proof}
First, $k\abs U = e(G) = \ell\abs W$. Second,
$$k\abs A = e(A, \Gamma(A)) \le \ell\abs{\Gamma(A)}$$
Dividing the inequality by the equality gives the result.
\end{proof}
\newlec
\begin{ncor}
Let $G$ be a $(k, \ell)$-biregular graph with parts $U, W$. If $k \ge \ell$ (or equivalently $\abs U \le \abs W$), then there is a complete matching from $U$ to $W$.
\end{ncor}
\begin{ncor}\label{cor:layer-inj}
If $\abs{s - \frac n 2} \le \abs{r - \frac n 2}$, then there exists an injection $f : X^{(r)} \hookrightarrow X^{(s)}$ such that either
\begin{itemize}
\item $r \le s$ and $A \subseteq f(A)$ for all $A \in X^{(r)}$
\item $s \le r$ and $f(A) \subseteq A$ for all $A \in X^{(r)}$
\end{itemize}
\end{ncor}
\begin{nthm}[Sperner, 1928]
Let $\mcA \subseteq \mcP(X)$ be an antichain. Then $\abs{\mcA} \le \binom n{\floor{\frac n 2}}$
\end{nthm}
\begin{proof}
A chain and an antichain can intersect in at most one element. If we manage to partition $\mcP(X)$ into $\binom n{\floor{\frac n 2}}$ chains, we win. \\
But we can repeatedly use Corollary \ref{cor:layer-inj} to construct matchings $X^{(0)}$ to $X^{(1)}$, $X^{(1)}$ to $X^{(2)}$, ..., $X^{(\ceil{\frac n 2} - 1)}$ to $X^{(\ceil{\frac n 2})}$ and $X^{(n)}$ to $X^{(n - 1)}$, $X^{(n - 1)}$ to $X^{(n - 2)}$, ..., $X^{(\floor{\frac n 2} + 1)}$ to $X^{(\floor{\frac n 2})}$, then ``stack'' the matchings together to make chains (if an element of the middle layer). Each chain goes through $X^{(\floor{frac n 2})}$, so we made $\binom n {\floor{\frac n 2}}$.
\end{proof}
We can now understand the observation of Erd\H os (1945) about Littlewood-Offord (1943).
\begin{ncor}
Let $x_1, \dots, x_n \in \R$ be such that $\abs{x_i} \ge 1$. Then at most $\binom n{\floor{\frac n 2}}$ of the sums $\sum_i \eps_i x_i$, $\eps_i = \pm 1$ fall into the interior of an interval $I$ of length $2$.
\end{ncor}
\begin{proof}
WLOG $\for i, x_i \ge 1$. Set $F_\eps = \{i | \eps_i = 1\}$. $\{F\eps | \sum_i \eps_i x_i \in I\}$ is an antichain (if $F_\eps \subsetneq F_{\eps'}$, then $\sum_i \eps'_i x_i \ge \sum_i \eps_i x_i + 2$, so both sums can't lie in $I$).
\end{proof}
\begin{defi}
A partial order $P$ is {\bf graded} if it has a partition $P_i$ such that
\begin{itemize}
\item if $x < y$, $x \in P_i$, $y \in P_j$, then $i < j$ (in particular each $P_i$ is an antichain)
\item if $x < y$, $x \in P_i$, $y \in P_j$, $i + 2 \le j$, then there exists $z$ such that $x < z < y$.
\end{itemize}
For $a \in P$, we call the unique $i$ for which $a \in P_i$ the {\bf grade} or {\bf rank} of $a$. \\
A graded order is {\bf regular} if for every $i$ there exists $p_i$ such that every $x \in P_i$ is less than exactly $p_k$ elements of $P_{i + 1}$. \\
For $A \subseteq P$, define $A_i = A \cap P_i$ and $w(A) = \sum_i \frac{\abs{A_i}}{\abs{P_i}}$.
\end{defi}
TODO: Insert picture
\begin{nthm}
Let $A$ be an antichain in a connected regular graded order $P$. Then $w(A) \le 1$.
\end{nthm}
\begin{proof}
The regularity condition means that for each $i$ the bipartite graph $G_i$ with parts $P_{i - 1}, P_i$ and $x \sim y \iff x < y$ is $(p_{i - 1}, q)$-biregular. In particular, $w(A_i) \le w(\Gamma_{G_i}(A_i))$. \\
Now, write $r$ the maximal rank of an element of $A$ and define
$$B := A \setminus A_r \cup \Gamma_{G_r}(A_r)$$
The fact that $A$ is an antichain means that $B$ is an antichain as well and $\Gamma_{G_r}(A_r)$ is disjoint from $A_{r - 1}$. Hence
\begin{align*}
w(A)
& = w(A_r) + w(A_{r - 1}) + \sum_{i < r - 1} w(A_i) \\
& \le w(\Gamma_{G_r}(A_r)) + w(A_{r - 1}) + \sum_{i < r - 1} w(A_i) \\
& = w(B_{r - 1}) + \sum_{i < r - 1} w(B_i) \\
& = w(B)
\end{align*}
We therefore have decreased the maximal rank without decreasing the weight. We can repeat the process until the antichain is contained in some $P_i$, in which case its weight is clearly at most $1$.
\end{proof}
\newlec
Consider maximal chains in our regular graded order. Say there are $M$ of them. Each $x \in P_h$ lies in the same number of chains $m(x)$, namely $\frac m{\abs{P_h}}$.
\begin{proof}[Second proof]
No two elements of $A$ lie in the same maximal chain. Hence
$$M \ge \sum_{x \in A} m(x) = \sum_{x \in A} \frac M{\abs{P_{\rank(x)}}} = Mw(A)$$
\end{proof}
The following is a corollary of the above, but we provide a proof using Katona's circle method.
\begin{nthm}[Lubell-Yamamoto-Meshalkin Inequality]
If $\mcA \subseteq 2^{[n]}$ is an antichain, then $\sum_{A \in \mcA} {\binom n {\abs A}}^{-1} \le 1$
\end{nthm}
\begin{proof}
We say that $A \in 2^{[n]}$ is {\bf contained} in a permutation $\pi$ if $A = \{\pi_1, \dots, \pi_{\abs A}\}$. Every permutation contains at most one element of $\mcA$ and every $A \in \mcA$ is contained in $\abs A!(n - \abs A)!$ permutations.
\end{proof}
We say a chain $C_i \subseteq C_{i + 1} \subseteq \dots \subseteq$ is {\bf symmetric} if $\abs{C_j} = j$ for all $j$.
\begin{egs}
$\{1\} \subseteq \{1, 4\} \subseteq \{1, 3, 4\} \subseteq \{1, 3, 4, 6\} \subseteq \{1, 3, 4, 5, 6\}$ and $\{2, 4, 5\}$ are symmetric chains in $2^{[6]}$. $\{2, 5, 6\} \subseteq \{2, 4, 5, 6\}$ is a symmetric in $2^{[7]}$ but not in any other $2^{[n]}$.
\end{egs}
\begin{nthm}[Partition into Symmetric Chains]
Every finite powerset can be partitioned into symmetric chains.
\end{nthm}
\begin{proof}
Induction on $n$:
\begin{itemize}
\item $\{\{\empty\}\}$ is a PSC for $n = 0$.
\item Assume we have a PSC for $n$. For every chain $\mcC = \{C_i, \dots, C_{n - i}\}$ in our PSC for $n$, add the following two chains to our PSC for $n + 1$:
\begin{align*}
\mcC' & = \{C_i, \dots, C_{n - i}, C_{n - i} \cup \{n\}\} \\
\mcC'' & = \{C_i \cup \{n\}, \dots, C_{n - i - 1} \cup \{n\}\}
\end{align*}
TODO: Insert figure
\end{itemize}
\end{proof}
The number of symmetric chains of length $n + 1 - 2i$ in a PSC is
$$\binom n i - \binom n{i - 1}$$
\begin{nthm}
Let $x_1, \dots, x_n$ be vectors of norm at least $1$ a normed space. For $A \subseteq [n]$, set $x_A = \sum_{i \in A} x_i$. Let $\mcA \subseteq 2^{[n]}$ such that
$$\for A, B \in \mcA, \norm{x_A - x_B} < 1$$
Then $\mcA \le \binom n{\floor{\frac n2}}$.
\end{nthm}
\begin{proof}
Call $\mcB \subseteq 2^{[n]}$ {\bf sparse} or {\bf scattered} if $\for A, B, \mcB, A \ne B, \norm{x_A - x_B} \ge 1$. $\mcA$ intersects every sparse family in at most one set, so we would be done if there existed a partition of $2^{[n]}$ into $\binom n{\floor{\frac n2}}$ sparse chains. This is the next theorem.
\end{proof}
\newlec
\begin{nthm}[Kleitman]
$2^{[n]}$ has a partition into $\binom n{\floor{\frac n2}}$ sparse chains.
\end{nthm}
\begin{proof}
Induction on $n$:
\begin{itemize}
\item $\{\{\empty\}\}$ is a sparse partition for $n = 0$.
\item Assume we have a sparse partition for $n$. Let $f$ be a support functional at $x_n$ ($\for x, f(x) \le \norm x$, with equality if $x = x_n$). For every sparse family $\mcD = \{D_1, \dots, D_k\}$ in our sparse partition for $n$, find $i$ maximising $f(x_{D_i})$ and add the following two sparse families to our sparse partition for $n + 1$:
\begin{align*}
\mcD' & = \mcD \cup \{D_i \cup \{n\}\} \\
\mcD'' & = \{D_j \cup \{n\} | j \ne i\}
\end{align*}
$\mcD''$ is clearly sparse. $\mcD'$ is also sparse because for all
$D \in \mcD$
\begin{align*}
\norm{x_{D_i \cup \{n\}} - x_D}
& = \norm{x_{D_i} + x_n - x_D} \\
& \ge f(x_{D_i} + x_n - x_D) \\
& = f(x_{D_i}) - f(x_D) + \norm{x_n} \\
& \ge 1
\end{align*}
The number of sparse partitions of length $n + 1 - 2i$ is again $\binom ni - \binom n{i - 1}$.
\end{itemize}
\end{proof}
\clearpage
\subsection{Around the Erd\H os-Ko-Rado theorem}
Erd\H os-Ko-Rado says that if $\mcA \subseteq X^{(r)}$ is intersecting, then $\abs{\mcA} \le \binom{n - 1}{r - 1}$.
\newlec
\begin{defi}[Katona's circle method]
We consider a cyclic order on $X$, namely an equivalence $\pi : [n] \simeq X$, where $\abs X = n$ and $\pi, \pi'$ are identified if they differ by a shift. An {\bf arc} is a set of consecutive elements, namely a set of the form $\{\pi_a, \dots, \pi_b\}$ (with wrap-around allowed).
TODO: Insert picture
For $\pi$ a cyclic order and $\mcA$ a set family, we write $\mcA_\pi$ the $\pi$-arcs of $\mcA$.
There are $(n - 1)!$ cyclic orders. Every cyclic order $\pi$ supports exactly $n$ arcs of size $r$, and every set $A$ of size $r$ is an arc of exactly $r!(n - r)!$ cyclic orders.
\end{defi}
\begin{nlemma}\label{lem:arc-antichain}
Let $\mcA \subseteq X^{(\le \frac n2)}$ be an intersecting antichain of arcs in a cyclic order $\pi$. then
$$\abs{\mcA} \le \min_{A \in \mcA} \abs A$$
\end{nlemma}
\begin{proof}
WLOG $\pi_1, \dots, \pi_k$ is the shortest arc, $\mcA$ is nonempty and $k \ge 2$. For every $i \in [k]$, at most one arc separates $x_i$ and $x_{i + 1}$. Therefore $\abs{\mcA} - 1 \le k - 1$, namely $\abs{\mcA} \le k$.
\end{proof}
\begin{nthm}
Let $\mcA \subseteq X^{(\le \frac n2)}$ be an intersecting antichain. Then
$$\sum_{A \in \mcA} \binom{n - 1}{\abs A - 1}^{-1} \le 1$$
\end{nthm}
\begin{proof}
Consider the bipartite graph with parts $\mcA$ and the cyclic orders of $X$. Connect $A \in \mcA$ to a cyclic order $\pi$ if $A$ is a $\pi$-arc. Make the weight of the edge be $\abs A^{-1}$. Then every $A \in \mcA$ has total weight $(\abs A - 1)!(n - \abs A)!$ while each $\pi$ has weight
$$\sum_{A \in \mcA_\pi} \abs A^{-1} \le \frac{\abs{\mcA_\pi}}{\min_{A \in \mcA} \abs A} \le 1$$
by Lemma \ref{lem:arc-antichain}. So
$$\sum_{A \in \mcA} (\abs A - 1)!(n - \abs A)! \le (n - 1)!$$
\end{proof}
\begin{nthm}
Let $r \le \frac n2$ and $\mcA \subseteq X^{(\le r)}$ be an intersecting antichain. Then $\abs{\mcA} \le \binom{n - 1}{r - 1}$.
\end{nthm}
\begin{proof}
$$\frac{\abs{\mcA}}{\binom{n - 1}{r - 1}} \le \sum_{A \in \mcA} \binom{n - 1}{\abs A - 1}^{-1} \le 1$$
\end{proof}
$\mcA$ being intersecting is the same as $\abs{A \inter B} \ge 1$ for all $A, B \in \mcA$. We say $\mcA$ is {\bf $\ell$-intersecting} if $\abs{A \inter B} \ge \ell$ for all $A, B \in \mcA$.
We know the size of an intersecting antichain in $X^{(r)}$ is maximised by taking all sets of size $r$ containing a fixed element $x$. We would hope that the size of an $\ell$-intersecting antichain in $X^{(r)}$ is maximised by taking all sets of size $r$ containing a fixed set $R$ of size $\ell$ (which would give $\binom{n - \ell}{r - \ell}$ as an upper bound). This is however not true for small $n$. The reason is that it might be better to make $R$ a bit bigger than $\ell$ and in return ask for the sets in the antichain to intersect $R$ into a bit more than $\ell$ elements (rather than containing it).
Precisely, pick your favorite number $t$ and your favorite set $R$ of size $\ell + 2t$. Now define
$$\mcF_t = \{A \in X^{(r)} \mid \abs{A \inter R} \ge \ell + t\}$$
For every $A, B \in \mcF$,
$$\abs{A \inter B} \ge \abs{A \inter B \inter R} = \abs{A \inter R} + \abs{B \inter R} - \abs{(A \union B) \inter R} \ge 2\ell + 2t - (\ell + 2t) = \ell$$
So $\mcF_t$ is $\ell$-intersecting. On the other hand,
$$\abs{F_t} > \binom{n - \ell}{r - \ell}$$
for values of $n$ on the same order of magnitude as $r$ and $\ell$.
\begin{nthm}
Let $1 \le \ell < r$ be fixed. If $n$ is large enough and $\mcA \subseteq X^{(r)}$ is an $\ell$-intersecting family, then $\abs{\mcA} \le \binom{n - \ell}{r - \ell}$.
\end{nthm}
\begin{proof}
We may assume $\mcA$ is maximal. In particular, find $A_1, A_2 \in \mcA$ such that
$$\abs{A_1 \inter A_2} = \ell$$
If $\abs{\Inter_{A \in \mcA} A} \ge \ell$, we're done. So find some $A_3 \in \mcA$ such that
$$\abs{A_1 \inter A_2 \inter A_3} < \ell$$
Set $U = A_1 \union A_2 \union A_3$. $\abs U \le 3r$ and every element of $\mcA$ intersects $U$ in at least $\ell + 1$ elements. Hence
$$\abs\mcA \le \binom{3r}{\ell + 1}\binom n{r - (\ell + 1)} \le \binom{n - \ell}{r - \ell}$$
for large enough $n$.
\end{proof}
\newlec
What if $2r \ge n$ and $\mcF \subseteq X^{(r)}$ is such that $A \union B \ne X$ for all $A, B \in \mcF$? This is just Erd\H os-Ko-Rado again by taking complements: $\{A^c \mid A \in \mcF\} \subseteq X^{(n - r)}$ is an intersecting antichain with $2(n - r) \le n$, so
$$\abs{\mcF} = \abs{\{A^c \mid A \in \mcF\}} \le \binom{n - 1}{(n - r) - 1} = \binom{n - 1}r$$
What happens now if $kr \ge n$ and no $k$ sets in $\mcF$ have union $X$? For $k = 2$, this is Erd\H os-Ko-Rado. We can take $\mcF = (\{1\}^c)^{(r)}$ to achieve
$$\abs\mcF \le \binom{n - 1}r$$
\begin{nthm}[Frankl]\label{thm:k-unioning}
Let $k, r, n$ be such that $n \le kr$ and $\mcF \subseteq X^{(r)}$ be such that no union of $k$ sets in $\mcF$ is $X$. then
$$\abs{\mcF} \le \binom{n - 1}r$$
This bound is sharp.
\end{nthm}
\begin{proof}
Every $F \in \mcF$ is an arc in exactly $r!(n - r)!$ cyclic orders. If we knew that each cyclic order supports at most $n - r$ elements of $\mcF$, we would get
$$\abs\mcF n!(n - r)! = \#\{(f, \pi) \mid f \in \mcF_\pi\} \le (n - 1)!(n - r)$$
namely $\abs\mcF \le \binom{n - 1}r$. So let's prove that. \\
Assume $\pi$ is a cyclic order. WLOG, assume $\doublesqbrack{n - r + 1, n} \in \mcF$. Define
$$K = \{\text{endpoint of } F \mid F \in \mcF_\pi\} \union \doublesqbrack{n + 1, kr}$$
For each $0 < j \le r$, we know that $\{j, j + k, \dots, j + (r - 1)k\} \not\subseteq K$ as otherwise the corresponding arcs would cover $X$. Hence
$$\abs{\mcF_\pi} = \abs K + n - kr \le (k - 1)r + n - kr = n - r$$
\end{proof}
\begin{ncor}
Let $k, r, n$ be such that $r \le \frac{k - 1}k n$ and $\mcA \subseteq X^{(r)}$ be a $k$-intersecting family. Then $\abs\mcA \le \binom{n - 1}{r - 1}$.
\end{ncor}
\begin{proof}
Set $\mcF = \{A^c \mid A \in \mcA\}$. Then Theorem \ref{thm:k-unioning} applies and gives
$$\abs\mcA = \abs\mcF \le \binom{n - 1}{n - r} = \binom{n - 1}{r - 1}$$
\end{proof}
\clearpage
\subsection{The Vertex Isoperimetric Problem in the Cube and the Kruskal-Katona Theorem}
We are interested in the isoperimetric problem in the discrete cube.
Define $Q_n$ to be the graph with vertices $2^{[n]}$ and
$$x \sim y \iff \abs{x \symdif y} = 1$$
$\abs{x \symdif y}$ is more generally the distance between $x$ and $y$ in $Q_n$. Given $A \subseteq Q_n$, write
$$N(A) = A \union \Gamma(A) = \{x \in Q_n \mid d(x, A) \le 1\}$$
the {\bf neighborhood} of $A$.
\begin{question}
How large is $\abs{N(A)}$ when $\abs A = \alpha$?
\end{question}
Our life will be made incredibly easier here by the fact that the extremal $A$ are {\it nested}. That is to say we will find $I_0 \subseteq I_1 \subseteq \dots$ such that $\abs{I_\alpha} = \alpha$ and $\abs{N(A)} \ge \abs{N(I_\alpha)}$ for all $A$ of size $\alpha$.
This is to compare to the usual isoperimetric inequality in $\R^d$, where the extremal sets (ie balls) are also nested. In both cases, this means we can make any set look more and more like an extremal set by {\it thickening} or {\it compressing} it.
TODO: Insert potatoes
\newlec
A nested sequence of sets is equivalent to the sequence of initial segments of some order. What order do we want here?
For $x, y \in X^{(r)}$, say $x$ is less than $y$ in the {\bf lexicographic order}, written $x \lexlt y$, if $\min(x \symdif y) \in x$. The slogan is ``Keep small elements small''. For $n = 5, r = 3$, the lex order is
$$123 < 124 < 125 < 134 < 135 < 145 < 234 < 235 < 245 < 345$$
We extend the lex orders on the $X^{(r)}$ to the {\bf simplicial order} on $Q_n$ by
$$x < y \iff \abs x < \abs y \text{ or } \abs x = \abs y \text{ and x } \lexlt y$$
The simplicial order on $Q_5$ starts
$$\emptyset, 1, 2, 3, 4, 5, 12, 13, 14, 15, 23, \dots$$
\begin{defi}
For $A \subseteq Q_n$, define
\begin{align*}
A_+^{(i)} & = \{x \mid i \nin x, x \union \{i\} \in A\} \\
A_-^{(i)} & = \{x \mid i \nin x, x \in A\}
\end{align*}
The {\bf $i$-compression} of $A$ is the set $B$ such that $B_\pm^{(i)}$ are the initial segments in the simplicial order with $\abs{B_\pm^{(i)}} = \abs{A_\pm^{(i)}}$.
\end{defi}
\begin{nlemma}\label{lem:card-nhds-compress}
$$\abs{N(C_i(A))} \le \abs{N(A)}$$
\end{nlemma}
\begin{proof}
Write $B = C_i(A)$.
\begin{align*}
\abs{N(B)}
& = \abs{N(B)_+^{(i)}} + \abs{N(B)_-^{(i)}} \\
& = \abs{N\left(B_+^{(i)}\right) \union B_-^{(i)}} + \abs{N\left(B_-^{(i)}\right) \union B_+^{(i)}} \\
& = \max\left(\abs{N\left(B_+^{(i)}\right)}, \abs{B_-^{(i)}}\right) + \max\left(\abs{N\left(B_-^{(i)}\right)}, \abs{B_+^{(i)}}\right) \text{ by nestedness} \\
& = \max\left(\abs{N\left(C_i\left(A_+^{(i)}\right)\right)}, \abs{A_-^{(i)}}\right) + \max\left(\abs{N\left(C_i\left(A_-^{(i)}\right)\right)}, \abs{A_+^{(i)}}\right) \\
& \le \max\left(\abs{N\left(A_+^{(i)}\right)}, \abs{A_-^{(i)}}\right) + \max\left(\abs{N\left(A_-^{(i)}\right)}, \abs{A_+^{(i)}}\right) \\
& = \abs{N(A)_+^{(i)}} + \abs{N(A)_-^{(i)}} \\
& = \abs{N(A)}
\end{align*}
\end{proof}
\begin{nlemma}\label{lem:eventually-compress}
For every $A \subseteq Q_n$, there exists a {\bf compressed set} $B$ (namely $C_i(B) = B$ for all $i$) such that $\abs B = \abs A$ and $\abs{N(B)} \le \abs{N(A)}$.
\end{nlemma}
\begin{proof}
Repeatedly $i$-compress $A$ for various $i$. This must terminate since $i$-compressing reduces $A$ in the lex order on $Q_n^{(\abs A)}$. (Warning, we're taking the lex order of the simplicial order here!)
\end{proof}
\newlec
We would be extremely lucky if the only compressed sets were initial segments of the simplicial order. We are merely incredibly lucky and there is exactly one exceptional set which is compressed but not an initial segment.
Define the {\bf half-space} $H_n = I_{2^{n - 1}}$, and the {\bf exceptional set} $E_n = H_n \setminus \{\max H_n\} \union \{\min H_n^c\}$ which is $H_n$ with its last element $\max H_n$ replaced by $\min H_n^c$, the next element in the simplex order (which also happens to be the complement of $\max H_n$).
\begin{itemize}
\item If $n = 2k + 1$, then $H_n = X^{(\le k)}$ and
$$E_n = H_n \setminus \{\{k + 2, \dots, 2k + 1\}\} \union \{\{1, \dots, k + 1\}\}$$
\item If $n = 2k$, then $H_n = X^{(\le k - 1)} \union \{x \union \{1\} \mid x \in (\{1\}^c)^{(k - 1)}\}$ and
$$E_n = H_n \setminus \{\{1, k + 2, \dots, 2k\}\} \union \{\{2, \dots, k + 1\}\}$$
\end{itemize}
In both cases, $E_n$ is compressed but not an initial segment. We also notice that the element we remove is the complement of the element we add.
\begin{nlemma}\label{lem:exceptional-set}
If $A \subseteq Q_n$ is compressed but not an initial segment, then $A = E_n$.
\end{nlemma}
\begin{proof}
For every $x \nin A, y \in A$ such that $x < y$, we see that $i \in x \iff i \nin x$ for all $i$ as otherwise $A$ wouldn't be $i$-compressed. Hence $x$ and $y$ are complements. In particular, such $x$ and $y$ are unique. Since $A$ is not an initial segment, such $x$ and $y$ exist. So
$$A = \text{initial segment} \setminus \{\text{last}\} \union \{\text{next}\}$$
where $\text{last}^c = \text{next}$. But there is only one element which follows its complement in the lex order. So $A = E_n$ as claimed.
\end{proof}
\begin{rmk}
$$\abs{N(E_n)} = \abs{N(H_n)} + \floor{\frac{n - 1}2}$$
\end{rmk}
\begin{nthm}[Vertex Isoperimetric Inequality]
If $A \subseteq Q_n$, then $\abs{N(A)} \ge \abs{N(I_{\abs A})}$.
\end{nthm}
\begin{proof}
Apply Lemmas \ref{lem:card-nhds-compress}, \ref{lem:eventually-compress}, \ref{lem:exceptional-set}.
\end{proof}
\begin{defi}
For $\mcA \subseteq \mcP(X)$, define the {\bf lower shadow} of $\mcA$
$$\underline\partial\mcA = \{A \setminus \{a\} \mid a \in A \in \mcA\}$$
and the {\bf upper shadow} of $\mcA$
$$\overline\partial\mcA = \{A \union \{a\} \mid a \nin A \in \mcA\}$$
\end{defi}
TODO: Insert graded potato
\begin{question}
How large is $\abs{\mcA}$ when $\mcA \subseteq X^{(r)}$ and $\abs\mcA = a$?
\end{question}
From Lemma \ref{lem:weight-biregular}, we know the bound $\abs{\underline\partial\mcA} \ge \frac r{n - r + 1}\abs\mcA$, but we can do much better now that we have the vertex isoperimetric inequality. Indeed,
$$N\left(X^{(\ge + 1)} \union \mcA\right) = X^{(\ge r)} \union \underline\partial\mcA$$
and $X^{(\ge r)}$ is disjoint from $\underline\partial\mcA$.
TODO: Insert graded potato
\newlec
Recall we defined $x \lexlt y$ to mean $\min(x \symdif y) \in x$. Define the {\bf colexicographic order} by
$$x \colexlt y \iff \max(x \symdif y) \in y$$
The slogan this time is ``Keep big elements small''. The colex order for $n = 5, r = 3$ is
$$123, 124, 134, 234, 125, 135, 235, 145, 245, 345$$
colexicographic order is dual to the lexicographic order in the sense that
$$x \colexlt y \iff y^c \lexlt x^c$$
Since we have proved the vertex isoperimetric inequality for the lex order, we will reuse it rather than the colex order (hence we bound $\overline\partial$ instead of $\underline\partial$). Define
\begin{align*}
I_\alpha^{(r)} & = \text{initial segment of size $\alpha$ in $X^{(r)}$ in colex} \\
J_\alpha^{(r)} & = \text{initial segment of size $\alpha$ in $X^{(r)}$ in lex}
\end{align*}
We observe that initial segments of the simplicial order are of the form $X^{(\le r - 1)} \union J_a^{(r)}$.
\begin{nthm}[Kruskal 1963, Katona 1968]
If $\mcA \subseteq X^{(r)}$, then $\abs{\underline\partial\mcA} \ge \abs{\underline\partial I_{\abs\mcA}^{(r)}}$.
\end{nthm}
\begin{proof}
By taking complements, we will instead prove the analogous statement for the upper shadow.
\begin{align*}
\abs{X^{(\le r)}} + \abs{\overline\partial\mcA}
& = \abs{X^{(\le r)} \union \overline\partial\mcA} \\
& = \abs{N\left(X^{(\le r - 1)} \union \mcA\right)} \\
& \ge \abs{N\left(X^{(\le r - 1)} \union J_{\abs\mcA}^{(r)}\right)} \text{ by the vertex isoperimetric inequality} \\
& = \abs{X^{(\le r)} \union \overline\partial J_{\abs\mcA}^{(r)}} \\
& = \abs{X^{(\le r)}} + \abs{\overline\partial J_{\abs\mcA}^{(r)}}
\end{align*}
So $\abs{\overline\partial\mcA} \ge \abs{\overline\partial J_{\abs\mcA}^{(r)}}$.
\end{proof}
\clearpage
\subsection{Sums of sets}
Let $G$ be an additive abelian group, $A, B \subseteq G$. Define
$$A + B = \{a + b \mid a \in A, b \in B\}$$
This is called {\bf pointwise addition} of sets or the {\bf Minkowski sum}.
\begin{nthm}[Brunn-Minkowski]
Let $A, B \subseteq \R^n$ be compact sets. Then
$$\abs{A + B}^{\frac 1n} \ge \abs A^{\frac 1n} + \abs B^{\frac 1n}$$
\end{nthm}
\begin{proof}[Sketch proof]
Assuming our sets are nice, eg they are finite disjoint unions of rectangles, then the question becomes combinatorial.
\end{proof}
If $A, B$ are finite nonempty sets of integers, say $A = \{a_1, \dots, a_k\}, a_1 < \dots < a_k$ and $B = \{b_1, \dots, b_\ell\}, b_1 < \dots < b_\ell$, then $\abs{A + B} \ge \abs A + \abs B - 1$ as
$$a_1 + b_1 < \dots < a_1 + b_\ell < \dots < a_k + b_\ell$$
are $k + \ell - 1$ distinct elements of $A + B$.
If $A, B$ are finite nonempty sets in a finite abelian group $G$ such that $\abs A + \abs B > \abs G$, then $A + B = G$. Indeed, for every $x \in G$, we have $\abs{x - A} + \abs B > \abs G$, so there must be some $b \in (x - A) \inter B$. Writing $b = x - a$ for some $a \in A$, we see that $x = a + b \in A + B$.
\newlec
\begin{nthm}[Cauchy 1813, Davenport 1935]\label{thm:cd}
If $p$ is prime and $A, B$ are finite nonempty sets in $\Z/p\Z$ such that $\abs A + \abs B \le p + 1$, then
$$\abs{A + B} \ge \abs A + \abs B - 1$$
\end{nthm}
\begin{proof}
Apply induction on $\abs A$.
\begin{itemize}
\item If $\abs A = 1$, say $A = \{a\}$, then
$$\abs{A + B} = \abs{a + B} = \abs B = \abs A + \abs B - 1$$
\item If $\abs A \ge 2$, we may assume $0 \in A \inter B$ and find some $a \ne 0$ such that $a \in A$. Since $a, 2a, \dots, pa$ are all distinct elements of $\Z/p\Z$, and $\abs B \le p + 1 - \abs A < p$, we can find some $k$ such that $ka \in B$ but $(k + 1)a \nin B$. Shifting $B$ by $ka$, we may assume $0 \in A \inter B, a \in A \ B$, so that $1 \le \abs{A \inter B} < \abs A$. By induction hypothesis and observing that $A \union B + A \inter B \subseteq A + B$,
$$\abs{A + B} \ge \abs{A \union B + A \inter B} \ge \abs{A \union B} + \abs{A \inter B} - 1 = \abs A + \abs B - 1$$
\end{itemize}
\end{proof}
\begin{ncor}\label{cor:cd-n-ary}
Let $A_1, \dots, A_k$ be nonempty sets in $\Z/p\Z$ for $p$ prime. Then
$$\abs{A_1 + \dots + A_k} \ge \min(\abs{A_1} + \dots + \abs{A_k} - (k - 1), p)$$
\end{ncor}
\begin{proof}
Trivial induction on $k$ from Theorem \ref{thm:cd}.
\end{proof}
\begin{nthm}[Macbeath 1953, Kneser 1955]
Let $A, B$ be nonempty subsets of a finite abelian group $G$ such that $\abs A + \abs B \le \abs G$. Then $G$ has a proper subgroup $H$ such that
$$\abs{A + B} \ge \abs A + \abs B - \abs H$$
\end{nthm}
\begin{proof}
Induction on $\abs B$.
\begin{itemize}
\item If $\abs B = 1$, say $B = \{b\}$, then one can take $H = \{0\}$ so that
$$\abs{A + B} = \abs{A + b} = \abs A = \abs A + \abs B - \abs H$$
\item If $ \abs B \ge 1$, we may assume $0 \in A \inter B$. We will have two cases. The first one is essentially trivial while the second one is the heart of the proof.
{\bf Case 1: $A + B - B = A$} \\
Set $H = \langle B\rangle$.
$$\abs{A + B} \ge \abs A \ge \abs A + \abs B - \abs H$$
and $H \ne G$ since $A + H = A \ne G$.
{\bf Case 2: $A + B - B \ne A$} \\
Find $a \in A$ and $b, c \in B$ such that $a + b - c \nin A$. Set $d = a - c$.
\begin{claim}
$$1 \le \abs{A \inter (B + d)} < \abs B$$
\end{claim}
\begin{proof}
$b + d \in B + d$ but $b + d = a + b - c \nin A$, so $\abs{A \inter (B + d)} < \abs{B + d} < \abs B$. On the other hand, $a = c + d \in A \inter (B + d)$.
\end{proof}
Hence by the induction hypothesis, find some strict subgroup $H$ such that
$$\abs{A \union (B + d) + A \inter (B + d)} \ge \abs{A \union (B + d)} + \abs{A \inter (B + d)} - \abs H$$
Since $A \union (B + d) + A \inter (B + d) \subseteq A + B + d$, we get
\begin{align*}
\abs{A + B}
& = \abs{A + B + d} \\
& \ge \abs{A \union (B + d) + A \inter (B + d)} \\
& \ge \abs{A \union (B + d)} + \abs{A \inter (B + d)} - \abs H \\
& = \abs A + \abs{B + d} - \abs H \\
& = \abs A + \abs B - \abs H \\
\end{align*}
\end{itemize}
\end{proof}
\newlec
\begin{nthm}[Erd\H os-Ginzburg-Ziv, 1961]
Let $a_1, \dots, a_{2n - 1} \in \Z/n\Z$. Then there exists a subsequence of length $n$ summing to $0$.
\end{nthm}
\begin{proof}~\\
{\bf Case 1: $n = p$ is prime} \\
Write $0 \le a_1, \dots \le a_{2p - 1} < p$. If $a_i = \dots, a_{i + p - 1}$, then their sum is $0$. Otherwise, set $A_i = \{a_i, a_{i + p - 1}\}$ for $i = 1, \dots, p - 1$ and $A_p = \{a_{2p - 1}\}$. Then Corollary \ref{cor:cd-n-ary} gives us
$$\abs{A_1 + \dots + A_p} \ge \min(2p - 1 - (p - 1), p) = p$$
Hence every $b \in \Z/p\Z$, in particular $0$, is the sum of a subsequence of length $p$.
{\bf Case 2: $n$ is composite} \\
Instead of looking at sequences in $\Z/n\Z$ which sum to $0$, look at sets in $\Z$ whose sum is divisible by $n$. Induction on the number of prime factors. \\
We know the result for $n$ a prime, so write $n = ab$ where $a, b < n$ and assume the result for $a$ and $b$. By induction hypothesis for $b$, repeatedly pick sets $S_1, \dots S_{2a - 1}$ of size $b$ whose sum is divisible by $b$. This is possible since after having picked $k \le 2a - 2$ such sets, there are $2n - 1 - kb \ge 2n - 1 - (2a - 2)b = 2b - 1$ elements left. Now, apply the induction hypothesis for $a$ on
$$\frac{S_1}b, \dots + \frac{S_{2a + 1}}b$$
to find $i_1, \dots, i_a$ such that
$$\frac{S_{i_1}}b + \dots + \frac{S_{i_a}}b$$
is divisible by $a$. But then $S_{i_1} + \dots + S_{i_a}$ is divisible by $n$.
\end{proof}
\begin{rmk}
This is sharp since no subsequence of length $n$ sums to $0$ in
$$\underbrace{0, \dots, 0}_{n - 1\text{ time}}, \underbrace{1, \dots, 1}_{n - 1\text{ time}}$$
\end{rmk}
\clearpage
\section{Polynomials in Combinatorics}
\subsection{Alon's Combinatorial Nullstellensatz}
Let $\F$ be a field, sometimes algebraically closed, often finite. Write
$$\F[X] = \F[X_1, \dots, X_n]$$
\begin{thm}[Hilbert's Nullstellensatz]
Let $\F$ be an algebraically closed field and $I$ be an ideal of $\F[X]$. Write
$$V(I) = \{x \in \F^n \mid \for f \in I, f(x) = 0\}$$
If $f \in \F[X]$ vanishes on $V(I)$ then $f^k \in I$ for some $k$.
\end{thm}
For finitely generated ideals, we have two versions.
\begin{thm}[Weak version of Hilbert's Nullstellensatz for finitely generated ideals]
Let $f_1, \dots, f_m \in \F[X]$ be polynomials with no common zeroes. Then
$$\langle f_1, \dots, f_n\rangle = \F[X]$$
\end{thm}
\begin{thm}[Strong version of Hilbert's Nullstellensatz for finitely generated ideals]
Let $f, f_1, \dots, f_m \in \F[X]$. If $f$ vanishes on $V(\langle f_1, \dots, f_m\rangle)$, then $f^k \in I$ for some $k$, ie $f^k = g_1f_1 + \dots + g_mf_m$ for some $g_1, \dots, g_m \in \F[X]$.
\end{thm}
For some time it was thought that the strong version was harder than the weak version. But in fact they're equivalent.
\begin{proof}[Proof that weak version $\imp$ strong version]
Add a variable $X_0$ and a polynomial $f_0 = fX_0 - 1$. By construction, $f_0, \dots, f_m$ have no common zeroes. So
$$\langle f_0, \dots, f_m\rangle = \F[X_0, \dots, X_n]$$
In particular, we can find $g_0, \dots, g_m \in \F[X_0, \dots, X_n]$ such that
$$1 = g_0(fX_0 - 1) + g_1f_1 + \dots + g_m f_m$$
Substituting $X_0 = \frac 1f$, we get
$$1 = g_1f_1 + \dots + g_m f_m = \sum_{i = 1}^m g_i\left(\frac 1f, X_1, \dots, X_m\right)f_i = \frac{\sum_{i = 1}^m h_if_i}{f^k}$$
for some integer $k$ and polynomials $h_i$, namely
$$f^k = \sum_{i = 1}^m h_if_i$$
\end{proof}
\newlec
\begin{nthm}[Alon's Combinatorial Nullstellensatz, 1995]
Let $\F$ be a field and $f \in \F[X_1, \dots, x_n]$ have degree $d = \sum_{i = 1}^n d_i$. Let $S_1, \dots, S_n \subseteq \F$ be such that $\abs{S_i} > d_i$. If $X_1^{d_1} + \dots + X_n^{d_n}$ is a monomial in the expansion of $f$, then $f$ is non-zero somewhere on $S_1 \times \dots \times S_n$.
\end{nthm}
\begin{proof}[Proof (Michałek)]
Note that
$$X^k = (X - t)(X^{k - 1} + tX^{k - 2} + \dots + t^{k - 1}) + t^k$$
Induction on $d$.
\begin{itemize}
\item Obvious for $d = 0$.
\item If $d \ge 1$, say $d_1 \ge 1$, pick $s_1 \in S_1$. Then $f = (X_1 - s_1)q(X) + r(X)$ where $\deg q = d - 1$ and $r \in \F[X_2, \dots, X_n]$. Assume that $f$ vanishes on $S_1 \times \dots \times S_n$. Then $s$ vanishes on $\{s_1\} \times S_2 \times \dots \times S_n$. But $r$ does not depend on $X_1$, so in fact $r$ vanishes on $S_1 \times \dots \times S_n$. Therefore $q$ is zero on $S_1 \setminus \{s_1\} \times \dots \times S_n$, which contradicts the induction hypothesis.
\end{itemize}
\end{proof}
\begin{nthm}[Chevalley-Warning]
Let $\F$ be a finite field and
$$f_1, \dots, f_m \in \F[X_1, \dots, X_n]$$
be such that $\sum_{i = 1}^m \deg f_i < n$. If the $f_i$ have a common zero, then they have another one.
\end{nthm}
\begin{proof}
Write $q$ the order of $\F$. Recall that for $z \in Z$
$$z^{q - 1} = \begin{cases*}
0 \text{ if } z = 0 \\
1 \text{ if } z \ne 0
\end{cases*}$$
Assume (by shifting) that the common zero of the $f_i$ is $0$. Define
$$f = \prod_{i = 1}^m (1 - f_i^{q - 1}) + \gamma \prod_{i = 1}^n \prod_{s \ne 0} (X_i - s)$$
where we choose $\gamma \ne 0$ such that $f(0) = 0$.
$$\deg f \le \max((n - 1)(q - 1), n(q - 1)) = n(q - 1)$$
and the monomial $X_1^{q - 1} \dots X_n^{q - 1}$ has coefficient $\gamma \ne 0$ in $f$. So the Combinatorial Nullstellensatz gives us $a$ such that $f(a) \ne 0$. Then $a \ne 0$ and $f_1(a) = \dots = f_m(a) = 0$.
\end{proof}
\begin{nthm}
Let $\F$ be a finite field of characteristic $p$. Let $f \in \F[X_1, \dots, X_n]$. If $\deg f < n$, then the number of zeroes of $f$ is a multiple of $p$.
\end{nthm}
\begin{proof}
Denote $q$ the order of $\F$.Write $N(f)$ the number of zeroes of $f$. We have
$$N(f) = \sum_x(1 - f(x)^{q - 1}) = - \sum_x f(x)^{q - 1}$$
Expand $f(x)^{q - 1}$ into monomials of degree at most $(n - 1)(q - 1)$.
\end{proof}
\newlec
\begin{nthm}
Let $f_1, \dots, f_m \in \F_p[X_1, \dots, X_n]$ such that $\sum_i \deg f_i < n$. Then $p$ divides the number of common zeroes. In particular, if there is a common zero, then there is another one.
\end{nthm}
\begin{proof}
Let $z_1, \dots, z_k \in \F_p^n$ be the common zeroes. Assume $p \nmid k$. Define
$$f = \underbrace{\prod_{i = 1}^m (1 - f_i^{p - 1})}_g - \underbrace{\sum_{j = 1}^k \prod_{i = 1}^n (1 - (X_i - z_{i, j})^{p - 1})}_h$$
$f$ is identically $0$ since, for each $j$,
$$f(z_j) = h(z_j) - g(z_j) = 1 - 1 = 0$$
And if $z \ne z_1, \dots, z_k$, then
$$f(z) = h(z) - g(z) = 0 - 0 = 0$$
\end{proof}
\clearpage
\subsection{Applications of Alon's Combinatorial Nullstellensatz}
\begin{nthm}[Cauchy-Davenport for primes]
Let $p$ be a prime and $A, B$ sets in $\F_p$ such that $1 \le \abs A \le \abs B$ and $\abs A + \abs B \le p + 1$. Then
$$\abs{A + B} \ge \abs A + \abs B - 1$$
\end{nthm}
\begin{proof}
Assume not. Then find $C$ of size $\abs A + \abs B - 2$ such that $A + B \subseteq C$. Define $f \in \F_p[X, Y]$ by
$$f(X, Y) = \prod_{c \in C}(X + Y - c)$$
Then $f$ vanishes on $A \times B$. But the coefficient of $X^{\abs A - 1}Y^{\abs B - 1}$ in $f$ is
$$\binom{\abs A + \abs B - 2}{\abs A - 1} \ne 0 \mod p$$
contradicting Alon's Combinatorial Nullstellensatz.
\end{proof}
\begin{nthm}[Erd\H os-Ginzburg-Ziv for primes]
Let $a_1, \dots, a_{2p - 1} \in \F_p$. Then there exists $I \in [2p - 1]^{(p)}$ such that $\sum_{i \in I} a_i = 0$.
\end{nthm}
\begin{proof}
Define $f_1 = \sum_{i = 1}^{2p - 1} X_i^{p - 1}, f_2 = \sum_{i = 1}^{2p - 1} a_iX_i^{p - 1}$. Then $\deg f_1 + \deg f_2 < 2p - 1$, so if $f_1$ and $f_2$ have a common zero, then they have another one. But $f_1(0) = f_2(0) = 0$, so find $z \ne 0$ another common zero. Define $I = \{i \mid z_i \ne 0\}$. Then
\begin{align*}
\abs I & = \sum_{i \in I} z_i^{p - 1} = 0 \\
\sum_i a_i & = \sum_{i \in I} a_i z_i^{p - 1} = 0
\end{align*}
so $p \mid \abs I$. Since $I \ne \emptyset$ ($z \ne 0$) and $\abs I < 2p$, it must be that $\abs I = p$.
\end{proof}
Define
$$A \hatplus B = \{a + b \mid a \in A, b \in B, a \ne b\}$$
For $A$ an interval in $\Z$, $\abs{A \hatplus A} = 2\abs A - 3$. The Erd\H os-Heilbronn conjecture says this is best possible.
\newlec
\begin{nthm}[Alon-Nathanson-Ruzsa proof]
Let $A, B$ be sets in $\F_p$ such that $1 \le \abs A < \abs B$ and $\abs A + \abs B \le p + 2$. Then
$$\abs{A \hatplus B} \ge \abs A + \abs B - 2$$
\end{nthm}
\begin{proof}
Assume $\abs{A \hatplus B} \le \abs A + \abs B - 3$. Then find $C \supseteq A \hatplus B$ such that $\abs C = \abs A + \abs B - 3$. Set
$$f = (Y - X)\prod_{c \in C} (X + Y - c)$$
$f$ is identically zero on $A \times B$ and $\deg f = a + b - 2$. But the coefficient of $X^{a - 1}Y^{b - 1}$ in $f$ is
$$\binom{a + b - 3}{a - 1} - \binom{a + b - 3}{b - 1} = \frac{b - a}{a - 1}\binom{a + b - 3}{a - 2} \ne 0$$
contradicting Alon's Combinatorial Nullstellensatz.
\end{proof}
\begin{ncor}
Let $A, B$ be sets in $\F_p$ such that $\abs A + \abs B \le p + 3$. Then
$$\abs{A \hatplus B} \ge \abs A + \abs B - 3$$
\end{ncor}
\begin{proof}
WLOG $2 \le \abs A \le \abs B$. Find $a \in A$.
$$\abs{A \hatplus B} \ge \abs{A \setminus \{a\} \hatplus B} \ge \abs{A \setminus \{a\}} + \abs B - 2 = \abs A + \abs B - 3$$
\end{proof}
\begin{nthm}
Let $p$ be a prime and $G$ be a graph of average degree $> 2p - 2$ and maximum degree $2p - 1$. Then $G$ has a $p$-regular subgraph.
\end{nthm}
\begin{proof}
Write $E$ the edges, $V$ the vertices of $G$, and $m = \abs E, n = \abs V$. Consider
$$f = \underbrace{\prod_{v \in V} \left(1 - \sum_{e \in E} 1_{v \in e} X_e)^{p - 1}\right)}_g - \underbrace{\prod_{e \in E} (1 - X_e)}_h$$
We observe that $\deg g \le n(p - 1)$ and $\deg h = m$. By assumption, $n(p - 1) < m$, so $\deg f = m$. Also note that $\prod_{e \in E} X_e$ has coefficient $\pm 1 \ne 0$ in $f$ and
$$f(0) = g(0) - h(0) = 1 - 1 = 0$$
By Alon's Combinatorial Nullstellensatz, find $x \in \{0, 1\}^E$ such that $f(x) \ne 0$. By the previous calculation, $x \ne 0$. So set
$$F = \{e \in E \mid x_e \ne 0\} = \{e \in E \mid x_e = 1\}$$
Then $F$ is nonempty and $\deg_F(v) = p$ for all $v \in F$. The subgraph formed by $F$ is thus $p$-regular.
\end{proof}
Going back to Erd\H os-Ginzburg-Ziv, what about we replace $\Z/p\Z$ by $\Z/p\Z \times \Z/p\Z$? How many elements do we need to have before we're sure that $p$ of them sum to $0$?
Certainly, $4p - 4$ is not enough since we can take
$$\underbrace{(0, 0), \dots, (0, 0)}_{p - 1\text{ times}}, \underbrace{(0, 1), \dots, (0, 1)}_{p - 1\text{ times}}, \underbrace{(1, 0), \dots, (1, 0)}_{p - 1\text{ times}}, \underbrace{(1, 1), \dots, (1, 1)}_{p - 1\text{ times}}$$
The content of Kemnitz's conjecture is that $4p - 3$ is enough.
\begin{nthm}
Let $v_1, \dots, v_{3p} \in \Z/p\Z \times \Z/p\Z$ sum to zero. Then some $p$ of them also sum to $0$.
\end{nthm}
\begin{proof}
Write $v_i = (a_i, b_i)$ and define
\begin{align*}
f_1 & = \sum_{i = 1}^{3p - 1} a_i X_i^{p - 1} \\
f_2 & = \sum_{i = 1}^{3p - 1} b_i X_i^{p - 1} \\
f_3 & = \sum_{i = 1}^{3p - 1} X_i^{p - 1}
\end{align*}
\end{proof}
\begin{nthm}[R\'onyai]
Let $p \ge 3$ be prime and $m = 4p - 2$. Let $v_1, \dots, v_m \in \Z/p\Z \times \Z/p\Z$. Then some subsequence of length $p$ sums to $0$.
\end{nthm}
\begin{proof}
We need a subsequence of length $p$ or length $3p$ summing to $0$. Suppose there doesn't exist such subsequence. Write $v_i = (a_i, b_i)$ and define
\begin{align*}
r & = \sum_{I \in [n]^{(p)}} \prod_{i \in I} X_i \\
f & = \left(\left(\sum_{i = 1}^m a_i X_i\right)^{p - 1} - 1\right)
\left(\left(\sum_{i = 1}^m b_i X_i\right)^{p - 1} - 1\right)
\left(\left(\sum_{i = 1}^m X_i\right)^{p - 1} - 1\right)
(r - 2) \\
g & = 2(X_1 - 1) \dots (X_m - 1) \\
h & = f - g
\end{align*}
For $c \in \{0, 1\}^m$, we see that
$$f(c), g(c) =
\begin{cases}
2 & \text{ if } c = 0 \\
0 & \text{ if } c \ne 0
\end{cases}$$
since, writing $\abs c = \sum_i c_i$, we see that the third factor $\left(\sum_{i = 1}^m X_i\right)^{p - 1} - 1$ is zero unless $\abs c = p, 2p, 3p$.
\begin{itemize}
\item If $\abs c = 2p$, the fourth factor is zero since $r = \binom{2p}p = 2 \mod p$.
\item If $\abs c = p, 3p$, then one of the first two factors is zero because we assumed there was no sequence of length $p$ or $3p$ summing to zero.
\end{itemize}
Hence $h$ is identically zero on $\{0, 1\}^m$. But the coefficient of $X_1 \dots X_m$ in $h$ is $2 \ne 0$. This contradicts Alon's Combinatorial Nullstellensatz.
\end{proof}
\clearpage
\section{Topology in Combinatorics}
\subsection{Kneser's Graph \& the LSB Theorem}
\newlec
Let $\abs X = n$. Define the {\bf Kneser graph} $KG(n, k)$ as the graph on vertices $X^{(k)}$ with
$$A \sim B \iff A \inter B = \emptyset$$
We will write $n = 2k + d$ for some $d \ge 0$.
\begin{question}
What is the chromatic number of $KG(n, k)$?
\end{question}
We know the independence number from Erd\H os-Ko-Rado.
\begin{align*}
\alpha(KG(n, k))
& = \text{maximum number of $k$-subsets without two disjoint sets} \\
& = \binom{n - 1}{k - 1}
\end{align*}
Hence
$$\chi(KG(n, k)) \ge \frac{\binom nk}{\alpha(KG(n, k))} = \frac nk$$
What is an upper bound?
TODO: Insert picture
If $A \inter [d + 1] \ne \emptyset$, then color $A$ with $\min(A \inter [d + 1])$. Give the rest color $d + 2$. So $d + 2$ colors are enough.
\begin{conjecture}[Kneser]
$$\chi(KG(n, k)) = d + 2$$
\end{conjecture}
This was proved by Lov\'asz in 1978, with a much simpler proof provided by B\'ar\'any the same year. Surprisingly, Greene gave an even simpler (basically trivial) proof in 2002.
\begin{nthm}[Lusternik-Schnirelmann 1930, Borsuk 1933]\label{thm:borsuk}~
\begin{enumerate}
\item Let $f : S^n \to \R^n$ be continuous. Then there exists $x \in S^n$ such that $f(x) = f(-x)$.
\item Let $V_1, \dots, V_{n + 1}$ form a closed cover of $S^n$. Then some $V_i$ contains a pair of antipodal points, ie there exist $i$ and $x$ such that $\pm x \in V_i$.
\end{enumerate}
\end{nthm}
\begin{nthm}[Greene]\label{thm:greene}
Let $S^n$ be covered with $n + 1$ sets, each of which is open or closed. Then one of the sets contains a pair of antipodal points.
\end{nthm}
\begin{proof}
Write $\mcU$ the open sets and $\mcV$ the closed sets.
{\bf Case 1: $\mcU = \emptyset$} \\
This is Theorem \ref{thm:borsuk}.
{\bf Case 2: $\mcV = \emptyset$} \\
We slightly shrink our sets to make them all closed. By the Lebesgue number lemma, find $\eps > 0$ such that every ball of radius $\eps$ is contained in some $U \in \mcU$. Define
$$\mcV' = \{\{x \mid \eps < d(x, U^c)\} \mid U \in \mcU\}$$
This also covers $S^n$ and every set in $\mcV'$ is a subset of some $U \in \mcU$, so we can apply Case 1 to find our antipodal points.
{\bf General case} \\
Suppose no set contains a pair of antipodal points. For each $V \in \mcV$, define
\begin{align*}
f_V : S^n & \to \R \\
x & \mapsto \max(d(x, V), d(-x, V))
\end{align*}
By assumption, $f_V(x) > 0$ for all $x$. Since $S^n$ is compact, find $\eps_V > 0$ such that $f_V(x) \ge \eps_V$ for all $x$. Then define
$$U_V = \{x \in S^n \mid d(x, V) < \eps_V\}$$
By construction, $U_V$ does not contain a pair of antipodal points either. Therefore,
$$\mcU' := \mcU \union \{U_V \mid V \in \mcV\}$$
is a counterexample to Case 2.
\end{proof}
\newlec
\begin{nthm}
For $n = 2k + d$, the chromatic number of the Kneser graph is exactly $d + 2$.
\end{nthm}
\begin{proof}[Proof (Greene)]
We already proved $\chi(KG(n, k)) \le d + 2$. Suppose $\chi(KG(n, k)) \le d + 1$, ie write
$$X^{(k)} = \Union_{i = 1}^{d + 1} \mcA_i$$
where each $\mcA_i$ is intersecting. Consider $n$ points on $S^{d + 1}$ in general position, ie no great $d$-sphere contains $d + 2$ points. Define
$$U_i = \{a \mid \exists s \in \mcA_i, \for x \in s, a \cdot x > 0\}, F = U_1^c \inter \dots \inter U_{d + 1}^c$$
The $U_i$ are open, $F$ is closed and $U_1 \union \dots U_{d + 1} \union F = S^{d + 1}$. Hence by Theorem \ref{thm:greene} we find that one of the sets contains a pair $\pm a$ of antipodal points.
\begin{itemize}
\item If it is $F$, then $\{x \mid a \cdot x > 0\}$ and $\{x \mid a \cdot x < 0\}$ each contain strictly less than $k$ elements. Therefore $\{x \mid a \cdot x = 0\}$ is a great $d$-sphere containing at least $n - 2(k - 1) = d + 2$ points. Contradiction.
\item If it is $U_i$ for some $i$, then $\mcA_i$ contains two disjoint sets. Contradiction.
\end{itemize}
\end{proof}
\printindex
\end{document}