ebook img

On-line list colouring of random graphs PDF

0.19 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview On-line list colouring of random graphs

ON-LINE LIST COLOURING OF RANDOM GRAPHS ALANFRIEZE, DIETER MITSCHE, XAVIERPE´REZ-GIME´NEZ, AND PAWEL PRAL AT 5 1 0 Abstract. In this paper, the on-line list colouring of binomial random graphs G(n,p) is studied. 2 We show that the on-line choice number of G(n,p) is asymptotically almost surely asymptotic to the chromatic number of G(n,p), provided that the average degree d = p(n−1) tends to infinity y a faster than (loglogn)1/3(logn)2n2/3. For sparser graphs, we are slightly less successful; we show M that if d≥(logn)2+ε for some ε>0, then the on-line choice number is larger than the chromatic number by at most a multiplicative factor of C, where C ∈ [2,4], depending on the range of d. 2 Also, for d=O(1), the on-line choice number is by at most a multiplicative constant factor larger 1 than thechromatic number. ] 1. Introduction O C The combinatorial game we study in the paper is played by two players, named Mr. Paint and . Mrs. Correct, and is played on a finite, undirected graph in which each vertex has assigned a h t non-negative number representing the number of erasers at the particular vertex. We assume for a simplicity that this number is initially the same for each vertex. In each round, first Mr. Paint m selects a subset of the vertices and paints them all the same colour; he cannot use this colour in [ subsequent rounds. Mrs. Correct then has to erase the colour from some of the vertices in order 2 to prevent adjacent vertices having the same colour. Whenever the colour at a vertex is erased, v the number of erasers at that vertex decreases by 1, but naturally, Mrs. Correct cannot erase the 9 6 colour if she has no erasers left at that vertex. Vertices whose colours have not been erased can be 4 considered as being permanently coloured and can be removed from the game. The game has two 7 possible endings: (i) all vertices have been permanently coloured, in which case Mrs. Correct wins, 0 . or(ii)atsomepointofthegame, Mr.Paintpresentstwoadjacentvertices uandv andneitherunor 1 v has any eraser left, in which case Mr. Paint wins. If, regardless of which sets she gets presented, 0 5 there is a strategy for Mrs. Correct to win the game having initially k 1 erasers at each vertex, − 1 we say that the graph is k-paintable. The smallest k for which the graph is k-paintable is called : v the paintability number of G, and denoted by χ (G). Note that this parameter is indeed well P i defined: for any graph on n vertices, n 1 erasers at each vertex always guarantee Mrs. Correct X − to win, as she can always choose one vertex from a set presented to her and erase colours on the r a remaining ones. This problem is also known as the on-line list colouring and the corresponding graph parameter is also called the on-line choice number of G—see, for example, [16] and below for the relation to the (off-line) list colouring. Let us recall a classic model of random graphs that we study in this paper. The binomial random graph G(n,p) is defined as the probability space (Ω, ,Pr), where Ω is the set of all F graphs with vertex set 1,2,...,n , is the family of all subsets of Ω, and for every G Ω, { } F ∈ Pr(G) = p|E(G)|(1 p)(n2)−|E(G)|. − This space may be viewed as the set of outcomes of n independent coin flips, one for each pair 2 (u,v) of vertices, where the probability of success (that is, adding edge uv) is p. Note that p = p(n) (cid:0) (cid:1) may (and usually does) tend to zero as n tends to infinity. The first author is supported in part by NSFGrant CCF0502793. The fourth author is supported in part by NSERC. 1 2 ALANFRIEZE,DIETERMITSCHE,XAVIERPE´REZ-GIME´NEZ,ANDPAWEL PRAL AT All asymptotics throughout are as n (we emphasize that the notations o() and O() refer → ∞ · · to functions of n, not necessarily positive, whose growth is bounded; whereas Θ() and Ω() always · · refer to positive functions). We say that an event in a probability space holds asymptotically almost surely (or a.a.s.) if the probability that it holds tends to 1 as n goes to infinity. We often write G(n,p) when we mean a graph drawn from the distribution G(n,p). Finally, for simplicity, we will write f(n) g(n) if f(n)/g(n) 1 as n (that is, when f(n)=(1+o(1))g(n)). ∼ → → ∞ Now,wewillbrieflymentiontherelationtootherknowngraphparameters. Apropercolouring of a graph is a labelling of its vertices with colours such that no two adjacent vertices have the same colour. A colouring using at most k colours is called a (proper) k-colouring. The smallest number of colours needed to colour a graph G is called its chromatic number, and it is denoted by χ(G). Let L be an arbitrary function that assigns to each vertex of G a list of k colours. k We say that G is L -list-colourable if there exists a proper colouring of the vertices such that k every vertex is coloured with a colour from its own list. A graph is k-choosable, if for every such function L , G is L -list-colourable. The minimum k for which a graph is k-choosable is called k k the list chromatic number, or the choice number, and denoted by χ (G). Since the choices L for L contain the special case where each vertex is assigned the list of colours 1,2,...,k , it is k { } clear that a k-choosable graph has also a k-colouring, and so χ(G) χ (G). It is also known L ≤ that if a graph is k-paintable, then it is also k-choosable [13], that is, χ (G) χ (G). Indeed, L P ≤ if there exists a function L so that G is not L -list-colourable, then Mr. Paint can easily win by k k fixing some permutation of all colours present in L and presenting at the i-th step all vertices k containing the i-th colour of the permutation on their lists (unless the vertex was already removed before). Finally, it was shown in [16] that the paintability of a graph G on n vertices is at most χ(G)logn+1. (All logarithms in this paper are natural logarithms.) Combining all inequalities we get the following: (1) χ(G) χ (G) χ (G) χ(G)logn+1. L P ≤ ≤ ≤ It follows from the well-known results of Bollob´as [4], L uczak [11] (see also McDiarmid [12]) that the chromatic number of G(n,p) a.a.s. satisfies log(1/(1 p))n (2) χ(G(n,p)) − , ∼ 2log(np) for np and p bounded away from 1. The study of the choice number of G(n,p) was initiated in [1], →wh∞ere Alon proved that a.a.s., the choice number of G(n,1/2) is o(n). Kahn then showed (see [2]) that a.a.s. the choice number of G(n,1/2) equals (1+o(1))χG(n,1/2). In [7], Krivelevich showed that this holds for p n 1/4, and Krivelevich, Sudakov, Vu, and Wormald [8] improved − ≫ this to p n 1/3. On the other hand, Alon, Krivelevich, Sudakov [3] and Vu [15] showed that for − ≫ any value of p satisfying 2 < np n/2, the choice number is Θ(np/log(np)). Later, Krivelevich ≤ and Vu [10] generalized this to hypergraphs; they also improved the leading constants and showed that the choice number for C np 0.9n (where C is a sufficiently large constant) is at most ≤ ≤ a multiplicative factor of 2 + o(1) away from the chromatic number, the best known factor for p n 1/3. Our results below (see Theorem 1, Theorem 2, and Theorem 3) show that even for the − ≤ on-line case, for a wide range of p, we can asymptotically match the best known constants of the off-line case. Moreover, if np logωn (for some function ω = ω(n) tending to infinity as n ), ≥ → ∞ then we get the same multiplicative factor of 2+o(1). Our main results are the following theorems. The first one deals with dense random graphs. Theorem 1. Let ε> 0 be any constant, and suppose that (loglogn)1/3(logn)2n 1/3 p 1 ε. − ≪ ≤ − ON-LINE LIST COLOURING OF RANDOM GRAPHS 3 Let G (n,p). Then, a.a.s., ∈ G n χ (G) χ(G), P ∼ 2log (np) ∼ b where b = 1/(1 p). − Note that if p = o(1), then n nlog(1/(1 p)) np np = − = Θ . 2log (np) 2log(np) ∼ 2log(np) log(np) b (cid:18) (cid:19) For constant p it is not true that log(1/(1 p)) p but the order is preserved, provided p 1 ε − ∼ ≤ − for some ε > 0. For sparsergraphsweareless successfulindeterminingtheasymptotic behaviourof χ ( (n,p)). P G Nevertheless, we can prove the following two theorems that determine the order of the graph parameter we study. Theorem 2. Let ε> 0 be any constant, and suppose that (logn)2+ε p = O((loglogn)1/3(logn)2n 1/3). − n ≤ Let G (n,p). Then, a.a.s., ∈ G np χ (G) = Θ = Θ(χ(G)). P log(np) (cid:18) (cid:19) Moreover, if np =(logn)C+o(1), a.a.s. 2χ(G) if C → ∞ χ(G) χ (G) (1+o(1)) 2C χ(G) if C [4, ) ≤ P ≤ C 2 ∈ ∞ 4χ−(G) if C (2,4). ∈   Finally, for very sparse graphs we have the following. Theorem 3. Let G (n,p) with p = O(1/n). Then, a.a.s., χ (G) = Θ(1) = Θ(χ(G)). P ∈ G 2. Preliminaries Most of the time, we will use the following version of Chernoff’s bound. Suppose that X ∈ Bin(n,p) is a binomial random variable with expectation µ =np. If 0 < δ < 1, then δ2µ Pr[X < (1 δ)µ] exp , − ≤ − 2 (cid:18) (cid:19) and if δ > 0, δ2µ Pr[X > (1+δ)µ] exp . ≤ −2+δ (cid:18) (cid:19) However, at some point we will need the following, stronger, version: for any t 0, we have ≥ t (3) Pr[X µ+t] exp µϕ , ≥ ≤ − µ (cid:18) (cid:18) (cid:19)(cid:19) where ϕ(x) = (1+x)log(1+x) x. − These inequalities are well known and can be found, for example, in [6]. 4 ALANFRIEZE,DIETERMITSCHE,XAVIERPE´REZ-GIME´NEZ,ANDPAWEL PRAL AT Let G = (V,E) be any graph. A set S V is called independent if no edge e E has both ⊆ ∈ endpoints in S. Denote by α(G) the independence number of G, that is, the size of a largest independent set of G. Let k be defined as follows: 0 k0 = k0(n,p) = max k N : n (1 p)(k2) n4 . ∈ k − ≥ (cid:26) (cid:18) (cid:19) (cid:27) It is well known that k is well defined (for all n sufficiently large and provided that p 1 ε for 0 ≤ − some ε> 0) and that k 2log (np) with b = 1/(1 p). The following result was proved in [8]. 0 ∼ b − Theorem 4 ([8]). Suppose n 2/5log6/5n p 1 ε for some constant ε > 0. Let G (n,p). − ≪ ≤ − ∈ G Then n2 Pr(α(G) < k )= exp Ω . 0 − k4 p (cid:18) (cid:18) 0 (cid:19)(cid:19) Weobtainthefollowingimmediatecorollarythatwillbeusefultodealwithdenserandomgraphs. In fact, this lemma is the bottleneck for the argument that prevents us to extend Theorem 1 for sparser graphs. Corollary 5. Let ε > 0 be any constant and let ω = ω(n) = o(logn) be any function tending to infinity as n . Suppose that → ∞ ω(loglogn)1/3(logn)2n 1/3 p 1 ε. − ≤ ≤ − Let G (n,p). Then, a.a.s. every set S V(G) with S = s s := n/(ωlog2n) contains an 0 ∈ G ⊆ | | ≥ independent set of size k = k (s,p) k (n,p). 0 0 0 ∼ Proof. Fix a set S V(G) with S = s n/(ωlog2n). First, let us note that ⊆ | | ≥ loglogn k = k (s,p) 2log (sp)= 2log (np) 1 O k (n,p). 0 0 ∼ b b − logn ∼ 0 (cid:18) (cid:18) (cid:19)(cid:19) Moreover, since n/(ωlog2n) s n, we can easily verify that p satisfies s 2/5log6/5s p − ≤ ≤ ≪ ≤ 1 ε (with room to spare in the lower bound). It follows immediately from Theorem 4 that the − probability of not having an independent set of size k in S is at most 0 s2 s2p3 snp3 exp Ω = exp Ω = exp Ω = exp Ω sω2loglogn . − k4p − log4n − ωlog6n − (cid:18) (cid:18) 0 (cid:19)(cid:19) (cid:18) (cid:18) (cid:19)(cid:19) (cid:18) (cid:18) (cid:19)(cid:19) (cid:0) (cid:0) (cid:1)(cid:1) On the other hand, the number of sets of size s can be bounded as follows: n ne s = exp(slog(ne/s)) exp(3sloglogn), s ≤ s ≤ (cid:18) (cid:19) (cid:16) (cid:17) since log(ne/s) log(eωlog2n) = (2+o(1))loglogn. Hence, the expected number of sets of size ≤ s for which the desired property fails is exp Ω sω2loglogn . Summing over all s s , the 0 − ≥ expected number of all such sets is exp Ω s ω2loglogn = o(1). The claim holds by Markov’s inequality. − (cid:0)0 (cid:0) (cid:1)(cid:1) (cid:3) (cid:0) (cid:0) (cid:1)(cid:1) Given a graph G and some fixed ordering of its vertices (for example, we may assume that it is the natural ordering 1,2,...,n), the greedy colouring algorithm proceeds by scanning vertices of G in the given order and assigning the first available colour for the current vertex. We will use the following lemma due to [9] (see also [5]). However, since we use it in a slightly different setting, we point out a few small differences in the argument by providing a sketch of the proof. In fact, this lemma is the bottleneck for the argument that prevents us to extend Theorem 2 for sparser graphs. Lemma 6. Let ω = ω(n) := loglogn. Given any constant 0 < ε < 1, let G (n,p) with ∈ G log2+εn/n =: p p = o(1/logn). Then, a.a.s. every subgraph of G of size u u := n/(ωlog2n) 0 0 ≤ ≥ has an independent set of size at least ε(1 ε)log(np)/(3p). − ON-LINE LIST COLOURING OF RANDOM GRAPHS 5 Note that in the lemma we set ω = loglogn; other functions tending to infinity slower than loglogn, constant, or even tendingto 0 not too quickly clearly would work as well, butwould make the statement of the result weaker. Sketch of the proof. We follow the notation as in [9] and apply the greedy approach given there for each subgraph of size u. Let (1 ε)log(up) up α = − and t = . 0 p log(up) Note that (1 ε)log(np/(ωlog2n)) (1 ε)(log(np) (2+o(1))loglogn) α − = − − 0 ≥ p p (1 ε)(ε+o(1))log(np) (1 ε)εlog(np) − − , ≥ (2+ε)p ≥ 3p since np log2+εn. (Note that, if np logCn, then we get the better estimate ≥ ≥ α (1 ε)(1 2/C +o(1))log(np)/p; 0 ≥ − − see Remark 7 below.) Moreover, (1 p)α0 = exp( pα (1+O(p))) = exp( (1 ε)log(up)(1+O(p))) 0 − − − − exp( (1 ε)log(up)) = (up) 1+ε, − ∼ − − since p = o(1/logn). For a fixed subgraph of size u, it follows from [9] that the probability that the algorithm fails to produce an independent set of size at least α is at most 0 (up)(1+o(1))(up) 1+εuε exp( t(1 p)α0uε) =exp − − − − log(up) (cid:18) (cid:19) u1+εpε(ε+o(1)) =exp . − log(up) (cid:18) (cid:19) By taking a union bound over all n ne u = exp(ulog(ne/u)) exp(3uloglogn) u ≤ u ≤ (cid:18) (cid:19) (cid:16) (cid:17) sets of size u, the probability P that there exists a subgraph of size u for which the algorithm fails u is at most (up)ε(ε+o(1)) (u p )ε(ε+o(1)) 0 0 exp u 3loglogn exp u 3loglogn , − log(up) − ≤ − log(u p ) − (cid:18) (cid:18) (cid:19)(cid:19) (cid:18) (cid:18) 0 0 (cid:19)(cid:19) as the function f(x) = xε/logx is increasing for large enough x. Since u p = logεn/ω = 0 0 logε+o(1)n, (logn)(ε+o(1))ε(ε+o(1)) P exp u 3loglogn u ≤ − (ε+o(1))loglogn − !! =exp u (logn)ε2+o(1) 3loglogn e u. − − − ≤ (cid:16) (cid:16) (cid:17)(cid:17) Summing over all u u n, we see that the probability that the algorithm fails is at most 0 n e u = O(e u0)≤= o(1≤), and the lemma follows. (cid:3) u=u0 − − P 6 ALANFRIEZE,DIETERMITSCHE,XAVIERPE´REZ-GIME´NEZ,ANDPAWEL PRAL AT Remark 7. Lemma 6 is sufficient to determine the order of the on-line choice number for sparse random graphs. We comment on improvements in order to obtain the smallest leading constant in the upper bound. From the proof of the lemma it is clear that the bound on the size of the independent set can be improved for denser graphs. More precisely, for np logωn (for ω ≥ → ∞ but still p = o(1/logn)), we can obtain independent sets of size at least (1 2ε)log(np)/p for − any arbitrarily small ε, and so we are asymptotically just a factor 2+o(1) off from the bound of Corollary 5. On the other hand, for C being a constant larger than 2, and np = logC+o(1)n, we can obtain independent sets of size at least (1 ε)(1 2/C +o(1))log(np)/p (again, for any arbitrarily small − − ε), so by Lemma 6 we are off by a factor 2/(1 2/C)+o(1) = 2C/(C 2)+o(1). We will now − − show that for log2+εn/n p = o(n 5/6), however, the independence number obtained is by at − ≤ most a factor 4 + o(1) off from the one obtained by Corollary 5. First, as shown above, given u n/(ωlog2n), there are at most exp(3uloglogn) sets of size u, and the expected number of ed≥ges induced by each such set is u p. By Chernoff’s bound, the probability that the number of 2 edges induced by one of them deviates by an additive u p/loglogn factor from its expected value (cid:0) (cid:1) 2 is at most exp (1/loglogn)2u2p/5 . Hence, with probability at most − (cid:0) (cid:1) (cid:0) exp u(3loglogn (cid:1)(logn)ε(1/loglogn)2/(5ω)) exp( u), − ≤ − there exists a set of siz(cid:0)e u that does not satisfy the condition. S(cid:1)umming over all n/(ωlog2n) ≤ u n, we see that a.a.s. for all such sets we have (1 + o(1))u2p/2 edges. On the other hand, ≤ note that for this range of p, the expected number of pairs of triangles sharing one vertex in G is Θ(n5p6) = o(1), and thus by Markov’s inequality, a.a.s. every vertex is in at most one triangle. It follows that there exists a set D of edges which is a matching such that after removing D the graph is triangle-free. Since in G the average degree of every set of size u is up(1+o(1)), the same also holds for G D. Since G D is triangle-free, by Shearer’s lemma [14], an independent set of \ \ size (1+o(1))log(np)/p can be found. By eliminating at most half of the set (one vertex for each edge in D), we obtain an independent set in the original graph. It follows that for every set of size u, there exists an independent set of size at least log(np)/((2+o(1))p), being therefore at most a factor 4+o(1) off from the size obtained by Corollary 5. We will also need the following observation. Lemma 8. Let G (n,p), for 10logn/n p 1, and set s = 10logn/p. Then, a.a.s. the 0 ∈ G ≤ ≤ following holds: (i) every set S V(G) with s S n contains an independent set of size at least 1/(9p); 0 (ii) for every i ⊆ N and every≤se|t |S≤ V(G) with 2 is S < 2 i+1s , S contains an − 0 − 0 ∈ ⊆ ≤ | | independent set of size at least i/(9p2i). Before we move to the proof of the lemma, let us note that the lower bound on p is not used in the proof of part (i), which becomes a trivial statement if p < 10logn/n. For part (ii), we could easily relax this bound to p 1/nk (for any constant k > 0) by changing some of the constants in ≥ the statement. Furthermore, part (i) is trivially true for p 1/9, and part (ii) is only non-trivial ≥ for all i satisfying 9p2i < i. Proof. For part (i), let us fix a set S with S = s satisfying s s n. Denote by E the set of 0 S edgesinducedbyS. WehaveE E = s p| |s2p/2. ByChernoff≤’sb≤ound,withprobabilityatleast | S| 2 ∼ 1 exp( E E /4), wehave E s2p. Bytakingaunionboundoverall n exp(slogn)subsets − − | S| | S|≤ (cid:0) (cid:1) s ≤ of size s, with probability at most exp(s(logn sp/9)) there exists a set of size s not satisfying the − (cid:0) (cid:1) condition. Since, by assumption, s s = 10logn/p, this probability is at most exp( slogn/9), 0 ≥ − and so summing over all s s , the claim holds a.a.s. for all s in the desired range. We may 0 ≥ therefore assume that (deterministically) every subset of size s s satisfies E s2p. Then, for 0 S ≥ | | ≤ ON-LINE LIST COLOURING OF RANDOM GRAPHS 7 any set S in the desired range, at least s/2 vertices of S have a degree of at most 4sp in the graph induced by S. By applying a greedy algorithm to these vertices, we can therefore always find an independent set of size at least (s/2)/(4sp+1) 1/(9p). Part (ii) is proved in a similar way. Fix i= i(≥n) N such that ∈ (4) 9p2i < i (since otherwise the statement is trivial), and a set S V(G) of size s satisfying the following: ⊆ (5) 2 is s < 2 i+1s . − 0 − 0 ≤ Since s is a natural number, we must have 2 i+1s 1. Therefore, 2i 2s 2n (where we − 0 0 ≥ ≤ ≤ used that p 10logn/n), and thus ≥ (6) i log(2n)/log2 2logn, ≤ ≤ for n large enough. Combining (5), (4), the fact that ps = 10logn, and (6), we get 0 s 2 is > (9p/i)s = (90/i)logn 45. − 0 0 ≥ ≥ Note that for s 45, ≥ s s2p E E = p . S | | 2 ≥ 2.1 (cid:18) (cid:19) It follows from the stronger version of Chernoff’s bound (3) that 2i Pr E 1+ E E = exp ϕ(2i/i)E E S S S | | ≥ i | | − | | (cid:18) (cid:18) (cid:19) (cid:19) (cid:0) 2i s2p (cid:1) exp ≤ −4 · 2.1 (cid:18) (cid:19) 2is2p exp . ≤ − 9 (cid:18) (cid:19) (Note that ϕ(2i/i) = (1+2i/i)log(1+2i/i) 2i/i (2i/i)log(2i/i) 2ilog2 as i . Moreover, it is straightforward to see that for every i −N, ϕ(∼2i/i) 2i/4.) ∼ → ∞ ∈ ≥ As before, by a union bound we get that with probability at most exp(s(logn 2isp/9)) there − exists a set of size s not satisfying the condition. Since, by assumption, s 2 is = 2 i10logn/p, − 0 − ≥ this probability is at most exp( slogn/9), regardless of which precise interval [2 is ,2 i+1s ) − 0 − 0 − contains s. Summingthe exp( slogn/9) boundwe obtained over all values 45 s < s gives o(1). 0 Therefore, we may assume tha−t (deterministically), for any i N, every subse≤t S of size s in the ∈ range (5) satisfies 2i s 2i s2p E 1+ p 1+ 2is2p/i. S | |≤ i 2 ≤ i 2 ≤ (cid:18) (cid:19)(cid:18) (cid:19) (cid:18) (cid:19) Arguing as before, this guarantees that we can always find an independent set of size at least (s/2)/(4 2isp/i+1) i/(9 2ip). · ≥ · (At the last step, we used that 4 2isp/i+1 4.2 2isp/i, which follows easily from (5), (6) and the definition of s .) The proof of·the lemma≤is finis·hed. (cid:3) 0 3. Proof of Theorem 1 Thelower boundfollows immediately from (1). For the upperbound,we give a winningstrategy for Mrs. Correct that a.a.s. requires only (1 + o(1))n/(2log (np)) erasers on each vertex, where b b = 1/(1 p). (Recall from (2) that a.a.s. χ(G) n/(2log (np)).) We emphasize that most probabilist−icstatements hereafter intheproofwillno∼trefertoGb(n,p)butrathertotherandomized strategythatMrs.Correctusestoselecteachindependentset,regardlessofthestrategyofMr.Paint and assuming that G deterministically satisfies the conclusions of Corollary 5 and Lemma 8. Since 8 ALANFRIEZE,DIETERMITSCHE,XAVIERPE´REZ-GIME´NEZ,ANDPAWEL PRAL AT p (loglogn)1/3(logn)2n 1/3, we can choose a function ω = o(logn) tending to infinity with n − ≫ and such that p ω(loglogn)1/3(logn)2n 1/3. Note that our choice of ω satisfies the requirements − ≥ of Corollary 5. Call a set S V(G) large if S = s n/(ωlog2n), small if s np/(ωlog2n), ⊆ | | ≥ ≤ and medium otherwise. Whenever Mrs. Correct is presented a large set, she can, by Corollary 5 find an independent set of size k = (2+o(1))log (np), and uses erasers for all remaining vertices. Note that, trivially, at 0 b most n n (7) = (1+o(1)) (2+o(1))log (np) 2log (np) b b large sets can be presented to her before the end of the game, and hence, for all large sets at most that many erasers on each vertex are needed. Supposenow that a small set S is presented to Mrs. Correct. Then, she chooses a random vertex v S and accepts the colour on that vertex; for all other vertices of the presented set erasers are ∈ used. (Note that this is clearly not a optimal strategy; Mrs.Correct could extend v to a maximal { } independentset but we do not use it in the argument and so we may pretend that a single vertex is accepted.) Let X denote the random variable counting the numberof erasers used when small sets v containing v are presented before eventually v gets a permanent colour (which does not necessarily happen when a small set is presented). We have np ωlog2n np/(√ωlogn) (8) Pr X 1 exp( √ωlogn)= o(n 1), v − ≥ √ωlogn ≤ − np ≤ − (cid:18) (cid:19) (cid:18) (cid:19) and thus, by a union bound, a.a.s. for all vertices the number of erasers used for small sets is at most np n (9) = o , √ωlogn 2log (np) (cid:18) b (cid:19) and so is negligible. Now, we are going to deal with medium sets. First, note that the size of a medium set is at least np/(ωlog2n) 10logn/p for the range of p considered in this theorem and by our choice of ≥ ω. Suppose that some medium set S is presented during the game. Applying Lemma 8 repeatedly, Mrs. Correct can partition S into independent sets of size 1/(9p) and a remaining set J of size ⌈ ⌉ at most 10logn/p np/(ωlog2n). The strategy is then the following: with probability 1/2 she ≤ chooses(uniformlyatrandom)oneoftheindependentsetsofsize 1/(9p) , andwithprobability1/2 ⌈ ⌉ she chooses one vertex chosen uniformly at random from J (as before, this is clearly a suboptimal strategy but convenient to analyze). Selected vertices keep the colour; for the others erasers need to be used. We partition the vertices of S into two groups: group 1 consists of vertices belonging to independent sets, and group 2 consists of vertices of J. Our goal is to show that each vertex v appears in less than 3np/(√ωlogn) many medium sets, so that the total number of erasers needed to deal with these situations is negligible (see (9)). Suppose that some vertex v appears in at least 3np/(√ωlogn) medium sets. Each time, the corresponding medium set is split into two groups, and we cannot control in which group v ends up. However, by Chernoff’s bound, with probability 1 o(n 1), at least np/(√ωlogn) times the − − group to which v belongs is chosen. We condition on this event. Note that if group 2 is chosen and v belongs to this group, the probability that v is not selected is at most 1 ωlog2n/(np) (as in (8) − for small sets). Similarly, if group 1 is chosen and v belongs to this group, the probability that v is not chosen, is at most 1 ωlog2n/(9np), as each medium set has size at most n/(ωlog2n). − ON-LINE LIST COLOURING OF RANDOM GRAPHS 9 DenotebyY therandomvariablecountingthenumberoferasersusedforvertex v corresponding v to medium sets, before eventually v gets a permanent colour. As in (8), 3np ωlog2n (np)/(√ωlogn) Pr Y o(n 1)+ 1 v − ≥ √ωlogn ≤ − 9np (cid:18) (cid:19) (cid:18) (cid:19) √ωlogn o(n 1)+exp = o(n 1), − − ≤ − 9 (cid:18) (cid:19) and thus, as before, by a union bound, a.a.s. the desired bound for the number of erasers used for medium sets holds for all vertices. Combiningboundsfor the number of erasers usedfor large, medium, and small sets, we get that, regardless of the strategy used by Mr. Paint, Mrs. Correct uses at most (1+o(1))n/(2log (np)) b erasers for each vertex, and the theorem follows. 4. Proof of Theorem 2 As in the proof of Theorem 1, the lower bound follows immediately from (1), and so it remains to show that Mrs. Correct has a strategy that a.a.s. requires only O(n/log (np)) erasers on each b vertex (recall also (2)). We will use the same definitions for sets of being small, medium, and large as before, but we set here ω = loglogn; as pointed out right after Lemma 6, other choices of ω are possible, but our choice gives the strongest result (as we assume the weakest condition). The argument for small sets remains exactly the same; a.a.s. it is enough to have o(n/log (np)) erasers b on each vertex to deal with all small sets that Mr. Paint may present. To deal with large sets, by Lemma 6, since we aim for a statement that holds a.a.s., we may assume that whenever a large set is presented, Mrs. Correct can always find an independent set of size at least ε(1 ε)log(np)/(3p), − keeps the colour on these vertices, and uses erasers for all remaining vertices. The number of large sets presented is therefore at most 3(ε(1 ε)) 1np/log(np) = O(n/log (np)), as needed. This is − − b enough to determine the order of the on-line choice number but, if one aims for a better constant, then Remark 7 implies that for np = logC+o(1)n we are guaranteed to have an independent set of size i = i(C), where log(np)/p if C → ∞ i = (1+o(1)) (1 2)log(np)/p if C [4, )  − C ∈ ∞ 1 log(np)/p if C (2,4). 2 ∈ We will showbelow thatthecontribution of mediumsets is of negligible order, and hencetheupper   bound on the number of erasers needed for large sets will also apply (up to lower order terms) to the total number of erasers needed. As a result, we will get the following bounds: (2+o(1)χ(G) for C , ( 2C +o(1))χ(G) for C [4, ), and (4+o(1))χ(G) for C (2,4). → ∞ C 2 ∈ ∞ ∈ The strategy−for medium sets has to be substantially modified. Suppose that a medium set S of size s is presented at some point of the game. Recall that np/(ωlog2n) < s < n/(ωlog2n), where ω = loglogn. Since we aim for a statement that holds a.a.s., we may assume that Mrs. Correct can partition S in the following way: by applying Lemma 8(i) repeatedly, as long as at least s = 10logn/p vertices are remaining, she can find independent sets of size 1/(9p) , and remove 0 ⌈ ⌉ them from S. (Note that for sparse graphs it might happen that s < s and so the lemma cannot 0 be applied even once. In such a situation, we simply move on to the next step.) If the number of remaining vertices, r, satisfies 2 is r < 2 i+1s for some i = i(n) N and r > np/(ωlog2n), − 0 − 0 ≤ ∈ then by Lemma 8(ii), she can find an independent set of size i/(9p2i) . Then, she removes that ⌈ ⌉ independent set, and continues iteratively with the remaining vertices in S. Note that there are clearly M logn/log2 = O(logn) different sizes of independent sets corresponding to different ≤ values of i. Finally, if r np/(ωlog2n), she puts all the remaining vertices into a set J (not ≤ necessarily independent), and stops partitioning. 10 ALANFRIEZE,DIETERMITSCHE,XAVIERPE´REZ-GIME´NEZ,ANDPAWEL PRAL AT We classify vertices in S into types depending on the size of the set in the partition of S to which they belong: more precisely, we call vertices from independent sets of size 1/(9p) to be of ⌈ ⌉ type-0; vertices from independent sets of size i/(9p2i) for some 1 i M are called of type-i; ⌈ ⌉ ≤ ≤ and vertices from the last set of vertices J of size at most np/(ωlog2n) are called of type-(M+1). Of course, these definitions depend on the particular time a set S is presented by Mr. Paint. In particular, if a vertex is presented several times by Mr. Paint, each time it might be of a different type. Finally, we classify vertices in S into 3 groups: vertices of type-0 form group-0, vertices of type-1 up to type-M form group-1, and vertices of type-(M +1) form group-2. Mrs. Correct now chooses with probability 1/3 one of the three groups. If group-0 is selected, she then picks uniformly at random one independent set within the group. In case when group-2 is selected, she picks uniformly at random a single vertex inside this group. Finally, if group-1 is selected, she first picks a random type. The probability that type-i is selected is equal to 1/i 1/i 1+o(1) q := = . i M 1/j ≥ logn/log21/j iloglogn j=1 j=1 Then, within the selected type,Pan independePntset is selected uniformly at random. If the group or type chosen by Mrs. Correct has no vertices in it, she picks no vertex to colour permanently, which is clearly a suboptimal strategy. Our goal is to show that a.a.s. each vertex v appears in less than 4np n = o = o(χ(G)) √ωlog(np) log (np) (cid:18) b (cid:19) many medium sets, before its colour is accepted. If this holds, then the number of erasers needed for each vertex (due to medium sets) is negligible. Suppose that some vertex v appears in at least 4np/(√ωlog(np)) medium sets. Since one of the three groups is selected uniformly at random for each medium set, we expect v to belong to the selected group at least (4/3)np/(√ωlog(np)) log2+ε+o(1)n times. Hence, it follows from ≥ Chernoff’s bound that, with probability 1 o(n 1), at least np/(√ωlog(np)) times the group to − − which v belongs is chosen. Call this event E. It remains to show that, conditional on E, v will be permanently coloured with probability 1 o(n 1) within the first np/(√ωlog(np)) times its group − − is picked for some medium set. Note that if group-0 is selected and v belongs to this group, the probability that v is chosen to be permanently coloured, is at least 1/(9p) ωlog2n (10) = . n/(ωlog2n) 9np Suppose then that group-1 is chosen, v belongs to this group and is of type-i for some 1 i M. ≤ ≤ The number of independent sets in the partition containing vertices of type-i is at most 2 i+1s /2 9s p 90logn 100logn − 0 0 +1 = +1 = +1 , i/(9p2i) i i ≤ i where for the last step we used that i M logn/log2. This time, the probability that v is ≤ ≤ permanently coloured is at least i 1+o(1) ωlog(np)logn (11) q , i 100logn ≥ 100(logn)(loglogn) ≥ np where we used that ω = loglogn and np/(log(np)) log2+ε/2n. Finally, if group-2 is chosen and ≥ v belongs to this group, the probability that v is permanently coloured is at least 1 ωlog2n (12) = . np/(ωlog2n) np

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.