概率论与数理统计 (Spring 2023)/Problem Set 1: Difference between revisions

From TCS Wiki
Jump to navigation Jump to search
Zhangxy (talk | contribs)
No edit summary
Zhangxy (talk | contribs)
Line 9: Line 9:
Without further notice, we are working on probability space <math>(\Omega,\mathcal{F},\mathbf{Pr})</math>.
Without further notice, we are working on probability space <math>(\Omega,\mathcal{F},\mathbf{Pr})</math>.


== Problem 1 ==
<p>Without further notice, we are working on probability space <math>(\Omega,\mathcal{F},\mathbf{Pr})</math>.</p>
<h2 id="problem-1-classic-examples">Problem 1 (Classic examples: Symmetric 1D random walk)</h2>
<ul>
<li><p>[<strong>Symmetric 1D random walk (I)</strong>] Let <math>(X_i)_{i \in \mathbb{N}}</math> be i.i.d. random variables, where <math>X_i</math> is uniformly chosen from <math>\{-1,1\}</math>. Let <math>S_n = \sum_{i=1}^{n} X_i</math>. Prove that <math>\sum_{i=1}^n \mathbf{Pr}(S_i = 0) = +\infty</math>.</p>
</li>
<li><p>[<strong>Symmetric 1D random walk (II)</strong>] Let <math>(X_i)_{i \in \mathbb{N}}</math> be i.i.d. random variables, where <math>X_i</math> is uniformly chosen from <math>\{-1,1\}</math>. Let <math>S_n = \sum_{i=1}^{n} X_i</math> and <math>U = \min \{n \in \mathbb{N}_+ \mid S_n = 0\}</math> (<math>U = \infty</math> if the set is empty). Prove that <math>\mathbb{E}(U) = \sum_{k=1}^{+\infty} \mathbf{Pr}(U \ge k) = +\infty</math>.</p>
</li>
<li><p>[<strong>Symmetric 1D random walk (III) </strong>] Let <math>(X_i)_{i \in \mathbb{N}}</math> be i.i.d. random variables, where <math>X_i</math> is uniformly chosen from <math>\{-1,1\}</math>. Let <math>S_n = \sum_{i=1}^{n} X_i</math>. Find out the probability <math>\mathbf{Pr}\left(\forall 1 \le i \le n, |S_i| \le m \right)</math>. (Hint: André's reflection principle)</p>
</li>
</ul>
<h2 id="problem-2-principle-of-inclusion-and-exclusion">Problem 2 (Principle of Inclusion and Exclusion)</h2>
<p>In the following tasks, for all <math>i \in \mathbb{N}</math>, <math>A_i</math> is an event.</p>
<ul>
<li>[<strong>Union bound</strong>] Prove that <math>\mathbf{Pr}\left( \bigcup_{i=1}^{+\infty} A_i \right) \le \sum_{i=1}^{+\infty} \mathbf{Pr}\left(A_i\right)</math>.</li>
<li>[<strong>Principle of Inclusion and Exclusion (PIE)</strong>] Prove that <math>\mathbf{Pr}\left( \bigcup_{i=1}^n A_i\right) = \sum_{\emptyset \neq S \subseteq [n]} (-1)^{|S|-1} \mathbf{Pr}\left( \bigcap_{i \in S} A_i \right)</math>, where <math>[n]={1,2,\ldots,n}</math>.</li>
<li>[<strong>Derangements</strong>] Find the number of permutation <math>p = (p_1,p_2,\ldots,p_n)</math> satisfying that <math>p_i \neq i</math> for all <math>1 \le i \le n</math> using PIE. <math>p=(p_1,p_2,\ldots,p_n)</math> is a permutation if each integer from <math>1</math> to <math>n</math> appears exacctly once in <math>p_1, p_2,\ldots, p_n</math>.</li>
<li>[<strong>Bonferroni&#39;s inequality and Kounias&#39;s inequality</strong>] Prove that
<math>
\sum_{i=1}^n \mathbf{Pr}(A_i) - \sum_{1 \le i< j \le n} \mathbf{Pr}(A_i \cap A_j)\le \mathbf{Pr}\left(\bigcup_{i=1}^n A_i\right) \le \sum_{i=1}^n \mathbf{Pr} \left( A_i\right) - \sum_{i=2}^n \mathbf{Pr}(A_1 \cap A_i).
</math>
(Hint: try using Venn diagram to understand these inequalites)</li>
</ul>
<h2 id="problem-3-probability-space">Problem 3 (Probability space)</h2>
<ul>
<li><p>[<strong>Nonexistence of probability space</strong>] Prove that it is impossible to draw uniformly from the set of natural numbers <math>\mathbb{N}</math>, i.e. there exists no probability space <math>(\mathbb{N},2^{\mathbb{N}},\mathbf{Pr})</math> such that <math>\mathbf{Pr}(X = i) = \mathbf{Pr}(X = j)</math> for all <math>i, j \in \mathbb{N}</math>. Please clarify why your argument fails on proving that it is impossible to draw uniformly from interval <math>[0,1]</math>, i.e. there exists no probability space <math>([0,1],\mathcal{F},\mathbf{Pr})</math> such that for any interval <math>(l,r] \subseteq [0,1]</math>, <math>(l,r] \in \mathcal{F}</math> and <math>\mathbf{Pr}(X \in (l,r]) = r-l</math>. (Actually, such probability measure exists and is called the Lebesgue measure). </p>
</li>
<li><p>[<strong>Limit of events</strong>] Let <math>(A_i)_{i \in \mathbb{N}}</math> be a sequence of events.
Define <math>\limsup A_n = \bigcap_{n=1}^{+\infty} \bigcup_{k=n}^{+\infty} A_k</math> and <math>\liminf A_n = \bigcup_{n=1}^{+\infty} \bigcap_{k=n}^{+\infty} A_k</math>. Prove that </p>
<ul>
<li><math>\liminf A_n, \limsup A_n \in \mathcal{F}</math> and <math>\liminf A_n \subseteq \limsup A_n</math>;</li>
<li><math>\liminf A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for all but finitely many values of } n\}</math>;</li>
<li><math>\limsup A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for infinite many values of } n\}</math>;</li>
<li>If <math>A = \liminf A_n = \limsup A_n</math>, then <math>\mathbf{Pr}(A_n) \rightarrow \mathbf{Pr}(A)</math> when <math>n</math> tends to infinity;</li>
<li>Furthermore, if <math>\bigcup_{k=n}^{+\infty} A_k</math> and <math>\bigcap_{k=n}^{+\infty} A_k</math> is independent for all <math>n</math> and <math>A = \liminf A_n = \limsup A_n</math>, then <math>\mathbf{Pr}(A)</math> is either <math>0</math> or <math>1</math>.</li>
</ul>
</li>
</ul>
<h2 id="problem-4-conditional-probability">Problem 4 (Conditional probability)</h2>
<ul>
<li><p>[<strong>Symmetric 1D random walk (III)</strong>] Let <math>(X_i)_{i \in \mathbb{N}_+}</math> be i.i.d. random variables, where <math>X_i</math> is uniformly chosen from <math>{-1,1}</math>. Let <math>S_n = \sum_{i=1}^{n} X_i</math>. Compute <math>\textbf{Pr}(S_4 = 4|S_8 = 6)</math>.</p>
</li>
<li><p>[<strong>Conditional version of the total probability</strong>] Let <math>C_1,\cdots, C_n</math> be disjoint events that form a partition of the state space. Let <math>A</math> and <math>B</math> be events such that <math>\textbf{Pr}(B\cap C_i)>0</math> for all <math>i</math>. Show that
<math>\textbf{Pr}(A|B) = \sum_{i = 1}^n \textbf{Pr}(C_i|B)\cdot \textbf{Pr}(A|B\cap C_i)</math></p>
</li>
<li><p>[<strong>Bayes&#39; Law</strong>] There are <math>n</math> urns filled with black and white balls. Let <math>f_i</math> be the fraction of white balls in urn <math>i</math>. In stage 1 an urn is chosen uniformly at random (i.e., each urn has probability <math>1/n</math> of being chosen). In stage 2 a ball is drawn uniformly at random from the urn. Let <math>U_i</math> be the event that urn <math>i</math> was selected at stage 1. Let <math>W</math> denote the event that a white ball is drawn at stage 2, and <math>B</math> denote the event that a black ball is drawn at stage 2. </p>
<ul>
<li>Use Bayes&#39;s Law to express <math>\textbf{Pr}(U_i|W)</math> in terms of <math>f_1,\cdots, f_n</math>.</li>
<li>Let&#39;s say there are three urns and urn 1 has 30 white and 10 black balls, urn 2 has 20 white and 20 black balls, and urn 3 has 10 white balls and 30 black balls. Compute <math>\textbf{Pr}(U_1|B)</math>, <math>\textbf{Pr}(U_2|B)</math> and <math>\textbf{Pr}(U_3|B)</math>.</li>
</ul>
</li>
</ul>
<h2 id="problem-5-independence">Problem 5 (Independence)</h2>
<p>Let&#39;s consider <math>n</math> independent and identically distributed Bernoulli random variables <math>(X_1, X_2, \cdots, X_n)</math> related to <math>n</math> same Bernoulli trials, where each trail succeeds with probability <math>p</math> for <math>0&lt;p&lt;1</math>.</p>
<ul>
<li><p>[<strong>Limited independence</strong>] Let <math>n = 2</math>. Construct three events <math>A,B</math> and <math>C</math> in <math>\mathcal{F}</math> such that <math>A, B</math> and <math>C</math> are pairwise independent but are not (mutually) independent. You need to explain why your solution is right.</p>
</li>
<li><p>[<strong>Product distribution</strong>] Suppose someone has observed the output of <math>n</math> trials, and she told you that the trail succeeds in precisely <math>k</math> rounds for <math>0< k< n</math>. Now you want to predict the output of <math>(n-1)</math>-th trail while <math>p</math> is unknown. One way to estimate <math>p</math> is to find a <math>p</math> which makes the observed result most probable, namely you want solve
  <math>\arg \max_{p\in(0,1)} \mathbf{Pr}_p [\text{the trail succeeds in precisely } k \text{ rounds}].</math></p>
<ul>
<li>Estimate <math>p</math> by the given objective function.</li>
<li>If someone tells you exactly which <math>k</math> rounds in total <math>n</math> trails that are succeed, would it help you to estimate <math>p</math> more accurately? Why?</li>
</ul>
</li>
</ul>
<h2 id="problem-6-probabilistic-method">Problem 6 (Probabilistic method)</h2>
<p>A CNF formula over variables <math>x_1,\cdots, x_n</math> is a conjunction (AND) of clause, where each clause is an OR of literals. A literal is either a variable <math>x_1</math> or its negation <math>\bar{x}_i</math>. A CNF formula is satisfiable if there is an assignment to the variables that makes the formula true. A <math>k</math>-CNF formula is a CNF formula where each clause of it has exactly <math>k</math> literals (without repetition.)</p>
<ul>
<li>[<strong>Satisfiability (I)</strong>] Let <math>F</math> be a <math>k</math>-CNF with less than <math>2^k</math> clauses. Use the probabilistic method to show that <math>F</math> must be satisfiable. </li>
<li>[<strong>Satisfiability (II)</strong>] Give a constructive proof of same problem above. That is, prove that <math>F</math> is satisfiable by showing how to construct an assignment that does satisfy <math>F</math>. Your construction doesn&#39;t have to be efficient. By doing this, You may better understand the probabilitic method.</li>
<li>[<strong>Satisfiability (III)</strong>] Let <math>F</math> be a <math>k</math>-CNF with <math>m\geq 2^k</math> clauses. Show that there must exist an assignment satisfying at least <math>\lfloor m(1-1/2^k) \rfloor</math> clauses in <math>F</math>. (Hint: Consider overlaps of events in Venn diagram.)</li>
</ul>

Revision as of 16:33, 3 March 2023

  • 目前作业非最终版本。
  • 每道题目的解答都要有完整的解题过程。中英文不限。
  • 我们推荐大家使用LaTeX, markdown等对作业进行排版。

Assumption throughout Problem Set 1

Without further notice, we are working on probability space [math]\displaystyle{ (\Omega,\mathcal{F},\mathbf{Pr}) }[/math].

Without further notice, we are working on probability space [math]\displaystyle{ (\Omega,\mathcal{F},\mathbf{Pr}) }[/math].

Problem 1 (Classic examples: Symmetric 1D random walk)

  • [Symmetric 1D random walk (I)] Let [math]\displaystyle{ (X_i)_{i \in \mathbb{N}} }[/math] be i.i.d. random variables, where [math]\displaystyle{ X_i }[/math] is uniformly chosen from [math]\displaystyle{ \{-1,1\} }[/math]. Let [math]\displaystyle{ S_n = \sum_{i=1}^{n} X_i }[/math]. Prove that [math]\displaystyle{ \sum_{i=1}^n \mathbf{Pr}(S_i = 0) = +\infty }[/math].

  • [Symmetric 1D random walk (II)] Let [math]\displaystyle{ (X_i)_{i \in \mathbb{N}} }[/math] be i.i.d. random variables, where [math]\displaystyle{ X_i }[/math] is uniformly chosen from [math]\displaystyle{ \{-1,1\} }[/math]. Let [math]\displaystyle{ S_n = \sum_{i=1}^{n} X_i }[/math] and [math]\displaystyle{ U = \min \{n \in \mathbb{N}_+ \mid S_n = 0\} }[/math] ([math]\displaystyle{ U = \infty }[/math] if the set is empty). Prove that [math]\displaystyle{ \mathbb{E}(U) = \sum_{k=1}^{+\infty} \mathbf{Pr}(U \ge k) = +\infty }[/math].

  • [Symmetric 1D random walk (III) ] Let [math]\displaystyle{ (X_i)_{i \in \mathbb{N}} }[/math] be i.i.d. random variables, where [math]\displaystyle{ X_i }[/math] is uniformly chosen from [math]\displaystyle{ \{-1,1\} }[/math]. Let [math]\displaystyle{ S_n = \sum_{i=1}^{n} X_i }[/math]. Find out the probability [math]\displaystyle{ \mathbf{Pr}\left(\forall 1 \le i \le n, |S_i| \le m \right) }[/math]. (Hint: André's reflection principle)

Problem 2 (Principle of Inclusion and Exclusion)

In the following tasks, for all [math]\displaystyle{ i \in \mathbb{N} }[/math], [math]\displaystyle{ A_i }[/math] is an event.

  • [Union bound] Prove that [math]\displaystyle{ \mathbf{Pr}\left( \bigcup_{i=1}^{+\infty} A_i \right) \le \sum_{i=1}^{+\infty} \mathbf{Pr}\left(A_i\right) }[/math].
  • [Principle of Inclusion and Exclusion (PIE)] Prove that [math]\displaystyle{ \mathbf{Pr}\left( \bigcup_{i=1}^n A_i\right) = \sum_{\emptyset \neq S \subseteq [n]} (-1)^{|S|-1} \mathbf{Pr}\left( \bigcap_{i \in S} A_i \right) }[/math], where [math]\displaystyle{ [n]={1,2,\ldots,n} }[/math].
  • [Derangements] Find the number of permutation [math]\displaystyle{ p = (p_1,p_2,\ldots,p_n) }[/math] satisfying that [math]\displaystyle{ p_i \neq i }[/math] for all [math]\displaystyle{ 1 \le i \le n }[/math] using PIE. [math]\displaystyle{ p=(p_1,p_2,\ldots,p_n) }[/math] is a permutation if each integer from [math]\displaystyle{ 1 }[/math] to [math]\displaystyle{ n }[/math] appears exacctly once in [math]\displaystyle{ p_1, p_2,\ldots, p_n }[/math].
  • [Bonferroni's inequality and Kounias's inequality] Prove that [math]\displaystyle{ \sum_{i=1}^n \mathbf{Pr}(A_i) - \sum_{1 \le i\lt j \le n} \mathbf{Pr}(A_i \cap A_j)\le \mathbf{Pr}\left(\bigcup_{i=1}^n A_i\right) \le \sum_{i=1}^n \mathbf{Pr} \left( A_i\right) - \sum_{i=2}^n \mathbf{Pr}(A_1 \cap A_i). }[/math] (Hint: try using Venn diagram to understand these inequalites)

Problem 3 (Probability space)

  • [Nonexistence of probability space] Prove that it is impossible to draw uniformly from the set of natural numbers [math]\displaystyle{ \mathbb{N} }[/math], i.e. there exists no probability space [math]\displaystyle{ (\mathbb{N},2^{\mathbb{N}},\mathbf{Pr}) }[/math] such that [math]\displaystyle{ \mathbf{Pr}(X = i) = \mathbf{Pr}(X = j) }[/math] for all [math]\displaystyle{ i, j \in \mathbb{N} }[/math]. Please clarify why your argument fails on proving that it is impossible to draw uniformly from interval [math]\displaystyle{ [0,1] }[/math], i.e. there exists no probability space [math]\displaystyle{ ([0,1],\mathcal{F},\mathbf{Pr}) }[/math] such that for any interval [math]\displaystyle{ (l,r] \subseteq [0,1] }[/math], [math]\displaystyle{ (l,r] \in \mathcal{F} }[/math] and [math]\displaystyle{ \mathbf{Pr}(X \in (l,r]) = r-l }[/math]. (Actually, such probability measure exists and is called the Lebesgue measure).

  • [Limit of events] Let [math]\displaystyle{ (A_i)_{i \in \mathbb{N}} }[/math] be a sequence of events. Define [math]\displaystyle{ \limsup A_n = \bigcap_{n=1}^{+\infty} \bigcup_{k=n}^{+\infty} A_k }[/math] and [math]\displaystyle{ \liminf A_n = \bigcup_{n=1}^{+\infty} \bigcap_{k=n}^{+\infty} A_k }[/math]. Prove that

    • [math]\displaystyle{ \liminf A_n, \limsup A_n \in \mathcal{F} }[/math] and [math]\displaystyle{ \liminf A_n \subseteq \limsup A_n }[/math];
    • [math]\displaystyle{ \liminf A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for all but finitely many values of } n\} }[/math];
    • [math]\displaystyle{ \limsup A_n = \{\omega \in \Omega \mid \omega \in A_n \text{ for infinite many values of } n\} }[/math];
    • If [math]\displaystyle{ A = \liminf A_n = \limsup A_n }[/math], then [math]\displaystyle{ \mathbf{Pr}(A_n) \rightarrow \mathbf{Pr}(A) }[/math] when [math]\displaystyle{ n }[/math] tends to infinity;
    • Furthermore, if [math]\displaystyle{ \bigcup_{k=n}^{+\infty} A_k }[/math] and [math]\displaystyle{ \bigcap_{k=n}^{+\infty} A_k }[/math] is independent for all [math]\displaystyle{ n }[/math] and [math]\displaystyle{ A = \liminf A_n = \limsup A_n }[/math], then [math]\displaystyle{ \mathbf{Pr}(A) }[/math] is either [math]\displaystyle{ 0 }[/math] or [math]\displaystyle{ 1 }[/math].

Problem 4 (Conditional probability)

  • [Symmetric 1D random walk (III)] Let [math]\displaystyle{ (X_i)_{i \in \mathbb{N}_+} }[/math] be i.i.d. random variables, where [math]\displaystyle{ X_i }[/math] is uniformly chosen from [math]\displaystyle{ {-1,1} }[/math]. Let [math]\displaystyle{ S_n = \sum_{i=1}^{n} X_i }[/math]. Compute [math]\displaystyle{ \textbf{Pr}(S_4 = 4|S_8 = 6) }[/math].

  • [Conditional version of the total probability] Let [math]\displaystyle{ C_1,\cdots, C_n }[/math] be disjoint events that form a partition of the state space. Let [math]\displaystyle{ A }[/math] and [math]\displaystyle{ B }[/math] be events such that [math]\displaystyle{ \textbf{Pr}(B\cap C_i)\gt 0 }[/math] for all [math]\displaystyle{ i }[/math]. Show that [math]\displaystyle{ \textbf{Pr}(A|B) = \sum_{i = 1}^n \textbf{Pr}(C_i|B)\cdot \textbf{Pr}(A|B\cap C_i) }[/math]

  • [Bayes' Law] There are [math]\displaystyle{ n }[/math] urns filled with black and white balls. Let [math]\displaystyle{ f_i }[/math] be the fraction of white balls in urn [math]\displaystyle{ i }[/math]. In stage 1 an urn is chosen uniformly at random (i.e., each urn has probability [math]\displaystyle{ 1/n }[/math] of being chosen). In stage 2 a ball is drawn uniformly at random from the urn. Let [math]\displaystyle{ U_i }[/math] be the event that urn [math]\displaystyle{ i }[/math] was selected at stage 1. Let [math]\displaystyle{ W }[/math] denote the event that a white ball is drawn at stage 2, and [math]\displaystyle{ B }[/math] denote the event that a black ball is drawn at stage 2.

    • Use Bayes's Law to express [math]\displaystyle{ \textbf{Pr}(U_i|W) }[/math] in terms of [math]\displaystyle{ f_1,\cdots, f_n }[/math].
    • Let's say there are three urns and urn 1 has 30 white and 10 black balls, urn 2 has 20 white and 20 black balls, and urn 3 has 10 white balls and 30 black balls. Compute [math]\displaystyle{ \textbf{Pr}(U_1|B) }[/math], [math]\displaystyle{ \textbf{Pr}(U_2|B) }[/math] and [math]\displaystyle{ \textbf{Pr}(U_3|B) }[/math].

Problem 5 (Independence)

Let's consider [math]\displaystyle{ n }[/math] independent and identically distributed Bernoulli random variables [math]\displaystyle{ (X_1, X_2, \cdots, X_n) }[/math] related to [math]\displaystyle{ n }[/math] same Bernoulli trials, where each trail succeeds with probability [math]\displaystyle{ p }[/math] for [math]\displaystyle{ 0&lt;p&lt;1 }[/math].

  • [Limited independence] Let [math]\displaystyle{ n = 2 }[/math]. Construct three events [math]\displaystyle{ A,B }[/math] and [math]\displaystyle{ C }[/math] in [math]\displaystyle{ \mathcal{F} }[/math] such that [math]\displaystyle{ A, B }[/math] and [math]\displaystyle{ C }[/math] are pairwise independent but are not (mutually) independent. You need to explain why your solution is right.

  • [Product distribution] Suppose someone has observed the output of [math]\displaystyle{ n }[/math] trials, and she told you that the trail succeeds in precisely [math]\displaystyle{ k }[/math] rounds for [math]\displaystyle{ 0\lt k\lt n }[/math]. Now you want to predict the output of [math]\displaystyle{ (n-1) }[/math]-th trail while [math]\displaystyle{ p }[/math] is unknown. One way to estimate [math]\displaystyle{ p }[/math] is to find a [math]\displaystyle{ p }[/math] which makes the observed result most probable, namely you want solve [math]\displaystyle{ \arg \max_{p\in(0,1)} \mathbf{Pr}_p [\text{the trail succeeds in precisely } k \text{ rounds}]. }[/math]

    • Estimate [math]\displaystyle{ p }[/math] by the given objective function.
    • If someone tells you exactly which [math]\displaystyle{ k }[/math] rounds in total [math]\displaystyle{ n }[/math] trails that are succeed, would it help you to estimate [math]\displaystyle{ p }[/math] more accurately? Why?

Problem 6 (Probabilistic method)

A CNF formula over variables [math]\displaystyle{ x_1,\cdots, x_n }[/math] is a conjunction (AND) of clause, where each clause is an OR of literals. A literal is either a variable [math]\displaystyle{ x_1 }[/math] or its negation [math]\displaystyle{ \bar{x}_i }[/math]. A CNF formula is satisfiable if there is an assignment to the variables that makes the formula true. A [math]\displaystyle{ k }[/math]-CNF formula is a CNF formula where each clause of it has exactly [math]\displaystyle{ k }[/math] literals (without repetition.)

  • [Satisfiability (I)] Let [math]\displaystyle{ F }[/math] be a [math]\displaystyle{ k }[/math]-CNF with less than [math]\displaystyle{ 2^k }[/math] clauses. Use the probabilistic method to show that [math]\displaystyle{ F }[/math] must be satisfiable.
  • [Satisfiability (II)] Give a constructive proof of same problem above. That is, prove that [math]\displaystyle{ F }[/math] is satisfiable by showing how to construct an assignment that does satisfy [math]\displaystyle{ F }[/math]. Your construction doesn't have to be efficient. By doing this, You may better understand the probabilitic method.
  • [Satisfiability (III)] Let [math]\displaystyle{ F }[/math] be a [math]\displaystyle{ k }[/math]-CNF with [math]\displaystyle{ m\geq 2^k }[/math] clauses. Show that there must exist an assignment satisfying at least [math]\displaystyle{ \lfloor m(1-1/2^k) \rfloor }[/math] clauses in [math]\displaystyle{ F }[/math]. (Hint: Consider overlaps of events in Venn diagram.)