<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://tcs.nju.edu.cn/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Yqzhu</id>
	<title>TCS Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://tcs.nju.edu.cn/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Yqzhu"/>
	<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Special:Contributions/Yqzhu"/>
	<updated>2026-04-27T03:36:41Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.45.3</generator>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)&amp;diff=13700</id>
		<title>概率论与数理统计 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)&amp;diff=13700"/>
		<updated>2026-04-23T09:47:02Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Assignments */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;概率论与数理统计&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Probability Theory&#039;&#039;&#039; &amp;lt;br&amp;gt; &amp;amp; &#039;&#039;&#039;Mathematical Statistics&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;尹一通&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = Email&lt;br /&gt;
|data3   = yinyt@nju.edu.cn  &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4  = office&lt;br /&gt;
|data4   = 计算机系 804&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘景铖&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = liu@nju.edu.cn  &lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 计算机系 516&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = Wednesday, 9am-12am&amp;lt;br&amp;gt;&lt;br /&gt;
仙Ⅱ-212&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = TBA &amp;lt;br&amp;gt;计算机系 804（尹一通）&amp;lt;br&amp;gt;计算机系 516（刘景铖）&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Grimmett_probability.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Random Processes&#039;&#039;&#039; (4E) &amp;lt;br&amp;gt; Geoffrey Grimmett and David Stirzaker &amp;lt;br&amp;gt;  Oxford University Press (2020)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header17=&lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Probability Theory and Mathematical Statistics&#039;&#039; (概率论与数理统计) class of Spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* TBA&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
:* [http://tcs.nju.edu.cn/yinyt/ 尹一通]：[mailto:yinyt@nju.edu.cn &amp;lt;yinyt@nju.edu.cn&amp;gt;]，计算机系 804 &lt;br /&gt;
:* [https://liuexp.github.io 刘景铖]：[mailto:liu@nju.edu.cn &amp;lt;liu@nju.edu.cn&amp;gt;]，计算机系 516 &lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 鞠哲：[mailto:juzhe@smail.nju.edu.cn &amp;lt;juzhe@smail.nju.edu.cn&amp;gt;]，计算机系 426&lt;br /&gt;
** 祝永祺：[mailto:652025330045@smail.nju.edu.cn &amp;lt;652025330045@smail.nju.edu.cn&amp;gt;]，计算机系 426&lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周三：9am-12am，仙Ⅱ-212&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* TBA, 计算机系 804（尹一通）&lt;br /&gt;
:* TBA, 计算机系 516（刘景铖）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1090092561（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：概率空间、随机变量及其数字特征、多维与连续随机变量、极限定理等内容&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：测度集中现象 (concentration of measure)、概率法 (the probabilistic method)、离散随机过程的相关专题&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：参数估计、假设检验、贝叶斯估计、线性回归等统计推断等概念&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计的若干基本概念，以及典型的统计模型和统计推断问题。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，力求使学生能够熟悉掌握概率的语言，并会利用概率思维来理解客观世界并对其建模，以及驾驭概率的数学工具来分析和求解专业问题。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：署你名字的工作必须是你个人的贡献。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[概率论与数理统计 (Spring 2026)/Problem Set 1|Problem Set 1]]  请在 2026/4/1 上课之前(9am UTC+8)提交到 [mailto:pr2026_nju@163.com pr2026_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_A1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
** [[概率论与数理统计 (Spring 2026)/第一次作业提交名单|第一次作业提交名单]]&lt;br /&gt;
&lt;br /&gt;
*[[概率论与数理统计 (Spring 2026)/Problem Set 2|Problem Set 2]]  请在 2026/4/22 上课之前(9am UTC+8)提交到 [mailto:pr2026_nju@163.com pr2026_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_A2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
** [[概率论与数理统计 (Spring 2026)/第二次作业提交名单|第二次作业提交名单]]&lt;br /&gt;
&lt;br /&gt;
*[[概率论与数理统计 (Spring 2026)/Problem Set 3|Problem Set 3]]  请在 2026/5/13 上课之前(9am UTC+8)提交到 [mailto:pr2026_nju@163.com pr2026_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_A3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/Intro.pdf 课程简介]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/ProbSpace.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Entropy and volume of Hamming balls|Entropy and volume of Hamming balls]]&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Karger&#039;s min-cut algorithm| Karger&#039;s min-cut algorithm]]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/RandVar.pdf 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 2, Sections 3.1~3.5, 3.7&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/Deviation.pdf 矩与偏差]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Sections 3.3, 3.6, 7.3&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Weierstrass Approximation Theorem|Weierstrass approximation]]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13684</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13684"/>
		<updated>2026-04-21T11:03:32Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}_{\ge 0}&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is at least as strong as the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13683</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13683"/>
		<updated>2026-04-21T11:01:35Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}^{+}&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is at least as strong as the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13682</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13682"/>
		<updated>2026-04-21T10:58:56Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}_{\ge 0}&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is at least as strong as the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13679</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13679"/>
		<updated>2026-04-21T10:54:54Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}_{\ge 0}&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13677</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13677"/>
		<updated>2026-04-21T10:51:59Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt; and every fixed admissible &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}_{\ge 0}&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound obtained from this fixed &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13676</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13676"/>
		<updated>2026-04-21T10:51:20Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt; and every fixed admissible &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}_{\ge 0}&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound from this fixed &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13675</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13675"/>
		<updated>2026-04-21T10:48:36Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt; and every fixed admissible &amp;lt;math&amp;gt;t&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}^+&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13674</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13674"/>
		<updated>2026-04-21T10:45:54Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k \in \mathbb{Z}^+&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13672</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13672"/>
		<updated>2026-04-21T10:37:06Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[e^{t|X|}]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13671</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13671"/>
		<updated>2026-04-21T10:34:25Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (Chernoff bound vs k-th moment method) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13670</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13670"/>
		<updated>2026-04-21T10:30:53Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4 (k-th moment method vs Chernoff bound) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (Chernoff bound vs k-th moment method) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13669</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13669"/>
		<updated>2026-04-21T10:30:31Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4(k-th moment method vs Chernoff bound) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4 (k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13668</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13668"/>
		<updated>2026-04-21T10:29:40Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4(k-th moment method vs Chernoff bound) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13667</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13667"/>
		<updated>2026-04-21T10:29:28Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4(k-th moment method vs Chernoff bound) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13666</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13666"/>
		<updated>2026-04-21T10:28:52Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4(k-th moment method vs Chernoff bound) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13665</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13665"/>
		<updated>2026-04-21T10:27:10Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 4(k-th moment method vs Chernoff bound) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13664</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13664"/>
		<updated>2026-04-21T10:27:00Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13663</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13663"/>
		<updated>2026-04-21T10:26:37Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbb{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbb{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13662</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13662"/>
		<updated>2026-04-21T10:25:59Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(k-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \min_{t \ge 0} \frac{\mathbf{E}[e^{t|X|}]}{e^{t\delta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge \delta] \le \frac{\mathbf{E}[|X|^k]}{\delta^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13661</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13661"/>
		<updated>2026-04-21T10:24:18Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge a] \le \min_{t&amp;gt;0} \frac{\mathbf{E}[e^{t|X|}]}{e^{ta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[|X| \ge a] \le \frac{\mathbf{E}[|X|^k]}{a^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13660</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13660"/>
		<updated>2026-04-21T10:24:06Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbb{Pr}[|X| \ge a] \le \min_{t&amp;gt;0} \frac{\mathbf{E}[e^{t|X|}]}{e^{ta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbb{Pr}[|X| \ge a] \le \frac{\mathbf{E}[|X|^k]}{a^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13659</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13659"/>
		<updated>2026-04-21T10:23:46Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \min_{t&amp;gt;0} \frac{\mathbf{E}[e^{t|X|}]}{e^{ta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \frac{\mathbf{E}[|X|^k]}{a^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13658</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13658"/>
		<updated>2026-04-21T10:23:32Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \min_{t&amp;gt;0} \frac{\mathbf{E}[e^{t|X|}]}{e^{ta}}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&amp;lt;\center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \frac{\mathbf{E}[|X|^k]}{a^k}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&amp;lt;\center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13657</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13657"/>
		<updated>2026-04-21T10:23:08Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \min_{t&amp;gt;0} \frac{\mathbf{E}[e^{t|X|}]}{e^{ta}}.&amp;lt;/math&amp;gt;&amp;lt;\center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \frac{\mathbf{E}[|X|^k]}{a^k}.&amp;lt;/math&amp;gt;&amp;lt;\center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13656</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13656"/>
		<updated>2026-04-21T10:22:15Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;\mathbb{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Chernoff Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \min_{t&amp;gt;0} \frac{\mathbf{E}[e^{t|X|}]}{e^{ta}}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&#039;&#039;&#039;kth-Moment Bound:&#039;&#039;&#039;&lt;br /&gt;
:&amp;lt;math&amp;gt;\Pr[|X| \ge a] \le \frac{\mathbf{E}[|X|^k]}{a^k}&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13655</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13655"/>
		<updated>2026-04-21T10:18:32Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;  &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;E[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. &lt;br /&gt;
&lt;br /&gt;
We can use two tail inequalities on &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;: the Chernoff bound and the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;i&amp;gt;(Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br&amp;gt;&amp;lt;br&amp;gt;&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13654</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13654"/>
		<updated>2026-04-21T10:17:01Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;E[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. We can use two tail inequalities on &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;: the Chernoff bound and the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound.&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. (Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13653</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13653"/>
		<updated>2026-04-21T10:16:37Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 4(K-th moment method vs Chernoff bound) ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable such that the moment generating function &amp;lt;math&amp;gt;E[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt;t &amp;gt; 0&amp;lt;/math&amp;gt;. We can use two tail inequalities on &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;: the Chernoff bound and the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound.&lt;br /&gt;
&lt;br /&gt;
(a) Prove that for every &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there is a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound is stronger than the Chernoff bound. (Hint: Consider the Taylor expansion of the moment generating function and apply the probabilistic method.)&lt;br /&gt;
&lt;br /&gt;
(b) Why do we still prefer to use the Chernoff bound rather than the (seemingly stronger) &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound in algorithmic analysis?&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13652</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13652"/>
		<updated>2026-04-21T03:20:54Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)&amp;diff=13651</id>
		<title>概率论与数理统计 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)&amp;diff=13651"/>
		<updated>2026-04-21T03:17:20Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Assignments */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;概率论与数理统计&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Probability Theory&#039;&#039;&#039; &amp;lt;br&amp;gt; &amp;amp; &#039;&#039;&#039;Mathematical Statistics&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;尹一通&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = Email&lt;br /&gt;
|data3   = yinyt@nju.edu.cn  &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4  = office&lt;br /&gt;
|data4   = 计算机系 804&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘景铖&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = liu@nju.edu.cn  &lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 计算机系 516&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = Wednesday, 9am-12am&amp;lt;br&amp;gt;&lt;br /&gt;
仙Ⅱ-212&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = TBA &amp;lt;br&amp;gt;计算机系 804（尹一通）&amp;lt;br&amp;gt;计算机系 516（刘景铖）&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Grimmett_probability.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Random Processes&#039;&#039;&#039; (4E) &amp;lt;br&amp;gt; Geoffrey Grimmett and David Stirzaker &amp;lt;br&amp;gt;  Oxford University Press (2020)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header17=&lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Probability Theory and Mathematical Statistics&#039;&#039; (概率论与数理统计) class of Spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* TBA&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
:* [http://tcs.nju.edu.cn/yinyt/ 尹一通]：[mailto:yinyt@nju.edu.cn &amp;lt;yinyt@nju.edu.cn&amp;gt;]，计算机系 804 &lt;br /&gt;
:* [https://liuexp.github.io 刘景铖]：[mailto:liu@nju.edu.cn &amp;lt;liu@nju.edu.cn&amp;gt;]，计算机系 516 &lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 鞠哲：[mailto:juzhe@smail.nju.edu.cn &amp;lt;juzhe@smail.nju.edu.cn&amp;gt;]，计算机系 426&lt;br /&gt;
** 祝永祺：[mailto:652025330045@smail.nju.edu.cn &amp;lt;652025330045@smail.nju.edu.cn&amp;gt;]，计算机系 426&lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周三：9am-12am，仙Ⅱ-212&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* TBA, 计算机系 804（尹一通）&lt;br /&gt;
:* TBA, 计算机系 516（刘景铖）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1090092561（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：概率空间、随机变量及其数字特征、多维与连续随机变量、极限定理等内容&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：测度集中现象 (concentration of measure)、概率法 (the probabilistic method)、离散随机过程的相关专题&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：参数估计、假设检验、贝叶斯估计、线性回归等统计推断等概念&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计的若干基本概念，以及典型的统计模型和统计推断问题。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，力求使学生能够熟悉掌握概率的语言，并会利用概率思维来理解客观世界并对其建模，以及驾驭概率的数学工具来分析和求解专业问题。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：署你名字的工作必须是你个人的贡献。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[概率论与数理统计 (Spring 2026)/Problem Set 1|Problem Set 1]]  请在 2026/4/1 上课之前(9am UTC+8)提交到 [mailto:pr2026_nju@163.com pr2026_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_A1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
** [[概率论与数理统计 (Spring 2026)/第一次作业提交名单|第一次作业提交名单]]&lt;br /&gt;
&lt;br /&gt;
*[[概率论与数理统计 (Spring 2026)/Problem Set 2|Problem Set 2]]  请在 2026/4/22 上课之前(9am UTC+8)提交到 [mailto:pr2026_nju@163.com pr2026_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_A2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
*[[概率论与数理统计 (Spring 2026)/Problem Set 3|Problem Set 3]]  请在 TBA 上课之前(9am UTC+8)提交到 [mailto:pr2026_nju@163.com pr2026_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_A3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/Intro.pdf 课程简介]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/ProbSpace.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Entropy and volume of Hamming balls|Entropy and volume of Hamming balls]]&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Karger&#039;s min-cut algorithm| Karger&#039;s min-cut algorithm]]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/RandVar.pdf 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 2, Sections 3.1~3.5, 3.7&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/Deviation.pdf 矩与偏差]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Sections 3.3, 3.6, 7.3&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Weierstrass Approximation Theorem|Weierstrass approximation]]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13650</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13650"/>
		<updated>2026-04-21T03:10:42Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 2 (Inequalities) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff Bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\mathbf{Pr}[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13649</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13649"/>
		<updated>2026-04-21T03:08:37Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 2 (Inequalities) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff Bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;center&amp;gt;&amp;lt;math&amp;gt;\Pr[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13648</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13648"/>
		<updated>2026-04-21T03:08:16Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 2 (Inequalities) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff Bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;math&amp;gt;&amp;lt;center&amp;gt;\Pr[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&amp;lt;/center&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13647</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13647"/>
		<updated>2026-04-21T03:05:17Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 2 (Inequalities) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff Bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials such that &amp;lt;math&amp;gt;\mathbf{Pr}(X_i)=p_i&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Then for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;math&amp;gt;\Pr[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13646</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13646"/>
		<updated>2026-04-21T03:04:53Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 2 (Inequalities) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chernoff Bound]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X_1,...,X_n&amp;lt;/math&amp;gt; be independent Poisson trials such that &amp;lt;math&amp;gt;\mathbf{Pr}(X_i)=p_i&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;X=\sum \limits_{i=1}^n X_i&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mu=\mathbb{E}[X]&amp;lt;/math&amp;gt;. Then for any &amp;lt;math&amp;gt;\delta&amp;gt;0&amp;lt;/math&amp;gt;,&lt;br /&gt;
::&amp;lt;math&amp;gt;\Pr[X\ge (1+\delta)\mu]\le\left(\frac{e^{\delta}}{(1+\delta)^{(1+\delta)}}\right)^{\mu}.&amp;lt;/math&amp;gt;&lt;br /&gt;
}}&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13645</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13645"/>
		<updated>2026-04-21T02:59:33Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 2 (Inequalities) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality (I)]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13644</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13644"/>
		<updated>2026-04-21T02:59:12Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Problem 2 (Inequalities) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;strong&amp;gt;[Chebyshev&#039;s inequality (I)]&amp;lt;/strong&amp;gt; Fix &amp;lt;math&amp;gt;0 &amp;lt; b \le a&amp;lt;/math&amp;gt;. Construct a random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb{E}[X^2] = b^2&amp;lt;/math&amp;gt; for which &amp;lt;math&amp;gt;\mathbf{Pr}(|X| \ge a) = b^2/a^2&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13643</id>
		<title>概率论与数理统计 (Spring 2026)/Problem Set 3</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/Problem_Set_3&amp;diff=13643"/>
		<updated>2026-04-21T02:52:30Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: Created page with &amp;quot;*每道题目的解答都要有完整的解题过程，中英文不限。  *我们推荐大家使用LaTeX, markdown等对作业进行排版。  *为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。  *若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*为督促大家认真完成平时作业、扎实掌握课程内容，本课程期末考试将从作业题目中&amp;lt;font color=red&amp;gt;随机抽取部分题目&amp;lt;/font&amp;gt;进行考查。请大家务必重视每一次作业，认真理解解题思路。&lt;br /&gt;
&lt;br /&gt;
*若考试中被抽取到的作业题目答错、答不完整或无法作答，将按照相关标准对作业进行&amp;lt;font color=red&amp;gt;扣分处理&amp;lt;/font&amp;gt;。&lt;br /&gt;
&lt;br /&gt;
== Assumption throughout Problem Set 3==&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\mathbf{Pr})&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;The term &amp;lt;math&amp;gt;\log&amp;lt;/math&amp;gt; used in this context refers to the natural logarithm.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Warm-up Problems) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;X_1,X_2,...,X_n&amp;lt;/math&amp;gt; be independent random variables, and suppose that &amp;lt;math&amp;gt;X_k&amp;lt;/math&amp;gt; is Bernoulli with parameter &amp;lt;math&amp;gt;p_k&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;Y= X_1 + X_2 + \dots + X_n&amp;lt;/math&amp;gt;. Show that, for &amp;lt;math&amp;gt;\mathbb E[Y]&amp;lt;/math&amp;gt; fixed, &amp;lt;math&amp;gt;\mathrm{Var}(Y )&amp;lt;/math&amp;gt; is a maximized when &amp;lt;math&amp;gt;p_1 = p_2 = \dots = p_n&amp;lt;/math&amp;gt;. That is to say, the variation in the sum is greatest when individuals are most alike.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Each member of a group of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; players rolls a (fair) 6-sided die. For any pair of players who throw the same number, the group scores &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; point. Find the mean and variance of the total score of the group.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        An urn contains &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls numbered &amp;lt;math&amp;gt;1, 2, \ldots, n&amp;lt;/math&amp;gt;. We select &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; balls uniformly at random &amp;lt;strong&amp;gt;without replacement&amp;lt;/strong&amp;gt; and add up their numbers. Find the mean and variance of the sum.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Variance (IV)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
      Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be an integer-valued, positive random variable and let &amp;lt;math&amp;gt;\{X_i\}_{i=1}^{\infty}&amp;lt;/math&amp;gt; be indepedently identically distributed random variables that are independent of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;, too. &lt;br /&gt;
Precisely, for any finite subset &amp;lt;math&amp;gt;I \subseteq\mathbb{N}_+&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\{X_i\}_{i \in I}&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are mutually independent. Let &amp;lt;math&amp;gt;X = \sum_{i=1}^N X_i&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\textbf{Var}[X] = \textbf{Var}[X_1] \mathbb{E}[N] + \mathbb{E}[X_1]^2 \textbf{Var}[N]&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt; [&amp;lt;strong&amp;gt;Moments (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Show that [math]G(t) = \frac{e^t}{4} + \frac{e^{-t}}{2} + \frac{1}{4}[/math] is a moment-generating function of a random variable, and write the probability mass function of this random variable.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Geo}(p)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;p \in (0,1)&amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Moments (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X\sim \text{Pois}(\lambda)&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;\lambda &amp;gt;0 &amp;lt;/math&amp;gt;. Find &amp;lt;math&amp;gt;\mathbb{E}[X^3]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\mathbb{E}[X^4]&amp;lt;/math&amp;gt;. &lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (I)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
        Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; be discrete random variables with correlation &amp;lt;math&amp;gt;\rho&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;|\rho|= 1&amp;lt;/math&amp;gt; if and only if &amp;lt;math&amp;gt;X=aY+b&amp;lt;/math&amp;gt; for some real numbers &amp;lt;math&amp;gt;a,b&amp;lt;/math&amp;gt;.&lt;br /&gt;
    &amp;lt;/li&amp;gt;&lt;br /&gt;
    &amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (II)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
    Let [math]X[/math] and [math]Y[/math] be discrete random variables with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt;, variance &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;, and correlation [math]\rho[/math]. Show that [math]\mathbb{E}(\max\{X^2,Y^2\})\leq 1+\sqrt{1-\rho^2}[/math]. (Hint: use the identity [math]\max\{a,b\} = \frac{1}{2}(a+b+|a-b|)[/math].)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;[&amp;lt;strong&amp;gt;Covariance and correlation (III)&amp;lt;/strong&amp;gt;]&lt;br /&gt;
   Let [math]X[/math] and [math]Y[/math] be independent Bernoulli random variables with parameter [math]1/2[/math]. Show that [math]X+Y[/math] and [math]|X-Y|[/math] are dependent though uncorrelated.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Inequalities) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Reverse Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with bounded range &amp;lt;math&amp;gt;0 \le X \le U&amp;lt;/math&amp;gt; for some &amp;lt;math&amp;gt;U &amp;gt; 0&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}(X \le a) \le \frac{U-\mathbf{E}[X]}{U-a}&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; U&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Markov&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable. Show that for all &amp;lt;math&amp;gt;\beta \geq 0&amp;lt;/math&amp;gt; and all &amp;lt;math&amp;gt;x &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}(X\geq x)\leq \mathbb{E}(e^{\beta X})e^{-\beta x}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;&lt;br /&gt;
    &amp;lt;strong&amp;gt;[Cantelli&#039;s inequality]&amp;lt;/strong&amp;gt; Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a discrete random variable with mean &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; and variance &amp;lt;math&amp;gt;\sigma^2&amp;lt;/math&amp;gt;. Prove that for any &amp;lt;math&amp;gt;\lambda &amp;gt; 0&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2}{\lambda^2+\sigma^2}&amp;lt;/math&amp;gt;. (Hint: You may first show that &amp;lt;math&amp;gt;\mathbf{Pr}[X \ge \lambda] \le \frac{\sigma^2 + u^2}{(\lambda + u)^2}&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;u &amp;gt; 0&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt; &amp;lt;strong&amp;gt;[Union of events]&amp;lt;/strong&amp;gt;&lt;br /&gt;
Let &amp;lt;math&amp;gt;A_1,A_2,\ldots,A_n&amp;lt;/math&amp;gt; be events with &amp;lt;math&amp;gt;\mathbf{Pr}[A_i] &amp;gt; 0&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;1 \le i \le n&amp;lt;/math&amp;gt;. Define &amp;lt;math&amp;gt;a = \sum_{i=1}^n \mathbf{Pr}[A_i]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;b = \sum_{1 \le i&amp;lt;j\le n} \mathbf{Pr}[A_i \cap A_j]&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\mathbf{Pr}[A_1 \cup A_2 \cup \ldots \cup A_n] \ge \max\left\{2-\frac{a+2b}{a^2}, \frac{a^2}{a+2b}\right\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Probability meets distinct sums) ==&lt;br /&gt;
&amp;lt;ul&amp;gt;    &lt;br /&gt;
&lt;br /&gt;
Let &amp;lt;math&amp;gt;f(n)&amp;lt;/math&amp;gt; denote the maximal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that there exists a set of &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; distinct numbers &amp;lt;math&amp;gt;\{x_1,x_2,\ldots,x_m\}&amp;lt;/math&amp;gt;&lt;br /&gt;
in &amp;lt;math&amp;gt;[n] = \{1,2,\ldots,n\}&amp;lt;/math&amp;gt; all of whose sums are distinct. Namely, &amp;lt;math&amp;gt;\sum_{i \in S} x_i&amp;lt;/math&amp;gt; are distinct for all &amp;lt;math&amp;gt;S \subseteq \{1,2,\ldots,m\}&amp;lt;/math&amp;gt;.&lt;br /&gt;
Use the second moment method (i.e., Chebyshev&#039;s inequality) to show that &amp;lt;math&amp;gt;f(n) \le \log_2 n + \frac{1}{2} \log_2 \log_2 n + O(1)&amp;lt;/math&amp;gt;. (Remark: Erdös&#039; [https://www.erdosproblems.com/1 first open problem] asks if &amp;lt;math&amp;gt;f(n) \le \log_2 n + C&amp;lt;/math&amp;gt; for some universal constant &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;.)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13596</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13596"/>
		<updated>2026-04-01T08:03:11Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240051 || 何明航&lt;br /&gt;
|-&lt;br /&gt;
| 221830067 || 张笑&lt;br /&gt;
|-&lt;br /&gt;
| 241240035 || 周玟序&lt;br /&gt;
|-&lt;br /&gt;
| 241240008 || 张恒畅&lt;br /&gt;
|-&lt;br /&gt;
| 241240066 || 贺子铭&lt;br /&gt;
|-&lt;br /&gt;
| 241180041 || 赵瀚清&lt;br /&gt;
|-&lt;br /&gt;
| 241240028 || 冯时&lt;br /&gt;
|-&lt;br /&gt;
| 241240007 || 杨煦天&lt;br /&gt;
|-&lt;br /&gt;
| 241840240 || 李味鸿&lt;br /&gt;
|-&lt;br /&gt;
| 241870040 || 陆子一&lt;br /&gt;
|-&lt;br /&gt;
| 251275003 || 翟悦凯&lt;br /&gt;
|-&lt;br /&gt;
| 211830049 || 王杰&lt;br /&gt;
|-&lt;br /&gt;
| 241870032 || 赵益&lt;br /&gt;
|-&lt;br /&gt;
| 241220136 || 祁书轩&lt;br /&gt;
|-&lt;br /&gt;
| 241240017 || 江子林&lt;br /&gt;
|-&lt;br /&gt;
| 241240022 || 潘诚懿&lt;br /&gt;
|-&lt;br /&gt;
| 241276007 || 胡博&lt;br /&gt;
|-&lt;br /&gt;
| 241880325 || 刘孟阳&lt;br /&gt;
|-&lt;br /&gt;
| 241840067 || 赵思景&lt;br /&gt;
|-&lt;br /&gt;
| 241276008 || 袁颀沣&lt;br /&gt;
|-&lt;br /&gt;
| 241840113 || 曾睿鸣&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
共65人。&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13593</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13593"/>
		<updated>2026-04-01T04:50:15Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240051 || 何明航&lt;br /&gt;
|-&lt;br /&gt;
| 221830067 || 张笑&lt;br /&gt;
|-&lt;br /&gt;
| 241240035 || 周玟序&lt;br /&gt;
|-&lt;br /&gt;
| 241240008 || 张恒畅&lt;br /&gt;
|-&lt;br /&gt;
| 241240066 || 贺子铭&lt;br /&gt;
|-&lt;br /&gt;
| 241180041 || 赵瀚清&lt;br /&gt;
|-&lt;br /&gt;
| 241240028 || 冯时&lt;br /&gt;
|-&lt;br /&gt;
| 241240007 || 杨煦天&lt;br /&gt;
|-&lt;br /&gt;
| 241840240 || 李味鸿&lt;br /&gt;
|-&lt;br /&gt;
| 241870040 || 陆子一&lt;br /&gt;
|-&lt;br /&gt;
| 251275003 || 翟悦凯&lt;br /&gt;
|-&lt;br /&gt;
| 211830049 || 王杰&lt;br /&gt;
|-&lt;br /&gt;
| 241870032 || 赵益&lt;br /&gt;
|-&lt;br /&gt;
| 241220136 || 祁书轩&lt;br /&gt;
|-&lt;br /&gt;
| 241240017 || 江子林&lt;br /&gt;
|-&lt;br /&gt;
| 241276007 || 胡博&lt;br /&gt;
|-&lt;br /&gt;
| 241880325 || 刘孟阳&lt;br /&gt;
|-&lt;br /&gt;
| 241840067 || 赵思景&lt;br /&gt;
|-&lt;br /&gt;
| 241276008 || 袁颀沣&lt;br /&gt;
|-&lt;br /&gt;
| 241840113 || 曾睿鸣&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
共64人。&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)&amp;diff=13592</id>
		<title>概率论与数理统计 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)&amp;diff=13592"/>
		<updated>2026-04-01T04:22:56Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: /* Assignments */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;概率论与数理统计&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
&#039;&#039;&#039;Probability Theory&#039;&#039;&#039; &amp;lt;br&amp;gt; &amp;amp; &#039;&#039;&#039;Mathematical Statistics&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;尹一通&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = Email&lt;br /&gt;
|data3   = yinyt@nju.edu.cn  &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4  = office&lt;br /&gt;
|data4   = 计算机系 804&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘景铖&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = liu@nju.edu.cn  &lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 计算机系 516&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = Wednesday, 9am-12am&amp;lt;br&amp;gt;&lt;br /&gt;
仙Ⅱ-212&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = TBA &amp;lt;br&amp;gt;计算机系 804（尹一通）&amp;lt;br&amp;gt;计算机系 516（刘景铖）&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Grimmett_probability.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Random Processes&#039;&#039;&#039; (4E) &amp;lt;br&amp;gt; Geoffrey Grimmett and David Stirzaker &amp;lt;br&amp;gt;  Oxford University Press (2020)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header17=&lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Probability Theory and Mathematical Statistics&#039;&#039; (概率论与数理统计) class of Spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* TBA&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
:* [http://tcs.nju.edu.cn/yinyt/ 尹一通]：[mailto:yinyt@nju.edu.cn &amp;lt;yinyt@nju.edu.cn&amp;gt;]，计算机系 804 &lt;br /&gt;
:* [https://liuexp.github.io 刘景铖]：[mailto:liu@nju.edu.cn &amp;lt;liu@nju.edu.cn&amp;gt;]，计算机系 516 &lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 鞠哲：[mailto:juzhe@smail.nju.edu.cn &amp;lt;juzhe@smail.nju.edu.cn&amp;gt;]，计算机系 426&lt;br /&gt;
** 祝永祺：[mailto:652025330045@smail.nju.edu.cn &amp;lt;652025330045@smail.nju.edu.cn&amp;gt;]，计算机系 426&lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周三：9am-12am，仙Ⅱ-212&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* TBA, 计算机系 804（尹一通）&lt;br /&gt;
:* TBA, 计算机系 516（刘景铖）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1090092561（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：概率空间、随机变量及其数字特征、多维与连续随机变量、极限定理等内容&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：测度集中现象 (concentration of measure)、概率法 (the probabilistic method)、离散随机过程的相关专题&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：参数估计、假设检验、贝叶斯估计、线性回归等统计推断等概念&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计的若干基本概念，以及典型的统计模型和统计推断问题。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，力求使学生能够熟悉掌握概率的语言，并会利用概率思维来理解客观世界并对其建模，以及驾驭概率的数学工具来分析和求解专业问题。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：署你名字的工作必须是你个人的贡献。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[概率论与数理统计 (Spring 2026)/Problem Set 1|Problem Set 1]]  请在 2026/4/1 上课之前(9am UTC+8)提交到 [mailto:pr2026_nju@163.com pr2026_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_A1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
** [[概率论与数理统计 (Spring 2026)/第一次作业提交名单|第一次作业提交名单]]&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/Intro.pdf 课程简介]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/ProbSpace.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Entropy and volume of Hamming balls|Entropy and volume of Hamming balls]]&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2026)/Karger&#039;s min-cut algorithm| Karger&#039;s min-cut algorithm]]&lt;br /&gt;
# [http://tcs.nju.edu.cn/slides/prob2026/RandVar.pdf 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 2, Sections 3.1~3.5, 3.7&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13591</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13591"/>
		<updated>2026-04-01T04:21:38Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240051 || 何明航&lt;br /&gt;
|-&lt;br /&gt;
| 221830067 || 张笑&lt;br /&gt;
|-&lt;br /&gt;
| 241240035 || 周玟序&lt;br /&gt;
|-&lt;br /&gt;
| 241240008 || 张恒畅&lt;br /&gt;
|-&lt;br /&gt;
| 241240066 || 贺子铭&lt;br /&gt;
|-&lt;br /&gt;
| 241180041 || 赵瀚清&lt;br /&gt;
|-&lt;br /&gt;
| 241240028 || 冯时&lt;br /&gt;
|-&lt;br /&gt;
| 241840240 || 李味鸿&lt;br /&gt;
|-&lt;br /&gt;
| 241870040 || 陆子一&lt;br /&gt;
|-&lt;br /&gt;
| 251275003 || 翟悦凯&lt;br /&gt;
|-&lt;br /&gt;
| 211830049 || 王杰&lt;br /&gt;
|-&lt;br /&gt;
| 241870032 || 赵益&lt;br /&gt;
|-&lt;br /&gt;
| 241220136 || 祁书轩&lt;br /&gt;
|-&lt;br /&gt;
| 241240017 || 江子林&lt;br /&gt;
|-&lt;br /&gt;
| 241276007 || 胡博&lt;br /&gt;
|-&lt;br /&gt;
| 241880325 || 刘孟阳&lt;br /&gt;
|-&lt;br /&gt;
| 241840067 || 赵思景&lt;br /&gt;
|-&lt;br /&gt;
| 241276008 || 袁颀沣&lt;br /&gt;
|-&lt;br /&gt;
| 241840113 || 曾睿鸣&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
共63人。&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13590</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13590"/>
		<updated>2026-04-01T04:21:30Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240051 || 何明航&lt;br /&gt;
|-&lt;br /&gt;
| 221830067 || 张笑&lt;br /&gt;
|-&lt;br /&gt;
| 241240035 || 周玟序&lt;br /&gt;
|-&lt;br /&gt;
| 241240008 || 张恒畅&lt;br /&gt;
|-&lt;br /&gt;
| 241240066 || 贺子铭&lt;br /&gt;
|-&lt;br /&gt;
| 241180041 || 赵瀚清&lt;br /&gt;
|-&lt;br /&gt;
| 241240028 || 冯时&lt;br /&gt;
|-&lt;br /&gt;
| 241840240 || 李味鸿&lt;br /&gt;
|-&lt;br /&gt;
| 241870040 || 陆子一&lt;br /&gt;
|-&lt;br /&gt;
| 251275003 || 翟悦凯&lt;br /&gt;
|-&lt;br /&gt;
| 211830049 || 王杰&lt;br /&gt;
|-&lt;br /&gt;
| 241870032 || 赵益&lt;br /&gt;
|-&lt;br /&gt;
| 241220136 || 祁书轩&lt;br /&gt;
|-&lt;br /&gt;
| 241240017 || 江子林&lt;br /&gt;
|-&lt;br /&gt;
| 241276007 || 胡博&lt;br /&gt;
|-&lt;br /&gt;
| 241880325 || 刘孟阳&lt;br /&gt;
|-&lt;br /&gt;
| 241840067 || 赵思景&lt;br /&gt;
|-&lt;br /&gt;
| 241276008 || 袁颀沣&lt;br /&gt;
|-&lt;br /&gt;
| 241840113 || 曾睿鸣&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;br /&gt;
共63人&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13589</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13589"/>
		<updated>2026-04-01T04:20:41Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240051 || 何明航&lt;br /&gt;
|-&lt;br /&gt;
| 221830067 || 张笑&lt;br /&gt;
|-&lt;br /&gt;
| 241240035 || 周玟序&lt;br /&gt;
|-&lt;br /&gt;
| 241240008 || 张恒畅&lt;br /&gt;
|-&lt;br /&gt;
| 241240066 || 贺子铭&lt;br /&gt;
|-&lt;br /&gt;
| 241180041 || 赵瀚清&lt;br /&gt;
|-&lt;br /&gt;
| 241240028 || 冯时&lt;br /&gt;
|-&lt;br /&gt;
| 241840240 || 李味鸿&lt;br /&gt;
|-&lt;br /&gt;
| 241870040 || 陆子一&lt;br /&gt;
|-&lt;br /&gt;
| 251275003 || 翟悦凯&lt;br /&gt;
|-&lt;br /&gt;
| 211830049 || 王杰&lt;br /&gt;
|-&lt;br /&gt;
| 241870032 || 赵益&lt;br /&gt;
|-&lt;br /&gt;
| 241220136 || 祁书轩&lt;br /&gt;
|-&lt;br /&gt;
| 241240017 || 江子林&lt;br /&gt;
|-&lt;br /&gt;
| 241276007 || 胡博&lt;br /&gt;
|-&lt;br /&gt;
| 241880325 || 刘孟阳&lt;br /&gt;
|-&lt;br /&gt;
| 241840067 || 赵思景&lt;br /&gt;
|-&lt;br /&gt;
| 241276008 || 袁颀沣&lt;br /&gt;
|-&lt;br /&gt;
| 241840113 || 曾睿鸣&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13588</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13588"/>
		<updated>2026-03-31T14:22:07Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240051 || 何明航&lt;br /&gt;
|-&lt;br /&gt;
| 221830067 || 张笑&lt;br /&gt;
|-&lt;br /&gt;
| 241240035 || 周玟序&lt;br /&gt;
|-&lt;br /&gt;
| 241240008 || 张恒畅&lt;br /&gt;
|-&lt;br /&gt;
| 241240066 || 贺子铭&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13587</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13587"/>
		<updated>2026-03-31T14:06:43Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240051 || 何明航&lt;br /&gt;
|-&lt;br /&gt;
| 221830067 || 张笑&lt;br /&gt;
|-&lt;br /&gt;
| 241240035 || 周玟序&lt;br /&gt;
|-&lt;br /&gt;
| 241240008 || 张恒畅&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13586</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13586"/>
		<updated>2026-03-31T12:56:57Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
| 241840065 || 荣恒嬉&lt;br /&gt;
|-&lt;br /&gt;
| 241240029 || 谢骐泽&lt;br /&gt;
|-&lt;br /&gt;
| 241840078 || 张惠泽&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13585</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13585"/>
		<updated>2026-03-31T11:54:29Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
| 241240050 || 李柱锃&lt;br /&gt;
|-&lt;br /&gt;
| 241240041 || 李东泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240069 || 陈姝婷&lt;br /&gt;
|-&lt;br /&gt;
| 241240053 || 张家奇&lt;br /&gt;
|-&lt;br /&gt;
| 241240032 || 崔佳雪&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13582</id>
		<title>概率论与数理统计 (Spring 2026)/第一次作业提交名单</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%A6%82%E7%8E%87%E8%AE%BA%E4%B8%8E%E6%95%B0%E7%90%86%E7%BB%9F%E8%AE%A1_(Spring_2026)/%E7%AC%AC%E4%B8%80%E6%AC%A1%E4%BD%9C%E4%B8%9A%E6%8F%90%E4%BA%A4%E5%90%8D%E5%8D%95&amp;diff=13582"/>
		<updated>2026-03-31T05:42:56Z</updated>

		<summary type="html">&lt;p&gt;Yqzhu: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;如有错漏邮件请及时联系助教。&lt;br /&gt;
&amp;lt;center&amp;gt;&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|-&lt;br /&gt;
! 学号 !! 姓名&lt;br /&gt;
|-&lt;br /&gt;
| 241240046 || 黄嘉诚 &lt;br /&gt;
|-&lt;br /&gt;
| 241880173 || 陆知渔 &lt;br /&gt;
|-&lt;br /&gt;
| 241240064 || 黄昕宇 &lt;br /&gt;
|-&lt;br /&gt;
| 241240068 || 郑飞阳 &lt;br /&gt;
|-&lt;br /&gt;
| 241240036 || 唐愉兵 &lt;br /&gt;
|-&lt;br /&gt;
| 241850002 || 张子腾 &lt;br /&gt;
|-&lt;br /&gt;
| 241840199 || 陈诣涵 &lt;br /&gt;
|-&lt;br /&gt;
| 241240049 || 罗嘉恒 &lt;br /&gt;
|-&lt;br /&gt;
| 241870077 || 张辰曦&lt;br /&gt;
|-&lt;br /&gt;
| 241220095 || 王天祥&lt;br /&gt;
|-&lt;br /&gt;
| 241098114 || 于静涵&lt;br /&gt;
|-&lt;br /&gt;
| 241820122 || 商世雄&lt;br /&gt;
|-&lt;br /&gt;
| 241870240 || 杨学舟&lt;br /&gt;
|-&lt;br /&gt;
| 241240019 || 王祎泽&lt;br /&gt;
|-&lt;br /&gt;
| 241240070 || 刘梦溪&lt;br /&gt;
|-&lt;br /&gt;
| 241880503 || 郑一鸣&lt;br /&gt;
|-&lt;br /&gt;
| 241870025 || 张科杰&lt;br /&gt;
|-&lt;br /&gt;
| 241840087 || 朱枻&lt;br /&gt;
|-&lt;br /&gt;
| 241880488 || 扶嘉年&lt;br /&gt;
|-&lt;br /&gt;
| 241220003 || 沈琪皓&lt;br /&gt;
|-&lt;br /&gt;
| 241240033 || 付雨彤&lt;br /&gt;
|-&lt;br /&gt;
| 241220004 || 张宸源&lt;br /&gt;
|-&lt;br /&gt;
| 241840052 || 李彦均&lt;br /&gt;
|-&lt;br /&gt;
| 241840173 || 刘明俊&lt;br /&gt;
|-&lt;br /&gt;
| 241870097 || 丁瀚铭&lt;br /&gt;
|-&lt;br /&gt;
| 231870061 || 朱俊杰&lt;br /&gt;
|-&lt;br /&gt;
| 241240048 || 康子凯&lt;br /&gt;
|-&lt;br /&gt;
| 241240061 || 周泽钰&lt;br /&gt;
|-&lt;br /&gt;
| 241840005 || 蔡云汉&lt;br /&gt;
|-&lt;br /&gt;
| 241240004 || 陈仝&lt;br /&gt;
|-&lt;br /&gt;
| 221180155 || 许云鹏&lt;br /&gt;
|-&lt;br /&gt;
| 241870230 || 闵文楷&lt;br /&gt;
|-&lt;br /&gt;
| 241240038 || 胡彦腾&lt;br /&gt;
|-&lt;br /&gt;
| 231250084 || 谢钦煌&lt;br /&gt;
|-&lt;br /&gt;
| 241240001 || 董清扬&lt;br /&gt;
|-&lt;br /&gt;
| 241220002 || 张瑞珉&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&amp;lt;/center&amp;gt;&lt;/div&gt;</summary>
		<author><name>Yqzhu</name></author>
	</entry>
</feed>