<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://tcs.nju.edu.cn/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Liumingmou</id>
	<title>TCS Wiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://tcs.nju.edu.cn/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Liumingmou"/>
	<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Special:Contributions/Liumingmou"/>
	<updated>2026-05-02T11:49:37Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.45.3</generator>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13704</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13704"/>
		<updated>2026-04-26T16:01:42Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课后作业 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/23)&#039;&#039;&#039; 请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
* &#039;&#039;&#039;(2026/4/6)&#039;&#039;&#039; 是清明节假期，停课一次。&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
&lt;br /&gt;
（如果你提交过作业之后想重新提交，可以通过更改“版本”字段换一个文件名提交）&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业2|作业2]] 请在 2026/05/11 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/475a895457d14d31ab24/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;)。&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
# [https://box.nju.edu.cn/f/40f8ae1bc97c409aa485/ Hashing]&lt;br /&gt;
#* Load Balancing: maximum load, power of two choices&lt;br /&gt;
#* Perfect Hashing: birthday paradox, FKS perfect hashing&lt;br /&gt;
#* Modern Hash Table: Cuckoo hashing, succinct dictionaries&lt;br /&gt;
#* Hashing in Practice: Chernoff Bound with limited independence, tabulation hashing&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13703</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13703"/>
		<updated>2026-04-26T15:03:28Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课后作业 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/23)&#039;&#039;&#039; 请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
* &#039;&#039;&#039;(2026/4/6)&#039;&#039;&#039; 是清明节假期，停课一次。&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
&lt;br /&gt;
（如果你提交过作业之后想重新提交，可以通过更改“版本号”字段换一个文件名提交）&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业2|作业2]] 请在 2026/05/11 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/475a895457d14d31ab24/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;)。&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
# [https://box.nju.edu.cn/f/40f8ae1bc97c409aa485/ Hashing]&lt;br /&gt;
#* Load Balancing: maximum load, power of two choices&lt;br /&gt;
#* Perfect Hashing: birthday paradox, FKS perfect hashing&lt;br /&gt;
#* Modern Hash Table: Cuckoo hashing, succinct dictionaries&lt;br /&gt;
#* Hashing in Practice: Chernoff Bound with limited independence, tabulation hashing&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13702</id>
		<title>高级算法 (Spring 2026)/作业2</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13702"/>
		<updated>2026-04-26T15:00:09Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# 将 power of two choices 扩展到 power of &amp;lt;math&amp;gt;d&amp;lt;/math&amp;gt; choices。此时 maximum load 是多少？分析和证明你的结论。&lt;br /&gt;
# 计算 FKS perfect hashing 的空间开销的期望和方差，并使用切比雪夫不等式证明其空间开销以高概率不超过 &amp;lt;math&amp;gt;10n&amp;lt;/math&amp;gt;.&lt;br /&gt;
# 对于 FKS perfect hashing 如果调整第一层的桶的数目（不是 &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; 而是另一个供你自由调节的参数 &amp;lt;math&amp;gt;n&#039;&amp;lt;/math&amp;gt;），你可以得到空间复杂度更优秀的哈希表吗？分析并证明你的结论。&lt;br /&gt;
# 当 Cuckoo hashing 的 负载率达到 &amp;lt;math&amp;gt;0.500001&amp;lt;/math&amp;gt; 时，插入失败的概率是多少？（Hint. Let &amp;lt;math&amp;gt; G(m, c/m)&amp;lt;/math&amp;gt; be Erdős–Rényi random graph with &amp;lt;math&amp;gt;c&amp;gt;1&amp;lt;/math&amp;gt;. Then with probability &amp;lt;math&amp;gt;1-o(1)&amp;lt;/math&amp;gt;, there exists a connected component &amp;lt;math&amp;gt;H&amp;lt;/math&amp;gt; satisfying &amp;lt;math&amp;gt;|V(H)| \ge \Omega(m)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;|E(H)| - |V(H)| = \Omega( m). &amp;lt;/math&amp;gt;）&lt;br /&gt;
# The Method of Four Russians 说到底就是分块并打表。使用 The Method of Four Russians, 构造一个大小为 &amp;lt;math&amp;gt; O(n^{1/c}\log n) &amp;lt;/math&amp;gt; bits 的 pre-computed look-up table（其中 &amp;lt;math&amp;gt;c&amp;lt;/math&amp;gt; 是一个可以自由调节的参数）来在课堂上提到的大小为  &amp;lt;math&amp;gt; O(\log n) &amp;lt;/math&amp;gt; bits 的 Trie 上以  &amp;lt;math&amp;gt; O(c) &amp;lt;/math&amp;gt; 的时间快速查找Trie 中是否有一个串是查询串的前缀，并返回对应的串的在 Trie 中的字典序。&lt;br /&gt;
# Feistel cipher / Feistel permutation 是一种双射，按如下方式运行： &amp;lt;math&amp;gt;(x_1,x_2) \mapsto (f(x_1)\oplus x_2,x_2)&amp;lt;/math&amp;gt;。若假设所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，且哈希函数 &amp;lt;math&amp;gt;h&amp;lt;/math&amp;gt; 是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal ，则 Feistel cipher 可以把 &amp;lt;math&amp;gt;x_1\in\{0,1\}^t&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;\{0,1\}^t&amp;lt;/math&amp;gt; 中。请构造一个一般的 Feistel cipher，若所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，则新的 Feistel cipher 将 &amp;lt;math&amp;gt;x_1\in[n]&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;[n]&amp;lt;/math&amp;gt; 中，并证明它是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal。&lt;br /&gt;
# Limited independence meets occupancy:&lt;br /&gt;
#* Occupancy problem 中如果使用的是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal hash function family 来放置小球，那么在课堂中讲过的两种情况下以至少 &amp;lt;math&amp;gt;1-o(1)&amp;lt;/math&amp;gt; 的概率 maximum load 分别是多少？&lt;br /&gt;
#* 构造一组两两独立的 hash function family，使得 &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls into &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; bins 的情况下 maximum load 很高，并证明你的结论。&lt;br /&gt;
# 课程中介绍的 tabulation hashing 被称为 simple tabulation hashing。 tabulation hashing 还有不少别的扩展和强化。尝试调查并介绍其中一些（你需要使用正规 citation 指出该算法的来源），并尝试解释该 tabulation hashing 克服了其他 tabulation hashing 的什么问题，有什么优点和缺点，尝试解释为什么。你也可以调查一个你喜欢的哈希函数。你的分数取决于你的解释的详细程度和准确程度。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13701</id>
		<title>高级算法 (Spring 2026)/作业2</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13701"/>
		<updated>2026-04-26T14:55:40Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# 将 power of two choices 扩展到 power of &amp;lt;math&amp;gt;d&amp;lt;/math&amp;gt; choices。此时 maximum load 是多少？尝试分析和证明你的结论。&lt;br /&gt;
# 计算 FKS perfect hashing 的空间开销的期望和方差，并使用切比雪夫不等式证明其空间开销以高概率不超过 &amp;lt;math&amp;gt;10n&amp;lt;/math&amp;gt;.&lt;br /&gt;
# 对于 FKS perfect hashing 如果调整第一层的桶的数目（不是 &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; 而是另一个供你自由调节的参数 &amp;lt;math&amp;gt;n&#039;&amp;lt;/math&amp;gt;），你可以得到空间复杂度更优秀的哈希表吗？&lt;br /&gt;
# 当 Cuckoo hashing 的 负载率超过 1/2 时，插入失败的概率是多少？（Hint. Let &amp;lt;math&amp;gt; G(m, c/m)&amp;lt;/math&amp;gt; be Erdős–Rényi random graph with &amp;lt;math&amp;gt;c&amp;gt;1&amp;lt;/math&amp;gt;. Then with probability &amp;lt;math&amp;gt;1-o(1)&amp;lt;/math&amp;gt;, there exists a connected component &amp;lt;math&amp;gt;H&amp;lt;/math&amp;gt; satisfying &amp;lt;math&amp;gt;|V(H)| \ge \Omega(m)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;|E(H)| - |V(H)| = \Omega( m). &amp;lt;/math&amp;gt;）&lt;br /&gt;
# The Method of Four Russians 说到底就是分块并打表。使用 The Method of Four Russians, 构造一个大小为 &amp;lt;math&amp;gt; O(n^{1/c}\log n) &amp;lt;/math&amp;gt; bits 的 pre-computed look-up table（其中 &amp;lt;math&amp;gt;c&amp;lt;/math&amp;gt; 是一个可以自由调节的参数）来在课堂上提到的大小为  &amp;lt;math&amp;gt; O(\log n) &amp;lt;/math&amp;gt; bits 的 Trie 上以  &amp;lt;math&amp;gt; O(c) &amp;lt;/math&amp;gt; 的时间快速查找Trie 中是否有一个串是查询串的前缀，并返回对应的串的在 Trie 中的字典序。&lt;br /&gt;
# Feistel cipher / Feistel permutation 是一种双射，按如下方式运行： &amp;lt;math&amp;gt;(x_1,x_2) \mapsto (f(x_1)\oplus x_2,x_2)&amp;lt;/math&amp;gt;。若假设所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，且哈希函数 &amp;lt;math&amp;gt;h&amp;lt;/math&amp;gt; 是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal ，则 Feistel cipher 可以把 &amp;lt;math&amp;gt;x_1\in\{0,1\}^t&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;\{0,1\}^t&amp;lt;/math&amp;gt; 中。请构造一个一般的 Feistel cipher，若所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，则新的 Feistel cipher 将 &amp;lt;math&amp;gt;x_1\in[n]&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;[n]&amp;lt;/math&amp;gt; 中，并证明它是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal。&lt;br /&gt;
# Chernoff bound with limited independence:&lt;br /&gt;
## occupancy problem 中如果使用的是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal hash function family 来放置小球，那么在课堂中讲过的两种情况下 w.h.p. maximum load 分别是多少？&lt;br /&gt;
## 构造一组两两独立的 hash function family，使得 &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; balls into &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; bins 的情况下 maximum load 很高，并证明你的结论。&lt;br /&gt;
# 课程中介绍的 tabulation hashing 被称为 simple tabulation hashing。 tabulation hashing 还有不少别的扩展和强化。尝试调查并介绍其中一些（你需要使用正规 citation 指出该算法的来源），并尝试解释该 tabulation hashing 克服了其他 tabulation hashing 的什么问题，有什么优点和缺点，尝试解释为什么。你也可以调查一个你喜欢的哈希函数。你的分数取决于你的解释的详细程度和准确程度。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13693</id>
		<title>高级算法 (Spring 2026)/作业2</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13693"/>
		<updated>2026-04-21T12:18:04Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# 将 power of two choices 扩展到 power of &amp;lt;math&amp;gt;d&amp;lt;/math&amp;gt; choices。此时 maximum load 是多少？尝试分析和证明你的结论。&lt;br /&gt;
# 计算 FKS perfect hashing 的空间开销的期望和方差，并使用切比雪夫不等式证明其空间开销以高概率不超过 &amp;lt;math&amp;gt;10n&amp;lt;/math&amp;gt;.&lt;br /&gt;
# 对于 FKS perfect hashing 如果调整第一层的桶的数目（不是 &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; 而是另一个供你自由调节的参数 &amp;lt;math&amp;gt;n&#039;&amp;lt;/math&amp;gt;），你可以得到空间复杂度更优秀的哈希表吗？&lt;br /&gt;
# 当 Cuckoo hashing 的 负载率超过 1/2 时，插入失败的概率是多少？（Hint. Let &amp;lt;math&amp;gt; G(m, c/m)&amp;lt;/math&amp;gt; be Erdős–Rényi random graph with &amp;lt;math&amp;gt;c&amp;gt;1&amp;lt;/math&amp;gt;. Then with probability &amp;lt;math&amp;gt;1-o(1)&amp;lt;/math&amp;gt;, there exists a connected component &amp;lt;math&amp;gt;H&amp;lt;/math&amp;gt; satisfying &amp;lt;math&amp;gt;|V(H)| \ge \Omega(m)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;|E(H)| - |V(H)| = \Omega( m). &amp;lt;/math&amp;gt;）&lt;br /&gt;
# The Method of Four Russians 说到底就是分块并打表。使用 The Method of Four Russians, 构造一个大小为 &amp;lt;math&amp;gt; O(n^{1/c}\log n) &amp;lt;/math&amp;gt; bits 的 pre-computed look-up table（其中 &amp;lt;math&amp;gt;c&amp;lt;/math&amp;gt; 是一个可以自由调节的参数）来在课堂上提到的大小为  &amp;lt;math&amp;gt; O(\log n) &amp;lt;/math&amp;gt; bits 的 Trie 上以  &amp;lt;math&amp;gt; O(c) &amp;lt;/math&amp;gt; 的时间快速查找Trie 中是否有一个串是查询串的前缀，并返回对应的串的在 Trie 中的字典序。&lt;br /&gt;
# Feistel cipher / Feistel permutation 是一种双射，按如下方式运行： &amp;lt;math&amp;gt;(x_1,x_2) \mapsto (f(x_1)\oplus x_2,x_2)&amp;lt;/math&amp;gt;。若假设所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，且哈希函数 &amp;lt;math&amp;gt;h&amp;lt;/math&amp;gt; 是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal ，则 Feistel cipher 可以把 &amp;lt;math&amp;gt;x_1\in\{0,1\}^t&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;\{0,1\}^t&amp;lt;/math&amp;gt; 中。请构造一个一般的 Feistel cipher，若所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，则新的 Feistel cipher 将 &amp;lt;math&amp;gt;x_1\in[n]&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;[n]&amp;lt;/math&amp;gt; 中，并证明它是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13692</id>
		<title>高级算法 (Spring 2026)/作业2</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13692"/>
		<updated>2026-04-21T12:08:42Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# 将 power of two choices 扩展到 power of &amp;lt;math&amp;gt;d&amp;lt;/math&amp;gt; choices。此时 maximum load 是多少？尝试分析和证明你的结论。&lt;br /&gt;
# 计算 FKS perfect hashing 的空间开销的期望和方差，并使用切比雪夫不等式证明其空间开销以高概率不超过 &amp;lt;math&amp;gt;10n&amp;lt;/math&amp;gt;.&lt;br /&gt;
# 对于 FKS perfect hashing 如果调整第一层的桶的数目（不是 &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; 而是另一个供你自由调节的参数 &amp;lt;math&amp;gt;n&#039;&amp;lt;/math&amp;gt;），你可以得到空间复杂度更优秀的哈希表吗？&lt;br /&gt;
# 当 Cuckoo hashing 的 负载率超过 1/2 时，插入失败的概率是多少？（Hint. Let &amp;lt;math&amp;gt; G(m, c/m)&amp;lt;/math&amp;gt; be Erdős–Rényi random graph with &amp;lt;math&amp;gt;c&amp;gt;1&amp;lt;/math&amp;gt;. Then with probability &amp;lt;math&amp;gt;1-o(1)&amp;lt;/math&amp;gt;, there exists a connected component &amp;lt;math&amp;gt;H&amp;lt;/math&amp;gt; satisfying &amp;lt;math&amp;gt;|V(H)| \ge \Omega(m)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;|E(H)| - |V(H)| = \Omega( m). &amp;lt;/math&amp;gt;）&lt;br /&gt;
# The Method of Four Russians 说到底就是分块并打表。使用 The Method of Four Russians, 构造一个大小为 &amp;lt;math&amp;gt; O(n^{1/c}\log n) &amp;lt;/math&amp;gt; bits 的 pre-computed look-up table（其中 &amp;lt;math&amp;gt;c&amp;lt;/math&amp;gt; 是一个可以自由调节的参数）来在课堂上提到的大小为  &amp;lt;math&amp;gt; O(\log n) &amp;lt;/math&amp;gt; bits 的 Trie 上以  &amp;lt;math&amp;gt; O(c) &amp;lt;/math&amp;gt; 的时间快速查找是否有一个 Trie 的串是查询串的前缀，并返回对应的串的在 Trie 中的字典序。&lt;br /&gt;
# Feistel cipher / Feistel permutation 是一种双射，按如下方式运行： &amp;lt;math&amp;gt;(x_1,x_2) \mapsto (f(x_1)\oplus x_2,x_2)&amp;lt;/math&amp;gt;。若假设所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，且哈希函数 &amp;lt;math&amp;gt;h&amp;lt;/math&amp;gt; 是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal ，则 Feistel cipher 可以把 &amp;lt;math&amp;gt;x_1\in\{0,1\}^t&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;\{0,1\}^t&amp;lt;/math&amp;gt; 中。请构造一个一般的 Feistel cipher，若所有的 &amp;lt;math&amp;gt;x_2&amp;lt;/math&amp;gt; 都不相同，则新的 Feistel cipher 将 &amp;lt;math&amp;gt;x_1\in[n]&amp;lt;/math&amp;gt; 以 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal 的方式均匀随机地映射到 &amp;lt;math&amp;gt;[n]&amp;lt;/math&amp;gt; 中，并证明它是 &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-universal。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13642</id>
		<title>高级算法 (Spring 2026)/作业2</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13642"/>
		<updated>2026-04-19T16:19:05Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# 将 power of two choices 扩展到 power of &amp;lt;math&amp;gt;d&amp;lt;/math&amp;gt; choices。此时 maximum load 是多少？尝试分析和证明你的结论。&lt;br /&gt;
# 计算 FKS perfect hashing 的空间开销的期望和方差，并使用切比雪夫不等式证明其空间开销以高概率不超过 &amp;lt;math&amp;gt;10n&amp;lt;/math&amp;gt;.&lt;br /&gt;
# 对于 FKS perfect hashing 如果调整第一层的桶的数目（不是 &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; 而是另一个供你自由调节的参数 &amp;lt;math&amp;gt;n&#039;&amp;lt;/math&amp;gt;），你可以得到空间复杂度更优秀的哈希表吗？&lt;br /&gt;
# 当 Cuckoo hashing 的 负载率超过 1/2 时，插入失败的概率是多少？（Hint. Let &amp;lt;math&amp;gt; G(m, c/m)&amp;lt;/math&amp;gt; be Erdős–Rényi random graph with &amp;lt;math&amp;gt;c&amp;gt;1&amp;lt;/math&amp;gt;. Then with probability &amp;lt;math&amp;gt;1-o(1)&amp;lt;/math&amp;gt;, there exists a connected component &amp;lt;math&amp;gt;H&amp;lt;/math&amp;gt; satisfying &amp;lt;math&amp;gt;|V(H)| \ge \Omega(m)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;|E(H)| - |V(H)| = \Omega( m). &amp;lt;/math&amp;gt;）&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13633</id>
		<title>高级算法 (Spring 2026)/作业2</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13633"/>
		<updated>2026-04-15T14:28:50Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# 将 power of two choices 扩展到 power of &amp;lt;math&amp;gt;d&amp;lt;/math&amp;gt; choices。此时 maximum load 是多少？尝试分析和证明你的结论。&lt;br /&gt;
# 计算 FKS perfect hashing 的空间开销的期望和方差，并使用切比雪夫不等式证明其空间开销以高概率不超过 &amp;lt;math&amp;gt;10n&amp;lt;/math&amp;gt;.&lt;br /&gt;
# 当 Cuckoo hashing 的 负载率超过 1/2 时，插入失败的概率是多少？（Hint. Let &amp;lt;math&amp;gt; G(m, c/m)&amp;lt;/math&amp;gt; be Erdős–Rényi random graph with &amp;lt;math&amp;gt;c&amp;gt;1&amp;lt;/math&amp;gt;. Then with probability &amp;lt;math&amp;gt;1-o(1)&amp;lt;/math&amp;gt;, there exists a connected component &amp;lt;math&amp;gt;H&amp;lt;/math&amp;gt; satisfying &amp;lt;math&amp;gt;|V(H)| \ge \Omega(m)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;|E(H)| - |V(H)| = \Omega( m). &amp;lt;/math&amp;gt;）&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13632</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13632"/>
		<updated>2026-04-15T13:16:13Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课后作业 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/23)&#039;&#039;&#039; 请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
* &#039;&#039;&#039;(2026/4/6)&#039;&#039;&#039; 是清明节假期，停课一次。&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业2|作业2]]（每次课后更新）作业提交时间待定。&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
# [https://box.nju.edu.cn/f/40f8ae1bc97c409aa485/ Hashing]&lt;br /&gt;
#* Load Balancing: maximum load, power of two choices&lt;br /&gt;
#* Perfect Hashing: birthday paradox, FKS perfect hashing&lt;br /&gt;
#* Modern Hash Table: Cuckoo hashing, succinct dictionaries&lt;br /&gt;
#* Hashing in Practice: Chernoff Bound with limited independence, tabulation hashing&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13631</id>
		<title>高级算法 (Spring 2026)/作业2</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A2&amp;diff=13631"/>
		<updated>2026-04-15T13:15:03Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: Created page with &amp;quot;*每道题目的解答都要有完整的解题过程，中英文不限。 *我们推荐大家使用LaTeX, markdown等对作业进行排版。 *没有条件的同学可以用纸笔完成作业之后拍照。 *本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&amp;#039;red&amp;#039;&amp;gt;&amp;#039;&amp;#039;&amp;#039;禁止使用自己发明的说法或者符号&amp;#039;&amp;#039;&amp;#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。  # 将 power of two ch...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# 将 power of two choices 扩展到 power of &amp;lt;math&amp;gt;d&amp;lt;/math&amp;gt; choices。此时 maximum load 是多少？尝试分析和证明你的结论。&lt;br /&gt;
# 考虑两种构造哈希表的基本模式：拉链法和基于桶的哈希表。&lt;br /&gt;
## 分别使用朴素的真随机哈希函数和power of two choices构造基于拉链法的哈希表。假设我们插入 n keys，哈希表有 n slots，那么每次查询的时候的时间开销的期望和方差大约是多少？时间开销以高概率不会超过多少？&lt;br /&gt;
## 基于桶的哈希表按照如下方式运行：假设哈希表有 m slots，这个时候我们可以把哈希表分成 &amp;lt;math&amp;gt;m/c&amp;lt;/math&amp;gt; 个桶，每个桶容量为 &amp;lt;math&amp;gt;c&amp;lt;/math&amp;gt;。每次插入的时候选择一个桶，再到桶内选择一个空的 slots 放入。查询时则遍历桶内所有的 slots，删除则是简单移除对应的 key。 &amp;lt;br/&amp;gt; 分别使用朴素的真随机哈希函数和power of two choices构造基于桶的哈希表。当你的负载率达到大约多少的时候，一次插入的失败概率会高于 &amp;lt;math&amp;gt;1/m^{10}&amp;lt;/math&amp;gt;?&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13597</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13597"/>
		<updated>2026-04-01T13:33:50Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 通知 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/23)&#039;&#039;&#039; 请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
* &#039;&#039;&#039;(2026/4/6)&#039;&#039;&#039; 是清明节假期，停课一次。&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
# [https://box.nju.edu.cn/f/40f8ae1bc97c409aa485/ Hashing]&lt;br /&gt;
#* Load Balancing: maximum load, power of two choices&lt;br /&gt;
#* Perfect Hashing: birthday paradox, FKS perfect hashing&lt;br /&gt;
#* Modern Hash Table: Cuckoo hashing, succinct dictionaries&lt;br /&gt;
#* Hashing in Practice: Chernoff Bound with limited independence, tabulation hashing&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13558</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13558"/>
		<updated>2026-03-23T15:21:36Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 通知 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/23)&#039;&#039;&#039; 请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
# [https://box.nju.edu.cn/f/40f8ae1bc97c409aa485/ Hashing]&lt;br /&gt;
#* Load Balancing: maximum load, power of two choices&lt;br /&gt;
#* Perfect Hashing: birthday paradox, FKS perfect hashing&lt;br /&gt;
#* Modern Hash Table: Cuckoo hashing, succinct dictionaries&lt;br /&gt;
#* Hashing in Practice: Chernoff Bound with limited independence, tabulation hashing&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13557</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13557"/>
		<updated>2026-03-23T15:21:02Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课后作业 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
请大家&amp;lt;font color=red &amp;gt;&#039;&#039;&#039;严格遵守[https://jw.nju.edu.cn/_upload/article/files/ab/85/1f49e9e9434dbf23018539b4c338/a550b349-9828-4a05-a5ec-d088b442bee1.pdf 《关于本科生规范使用生成式人工智能工具的指导意见（试行）》]&#039;&#039;&#039;&amp;lt;/font&amp;gt;！&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
# [https://box.nju.edu.cn/f/40f8ae1bc97c409aa485/ Hashing]&lt;br /&gt;
#* Load Balancing: maximum load, power of two choices&lt;br /&gt;
#* Perfect Hashing: birthday paradox, FKS perfect hashing&lt;br /&gt;
#* Modern Hash Table: Cuckoo hashing, succinct dictionaries&lt;br /&gt;
#* Hashing in Practice: Chernoff Bound with limited independence, tabulation hashing&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13554</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13554"/>
		<updated>2026-03-19T17:55:33Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课件及相关阅读资料 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
# [https://box.nju.edu.cn/f/40f8ae1bc97c409aa485/ Hashing]&lt;br /&gt;
#* Load Balancing: maximum load, power of two choices&lt;br /&gt;
#* Perfect Hashing: birthday paradox, FKS perfect hashing&lt;br /&gt;
#* Modern Hash Table: Cuckoo hashing, succinct dictionaries&lt;br /&gt;
#* Hashing in Practice: Chernoff Bound with limited independence, tabulation hashing&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13553</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13553"/>
		<updated>2026-03-19T17:38:50Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课程教材 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
更多的内容也可以参考[[#Related_Online_Courses |其他学者的同类课程]]。&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13552</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13552"/>
		<updated>2026-03-19T17:30:26Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课后作业 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]] 请在 2026/04/06 上课前（9am UTC+8）上传到 [https://box.nju.edu.cn/u/d/628bff0686e14f8cbf3f/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_版本.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13551</id>
		<title>高级算法 (Spring 2026)/作业1</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13551"/>
		<updated>2026-03-19T17:27:21Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# Karp-Rabin Algorithm 使用一个 equality testing 算法作为黑盒子以达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。分别讨论用本门课中介绍的其他 equality testing 算法替代 Karp-Rabin Algorithm 的 equality testing 算法能否达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度，并自己找到一个没有在本门课中出现的 equality testing 算法（你需要使用正规 citation 指出该算法的来源），尝试用它替换 Karp-Rabin Algorithm 的 equality testing 算法，你需要尽可能完备地说明为什么能或者不可能达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。&lt;br /&gt;
# 解释 Morris’ counter 背后的直觉。尝试应用 &amp;lt;math&amp;gt;(1+\alpha)^{-X}&amp;lt;/math&amp;gt; 的思路得到更好的结果。&lt;br /&gt;
# 使用 马尔科夫不等式 和 切比雪夫不等式 完成 slides 中 Bottom-k algorithm 中未完成的分析（即 &amp;lt;math&amp;gt;\Pr[V\ge k]&amp;lt;/math&amp;gt; 和 &amp;lt;math&amp;gt;\Pr[W\le k]&amp;lt;/math&amp;gt;）。&lt;br /&gt;
# 使用 median trick 来优化 min sketch 和 Flajolet-Martin Algorithm 和 Bottom-k algorithm。&lt;br /&gt;
# 第二章中哪些算法是 linear sketch，哪些不是？为什么？如果由你来改进每个算法以提高它们的并行性，你会怎么做？&lt;br /&gt;
# 完成通过分治思想来借助point query计算Heavy hitter的算法细节（细化到伪代码级别），并说明你的算法的正确性。&lt;br /&gt;
# 设计一个支持删除的 filter （即设计一个可以维护多重集的数据结构），并分析其复杂度和正确性。（复杂度越低，得分越高。提醒：每一个微观结构都是有空间代价的，比如如果你使用了 n 个 int，那么你会使用 4n bytes 的空间）&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13550</id>
		<title>高级算法 (Spring 2026)/作业1</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13550"/>
		<updated>2026-03-19T17:11:08Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# Karp-Rabin Algorithm 使用一个 equality testing 算法作为黑盒子以达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。分别讨论用本门课中介绍的其他 equality testing 算法替代 Karp-Rabin Algorithm 的 equality testing 算法能否达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度，并自己找到一个没有在本门课中出现的 equality testing 算法（你需要使用正规 citation 指出该算法的来源），尝试用它替换 Karp-Rabin Algorithm 的 equality testing 算法，你需要尽可能完备地说明为什么能或者不可能达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。&lt;br /&gt;
# 解释 Morris’ counter 背后的直觉。尝试应用 &amp;lt;math&amp;gt;(1+\alpha)^{-X}&amp;lt;/math&amp;gt; 的思路得到更好的结果。&lt;br /&gt;
# 使用 马尔科夫不等式 和 切比雪夫不等式 完成 slides 中 Bottom-k algorithm 中未完成的分析（即 &amp;lt;math&amp;gt;\Pr[V\ge k]&amp;lt;/math&amp;gt; 和 &amp;lt;math&amp;gt;\Pr[W\le k]&amp;lt;/math&amp;gt;）。&lt;br /&gt;
# 使用 median trick 来优化 min sketch 和 Flajolet-Martin Algorithm 和 Bottom-k algorithm。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13505</id>
		<title>高级算法 (Spring 2026)/作业1</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13505"/>
		<updated>2026-03-15T08:38:17Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# Karp-Rabin Algorithm 使用一个 equality testing 算法作为黑盒子以达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。分别讨论用本门课中介绍的其他 equality testing 算法替代 Karp-Rabin Algorithm 的 equality testing 算法能否达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度，并自己找到一个没有在本门课中出现的 equality testing 算法（你需要使用正规 citation 指出该算法的来源），尝试用它替换 Karp-Rabin Algorithm 的 equality testing 算法，你需要尽可能完备地说明为什么能或者不可能达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。&lt;br /&gt;
# 解释 Morris’ counter 背后的直觉。尝试应用 &amp;lt;math&amp;gt;(1+\alpha)^{-X}&amp;lt;/math&amp;gt; 的思路得到更好的结果。&lt;br /&gt;
# 使用 median trick 来优化 min sketch 和 Flajolet-Martin Algorithm 和 Bottom-k algorithm。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13504</id>
		<title>高级算法 (Spring 2026)/作业1</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13504"/>
		<updated>2026-03-15T08:37:25Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
*本门课的所有作业中如果出现了课程中（包括课件中和作业题目中）的算法或者概念或者符号，则&amp;lt;font color=&#039;red&#039;&amp;gt;&#039;&#039;&#039;禁止使用自己发明的说法或者符号&#039;&#039;&#039;&amp;lt;/font&amp;gt;，必须与课程内容保持一致。&lt;br /&gt;
&lt;br /&gt;
# Karp-Rabin Algorithm 使用一个 equality testing 算法作为黑盒子以达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。分别讨论用本门课中介绍的其他 equality testing 算法替代 Karp-Rabin Algorithm 的 equality testing 算法能否达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度，并自己找到一个没有在本门课中出现的 equality testing 算法（你需要使用正规 citation 指出该算法的来源），尝试用它替换 Karp-Rabin Algorithm 的 equality testing 算法，你需要尽可能完备地说明为什么能或者不可能达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。&lt;br /&gt;
# 解释 Morris’ counter 背后的直觉。尝试应用 &amp;lt;math&amp;gt;(1+\alpha)^{-X}&amp;lt;/math&amp;gt; 的思路得到更好的结果。&lt;br /&gt;
# 使用 median trick 来优化 min sketch 和 Flajolet-Martin Algorithm。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13494</id>
		<title>高级算法 (Spring 2026)/作业1</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13494"/>
		<updated>2026-03-09T04:42:39Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
&lt;br /&gt;
# Karp-Rabin Algorithm 使用一个 equality testing 算法作为黑盒子以达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。分别讨论用本门课中介绍的其他 equality testing 算法替代 Karp-Rabin Algorithm 的 equality testing 算法能否达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度，并自己找到一个没有在本门课中出现的 equality testing 算法（你需要使用正规 citation 指出该算法的来源），尝试用它替换 Karp-Rabin Algorithm 的 equality testing 算法，你需要尽可能完备地说明为什么能或者不可能达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。&lt;br /&gt;
# 解释 Morris’ counter 背后的直觉。尝试应用 &amp;lt;math&amp;gt;(1+\alpha)^{-X}&amp;lt;/math&amp;gt; 的思路得到更好的结果。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13492</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13492"/>
		<updated>2026-03-08T12:51:00Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课件及相关阅读资料 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]]（每堂课之后更新）DDL to be announced.&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
# [https://box.nju.edu.cn/f/fcf9e2b3051e443fb324/ Sketching]&lt;br /&gt;
#* Morris&#039; Algorithm&lt;br /&gt;
#** mean trick, median trick&lt;br /&gt;
#* Counting Distinct Elements: min sketch, Flajolet-Martin algorithm, bottom-&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; algorithm, HyperLogLog&lt;br /&gt;
#* Heavy Hitter &amp;amp; Point Query: count-min sketch&lt;br /&gt;
#* 2nd Frequency Moments Estimator: count sketch&lt;br /&gt;
#** &amp;lt;math&amp;gt;\ell_2&amp;lt;/math&amp;gt; point query with count sketch&lt;br /&gt;
#* Approximate Membership: Bloom filter&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13486</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13486"/>
		<updated>2026-03-03T16:40:15Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课后作业 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
*[[高级算法_(Spring_2026)/作业1|作业1]]（每堂课之后更新）DDL to be announced.&lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13485</id>
		<title>高级算法 (Spring 2026)/作业1</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)/%E4%BD%9C%E4%B8%9A1&amp;diff=13485"/>
		<updated>2026-03-03T16:25:53Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: Created page with &amp;quot;*每道题目的解答都要有完整的解题过程，中英文不限。 *我们推荐大家使用LaTeX, markdown等对作业进行排版。 *没有条件的同学可以用纸笔完成作业之后拍照。  # Karp-Rabin Algorithm 使用一个 equality testing 算法作为黑盒子以达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。分别讨论用本门课中介绍的其他 equality testing 算法替代 Karp-Rabin Algorithm 的 equality testing 算法能否达成 &amp;lt;math&amp;gt;O(...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
&lt;br /&gt;
# Karp-Rabin Algorithm 使用一个 equality testing 算法作为黑盒子以达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。分别讨论用本门课中介绍的其他 equality testing 算法替代 Karp-Rabin Algorithm 的 equality testing 算法能否达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度，并自己找到一个没有在本门课中出现的 equality testing 算法（你需要使用正规 citation 指出该算法的来源），尝试用它替换 Karp-Rabin Algorithm 的 equality testing 算法，你需要尽可能完备地说明为什么能或者不可能达成 &amp;lt;math&amp;gt;O(m+n)&amp;lt;/math&amp;gt; 的时间复杂度。&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_6&amp;diff=13453</id>
		<title>数据科学基础 (Fall 2025)/Problem Set 6</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_6&amp;diff=13453"/>
		<updated>2026-02-26T16:08:40Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
&lt;br /&gt;
==  Assumption throughout Problem Set 6 == &lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\Pr)&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (LLN &amp;amp; CLT)==&lt;br /&gt;
* [&#039;&#039;&#039;Proportional betting&#039;&#039;&#039;] In each of a sequence of independent bets, a gambler either wins 30%, or loses 25% of her current fortune, each with probability &amp;lt;math&amp;gt;1/2&amp;lt;/math&amp;gt;. Denoting her fortune after &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; bets by &amp;lt;math&amp;gt;F_n&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\mathbb E(F_n)\to\infty&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;n \to\infty&amp;lt;/math&amp;gt;, while &amp;lt;math&amp;gt;F_n \to 0&amp;lt;/math&amp;gt; almost surely.&lt;br /&gt;
* [&#039;&#039;&#039;Entropy&#039;&#039;&#039;]  The interval &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt; is partitioned into &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; disjoint sub-intervals with lengths &amp;lt;math&amp;gt;p_1,p_2,\dots,p_n&amp;lt;/math&amp;gt;, and the entropy of this partition is defined to be &amp;lt;math&amp;gt;h= −\sum^n_{i=1} p_i log p_i&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;X_1,X_2,\dots&amp;lt;/math&amp;gt; be independent random variables having the uniform distribution on &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt;, and let &amp;lt;math&amp;gt;Z_m^{(i)}&amp;lt;/math&amp;gt; be the number of the &amp;lt;math&amp;gt;X_1,X_2,\dots,X_m&amp;lt;/math&amp;gt; which lie in the &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt;-th interval of the partition above. Show that &amp;lt;math&amp;gt;R_m =\prod^n_{i=1} p_i^{Z_m^{(i)}}&amp;lt;/math&amp;gt; satisfies &amp;lt;math&amp;gt;m^{−1}\cdot\log R_m \to −h&amp;lt;/math&amp;gt; almost surely as &amp;lt;math&amp;gt;m \to\infty&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Mobilizing a Supermajority&#039;&#039;&#039;] In a society of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; independent individuals, each person independently (i) attends the vote with probability &amp;lt;math&amp;gt;\tau&amp;lt;/math&amp;gt; and abstains with probability &amp;lt;math&amp;gt;1-\tau&amp;lt;/math&amp;gt;; (ii) if attending, votes &amp;quot;Yes&amp;quot; with probability &amp;lt;math&amp;gt;p&amp;lt;/math&amp;gt; and &amp;quot;No&amp;quot; with probability &amp;lt;math&amp;gt;1-p&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;A proposal is accepted if among all attendees, the fraction of Yes votes is at least a supermajority threshold &amp;lt;math&amp;gt;\theta \in (1/2,1)&amp;lt;/math&amp;gt; (e.g., &amp;lt;math&amp;gt;\theta = 2/3&amp;lt;/math&amp;gt;). A mobilization campaign may add &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; extra supporters who certainly attend and certainly vote Yes. Your goal is to determine the minimal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that the proposal passes with probability at least &amp;lt;math&amp;gt;1-\delta&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Concentration of measure)==&lt;br /&gt;
* [&#039;&#039;&#039;Tossing coins&#039;&#039;&#039;] We repeatedly toss a fair coin (with an equal probability of heads and tails). Let the random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be the number of throws required to obtain a total of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; heads. Show that &amp;lt;math&amp;gt;\Pr[X &amp;gt; 2n + \delta\sqrt{n\log n}]\leq n^{-\delta^2/6}&amp;lt;/math&amp;gt; for any real &amp;lt;math&amp;gt;0&amp;lt;\delta&amp;lt;\sqrt{\frac{4n}{\log n}}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable with expectation &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; such that moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt; t &amp;gt; 0 &amp;lt;/math&amp;gt;. We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt; X &amp;lt;/math&amp;gt;: &lt;br /&gt;
** Chernoff Bound:  &amp;lt;math&amp;gt;\Pr[|X| \geq \delta] \leq \min_{t \geq 0} {\mathbb{E}[e^{t|X|}]}/{e^{t\delta}}&amp;lt;/math&amp;gt;;&lt;br /&gt;
** &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-Moment Bound:  &amp;lt;math&amp;gt;\Pr[|X| \geq \delta] \leq {\mathbb{E}[|X|^k]}/{\delta^k}&amp;lt;/math&amp;gt;.&lt;br /&gt;
# Show that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there exists a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-moment bound is no weaker than the Chernoff bound. (Hint: Use the probabilistic method. Construct a distribution over all &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-moment bound, and show that the expected bound is not weaker than the Chernoff bound.)&lt;br /&gt;
# Why would we still prefer the Chernoff bound to the (seemingly) stronger &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound?&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Densest induced subgraph in random graph&#039;&#039;&#039;] For a graph &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; on vertex set &amp;lt;math&amp;gt;[n] = {1,2,\dots,n}&amp;lt;/math&amp;gt;, define the average-degree density of an induced subgraph as &amp;lt;math&amp;gt;\mathrm{dens}(S) := \frac{e(S)}{|S|}&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;e(S)&amp;lt;/math&amp;gt; is the number of edges with both endpoints in &amp;lt;math&amp;gt;S&amp;lt;/math&amp;gt;. Define the densest induced subgraph of &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;\mathrm{dens}(G) := \max_{S \subseteq [n], |S|\ge 2} \mathrm{dens}(S)&amp;lt;/math&amp;gt;. Show that, with probability at least &amp;lt;math&amp;gt;2/3&amp;lt;/math&amp;gt;, the densest induced subgraph in &amp;lt;math&amp;gt;G(n,1/2)&amp;lt;/math&amp;gt; satisfies &amp;lt;math&amp;gt;\mathrm{dens}(G(n,1/2)) \le \frac{n}{4} + O(n^{1/2})&amp;lt;/math&amp;gt;. More precisely, prove that there exists an absolute constant &amp;lt;math&amp;gt;C &amp;gt; 0&amp;lt;/math&amp;gt; such that &amp;lt;math&amp;gt;\Pr\big( \mathrm{dens}(G(n,1/2)) \le \frac{n}{4} + C n^{1/2} \big) \ge \frac{2}{3}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Random processes)==&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;High-dimensional random walk&#039;&#039;&#039;] Consider an unbiased random walk over &amp;lt;math&amp;gt;\mathbb R^n&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;n&amp;gt;1&amp;lt;/math&amp;gt;. At each step, assuming we are at position &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; without loss of generality, for each dimension &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt;, we choose a movement &amp;lt;math&amp;gt;\delta_i\in\mathbb R&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb E [\delta_i]=0&amp;lt;/math&amp;gt; (i.e. unbiased) at random, then move to &amp;lt;math&amp;gt; X+\sum_i\sigma_i&amp;lt;/math&amp;gt;. Prove that an unbiased random walk in any number of dimensions, regardless of the distributions of &amp;lt;math&amp;gt;\sigma_i&amp;lt;/math&amp;gt;&#039;s, is an example of a martingale.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Pólya’s urn&#039;&#039;&#039;]A bag contains red and blue balls, with initially  &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt; red and  &amp;lt;math&amp;gt;b&amp;lt;/math&amp;gt; blue where  &amp;lt;math&amp;gt;rb &amp;gt;0 &amp;lt;/math&amp;gt;. A ball is drawn from the bag, its color noted, and then it is returned to the bag together with a new ball of the same color. Let  &amp;lt;math&amp;gt;R_n &amp;lt;/math&amp;gt; be the number of red balls after  &amp;lt;math&amp;gt;n &amp;lt;/math&amp;gt; such operations. Show that  &amp;lt;math&amp;gt;Y_n = R_n/(n + r + b) &amp;lt;/math&amp;gt; is a martingale.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Optional stopping 1-D symmetric random walk&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;S_n = a + \sum_{r=1}^n X_r&amp;lt;/math&amp;gt; be a simple symmetric random walk. The walk stops at the earliest time &amp;lt;math&amp;gt;T&amp;lt;/math&amp;gt; when it reaches either &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; or &amp;lt;math&amp;gt;K&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; K&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;&lt;br /&gt;
M_n = \sum_{r=0}^n S_r - \tfrac{1}{3} S_n^3&lt;br /&gt;
&amp;lt;/math&amp;gt; is a martingale, and deduce that &amp;lt;math&amp;gt;&lt;br /&gt;
\mathbb{E}\left( \sum_{r=0}^{T} S_r \right)&lt;br /&gt;
= \tfrac{1}{3} (K^2 - a^2) a + a.&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Random walk on a graph&#039;&#039;&#039;] A particle performs a random walk on the vertex set of a connected graph &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;, which for simplicity we assume to have neither loops nor multiple edges. At each stage it moves to a neighbor of its current position, each such neighbor being chosen with equal probability. If &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; has &amp;lt;math&amp;gt;\eta&amp;lt;\infty&amp;lt;/math&amp;gt; edges, show that the stationary distribution is given by &amp;lt;math&amp;gt;\pi(v) = d_v/(2\eta)&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;d_v&amp;lt;/math&amp;gt; is the degree of vertex &amp;lt;math&amp;gt;v&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Reversibility versus periodicity&#039;&#039;&#039;] Can a reversible chain be periodic?&lt;br /&gt;
* [&#039;&#039;&#039;Metropolis–Hastings algorithm&#039;&#039;&#039;] To sample a state, for each state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt;, the Glauber dynamics uniformly chooses a state among the adjacent states of &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; together with state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; itself at random in each step, and moves to the chosen state. The Metropolis-Hastings algorithm generalizes the idea of Glauber dynamics. Let us assume that we have designed an irreducible state space for our Markov chain; now we want to construct a Markov chain on this state space with a stationary distribution &amp;lt;math&amp;gt;\pi_x = b(x)/B&amp;lt;/math&amp;gt;, where for all &amp;lt;math&amp;gt;x \in \Omega&amp;lt;/math&amp;gt; we have &amp;lt;math&amp;gt;b(x) &amp;gt; 0&amp;lt;/math&amp;gt; and such that &amp;lt;math&amp;gt;B =\sum_{x\in\Omega} b(x)&amp;lt;/math&amp;gt; is finite. &lt;br /&gt;
# For a finite state space &amp;lt;math&amp;gt;\Omega&amp;lt;/math&amp;gt; and neighborhood structure &amp;lt;math&amp;gt;\{N(X ) | x \in\Omega\}&amp;lt;/math&amp;gt;, let &amp;lt;math&amp;gt;N = \max_{x\in\Omega} |N(x)|&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;M&amp;lt;/math&amp;gt; be any number such that &amp;lt;math&amp;gt;M \ge N&amp;lt;/math&amp;gt;. For all &amp;lt;math&amp;gt;x \in \Omega&amp;lt;/math&amp;gt;, let &amp;lt;math&amp;gt;\pi_x &amp;gt; 0&amp;lt;/math&amp;gt; be the desired probability of state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; in the stationary distribution. Consider a Markov chain where &amp;lt;math&amp;gt;P_{x,y} =&lt;br /&gt;
\begin{cases}(1/M) \min(1, \pi_y/\pi_x ) &amp;amp;\text{if $x \ne y$ and $y \in N(x)$},\\&lt;br /&gt;
0 &amp;amp;\text{if $x \ne y$ and $y \notin N(x)$},\\&lt;br /&gt;
1 − \sum_{y\ne x} P_{x,y} &amp;amp;\text{if $x = y$}\end{cases}&amp;lt;/math&amp;gt;. Assume this chain is irreducible and aperiodic, verify that the stationary distribution is given by the probabilities &amp;lt;math&amp;gt;\pi_x&amp;lt;/math&amp;gt;. (Hint: Show the time-reversibility.)&lt;br /&gt;
# Let &amp;lt;math&amp;gt;S = \sum_{i=1}^\infty i^{−2} = \pi^2/6&amp;lt;/math&amp;gt;. Design a Markov chain based on the Metropolis-Hastings algorithm on the positive integers such that, in the stationary distribution, &amp;lt;math&amp;gt;\pi_i = 1/(S\cdot i^2)&amp;lt;/math&amp;gt; . The neighbors of any integer &amp;lt;math&amp;gt;i &amp;gt; 1&amp;lt;/math&amp;gt; for your chain should be only &amp;lt;math&amp;gt;i − 1&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;i + 1&amp;lt;/math&amp;gt;, and the only neighbor of &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; should be the integer &amp;lt;math&amp;gt;2&amp;lt;/math&amp;gt;.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_5&amp;diff=13452</id>
		<title>数据科学基础 (Fall 2025)/Problem Set 5</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_5&amp;diff=13452"/>
		<updated>2026-02-26T15:35:46Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Problem 2 (Continuous Random Variables) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
&lt;br /&gt;
==  Assumption throughout Problem Set 5 == &lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\Pr)&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (Random Graphs) ==&lt;br /&gt;
* [&#039;&#039;&#039;Triangle neighbors&#039;&#039;&#039;] Suppose that &amp;lt;math&amp;gt;p= O(1/n)&amp;lt;/math&amp;gt;. Prove that the Erdős–Rényi random graph &amp;lt;math&amp;gt;\mathbf{G}(n,p)&amp;lt;/math&amp;gt;  does not contain vertex which belongs to more than one triangle.&lt;br /&gt;
* [&#039;&#039;&#039;Isolated vertices&#039;&#039;&#039;] An isolated vertex is of degree of 0. Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be the random variable counting isolated vertices in the Erdős–Rényi random graph &amp;lt;math&amp;gt;\mathbf{G}(n,p)&amp;lt;/math&amp;gt;. Assume &amp;lt;math&amp;gt;p=(\log n+c)/n&amp;lt;/math&amp;gt; for some constant &amp;lt;math&amp;gt;c&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; converges to Poisson distribution with parameter &amp;lt;math&amp;gt;e^{-c}&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;n\to\infty&amp;lt;/math&amp;gt;. (hint: prove the binomial moments &amp;lt;math&amp;gt;\mathbb E\left[\binom X k\right],\forall k\in\mathbb N&amp;lt;/math&amp;gt; are equal)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Problem 2 (Continuous Random Variables)==&lt;br /&gt;
* [&#039;&#039;&#039;Jointly continuous&#039;&#039;&#039;] &lt;br /&gt;
*#If &amp;lt;math&amp;gt;U&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;V&amp;lt;/math&amp;gt; are jointly continuous, show that &amp;lt;math&amp;gt;\Pr(U = V) = 0 &amp;lt;/math&amp;gt;.&lt;br /&gt;
*# Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be uniformly distributed on &amp;lt;math&amp;gt;(0, 1)&amp;lt;/math&amp;gt;, and let &amp;lt;math&amp;gt;Y = X&amp;lt;/math&amp;gt; . Then &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; are continuous, and &amp;lt;math&amp;gt;\Pr(X = Y ) = 1&amp;lt;/math&amp;gt;. Is there a contradiction here?&lt;br /&gt;
* [&#039;&#039;&#039;Distribution function&#039;&#039;&#039;] Can an &amp;lt;math&amp;gt;F:\mathbb R\to [0,1]&amp;lt;/math&amp;gt;, which is (i) nondecreasing, (ii) &amp;lt;math&amp;gt;\lim_{x\to-\infty}F(x)=0,\lim_{x\to+\infty}F(x)=1&amp;lt;/math&amp;gt;, (iii) continuous, (iv) not differentiable at some point, be a cumulative distribution function (CDF) for some random variable? Is &amp;lt;math&amp;gt;F&amp;lt;/math&amp;gt; always a cumulative distribution function for some random variable? What random variable might it  be? Justify your answer.&lt;br /&gt;
* [&#039;&#039;&#039;Density function&#039;&#039;&#039;] For what values of the &amp;lt;math&amp;gt;C&amp;lt;/math&amp;gt;,  &amp;lt;math&amp;gt;f (x)= C\cdot\exp(−x− e^{−x} ), x \in \mathbb R&amp;lt;/math&amp;gt;, the density function of the ‘extreme-value distribution’, is a probability density function?&lt;br /&gt;
* [&#039;&#039;&#039;iid&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;\{X_r : r \ge 1\}&amp;lt;/math&amp;gt; be independent and identically distributed with distribution function &amp;lt;math&amp;gt;F&amp;lt;/math&amp;gt; satisfying &amp;lt;math&amp;gt;F(y)&amp;lt;1&amp;lt;/math&amp;gt; for all &amp;lt;math&amp;gt;y&amp;lt;/math&amp;gt;, and let &amp;lt;math&amp;gt;Y (y)= \min\{k : X_k &amp;gt;y\}&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;\lim_{y\to\infty} \Pr( Y (y)\le \mathbb E[Y (y)])= 1−e^{−1}&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Tails and moments&#039;&#039;&#039;] If &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; is a continuous random variable and &amp;lt;math&amp;gt;\mathbb E(X^r )&amp;lt;/math&amp;gt; exists, where &amp;lt;math&amp;gt;r \ge 1&amp;lt;/math&amp;gt; is an integer, show that &amp;lt;math&amp;gt;\int_0^\infty x^{r−1}\Pr(|X| &amp;gt;x)dx &amp;lt;\infty&amp;lt;/math&amp;gt;, and &amp;lt;math&amp;gt;x^r\cdot\Pr(|X| &amp;gt;x)\to 0&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;x \to\infty&amp;lt;/math&amp;gt;. (&#039;&#039;&#039;Hint&#039;&#039;&#039;. You might need this: for non-negative &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;, &amp;lt;math&amp;gt;\mathbb E(X^r )=\int_0^\infty rx^{r-1}\Pr(X&amp;gt;x)dx&amp;lt;/math&amp;gt;.)&lt;br /&gt;
* [&#039;&#039;&#039;Conditional expectation&#039;&#039;&#039;] Show that the conditional expectation &amp;lt;math&amp;gt;\psi(X)= \mathbb E(Y | X)&amp;lt;/math&amp;gt; satisfies &amp;lt;math&amp;gt;\mathbb E(\psi(X)g(X))=\mathbb E(Y\cdot g(X))&amp;lt;/math&amp;gt;, for any function &amp;lt;math&amp;gt;g&amp;lt;/math&amp;gt; for which both expectations exist.&lt;br /&gt;
* [&#039;&#039;&#039;Correlated?Indepedent?&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be uniformly distributed on &amp;lt;math&amp;gt;[−1,1]&amp;lt;/math&amp;gt;. Are the random variables &amp;lt;math&amp;gt;Z_n = \cos(n\pi X), n =1,2,\dots&amp;lt;/math&amp;gt;, correlated? Are they independent? Explain your answers.&lt;br /&gt;
* [&#039;&#039;&#039;Aliasing method&#039;&#039;&#039;] A finite real vector is called a probability vector if it has non-negative entries with sum &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;. Show that a probability vector &amp;lt;math&amp;gt;\mathbf p&amp;lt;/math&amp;gt; of length &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; may be written in the form &amp;lt;math&amp;gt;\mathbf p=\frac 1{n}\sum^n_{r=1}\mathbf v_r&amp;lt;/math&amp;gt;, where each &amp;lt;math&amp;gt;\mathbf v_r&amp;lt;/math&amp;gt; is a probability vector with at most two non-zero entries. Describe a method, based on this observation, for sampling from &amp;lt;math&amp;gt;\mathbf p&amp;lt;/math&amp;gt; viewed as a probability mass function.&lt;br /&gt;
* [&#039;&#039;&#039;Stochastic domination&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;X, Y&amp;lt;/math&amp;gt; be continuous random variables. Show that &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; dominates &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; stochastically if and only if &amp;lt;math&amp;gt;\mathbb{E}[f(X)]\geq \mathbb{E}[f(Y)]&amp;lt;/math&amp;gt; for any non-decreasing function &amp;lt;math&amp;gt;f&amp;lt;/math&amp;gt; for which the expectations exist.&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Continuous Distributions)==&lt;br /&gt;
* [&#039;&#039;&#039;Uniform Distribution (i)&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;U&amp;lt;/math&amp;gt; be uniform on &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;0 &amp;lt;q &amp;lt;1&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;X= 1 + \lfloor\ln U/\ln q\rfloor&amp;lt;/math&amp;gt; has a geometric distribution.&lt;br /&gt;
* [&#039;&#039;&#039;Uniform Distribution (ii)&#039;&#039;&#039;] Show that it cannot be the case that &amp;lt;math&amp;gt;U= X + Y&amp;lt;/math&amp;gt; where &amp;lt;math&amp;gt;U&amp;lt;/math&amp;gt; is uniformly distributed on &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; are independent and identically distributed. You should not assume that &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; are continuous variables.&lt;br /&gt;
* [&#039;&#039;&#039;Uniform Distribution (iii)&#039;&#039;&#039;] Disprove the existence of uniform distribution over &amp;lt;math&amp;gt;[a,+\infty)&amp;lt;/math&amp;gt; for any &amp;lt;math&amp;gt;a\in\mathbb R&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Exponential distribution (i)&#039;&#039;&#039;] Prove that exponential distribution is the only memoryless continuous random variable.&lt;br /&gt;
* [&#039;&#039;&#039;Exponential distribution (ii)&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be exponentially distributed with parameter &amp;lt;math&amp;gt;\lambda&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; be the greatest integer not greater than &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;, and set &amp;lt;math&amp;gt;M= X− N&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;M&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt; are independent. Find the density function of &amp;lt;math&amp;gt;M&amp;lt;/math&amp;gt; and the distribution of &amp;lt;math&amp;gt;N&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Waiting for offers&#039;&#039;&#039;] I am selling my house, and have decided to accept the first offer exceeding ￥&amp;lt;math&amp;gt;K&amp;lt;/math&amp;gt;. Assuming that offers are independent random variables with common distribution function &amp;lt;math&amp;gt;F&amp;lt;/math&amp;gt;, find the expected number of offers received before I sell the house.&lt;br /&gt;
* [&#039;&#039;&#039;Geometric distribution&#039;&#039;&#039;] Prove that &amp;lt;math&amp;gt;\lfloor X\rfloor&amp;lt;/math&amp;gt; is a geometric random variable, and find its probability mass function, where &amp;lt;math&amp;gt;X\sim\exp(\lambda)&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Poisson clocks&#039;&#039;&#039;] Prove that a Poisson point process with &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; Poisson clocks with rate &amp;lt;math&amp;gt;\lambda&amp;lt;/math&amp;gt; is equivalent to the &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;-clock process with rate &amp;lt;math&amp;gt;\lambda k&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Poissonian bears&#039;&#039;&#039;] In a certain town at time &amp;lt;math&amp;gt;t = 0&amp;lt;/math&amp;gt; there are no bears. Brown bears and grizzly bears arrive as independent Poisson point processes &amp;lt;math&amp;gt;B&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; with respective intensities &amp;lt;math&amp;gt;\beta&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;\gamma&amp;lt;/math&amp;gt;.&lt;br /&gt;
*# Show that the first bear is brown with probability &amp;lt;math&amp;gt;\beta/(\beta+ \gamma)&amp;lt;/math&amp;gt;.&lt;br /&gt;
*# Find the probability that between two consecutive brown bears, there arrive exactly &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt; grizzly bears.&lt;br /&gt;
* [&#039;&#039;&#039;Bivariate normal distributions (i)&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;f_{X,Y}(x,y)=\frac{1}{2\pi\sigma_1\sigma_2\sqrt{1-\rho^2}}\exp(-\frac{1}{2}Q(x,y))&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;Q(x,y)=\frac{1}{1-\rho^2}\left[\left(\frac{x-\mu_1}{\sigma_1}\right)^2-2\rho\left(\frac{x-\mu_1}{\sigma_1}\right)\left(\frac{y-\mu_2}{\sigma_2}\right)+\left(\frac{y-\mu_2}{\sigma_2}\right)^2\right]&amp;lt;/math&amp;gt; be density function of random variable pair &amp;lt;math&amp;gt;(X, Y)&amp;lt;/math&amp;gt;. Find the means, variances of &amp;lt;math&amp;gt;X, Y&amp;lt;/math&amp;gt; and their covariance.&lt;br /&gt;
* [&#039;&#039;&#039;Bivariate normal distributions (ii)&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; have the &amp;lt;math&amp;gt;N(0,1)&amp;lt;/math&amp;gt; distribution and let &amp;lt;math&amp;gt;a &amp;gt;0&amp;lt;/math&amp;gt;. Show that the random variable &amp;lt;math&amp;gt;Y&amp;lt;/math&amp;gt; given by &amp;lt;math&amp;gt;Y=&lt;br /&gt;
\begin{cases}X&amp;amp;\text{ if }|X| &amp;lt;a \\&lt;br /&gt;
−X &amp;amp;\text{ if } |X| \ge a&lt;br /&gt;
\end{cases}&amp;lt;/math&amp;gt; has the &amp;lt;math&amp;gt;N(0,1)&amp;lt;/math&amp;gt; distribution, and find an expression for &amp;lt;math&amp;gt;\rho(a)= \mathrm{cov}(X,Y )&amp;lt;/math&amp;gt; in terms of the density function &amp;lt;math&amp;gt;\varphi&amp;lt;/math&amp;gt; of &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt;. Does the pair &amp;lt;math&amp;gt;(X,Y )&amp;lt;/math&amp;gt; have a bivariate normal distribution?&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13451</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13451"/>
		<updated>2026-02-23T17:23:29Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 教学大纲 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time. &lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13450</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13450"/>
		<updated>2026-02-23T17:22:59Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课件及相关阅读资料 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性/整数规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time. &lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# [https://box.nju.edu.cn/f/980d814e4ad64285a640/ Fingerprinting]&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Main_Page&amp;diff=13449</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Main_Page&amp;diff=13449"/>
		<updated>2026-02-23T17:15:09Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Home Pages for Courses and Seminars */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is a course/seminar wiki run by the [http://tcs.nju.edu.cn theory group] in the Department of Computer Science and Technology at Nanjing University.&lt;br /&gt;
&lt;br /&gt;
== Home Pages for Courses and Seminars==&lt;br /&gt;
;Current semester&lt;br /&gt;
* [[高级算法 (Fall 2025)|高级算法 Advanced Algorithms (Fall 2025)]]&lt;br /&gt;
&lt;br /&gt;
* [[高级算法 (Spring 2026)|高级算法 Advanced Algorithms (Spring 2026, Suzhou)]]&lt;br /&gt;
&lt;br /&gt;
;Past courses&lt;br /&gt;
&lt;br /&gt;
* Advanced Algorithms: [[高级算法 (Spring 2025)|Spring 2025(Suzhou)]], [[高级算法 (Fall 2024)|Fall 2024]], [[高级算法 (Fall 2023)|Fall 2023]], [[高级算法 (Fall 2022)|Fall 2022]], [[高级算法 (Fall 2021)|Fall 2021]], [[高级算法 (Fall 2020)|Fall 2020]], [[高级算法 (Fall 2019)|Fall 2019]], [[高级算法 (Fall 2018)|Fall 2018]], [[高级算法 (Fall 2017)|Fall 2017]], [[随机算法 \ 高级算法 (Fall 2016)|Fall 2016]].&lt;br /&gt;
&lt;br /&gt;
*Algorithm Design and Analysis: [https://tcs.nju.edu.cn/shili/courses/2024spring-algo/ Spring 2024]&lt;br /&gt;
&lt;br /&gt;
* Combinatorics: [[组合数学 (Spring 2025)|Spring 2025]], [[组合数学 (Spring 2024)|Spring 2024]], [[组合数学 (Spring 2023)|Spring 2023]], [[组合数学 (Fall 2019)|Fall 2019]], [[组合数学 (Fall 2017)|Fall 2017]], [[组合数学 (Fall 2016)|Fall 2016]], [[组合数学 (Fall 2015)|Fall 2015]], [[组合数学 (Spring 2014)|Spring 2014]], [[组合数学 (Spring 2013)|Spring 2013]], [[组合数学 (Fall 2011)|Fall 2011]], [[Combinatorics (Fall 2010)|Fall 2010]].&lt;br /&gt;
&lt;br /&gt;
* Computational Complexity: [[计算复杂性 (Spring 2025)|Spring 2025]], [[计算复杂性 (Spring 2024)|Spring 2024]], [[计算复杂性 (Spring 2023)|Spring 2023]], [[计算复杂性 (Fall 2019)|Fall 2019]], [[计算复杂性 (Fall 2018)|Fall 2018]].&lt;br /&gt;
&lt;br /&gt;
* Foundations of Data Science: [[数据科学基础 (Fall 2025)|Fall 2025]], [[数据科学基础 (Fall 2024)|Fall 2024]]&lt;br /&gt;
&lt;br /&gt;
* Numerical Method: [[计算方法 Numerical method (Spring 2025)|Spring 2025]], [[计算方法 Numerical method (Spring 2024)|Spring 2024]], [[计算方法 Numerical method (Spring 2023)|Spring 2023]], [https://liuexp.github.io/numerical.html Spring 2022].&lt;br /&gt;
&lt;br /&gt;
* Probability Theory: [[概率论与数理统计 (Spring 2025)|Spring 2025]], [[概率论与数理统计 (Spring 2024)|Spring 2024]], [[概率论与数理统计 (Spring 2023)|Spring 2023]].&lt;br /&gt;
&lt;br /&gt;
* Quantum Computation: [[量子计算 (Spring 2022)|Spring 2022]], [[量子计算 (Spring 2021)|Spring 2021]], [[量子计算 (Fall 2019)|Fall 2019]].&lt;br /&gt;
&lt;br /&gt;
* Randomized Algorithms:  [[随机算法 (Fall 2015)|Fall 2015]], [[随机算法 (Spring 2014)|Spring 2014]], [[随机算法 (Spring 2013)|Spring 2013]], [[随机算法 (Fall 2011)|Fall 2011]], [[Randomized Algorithms (Spring 2010)|Spring 2010]].&lt;br /&gt;
&lt;br /&gt;
;Past seminars, workshops and summer schools&lt;br /&gt;
*计算理论之美暑期学校: [[计算理论之美 (Summer 2025)|2025]], [[计算理论之美 (Summer 2024)|2024]], [[计算理论之美 (Summer 2023)|2023]], [[计算理论之美 (Summer 2021)|2021]]&lt;br /&gt;
*[[Theory Seminar|理论计算机科学讨论班]]&lt;br /&gt;
*[[Study Group|理论计算机科学学习小组]]&lt;br /&gt;
*[[TCSPhD2020| 理论计算机科学优秀博士生论坛2020]]&lt;br /&gt;
*[[Quantum|量子算法与物理实现研讨会]]&lt;br /&gt;
*Theory Day: [[Theory@Suzhou 2025 | 2025 (Suzhou)]],  [[Theory@Nanjing 2019|2019]], [[Theory@Nanjing 2018|2018]], [[Theory@Nanjing 2017|2017]]&lt;br /&gt;
*[[\Delta Seminar on Logic, Philosophy, and Computer Science|Δ Seminar on Logic, Philosophy, and Computer Science]]&lt;br /&gt;
*[[近似算法讨论班 (Fall 2011)|近似算法 Approximation Algorithms, Fall 2011.]]&lt;br /&gt;
&lt;br /&gt;
; 其它链接&lt;br /&gt;
* [[General Circulation(Fall 2025)|大气环流 General Circulation of the Atmosphere, Fall 2025]]&lt;br /&gt;
* [[General Circulation(Fall 2024)|大气环流 General Circulation of the Atmosphere, Fall 2024]]&lt;br /&gt;
&lt;br /&gt;
* [[概率论 (Summer 2014)| 概率与计算 (上海交大 Summer 2014)]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13448</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13448"/>
		<updated>2026-02-23T17:13:36Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2026. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性/整数规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time. &lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# Fingerprinting&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13447</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13447"/>
		<updated>2026-02-23T17:13:24Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课程信息 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn 📧]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐：[mailto:siqi_wang@smail.nju.edu.cn 📧]&lt;br /&gt;
** 齐世毅：[mailto:1083951258@qq.com 📧]&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性/整数规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time. &lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# Fingerprinting&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13446</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13446"/>
		<updated>2026-02-23T17:10:32Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* 课程信息 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐&lt;br /&gt;
** 齐世毅&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1083465754&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性/整数规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time. &lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# Fingerprinting&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13445</id>
		<title>高级算法 (Spring 2026)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E9%AB%98%E7%BA%A7%E7%AE%97%E6%B3%95_(Spring_2026)&amp;diff=13445"/>
		<updated>2026-02-23T17:05:17Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: Created page with &amp;quot;{{Infobox |name         = Infobox |bodystyle    =  |title        = &amp;lt;font size=3&amp;gt;高级算法  &amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt; |titlestyle   =   |image        =  |imagestyle   =  |caption      =  |captionstyle =  |headerstyle  = background:#ccf; |labelstyle   = background:#ddf; |datastyle    =   |header1 =任课教师 |label1  =  |data1   =  |header2 =  |label2  =  |data2   = &amp;#039;&amp;#039;&amp;#039;刘明谋&amp;#039;&amp;#039;&amp;#039; |header3 =  |label3  = 电子邮件 |data3   = lmm@nju.edu.cn  |header4 = |label4=...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;高级算法 &lt;br /&gt;
&amp;lt;br&amp;gt;Advanced Algorithms&amp;lt;/font&amp;gt;&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =任课教师&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data2   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header3 = &lt;br /&gt;
|label3  = 电子邮件&lt;br /&gt;
|data3   = lmm@nju.edu.cn &lt;br /&gt;
|header4 =&lt;br /&gt;
|label4= 办公室&lt;br /&gt;
|data4= 南雍-西229&lt;br /&gt;
|header5 = &lt;br /&gt;
|label5  = &lt;br /&gt;
|header11 = 课程时间地点&lt;br /&gt;
|label11  = &lt;br /&gt;
|data11   = &lt;br /&gt;
|header12 =&lt;br /&gt;
|label12  = 教室&lt;br /&gt;
|data12   = 周一，9am-12pm&amp;lt;br&amp;gt;苏教B207&lt;br /&gt;
|header13 =&lt;br /&gt;
|label13  = Place&lt;br /&gt;
|data13   = &lt;br /&gt;
|header14 =&lt;br /&gt;
|label14  = 答疑时间&lt;br /&gt;
|data14   = 周五，2pm-5pm&amp;lt;br&amp;gt;南雍-西229&lt;br /&gt;
|header15 = 教材&lt;br /&gt;
|label15  = &lt;br /&gt;
|data15   = &lt;br /&gt;
|header16 =&lt;br /&gt;
|label16  = &lt;br /&gt;
|data16   = [[File:MR-randomized-algorithms.png|border|100px]]&lt;br /&gt;
|header17 =&lt;br /&gt;
|label17  = &lt;br /&gt;
|data17   = Motwani and Raghavan. &amp;lt;br&amp;gt;&#039;&#039;Randomized Algorithms&#039;&#039;.&amp;lt;br&amp;gt; Cambridge Univ Press, 1995.&lt;br /&gt;
|header18 =&lt;br /&gt;
|label18  = &lt;br /&gt;
|data18   = [[File:Approximation_Algorithms.jpg|border|100px]]&lt;br /&gt;
|header19 =&lt;br /&gt;
|label19  = &lt;br /&gt;
|data19   =  Vazirani. &amp;lt;br&amp;gt;&#039;&#039;Approximation Algorithms&#039;&#039;. &amp;lt;br&amp;gt; Springer-Verlag, 2001.&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Advanced Algorithms&#039;&#039; class of spring 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= 通知 =&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;(2026/3/2)&#039;&#039;&#039; 第一堂课&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
= 课程信息 =&lt;br /&gt;
* &#039;&#039;&#039;任课教师&#039;&#039;&#039;: &lt;br /&gt;
:* [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;助教&#039;&#039;&#039;: &lt;br /&gt;
** 王思齐&lt;br /&gt;
** 齐世毅&lt;br /&gt;
* &#039;&#039;&#039;课程时间地点&#039;&#039;&#039;: &lt;br /&gt;
** 周一，9am-12pm，苏教B207&lt;br /&gt;
* &#039;&#039;&#039;答疑时间&#039;&#039;&#039;: 周五, 2pm-5pm, 南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;QQ群&#039;&#039;&#039;:&lt;br /&gt;
&lt;br /&gt;
= 教学大纲 =&lt;br /&gt;
随着计算机算法理论的不断发展，现代计算机算法的设计与分析大量地使用非初等的数学工具以及非传统的算法思想。“高级算法”这门课程就是面向计算机算法的这一发展趋势而设立的。课程将针对传统算法课程未系统涉及、却在计算机科学各领域的科研和实践中扮演重要角色的高等算法设计思想和算法分析工具进行系统讲授。&lt;br /&gt;
&lt;br /&gt;
课程内容分为五大部分：&lt;br /&gt;
* 基于哈希的大数据算法&lt;br /&gt;
* 哈希表与面向大数据的现代计算场景&lt;br /&gt;
* 测度的集中与处理高维数据&lt;br /&gt;
* 最大流与线性/整数规划&lt;br /&gt;
* 其他重要话题&lt;br /&gt;
&lt;br /&gt;
=== 先修课程 ===&lt;br /&gt;
* 必须：离散数学，概率论，线性代数。&lt;br /&gt;
* 推荐：算法设计与分析。&lt;br /&gt;
&lt;br /&gt;
=== 课程教材 ===&lt;br /&gt;
本门课较为前沿，大部分课程内容还没有进入任何教材。以下教材和参考书仅作为参考。&lt;br /&gt;
* [[高级算法 (Fall 2024) / Course materials|&amp;lt;font size=3&amp;gt;教材和参考书&amp;lt;/font&amp;gt;]]&lt;br /&gt;
&lt;br /&gt;
=== 成绩 ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考核。最终成绩将由平时作业成绩和期末考核成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= 课后作业 =&lt;br /&gt;
Late policy: In general, we will accomodate late submission requests ONLY IF you made such requests ahead of time. &lt;br /&gt;
&lt;br /&gt;
= 课件及相关阅读资料 =&lt;br /&gt;
# Fingerprinting&lt;br /&gt;
#* Polynomial Identity Testing&lt;br /&gt;
#* Communication Complexity (Equality)&lt;br /&gt;
#* Application: Bipartite Perfect Matching, Checking Matrix Multiplication&lt;br /&gt;
#* Karp-Rabin Algorithm (string-searching), Lipton’s Algorithm (checking identity of multisets)&lt;br /&gt;
&lt;br /&gt;
= Related Online Courses=&lt;br /&gt;
* [https://www.cs.columbia.edu/~andoni/advancedS24/index.html Advanced Algorithms] by Alexandr Andoni at Columbia University.&lt;br /&gt;
* [https://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15854-f21/www/ Advanced Approximation Algorithms] by Anupam Gupta at CMU.&lt;br /&gt;
* [http://people.csail.mit.edu/moitra/854.html Advanced Algorithms] by Ankur Moitra at MIT.&lt;br /&gt;
* [https://6.5210.csail.mit.edu/ Advanced Algorithms] by David Karger at MIT.&lt;br /&gt;
* [https://www.cs.cmu.edu/~dwoodruf/teaching/15851-spring24/ Algorithms for Big Data] by David Woodruff at CMU.&lt;br /&gt;
* [https://www.sketchingbigdata.org/fall20/lec/ Sketching Algorithms] by Jelani Nelson at UC Berkeley.&lt;br /&gt;
* [http://web.stanford.edu/class/cs168/index.html The Modern Algorithmic Toolbox] by Tim Roughgarden and Gregory Valiant at Stanford.&lt;br /&gt;
* [https://www.cs.princeton.edu/courses/archive/fall18/cos521/ Advanced Algorithm Design] by Pravesh Kothari and Christopher Musco at Princeton.&lt;br /&gt;
* [http://www.cs.cmu.edu/afs/cs.cmu.edu/academic/class/15859-f11/www/ Linear and Semidefinite Programming (Advanced Algorithms)] by Anupam Gupta and Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://www.cs.cmu.edu/~odonnell/papers/cs-theory-toolkit-lecture-notes.pdf CS Theory Toolkit] by Ryan O&#039;Donnell at CMU.&lt;br /&gt;
* [https://cs.uwaterloo.ca/~lapchi/cs860/index.html Eigenvalues and Polynomials] by Lap Chi Lau at University of Waterloo.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13438</id>
		<title>数据科学基础 (Fall 2025)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13438"/>
		<updated>2025-12-27T16:18:08Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Lectures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;数据科学基础&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Foundations of Data Science&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = lmm@nju.edu.cn&lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 南雍-西229&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = 周五, 2pm-5pm &amp;lt;br/&amp;gt;苏教楼D202&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = 周四, 3pm-5pm&amp;lt;br/&amp;gt;南雍-西229&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Foundations_of_Data_Science.jpg|border|100px]]&lt;br /&gt;
|header17= &lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Foundations of Data Science&#039;&#039;&#039; &amp;lt;br&amp;gt; Avrim Blum, John Hopcroft, Ravi Kannan &amp;lt;br&amp;gt;   Cambridge University Press (2020)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Foundations of Data Science&#039;&#039; (数据科学基础) class of Fall 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* 新学期第一堂课：2025年8月29日，苏教楼D202。&lt;br /&gt;
* 2025年11月7日因校运动会停课一次。&lt;br /&gt;
* 第五次作业的 Aliasing method 一题中应该是&amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n{-1}}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;&lt;br /&gt;
* 12月19日的课调到12月28日&lt;br /&gt;
* 第六次作业的 Densest induced subgraph in random graph 一题中应该是&amp;lt;math&amp;gt;\frac n 4&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\frac n 2&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
** [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 梁梓豪：[mailto:zhliang@smail.nju.edu.cn 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
** 周海刚：[mailto:hgzhou2003@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 欧丰宁：[mailto:oufn02@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 于逸潇：[mailto:yixiaoyu@smail.nju.edu.cn 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 缪天顺：[mailto:mtsmts2022@outlook.com 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周五：2pm-5pm，苏教楼D202&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* 周四：3pm-5pm，南雍-西229（刘明谋）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1019436733（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：包括概率空间、随机变量及其数字特征、多维与连续随机变量&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：包括测度集中现象，概率法，离散随机过程三部分&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：包括参数估计、假设检验、贝叶斯估计、方差分析、相关性及回归分析等统计推断内容。&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计相关的基本概念，以及典型的统计模型、统计推断方法。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，学生将能够掌握概率论和统计学的基本理论和方法，具备处理和分析实际数据的能力，为后续学习数据挖掘、机器学习、大数据技术等数据科学相关领域打下坚实基础。本课程采用课堂讲授、案例分析和课后练习相结合的教学方式，注重理论与实践相结合，培养学生运用所学知识解决实际问题的能力。通过本课程的学习，学生将能够具备扎实的数学基础，为未来从事数据科学研究和实践奠定坚实基础。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[BHK]&#039;&#039;&#039; &#039;&#039;Foundations of Data Science&#039;&#039;, by Avrim Blum, John Hopcroft, and Ravindran Kannan; Cambridge University Press (2020).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 1|Problem Set 1]]  请在 2025/09/26 上课之前(14:00 UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 2|Problem Set 2]]  请在 2025/10/03 14:00前(UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 3|Problem Set 3]]  请在 2025/10/17 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/e717e1b8eccd4c4fb889/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 4|Problem Set 4]]  请在 2025/10/31 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/fb85c46de75f4095b326/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA4.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 5|Problem Set 5]]  请在 2025/11/21 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/1243dac3190b4e1eb30b/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA5.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 6|Problem Set 6]]  请在 2025/12/26 14:00 UTC+8 前上传到 [https://box.nju.edu.cn/u/d/9302de38f13146eeb5e9/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA6.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/1/1a/Intro%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 课程简介]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1Vkz4YqEC9 Bertrand Paradox (贝特朗悖论)的视频]&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/5/51/ProbSpace%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/732bad4060fc442789ab/ 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Volume of Hamming balls|Volume of Hamming balls]]&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1ta411A7fp/ 高尔顿板（Galton board）视频] 和 [https://en.wikipedia.org/wiki/Galton_board 维基百科页面]&lt;br /&gt;
# [https://box.nju.edu.cn/f/89f212b7b6874c0e9097/  ‎偏差和矩]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2024)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/1eca74dafe6c4d11a799/ 连续分布]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第3章, 和4.1节&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 4&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 8, 9&#039;&#039;&#039;&lt;br /&gt;
#* [https://measure.axler.net/MIRA.pdf Measure, Integration &amp;amp; Real Analysis] by Sheldon Axler&lt;br /&gt;
# [https://box.nju.edu.cn/f/9a675bedb36243d19616/ 极限定理]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第5章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.7~5.10, 7.1~7.5&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/1049bd7f7974465cbc85/ 测度集中]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 4&#039;&#039;&#039; and &#039;&#039;&#039;Sections 13.1, 13.4~13.5&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.11, 12.1~12.3, 7.8~7.9&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Hoeffding&#039;s lemma|Hoeffding&#039;s lemma]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/06617a7c88af456696de/ 随机过程]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第6章, 第7章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU]  Chapters 7, Sections 13.1~13.3&#039;&#039;&#039; or &#039;&#039;&#039;[GS] Chapters 6, Sections 12.4~12.5&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/OST and applications|OST and applications]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/be7ade6440ea4462af3b/ 统计学与点估计]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第8章, 第9章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Section 9.6~9.7&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/5e1cb2f1d656460bb60c/ 假设检验]&lt;br /&gt;
# [https://box.nju.edu.cn/f/186d4b14d4744a858857/ 方差分析、简单线性回归、信息论初步]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13429</id>
		<title>数据科学基础 (Fall 2025)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13429"/>
		<updated>2025-12-25T13:17:04Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Announcement */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;数据科学基础&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Foundations of Data Science&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = lmm@nju.edu.cn&lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 南雍-西229&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = 周五, 2pm-5pm &amp;lt;br/&amp;gt;苏教楼D202&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = 周四, 3pm-5pm&amp;lt;br/&amp;gt;南雍-西229&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Foundations_of_Data_Science.jpg|border|100px]]&lt;br /&gt;
|header17= &lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Foundations of Data Science&#039;&#039;&#039; &amp;lt;br&amp;gt; Avrim Blum, John Hopcroft, Ravi Kannan &amp;lt;br&amp;gt;   Cambridge University Press (2020)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Foundations of Data Science&#039;&#039; (数据科学基础) class of Fall 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* 新学期第一堂课：2025年8月29日，苏教楼D202。&lt;br /&gt;
* 2025年11月7日因校运动会停课一次。&lt;br /&gt;
* 第五次作业的 Aliasing method 一题中应该是&amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n{-1}}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;&lt;br /&gt;
* 12月19日的课调到12月28日&lt;br /&gt;
* 第六次作业的 Densest induced subgraph in random graph 一题中应该是&amp;lt;math&amp;gt;\frac n 4&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\frac n 2&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
** [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 梁梓豪：[mailto:zhliang@smail.nju.edu.cn 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
** 周海刚：[mailto:hgzhou2003@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 欧丰宁：[mailto:oufn02@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 于逸潇：[mailto:yixiaoyu@smail.nju.edu.cn 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 缪天顺：[mailto:mtsmts2022@outlook.com 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周五：2pm-5pm，苏教楼D202&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* 周四：3pm-5pm，南雍-西229（刘明谋）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1019436733（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：包括概率空间、随机变量及其数字特征、多维与连续随机变量&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：包括测度集中现象，概率法，离散随机过程三部分&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：包括参数估计、假设检验、贝叶斯估计、方差分析、相关性及回归分析等统计推断内容。&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计相关的基本概念，以及典型的统计模型、统计推断方法。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，学生将能够掌握概率论和统计学的基本理论和方法，具备处理和分析实际数据的能力，为后续学习数据挖掘、机器学习、大数据技术等数据科学相关领域打下坚实基础。本课程采用课堂讲授、案例分析和课后练习相结合的教学方式，注重理论与实践相结合，培养学生运用所学知识解决实际问题的能力。通过本课程的学习，学生将能够具备扎实的数学基础，为未来从事数据科学研究和实践奠定坚实基础。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[BHK]&#039;&#039;&#039; &#039;&#039;Foundations of Data Science&#039;&#039;, by Avrim Blum, John Hopcroft, and Ravindran Kannan; Cambridge University Press (2020).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 1|Problem Set 1]]  请在 2025/09/26 上课之前(14:00 UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 2|Problem Set 2]]  请在 2025/10/03 14:00前(UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 3|Problem Set 3]]  请在 2025/10/17 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/e717e1b8eccd4c4fb889/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 4|Problem Set 4]]  请在 2025/10/31 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/fb85c46de75f4095b326/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA4.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 5|Problem Set 5]]  请在 2025/11/21 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/1243dac3190b4e1eb30b/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA5.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 6|Problem Set 6]]  请在 2025/12/26 14:00 UTC+8 前上传到 [https://box.nju.edu.cn/u/d/9302de38f13146eeb5e9/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA6.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/1/1a/Intro%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 课程简介]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1Vkz4YqEC9 Bertrand Paradox (贝特朗悖论)的视频]&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/5/51/ProbSpace%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/732bad4060fc442789ab/ 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Volume of Hamming balls|Volume of Hamming balls]]&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1ta411A7fp/ 高尔顿板（Galton board）视频] 和 [https://en.wikipedia.org/wiki/Galton_board 维基百科页面]&lt;br /&gt;
# [https://box.nju.edu.cn/f/89f212b7b6874c0e9097/  ‎偏差和矩]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2024)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/1eca74dafe6c4d11a799/ 连续分布]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第3章, 和4.1节&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 4&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 8, 9&#039;&#039;&#039;&lt;br /&gt;
#* [https://measure.axler.net/MIRA.pdf Measure, Integration &amp;amp; Real Analysis] by Sheldon Axler&lt;br /&gt;
# [https://box.nju.edu.cn/f/9a675bedb36243d19616/ 极限定理]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第5章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.7~5.10, 7.1~7.5&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/1049bd7f7974465cbc85/ 测度集中]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 4&#039;&#039;&#039; and &#039;&#039;&#039;Sections 13.1, 13.4~13.5&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.11, 12.1~12.3, 7.8~7.9&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Hoeffding&#039;s lemma|Hoeffding&#039;s lemma]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/06617a7c88af456696de/ 随机过程]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第6章, 第7章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU]  Chapters 7, Sections 13.1~13.3&#039;&#039;&#039; or &#039;&#039;&#039;[GS] Chapters 6, Sections 12.4~12.5&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/OST and applications|OST and applications]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/be7ade6440ea4462af3b/ 统计学与点估计]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第8章, 第9章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Section 9.6~9.7&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/5e1cb2f1d656460bb60c/ 假设检验]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13428</id>
		<title>数据科学基础 (Fall 2025)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13428"/>
		<updated>2025-12-25T13:16:34Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Announcement */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;数据科学基础&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Foundations of Data Science&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = lmm@nju.edu.cn&lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 南雍-西229&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = 周五, 2pm-5pm &amp;lt;br/&amp;gt;苏教楼D202&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = 周四, 3pm-5pm&amp;lt;br/&amp;gt;南雍-西229&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Foundations_of_Data_Science.jpg|border|100px]]&lt;br /&gt;
|header17= &lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Foundations of Data Science&#039;&#039;&#039; &amp;lt;br&amp;gt; Avrim Blum, John Hopcroft, Ravi Kannan &amp;lt;br&amp;gt;   Cambridge University Press (2020)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Foundations of Data Science&#039;&#039; (数据科学基础) class of Fall 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* 新学期第一堂课：2025年8月29日，苏教楼D202。&lt;br /&gt;
* 2025年11月7日因校运动会停课一次。&lt;br /&gt;
* 第五次作业的 Aliasing method 一题中应该是&amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n{-1}}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;&lt;br /&gt;
* 第六次作业的 Densest induced subgraph in random graph 一题中应该是&amp;lt;math&amp;gt;\frac n 4&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\frac n 2&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
** [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 梁梓豪：[mailto:zhliang@smail.nju.edu.cn 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
** 周海刚：[mailto:hgzhou2003@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 欧丰宁：[mailto:oufn02@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 于逸潇：[mailto:yixiaoyu@smail.nju.edu.cn 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 缪天顺：[mailto:mtsmts2022@outlook.com 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周五：2pm-5pm，苏教楼D202&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* 周四：3pm-5pm，南雍-西229（刘明谋）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1019436733（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：包括概率空间、随机变量及其数字特征、多维与连续随机变量&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：包括测度集中现象，概率法，离散随机过程三部分&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：包括参数估计、假设检验、贝叶斯估计、方差分析、相关性及回归分析等统计推断内容。&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计相关的基本概念，以及典型的统计模型、统计推断方法。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，学生将能够掌握概率论和统计学的基本理论和方法，具备处理和分析实际数据的能力，为后续学习数据挖掘、机器学习、大数据技术等数据科学相关领域打下坚实基础。本课程采用课堂讲授、案例分析和课后练习相结合的教学方式，注重理论与实践相结合，培养学生运用所学知识解决实际问题的能力。通过本课程的学习，学生将能够具备扎实的数学基础，为未来从事数据科学研究和实践奠定坚实基础。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[BHK]&#039;&#039;&#039; &#039;&#039;Foundations of Data Science&#039;&#039;, by Avrim Blum, John Hopcroft, and Ravindran Kannan; Cambridge University Press (2020).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 1|Problem Set 1]]  请在 2025/09/26 上课之前(14:00 UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 2|Problem Set 2]]  请在 2025/10/03 14:00前(UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 3|Problem Set 3]]  请在 2025/10/17 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/e717e1b8eccd4c4fb889/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 4|Problem Set 4]]  请在 2025/10/31 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/fb85c46de75f4095b326/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA4.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 5|Problem Set 5]]  请在 2025/11/21 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/1243dac3190b4e1eb30b/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA5.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 6|Problem Set 6]]  请在 2025/12/26 14:00 UTC+8 前上传到 [https://box.nju.edu.cn/u/d/9302de38f13146eeb5e9/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA6.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/1/1a/Intro%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 课程简介]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1Vkz4YqEC9 Bertrand Paradox (贝特朗悖论)的视频]&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/5/51/ProbSpace%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/732bad4060fc442789ab/ 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Volume of Hamming balls|Volume of Hamming balls]]&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1ta411A7fp/ 高尔顿板（Galton board）视频] 和 [https://en.wikipedia.org/wiki/Galton_board 维基百科页面]&lt;br /&gt;
# [https://box.nju.edu.cn/f/89f212b7b6874c0e9097/  ‎偏差和矩]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2024)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/1eca74dafe6c4d11a799/ 连续分布]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第3章, 和4.1节&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 4&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 8, 9&#039;&#039;&#039;&lt;br /&gt;
#* [https://measure.axler.net/MIRA.pdf Measure, Integration &amp;amp; Real Analysis] by Sheldon Axler&lt;br /&gt;
# [https://box.nju.edu.cn/f/9a675bedb36243d19616/ 极限定理]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第5章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.7~5.10, 7.1~7.5&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/1049bd7f7974465cbc85/ 测度集中]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 4&#039;&#039;&#039; and &#039;&#039;&#039;Sections 13.1, 13.4~13.5&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.11, 12.1~12.3, 7.8~7.9&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Hoeffding&#039;s lemma|Hoeffding&#039;s lemma]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/06617a7c88af456696de/ 随机过程]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第6章, 第7章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU]  Chapters 7, Sections 13.1~13.3&#039;&#039;&#039; or &#039;&#039;&#039;[GS] Chapters 6, Sections 12.4~12.5&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/OST and applications|OST and applications]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/be7ade6440ea4462af3b/ 统计学与点估计]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第8章, 第9章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Section 9.6~9.7&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/5e1cb2f1d656460bb60c/ 假设检验]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_6&amp;diff=13427</id>
		<title>数据科学基础 (Fall 2025)/Problem Set 6</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_6&amp;diff=13427"/>
		<updated>2025-12-25T13:15:46Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Problem 3 (Concentration of measure) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
&lt;br /&gt;
==  Assumption throughout Problem Set 6 == &lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\Pr)&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (LLN &amp;amp; CLT)==&lt;br /&gt;
* [&#039;&#039;&#039;Proportional betting&#039;&#039;&#039;] In each of a sequence of independent bets, a gambler either wins 30%, or loses 25% of her current fortune, each with probability &amp;lt;math&amp;gt;1/2&amp;lt;/math&amp;gt;. Denoting her fortune after &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; bets by &amp;lt;math&amp;gt;F_n&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\mathbb E(F_n)\to\infty&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;n \to\infty&amp;lt;/math&amp;gt;, while &amp;lt;math&amp;gt;F_n \to 0&amp;lt;/math&amp;gt; almost surely.&lt;br /&gt;
* [&#039;&#039;&#039;Entropy&#039;&#039;&#039;]  The interval &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt; is partitioned into &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; disjoint sub-intervals with lengths &amp;lt;math&amp;gt;p_1,p_2,\dots,p_n&amp;lt;/math&amp;gt;, and the entropy of this partition is defined to be &amp;lt;math&amp;gt;h= −\sum^n_{i=1} p_i log p_i&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;X_1,X_2,\dots&amp;lt;/math&amp;gt; be independent random variables having the uniform distribution on &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt;, and let &amp;lt;math&amp;gt;Z_m^{(i)}&amp;lt;/math&amp;gt; be the number of the &amp;lt;math&amp;gt;X_1,X_2,\dots,X_m&amp;lt;/math&amp;gt; which lie in the &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt;-th interval of the partition above. Show that &amp;lt;math&amp;gt;R_m =\prod^n_{i=1} p_i^{Z_m^{(i)}}&amp;lt;/math&amp;gt; satisfies &amp;lt;math&amp;gt;m^{−1}\cdot\log R_m \to −h&amp;lt;/math&amp;gt; almost surely as &amp;lt;math&amp;gt;m \to\infty&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Mobilizing a Supermajority&#039;&#039;&#039;] In a society of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; independent individuals, each person independently (i) attends the vote with probability &amp;lt;math&amp;gt;\tau&amp;lt;/math&amp;gt; and abstains with probability &amp;lt;math&amp;gt;1-\tau&amp;lt;/math&amp;gt;; (ii) if attending, votes &amp;quot;Yes&amp;quot; with probability &amp;lt;math&amp;gt;p&amp;lt;/math&amp;gt; and &amp;quot;No&amp;quot; with probability &amp;lt;math&amp;gt;1-p&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;A proposal is accepted if among all attendees, the fraction of Yes votes is at least a supermajority threshold &amp;lt;math&amp;gt;\theta \in (1/2,1)&amp;lt;/math&amp;gt; (e.g., &amp;lt;math&amp;gt;\theta = 2/3&amp;lt;/math&amp;gt;). A mobilization campaign may add &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; extra supporters who certainly attend and certainly vote Yes. Your goal is to determine the minimal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that the proposal passes with probability at least &amp;lt;math&amp;gt;1-\delta&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Concentration of measure)==&lt;br /&gt;
* [&#039;&#039;&#039;Tossing coins&#039;&#039;&#039;] We repeatedly toss a fair coin (with an equal probability of heads and tails). Let the random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be the number of throws required to obtain a total of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; heads. Show that &amp;lt;math&amp;gt;\Pr[X &amp;gt; 2n + \delta\sqrt{n\log n}]\leq n^{-\delta^2/6}&amp;lt;/math&amp;gt; for any real &amp;lt;math&amp;gt;0&amp;lt;\delta&amp;lt;\sqrt{\frac{4n}{\log n}}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable with expectation &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; such that moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt; t &amp;gt; 0 &amp;lt;/math&amp;gt;. We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt; X &amp;lt;/math&amp;gt;: &lt;br /&gt;
** Chernoff Bound:  &amp;lt;math&amp;gt;\Pr[|X| \geq \delta] \leq \min_{t \geq 0} {\mathbb{E}[e^{t|X|}]}/{e^{t\delta}}&amp;lt;/math&amp;gt;;&lt;br /&gt;
** &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-Moment Bound:  &amp;lt;math&amp;gt;\Pr[|X| \geq \delta] \leq {\mathbb{E}[|X|^k]}/{\delta^k}&amp;lt;/math&amp;gt;.&lt;br /&gt;
# Show that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there exists a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-moment bound is no weaker than the Chernoff bound. (Hint: Use the probabilistic method. Construct a distribution over all &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-moment bound, and show that the expected bound is not weaker than the Chernoff bound.)&lt;br /&gt;
# Why would we still prefer the Chernoff bound to the (seemingly) stronger &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound?&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Densest induced subgraph in random graph&#039;&#039;&#039;] For a graph &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; on vertex set &amp;lt;math&amp;gt;[n] = {1,2,\dots,n}&amp;lt;/math&amp;gt;, define the average-degree density of an induced subgraph as &amp;lt;math&amp;gt;\mathrm{dens}(S) := \frac{e(S)}{|S|}&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;e(S)&amp;lt;/math&amp;gt; is the number of edges with both endpoints in &amp;lt;math&amp;gt;S&amp;lt;/math&amp;gt;. Define the densest induced subgraph of &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;\mathrm{dens}(G) := \max_{S \subseteq [n], |S|\ge 2} \mathrm{dens}(S)&amp;lt;/math&amp;gt;. Show that, with probability at least &amp;lt;math&amp;gt;2/3&amp;lt;/math&amp;gt;, the densest induced subgraph in &amp;lt;math&amp;gt;G(n,1/2)&amp;lt;/math&amp;gt; satisfies &amp;lt;math&amp;gt;\mathrm{dens}(G(n,1/2)) \le \frac{n}{4} + O(n^{1/2})&amp;lt;/math&amp;gt;. More precisely, prove that there exists an absolute constant &amp;lt;math&amp;gt;C &amp;gt; 0&amp;lt;/math&amp;gt; such that &amp;lt;math&amp;gt;\Pr\big( \mathrm{dens}(G(n,1/2)) \le \frac{n}{4} + C n^{1/2} \big) \ge \frac{2}{3}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Random processes)==&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;High-dimensional random walk&#039;&#039;&#039;] Consider an unbiased random walk over &amp;lt;math&amp;gt;\mathbb R^n&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;n&amp;gt;1&amp;lt;/math&amp;gt;. At each step, assuming we are at position &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; without loss of generality, for each dimension &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt;, we choose a movement &amp;lt;math&amp;gt;\delta_i\in\mathbb R&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb E [\delta_i]=0&amp;lt;/math&amp;gt; (i.e. unbiased) at random, then move to &amp;lt;math&amp;gt; X+\sum_i\sigma_i&amp;lt;/math&amp;gt;. Prove that an unbiased random walk in any number of dimensions, regardless of the distributions of &amp;lt;math&amp;gt;\sigma_i&amp;lt;/math&amp;gt;&#039;s, is an example of a martingale.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Pólya’s urn&#039;&#039;&#039;]A bag contains red and blue balls, with initially  &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt; red and  &amp;lt;math&amp;gt;b&amp;lt;/math&amp;gt; blue where  &amp;lt;math&amp;gt;rb &amp;gt;0 &amp;lt;/math&amp;gt;. A ball is drawn from the bag, its color noted, and then it is returned to the bag together with a new ball of the same color. Let  &amp;lt;math&amp;gt;R_n &amp;lt;/math&amp;gt; be the number of red balls after  &amp;lt;math&amp;gt;n &amp;lt;/math&amp;gt; such operations. Show that  &amp;lt;math&amp;gt;Y_n = R_n/(n + r + b) &amp;lt;/math&amp;gt; is a martingale.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Optional stopping 1-D symmetric random walk&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;S_n = a + \sum_{r=1}^n X_r&amp;lt;/math&amp;gt; be a simple symmetric random walk. The walk stops at the earliest time &amp;lt;math&amp;gt;T&amp;lt;/math&amp;gt; when it reaches either &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; or &amp;lt;math&amp;gt;K&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; K&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;&lt;br /&gt;
M_n = \sum_{r=0}^n S_r - \tfrac{1}{3} S_n^3&lt;br /&gt;
&amp;lt;/math&amp;gt; is a martingale, and deduce that &amp;lt;math&amp;gt;&lt;br /&gt;
\mathbb{E}\left( \sum_{r=0}^{T} S_r \right)&lt;br /&gt;
= \tfrac{1}{3} (K^2 - a^2) a + a.&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Random walk on a graph&#039;&#039;&#039;] A particle performs a random walk on the vertex set of a connected graph &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;, which for simplicity we assume to have neither loops nor multiple edges. At each stage it moves to a neighbor of its current position, each such neighbor being chosen with equal probability. If &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; has &amp;lt;math&amp;gt;\eta&amp;lt;\infty&amp;lt;/math&amp;gt; edges, show that the stationary distribution is given by &amp;lt;math&amp;gt;\pi(v) = d_v/(2\eta)&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;d_v&amp;lt;/math&amp;gt; is the degree of vertex &amp;lt;math&amp;gt;v&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Reversibility versus periodicity&#039;&#039;&#039;] Can a reversible chain be periodic?&lt;br /&gt;
* [&#039;&#039;&#039;Metropolis–Hastings algorithm&#039;&#039;&#039;] To sample a state, for each state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt;, the Glauber dynamics uniformly chooses a state among the adjacent states of &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; together with state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; itself at random in each step, and moves to the chosen state. The Metropolis-Hastings algorithm generalizes the idea of Glauber dynamics. Let us assume that we have designed an irreducible state space for our Markov chain; now we want to construct a Markov chain on this state space with a stationary distribution &amp;lt;math&amp;gt;\pi_x = b(x)/B&amp;lt;/math&amp;gt;, where for all &amp;lt;math&amp;gt;x \in \Omega&amp;lt;/math&amp;gt; we have &amp;lt;math&amp;gt;b(x) &amp;gt; 0&amp;lt;/math&amp;gt; and such that &amp;lt;math&amp;gt;B =\sum_{x\in\Omega} b(x)&amp;lt;/math&amp;gt; is finite. &lt;br /&gt;
# For a finite state space &amp;lt;math&amp;gt;\Omega&amp;lt;/math&amp;gt; and neighborhood structure &amp;lt;math&amp;gt;\{N(X ) | x \in\Omega\}&amp;lt;/math&amp;gt;, let &amp;lt;math&amp;gt;N = \max_{x\in\Omega} |N(x)|&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;M&amp;lt;/math&amp;gt; be any number such that &amp;lt;math&amp;gt;M \ge N&amp;lt;/math&amp;gt;. For all &amp;lt;math&amp;gt;x \in \Omega&amp;lt;/math&amp;gt;, let &amp;lt;math&amp;gt;\pi_x &amp;gt; 0&amp;lt;/math&amp;gt; be the desired probability of state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; in the stationary distribution. Consider a Markov chain where &amp;lt;math&amp;gt;P_{x,y} =&lt;br /&gt;
\begin{cases}(1/M) \min(1, \pi_y/\pi_x ) &amp;amp;\text{if $x \ne y$ and $y \in N(x)$},\\&lt;br /&gt;
0 &amp;amp;\text{if $x \ne y$ and $y \notin N(x)$},\\&lt;br /&gt;
1 − \sum_{y\ne x} P_{x,y} &amp;amp;\text{if $x = y$}\end{cases}&amp;lt;/math&amp;gt;. Assume this chain is irreducible and aperiodic, verify that the stationary distribution is given by the probabilities &amp;lt;math&amp;gt;\pi_x&amp;lt;/math&amp;gt;. (Hint: Show the time-reversibility.)&lt;br /&gt;
# Let &amp;lt;math&amp;gt;S = \sum_{i=1}^\infty i^{−2} = \pi^2/6&amp;lt;/math&amp;gt;. Design a Markov chain based on the Metropolis-Hastings algorithm on the positive integers such that, in the stationary distribution, &amp;lt;math&amp;gt;\pi_i = 1/(S\cdot i^2)&amp;lt;/math&amp;gt; . The neighbors of any integer &amp;lt;math&amp;gt;i &amp;gt; 1&amp;lt;/math&amp;gt; for your chain should be only &amp;lt;math&amp;gt;i − 1&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;i + 1&amp;lt;/math&amp;gt;, and the only neighbor of &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; should be the integer &amp;lt;math&amp;gt;2&amp;lt;/math&amp;gt;.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13425</id>
		<title>数据科学基础 (Fall 2025)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13425"/>
		<updated>2025-12-04T17:45:16Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;数据科学基础&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Foundations of Data Science&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = lmm@nju.edu.cn&lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 南雍-西229&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = 周五, 2pm-5pm &amp;lt;br/&amp;gt;苏教楼D202&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = 周四, 3pm-5pm&amp;lt;br/&amp;gt;南雍-西229&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Foundations_of_Data_Science.jpg|border|100px]]&lt;br /&gt;
|header17= &lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Foundations of Data Science&#039;&#039;&#039; &amp;lt;br&amp;gt; Avrim Blum, John Hopcroft, Ravi Kannan &amp;lt;br&amp;gt;   Cambridge University Press (2020)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Foundations of Data Science&#039;&#039; (数据科学基础) class of Fall 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* 新学期第一堂课：2025年8月29日，苏教楼D202。&lt;br /&gt;
* 2025年11月7日因校运动会停课一次。&lt;br /&gt;
* 第五次作业的Aliasing method 一题中应该是&amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n{-1}}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
** [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 梁梓豪：[mailto:zhliang@smail.nju.edu.cn 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
** 周海刚：[mailto:hgzhou2003@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 欧丰宁：[mailto:oufn02@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 于逸潇：[mailto:yixiaoyu@smail.nju.edu.cn 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 缪天顺：[mailto:mtsmts2022@outlook.com 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周五：2pm-5pm，苏教楼D202&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* 周四：3pm-5pm，南雍-西229（刘明谋）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1019436733（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：包括概率空间、随机变量及其数字特征、多维与连续随机变量&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：包括测度集中现象，概率法，离散随机过程三部分&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：包括参数估计、假设检验、贝叶斯估计、方差分析、相关性及回归分析等统计推断内容。&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计相关的基本概念，以及典型的统计模型、统计推断方法。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，学生将能够掌握概率论和统计学的基本理论和方法，具备处理和分析实际数据的能力，为后续学习数据挖掘、机器学习、大数据技术等数据科学相关领域打下坚实基础。本课程采用课堂讲授、案例分析和课后练习相结合的教学方式，注重理论与实践相结合，培养学生运用所学知识解决实际问题的能力。通过本课程的学习，学生将能够具备扎实的数学基础，为未来从事数据科学研究和实践奠定坚实基础。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[BHK]&#039;&#039;&#039; &#039;&#039;Foundations of Data Science&#039;&#039;, by Avrim Blum, John Hopcroft, and Ravindran Kannan; Cambridge University Press (2020).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 1|Problem Set 1]]  请在 2025/09/26 上课之前(14:00 UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 2|Problem Set 2]]  请在 2025/10/03 14:00前(UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 3|Problem Set 3]]  请在 2025/10/17 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/e717e1b8eccd4c4fb889/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 4|Problem Set 4]]  请在 2025/10/31 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/fb85c46de75f4095b326/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA4.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 5|Problem Set 5]]  请在 2025/11/21 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/1243dac3190b4e1eb30b/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA5.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 6|Problem Set 6]]  请在 2025/12/26 14:00 UTC+8 前上传到 [https://box.nju.edu.cn/u/d/9302de38f13146eeb5e9/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA6.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/1/1a/Intro%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 课程简介]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1Vkz4YqEC9 Bertrand Paradox (贝特朗悖论)的视频]&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/5/51/ProbSpace%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/732bad4060fc442789ab/ 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Volume of Hamming balls|Volume of Hamming balls]]&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1ta411A7fp/ 高尔顿板（Galton board）视频] 和 [https://en.wikipedia.org/wiki/Galton_board 维基百科页面]&lt;br /&gt;
# [https://box.nju.edu.cn/f/89f212b7b6874c0e9097/  ‎偏差和矩]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2024)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/1eca74dafe6c4d11a799/ 连续分布]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第3章, 和4.1节&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 4&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 8, 9&#039;&#039;&#039;&lt;br /&gt;
#* [https://measure.axler.net/MIRA.pdf Measure, Integration &amp;amp; Real Analysis] by Sheldon Axler&lt;br /&gt;
# [https://box.nju.edu.cn/f/9a675bedb36243d19616/ 极限定理]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第5章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.7~5.10, 7.1~7.5&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/1049bd7f7974465cbc85/ 测度集中]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 4&#039;&#039;&#039; and &#039;&#039;&#039;Sections 13.1, 13.4~13.5&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.11, 12.1~12.3, 7.8~7.9&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Hoeffding&#039;s lemma|Hoeffding&#039;s lemma]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/06617a7c88af456696de/ 随机过程]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第6章, 第7章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU]  Chapters 7, Sections 13.1~13.3&#039;&#039;&#039; or &#039;&#039;&#039;[GS] Chapters 6, Sections 12.4~12.5&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/OST and applications|OST and applications]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/be7ade6440ea4462af3b/ 统计学与点估计]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第8章, 第9章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Section 9.6~9.7&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/5e1cb2f1d656460bb60c/ 假设检验]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13424</id>
		<title>数据科学基础 (Fall 2025)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13424"/>
		<updated>2025-12-04T17:34:35Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Assignments */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;数据科学基础&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Foundations of Data Science&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = lmm@nju.edu.cn&lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 南雍-西229&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = 周五, 2pm-5pm &amp;lt;br/&amp;gt;苏教楼C204&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = 周四, 3pm-5pm&amp;lt;br/&amp;gt;南雍-西229&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Foundations_of_Data_Science.jpg|border|100px]]&lt;br /&gt;
|header17= &lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Foundations of Data Science&#039;&#039;&#039; &amp;lt;br&amp;gt; Avrim Blum, John Hopcroft, Ravi Kannan &amp;lt;br&amp;gt;   Cambridge University Press (2020)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Foundations of Data Science&#039;&#039; (数据科学基础) class of Fall 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* 新学期第一堂课：2025年8月29日，苏教楼D202。&lt;br /&gt;
* 2025年11月7日因校运动会停课一次。&lt;br /&gt;
* 第五次作业的Aliasing method 一题中应该是&amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n{-1}}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
** [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 梁梓豪：[mailto:zhliang@smail.nju.edu.cn 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
** 周海刚：[mailto:hgzhou2003@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 欧丰宁：[mailto:oufn02@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 于逸潇：[mailto:yixiaoyu@smail.nju.edu.cn 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 缪天顺：[mailto:mtsmts2022@outlook.com 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周五：2pm-5pm，苏教楼C204&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* 周四：3pm-5pm，南雍-西229（刘明谋）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1019436733（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：包括概率空间、随机变量及其数字特征、多维与连续随机变量&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：包括测度集中现象，概率法，离散随机过程三部分&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：包括参数估计、假设检验、贝叶斯估计、方差分析、相关性及回归分析等统计推断内容。&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计相关的基本概念，以及典型的统计模型、统计推断方法。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，学生将能够掌握概率论和统计学的基本理论和方法，具备处理和分析实际数据的能力，为后续学习数据挖掘、机器学习、大数据技术等数据科学相关领域打下坚实基础。本课程采用课堂讲授、案例分析和课后练习相结合的教学方式，注重理论与实践相结合，培养学生运用所学知识解决实际问题的能力。通过本课程的学习，学生将能够具备扎实的数学基础，为未来从事数据科学研究和实践奠定坚实基础。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[BHK]&#039;&#039;&#039; &#039;&#039;Foundations of Data Science&#039;&#039;, by Avrim Blum, John Hopcroft, and Ravindran Kannan; Cambridge University Press (2020).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 1|Problem Set 1]]  请在 2025/09/26 上课之前(14:00 UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 2|Problem Set 2]]  请在 2025/10/03 14:00前(UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 3|Problem Set 3]]  请在 2025/10/17 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/e717e1b8eccd4c4fb889/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 4|Problem Set 4]]  请在 2025/10/31 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/fb85c46de75f4095b326/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA4.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 5|Problem Set 5]]  请在 2025/11/21 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/1243dac3190b4e1eb30b/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA5.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 6|Problem Set 6]]  请在 2025/12/26 14:00 UTC+8 前上传到 [https://box.nju.edu.cn/u/d/9302de38f13146eeb5e9/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA6.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/1/1a/Intro%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 课程简介]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1Vkz4YqEC9 Bertrand Paradox (贝特朗悖论)的视频]&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/5/51/ProbSpace%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/732bad4060fc442789ab/ 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Volume of Hamming balls|Volume of Hamming balls]]&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1ta411A7fp/ 高尔顿板（Galton board）视频] 和 [https://en.wikipedia.org/wiki/Galton_board 维基百科页面]&lt;br /&gt;
# [https://box.nju.edu.cn/f/89f212b7b6874c0e9097/  ‎偏差和矩]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2024)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/1eca74dafe6c4d11a799/ 连续分布]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第3章, 和4.1节&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 4&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 8, 9&#039;&#039;&#039;&lt;br /&gt;
#* [https://measure.axler.net/MIRA.pdf Measure, Integration &amp;amp; Real Analysis] by Sheldon Axler&lt;br /&gt;
# [https://box.nju.edu.cn/f/9a675bedb36243d19616/ 极限定理]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第5章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.7~5.10, 7.1~7.5&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/1049bd7f7974465cbc85/ 测度集中]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 4&#039;&#039;&#039; and &#039;&#039;&#039;Sections 13.1, 13.4~13.5&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.11, 12.1~12.3, 7.8~7.9&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Hoeffding&#039;s lemma|Hoeffding&#039;s lemma]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/06617a7c88af456696de/ 随机过程]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第6章, 第7章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU]  Chapters 7, Sections 13.1~13.3&#039;&#039;&#039; or &#039;&#039;&#039;[GS] Chapters 6, Sections 12.4~12.5&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/OST and applications|OST and applications]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/be7ade6440ea4462af3b/ 统计学与点估计]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第8章, 第9章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Section 9.6~9.7&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/5e1cb2f1d656460bb60c/ 假设检验]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_6&amp;diff=13423</id>
		<title>数据科学基础 (Fall 2025)/Problem Set 6</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)/Problem_Set_6&amp;diff=13423"/>
		<updated>2025-12-04T17:30:26Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: Created page with &amp;quot;*每道题目的解答都要有完整的解题过程，中英文不限。  *我们推荐大家使用LaTeX, markdown等对作业进行排版。  *没有条件的同学可以用纸笔完成作业之后拍照。  ==  Assumption throughout Problem Set 6 ==  &amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\Pr)&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;  &amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;  == Problem 1...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;*每道题目的解答都要有完整的解题过程，中英文不限。&lt;br /&gt;
&lt;br /&gt;
*我们推荐大家使用LaTeX, markdown等对作业进行排版。&lt;br /&gt;
&lt;br /&gt;
*没有条件的同学可以用纸笔完成作业之后拍照。&lt;br /&gt;
&lt;br /&gt;
==  Assumption throughout Problem Set 6 == &lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we are working on probability space &amp;lt;math&amp;gt;(\Omega,\mathcal{F},\Pr)&amp;lt;/math&amp;gt;.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;p&amp;gt;Without further notice, we assume that the expectation of random variables are well-defined.&amp;lt;/p&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Problem 1 (LLN &amp;amp; CLT)==&lt;br /&gt;
* [&#039;&#039;&#039;Proportional betting&#039;&#039;&#039;] In each of a sequence of independent bets, a gambler either wins 30%, or loses 25% of her current fortune, each with probability &amp;lt;math&amp;gt;1/2&amp;lt;/math&amp;gt;. Denoting her fortune after &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; bets by &amp;lt;math&amp;gt;F_n&amp;lt;/math&amp;gt;, show that &amp;lt;math&amp;gt;\mathbb E(F_n)\to\infty&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;n \to\infty&amp;lt;/math&amp;gt;, while &amp;lt;math&amp;gt;F_n \to 0&amp;lt;/math&amp;gt; almost surely.&lt;br /&gt;
* [&#039;&#039;&#039;Entropy&#039;&#039;&#039;]  The interval &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt; is partitioned into &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; disjoint sub-intervals with lengths &amp;lt;math&amp;gt;p_1,p_2,\dots,p_n&amp;lt;/math&amp;gt;, and the entropy of this partition is defined to be &amp;lt;math&amp;gt;h= −\sum^n_{i=1} p_i log p_i&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;X_1,X_2,\dots&amp;lt;/math&amp;gt; be independent random variables having the uniform distribution on &amp;lt;math&amp;gt;[0,1]&amp;lt;/math&amp;gt;, and let &amp;lt;math&amp;gt;Z_m^{(i)}&amp;lt;/math&amp;gt; be the number of the &amp;lt;math&amp;gt;X_1,X_2,\dots,X_m&amp;lt;/math&amp;gt; which lie in the &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt;-th interval of the partition above. Show that &amp;lt;math&amp;gt;R_m =\prod^n_{i=1} p_i^{Z_m^{(i)}}&amp;lt;/math&amp;gt; satisfies &amp;lt;math&amp;gt;m^{−1}\cdot\log R_m \to −h&amp;lt;/math&amp;gt; almost surely as &amp;lt;math&amp;gt;m \to\infty&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Mobilizing a Supermajority&#039;&#039;&#039;] In a society of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; independent individuals, each person independently (i) attends the vote with probability &amp;lt;math&amp;gt;\tau&amp;lt;/math&amp;gt; and abstains with probability &amp;lt;math&amp;gt;1-\tau&amp;lt;/math&amp;gt;; (ii) if attending, votes &amp;quot;Yes&amp;quot; with probability &amp;lt;math&amp;gt;p&amp;lt;/math&amp;gt; and &amp;quot;No&amp;quot; with probability &amp;lt;math&amp;gt;1-p&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;A proposal is accepted if among all attendees, the fraction of Yes votes is at least a supermajority threshold &amp;lt;math&amp;gt;\theta \in (1/2,1)&amp;lt;/math&amp;gt; (e.g., &amp;lt;math&amp;gt;\theta = 2/3&amp;lt;/math&amp;gt;). A mobilization campaign may add &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; extra supporters who certainly attend and certainly vote Yes. Your goal is to determine the minimal &amp;lt;math&amp;gt;m&amp;lt;/math&amp;gt; such that the proposal passes with probability at least &amp;lt;math&amp;gt;1-\delta&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Concentration of measure)==&lt;br /&gt;
* [&#039;&#039;&#039;Tossing coins&#039;&#039;&#039;] We repeatedly toss a fair coin (with an equal probability of heads and tails). Let the random variable &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be the number of throws required to obtain a total of &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; heads. Show that &amp;lt;math&amp;gt;\Pr[X &amp;gt; 2n + \delta\sqrt{n\log n}]\leq n^{-\delta^2/6}&amp;lt;/math&amp;gt; for any real &amp;lt;math&amp;gt;0&amp;lt;\delta&amp;lt;\sqrt{\frac{4n}{\log n}}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;&amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; be a random variable with expectation &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; such that moment generating function &amp;lt;math&amp;gt;\mathbf{E}[\exp(t|X|)]&amp;lt;/math&amp;gt; is finite for some &amp;lt;math&amp;gt; t &amp;gt; 0 &amp;lt;/math&amp;gt;. We can use the following two kinds of tail inequalities for &amp;lt;math&amp;gt; X &amp;lt;/math&amp;gt;: &lt;br /&gt;
** Chernoff Bound:  &amp;lt;math&amp;gt;\Pr[|X| \geq \delta] \leq \min_{t \geq 0} {\mathbb{E}[e^{t|X|}]}/{e^{t\delta}}&amp;lt;/math&amp;gt;;&lt;br /&gt;
** &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-Moment Bound:  &amp;lt;math&amp;gt;\Pr[|X| \geq \delta] \leq {\mathbb{E}[|X|^k]}/{\delta^k}&amp;lt;/math&amp;gt;.&lt;br /&gt;
# Show that for each &amp;lt;math&amp;gt;\delta&amp;lt;/math&amp;gt;, there exists a choice of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; such that the &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-moment bound is no weaker than the Chernoff bound. (Hint: Use the probabilistic method. Construct a distribution over all &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;th-moment bound, and show that the expected bound is not weaker than the Chernoff bound.)&lt;br /&gt;
# Why would we still prefer the Chernoff bound to the (seemingly) stronger &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-th moment bound?&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Densest induced subgraph in random graph&#039;&#039;&#039;] For a graph &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; on vertex set &amp;lt;math&amp;gt;[n] = {1,2,\dots,n}&amp;lt;/math&amp;gt;, define the average-degree density of an induced subgraph as &amp;lt;math&amp;gt;\mathrm{dens}(S) := \frac{e(S)}{|S|}&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;e(S)&amp;lt;/math&amp;gt; is the number of edges with both endpoints in &amp;lt;math&amp;gt;S&amp;lt;/math&amp;gt;. Define the densest induced subgraph of &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; as &amp;lt;math&amp;gt;\mathrm{dens}(G) := \max_{S \subseteq [n], |S|\ge 2} \mathrm{dens}(S)&amp;lt;/math&amp;gt;. Show that, with probability at least &amp;lt;math&amp;gt;2/3&amp;lt;/math&amp;gt;, the densest induced subgraph in &amp;lt;math&amp;gt;G(n,1/2)&amp;lt;/math&amp;gt; satisfies &amp;lt;math&amp;gt;\mathrm{dens}(G(n,1/2)) \le \frac{n}{2} + O(n^{1/2})&amp;lt;/math&amp;gt;. More precisely, prove that there exists an absolute constant &amp;lt;math&amp;gt;C &amp;gt; 0&amp;lt;/math&amp;gt; such that &amp;lt;math&amp;gt;\Pr\big( \mathrm{dens}(G(n,1/2)) \le \frac{n}{2} + C n^{1/2} \big) \ge \frac{2}{3}&amp;lt;/math&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Problem 3 (Random processes)==&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;High-dimensional random walk&#039;&#039;&#039;] Consider an unbiased random walk over &amp;lt;math&amp;gt;\mathbb R^n&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;n&amp;gt;1&amp;lt;/math&amp;gt;. At each step, assuming we are at position &amp;lt;math&amp;gt;X&amp;lt;/math&amp;gt; without loss of generality, for each dimension &amp;lt;math&amp;gt;i&amp;lt;/math&amp;gt;, we choose a movement &amp;lt;math&amp;gt;\delta_i\in\mathbb R&amp;lt;/math&amp;gt; with &amp;lt;math&amp;gt;\mathbb E [\delta_i]=0&amp;lt;/math&amp;gt; (i.e. unbiased) at random, then move to &amp;lt;math&amp;gt; X+\sum_i\sigma_i&amp;lt;/math&amp;gt;. Prove that an unbiased random walk in any number of dimensions, regardless of the distributions of &amp;lt;math&amp;gt;\sigma_i&amp;lt;/math&amp;gt;&#039;s, is an example of a martingale.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Pólya’s urn&#039;&#039;&#039;]A bag contains red and blue balls, with initially  &amp;lt;math&amp;gt;r&amp;lt;/math&amp;gt; red and  &amp;lt;math&amp;gt;b&amp;lt;/math&amp;gt; blue where  &amp;lt;math&amp;gt;rb &amp;gt;0 &amp;lt;/math&amp;gt;. A ball is drawn from the bag, its color noted, and then it is returned to the bag together with a new ball of the same color. Let  &amp;lt;math&amp;gt;R_n &amp;lt;/math&amp;gt; be the number of red balls after  &amp;lt;math&amp;gt;n &amp;lt;/math&amp;gt; such operations. Show that  &amp;lt;math&amp;gt;Y_n = R_n/(n + r + b) &amp;lt;/math&amp;gt; is a martingale.&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Optional stopping 1-D symmetric random walk&#039;&#039;&#039;] Let &amp;lt;math&amp;gt;S_n = a + \sum_{r=1}^n X_r&amp;lt;/math&amp;gt; be a simple symmetric random walk. The walk stops at the earliest time &amp;lt;math&amp;gt;T&amp;lt;/math&amp;gt; when it reaches either &amp;lt;math&amp;gt;0&amp;lt;/math&amp;gt; or &amp;lt;math&amp;gt;K&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;0 &amp;lt; a &amp;lt; K&amp;lt;/math&amp;gt;. Show that &amp;lt;math&amp;gt;&lt;br /&gt;
M_n = \sum_{r=0}^n S_r - \tfrac{1}{3} S_n^3&lt;br /&gt;
&amp;lt;/math&amp;gt; is a martingale, and deduce that &amp;lt;math&amp;gt;&lt;br /&gt;
\mathbb{E}\left( \sum_{r=0}^{T} S_r \right)&lt;br /&gt;
= \tfrac{1}{3} (K^2 - a^2) a + a.&lt;br /&gt;
&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
* [&#039;&#039;&#039;Random walk on a graph&#039;&#039;&#039;] A particle performs a random walk on the vertex set of a connected graph &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;, which for simplicity we assume to have neither loops nor multiple edges. At each stage it moves to a neighbor of its current position, each such neighbor being chosen with equal probability. If &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; has &amp;lt;math&amp;gt;\eta&amp;lt;\infty&amp;lt;/math&amp;gt; edges, show that the stationary distribution is given by &amp;lt;math&amp;gt;\pi(v) = d_v/(2\eta)&amp;lt;/math&amp;gt;, where &amp;lt;math&amp;gt;d_v&amp;lt;/math&amp;gt; is the degree of vertex &amp;lt;math&amp;gt;v&amp;lt;/math&amp;gt;.&lt;br /&gt;
* [&#039;&#039;&#039;Reversibility versus periodicity&#039;&#039;&#039;] Can a reversible chain be periodic?&lt;br /&gt;
* [&#039;&#039;&#039;Metropolis–Hastings algorithm&#039;&#039;&#039;] To sample a state, for each state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt;, the Glauber dynamics uniformly chooses a state among the adjacent states of &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; together with state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; itself at random in each step, and moves to the chosen state. The Metropolis-Hastings algorithm generalizes the idea of Glauber dynamics. Let us assume that we have designed an irreducible state space for our Markov chain; now we want to construct a Markov chain on this state space with a stationary distribution &amp;lt;math&amp;gt;\pi_x = b(x)/B&amp;lt;/math&amp;gt;, where for all &amp;lt;math&amp;gt;x \in \Omega&amp;lt;/math&amp;gt; we have &amp;lt;math&amp;gt;b(x) &amp;gt; 0&amp;lt;/math&amp;gt; and such that &amp;lt;math&amp;gt;B =\sum_{x\in\Omega} b(x)&amp;lt;/math&amp;gt; is finite. &lt;br /&gt;
# For a finite state space &amp;lt;math&amp;gt;\Omega&amp;lt;/math&amp;gt; and neighborhood structure &amp;lt;math&amp;gt;\{N(X ) | x \in\Omega\}&amp;lt;/math&amp;gt;, let &amp;lt;math&amp;gt;N = \max_{x\in\Omega} |N(x)|&amp;lt;/math&amp;gt;. Let &amp;lt;math&amp;gt;M&amp;lt;/math&amp;gt; be any number such that &amp;lt;math&amp;gt;M \ge N&amp;lt;/math&amp;gt;. For all &amp;lt;math&amp;gt;x \in \Omega&amp;lt;/math&amp;gt;, let &amp;lt;math&amp;gt;\pi_x &amp;gt; 0&amp;lt;/math&amp;gt; be the desired probability of state &amp;lt;math&amp;gt;x&amp;lt;/math&amp;gt; in the stationary distribution. Consider a Markov chain where &amp;lt;math&amp;gt;P_{x,y} =&lt;br /&gt;
\begin{cases}(1/M) \min(1, \pi_y/\pi_x ) &amp;amp;\text{if $x \ne y$ and $y \in N(x)$},\\&lt;br /&gt;
0 &amp;amp;\text{if $x \ne y$ and $y \notin N(x)$},\\&lt;br /&gt;
1 − \sum_{y\ne x} P_{x,y} &amp;amp;\text{if $x = y$}\end{cases}&amp;lt;/math&amp;gt;. Assume this chain is irreducible and aperiodic, verify that the stationary distribution is given by the probabilities &amp;lt;math&amp;gt;\pi_x&amp;lt;/math&amp;gt;. (Hint: Show the time-reversibility.)&lt;br /&gt;
# Let &amp;lt;math&amp;gt;S = \sum_{i=1}^\infty i^{−2} = \pi^2/6&amp;lt;/math&amp;gt;. Design a Markov chain based on the Metropolis-Hastings algorithm on the positive integers such that, in the stationary distribution, &amp;lt;math&amp;gt;\pi_i = 1/(S\cdot i^2)&amp;lt;/math&amp;gt; . The neighbors of any integer &amp;lt;math&amp;gt;i &amp;gt; 1&amp;lt;/math&amp;gt; for your chain should be only &amp;lt;math&amp;gt;i − 1&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;i + 1&amp;lt;/math&amp;gt;, and the only neighbor of &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt; should be the integer &amp;lt;math&amp;gt;2&amp;lt;/math&amp;gt;.&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Main_Page&amp;diff=13422</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Main_Page&amp;diff=13422"/>
		<updated>2025-12-03T23:32:26Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Home Pages for Courses and Seminars */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This is a course/seminar wiki run by the [http://tcs.nju.edu.cn theory group] in the Department of Computer Science and Technology at Nanjing University.&lt;br /&gt;
&lt;br /&gt;
== Home Pages for Courses and Seminars==&lt;br /&gt;
;Current semester&lt;br /&gt;
* [[高级算法 (Fall 2025)|高级算法 Advanced Algorithms (Fall 2025)]]&lt;br /&gt;
&lt;br /&gt;
* [[数据科学基础 (Fall 2025)|数据科学基础 Foundations of Data Science (Fall 2025)]]&lt;br /&gt;
&lt;br /&gt;
;Past courses&lt;br /&gt;
&lt;br /&gt;
* Advanced Algorithms: [[高级算法 (Spring 2025)|Spring 2025(Suzhou)]], [[高级算法 (Fall 2024)|Fall 2024]], [[高级算法 (Fall 2023)|Fall 2023]], [[高级算法 (Fall 2022)|Fall 2022]], [[高级算法 (Fall 2021)|Fall 2021]], [[高级算法 (Fall 2020)|Fall 2020]], [[高级算法 (Fall 2019)|Fall 2019]], [[高级算法 (Fall 2018)|Fall 2018]], [[高级算法 (Fall 2017)|Fall 2017]], [[随机算法 \ 高级算法 (Fall 2016)|Fall 2016]].&lt;br /&gt;
&lt;br /&gt;
*Algorithm Design and Analysis: [https://tcs.nju.edu.cn/shili/courses/2024spring-algo/ Spring 2024]&lt;br /&gt;
&lt;br /&gt;
* Combinatorics: [[组合数学 (Spring 2025)|Spring 2025]], [[组合数学 (Spring 2024)|Spring 2024]], [[组合数学 (Spring 2023)|Spring 2023]], [[组合数学 (Fall 2019)|Fall 2019]], [[组合数学 (Fall 2017)|Fall 2017]], [[组合数学 (Fall 2016)|Fall 2016]], [[组合数学 (Fall 2015)|Fall 2015]], [[组合数学 (Spring 2014)|Spring 2014]], [[组合数学 (Spring 2013)|Spring 2013]], [[组合数学 (Fall 2011)|Fall 2011]], [[Combinatorics (Fall 2010)|Fall 2010]].&lt;br /&gt;
&lt;br /&gt;
* Computational Complexity: [[计算复杂性 (Spring 2025)|Spring 2025]], [[计算复杂性 (Spring 2024)|Spring 2024]], [[计算复杂性 (Spring 2023)|Spring 2023]], [[计算复杂性 (Fall 2019)|Fall 2019]], [[计算复杂性 (Fall 2018)|Fall 2018]].&lt;br /&gt;
&lt;br /&gt;
* Foundations of Data Science: [[数据科学基础 (Fall 2024)|Fall 2024]]&lt;br /&gt;
&lt;br /&gt;
* Numerical Method: [[计算方法 Numerical method (Spring 2025)|Spring 2025]], [[计算方法 Numerical method (Spring 2024)|Spring 2024]], [[计算方法 Numerical method (Spring 2023)|Spring 2023]], [https://liuexp.github.io/numerical.html Spring 2022].&lt;br /&gt;
&lt;br /&gt;
* Probability Theory: [[概率论与数理统计 (Spring 2025)|Spring 2025]], [[概率论与数理统计 (Spring 2024)|Spring 2024]], [[概率论与数理统计 (Spring 2023)|Spring 2023]].&lt;br /&gt;
&lt;br /&gt;
* Quantum Computation: [[量子计算 (Spring 2022)|Spring 2022]], [[量子计算 (Spring 2021)|Spring 2021]], [[量子计算 (Fall 2019)|Fall 2019]].&lt;br /&gt;
&lt;br /&gt;
* Randomized Algorithms:  [[随机算法 (Fall 2015)|Fall 2015]], [[随机算法 (Spring 2014)|Spring 2014]], [[随机算法 (Spring 2013)|Spring 2013]], [[随机算法 (Fall 2011)|Fall 2011]], [[Randomized Algorithms (Spring 2010)|Spring 2010]].&lt;br /&gt;
&lt;br /&gt;
;Past seminars, workshops and summer schools&lt;br /&gt;
*计算理论之美暑期学校: [[计算理论之美 (Summer 2025)|2025]], [[计算理论之美 (Summer 2024)|2024]], [[计算理论之美 (Summer 2023)|2023]], [[计算理论之美 (Summer 2021)|2021]]&lt;br /&gt;
*[[Theory Seminar|理论计算机科学讨论班]]&lt;br /&gt;
*[[Study Group|理论计算机科学学习小组]]&lt;br /&gt;
*[[TCSPhD2020| 理论计算机科学优秀博士生论坛2020]]&lt;br /&gt;
*[[Quantum|量子算法与物理实现研讨会]]&lt;br /&gt;
*Theory Day: [[Theory@Suzhou 2025 | 2025 (Suzhou)]],  [[Theory@Nanjing 2019|2019]], [[Theory@Nanjing 2018|2018]], [[Theory@Nanjing 2017|2017]]&lt;br /&gt;
*[[\Delta Seminar on Logic, Philosophy, and Computer Science|Δ Seminar on Logic, Philosophy, and Computer Science]]&lt;br /&gt;
*[[近似算法讨论班 (Fall 2011)|近似算法 Approximation Algorithms, Fall 2011.]]&lt;br /&gt;
&lt;br /&gt;
; 其它链接&lt;br /&gt;
* [[General Circulation(Fall 2025)|大气环流 General Circulation of the Atmosphere, Fall 2025]]&lt;br /&gt;
* [[General Circulation(Fall 2024)|大气环流 General Circulation of the Atmosphere, Fall 2024]]&lt;br /&gt;
&lt;br /&gt;
* [[概率论 (Summer 2014)| 概率与计算 (上海交大 Summer 2014)]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13417</id>
		<title>Theory@Suzhou 2025</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13417"/>
		<updated>2025-11-30T02:27:34Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Program */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:2025 SuZhou Theory Day poster.png|200px|thumb|活动海报]]&lt;br /&gt;
==General Information ==&lt;br /&gt;
[[File:苏教楼D.png|thumb|苏教楼D在图中红星处]]&lt;br /&gt;
*&#039;&#039;&#039;&amp;lt;font size=4&amp;gt;Sunday, Nov 30, 2025: 09:00 -- 18:00.&amp;lt;/font&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Location&#039;&#039;&#039;: 南京大学苏州校区&amp;lt;/font&amp;gt;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Venue&#039;&#039;&#039;: 苏教楼D202&amp;lt;/font&amp;gt;&lt;br /&gt;
[https://zcc.nju.edu.cn/DFS//file/2024/09/20/202409201037042506uv3mq.pdf 苏州校区地图]&lt;br /&gt;
&lt;br /&gt;
==Announcement==&lt;br /&gt;
&amp;lt;font size=4 color=red&amp;gt;&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
*关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
*审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
*如开车进校，请将车牌号填写到“随行车辆车牌”处。&lt;br /&gt;
&lt;br /&gt;
==Speakers (in alphabetic order)==&lt;br /&gt;
* [http://staff.ustc.edu.cn/~xuechen1989/ 陈雪]（中国科学技术大学）&lt;br /&gt;
* [https://zengfenghuang.github.io/ 黄增峰]（复旦大学）&lt;br /&gt;
* [https://www.shaofengjiang.cn/ 姜少峰]（北京大学）&lt;br /&gt;
* [https://chaoxu.prof/ 许超]（电子科技大学）&lt;br /&gt;
* [https://chihaozhang.com/ 张驰豪]（上海交通大学）&lt;br /&gt;
* [https://scholar.google.com/citations?user=TydhZfgAAAAJ  张瀚文]（哥本哈根大学）&lt;br /&gt;
* [https://zhangty12.github.io/ 张天翼]（南京大学）&lt;br /&gt;
&lt;br /&gt;
== Join us==&lt;br /&gt;
&#039;&#039;&#039;不需注册&#039;&#039;&#039;。&lt;br /&gt;
本次活动涵盖近似算法、图算法、计算几何、理论机器学习、概率与采样算法、流与分布式算法在内的多个主题，欢迎所有对理论计算机科学感兴趣的同学和老师前来参加。&amp;lt;br/&amp;gt;&lt;br /&gt;
请 [https://docs.qq.com/form/page/DS0JxdW5yZHZPYWtF &#039;&#039;&#039;简单填写问卷&#039;&#039;&#039;] 用于统计参会人数，以便准备茶歇的食物和调整报告厅。&lt;br /&gt;
&lt;br /&gt;
== Program ==&lt;br /&gt;
:{|border=&amp;quot;2&amp;quot; width=&amp;quot;100%&amp;quot; cellspacing=&amp;quot;4&amp;quot; cellpadding=&amp;quot;3&amp;quot; rules=&amp;quot;all&amp;quot; style=&amp;quot;margin:1em 1em 1em 0; border:solid 1px #AAAAAA; border-collapse:collapse;empty-cells:show;&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#A7C1F2&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Workshop Program&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;width: 140px;&amp;quot; align=&amp;quot;center&amp;quot;|09:00 - 09:50&lt;br /&gt;
|style=&amp;quot;width: 180px;&amp;quot; align=&amp;quot;center&amp;quot;|张天翼&amp;lt;br/&amp;gt;&lt;br /&gt;
南京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Approximate Light Spanners in Planar Graphs&amp;lt;/font&amp;gt;&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Althöfer 等人（DCG 1993）提出了贪心生成子图，并证明了对于任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，其贪心 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的总权重至多为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;w(\mathrm{MST}(G))&amp;lt;/math&amp;gt; 表示图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的最小生成树 &amp;lt;math&amp;gt;\mathrm{MST}(G)&amp;lt;/math&amp;gt; 的权重。该界在存在性意义上是紧的：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得其任意 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的权重至少为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;。&amp;lt;br/&amp;gt;然而，从近似算法的角度来看，即使是双标准（bicriteria）近似，贪心生成子图 的权重近似因子也基本上达到了上述存在性下界：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得对于任意满足 &amp;lt;math&amp;gt;1 \leq x = O(\epsilon^{-1/2})&amp;lt;/math&amp;gt; 的参数，其贪心 &amp;lt;math&amp;gt;(1 + x\epsilon)&amp;lt;/math&amp;gt;-生成子图 的权重为 &amp;lt;math&amp;gt;\Omega\left(\frac{1}{\epsilon \cdot x^2} \cdot w(G_{\mathrm{opt},\epsilon})\right)&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;G_{\mathrm{opt},\epsilon}&amp;lt;/math&amp;gt; 是图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的权重最小的 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图。&amp;lt;br/&amp;gt;尽管在过去三十年中，关于生成子图的近似算法的研究层出不穷，但目前仍不存在任何（即使是双标准）近似算法，能够在带权平面图上构造出优于上述存在性下界的轻量生成子图。&amp;lt;br/&amp;gt;作为本文的主要贡献，我们提出了一种在平面图上的动态规划算法，可在任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 中构造一个 &amp;lt;math&amp;gt;\left(1 + \epsilon \cdot 2^{O(\log^* 1/\epsilon)}\right)&amp;lt;/math&amp;gt;-生成子图，其总权重为 &amp;lt;math&amp;gt;O(1) \cdot w(G_{\mathrm{opt},\epsilon})&amp;lt;/math&amp;gt;。此外，我们也证明了精确求解最小平面生成子图是NP难的。&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (09:50 – 10:10)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|10:10 – 11:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|姜少峰&amp;lt;br/&amp;gt;&lt;br /&gt;
北京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Local Search for Clustering in Almost-linear Time&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We propose the first local search algorithm for Euclidean clustering that attains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. Specifically, for Euclidean k-Means, our algorithm achieves an &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;-approximation in &amp;lt;math&amp;gt;\tilde{O}(n^{1 + 1 / c})&amp;lt;/math&amp;gt; time, for any constant &amp;lt;math&amp;gt;c \ge 1&amp;lt;/math&amp;gt;, maintaining the same running time as the previous (non-local-search-based) approach [la Tour and Saulpic, arXiv&#039;2407.11217] while improving the approximation factor from &amp;lt;math&amp;gt;O(c^{6})&amp;lt;/math&amp;gt; to &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;. The algorithm generalizes to any metric space with sparse spanners, delivering efficient constant approximation in &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; metrics, doubling metrics, Jaccard metrics, etc.&amp;lt;br/&amp;gt; This generality derives from our main technical contribution: a local search algorithm on general graphs that obtains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. We establish this through a new &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;-swap local search framework featuring a novel swap selection rule. At a high level, this rule “scores” every possible swap, based on both its modification to the clustering and its improvement to the clustering objective, and then selects those high-scoring swaps. To implement this, we design a new data structure for maintaining approximate nearest neighbors with amortized guarantees tailored to our framework.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|11:05 – 11:55&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|陈雪&amp;lt;br/&amp;gt;&lt;br /&gt;
中国科学技术大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Algorithms for Sparse LPN and LSPN Against Low-noise&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider sparse variants of the classical Learning Parities with random Noise (LPN) problem. Our main contribution is a new algorithmic framework that provides learning algorithms against low-noise for both Learning Sparse Parities (LSPN) problem and sparse LPN problem. Different from previous approaches for LSPN and sparse LPN, this framework has a simple structure and runs in polynomial space. Let &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; be the dimension, &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; denote the sparsity, and &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt; be the noise rate.&amp;lt;br/&amp;gt;As a fundamental problem in computational learning theory, Learning Sparse Parities with Noise (LSPN) assumes the hidden parity is &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse. While a simple enumeration algorithm takes &amp;lt;math&amp;gt;{n \choose k}=O((n/k)^k)&amp;lt;/math&amp;gt; time, previously known results still need &amp;lt;math&amp;gt;{n \choose k/2} = \Omega((n/k)^{k/2})&amp;lt;/math&amp;gt; time for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;. Our framework provides a LSPN algorithm runs in time &amp;lt;math&amp;gt;O((\eta \cdot n/k)^k)&amp;lt;/math&amp;gt; for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;, which improves the state-of-the-art of LSPN whenever &amp;lt;math&amp;gt;\eta \in ( k/n,\sqrt{k/n})&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;The sparse LPN problem is closely related to the classical problem of refuting random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-CSP and has been widely used in cryptography as the hardness assumption. Different from the standard LPN, it samples random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors. Because the number of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors is &amp;lt;math&amp;gt;{n \choose k} &amp;lt; n^k&amp;lt;/math&amp;gt;, sparse LPN has learning algorithms in polynomial time when &amp;lt;math&amp;gt;m&amp;gt;n^{k/2}&amp;lt;/math&amp;gt;. However, much less is known about learning algorithms for constant &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; like &amp;lt;math&amp;gt;3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m&amp;lt;n^{k/2}&amp;lt;/math&amp;gt; samples, except the Gaussian elimination algorithm of time &amp;lt;math&amp;gt;e^{\eta n}&amp;lt;/math&amp;gt;. Our framework provides a learning algorithm in &amp;lt;math&amp;gt;e^{O(\eta \cdot n^{\frac{\delta+1}{2}})}&amp;lt;/math&amp;gt; time given &amp;lt;math&amp;gt;\delta \in (0,1)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m \approx n^{1+(1-\delta)\cdot \frac{k-1}{2}}&amp;lt;/math&amp;gt; samples. This improves previous learning algorithms. For example, in the classical setting of &amp;lt;math&amp;gt;k=3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m=n^{1.4}&amp;lt;/math&amp;gt;, our algorithm would be faster than previous approaches for any &amp;lt;math&amp;gt;\eta&amp;lt;n^{-0.7}&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;Based on joint work with Wenxuan Shu (USTC) and Zhaienhe Zhou (USTC).&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Lunch Break  (12:00 - 14:00)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:00 – 14:50&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张瀚文&amp;lt;br/&amp;gt;&lt;br /&gt;
哥本哈根大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Minimum Star Partitions of Simple Polygons in Polynomial Time &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: 我们设计了一种多项式时间算法，用于将简单多边形P划分为最少个数的星形多边形。这样的算法是否存在的问题已被提出超过四十年之久并多次重复，包括在O’Rourke的著作《美术馆定理与算法》中。之前已知的算法只能处理一些特殊情况，例如多边形是单调的直边多边形，或者不允许使用斯坦纳点的情况，都远不足以处理最普遍的例子。而允许星型子部分重叠的覆盖变体——即著名的美术馆问题，在2018年被证明属于∃ℝ完全类，因此很可能比NP问题更难。除了理论价值外，星型多边形划分也可以应用在数控型腔铣削、机器人路径规划、形状参数化等实际场景中。&amp;lt;br/&amp;gt;在这个报告中，我会着重讲解我们求解这个问题时的直觉、思考和发现，沉浸式体验我们在这项研究中的全部经历。&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:55 – 15:45&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|许超&amp;lt;br&amp;gt;&lt;br /&gt;
电子科技大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: An Optimal Algorithm for the Stacker Crane Problem on Fixed Topologies&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: The Stacker Crane Problem (SCP) is a variant of the Traveling Salesman Problem. In SCP, pairs of pickup and delivery points are designated on a graph, and a crane must visit these points to move objects from each pickup location to its respective delivery point. The goal is to minimize the total distance traveled. SCP is known to be NP-hard, even on trees. The only positive results, in terms of polynomial-time solvability, apply to graphs that are topologically equivalent to a path or a cycle. We propose an algorithm that is optimal for each fixed topology, running in near-linear time. This is achieved by demonstrating that the problem is fixed-parameter tractable (FPT) when parameterized by both the cycle rank and the number of branch vertices.&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (15:45 – 16:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|16:15 – 17:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|黄增峰&amp;lt;br/&amp;gt;&lt;br /&gt;
复旦大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Simple and Optimal Algorithms for Heavy Hitters and Frequency Moments in Distributed Models&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider the problems of distributed heavy hitters and frequency moments in both the coordinator model and the distributed tracking model. We present simple and optimal algorithms for heavy hitters and frequency moments estimation in these distributed models. For &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; heavy hitters in the coordinator model, our algorithm requires only one round and uses &amp;lt;math&amp;gt;\tilde{O}(k^{p-1}/\epsilon^p)&amp;lt;/math&amp;gt; bits of communication. For &amp;lt;math&amp;gt;p &amp;gt; 2&amp;lt;/math&amp;gt;, this is the first near-optimal result. By combining our algorithm with the standard recursive sketching technique, we obtain a near-optimal two-round algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the coordinator model, matching a significant result from recent work by Esfandiari et al. (STOC 2024). Our algorithm and analysis are much simpler and have better cost with respect to logarithmic factors. Due to the simplicity of our heavy hitter algorithms, we manage to adapt them to the distributed tracking model with only a &amp;lt;math&amp;gt;\mathrm{polylog}(n)&amp;lt;/math&amp;gt; increase in communication. This presents the first near-optimal algorithm for heavy hitters. By applying the recursive sketching technique, we also provide the first near-optimal algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the distributed tracking model for all &amp;lt;math&amp;gt;p \geq 2&amp;lt;/math&amp;gt;. Even for &amp;lt;math&amp;gt;F_2&amp;lt;/math&amp;gt;, our result improves upon the bounds established by Cormode, Muthukrishnan, and Yi (SODA 2008) and Woodruff and Zhang (STOC 2012), nearly matching the existing lower bound for the first time.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|17:10 – 18:00&lt;br /&gt;
&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张驰豪&amp;lt;br/&amp;gt;&lt;br /&gt;
上海交通大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Sampling from non-log-concave distributions&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Sampling from a d-dimensional distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; with density &amp;lt;math&amp;gt;p_{\mu}(x) \propto e^{-V(x)}&amp;lt;/math&amp;gt; is a central problem in many areas, including theoretical computer science, statistical physics, and machine learning. It is well-known that when the potential function &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;convex&#039;&#039; (or equivalently, when &amp;lt;math&amp;gt;p_{\mu}&amp;lt;/math&amp;gt; is &#039;&#039;log-concave&#039;&#039;), or more generally, when &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfies good isoperimetric inequalities, efficient sampling algorithms exist in various computational models. A common belief is that the sampling task becomes more difficult when &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;non-convex&#039;&#039;. On the other hand, data-based algorithms (e.g., denoising diffusion probabilistic models) developed in the machine learning community are very successful in practice when dealing with highly non-log-concave distributions (such as in image generation), and provide new insights into designing efficient sampling algorithms. &amp;lt;br/&amp;gt; In this talk, we will start with a general tight (exponential) sampling complexity bound for any &#039;&#039;non-log-concave&#039;&#039; distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfying mild regularity conditions. Then, we will show how a common strengthening of these regularity conditions leads to an efficient (polynomial) sampling algorithm. Finally, we will discuss future directions for understanding the complexity of sampling from general distributions.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Getting to The Campus==&lt;br /&gt;
*入校：&amp;lt;strike&amp;gt;校外来宾请在校门口向安保说明会议名称后登记入校。&amp;lt;/strike&amp;gt; &amp;lt;font color=red&amp;gt;[[Media:智软院访客预约流程.pdf|&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;]]&amp;lt;/font&amp;gt;&lt;br /&gt;
**关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
**审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
**如开车进校，请将车牌号填写到“随行车辆车牌”处。&lt;br /&gt;
*高铁 / 动车&lt;br /&gt;
**苏州站：打车至校区约 30 分钟（非早晚高峰情况下），费用约 ¥30。亦可选择快线 3 号或地铁转有轨电车，全程约 2 小时。&lt;br /&gt;
**苏州新区站：打车至校区约 25 分钟，费用约 ¥25。也可乘坐有轨电车 2 号线，约 1 小时。&lt;br /&gt;
*飞机&lt;br /&gt;
**无锡硕放机场（WUX）：打车至校区约 30 分钟；因跨城行驶，司机可能会收取返程/空驶费用，总费用约 ¥80。如果能打到顺风车的话会较为便宜。亦可选公共交通，约 2 小时。&lt;br /&gt;
**上海虹桥机场（SHA）：建议从虹桥火车站换乘高铁至苏州站或苏州新区站。务必留意，由上海虹桥站前往苏州的高铁末班车时间通常是21:42。不建议从虹桥直接打车至苏州（费用较高）；同时不建议打顺风车，因为通常只能打到黑出租。&lt;br /&gt;
&lt;br /&gt;
== Accommodation Suggestion ==&lt;br /&gt;
￥￥￥ 苏州科技城源宿酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥￥ 南大国际学术交流中心（校内酒店，性价比高）、苏州科技城万达美华酒店、全季苏州科技城酒店、苏州高新区科技城亚朵酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥ 格林豪泰苏州市科技城商务酒店、宜必思尚品苏州科技城酒店、如家精选-苏州乐园高新区科技城店&lt;br /&gt;
&lt;br /&gt;
== Lunch &amp;amp; Supper ==&lt;br /&gt;
[[File:苏州校区食堂（2025）.png|thumb|苏州校区的四个食堂在图中红星处]]&lt;br /&gt;
* 苏州校区内现有科创大厦食堂、第16、17、18食堂共四个食堂，素菜2-3元、荤菜4-8元，可直接用支付宝或微信支付。此外，国际学术交流中心也提供更为昂贵的食物。&lt;br /&gt;
* 学校附近有：东渚镇、文体中心、丰茂里、时尚水岸星悦荟、星悦里等几个商业区。&lt;br /&gt;
* 也可以选择外卖，会送至校门口的外卖柜或外卖架上。&lt;br /&gt;
&lt;br /&gt;
== Getting Around ==&lt;br /&gt;
* 大阳山国家森林公园 &amp;amp; 植物园：层林步道＋寺庙人文，爬 60–90 分钟视体力安排；秋冬晴天观景佳。&lt;br /&gt;
* 树山生态村：乡野步道、茶园与农家菜，团队晚餐/走读首选。&lt;br /&gt;
* 太湖湿地/西山方向：自驾更便捷，观湿地与湖景线。&lt;br /&gt;
* 古城园林：傍晚可打车去平江路/山塘街逛夜景，或白天参观苏州博物馆/拙政园。&lt;br /&gt;
&lt;br /&gt;
== About Suzhou Campus ==&lt;br /&gt;
南京大学苏州校区位于苏州高新区太湖科技城，地处“环太湖科创圈”与“沿沪宁产业创新带”的黄金交汇点，被定位为南大发展壮大新工科的主阵地。立足“国家战略、世界一流、强强联合、需需结合”，南大苏州校区聚焦人工智能、新一代信息技术、新能源、先进制造、生命健康等领域“卡脖子”问题，强化“新工科”建设，促进文理工医交叉融合，政产学研协调发展。&lt;br /&gt;
&lt;br /&gt;
== Contact ==&lt;br /&gt;
刘明谋： lmm@nju.edu.cn&amp;lt;br/&amp;gt;&lt;br /&gt;
[[Media:2025年Suzhou Theory Day邀请函.pdf|邀请函.pdf]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13416</id>
		<title>Theory@Suzhou 2025</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13416"/>
		<updated>2025-11-30T02:22:34Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Program */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:2025 SuZhou Theory Day poster.png|200px|thumb|活动海报]]&lt;br /&gt;
==General Information ==&lt;br /&gt;
[[File:苏教楼D.png|thumb|苏教楼D在图中红星处]]&lt;br /&gt;
*&#039;&#039;&#039;&amp;lt;font size=4&amp;gt;Sunday, Nov 30, 2025: 09:00 -- 18:00.&amp;lt;/font&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Location&#039;&#039;&#039;: 南京大学苏州校区&amp;lt;/font&amp;gt;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Venue&#039;&#039;&#039;: 苏教楼D202&amp;lt;/font&amp;gt;&lt;br /&gt;
[https://zcc.nju.edu.cn/DFS//file/2024/09/20/202409201037042506uv3mq.pdf 苏州校区地图]&lt;br /&gt;
&lt;br /&gt;
==Announcement==&lt;br /&gt;
&amp;lt;font size=4 color=red&amp;gt;&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
*关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
*审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
*如开车进校，请将车牌号填写到“随行车辆车牌”处。&lt;br /&gt;
&lt;br /&gt;
==Speakers (in alphabetic order)==&lt;br /&gt;
* [http://staff.ustc.edu.cn/~xuechen1989/ 陈雪]（中国科学技术大学）&lt;br /&gt;
* [https://zengfenghuang.github.io/ 黄增峰]（复旦大学）&lt;br /&gt;
* [https://www.shaofengjiang.cn/ 姜少峰]（北京大学）&lt;br /&gt;
* [https://chaoxu.prof/ 许超]（电子科技大学）&lt;br /&gt;
* [https://chihaozhang.com/ 张驰豪]（上海交通大学）&lt;br /&gt;
* [https://scholar.google.com/citations?user=TydhZfgAAAAJ  张瀚文]（哥本哈根大学）&lt;br /&gt;
* [https://zhangty12.github.io/ 张天翼]（南京大学）&lt;br /&gt;
&lt;br /&gt;
== Join us==&lt;br /&gt;
&#039;&#039;&#039;不需注册&#039;&#039;&#039;。&lt;br /&gt;
本次活动涵盖近似算法、图算法、计算几何、理论机器学习、概率与采样算法、流与分布式算法在内的多个主题，欢迎所有对理论计算机科学感兴趣的同学和老师前来参加。&amp;lt;br/&amp;gt;&lt;br /&gt;
请 [https://docs.qq.com/form/page/DS0JxdW5yZHZPYWtF &#039;&#039;&#039;简单填写问卷&#039;&#039;&#039;] 用于统计参会人数，以便准备茶歇的食物和调整报告厅。&lt;br /&gt;
&lt;br /&gt;
== Program ==&lt;br /&gt;
:{|border=&amp;quot;2&amp;quot; width=&amp;quot;100%&amp;quot; cellspacing=&amp;quot;4&amp;quot; cellpadding=&amp;quot;3&amp;quot; rules=&amp;quot;all&amp;quot; style=&amp;quot;margin:1em 1em 1em 0; border:solid 1px #AAAAAA; border-collapse:collapse;empty-cells:show;&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#A7C1F2&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Workshop Program&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;width: 140px;&amp;quot; align=&amp;quot;center&amp;quot;|09:00 - 09:50&lt;br /&gt;
|style=&amp;quot;width: 180px;&amp;quot; align=&amp;quot;center&amp;quot;|张天翼&amp;lt;br/&amp;gt;&lt;br /&gt;
南京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Approximate Light Spanners in Planar Graphs&amp;lt;/font&amp;gt;&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Althöfer 等人（DCG 1993）提出了贪心生成子图，并证明了对于任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，其贪心 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的总权重至多为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;w(\mathrm{MST}(G))&amp;lt;/math&amp;gt; 表示图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的最小生成树 &amp;lt;math&amp;gt;\mathrm{MST}(G)&amp;lt;/math&amp;gt; 的权重。该界在存在性意义上是紧的：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得其任意 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的权重至少为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;。&amp;lt;br/&amp;gt;然而，从近似算法的角度来看，即使是双标准（bicriteria）近似，贪心生成子图 的权重近似因子也基本上达到了上述存在性下界：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得对于任意满足 &amp;lt;math&amp;gt;1 \leq x = O(\epsilon^{-1/2})&amp;lt;/math&amp;gt; 的参数，其贪心 &amp;lt;math&amp;gt;(1 + x\epsilon)&amp;lt;/math&amp;gt;-生成子图 的权重为 &amp;lt;math&amp;gt;\Omega\left(\frac{1}{\epsilon \cdot x^2} \cdot w(G_{\mathrm{opt},\epsilon})\right)&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;G_{\mathrm{opt},\epsilon}&amp;lt;/math&amp;gt; 是图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的权重最小的 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图。&amp;lt;br/&amp;gt;尽管在过去三十年中，关于生成子图的近似算法的研究层出不穷，但目前仍不存在任何（即使是双标准）近似算法，能够在带权平面图上构造出优于上述存在性下界的轻量生成子图。&amp;lt;br/&amp;gt;作为本文的主要贡献，我们提出了一种在平面图上的动态规划算法，可在任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 中构造一个 &amp;lt;math&amp;gt;\left(1 + \epsilon \cdot 2^{O(\log^* 1/\epsilon)}\right)&amp;lt;/math&amp;gt;-生成子图，其总权重为 &amp;lt;math&amp;gt;O(1) \cdot w(G_{\mathrm{opt},\epsilon})&amp;lt;/math&amp;gt;。此外，我们也证明了精确求解最小平面生成子图是NP难的。&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (09:50 – 10:10)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|10:10 – 11:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|姜少峰&amp;lt;br/&amp;gt;&lt;br /&gt;
北京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Local Search for Clustering in Almost-linear Time&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We propose the first local search algorithm for Euclidean clustering that attains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. Specifically, for Euclidean k-Means, our algorithm achieves an &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;-approximation in &amp;lt;math&amp;gt;\tilde{O}(n^{1 + 1 / c})&amp;lt;/math&amp;gt; time, for any constant &amp;lt;math&amp;gt;c \ge 1&amp;lt;/math&amp;gt;, maintaining the same running time as the previous (non-local-search-based) approach [la Tour and Saulpic, arXiv&#039;2407.11217] while improving the approximation factor from &amp;lt;math&amp;gt;O(c^{6})&amp;lt;/math&amp;gt; to &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;. The algorithm generalizes to any metric space with sparse spanners, delivering efficient constant approximation in &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; metrics, doubling metrics, Jaccard metrics, etc.&amp;lt;br/&amp;gt; This generality derives from our main technical contribution: a local search algorithm on general graphs that obtains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. We establish this through a new &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;-swap local search framework featuring a novel swap selection rule. At a high level, this rule “scores” every possible swap, based on both its modification to the clustering and its improvement to the clustering objective, and then selects those high-scoring swaps. To implement this, we design a new data structure for maintaining approximate nearest neighbors with amortized guarantees tailored to our framework.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|11:05 – 11:55&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|陈雪&amp;lt;br/&amp;gt;&lt;br /&gt;
中国科学技术大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Algorithms for Sparse LPN and LSPN Against Low-noise&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider sparse variants of the classical Learning Parities with random Noise (LPN) problem. Our main contribution is a new algorithmic framework that provides learning algorithms against low-noise for both Learning Sparse Parities (LSPN) problem and sparse LPN problem. Different from previous approaches for LSPN and sparse LPN, this framework has a simple structure and runs in polynomial space. Let &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; be the dimension, &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; denote the sparsity, and &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt; be the noise rate.&amp;lt;br/&amp;gt;As a fundamental problem in computational learning theory, Learning Sparse Parities with Noise (LSPN) assumes the hidden parity is &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse. While a simple enumeration algorithm takes &amp;lt;math&amp;gt;{n \choose k}=O((n/k)^k)&amp;lt;/math&amp;gt; time, previously known results still need &amp;lt;math&amp;gt;{n \choose k/2} = \Omega((n/k)^{k/2})&amp;lt;/math&amp;gt; time for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;. Our framework provides a LSPN algorithm runs in time &amp;lt;math&amp;gt;O((\eta \cdot n/k)^k)&amp;lt;/math&amp;gt; for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;, which improves the state-of-the-art of LSPN whenever &amp;lt;math&amp;gt;\eta \in ( k/n,\sqrt{k/n})&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;The sparse LPN problem is closely related to the classical problem of refuting random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-CSP and has been widely used in cryptography as the hardness assumption. Different from the standard LPN, it samples random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors. Because the number of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors is &amp;lt;math&amp;gt;{n \choose k} &amp;lt; n^k&amp;lt;/math&amp;gt;, sparse LPN has learning algorithms in polynomial time when &amp;lt;math&amp;gt;m&amp;gt;n^{k/2}&amp;lt;/math&amp;gt;. However, much less is known about learning algorithms for constant &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; like &amp;lt;math&amp;gt;3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m&amp;lt;n^{k/2}&amp;lt;/math&amp;gt; samples, except the Gaussian elimination algorithm of time &amp;lt;math&amp;gt;e^{\eta n}&amp;lt;/math&amp;gt;. Our framework provides a learning algorithm in &amp;lt;math&amp;gt;e^{O(\eta \cdot n^{\frac{\delta+1}{2}})}&amp;lt;/math&amp;gt; time given &amp;lt;math&amp;gt;\delta \in (0,1)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m \approx n^{1+(1-\delta)\cdot \frac{k-1}{2}}&amp;lt;/math&amp;gt; samples. This improves previous learning algorithms. For example, in the classical setting of &amp;lt;math&amp;gt;k=3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m=n^{1.4}&amp;lt;/math&amp;gt;, our algorithm would be faster than previous approaches for any &amp;lt;math&amp;gt;\eta&amp;lt;n^{-0.7}&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;Based on joint work with Wenxuan Shu (USTC) and Zhaienhe Zhou (USTC).&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Lunch Break  (12:00 - 14:00)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:00 – 14:50&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张瀚文&amp;lt;br/&amp;gt;&lt;br /&gt;
哥本哈根大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Minimum Star Partitions of Simple Polygons in Polynomial Time &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: 我们设计了一种多项式时间算法，用于将简单多边形P划分为最少个数的星形多边形。这样的算法是否存在的问题已被提出超过四十年之久并多次重复，包括在O’Rourke的著作《美术馆定理与算法》中。之前已知的算法只能处理一些特殊情况，例如多边形是单调的直边多边形，或者不允许使用斯坦纳点的情况，都远不足以处理最普遍的例子。而允许星型子部分重叠的覆盖变体——即著名的美术馆问题，在2018年被证明属于∃ℝ完全类，因此很可能比NP问题更难。除了理论价值外，星型多边形划分也可以应用在数控型腔铣削、机器人路径规划、形状参数化等实际场景中。&amp;lt;br/&amp;gt;在这个报告中，我会着重讲解我们求解这个问题时的直觉、思考和发现，沉浸式体验我们在这项研究中的全部经历。&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:55 – 15:45&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|许超&amp;lt;br&amp;gt;&lt;br /&gt;
电子科技大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: An Optimal Algorithm for the Stacker Crane Problem on Fixed Topologies&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: The Stacker Crane Problem (SCP) is a variant of the Traveling Salesman Problem. In SCP, pairs of pickup and delivery points are designated on a graph, and a crane must visit these points to move objects from each pickup location to its respective delivery point. The goal is to minimize the total distance traveled. SCP is known to be NP-hard, even on trees. The only positive results, in terms of polynomial-time solvability, apply to graphs that are topologically equivalent to a path or a cycle. We propose an algorithm that is optimal for each fixed topology, running in near-linear time. This is achieved by demonstrating that the problem is fixed-parameter tractable (FPT) when parameterized by both the cycle rank and the number of branch vertices.&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (15:45 – 16:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|16:15 – 17:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张驰豪&amp;lt;br/&amp;gt;&lt;br /&gt;
上海交通大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Sampling from non-log-concave distributions&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Sampling from a d-dimensional distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; with density &amp;lt;math&amp;gt;p_{\mu}(x) \propto e^{-V(x)}&amp;lt;/math&amp;gt; is a central problem in many areas, including theoretical computer science, statistical physics, and machine learning. It is well-known that when the potential function &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;convex&#039;&#039; (or equivalently, when &amp;lt;math&amp;gt;p_{\mu}&amp;lt;/math&amp;gt; is &#039;&#039;log-concave&#039;&#039;), or more generally, when &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfies good isoperimetric inequalities, efficient sampling algorithms exist in various computational models. A common belief is that the sampling task becomes more difficult when &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;non-convex&#039;&#039;. On the other hand, data-based algorithms (e.g., denoising diffusion probabilistic models) developed in the machine learning community are very successful in practice when dealing with highly non-log-concave distributions (such as in image generation), and provide new insights into designing efficient sampling algorithms. &amp;lt;br/&amp;gt; In this talk, we will start with a general tight (exponential) sampling complexity bound for any &#039;&#039;non-log-concave&#039;&#039; distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfying mild regularity conditions. Then, we will show how a common strengthening of these regularity conditions leads to an efficient (polynomial) sampling algorithm. Finally, we will discuss future directions for understanding the complexity of sampling from general distributions.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|17:10 – 18:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|黄增峰&amp;lt;br/&amp;gt;&lt;br /&gt;
复旦大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Simple and Optimal Algorithms for Heavy Hitters and Frequency Moments in Distributed Models&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider the problems of distributed heavy hitters and frequency moments in both the coordinator model and the distributed tracking model. We present simple and optimal algorithms for heavy hitters and frequency moments estimation in these distributed models. For &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; heavy hitters in the coordinator model, our algorithm requires only one round and uses &amp;lt;math&amp;gt;\tilde{O}(k^{p-1}/\epsilon^p)&amp;lt;/math&amp;gt; bits of communication. For &amp;lt;math&amp;gt;p &amp;gt; 2&amp;lt;/math&amp;gt;, this is the first near-optimal result. By combining our algorithm with the standard recursive sketching technique, we obtain a near-optimal two-round algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the coordinator model, matching a significant result from recent work by Esfandiari et al. (STOC 2024). Our algorithm and analysis are much simpler and have better cost with respect to logarithmic factors. Due to the simplicity of our heavy hitter algorithms, we manage to adapt them to the distributed tracking model with only a &amp;lt;math&amp;gt;\mathrm{polylog}(n)&amp;lt;/math&amp;gt; increase in communication. This presents the first near-optimal algorithm for heavy hitters. By applying the recursive sketching technique, we also provide the first near-optimal algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the distributed tracking model for all &amp;lt;math&amp;gt;p \geq 2&amp;lt;/math&amp;gt;. Even for &amp;lt;math&amp;gt;F_2&amp;lt;/math&amp;gt;, our result improves upon the bounds established by Cormode, Muthukrishnan, and Yi (SODA 2008) and Woodruff and Zhang (STOC 2012), nearly matching the existing lower bound for the first time.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Getting to The Campus==&lt;br /&gt;
*入校：&amp;lt;strike&amp;gt;校外来宾请在校门口向安保说明会议名称后登记入校。&amp;lt;/strike&amp;gt; &amp;lt;font color=red&amp;gt;[[Media:智软院访客预约流程.pdf|&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;]]&amp;lt;/font&amp;gt;&lt;br /&gt;
**关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
**审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
**如开车进校，请将车牌号填写到“随行车辆车牌”处。&lt;br /&gt;
*高铁 / 动车&lt;br /&gt;
**苏州站：打车至校区约 30 分钟（非早晚高峰情况下），费用约 ¥30。亦可选择快线 3 号或地铁转有轨电车，全程约 2 小时。&lt;br /&gt;
**苏州新区站：打车至校区约 25 分钟，费用约 ¥25。也可乘坐有轨电车 2 号线，约 1 小时。&lt;br /&gt;
*飞机&lt;br /&gt;
**无锡硕放机场（WUX）：打车至校区约 30 分钟；因跨城行驶，司机可能会收取返程/空驶费用，总费用约 ¥80。如果能打到顺风车的话会较为便宜。亦可选公共交通，约 2 小时。&lt;br /&gt;
**上海虹桥机场（SHA）：建议从虹桥火车站换乘高铁至苏州站或苏州新区站。务必留意，由上海虹桥站前往苏州的高铁末班车时间通常是21:42。不建议从虹桥直接打车至苏州（费用较高）；同时不建议打顺风车，因为通常只能打到黑出租。&lt;br /&gt;
&lt;br /&gt;
== Accommodation Suggestion ==&lt;br /&gt;
￥￥￥ 苏州科技城源宿酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥￥ 南大国际学术交流中心（校内酒店，性价比高）、苏州科技城万达美华酒店、全季苏州科技城酒店、苏州高新区科技城亚朵酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥ 格林豪泰苏州市科技城商务酒店、宜必思尚品苏州科技城酒店、如家精选-苏州乐园高新区科技城店&lt;br /&gt;
&lt;br /&gt;
== Lunch &amp;amp; Supper ==&lt;br /&gt;
[[File:苏州校区食堂（2025）.png|thumb|苏州校区的四个食堂在图中红星处]]&lt;br /&gt;
* 苏州校区内现有科创大厦食堂、第16、17、18食堂共四个食堂，素菜2-3元、荤菜4-8元，可直接用支付宝或微信支付。此外，国际学术交流中心也提供更为昂贵的食物。&lt;br /&gt;
* 学校附近有：东渚镇、文体中心、丰茂里、时尚水岸星悦荟、星悦里等几个商业区。&lt;br /&gt;
* 也可以选择外卖，会送至校门口的外卖柜或外卖架上。&lt;br /&gt;
&lt;br /&gt;
== Getting Around ==&lt;br /&gt;
* 大阳山国家森林公园 &amp;amp; 植物园：层林步道＋寺庙人文，爬 60–90 分钟视体力安排；秋冬晴天观景佳。&lt;br /&gt;
* 树山生态村：乡野步道、茶园与农家菜，团队晚餐/走读首选。&lt;br /&gt;
* 太湖湿地/西山方向：自驾更便捷，观湿地与湖景线。&lt;br /&gt;
* 古城园林：傍晚可打车去平江路/山塘街逛夜景，或白天参观苏州博物馆/拙政园。&lt;br /&gt;
&lt;br /&gt;
== About Suzhou Campus ==&lt;br /&gt;
南京大学苏州校区位于苏州高新区太湖科技城，地处“环太湖科创圈”与“沿沪宁产业创新带”的黄金交汇点，被定位为南大发展壮大新工科的主阵地。立足“国家战略、世界一流、强强联合、需需结合”，南大苏州校区聚焦人工智能、新一代信息技术、新能源、先进制造、生命健康等领域“卡脖子”问题，强化“新工科”建设，促进文理工医交叉融合，政产学研协调发展。&lt;br /&gt;
&lt;br /&gt;
== Contact ==&lt;br /&gt;
刘明谋： lmm@nju.edu.cn&amp;lt;br/&amp;gt;&lt;br /&gt;
[[Media:2025年Suzhou Theory Day邀请函.pdf|邀请函.pdf]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13415</id>
		<title>数据科学基础 (Fall 2025)</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%80_(Fall_2025)&amp;diff=13415"/>
		<updated>2025-11-29T17:04:12Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Lectures */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{Infobox&lt;br /&gt;
|name         = Infobox&lt;br /&gt;
|bodystyle    = &lt;br /&gt;
|title        = &amp;lt;font size=3&amp;gt;&#039;&#039;&#039;数据科学基础&#039;&#039;&#039;&amp;lt;br&amp;gt;&lt;br /&gt;
Foundations of Data Science&lt;br /&gt;
|titlestyle   = &lt;br /&gt;
&lt;br /&gt;
|image        = &lt;br /&gt;
|imagestyle   = &lt;br /&gt;
|caption      = &lt;br /&gt;
|captionstyle = &lt;br /&gt;
|headerstyle  = background:#ccf;&lt;br /&gt;
|labelstyle   = background:#ddf;&lt;br /&gt;
|datastyle    = &lt;br /&gt;
&lt;br /&gt;
|header1 =Instructor&lt;br /&gt;
|label1  = &lt;br /&gt;
|data1   = &lt;br /&gt;
|header2 = &lt;br /&gt;
|label2  = &lt;br /&gt;
|data5   = &#039;&#039;&#039;刘明谋&#039;&#039;&#039;&lt;br /&gt;
|header6 = &lt;br /&gt;
|label6  = Email&lt;br /&gt;
|data6   = lmm@nju.edu.cn&lt;br /&gt;
|header7 =&lt;br /&gt;
|label7  = office&lt;br /&gt;
|data7   = 南雍-西229&lt;br /&gt;
|header8 = Class&lt;br /&gt;
|label8  = &lt;br /&gt;
|data8   = &lt;br /&gt;
|header9 =&lt;br /&gt;
|label9  = Class meeting&lt;br /&gt;
|data9   = 周五, 2pm-5pm &amp;lt;br/&amp;gt;苏教楼C204&lt;br /&gt;
|header10=&lt;br /&gt;
|label10 = Office hour&lt;br /&gt;
|data10  = 周四, 3pm-5pm&amp;lt;br/&amp;gt;南雍-西229&lt;br /&gt;
|header11= Textbook&lt;br /&gt;
|label11 = &lt;br /&gt;
|data11  = &lt;br /&gt;
|header12=&lt;br /&gt;
|label12 = &lt;br /&gt;
|data12  = [[File:概率导论.jpeg|border|100px]]&lt;br /&gt;
|header13=&lt;br /&gt;
|label13 = &lt;br /&gt;
|data13  = &#039;&#039;&#039;概率导论&#039;&#039;&#039;（第2版·修订版）&amp;lt;br&amp;gt; Dimitri P. Bertsekas and John N. Tsitsiklis&amp;lt;br&amp;gt; 郑忠国 童行伟 译；人民邮电出版社 (2022)&lt;br /&gt;
|header14=&lt;br /&gt;
|label14 = &lt;br /&gt;
|data14  = [[File:Probability_and_Computing_2ed.jpg|border|100px]]&lt;br /&gt;
|header15=&lt;br /&gt;
|label15 = &lt;br /&gt;
|data15  = &#039;&#039;&#039;Probability and Computing&#039;&#039;&#039; (2E) &amp;lt;br&amp;gt; Michael Mitzenmacher and Eli Upfal &amp;lt;br&amp;gt;   Cambridge University Press (2017)&lt;br /&gt;
|header16=&lt;br /&gt;
|label16 = &lt;br /&gt;
|data16  = [[File:Foundations_of_Data_Science.jpg|border|100px]]&lt;br /&gt;
|header17= &lt;br /&gt;
|label17 = &lt;br /&gt;
|data17  = &#039;&#039;&#039;Foundations of Data Science&#039;&#039;&#039; &amp;lt;br&amp;gt; Avrim Blum, John Hopcroft, Ravi Kannan &amp;lt;br&amp;gt;   Cambridge University Press (2020)&lt;br /&gt;
|belowstyle = background:#ddf;&lt;br /&gt;
|below = &lt;br /&gt;
}}&lt;br /&gt;
&lt;br /&gt;
This is the webpage for the &#039;&#039;Foundations of Data Science&#039;&#039; (数据科学基础) class of Fall 2025. Students who take this class should check this page periodically for content updates and new announcements. &lt;br /&gt;
&lt;br /&gt;
= Announcement =&lt;br /&gt;
* 新学期第一堂课：2025年8月29日，苏教楼D202。&lt;br /&gt;
* 2025年11月7日因校运动会停课一次。&lt;br /&gt;
* 第五次作业的Aliasing method 一题中应该是&amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;而不是 &amp;lt;math&amp;gt;\displaystyle{ \mathbf p=\frac 1{n{-1}}\sum^n_{r=1}\mathbf v_r }&amp;lt;/math&amp;gt;&lt;br /&gt;
&lt;br /&gt;
= Course info =&lt;br /&gt;
* &#039;&#039;&#039;Instructor &#039;&#039;&#039;: &lt;br /&gt;
** [https://liumingmou.github.io 刘明谋]：[mailto:lmm@nju.edu.cn &amp;lt;lmm@nju.edu.cn&amp;gt;]，南雍-西229&lt;br /&gt;
* &#039;&#039;&#039;Teaching assistant&#039;&#039;&#039;:&lt;br /&gt;
** 梁梓豪：[mailto:zhliang@smail.nju.edu.cn 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
** 周海刚：[mailto:hgzhou2003@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 欧丰宁：[mailto:oufn02@outlook.com 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 于逸潇：[mailto:yixiaoyu@smail.nju.edu.cn 📧] 仙林校区计科楼北栋410&lt;br /&gt;
** 缪天顺：[mailto:mtsmts2022@outlook.com 📧] 仙林校区计科楼北栋426 &lt;br /&gt;
* &#039;&#039;&#039;Class meeting&#039;&#039;&#039;:&lt;br /&gt;
** 周五：2pm-5pm，苏教楼C204&lt;br /&gt;
* &#039;&#039;&#039;Office hour&#039;&#039;&#039;: &lt;br /&gt;
:* 周四：3pm-5pm，南雍-西229（刘明谋）&lt;br /&gt;
:* &#039;&#039;&#039;QQ群&#039;&#039;&#039;: 1019436733（申请加入需提供姓名、院系、学号）&lt;br /&gt;
&lt;br /&gt;
= Syllabus =&lt;br /&gt;
课程内容分为三大部分：&lt;br /&gt;
&lt;br /&gt;
* &#039;&#039;&#039;经典概率论&#039;&#039;&#039;：包括概率空间、随机变量及其数字特征、多维与连续随机变量&lt;br /&gt;
* &#039;&#039;&#039;概率与计算&#039;&#039;&#039;：包括测度集中现象，概率法，离散随机过程三部分&lt;br /&gt;
* &#039;&#039;&#039;数理统计&#039;&#039;&#039;：包括参数估计、假设检验、贝叶斯估计、方差分析、相关性及回归分析等统计推断内容。&lt;br /&gt;
&lt;br /&gt;
对于第一和第二部分，要求清楚掌握基本概念，深刻理解关键的现象与规律以及背后的原理，并可以灵活运用所学方法求解相关问题。对于第三部分，要求熟悉数理统计相关的基本概念，以及典型的统计模型、统计推断方法。&lt;br /&gt;
&lt;br /&gt;
经过本课程的训练，学生将能够掌握概率论和统计学的基本理论和方法，具备处理和分析实际数据的能力，为后续学习数据挖掘、机器学习、大数据技术等数据科学相关领域打下坚实基础。本课程采用课堂讲授、案例分析和课后练习相结合的教学方式，注重理论与实践相结合，培养学生运用所学知识解决实际问题的能力。通过本课程的学习，学生将能够具备扎实的数学基础，为未来从事数据科学研究和实践奠定坚实基础。&lt;br /&gt;
&lt;br /&gt;
=== 教材与参考书 Course Materials ===&lt;br /&gt;
* &#039;&#039;&#039;[BT]&#039;&#039;&#039; 概率导论（第2版·修订版），[美]伯特瑟卡斯（Dimitri P.Bertsekas）[美]齐齐克利斯（John N.Tsitsiklis）著，郑忠国 童行伟 译，人民邮电出版社（2022）。&lt;br /&gt;
* &#039;&#039;&#039;[MU]&#039;&#039;&#039; &#039;&#039;Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis&#039;&#039;, by Michael Mitzenmacher, Eli Upfal; Cambridge University Press; 2nd edition (2017).&lt;br /&gt;
* &#039;&#039;&#039;[GS]&#039;&#039;&#039; &#039;&#039;Probability and Random Processes&#039;&#039;, by Geoffrey Grimmett and David Stirzaker; Oxford University Press; 4th edition (2020).&lt;br /&gt;
* &#039;&#039;&#039;[BHK]&#039;&#039;&#039; &#039;&#039;Foundations of Data Science&#039;&#039;, by Avrim Blum, John Hopcroft, and Ravindran Kannan; Cambridge University Press (2020).&lt;br /&gt;
&lt;br /&gt;
=== 成绩 Grading Policy ===&lt;br /&gt;
* 课程成绩：本课程将会有若干次作业和一次期末考试。最终成绩将由平时作业成绩和期末考试成绩综合得出。&lt;br /&gt;
* 迟交：如果有特殊的理由，无法按时完成作业，请提前联系授课老师，给出正当理由。否则迟交的作业将不被接受。&lt;br /&gt;
&lt;br /&gt;
=== &amp;lt;font color=red&amp;gt; 学术诚信 Academic Integrity &amp;lt;/font&amp;gt;===&lt;br /&gt;
学术诚信是所有从事学术活动的学生和学者最基本的职业道德底线，本课程将不遗余力的维护学术诚信规范，违反这一底线的行为将不会被容忍。&lt;br /&gt;
&lt;br /&gt;
作业完成的原则：&#039;&#039;&#039;署你名字的工作必须是你个人的贡献，任何不是由你完成的部分都必须明确标注&#039;&#039;&#039;，特别是由AI生成的部分，否则就涉嫌抄袭。在完成作业的过程中，允许讨论，前提是讨论的所有参与者均处于同等完成度。但关键想法的执行、以及作业文本的写作必须独立完成，并在作业中致谢（acknowledge）所有参与讨论的人。符合规则的讨论与致谢将不会影响得分。不允许其他任何形式的合作——尤其是与已经完成作业的同学“讨论”。&lt;br /&gt;
&lt;br /&gt;
本课程将对剽窃行为采取零容忍的态度。在完成作业过程中，对他人工作（出版物、互联网资料、其他人的作业等）直接的文本抄袭和对关键思想、关键元素的抄袭，按照 [http://www.acm.org/publications/policies/plagiarism_policy ACM Policy on Plagiarism]的解释，都将视为剽窃。剽窃者成绩将被取消。如果发现互相抄袭行为，&amp;lt;font color=red&amp;gt; 抄袭和被抄袭双方的成绩都将被取消&amp;lt;/font&amp;gt;。因此请主动防止自己的作业被他人抄袭。&lt;br /&gt;
&lt;br /&gt;
学术诚信影响学生个人的品行，也关乎整个教育系统的正常运转。为了一点分数而做出学术不端的行为，不仅使自己沦为一个欺骗者，也使他人的诚实努力失去意义。让我们一起努力维护一个诚信的环境。&lt;br /&gt;
&lt;br /&gt;
= Assignments =&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 1|Problem Set 1]]  请在 2025/09/26 上课之前(14:00 UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA1.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 2|Problem Set 2]]  请在 2025/10/03 14:00前(UTC+8)使用邮件的附件功能提交到 [mailto:pr2024_nju@163.com pr2024_nju@163.com] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA2.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 3|Problem Set 3]]  请在 2025/10/17 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/e717e1b8eccd4c4fb889/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA3.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 4|Problem Set 4]]  请在 2025/10/31 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/fb85c46de75f4095b326/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA4.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
*[[数据科学基础 (Fall 2025)/Problem Set 5|Problem Set 5]]  请在 2025/11/21 上课之前(14:00 UTC+8)上传到 [https://box.nju.edu.cn/u/d/1243dac3190b4e1eb30b/ 南大云盘] (文件名为&#039;&amp;lt;font color=red &amp;gt;学号_姓名_25FA5.pdf&amp;lt;/font&amp;gt;&#039;).&lt;br /&gt;
&lt;br /&gt;
= Lectures =&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/1/1a/Intro%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 课程简介]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1Vkz4YqEC9 Bertrand Paradox (贝特朗悖论)的视频]&lt;br /&gt;
# [https://tcs.nju.edu.cn/wiki/images/5/51/ProbSpace%EF%BC%88%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%9F%BA%E7%A1%802025%EF%BC%89.pdf 概率空间]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第1章&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/732bad4060fc442789ab/ 随机变量]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第2章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 2&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Volume of Hamming balls|Volume of Hamming balls]]&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Average-case analysis of QuickSort|Average-case analysis of &#039;&#039;&#039;&#039;&#039;QuickSort&#039;&#039;&#039;&#039;&#039;]]&lt;br /&gt;
#* [https://www.bilibili.com/video/BV1ta411A7fp/ 高尔顿板（Galton board）视频] 和 [https://en.wikipedia.org/wiki/Galton_board 维基百科页面]&lt;br /&gt;
# [https://box.nju.edu.cn/f/89f212b7b6874c0e9097/  ‎偏差和矩]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapter 3&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 章节 2.4, 4.2, 4.3, 5.1&#039;&#039;&#039;&lt;br /&gt;
#* [[概率论与数理统计 (Spring 2024)/Threshold of k-clique in random graph|Threshold of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-clique in random graph]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/1eca74dafe6c4d11a799/ 连续分布]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第3章, 和4.1节&#039;&#039;&#039; 或 &#039;&#039;&#039;[GS] Chapter 4&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 8, 9&#039;&#039;&#039;&lt;br /&gt;
#* [https://measure.axler.net/MIRA.pdf Measure, Integration &amp;amp; Real Analysis] by Sheldon Axler&lt;br /&gt;
# [https://box.nju.edu.cn/f/9a675bedb36243d19616/ 极限定理]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第5章&#039;&#039;&#039; &lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.7~5.10, 7.1~7.5&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/1049bd7f7974465cbc85/ 测度集中]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Chapters 4&#039;&#039;&#039; and &#039;&#039;&#039;Sections 13.1, 13.4~13.5&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[GS] Sections 5.11, 12.1~12.3, 7.8~7.9&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/Hoeffding&#039;s lemma|Hoeffding&#039;s lemma]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/06617a7c88af456696de/ 随机过程]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第6章, 第7章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU]  Chapters 7, Sections 13.1~13.3&#039;&#039;&#039; or &#039;&#039;&#039;[GS] Chapters 6, Sections 12.4~12.5&#039;&#039;&#039;&lt;br /&gt;
#* [[数据科学基础 (Fall 2024)/OST and applications|OST and applications]]&lt;br /&gt;
# [https://box.nju.edu.cn/f/be7ade6440ea4462af3b/ 统计学与点估计]&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[BT] 第8章, 第9章&#039;&#039;&#039;&lt;br /&gt;
#* 阅读：&#039;&#039;&#039;[MU] Section 9.6~9.7&#039;&#039;&#039;&lt;br /&gt;
# [https://box.nju.edu.cn/f/5e1cb2f1d656460bb60c/ 假设检验]&lt;br /&gt;
&lt;br /&gt;
= Concepts =&lt;br /&gt;
* [https://plato.stanford.edu/entries/probability-interpret/ Interpretations of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/History_of_probability History of probability]&lt;br /&gt;
* Example problems:&lt;br /&gt;
** [https://dornsifecms.usc.edu/assets/sites/520/docs/VonNeumann-ams12p36-38.pdf von Neumann&#039;s Bernoulli factory] and other [https://peteroupc.github.io/bernoulli.html Bernoulli factory algorithms]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boy_or_Girl_paradox Boy or Girl paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Monty_Hall_problem Monty Hall problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bertrand_paradox_(probability) Bertrand paradox]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hard_spheres Hard spheres model] and [https://en.wikipedia.org/wiki/Ising_model Ising model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/PageRank &#039;&#039;PageRank&#039;&#039;] and stationary [https://en.wikipedia.org/wiki/Random_walk random walk]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Diffusion_process Diffusion process] and [https://en.wikipedia.org/wiki/Diffusion_model diffusion model]&lt;br /&gt;
*[https://en.wikipedia.org/wiki/Probability_space Probability space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Sample_space Sample space]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Event_(probability_theory) Event] and [https://en.wikipedia.org/wiki/Σ-algebra &amp;lt;math&amp;gt;\sigma&amp;lt;/math&amp;gt;-algebra]&lt;br /&gt;
** Kolmogorov&#039;s [https://en.wikipedia.org/wiki/Probability_axioms axioms of probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Classical] and [https://en.wikipedia.org/wiki/Geometric_probability goemetric probability]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Boole%27s_inequality Union bound]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Inclusion%E2%80%93exclusion_principle Inclusion-Exclusion principle]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Boole%27s_inequality#Bonferroni_inequalities Bonferroni inequalities]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Conditional_probability Conditional probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Chain_rule_(probability) Chain rule]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_probability Law of total probability]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bayes%27_theorem Bayes&#039; law]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Independence_(probability_theory) Independence] &lt;br /&gt;
** [https://en.wikipedia.org/wiki/Pairwise_independence Pairwise independence]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Random_variable Random variable]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Cumulative_distribution_function Cumulative distribution function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_mass_function Probability mass function]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Probability_density_function Probability density function]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Multivariate_random_variable Random vector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Joint_probability_distribution Joint probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_probability_distribution Conditional probability distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Marginal_distribution Marginal distribution]&lt;br /&gt;
* Some &#039;&#039;&#039;discrete&#039;&#039;&#039; probability distributions&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Bernoulli_trial Bernoulli trial] and [https://en.wikipedia.org/wiki/Bernoulli_distribution Bernoulli distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Discrete_uniform_distribution Discrete uniform distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Binomial_distribution Binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Geometric_distribution Geometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Negative_binomial_distribution Negative binomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Hypergeometric_distribution Hypergeometric distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Poisson_distribution Poisson distribution]&lt;br /&gt;
** and [https://en.wikipedia.org/wiki/List_of_probability_distributions#Discrete_distributions others]&lt;br /&gt;
* Balls into bins model&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Multinomial_distribution Multinomial distribution]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Birthday_problem Birthday problem]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Coupon_collector%27s_problem Coupon collector]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Balls_into_bins_problem Occupancy problem]&lt;br /&gt;
* Random graphs&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93R%C3%A9nyi_model Erdős–Rényi random graph model]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Galton%E2%80%93Watson_process Galton–Watson branching process]&lt;br /&gt;
* [https://en.wikipedia.org/wiki/Expected_value Expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_the_unconscious_statistician Law of the unconscious statistician, &#039;&#039;LOTUS&#039;&#039;]&lt;br /&gt;
** [https://dlsun.github.io/probability/linearity.html Linearity of expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Conditional_expectation Conditional expectation]&lt;br /&gt;
** [https://en.wikipedia.org/wiki/Law_of_total_expectation Law of total expectation]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13412</id>
		<title>Theory@Suzhou 2025</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13412"/>
		<updated>2025-11-29T03:37:19Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Getting to The Campus */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:2025 SuZhou Theory Day poster.png|200px|thumb|活动海报]]&lt;br /&gt;
==General Information ==&lt;br /&gt;
[[File:苏教楼D.png|thumb|苏教楼D在图中红星处]]&lt;br /&gt;
*&#039;&#039;&#039;&amp;lt;font size=4&amp;gt;Sunday, Nov 30, 2025: 09:00 -- 18:00.&amp;lt;/font&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Location&#039;&#039;&#039;: 南京大学苏州校区&amp;lt;/font&amp;gt;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Venue&#039;&#039;&#039;: 苏教楼D202&amp;lt;/font&amp;gt;&lt;br /&gt;
[https://zcc.nju.edu.cn/DFS//file/2024/09/20/202409201037042506uv3mq.pdf 苏州校区地图]&lt;br /&gt;
&lt;br /&gt;
==Announcement==&lt;br /&gt;
&amp;lt;font size=4 color=red&amp;gt;&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
*关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
*审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
*如开车进校，请将车牌号填写到“随行车辆车牌”处。&lt;br /&gt;
&lt;br /&gt;
==Speakers (in alphabetic order)==&lt;br /&gt;
* [http://staff.ustc.edu.cn/~xuechen1989/ 陈雪]（中国科学技术大学）&lt;br /&gt;
* [https://zengfenghuang.github.io/ 黄增峰]（复旦大学）&lt;br /&gt;
* [https://www.shaofengjiang.cn/ 姜少峰]（北京大学）&lt;br /&gt;
* [https://chaoxu.prof/ 许超]（电子科技大学）&lt;br /&gt;
* [https://chihaozhang.com/ 张驰豪]（上海交通大学）&lt;br /&gt;
* [https://scholar.google.com/citations?user=TydhZfgAAAAJ  张瀚文]（哥本哈根大学）&lt;br /&gt;
* [https://zhangty12.github.io/ 张天翼]（南京大学）&lt;br /&gt;
&lt;br /&gt;
== Join us==&lt;br /&gt;
&#039;&#039;&#039;不需注册&#039;&#039;&#039;。&lt;br /&gt;
本次活动涵盖近似算法、图算法、计算几何、理论机器学习、概率与采样算法、流与分布式算法在内的多个主题，欢迎所有对理论计算机科学感兴趣的同学和老师前来参加。&amp;lt;br/&amp;gt;&lt;br /&gt;
请 [https://docs.qq.com/form/page/DS0JxdW5yZHZPYWtF &#039;&#039;&#039;简单填写问卷&#039;&#039;&#039;] 用于统计参会人数，以便准备茶歇的食物和调整报告厅。&lt;br /&gt;
&lt;br /&gt;
== Program ==&lt;br /&gt;
:{|border=&amp;quot;2&amp;quot; width=&amp;quot;100%&amp;quot; cellspacing=&amp;quot;4&amp;quot; cellpadding=&amp;quot;3&amp;quot; rules=&amp;quot;all&amp;quot; style=&amp;quot;margin:1em 1em 1em 0; border:solid 1px #AAAAAA; border-collapse:collapse;empty-cells:show;&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#A7C1F2&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Workshop Program&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;width: 140px;&amp;quot; align=&amp;quot;center&amp;quot;|09:00 - 09:50&lt;br /&gt;
|style=&amp;quot;width: 180px;&amp;quot; align=&amp;quot;center&amp;quot;|张天翼&amp;lt;br/&amp;gt;&lt;br /&gt;
南京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Approximate Light Spanners in Planar Graphs&amp;lt;/font&amp;gt;&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Althöfer 等人（DCG 1993）提出了贪心生成子图，并证明了对于任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，其贪心 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的总权重至多为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;w(\mathrm{MST}(G))&amp;lt;/math&amp;gt; 表示图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的最小生成树 &amp;lt;math&amp;gt;\mathrm{MST}(G)&amp;lt;/math&amp;gt; 的权重。该界在存在性意义上是紧的：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得其任意 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的权重至少为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;。&amp;lt;br/&amp;gt;然而，从近似算法的角度来看，即使是双标准（bicriteria）近似，贪心生成子图 的权重近似因子也基本上达到了上述存在性下界：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得对于任意满足 &amp;lt;math&amp;gt;1 \leq x = O(\epsilon^{-1/2})&amp;lt;/math&amp;gt; 的参数，其贪心 &amp;lt;math&amp;gt;(1 + x\epsilon)&amp;lt;/math&amp;gt;-生成子图 的权重为 &amp;lt;math&amp;gt;\Omega\left(\frac{1}{\epsilon \cdot x^2} \cdot w(G_{\mathrm{opt},\epsilon})\right)&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;G_{\mathrm{opt},\epsilon}&amp;lt;/math&amp;gt; 是图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的权重最小的 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图。&amp;lt;br/&amp;gt;尽管在过去三十年中，关于生成子图的近似算法的研究层出不穷，但目前仍不存在任何（即使是双标准）近似算法，能够在带权平面图上构造出优于上述存在性下界的轻量生成子图。&amp;lt;br/&amp;gt;作为本文的主要贡献，我们提出了一种在平面图上的动态规划算法，可在任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 中构造一个 &amp;lt;math&amp;gt;\left(1 + \epsilon \cdot 2^{O(\log^* 1/\epsilon)}\right)&amp;lt;/math&amp;gt;-生成子图，其总权重为 &amp;lt;math&amp;gt;O(1) \cdot w(G_{\mathrm{opt},\epsilon})&amp;lt;/math&amp;gt;。此外，我们也证明了精确求解最小平面生成子图是NP难的。&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (09:50 – 10:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|10:15 – 11:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|姜少峰&amp;lt;br/&amp;gt;&lt;br /&gt;
北京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Local Search for Clustering in Almost-linear Time&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We propose the first local search algorithm for Euclidean clustering that attains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. Specifically, for Euclidean k-Means, our algorithm achieves an &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;-approximation in &amp;lt;math&amp;gt;\tilde{O}(n^{1 + 1 / c})&amp;lt;/math&amp;gt; time, for any constant &amp;lt;math&amp;gt;c \ge 1&amp;lt;/math&amp;gt;, maintaining the same running time as the previous (non-local-search-based) approach [la Tour and Saulpic, arXiv&#039;2407.11217] while improving the approximation factor from &amp;lt;math&amp;gt;O(c^{6})&amp;lt;/math&amp;gt; to &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;. The algorithm generalizes to any metric space with sparse spanners, delivering efficient constant approximation in &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; metrics, doubling metrics, Jaccard metrics, etc.&amp;lt;br/&amp;gt; This generality derives from our main technical contribution: a local search algorithm on general graphs that obtains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. We establish this through a new &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;-swap local search framework featuring a novel swap selection rule. At a high level, this rule “scores” every possible swap, based on both its modification to the clustering and its improvement to the clustering objective, and then selects those high-scoring swaps. To implement this, we design a new data structure for maintaining approximate nearest neighbors with amortized guarantees tailored to our framework.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|11:10 – 12:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|陈雪&amp;lt;br/&amp;gt;&lt;br /&gt;
中国科学技术大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Algorithms for Sparse LPN and LSPN Against Low-noise&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider sparse variants of the classical Learning Parities with random Noise (LPN) problem. Our main contribution is a new algorithmic framework that provides learning algorithms against low-noise for both Learning Sparse Parities (LSPN) problem and sparse LPN problem. Different from previous approaches for LSPN and sparse LPN, this framework has a simple structure and runs in polynomial space. Let &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; be the dimension, &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; denote the sparsity, and &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt; be the noise rate.&amp;lt;br/&amp;gt;As a fundamental problem in computational learning theory, Learning Sparse Parities with Noise (LSPN) assumes the hidden parity is &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse. While a simple enumeration algorithm takes &amp;lt;math&amp;gt;{n \choose k}=O((n/k)^k)&amp;lt;/math&amp;gt; time, previously known results still need &amp;lt;math&amp;gt;{n \choose k/2} = \Omega((n/k)^{k/2})&amp;lt;/math&amp;gt; time for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;. Our framework provides a LSPN algorithm runs in time &amp;lt;math&amp;gt;O((\eta \cdot n/k)^k)&amp;lt;/math&amp;gt; for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;, which improves the state-of-the-art of LSPN whenever &amp;lt;math&amp;gt;\eta \in ( k/n,\sqrt{k/n})&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;The sparse LPN problem is closely related to the classical problem of refuting random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-CSP and has been widely used in cryptography as the hardness assumption. Different from the standard LPN, it samples random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors. Because the number of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors is &amp;lt;math&amp;gt;{n \choose k} &amp;lt; n^k&amp;lt;/math&amp;gt;, sparse LPN has learning algorithms in polynomial time when &amp;lt;math&amp;gt;m&amp;gt;n^{k/2}&amp;lt;/math&amp;gt;. However, much less is known about learning algorithms for constant &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; like &amp;lt;math&amp;gt;3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m&amp;lt;n^{k/2}&amp;lt;/math&amp;gt; samples, except the Gaussian elimination algorithm of time &amp;lt;math&amp;gt;e^{\eta n}&amp;lt;/math&amp;gt;. Our framework provides a learning algorithm in &amp;lt;math&amp;gt;e^{O(\eta \cdot n^{\frac{\delta+1}{2}})}&amp;lt;/math&amp;gt; time given &amp;lt;math&amp;gt;\delta \in (0,1)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m \approx n^{1+(1-\delta)\cdot \frac{k-1}{2}}&amp;lt;/math&amp;gt; samples. This improves previous learning algorithms. For example, in the classical setting of &amp;lt;math&amp;gt;k=3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m=n^{1.4}&amp;lt;/math&amp;gt;, our algorithm would be faster than previous approaches for any &amp;lt;math&amp;gt;\eta&amp;lt;n^{-0.7}&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;Based on joint work with Wenxuan Shu (USTC) and Zhaienhe Zhou (USTC).&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Lunch Break  (12:00 - 14:00)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:00 – 14:50&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张瀚文&amp;lt;br/&amp;gt;&lt;br /&gt;
哥本哈根大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Minimum Star Partitions of Simple Polygons in Polynomial Time &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: 我们设计了一种多项式时间算法，用于将简单多边形P划分为最少个数的星形多边形。这样的算法是否存在的问题已被提出超过四十年之久并多次重复，包括在O’Rourke的著作《美术馆定理与算法》中。之前已知的算法只能处理一些特殊情况，例如多边形是单调的直边多边形，或者不允许使用斯坦纳点的情况，都远不足以处理最普遍的例子。而允许星型子部分重叠的覆盖变体——即著名的美术馆问题，在2018年被证明属于∃ℝ完全类，因此很可能比NP问题更难。除了理论价值外，星型多边形划分也可以应用在数控型腔铣削、机器人路径规划、形状参数化等实际场景中。&amp;lt;br/&amp;gt;在这个报告中，我会着重讲解我们求解这个问题时的直觉、思考和发现，沉浸式体验我们在这项研究中的全部经历。&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:55 – 15:45&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|许超&amp;lt;br&amp;gt;&lt;br /&gt;
电子科技大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: An Optimal Algorithm for the Stacker Crane Problem on Fixed Topologies&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: The Stacker Crane Problem (SCP) is a variant of the Traveling Salesman Problem. In SCP, pairs of pickup and delivery points are designated on a graph, and a crane must visit these points to move objects from each pickup location to its respective delivery point. The goal is to minimize the total distance traveled. SCP is known to be NP-hard, even on trees. The only positive results, in terms of polynomial-time solvability, apply to graphs that are topologically equivalent to a path or a cycle. We propose an algorithm that is optimal for each fixed topology, running in near-linear time. This is achieved by demonstrating that the problem is fixed-parameter tractable (FPT) when parameterized by both the cycle rank and the number of branch vertices.&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (15:45 – 16:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|16:15 – 17:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张驰豪&amp;lt;br/&amp;gt;&lt;br /&gt;
上海交通大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Sampling from non-log-concave distributions&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Sampling from a d-dimensional distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; with density &amp;lt;math&amp;gt;p_{\mu}(x) \propto e^{-V(x)}&amp;lt;/math&amp;gt; is a central problem in many areas, including theoretical computer science, statistical physics, and machine learning. It is well-known that when the potential function &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;convex&#039;&#039; (or equivalently, when &amp;lt;math&amp;gt;p_{\mu}&amp;lt;/math&amp;gt; is &#039;&#039;log-concave&#039;&#039;), or more generally, when &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfies good isoperimetric inequalities, efficient sampling algorithms exist in various computational models. A common belief is that the sampling task becomes more difficult when &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;non-convex&#039;&#039;. On the other hand, data-based algorithms (e.g., denoising diffusion probabilistic models) developed in the machine learning community are very successful in practice when dealing with highly non-log-concave distributions (such as in image generation), and provide new insights into designing efficient sampling algorithms. &amp;lt;br/&amp;gt; In this talk, we will start with a general tight (exponential) sampling complexity bound for any &#039;&#039;non-log-concave&#039;&#039; distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfying mild regularity conditions. Then, we will show how a common strengthening of these regularity conditions leads to an efficient (polynomial) sampling algorithm. Finally, we will discuss future directions for understanding the complexity of sampling from general distributions.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|17:10 – 18:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|黄增峰&amp;lt;br/&amp;gt;&lt;br /&gt;
复旦大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Simple and Optimal Algorithms for Heavy Hitters and Frequency Moments in Distributed Models&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider the problems of distributed heavy hitters and frequency moments in both the coordinator model and the distributed tracking model. We present simple and optimal algorithms for heavy hitters and frequency moments estimation in these distributed models. For &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; heavy hitters in the coordinator model, our algorithm requires only one round and uses &amp;lt;math&amp;gt;\tilde{O}(k^{p-1}/\epsilon^p)&amp;lt;/math&amp;gt; bits of communication. For &amp;lt;math&amp;gt;p &amp;gt; 2&amp;lt;/math&amp;gt;, this is the first near-optimal result. By combining our algorithm with the standard recursive sketching technique, we obtain a near-optimal two-round algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the coordinator model, matching a significant result from recent work by Esfandiari et al. (STOC 2024). Our algorithm and analysis are much simpler and have better cost with respect to logarithmic factors. Due to the simplicity of our heavy hitter algorithms, we manage to adapt them to the distributed tracking model with only a &amp;lt;math&amp;gt;\mathrm{polylog}(n)&amp;lt;/math&amp;gt; increase in communication. This presents the first near-optimal algorithm for heavy hitters. By applying the recursive sketching technique, we also provide the first near-optimal algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the distributed tracking model for all &amp;lt;math&amp;gt;p \geq 2&amp;lt;/math&amp;gt;. Even for &amp;lt;math&amp;gt;F_2&amp;lt;/math&amp;gt;, our result improves upon the bounds established by Cormode, Muthukrishnan, and Yi (SODA 2008) and Woodruff and Zhang (STOC 2012), nearly matching the existing lower bound for the first time.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Getting to The Campus==&lt;br /&gt;
*入校：&amp;lt;strike&amp;gt;校外来宾请在校门口向安保说明会议名称后登记入校。&amp;lt;/strike&amp;gt; &amp;lt;font color=red&amp;gt;[[Media:智软院访客预约流程.pdf|&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;]]&amp;lt;/font&amp;gt;&lt;br /&gt;
**关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
**审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
**如开车进校，请将车牌号填写到“随行车辆车牌”处。&lt;br /&gt;
*高铁 / 动车&lt;br /&gt;
**苏州站：打车至校区约 30 分钟（非早晚高峰情况下），费用约 ¥30。亦可选择快线 3 号或地铁转有轨电车，全程约 2 小时。&lt;br /&gt;
**苏州新区站：打车至校区约 25 分钟，费用约 ¥25。也可乘坐有轨电车 2 号线，约 1 小时。&lt;br /&gt;
*飞机&lt;br /&gt;
**无锡硕放机场（WUX）：打车至校区约 30 分钟；因跨城行驶，司机可能会收取返程/空驶费用，总费用约 ¥80。如果能打到顺风车的话会较为便宜。亦可选公共交通，约 2 小时。&lt;br /&gt;
**上海虹桥机场（SHA）：建议从虹桥火车站换乘高铁至苏州站或苏州新区站。务必留意，由上海虹桥站前往苏州的高铁末班车时间通常是21:42。不建议从虹桥直接打车至苏州（费用较高）；同时不建议打顺风车，因为通常只能打到黑出租。&lt;br /&gt;
&lt;br /&gt;
== Accommodation Suggestion ==&lt;br /&gt;
￥￥￥ 苏州科技城源宿酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥￥ 南大国际学术交流中心（校内酒店，性价比高）、苏州科技城万达美华酒店、全季苏州科技城酒店、苏州高新区科技城亚朵酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥ 格林豪泰苏州市科技城商务酒店、宜必思尚品苏州科技城酒店、如家精选-苏州乐园高新区科技城店&lt;br /&gt;
&lt;br /&gt;
== Lunch &amp;amp; Supper ==&lt;br /&gt;
[[File:苏州校区食堂（2025）.png|thumb|苏州校区的四个食堂在图中红星处]]&lt;br /&gt;
* 苏州校区内现有科创大厦食堂、第16、17、18食堂共四个食堂，素菜2-3元、荤菜4-8元，可直接用支付宝或微信支付。此外，国际学术交流中心也提供更为昂贵的食物。&lt;br /&gt;
* 学校附近有：东渚镇、文体中心、丰茂里、时尚水岸星悦荟、星悦里等几个商业区。&lt;br /&gt;
* 也可以选择外卖，会送至校门口的外卖柜或外卖架上。&lt;br /&gt;
&lt;br /&gt;
== Getting Around ==&lt;br /&gt;
* 大阳山国家森林公园 &amp;amp; 植物园：层林步道＋寺庙人文，爬 60–90 分钟视体力安排；秋冬晴天观景佳。&lt;br /&gt;
* 树山生态村：乡野步道、茶园与农家菜，团队晚餐/走读首选。&lt;br /&gt;
* 太湖湿地/西山方向：自驾更便捷，观湿地与湖景线。&lt;br /&gt;
* 古城园林：傍晚可打车去平江路/山塘街逛夜景，或白天参观苏州博物馆/拙政园。&lt;br /&gt;
&lt;br /&gt;
== About Suzhou Campus ==&lt;br /&gt;
南京大学苏州校区位于苏州高新区太湖科技城，地处“环太湖科创圈”与“沿沪宁产业创新带”的黄金交汇点，被定位为南大发展壮大新工科的主阵地。立足“国家战略、世界一流、强强联合、需需结合”，南大苏州校区聚焦人工智能、新一代信息技术、新能源、先进制造、生命健康等领域“卡脖子”问题，强化“新工科”建设，促进文理工医交叉融合，政产学研协调发展。&lt;br /&gt;
&lt;br /&gt;
== Contact ==&lt;br /&gt;
刘明谋： lmm@nju.edu.cn&amp;lt;br/&amp;gt;&lt;br /&gt;
[[Media:2025年Suzhou Theory Day邀请函.pdf|邀请函.pdf]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13411</id>
		<title>Theory@Suzhou 2025</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13411"/>
		<updated>2025-11-29T03:37:09Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: /* Announcement */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:2025 SuZhou Theory Day poster.png|200px|thumb|活动海报]]&lt;br /&gt;
==General Information ==&lt;br /&gt;
[[File:苏教楼D.png|thumb|苏教楼D在图中红星处]]&lt;br /&gt;
*&#039;&#039;&#039;&amp;lt;font size=4&amp;gt;Sunday, Nov 30, 2025: 09:00 -- 18:00.&amp;lt;/font&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Location&#039;&#039;&#039;: 南京大学苏州校区&amp;lt;/font&amp;gt;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Venue&#039;&#039;&#039;: 苏教楼D202&amp;lt;/font&amp;gt;&lt;br /&gt;
[https://zcc.nju.edu.cn/DFS//file/2024/09/20/202409201037042506uv3mq.pdf 苏州校区地图]&lt;br /&gt;
&lt;br /&gt;
==Announcement==&lt;br /&gt;
&amp;lt;font size=4 color=red&amp;gt;&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
*关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
*审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
*如开车进校，请将车牌号填写到“随行车辆车牌”处。&lt;br /&gt;
&lt;br /&gt;
==Speakers (in alphabetic order)==&lt;br /&gt;
* [http://staff.ustc.edu.cn/~xuechen1989/ 陈雪]（中国科学技术大学）&lt;br /&gt;
* [https://zengfenghuang.github.io/ 黄增峰]（复旦大学）&lt;br /&gt;
* [https://www.shaofengjiang.cn/ 姜少峰]（北京大学）&lt;br /&gt;
* [https://chaoxu.prof/ 许超]（电子科技大学）&lt;br /&gt;
* [https://chihaozhang.com/ 张驰豪]（上海交通大学）&lt;br /&gt;
* [https://scholar.google.com/citations?user=TydhZfgAAAAJ  张瀚文]（哥本哈根大学）&lt;br /&gt;
* [https://zhangty12.github.io/ 张天翼]（南京大学）&lt;br /&gt;
&lt;br /&gt;
== Join us==&lt;br /&gt;
&#039;&#039;&#039;不需注册&#039;&#039;&#039;。&lt;br /&gt;
本次活动涵盖近似算法、图算法、计算几何、理论机器学习、概率与采样算法、流与分布式算法在内的多个主题，欢迎所有对理论计算机科学感兴趣的同学和老师前来参加。&amp;lt;br/&amp;gt;&lt;br /&gt;
请 [https://docs.qq.com/form/page/DS0JxdW5yZHZPYWtF &#039;&#039;&#039;简单填写问卷&#039;&#039;&#039;] 用于统计参会人数，以便准备茶歇的食物和调整报告厅。&lt;br /&gt;
&lt;br /&gt;
== Program ==&lt;br /&gt;
:{|border=&amp;quot;2&amp;quot; width=&amp;quot;100%&amp;quot; cellspacing=&amp;quot;4&amp;quot; cellpadding=&amp;quot;3&amp;quot; rules=&amp;quot;all&amp;quot; style=&amp;quot;margin:1em 1em 1em 0; border:solid 1px #AAAAAA; border-collapse:collapse;empty-cells:show;&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#A7C1F2&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Workshop Program&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;width: 140px;&amp;quot; align=&amp;quot;center&amp;quot;|09:00 - 09:50&lt;br /&gt;
|style=&amp;quot;width: 180px;&amp;quot; align=&amp;quot;center&amp;quot;|张天翼&amp;lt;br/&amp;gt;&lt;br /&gt;
南京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Approximate Light Spanners in Planar Graphs&amp;lt;/font&amp;gt;&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Althöfer 等人（DCG 1993）提出了贪心生成子图，并证明了对于任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，其贪心 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的总权重至多为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;w(\mathrm{MST}(G))&amp;lt;/math&amp;gt; 表示图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的最小生成树 &amp;lt;math&amp;gt;\mathrm{MST}(G)&amp;lt;/math&amp;gt; 的权重。该界在存在性意义上是紧的：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得其任意 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的权重至少为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;。&amp;lt;br/&amp;gt;然而，从近似算法的角度来看，即使是双标准（bicriteria）近似，贪心生成子图 的权重近似因子也基本上达到了上述存在性下界：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得对于任意满足 &amp;lt;math&amp;gt;1 \leq x = O(\epsilon^{-1/2})&amp;lt;/math&amp;gt; 的参数，其贪心 &amp;lt;math&amp;gt;(1 + x\epsilon)&amp;lt;/math&amp;gt;-生成子图 的权重为 &amp;lt;math&amp;gt;\Omega\left(\frac{1}{\epsilon \cdot x^2} \cdot w(G_{\mathrm{opt},\epsilon})\right)&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;G_{\mathrm{opt},\epsilon}&amp;lt;/math&amp;gt; 是图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的权重最小的 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图。&amp;lt;br/&amp;gt;尽管在过去三十年中，关于生成子图的近似算法的研究层出不穷，但目前仍不存在任何（即使是双标准）近似算法，能够在带权平面图上构造出优于上述存在性下界的轻量生成子图。&amp;lt;br/&amp;gt;作为本文的主要贡献，我们提出了一种在平面图上的动态规划算法，可在任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 中构造一个 &amp;lt;math&amp;gt;\left(1 + \epsilon \cdot 2^{O(\log^* 1/\epsilon)}\right)&amp;lt;/math&amp;gt;-生成子图，其总权重为 &amp;lt;math&amp;gt;O(1) \cdot w(G_{\mathrm{opt},\epsilon})&amp;lt;/math&amp;gt;。此外，我们也证明了精确求解最小平面生成子图是NP难的。&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (09:50 – 10:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|10:15 – 11:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|姜少峰&amp;lt;br/&amp;gt;&lt;br /&gt;
北京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Local Search for Clustering in Almost-linear Time&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We propose the first local search algorithm for Euclidean clustering that attains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. Specifically, for Euclidean k-Means, our algorithm achieves an &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;-approximation in &amp;lt;math&amp;gt;\tilde{O}(n^{1 + 1 / c})&amp;lt;/math&amp;gt; time, for any constant &amp;lt;math&amp;gt;c \ge 1&amp;lt;/math&amp;gt;, maintaining the same running time as the previous (non-local-search-based) approach [la Tour and Saulpic, arXiv&#039;2407.11217] while improving the approximation factor from &amp;lt;math&amp;gt;O(c^{6})&amp;lt;/math&amp;gt; to &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;. The algorithm generalizes to any metric space with sparse spanners, delivering efficient constant approximation in &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; metrics, doubling metrics, Jaccard metrics, etc.&amp;lt;br/&amp;gt; This generality derives from our main technical contribution: a local search algorithm on general graphs that obtains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. We establish this through a new &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;-swap local search framework featuring a novel swap selection rule. At a high level, this rule “scores” every possible swap, based on both its modification to the clustering and its improvement to the clustering objective, and then selects those high-scoring swaps. To implement this, we design a new data structure for maintaining approximate nearest neighbors with amortized guarantees tailored to our framework.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|11:10 – 12:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|陈雪&amp;lt;br/&amp;gt;&lt;br /&gt;
中国科学技术大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Algorithms for Sparse LPN and LSPN Against Low-noise&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider sparse variants of the classical Learning Parities with random Noise (LPN) problem. Our main contribution is a new algorithmic framework that provides learning algorithms against low-noise for both Learning Sparse Parities (LSPN) problem and sparse LPN problem. Different from previous approaches for LSPN and sparse LPN, this framework has a simple structure and runs in polynomial space. Let &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; be the dimension, &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; denote the sparsity, and &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt; be the noise rate.&amp;lt;br/&amp;gt;As a fundamental problem in computational learning theory, Learning Sparse Parities with Noise (LSPN) assumes the hidden parity is &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse. While a simple enumeration algorithm takes &amp;lt;math&amp;gt;{n \choose k}=O((n/k)^k)&amp;lt;/math&amp;gt; time, previously known results still need &amp;lt;math&amp;gt;{n \choose k/2} = \Omega((n/k)^{k/2})&amp;lt;/math&amp;gt; time for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;. Our framework provides a LSPN algorithm runs in time &amp;lt;math&amp;gt;O((\eta \cdot n/k)^k)&amp;lt;/math&amp;gt; for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;, which improves the state-of-the-art of LSPN whenever &amp;lt;math&amp;gt;\eta \in ( k/n,\sqrt{k/n})&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;The sparse LPN problem is closely related to the classical problem of refuting random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-CSP and has been widely used in cryptography as the hardness assumption. Different from the standard LPN, it samples random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors. Because the number of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors is &amp;lt;math&amp;gt;{n \choose k} &amp;lt; n^k&amp;lt;/math&amp;gt;, sparse LPN has learning algorithms in polynomial time when &amp;lt;math&amp;gt;m&amp;gt;n^{k/2}&amp;lt;/math&amp;gt;. However, much less is known about learning algorithms for constant &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; like &amp;lt;math&amp;gt;3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m&amp;lt;n^{k/2}&amp;lt;/math&amp;gt; samples, except the Gaussian elimination algorithm of time &amp;lt;math&amp;gt;e^{\eta n}&amp;lt;/math&amp;gt;. Our framework provides a learning algorithm in &amp;lt;math&amp;gt;e^{O(\eta \cdot n^{\frac{\delta+1}{2}})}&amp;lt;/math&amp;gt; time given &amp;lt;math&amp;gt;\delta \in (0,1)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m \approx n^{1+(1-\delta)\cdot \frac{k-1}{2}}&amp;lt;/math&amp;gt; samples. This improves previous learning algorithms. For example, in the classical setting of &amp;lt;math&amp;gt;k=3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m=n^{1.4}&amp;lt;/math&amp;gt;, our algorithm would be faster than previous approaches for any &amp;lt;math&amp;gt;\eta&amp;lt;n^{-0.7}&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;Based on joint work with Wenxuan Shu (USTC) and Zhaienhe Zhou (USTC).&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Lunch Break  (12:00 - 14:00)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:00 – 14:50&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张瀚文&amp;lt;br/&amp;gt;&lt;br /&gt;
哥本哈根大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Minimum Star Partitions of Simple Polygons in Polynomial Time &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: 我们设计了一种多项式时间算法，用于将简单多边形P划分为最少个数的星形多边形。这样的算法是否存在的问题已被提出超过四十年之久并多次重复，包括在O’Rourke的著作《美术馆定理与算法》中。之前已知的算法只能处理一些特殊情况，例如多边形是单调的直边多边形，或者不允许使用斯坦纳点的情况，都远不足以处理最普遍的例子。而允许星型子部分重叠的覆盖变体——即著名的美术馆问题，在2018年被证明属于∃ℝ完全类，因此很可能比NP问题更难。除了理论价值外，星型多边形划分也可以应用在数控型腔铣削、机器人路径规划、形状参数化等实际场景中。&amp;lt;br/&amp;gt;在这个报告中，我会着重讲解我们求解这个问题时的直觉、思考和发现，沉浸式体验我们在这项研究中的全部经历。&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:55 – 15:45&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|许超&amp;lt;br&amp;gt;&lt;br /&gt;
电子科技大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: An Optimal Algorithm for the Stacker Crane Problem on Fixed Topologies&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: The Stacker Crane Problem (SCP) is a variant of the Traveling Salesman Problem. In SCP, pairs of pickup and delivery points are designated on a graph, and a crane must visit these points to move objects from each pickup location to its respective delivery point. The goal is to minimize the total distance traveled. SCP is known to be NP-hard, even on trees. The only positive results, in terms of polynomial-time solvability, apply to graphs that are topologically equivalent to a path or a cycle. We propose an algorithm that is optimal for each fixed topology, running in near-linear time. This is achieved by demonstrating that the problem is fixed-parameter tractable (FPT) when parameterized by both the cycle rank and the number of branch vertices.&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (15:45 – 16:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|16:15 – 17:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张驰豪&amp;lt;br/&amp;gt;&lt;br /&gt;
上海交通大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Sampling from non-log-concave distributions&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Sampling from a d-dimensional distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; with density &amp;lt;math&amp;gt;p_{\mu}(x) \propto e^{-V(x)}&amp;lt;/math&amp;gt; is a central problem in many areas, including theoretical computer science, statistical physics, and machine learning. It is well-known that when the potential function &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;convex&#039;&#039; (or equivalently, when &amp;lt;math&amp;gt;p_{\mu}&amp;lt;/math&amp;gt; is &#039;&#039;log-concave&#039;&#039;), or more generally, when &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfies good isoperimetric inequalities, efficient sampling algorithms exist in various computational models. A common belief is that the sampling task becomes more difficult when &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;non-convex&#039;&#039;. On the other hand, data-based algorithms (e.g., denoising diffusion probabilistic models) developed in the machine learning community are very successful in practice when dealing with highly non-log-concave distributions (such as in image generation), and provide new insights into designing efficient sampling algorithms. &amp;lt;br/&amp;gt; In this talk, we will start with a general tight (exponential) sampling complexity bound for any &#039;&#039;non-log-concave&#039;&#039; distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfying mild regularity conditions. Then, we will show how a common strengthening of these regularity conditions leads to an efficient (polynomial) sampling algorithm. Finally, we will discuss future directions for understanding the complexity of sampling from general distributions.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|17:10 – 18:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|黄增峰&amp;lt;br/&amp;gt;&lt;br /&gt;
复旦大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Simple and Optimal Algorithms for Heavy Hitters and Frequency Moments in Distributed Models&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider the problems of distributed heavy hitters and frequency moments in both the coordinator model and the distributed tracking model. We present simple and optimal algorithms for heavy hitters and frequency moments estimation in these distributed models. For &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; heavy hitters in the coordinator model, our algorithm requires only one round and uses &amp;lt;math&amp;gt;\tilde{O}(k^{p-1}/\epsilon^p)&amp;lt;/math&amp;gt; bits of communication. For &amp;lt;math&amp;gt;p &amp;gt; 2&amp;lt;/math&amp;gt;, this is the first near-optimal result. By combining our algorithm with the standard recursive sketching technique, we obtain a near-optimal two-round algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the coordinator model, matching a significant result from recent work by Esfandiari et al. (STOC 2024). Our algorithm and analysis are much simpler and have better cost with respect to logarithmic factors. Due to the simplicity of our heavy hitter algorithms, we manage to adapt them to the distributed tracking model with only a &amp;lt;math&amp;gt;\mathrm{polylog}(n)&amp;lt;/math&amp;gt; increase in communication. This presents the first near-optimal algorithm for heavy hitters. By applying the recursive sketching technique, we also provide the first near-optimal algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the distributed tracking model for all &amp;lt;math&amp;gt;p \geq 2&amp;lt;/math&amp;gt;. Even for &amp;lt;math&amp;gt;F_2&amp;lt;/math&amp;gt;, our result improves upon the bounds established by Cormode, Muthukrishnan, and Yi (SODA 2008) and Woodruff and Zhang (STOC 2012), nearly matching the existing lower bound for the first time.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Getting to The Campus==&lt;br /&gt;
*入校：&amp;lt;strike&amp;gt;校外来宾请在校门口向安保说明会议名称后登记入校。&amp;lt;/strike&amp;gt; &amp;lt;font color=red&amp;gt;[[Media:智软院访客预约流程.pdf|&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;]]&amp;lt;/font&amp;gt;&lt;br /&gt;
**关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
**审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
*高铁 / 动车&lt;br /&gt;
**苏州站：打车至校区约 30 分钟（非早晚高峰情况下），费用约 ¥30。亦可选择快线 3 号或地铁转有轨电车，全程约 2 小时。&lt;br /&gt;
**苏州新区站：打车至校区约 25 分钟，费用约 ¥25。也可乘坐有轨电车 2 号线，约 1 小时。&lt;br /&gt;
*飞机&lt;br /&gt;
**无锡硕放机场（WUX）：打车至校区约 30 分钟；因跨城行驶，司机可能会收取返程/空驶费用，总费用约 ¥80。如果能打到顺风车的话会较为便宜。亦可选公共交通，约 2 小时。&lt;br /&gt;
**上海虹桥机场（SHA）：建议从虹桥火车站换乘高铁至苏州站或苏州新区站。务必留意，由上海虹桥站前往苏州的高铁末班车时间通常是21:42。不建议从虹桥直接打车至苏州（费用较高）；同时不建议打顺风车，因为通常只能打到黑出租。&lt;br /&gt;
&lt;br /&gt;
== Accommodation Suggestion ==&lt;br /&gt;
￥￥￥ 苏州科技城源宿酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥￥ 南大国际学术交流中心（校内酒店，性价比高）、苏州科技城万达美华酒店、全季苏州科技城酒店、苏州高新区科技城亚朵酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥ 格林豪泰苏州市科技城商务酒店、宜必思尚品苏州科技城酒店、如家精选-苏州乐园高新区科技城店&lt;br /&gt;
&lt;br /&gt;
== Lunch &amp;amp; Supper ==&lt;br /&gt;
[[File:苏州校区食堂（2025）.png|thumb|苏州校区的四个食堂在图中红星处]]&lt;br /&gt;
* 苏州校区内现有科创大厦食堂、第16、17、18食堂共四个食堂，素菜2-3元、荤菜4-8元，可直接用支付宝或微信支付。此外，国际学术交流中心也提供更为昂贵的食物。&lt;br /&gt;
* 学校附近有：东渚镇、文体中心、丰茂里、时尚水岸星悦荟、星悦里等几个商业区。&lt;br /&gt;
* 也可以选择外卖，会送至校门口的外卖柜或外卖架上。&lt;br /&gt;
&lt;br /&gt;
== Getting Around ==&lt;br /&gt;
* 大阳山国家森林公园 &amp;amp; 植物园：层林步道＋寺庙人文，爬 60–90 分钟视体力安排；秋冬晴天观景佳。&lt;br /&gt;
* 树山生态村：乡野步道、茶园与农家菜，团队晚餐/走读首选。&lt;br /&gt;
* 太湖湿地/西山方向：自驾更便捷，观湿地与湖景线。&lt;br /&gt;
* 古城园林：傍晚可打车去平江路/山塘街逛夜景，或白天参观苏州博物馆/拙政园。&lt;br /&gt;
&lt;br /&gt;
== About Suzhou Campus ==&lt;br /&gt;
南京大学苏州校区位于苏州高新区太湖科技城，地处“环太湖科创圈”与“沿沪宁产业创新带”的黄金交汇点，被定位为南大发展壮大新工科的主阵地。立足“国家战略、世界一流、强强联合、需需结合”，南大苏州校区聚焦人工智能、新一代信息技术、新能源、先进制造、生命健康等领域“卡脖子”问题，强化“新工科”建设，促进文理工医交叉融合，政产学研协调发展。&lt;br /&gt;
&lt;br /&gt;
== Contact ==&lt;br /&gt;
刘明谋： lmm@nju.edu.cn&amp;lt;br/&amp;gt;&lt;br /&gt;
[[Media:2025年Suzhou Theory Day邀请函.pdf|邀请函.pdf]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13410</id>
		<title>Theory@Suzhou 2025</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13410"/>
		<updated>2025-11-29T03:27:52Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:2025 SuZhou Theory Day poster.png|200px|thumb|活动海报]]&lt;br /&gt;
==General Information ==&lt;br /&gt;
[[File:苏教楼D.png|thumb|苏教楼D在图中红星处]]&lt;br /&gt;
*&#039;&#039;&#039;&amp;lt;font size=4&amp;gt;Sunday, Nov 30, 2025: 09:00 -- 18:00.&amp;lt;/font&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Location&#039;&#039;&#039;: 南京大学苏州校区&amp;lt;/font&amp;gt;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Venue&#039;&#039;&#039;: 苏教楼D202&amp;lt;/font&amp;gt;&lt;br /&gt;
[https://zcc.nju.edu.cn/DFS//file/2024/09/20/202409201037042506uv3mq.pdf 苏州校区地图]&lt;br /&gt;
&lt;br /&gt;
==Announcement==&lt;br /&gt;
&amp;lt;font size=4 color=red&amp;gt;&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
*关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
*审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
&lt;br /&gt;
==Speakers (in alphabetic order)==&lt;br /&gt;
* [http://staff.ustc.edu.cn/~xuechen1989/ 陈雪]（中国科学技术大学）&lt;br /&gt;
* [https://zengfenghuang.github.io/ 黄增峰]（复旦大学）&lt;br /&gt;
* [https://www.shaofengjiang.cn/ 姜少峰]（北京大学）&lt;br /&gt;
* [https://chaoxu.prof/ 许超]（电子科技大学）&lt;br /&gt;
* [https://chihaozhang.com/ 张驰豪]（上海交通大学）&lt;br /&gt;
* [https://scholar.google.com/citations?user=TydhZfgAAAAJ  张瀚文]（哥本哈根大学）&lt;br /&gt;
* [https://zhangty12.github.io/ 张天翼]（南京大学）&lt;br /&gt;
&lt;br /&gt;
== Join us==&lt;br /&gt;
&#039;&#039;&#039;不需注册&#039;&#039;&#039;。&lt;br /&gt;
本次活动涵盖近似算法、图算法、计算几何、理论机器学习、概率与采样算法、流与分布式算法在内的多个主题，欢迎所有对理论计算机科学感兴趣的同学和老师前来参加。&amp;lt;br/&amp;gt;&lt;br /&gt;
请 [https://docs.qq.com/form/page/DS0JxdW5yZHZPYWtF &#039;&#039;&#039;简单填写问卷&#039;&#039;&#039;] 用于统计参会人数，以便准备茶歇的食物和调整报告厅。&lt;br /&gt;
&lt;br /&gt;
== Program ==&lt;br /&gt;
:{|border=&amp;quot;2&amp;quot; width=&amp;quot;100%&amp;quot; cellspacing=&amp;quot;4&amp;quot; cellpadding=&amp;quot;3&amp;quot; rules=&amp;quot;all&amp;quot; style=&amp;quot;margin:1em 1em 1em 0; border:solid 1px #AAAAAA; border-collapse:collapse;empty-cells:show;&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#A7C1F2&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Workshop Program&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;width: 140px;&amp;quot; align=&amp;quot;center&amp;quot;|09:00 - 09:50&lt;br /&gt;
|style=&amp;quot;width: 180px;&amp;quot; align=&amp;quot;center&amp;quot;|张天翼&amp;lt;br/&amp;gt;&lt;br /&gt;
南京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Approximate Light Spanners in Planar Graphs&amp;lt;/font&amp;gt;&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Althöfer 等人（DCG 1993）提出了贪心生成子图，并证明了对于任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，其贪心 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的总权重至多为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;w(\mathrm{MST}(G))&amp;lt;/math&amp;gt; 表示图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的最小生成树 &amp;lt;math&amp;gt;\mathrm{MST}(G)&amp;lt;/math&amp;gt; 的权重。该界在存在性意义上是紧的：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得其任意 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的权重至少为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;。&amp;lt;br/&amp;gt;然而，从近似算法的角度来看，即使是双标准（bicriteria）近似，贪心生成子图 的权重近似因子也基本上达到了上述存在性下界：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得对于任意满足 &amp;lt;math&amp;gt;1 \leq x = O(\epsilon^{-1/2})&amp;lt;/math&amp;gt; 的参数，其贪心 &amp;lt;math&amp;gt;(1 + x\epsilon)&amp;lt;/math&amp;gt;-生成子图 的权重为 &amp;lt;math&amp;gt;\Omega\left(\frac{1}{\epsilon \cdot x^2} \cdot w(G_{\mathrm{opt},\epsilon})\right)&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;G_{\mathrm{opt},\epsilon}&amp;lt;/math&amp;gt; 是图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的权重最小的 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图。&amp;lt;br/&amp;gt;尽管在过去三十年中，关于生成子图的近似算法的研究层出不穷，但目前仍不存在任何（即使是双标准）近似算法，能够在带权平面图上构造出优于上述存在性下界的轻量生成子图。&amp;lt;br/&amp;gt;作为本文的主要贡献，我们提出了一种在平面图上的动态规划算法，可在任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 中构造一个 &amp;lt;math&amp;gt;\left(1 + \epsilon \cdot 2^{O(\log^* 1/\epsilon)}\right)&amp;lt;/math&amp;gt;-生成子图，其总权重为 &amp;lt;math&amp;gt;O(1) \cdot w(G_{\mathrm{opt},\epsilon})&amp;lt;/math&amp;gt;。此外，我们也证明了精确求解最小平面生成子图是NP难的。&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (09:50 – 10:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|10:15 – 11:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|姜少峰&amp;lt;br/&amp;gt;&lt;br /&gt;
北京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Local Search for Clustering in Almost-linear Time&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We propose the first local search algorithm for Euclidean clustering that attains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. Specifically, for Euclidean k-Means, our algorithm achieves an &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;-approximation in &amp;lt;math&amp;gt;\tilde{O}(n^{1 + 1 / c})&amp;lt;/math&amp;gt; time, for any constant &amp;lt;math&amp;gt;c \ge 1&amp;lt;/math&amp;gt;, maintaining the same running time as the previous (non-local-search-based) approach [la Tour and Saulpic, arXiv&#039;2407.11217] while improving the approximation factor from &amp;lt;math&amp;gt;O(c^{6})&amp;lt;/math&amp;gt; to &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;. The algorithm generalizes to any metric space with sparse spanners, delivering efficient constant approximation in &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; metrics, doubling metrics, Jaccard metrics, etc.&amp;lt;br/&amp;gt; This generality derives from our main technical contribution: a local search algorithm on general graphs that obtains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. We establish this through a new &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;-swap local search framework featuring a novel swap selection rule. At a high level, this rule “scores” every possible swap, based on both its modification to the clustering and its improvement to the clustering objective, and then selects those high-scoring swaps. To implement this, we design a new data structure for maintaining approximate nearest neighbors with amortized guarantees tailored to our framework.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|11:10 – 12:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|陈雪&amp;lt;br/&amp;gt;&lt;br /&gt;
中国科学技术大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Algorithms for Sparse LPN and LSPN Against Low-noise&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider sparse variants of the classical Learning Parities with random Noise (LPN) problem. Our main contribution is a new algorithmic framework that provides learning algorithms against low-noise for both Learning Sparse Parities (LSPN) problem and sparse LPN problem. Different from previous approaches for LSPN and sparse LPN, this framework has a simple structure and runs in polynomial space. Let &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; be the dimension, &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; denote the sparsity, and &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt; be the noise rate.&amp;lt;br/&amp;gt;As a fundamental problem in computational learning theory, Learning Sparse Parities with Noise (LSPN) assumes the hidden parity is &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse. While a simple enumeration algorithm takes &amp;lt;math&amp;gt;{n \choose k}=O((n/k)^k)&amp;lt;/math&amp;gt; time, previously known results still need &amp;lt;math&amp;gt;{n \choose k/2} = \Omega((n/k)^{k/2})&amp;lt;/math&amp;gt; time for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;. Our framework provides a LSPN algorithm runs in time &amp;lt;math&amp;gt;O((\eta \cdot n/k)^k)&amp;lt;/math&amp;gt; for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;, which improves the state-of-the-art of LSPN whenever &amp;lt;math&amp;gt;\eta \in ( k/n,\sqrt{k/n})&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;The sparse LPN problem is closely related to the classical problem of refuting random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-CSP and has been widely used in cryptography as the hardness assumption. Different from the standard LPN, it samples random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors. Because the number of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors is &amp;lt;math&amp;gt;{n \choose k} &amp;lt; n^k&amp;lt;/math&amp;gt;, sparse LPN has learning algorithms in polynomial time when &amp;lt;math&amp;gt;m&amp;gt;n^{k/2}&amp;lt;/math&amp;gt;. However, much less is known about learning algorithms for constant &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; like &amp;lt;math&amp;gt;3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m&amp;lt;n^{k/2}&amp;lt;/math&amp;gt; samples, except the Gaussian elimination algorithm of time &amp;lt;math&amp;gt;e^{\eta n}&amp;lt;/math&amp;gt;. Our framework provides a learning algorithm in &amp;lt;math&amp;gt;e^{O(\eta \cdot n^{\frac{\delta+1}{2}})}&amp;lt;/math&amp;gt; time given &amp;lt;math&amp;gt;\delta \in (0,1)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m \approx n^{1+(1-\delta)\cdot \frac{k-1}{2}}&amp;lt;/math&amp;gt; samples. This improves previous learning algorithms. For example, in the classical setting of &amp;lt;math&amp;gt;k=3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m=n^{1.4}&amp;lt;/math&amp;gt;, our algorithm would be faster than previous approaches for any &amp;lt;math&amp;gt;\eta&amp;lt;n^{-0.7}&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;Based on joint work with Wenxuan Shu (USTC) and Zhaienhe Zhou (USTC).&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Lunch Break  (12:00 - 14:00)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:00 – 14:50&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张瀚文&amp;lt;br/&amp;gt;&lt;br /&gt;
哥本哈根大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Minimum Star Partitions of Simple Polygons in Polynomial Time &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: 我们设计了一种多项式时间算法，用于将简单多边形P划分为最少个数的星形多边形。这样的算法是否存在的问题已被提出超过四十年之久并多次重复，包括在O’Rourke的著作《美术馆定理与算法》中。之前已知的算法只能处理一些特殊情况，例如多边形是单调的直边多边形，或者不允许使用斯坦纳点的情况，都远不足以处理最普遍的例子。而允许星型子部分重叠的覆盖变体——即著名的美术馆问题，在2018年被证明属于∃ℝ完全类，因此很可能比NP问题更难。除了理论价值外，星型多边形划分也可以应用在数控型腔铣削、机器人路径规划、形状参数化等实际场景中。&amp;lt;br/&amp;gt;在这个报告中，我会着重讲解我们求解这个问题时的直觉、思考和发现，沉浸式体验我们在这项研究中的全部经历。&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:55 – 15:45&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|许超&amp;lt;br&amp;gt;&lt;br /&gt;
电子科技大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: An Optimal Algorithm for the Stacker Crane Problem on Fixed Topologies&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: The Stacker Crane Problem (SCP) is a variant of the Traveling Salesman Problem. In SCP, pairs of pickup and delivery points are designated on a graph, and a crane must visit these points to move objects from each pickup location to its respective delivery point. The goal is to minimize the total distance traveled. SCP is known to be NP-hard, even on trees. The only positive results, in terms of polynomial-time solvability, apply to graphs that are topologically equivalent to a path or a cycle. We propose an algorithm that is optimal for each fixed topology, running in near-linear time. This is achieved by demonstrating that the problem is fixed-parameter tractable (FPT) when parameterized by both the cycle rank and the number of branch vertices.&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (15:45 – 16:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|16:15 – 17:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张驰豪&amp;lt;br/&amp;gt;&lt;br /&gt;
上海交通大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Sampling from non-log-concave distributions&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Sampling from a d-dimensional distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; with density &amp;lt;math&amp;gt;p_{\mu}(x) \propto e^{-V(x)}&amp;lt;/math&amp;gt; is a central problem in many areas, including theoretical computer science, statistical physics, and machine learning. It is well-known that when the potential function &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;convex&#039;&#039; (or equivalently, when &amp;lt;math&amp;gt;p_{\mu}&amp;lt;/math&amp;gt; is &#039;&#039;log-concave&#039;&#039;), or more generally, when &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfies good isoperimetric inequalities, efficient sampling algorithms exist in various computational models. A common belief is that the sampling task becomes more difficult when &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;non-convex&#039;&#039;. On the other hand, data-based algorithms (e.g., denoising diffusion probabilistic models) developed in the machine learning community are very successful in practice when dealing with highly non-log-concave distributions (such as in image generation), and provide new insights into designing efficient sampling algorithms. &amp;lt;br/&amp;gt; In this talk, we will start with a general tight (exponential) sampling complexity bound for any &#039;&#039;non-log-concave&#039;&#039; distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfying mild regularity conditions. Then, we will show how a common strengthening of these regularity conditions leads to an efficient (polynomial) sampling algorithm. Finally, we will discuss future directions for understanding the complexity of sampling from general distributions.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|17:10 – 18:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|黄增峰&amp;lt;br/&amp;gt;&lt;br /&gt;
复旦大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Simple and Optimal Algorithms for Heavy Hitters and Frequency Moments in Distributed Models&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider the problems of distributed heavy hitters and frequency moments in both the coordinator model and the distributed tracking model. We present simple and optimal algorithms for heavy hitters and frequency moments estimation in these distributed models. For &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; heavy hitters in the coordinator model, our algorithm requires only one round and uses &amp;lt;math&amp;gt;\tilde{O}(k^{p-1}/\epsilon^p)&amp;lt;/math&amp;gt; bits of communication. For &amp;lt;math&amp;gt;p &amp;gt; 2&amp;lt;/math&amp;gt;, this is the first near-optimal result. By combining our algorithm with the standard recursive sketching technique, we obtain a near-optimal two-round algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the coordinator model, matching a significant result from recent work by Esfandiari et al. (STOC 2024). Our algorithm and analysis are much simpler and have better cost with respect to logarithmic factors. Due to the simplicity of our heavy hitter algorithms, we manage to adapt them to the distributed tracking model with only a &amp;lt;math&amp;gt;\mathrm{polylog}(n)&amp;lt;/math&amp;gt; increase in communication. This presents the first near-optimal algorithm for heavy hitters. By applying the recursive sketching technique, we also provide the first near-optimal algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the distributed tracking model for all &amp;lt;math&amp;gt;p \geq 2&amp;lt;/math&amp;gt;. Even for &amp;lt;math&amp;gt;F_2&amp;lt;/math&amp;gt;, our result improves upon the bounds established by Cormode, Muthukrishnan, and Yi (SODA 2008) and Woodruff and Zhang (STOC 2012), nearly matching the existing lower bound for the first time.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Getting to The Campus==&lt;br /&gt;
*入校：&amp;lt;strike&amp;gt;校外来宾请在校门口向安保说明会议名称后登记入校。&amp;lt;/strike&amp;gt; &amp;lt;font color=red&amp;gt;[[Media:智软院访客预约流程.pdf|&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;]]&amp;lt;/font&amp;gt;&lt;br /&gt;
**关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
**审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
*高铁 / 动车&lt;br /&gt;
**苏州站：打车至校区约 30 分钟（非早晚高峰情况下），费用约 ¥30。亦可选择快线 3 号或地铁转有轨电车，全程约 2 小时。&lt;br /&gt;
**苏州新区站：打车至校区约 25 分钟，费用约 ¥25。也可乘坐有轨电车 2 号线，约 1 小时。&lt;br /&gt;
*飞机&lt;br /&gt;
**无锡硕放机场（WUX）：打车至校区约 30 分钟；因跨城行驶，司机可能会收取返程/空驶费用，总费用约 ¥80。如果能打到顺风车的话会较为便宜。亦可选公共交通，约 2 小时。&lt;br /&gt;
**上海虹桥机场（SHA）：建议从虹桥火车站换乘高铁至苏州站或苏州新区站。务必留意，由上海虹桥站前往苏州的高铁末班车时间通常是21:42。不建议从虹桥直接打车至苏州（费用较高）；同时不建议打顺风车，因为通常只能打到黑出租。&lt;br /&gt;
&lt;br /&gt;
== Accommodation Suggestion ==&lt;br /&gt;
￥￥￥ 苏州科技城源宿酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥￥ 南大国际学术交流中心（校内酒店，性价比高）、苏州科技城万达美华酒店、全季苏州科技城酒店、苏州高新区科技城亚朵酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥ 格林豪泰苏州市科技城商务酒店、宜必思尚品苏州科技城酒店、如家精选-苏州乐园高新区科技城店&lt;br /&gt;
&lt;br /&gt;
== Lunch &amp;amp; Supper ==&lt;br /&gt;
[[File:苏州校区食堂（2025）.png|thumb|苏州校区的四个食堂在图中红星处]]&lt;br /&gt;
* 苏州校区内现有科创大厦食堂、第16、17、18食堂共四个食堂，素菜2-3元、荤菜4-8元，可直接用支付宝或微信支付。此外，国际学术交流中心也提供更为昂贵的食物。&lt;br /&gt;
* 学校附近有：东渚镇、文体中心、丰茂里、时尚水岸星悦荟、星悦里等几个商业区。&lt;br /&gt;
* 也可以选择外卖，会送至校门口的外卖柜或外卖架上。&lt;br /&gt;
&lt;br /&gt;
== Getting Around ==&lt;br /&gt;
* 大阳山国家森林公园 &amp;amp; 植物园：层林步道＋寺庙人文，爬 60–90 分钟视体力安排；秋冬晴天观景佳。&lt;br /&gt;
* 树山生态村：乡野步道、茶园与农家菜，团队晚餐/走读首选。&lt;br /&gt;
* 太湖湿地/西山方向：自驾更便捷，观湿地与湖景线。&lt;br /&gt;
* 古城园林：傍晚可打车去平江路/山塘街逛夜景，或白天参观苏州博物馆/拙政园。&lt;br /&gt;
&lt;br /&gt;
== About Suzhou Campus ==&lt;br /&gt;
南京大学苏州校区位于苏州高新区太湖科技城，地处“环太湖科创圈”与“沿沪宁产业创新带”的黄金交汇点，被定位为南大发展壮大新工科的主阵地。立足“国家战略、世界一流、强强联合、需需结合”，南大苏州校区聚焦人工智能、新一代信息技术、新能源、先进制造、生命健康等领域“卡脖子”问题，强化“新工科”建设，促进文理工医交叉融合，政产学研协调发展。&lt;br /&gt;
&lt;br /&gt;
== Contact ==&lt;br /&gt;
刘明谋： lmm@nju.edu.cn&amp;lt;br/&amp;gt;&lt;br /&gt;
[[Media:2025年Suzhou Theory Day邀请函.pdf|邀请函.pdf]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13409</id>
		<title>Theory@Suzhou 2025</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=Theory@Suzhou_2025&amp;diff=13409"/>
		<updated>2025-11-29T03:27:29Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[File:2025 SuZhou Theory Day poster.png|200px|thumb|活动海报]]&lt;br /&gt;
==General Information ==&lt;br /&gt;
[[File:苏教楼D.png|thumb|苏教楼D在图中红星处]]&lt;br /&gt;
*&#039;&#039;&#039;&amp;lt;font size=4&amp;gt;Sunday, Nov 30, 2025: 09:00 -- 18:00.&amp;lt;/font&amp;gt;&#039;&#039;&#039;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Location&#039;&#039;&#039;: 南京大学苏州校区&amp;lt;/font&amp;gt;&lt;br /&gt;
* &amp;lt;font size=4&amp;gt;&#039;&#039;&#039;Venue&#039;&#039;&#039;: 苏教楼D202&amp;lt;/font&amp;gt;&lt;br /&gt;
[https://zcc.nju.edu.cn/DFS//file/2024/09/20/202409201037042506uv3mq.pdf 苏州校区地图]&lt;br /&gt;
&lt;br /&gt;
==Announcement==&lt;br /&gt;
&amp;lt;font color=red&amp;gt;&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;&amp;lt;/font&amp;gt;&lt;br /&gt;
*关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
*审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
&lt;br /&gt;
==Speakers (in alphabetic order)==&lt;br /&gt;
* [http://staff.ustc.edu.cn/~xuechen1989/ 陈雪]（中国科学技术大学）&lt;br /&gt;
* [https://zengfenghuang.github.io/ 黄增峰]（复旦大学）&lt;br /&gt;
* [https://www.shaofengjiang.cn/ 姜少峰]（北京大学）&lt;br /&gt;
* [https://chaoxu.prof/ 许超]（电子科技大学）&lt;br /&gt;
* [https://chihaozhang.com/ 张驰豪]（上海交通大学）&lt;br /&gt;
* [https://scholar.google.com/citations?user=TydhZfgAAAAJ  张瀚文]（哥本哈根大学）&lt;br /&gt;
* [https://zhangty12.github.io/ 张天翼]（南京大学）&lt;br /&gt;
&lt;br /&gt;
== Join us==&lt;br /&gt;
&#039;&#039;&#039;不需注册&#039;&#039;&#039;。&lt;br /&gt;
本次活动涵盖近似算法、图算法、计算几何、理论机器学习、概率与采样算法、流与分布式算法在内的多个主题，欢迎所有对理论计算机科学感兴趣的同学和老师前来参加。&amp;lt;br/&amp;gt;&lt;br /&gt;
请 [https://docs.qq.com/form/page/DS0JxdW5yZHZPYWtF &#039;&#039;&#039;简单填写问卷&#039;&#039;&#039;] 用于统计参会人数，以便准备茶歇的食物和调整报告厅。&lt;br /&gt;
&lt;br /&gt;
== Program ==&lt;br /&gt;
:{|border=&amp;quot;2&amp;quot; width=&amp;quot;100%&amp;quot; cellspacing=&amp;quot;4&amp;quot; cellpadding=&amp;quot;3&amp;quot; rules=&amp;quot;all&amp;quot; style=&amp;quot;margin:1em 1em 1em 0; border:solid 1px #AAAAAA; border-collapse:collapse;empty-cells:show;&amp;quot; &lt;br /&gt;
|-&lt;br /&gt;
|bgcolor=&amp;quot;#A7C1F2&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Workshop Program&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;width: 140px;&amp;quot; align=&amp;quot;center&amp;quot;|09:00 - 09:50&lt;br /&gt;
|style=&amp;quot;width: 180px;&amp;quot; align=&amp;quot;center&amp;quot;|张天翼&amp;lt;br/&amp;gt;&lt;br /&gt;
南京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Approximate Light Spanners in Planar Graphs&amp;lt;/font&amp;gt;&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Althöfer 等人（DCG 1993）提出了贪心生成子图，并证明了对于任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，其贪心 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的总权重至多为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;w(\mathrm{MST}(G))&amp;lt;/math&amp;gt; 表示图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的最小生成树 &amp;lt;math&amp;gt;\mathrm{MST}(G)&amp;lt;/math&amp;gt; 的权重。该界在存在性意义上是紧的：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得其任意 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图的权重至少为 &amp;lt;math&amp;gt;\left(1 + \frac{2}{\epsilon}\right) \cdot w(\mathrm{MST}(G))&amp;lt;/math&amp;gt;。&amp;lt;br/&amp;gt;然而，从近似算法的角度来看，即使是双标准（bicriteria）近似，贪心生成子图 的权重近似因子也基本上达到了上述存在性下界：存在某些平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt;，使得对于任意满足 &amp;lt;math&amp;gt;1 \leq x = O(\epsilon^{-1/2})&amp;lt;/math&amp;gt; 的参数，其贪心 &amp;lt;math&amp;gt;(1 + x\epsilon)&amp;lt;/math&amp;gt;-生成子图 的权重为 &amp;lt;math&amp;gt;\Omega\left(\frac{1}{\epsilon \cdot x^2} \cdot w(G_{\mathrm{opt},\epsilon})\right)&amp;lt;/math&amp;gt;，其中 &amp;lt;math&amp;gt;G_{\mathrm{opt},\epsilon}&amp;lt;/math&amp;gt; 是图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 的权重最小的 &amp;lt;math&amp;gt;(1+\epsilon)&amp;lt;/math&amp;gt;-生成子图。&amp;lt;br/&amp;gt;尽管在过去三十年中，关于生成子图的近似算法的研究层出不穷，但目前仍不存在任何（即使是双标准）近似算法，能够在带权平面图上构造出优于上述存在性下界的轻量生成子图。&amp;lt;br/&amp;gt;作为本文的主要贡献，我们提出了一种在平面图上的动态规划算法，可在任意带权平面图 &amp;lt;math&amp;gt;G&amp;lt;/math&amp;gt; 中构造一个 &amp;lt;math&amp;gt;\left(1 + \epsilon \cdot 2^{O(\log^* 1/\epsilon)}\right)&amp;lt;/math&amp;gt;-生成子图，其总权重为 &amp;lt;math&amp;gt;O(1) \cdot w(G_{\mathrm{opt},\epsilon})&amp;lt;/math&amp;gt;。此外，我们也证明了精确求解最小平面生成子图是NP难的。&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (09:50 – 10:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|10:15 – 11:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|姜少峰&amp;lt;br/&amp;gt;&lt;br /&gt;
北京大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Local Search for Clustering in Almost-linear Time&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We propose the first local search algorithm for Euclidean clustering that attains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. Specifically, for Euclidean k-Means, our algorithm achieves an &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;-approximation in &amp;lt;math&amp;gt;\tilde{O}(n^{1 + 1 / c})&amp;lt;/math&amp;gt; time, for any constant &amp;lt;math&amp;gt;c \ge 1&amp;lt;/math&amp;gt;, maintaining the same running time as the previous (non-local-search-based) approach [la Tour and Saulpic, arXiv&#039;2407.11217] while improving the approximation factor from &amp;lt;math&amp;gt;O(c^{6})&amp;lt;/math&amp;gt; to &amp;lt;math&amp;gt;O(c)&amp;lt;/math&amp;gt;. The algorithm generalizes to any metric space with sparse spanners, delivering efficient constant approximation in &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; metrics, doubling metrics, Jaccard metrics, etc.&amp;lt;br/&amp;gt; This generality derives from our main technical contribution: a local search algorithm on general graphs that obtains an &amp;lt;math&amp;gt;O(1)&amp;lt;/math&amp;gt;-approximation in almost-linear time. We establish this through a new &amp;lt;math&amp;gt;1&amp;lt;/math&amp;gt;-swap local search framework featuring a novel swap selection rule. At a high level, this rule “scores” every possible swap, based on both its modification to the clustering and its improvement to the clustering objective, and then selects those high-scoring swaps. To implement this, we design a new data structure for maintaining approximate nearest neighbors with amortized guarantees tailored to our framework.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|11:10 – 12:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|陈雪&amp;lt;br/&amp;gt;&lt;br /&gt;
中国科学技术大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Algorithms for Sparse LPN and LSPN Against Low-noise&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider sparse variants of the classical Learning Parities with random Noise (LPN) problem. Our main contribution is a new algorithmic framework that provides learning algorithms against low-noise for both Learning Sparse Parities (LSPN) problem and sparse LPN problem. Different from previous approaches for LSPN and sparse LPN, this framework has a simple structure and runs in polynomial space. Let &amp;lt;math&amp;gt;n&amp;lt;/math&amp;gt; be the dimension, &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; denote the sparsity, and &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt; be the noise rate.&amp;lt;br/&amp;gt;As a fundamental problem in computational learning theory, Learning Sparse Parities with Noise (LSPN) assumes the hidden parity is &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse. While a simple enumeration algorithm takes &amp;lt;math&amp;gt;{n \choose k}=O((n/k)^k)&amp;lt;/math&amp;gt; time, previously known results still need &amp;lt;math&amp;gt;{n \choose k/2} = \Omega((n/k)^{k/2})&amp;lt;/math&amp;gt; time for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;. Our framework provides a LSPN algorithm runs in time &amp;lt;math&amp;gt;O((\eta \cdot n/k)^k)&amp;lt;/math&amp;gt; for any noise rate &amp;lt;math&amp;gt;\eta&amp;lt;/math&amp;gt;, which improves the state-of-the-art of LSPN whenever &amp;lt;math&amp;gt;\eta \in ( k/n,\sqrt{k/n})&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;The sparse LPN problem is closely related to the classical problem of refuting random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-CSP and has been widely used in cryptography as the hardness assumption. Different from the standard LPN, it samples random &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors. Because the number of &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt;-sparse vectors is &amp;lt;math&amp;gt;{n \choose k} &amp;lt; n^k&amp;lt;/math&amp;gt;, sparse LPN has learning algorithms in polynomial time when &amp;lt;math&amp;gt;m&amp;gt;n^{k/2}&amp;lt;/math&amp;gt;. However, much less is known about learning algorithms for constant &amp;lt;math&amp;gt;k&amp;lt;/math&amp;gt; like &amp;lt;math&amp;gt;3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m&amp;lt;n^{k/2}&amp;lt;/math&amp;gt; samples, except the Gaussian elimination algorithm of time &amp;lt;math&amp;gt;e^{\eta n}&amp;lt;/math&amp;gt;. Our framework provides a learning algorithm in &amp;lt;math&amp;gt;e^{O(\eta \cdot n^{\frac{\delta+1}{2}})}&amp;lt;/math&amp;gt; time given &amp;lt;math&amp;gt;\delta \in (0,1)&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m \approx n^{1+(1-\delta)\cdot \frac{k-1}{2}}&amp;lt;/math&amp;gt; samples. This improves previous learning algorithms. For example, in the classical setting of &amp;lt;math&amp;gt;k=3&amp;lt;/math&amp;gt; and &amp;lt;math&amp;gt;m=n^{1.4}&amp;lt;/math&amp;gt;, our algorithm would be faster than previous approaches for any &amp;lt;math&amp;gt;\eta&amp;lt;n^{-0.7}&amp;lt;/math&amp;gt;.&amp;lt;br/&amp;gt;Based on joint work with Wenxuan Shu (USTC) and Zhaienhe Zhou (USTC).&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Lunch Break  (12:00 - 14:00)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:00 – 14:50&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张瀚文&amp;lt;br/&amp;gt;&lt;br /&gt;
哥本哈根大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Minimum Star Partitions of Simple Polygons in Polynomial Time &amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: 我们设计了一种多项式时间算法，用于将简单多边形P划分为最少个数的星形多边形。这样的算法是否存在的问题已被提出超过四十年之久并多次重复，包括在O’Rourke的著作《美术馆定理与算法》中。之前已知的算法只能处理一些特殊情况，例如多边形是单调的直边多边形，或者不允许使用斯坦纳点的情况，都远不足以处理最普遍的例子。而允许星型子部分重叠的覆盖变体——即著名的美术馆问题，在2018年被证明属于∃ℝ完全类，因此很可能比NP问题更难。除了理论价值外，星型多边形划分也可以应用在数控型腔铣削、机器人路径规划、形状参数化等实际场景中。&amp;lt;br/&amp;gt;在这个报告中，我会着重讲解我们求解这个问题时的直觉、思考和发现，沉浸式体验我们在这项研究中的全部经历。&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|14:55 – 15:45&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|许超&amp;lt;br&amp;gt;&lt;br /&gt;
电子科技大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: An Optimal Algorithm for the Stacker Crane Problem on Fixed Topologies&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: The Stacker Crane Problem (SCP) is a variant of the Traveling Salesman Problem. In SCP, pairs of pickup and delivery points are designated on a graph, and a crane must visit these points to move objects from each pickup location to its respective delivery point. The goal is to minimize the total distance traveled. SCP is known to be NP-hard, even on trees. The only positive results, in terms of polynomial-time solvability, apply to graphs that are topologically equivalent to a path or a cycle. We propose an algorithm that is optimal for each fixed topology, running in near-linear time. This is achieved by demonstrating that the problem is fixed-parameter tractable (FPT) when parameterized by both the cycle rank and the number of branch vertices.&lt;br /&gt;
|-&lt;br /&gt;
|style=&amp;quot;background: silver;&amp;quot; align=&amp;quot;center&amp;quot; colspan=&amp;quot;3&amp;quot; |&#039;&#039;&#039;Coffee Break (15:45 – 16:15)&#039;&#039;&#039;&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|16:15 – 17:05&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|张驰豪&amp;lt;br/&amp;gt;&lt;br /&gt;
上海交通大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Sampling from non-log-concave distributions&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: Sampling from a d-dimensional distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; with density &amp;lt;math&amp;gt;p_{\mu}(x) \propto e^{-V(x)}&amp;lt;/math&amp;gt; is a central problem in many areas, including theoretical computer science, statistical physics, and machine learning. It is well-known that when the potential function &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;convex&#039;&#039; (or equivalently, when &amp;lt;math&amp;gt;p_{\mu}&amp;lt;/math&amp;gt; is &#039;&#039;log-concave&#039;&#039;), or more generally, when &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfies good isoperimetric inequalities, efficient sampling algorithms exist in various computational models. A common belief is that the sampling task becomes more difficult when &amp;lt;math&amp;gt;V(x)&amp;lt;/math&amp;gt; is &#039;&#039;non-convex&#039;&#039;. On the other hand, data-based algorithms (e.g., denoising diffusion probabilistic models) developed in the machine learning community are very successful in practice when dealing with highly non-log-concave distributions (such as in image generation), and provide new insights into designing efficient sampling algorithms. &amp;lt;br/&amp;gt; In this talk, we will start with a general tight (exponential) sampling complexity bound for any &#039;&#039;non-log-concave&#039;&#039; distribution &amp;lt;math&amp;gt;\mu&amp;lt;/math&amp;gt; satisfying mild regularity conditions. Then, we will show how a common strengthening of these regularity conditions leads to an efficient (polynomial) sampling algorithm. Finally, we will discuss future directions for understanding the complexity of sampling from general distributions.&lt;br /&gt;
|-&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|17:10 – 18:00&lt;br /&gt;
|align=&amp;quot;center&amp;quot;|黄增峰&amp;lt;br/&amp;gt;&lt;br /&gt;
复旦大学&lt;br /&gt;
|&lt;br /&gt;
:&amp;lt;font size=3&amp;gt;&#039;&#039;&#039;Title&#039;&#039;&#039;: Simple and Optimal Algorithms for Heavy Hitters and Frequency Moments in Distributed Models&amp;lt;/font&amp;gt;&lt;br /&gt;
&lt;br /&gt;
:&#039;&#039;&#039;Abstract&#039;&#039;&#039;: We consider the problems of distributed heavy hitters and frequency moments in both the coordinator model and the distributed tracking model. We present simple and optimal algorithms for heavy hitters and frequency moments estimation in these distributed models. For &amp;lt;math&amp;gt;\ell_p&amp;lt;/math&amp;gt; heavy hitters in the coordinator model, our algorithm requires only one round and uses &amp;lt;math&amp;gt;\tilde{O}(k^{p-1}/\epsilon^p)&amp;lt;/math&amp;gt; bits of communication. For &amp;lt;math&amp;gt;p &amp;gt; 2&amp;lt;/math&amp;gt;, this is the first near-optimal result. By combining our algorithm with the standard recursive sketching technique, we obtain a near-optimal two-round algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the coordinator model, matching a significant result from recent work by Esfandiari et al. (STOC 2024). Our algorithm and analysis are much simpler and have better cost with respect to logarithmic factors. Due to the simplicity of our heavy hitter algorithms, we manage to adapt them to the distributed tracking model with only a &amp;lt;math&amp;gt;\mathrm{polylog}(n)&amp;lt;/math&amp;gt; increase in communication. This presents the first near-optimal algorithm for heavy hitters. By applying the recursive sketching technique, we also provide the first near-optimal algorithm for &amp;lt;math&amp;gt;F_p&amp;lt;/math&amp;gt; in the distributed tracking model for all &amp;lt;math&amp;gt;p \geq 2&amp;lt;/math&amp;gt;. Even for &amp;lt;math&amp;gt;F_2&amp;lt;/math&amp;gt;, our result improves upon the bounds established by Cormode, Muthukrishnan, and Yi (SODA 2008) and Woodruff and Zhang (STOC 2012), nearly matching the existing lower bound for the first time.&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Getting to The Campus==&lt;br /&gt;
*入校：&amp;lt;strike&amp;gt;校外来宾请在校门口向安保说明会议名称后登记入校。&amp;lt;/strike&amp;gt; &amp;lt;font color=red&amp;gt;[[Media:智软院访客预约流程.pdf|&#039;&#039;&#039;因学校政策调整，需要在公众号上登记入校。&#039;&#039;&#039;]]&amp;lt;/font&amp;gt;&lt;br /&gt;
**关注微信公众号“南京大学信息门户”，点击门户首页左下角“i校园”-“访客通行”&lt;br /&gt;
**审核人单位：智能软件与工程学院。审核人姓名：石会&lt;br /&gt;
*高铁 / 动车&lt;br /&gt;
**苏州站：打车至校区约 30 分钟（非早晚高峰情况下），费用约 ¥30。亦可选择快线 3 号或地铁转有轨电车，全程约 2 小时。&lt;br /&gt;
**苏州新区站：打车至校区约 25 分钟，费用约 ¥25。也可乘坐有轨电车 2 号线，约 1 小时。&lt;br /&gt;
*飞机&lt;br /&gt;
**无锡硕放机场（WUX）：打车至校区约 30 分钟；因跨城行驶，司机可能会收取返程/空驶费用，总费用约 ¥80。如果能打到顺风车的话会较为便宜。亦可选公共交通，约 2 小时。&lt;br /&gt;
**上海虹桥机场（SHA）：建议从虹桥火车站换乘高铁至苏州站或苏州新区站。务必留意，由上海虹桥站前往苏州的高铁末班车时间通常是21:42。不建议从虹桥直接打车至苏州（费用较高）；同时不建议打顺风车，因为通常只能打到黑出租。&lt;br /&gt;
&lt;br /&gt;
== Accommodation Suggestion ==&lt;br /&gt;
￥￥￥ 苏州科技城源宿酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥￥ 南大国际学术交流中心（校内酒店，性价比高）、苏州科技城万达美华酒店、全季苏州科技城酒店、苏州高新区科技城亚朵酒店&amp;lt;br/&amp;gt;&lt;br /&gt;
￥ 格林豪泰苏州市科技城商务酒店、宜必思尚品苏州科技城酒店、如家精选-苏州乐园高新区科技城店&lt;br /&gt;
&lt;br /&gt;
== Lunch &amp;amp; Supper ==&lt;br /&gt;
[[File:苏州校区食堂（2025）.png|thumb|苏州校区的四个食堂在图中红星处]]&lt;br /&gt;
* 苏州校区内现有科创大厦食堂、第16、17、18食堂共四个食堂，素菜2-3元、荤菜4-8元，可直接用支付宝或微信支付。此外，国际学术交流中心也提供更为昂贵的食物。&lt;br /&gt;
* 学校附近有：东渚镇、文体中心、丰茂里、时尚水岸星悦荟、星悦里等几个商业区。&lt;br /&gt;
* 也可以选择外卖，会送至校门口的外卖柜或外卖架上。&lt;br /&gt;
&lt;br /&gt;
== Getting Around ==&lt;br /&gt;
* 大阳山国家森林公园 &amp;amp; 植物园：层林步道＋寺庙人文，爬 60–90 分钟视体力安排；秋冬晴天观景佳。&lt;br /&gt;
* 树山生态村：乡野步道、茶园与农家菜，团队晚餐/走读首选。&lt;br /&gt;
* 太湖湿地/西山方向：自驾更便捷，观湿地与湖景线。&lt;br /&gt;
* 古城园林：傍晚可打车去平江路/山塘街逛夜景，或白天参观苏州博物馆/拙政园。&lt;br /&gt;
&lt;br /&gt;
== About Suzhou Campus ==&lt;br /&gt;
南京大学苏州校区位于苏州高新区太湖科技城，地处“环太湖科创圈”与“沿沪宁产业创新带”的黄金交汇点，被定位为南大发展壮大新工科的主阵地。立足“国家战略、世界一流、强强联合、需需结合”，南大苏州校区聚焦人工智能、新一代信息技术、新能源、先进制造、生命健康等领域“卡脖子”问题，强化“新工科”建设，促进文理工医交叉融合，政产学研协调发展。&lt;br /&gt;
&lt;br /&gt;
== Contact ==&lt;br /&gt;
刘明谋： lmm@nju.edu.cn&amp;lt;br/&amp;gt;&lt;br /&gt;
[[Media:2025年Suzhou Theory Day邀请函.pdf|邀请函.pdf]]&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=File:%E6%99%BA%E8%BD%AF%E9%99%A2%E8%AE%BF%E5%AE%A2%E9%A2%84%E7%BA%A6%E6%B5%81%E7%A8%8B.pdf&amp;diff=13408</id>
		<title>File:智软院访客预约流程.pdf</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=File:%E6%99%BA%E8%BD%AF%E9%99%A2%E8%AE%BF%E5%AE%A2%E9%A2%84%E7%BA%A6%E6%B5%81%E7%A8%8B.pdf&amp;diff=13408"/>
		<updated>2025-11-29T03:22:21Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: 2025年11月智软院访客预约流程&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
2025年11月智软院访客预约流程&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
	<entry>
		<id>https://tcs.nju.edu.cn/wiki/index.php?title=File:2025%E5%B9%B4Suzhou_Theory_Day%E9%82%80%E8%AF%B7%E5%87%BD.pdf&amp;diff=13407</id>
		<title>File:2025年Suzhou Theory Day邀请函.pdf</title>
		<link rel="alternate" type="text/html" href="https://tcs.nju.edu.cn/wiki/index.php?title=File:2025%E5%B9%B4Suzhou_Theory_Day%E9%82%80%E8%AF%B7%E5%87%BD.pdf&amp;diff=13407"/>
		<updated>2025-11-26T13:50:26Z</updated>

		<summary type="html">&lt;p&gt;Liumingmou: Liumingmou uploaded a new version of File:2025年Suzhou Theory Day邀请函.pdf&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Summary ==&lt;br /&gt;
2025 苏州 Theory Day 邀请函&lt;/div&gt;</summary>
		<author><name>Liumingmou</name></author>
	</entry>
</feed>