Channel Avatar

mathematicalmonk @UCcAtD_VYwcYwVbTdvArsm7w@youtube.com

91.6K subscribers - no pronouns set

Videos about math, at the graduate level or upper-level unde


13:20
(ML 2.5) Generalizations for trees (CART)
14:06
(ML 2.4) Growing a classification tree (CART)
24:22
(IC 5.14) Finite-precision arithmetic coding - Decoder
17:37
(IC 5.13) Finite-precision arithmetic coding - Encoder
07:52
(IC 5.12) Finite-precision arithmetic coding - Setup
31:37
(IC 5.11) Finite-precision arithmetic coding - Rescaling
24:20
(IC 5.10) Generalizing arithmetic coding to non-i.i.d. models
19:08
(IC 5.9) Computational complexity of arithmetic coding
32:18
(IC 5.8) Near optimality of arithmetic coding
19:37
(IC 5.7) Decoder for arithmetic coding (infinite-precision)
25:03
(IC 5.4) Why the interval needs to be completely contained
16:29
(IC 5.6) Encoder for arithmetic coding (infinite-precision)
23:23
(IC 5.5) Rescaling operations for arithmetic coding
26:58
(IC 5.3) Arithmetic coding - Example #2
29:05
(IC 5.2) Arithmetic coding - Example #1
06:48
(IC 5.1) Arithmetic coding - introduction
23:46
(IC 4.12) Optimality of Huffman codes (part 7) - existence
09:07
(IC 4.13) Not every optimal prefix code is Huffman
14:09
(IC 4.11) Optimality of Huffman codes (part 6) - induction
22:19
(IC 4.10) Optimality of Huffman codes (part 5) - extension lemma
18:18
(IC 4.9) Optimality of Huffman codes (part 4) - extension and contraction
08:01
(IC 4.8) Optimality of Huffman codes (part 3) - sibling codes
21:51
(IC 4.7) Optimality of Huffman codes (part 2) - weak siblings
14:37
(IC 4.6) Optimality of Huffman codes (part 1) - inverse ordering
11:57
(IC 4.5) An issue with Huffman coding
06:32
(IC 4.4) Weighted minimization with Huffman coding
12:23
(IC 4.3) B-ary Huffman codes
13:43
(IC 4.2) Huffman coding - more examples
12:31
(IC 4.1) Huffman coding - introduction and example
15:36
(IC 3.10) Relative entropy as the mismatch inefficiency
15:49
(IC 3.9) Source coding theorem (optimal lossless compression)
08:30
(IC 3.8) Entropy of i.i.d. random variables
13:25
(IC 3.7) Block codes for compression
03:57
(IC 3.6) Example - entropy as a lower bound
13:57
(IC 3.5) Bounds on optimal expected length
01:44
(IC 3.4) Remark - an alternate proof
17:49
(IC 3.3) Entropy as a lower bound on expected length (part 3)
13:06
(IC 3.2) Entropy as a lower bound on expected length (part 2)
15:20
(IC 3.1) Entropy as a lower bound on expected length (part 1)
12:32
(IC 2.11) Kraft-McMillan - proof sketch for (b)
20:53
(IC 2.10) Kraft-McMillan - examples for (b)
18:20
(IC 2.9) Kraft-McMillan - proof of (a)
13:46
(IC 2.8) Kraft-McMillan inequality - statement
13:14
(IC 2.7) Expected codeword length
05:37
(IC 2.6) Prefix codes - remarks and what's next
15:50
(IC 2.5) Prefix codes
16:00
(IC 2.4) Decoding - prefix versus non-prefix
14:37
(IC 2.3) Symbol codes - definition and examples
13:52
(IC 2.2) Symbol codes - terminology and notation
05:28
(IC 2.1) A puzzle on weighing coins
15:46
(IC 1.5) Examples of source-encoder-channel pipelines
18:12
(IC 1.6) A different notion of "information"
12:33
(IC 1.4) Source-channel separation
11:39
(IC 1.2) Applications of Compression codes
29:10
(IC 1.3) Applications of Error-correcting codes
14:40
(IC 1.1) Information theory and Coding - Outline of topics
13:43
(ML 17.5) Importance sampling - introduction
19:27
(ML 19.11) GP regression - model and inference
14:30
(ML 19.10) GP regression - the key step
19:45
(ML 19.9) GP regression - introduction