|
87 | 87 | " * Note: Solving this gives a table like in my intro to entropy notes.md (Sending X signal and receiving Y)\n", |
88 | 88 | " * Here X is either 0.01 or 0.5 (the distribution sent) and Y is 0 or 1 (the signal received)\n", |
89 | 89 | "\n", |
| 90 | +"**Second Simple Problem**\n", |
| 91 | +"* Suppose true p is 2/2_000 and n=1_000\n", |
| 92 | +"* Naive Approach\n", |
| 93 | +" * Half of the time you'll see k/n = 1/1_000 and guess 2/2_000 and be right\n", |
| 94 | +" * Quarter of the time you'll see k/n = 0/1_000 and guess 0/2_000 and be 1000% low on event frequency (low by 1/1_000)\n", |
| 95 | +" * Quarter of the time you'll see k/n = 2/1_000 and guess 4/2_000 and be 1000% high on event frequency (high by 1/1_000)\n", |
| 96 | +" * Is this an asymmetry we are trying to correct for in the k/n naive approach?\n", |
| 97 | +"* Note that for the above case 2/2_000 is the same as 1/1_000\n", |
| 98 | +"* consider that 1/2_000 case too\n", |
| 99 | +"\n", |
90 | 100 | "Also Review\n", |
91 | 101 | "* Comment about Bayes, try understand it and how that fits in. Explain the problem with starting with p=0.5 and do bayes from there... (if any)\n", |
92 | 102 | "\n", |
|