due Tue Jan 24
- Expect to continue the ideas here for at least another week, with coding assignments to implement some of them coming soon.
- Take a first pass at reading Biggs textbook chapters 1 through 3, with an eye towards understanding the definition and intuition behind the terms
- alphabet, string, message, word
- uniquely decodable
- kraft-mcmillan number
- optimal code
- information entropy
- huffman code
- Give a specific example of each of the preceding terms.
- For another take on the notion of "information entropy", start reading part 1 of Shannon's Mathematical Theory of Communication, which is the first paper to propose this concept.
- Similar material is in chapters 4 & 5 of MacKay's book.
- Follow up any of these in wikipedia or other online sources as need be.
due Tue Jan 31
- (This is too much ... see what you have time for.)
- Finish reading through chapter 3 in text. Start reading chapter 4, including LZW.
- Read about joint probability from MacKay's book, chapter book, and/or
- On page 33, do exercises 3.6, 3.7, and 3.8.
- Do exercise 3.13 on page 40.
- In a computer language of your choice, start writing a program that calculates the information entropy of some input text. You should have at least a working prototype by Tues, which will be further tested and refined for next week's assignment. The program should
- Count the number of each symbol (i.e. character).
- From that, construct the probabilities p(i).
- From that, and assuming for now (incorrectly, probably) that the character probabilities don't depend on other characters, calculate the entropy. Decide what base to use, and discuss why you chose that one.
- Choose some text sources, and test your program. (You can if you like generate random text with different characteristics to test, or use, say, Moby Dick.)
- Try compressing those text files with any of the standard utitilies (gzip etc). Is there any connection between the entropy and how much the file can be compressed, and if so, what exactly?
due Tue Feb 7
- implement markov-2 (or higher) entropy model, extending the last coding assignment. Choice of what text to use (including randomly generated examples) is up to you.
- implement a lossless compression algorithm of your choice; be ready to discuss and show in class.
due Tue Feb 14
- Read up on noise, as given in the references in my Feb 10 notes.
- Calculate all the interesting quantities including mutual information and channel capacity for at least two models:
- the symmetric binary channel, with X=(0,1), Y=(0,1), P(1|1)=P(0|0)=1-f, P(1|0)=P(0|1)=f where f is a fractional error rate like 0.01 or 0.1. (This is one of the sample problems in MacCay).
- the 3 symbol X=(1,2,3) Y=(a,b,c) example we started in class.
- If you're really having a good time, write some code to automate this calculation.
due Tue Feb 21
error correcting codes - basics
- Read chapters 6 and 7 in Biggs (what we've been doing) and start 8 (what we'll do next week)
- Do 6.6, i.e. prove that that hamming distance d(x,y) satisfies d(x,x)=0, d(x,y)=d(y,x), and d(x,z) <= d(x,y) + d(y,z).
- Do 6.15, 6.16, 6.17
- Do 7.8, 7.11
due Tue Feb 28
linear codes, hamming codes
- Read chapters 8, 9, or similar material other places.
- Do problems 8.7, 8.10, 8.11, 8.13
- What are (n, k, delta) for a hamming code with m=5? Give an example of a correct code word. Give an example of a word that contains an error, and specify where the error is.
due Thu Mar 8
cyclic and other linear codes
- Browse through the codes listed in the Feb 28 notes : Cyclic, Turbo, Golay, etc.
- Pick one to study further, and do so.
- Describe how it works, and give a specific (can be small) example including data to be sent, the codeword, and an explanation of what errors can be fixed, and how.
- Come to class Thursday ready to do a 20 min talk on what you know about it.
due Tue Apr 3
- Play around with one of the historical ciphers described in the text, at least writing a program to encode/decode text, and preferably doing some cryptanalysis of a simple cipher. Possibilities include anything from chapter 11 in the text :
- Do a frequency analysis of a simple substitution cipher.
- Write a brute force search decoder for the correct text that recognizes when the sentence is in English.
- Explore Hill's system and plaintext attacks.
- Read chapter 12 in the text and do exercises 12.10 and 12.11.
- Get a command-line openssl on your computer, if it isn't already. Use it to encode the text "This is the plain text that you should encode to make Jim happy." with the key "quick_brown_foxes_are_foxy" in aes-128-cbc and base64.
- So ... what is aes-128-cbc? Be specific.
- Did you use a salt? Is your code the same each time, or different?
- Is all of the key required? Should it be longer? Will any other key give the same result?
due Tue Apr 10
- Read chapter 13, on RSA. Use any other sources (i.e. wikipedia) you like to explore this topic, and come to class ready to finish our discussion.
- Do exercises 13.4 and 13.6 in the text, both on page 211.
- Explain (at least) or implement (better) the "repeated squaring" algorithm for efficiently calculating AB mod C when A, B, C are big integers. (This is described in section 13.3 in the text, and wikipedia's "exponentiation by squaring" article.)
- Play around with RSA via openssl, doing things like :
- Generate a public/private RSA key pair.
- Upload your public key to pgp.mit.edu.
- Verify that you can encrypt & decrypt a message with your keys.
- Create a signed message for someone else. (My public key has id 29A4465F at pgp.mit.edu).
due Tue Apr 17
final project proposal
- Pick a topic for your final project, which can be related to anything we've done this term : entropy, error correction, compression, or crypto.
- Due in its final form Fri May 4
- Be ready to present it on Tues May 1
- This should be some combo of a reasearch and/or coding project, but at least some of it should be an implementation of something (probably code, possibly math).
- Be specific, and include some ideas for sources and other materials.
due Tue May 1
- Present your final project to the class, on the last day of classes.
due Fri May 4
submit final project
- a place for Jim to record the overall semester grade