Office Hours: Tue and Thu, 3:45pm - 4:45pm. Lindley 201D.
Security is a subtle and error-prone discipline. The other part of the course focuses on mathematically rigorous techniques for analyzing security mechanisms and verifying security properties. (These analyses will typically make simplifying assumptions about the strength of cryptographic primitives, to avoid getting "bogged down" in cryptography.) This is a relatively recent and rapidly-growing research area. Readings for this part of the course will be a selection of recent papers. The analyses will include manual methods (such as Burrows, Abadi, and Needham's logic of authentication) and mechanized analysis techniques, which may be based on temporal-logic model-checking, process algebra, program dependence analysis, symbolic computation, etc. Familiarity with these topics is not a prerequisite--reading assignments and lectures will introduce them as needed---but interest in them (and security!) is!
For students who took B649 before: The overlap with last year's incarnation of B649 is small (about 15%). A student is allowed to take B649 multiple times, with the instructor's permission, provided the total credit hours does not exceed 6. (B649 was 3 credits last year and is 3 credits again this year.)
There are no formal prerequisites. The requirements are willingness to maintain a fast pace and to read and understand research articles.
If you have any questions about the course or the suitability of your background, please contact the instructor.
Other decent textbooks that might be helpful are:
Textbook
For the introduction to security, we will use the following textbook:
The remainder of the readings will be research articles.
Syllabus
Class | Topic |
1 | Introduction |
2-4 | Cryptography: private-key and public-key cryptosystems, hashes, message digests, digital signatures, secret sharing |
5-6 | Authentication: passwords, authentication tokens, one-time passwords, strong authentication |
7-8 | Key Establishment |
9 | Key Management |
10-12 | Authentication Logics: BAN and Abadi-Tuttle |
13-15 | Woo and Lam's Model of Authentication; Automated protocol verification by state exploration |
16,18-19 | Secure Distributed Storage and Retrieval; Rabin's Information Dispersal Algorithm |
17 | Midterm |
20,22 | Authentication Tokens (Smart Cards) Redux |
21 | Hiring talk: Tandy Warnow |
23-25 | Secure Reliable Multicast |
26 | High-Bandwidth Encryption with Low-Bandwidth Smart Cards |
27-28 | Untraceability; Hidden Automorphism Model |
Project | 45% |
Homework | 40% |
Midterm exam | 15% |
Item | Out Of | Mean | Stddev | Histogram |
Homework 1 | 10 | 9.8 | 0.4 | |
Homework 2 | 10 | 7.9 | 2.0 | |
Homework 3 | 15 | 14.6 | 0.4 | |
Homework 4 | 15 | 12.8 | 1.7 | |
Homework 5 | 10 | 9.6 | 0.6 | |
Homework 6 | 7 | 5.6 | 0.9 | |
Midterm | 30 | 18.1 | 4.2 | histogram |
Project | 100 | 91 | 3.6 | |
Course | 4.3 | 3.8 | .36 |
Project: If you did not receive comments on your project by email, please contact the instructor.
Project: For implementation projects, if your system requires any set-up or installation, please describe the necessary steps in a README file (or in your final report).
Project Presentations: LH102 seems more comfortable and convenient than LH115, so we will use it instead.
Project Presentations: You might find it helpful to use overhead transparencies, the whiteboard, or "on-line slides" during your presentation. If you use on-line "slides", it would probably be better to make slides that can be displayed from a Sun, unless someone else knows how to use the PC in LH102. Making "slides" in HTML (with a large font) and displaying them with Netscape works reasonably well. Framemaker (/usr/local/bin/Frame) and SLiTeX are also available on the Suns.
Secret-Ballot Protocol: The link should work now.
Projects: The deadline for final reports has been extended slightly to 1pm on Saturday, i.e., just before the presentations. You can submit your final report electronically or as hardcopy. If you anticipate having difficulty meeting this deadline, please discuss the situation with me as soon as possible. In any case, skipping some of the presentations to work on your project will be a bad idea.
Project Presentations: It's a very good idea to rehearse your presentation (including the demo, if you implemented something) a few times, and to check that your presentation takes 10 or 11 minutes (leave a minute or two for questions). For people who did implementations, in addition to demonstrating your system, discuss (briefly) the aspects you think are most interesting, for example: the assumptions your system relies on, the guarantees it provides, justification for design decisions, unexpected difficulties you encountered, performance issues, features or optimizations that could be added, etc. If you have questions about what to include or not include in your presentation, just ask; if you would like to show me your demo in advance, send email suggesting a time.
Secret-Ballot Protocol: Here is pseudo-code for the Secret-Ballot Protocol, taken from my slide. This pseudo-code corresponds to Figure 1.5 in the photocopied excerpt from Merritt's thesis. It seems to me that the pseudo-code on page 53 of that excerpt was intended to correspond to Figure 1.5 but does not.
Project: People using Java for the project might want to look at Avalanche.
Schedule: The project presentations seem like a fitting way to close the semester. There will be no classes after that.
Optional Homework: Here is Homework 7. It is optional. It is intended to give interested students something to think about. Since it is optional, solutions will not be posted. Of course, if you would like to discuss your thoughts or answers, feel free to stop by my office (preferably during office hours, if possible). Anyone who happens to be inclined to submit written answers to any of these problems is welcome to do so.
Project Presentations: Here is the schedule for 25 April. Abbreviations: FSE = forward secrecy for email; SDSR = secure distributed storage and retrieval.
TIME | TEAM MEMBERS | TOPIC |
1:00 | Akman and Onalan | FSE |
1:12 | Ahmed | Reductions for authen. protocols |
1:24 | Ishkov | FSE |
1:36 | Chauhan and Govindaraju | SDSR |
1:48 | Lu and Zhou | FSE |
2:00 | Deuskar and Shetty | SDSR |
2:12 | Savintsev | FSE |
2:24 | Yang and Zhang | Anonymity |
2:36 | Heitz | FSE |
2:48 | EVERYONE | RELAXATION |
3:06 | Stuckey | FSE |
3:18 | Shanbhag | SDSR |
3:30 | Singh and Tangirala | FSE |
3:42 | Balasubramaniam | Type systems for security |
3:54 | Unnikrishnan and Vlas | FSE |
4:06 | Vinke | Anonymity |
4:18 | Ramakrishnan and Sharma | FSE |
4:30 | The End |
Optional Reading:The following paper was discussed in lecture on 14 Apr:
Project Presentations: Teams of two are free to share the talking or have one member do it. For teams of two that did implementation projects, it will probably be most efficient if one member sets up the demo while the other member starts describing the project.
Project Presentations: As mentioned in lecture, the project presentations will be held in Lindley 115 in the morning or afternoon on Apr 25 or 26. If you will be unable to attend at some (or all) of those times, please let me know by tomorrow afternoon. The presentations will be approximately 12 minutes each, so they will collectively take approximately 3 hours. I propose having two 1.5-hour sessions, with a break inbetween.
Reading:The following paper was distributed in lecture on 2 Apr:
Project Presentations: After reading all of the project proposals, I became convinced that people would benefit more from hearing about each other's projects and presenting their own work than from taking an exam. So, instead of a final exam, each team will give a short presentation of its project. I will reserve a room with a projection system, so teams that did implementations can demonstrate their software. The presentations will be held shortly after the projects are due. An exact time and place will be scheduled as soon as possible. Due to this change, the total weight of the project (including the presentation) increases by 10%, and the weight of homeworks increases by 5%.
News: Ronald Rivest has posted an interesting note: Chaffing and Winnowing: Confidentiality without Encryption.
Course Evaluation: Please fill out the on-line course evaluation. Click here to access the convenient web interface.
SDSR: The security group at IBM T. J. Watson Research Center that wrote the paper on SDSR also implemented such a system. The implementation is IBM proprietary, but a paper describing the implementation is available: Design and Implementation of a Secure Distributed Data Repository. A slightly revised and extended version of the original paper is also available: Secure Distributed Storage and Retrieval.
Project: For the Forward Secrecy for Email project, when stating the security guarantees provided by your system, be careful about where data is stored. For example, a user might log into one machine (e.g., bone) to read email, but the /usr/spool/mail files might be stored on a different machine (e.g., whale). If your system involves a "server" process, that process might conceivably run on a third machine. If you need to assume that some of these machines are the same, then state this assumption explicitly and provide some justification.
~stoller/security/sharing/
(see the
comments in disperse.c
).
Project: For the Forward Secrecy for Email project, if your system
uses regular email as the transport mechanism, then you might want to
consider using procmail
(for more information, see man
procmail
). If you base your system on pgp
(instead of
RSAREF), you might also need to use des
.
Project: In proposals for implementation-oriented projects, I recommend dividing your project into 3 or 4 phases. This helps you keep track of your progress and helps ensure that, if you don't finish everything by the due date, at least you finish a well-defined piece with some functionality.
Project:The revised due date for the project proposal is Mar 29. It would be a shame if people missed the Horizon Day because of this. People who attend at least 2 Horizon Day talks receive an extension until 5pm on Mar 30. Just write on top of your proposal the names of 2 talks you attended.
Reading: The following paper was distributed in lecture on 24 Mar:
Project: Two links in the bibliography section of the project web page were broken; now they work.
Project: Information on the B649 Course Project is now available.
Midterm: A histogram of midterm scores has been added to the table of grading statistics.
Midterm: If you have questions about the grading of the midterm, please ask the appropriate person (if possible): Yu Ma for problems 1,2,5; Scott for problems 3,4,6. This will help ensure consistency of grading.
Midterm: Remember that the midterm will be open book, open notes, and open B649 handouts! Bring those things with you to the exam! B649 handouts include articles, homeworks, etc.
Erratum: In the notes on Woo and Lam's Model of Authentication Protocols, I just corrected a typo: in (E5), the occurrence of Subst should be an occurrence of subst.
Homework: Here are Solutions to Homework 6.
Clarification: For hw6, problem 2, please do not send binary data
in email; encode it as ASCII text. There are various ways to do this
(including a command-line option to pgp
).
Erratum: In the notes on Woo and Lam's Model of Authentication Protocols, I just corrected some typos. In (E4), both occurrences of Subst should be occurrences of subst. In section 3, paragraph 2, the occurrence of vs should be an occurrence of V.
Clarification: In hw6, if you find that you need "low-level" functions that do not have much to do with the model of protocols itself (e.g., the pattern-matcher mentioned in a previous clarification), then you are free to describe the behavior of those functions in clear English, instead of writing pseudo-code for them. (To me, this is already part of the meaning of the word "pseudo-code", as opposed to "code", but I'm not sure whether everyone sees it that way.)
Clarification: In the notes on Woo and Lam's Model of Authentication Protocols, I just corrected an omission in condition (E6). The conjunct prin(id'')=x was missing from the definition of old.
Clarification: Is Z is able to execute BI, EI, BR, and ER statements? According to Woo and Lam's definition of execution (in their paper), the answer is yes. According to the definition of execution in the lecture notes, the answer is no. Letting Z execute those statements is harmless, but I would argue that it is unnecessary. First, note that Z executing those statements does not change the possible future behaviors of any principal, including Z. Those statements are included only so that correspondence properties can talk about them. Thus, the question is: can modifying an execution by inserting or removing such statements executed by Z could affect whether the execution satisfies a correspondence property? Recall that a correspondence property always specifies which two principals are being considered. It would be silly for either of those principals to be Z, so let's prohibit that. Then, the answer to the question is no.
Clarification: In hw6, to ensure that the number of reachable states is finite, you need to assume that the set Name is finite.
Reading: The following paper was distributed in lecture on 3 Mar:
Midterm: The midterm will cover Topics I-VII (I knew those Roman numerals would be useful some day) and the associated readings.
Clarification: In hw6, I recommend structuring your algorithm into a transition function (for the automaton) and a straightforward program that finds all the reachable states of an automaton, given the initial state and the transition function. (Wow! It's a higher-order program! :-)
Midterm: Don't forget that the midterm will be held in lecture on March 10. The midterm will be open book, open notes, and open B649 handouts. The only book that you may open during the exam is the course textbook. On March 10, my office hour will be 10:15am-11:15am instead of 3:45pm-4:45pm.
Clarification: In hw6, your algorithm will need to test whether a Receive statement matches a Send statement. This involves pattern-matching between the terms in the two statements. You don't need to write pseudo-code for this pattern-matching. Just assume there is a function "matchmaker", specified as follows. For a ground term t and a "pattern" p (i.e., a term with some variables, presumably all undefined), matchmaker(p,t) returns a substitution s for the variables in p such that p[s]=t if such a substitution exists, or returns some special value if no such substitution exists.
Clarification: I changed hw6 and the notes on Woo and Lam's model slightly, to allow variable-arity concatenations and disallow directly-nested concatenations. This simplifies generation of the set of terms possibly sent by Z.
Lecture Notes: Here are notes on Woo and Lam's Model of Authentication Protocols. These notes follow the presentation in lecture very closely, though I couldn't resist making a few minor improvements. For example, I renamed "Sys" to "Name", omitted Var_c, introduced notation for public and private keys, and combined and re-ordered some of the requirements in the definition of executions.
Homework: Here are Solutions to Homework 5.
Clarification: Message Formatting Assumption 1.3.1 states that field boundaries in a message are not misinterpreted. Message tampering is orthogonal to this assumption. If the adversary modifies a message m, then the result is a different message m1, and the assumption then is that field boundaries in m1 are not misinterpreted, though they may differ arbitrarily from the field boundaries in m. So, including at the beginning of each message a header containing a list of the fields and their lengths is a simple and sufficient implementation. Fixed-length fields (with padding as needed) is also simple and sufficient. Explicit field separators are unnecessarily complicated.
Clarification: What do the formulas "P sees {X}K" and "P believes (P sees {X}K)" mean? The former means that P saw the sequence of bits which is {X}K, but P does not necessarily recognize that sequence of bits as being message X encrypted with K. The latter means that P does recognize that sequence of bits as being message X encrypted with K.
Homework: For Homework 6, you will need the instructor's PGP public key.
Homework: Homework 6 was distributed in lecture on Thursday.
Reading: Lectures this week will be based on:
Homework: Homework 5 (revised) was distributed in lecture on Thursday.
Homework: Here are Solutions to Homework 4.
Homework: Please discard the version of Homework 5 that was distributed on Tuesday. A revised version will be distributed on Thursday. Sorry for the inconvenience (and waste of paper).
Reading: For Homework 5, you will probably want to read Sections 1-4 of:
Homework: Here are Solutions to Homework 3.
Reading: The first article to read is:
Homework: If you have questions about the grading of problem 2 on homework 2 (or other homework problems), you should email the grader or look for her in LH130.
News Group: A link to the course newsgroup has been added above, for the convenience of people who read news using their browser.
Homework: Homework 3 was distributed in lecture on Tuesday.
Homework: Grading statistics for Homework 2 have been added above.
Grading of Homework: Questions about grading of homeworks should be addressed first to the grader (Yu Ma) and then (if questions persist) to the instructor.
Homework: Information about homework submission has been added above. (This information was already mentioned during a lecture.)
Homework: Another clarification to problem 1 on homework 2. Having either or both machines generate random values to use as secret session keys is fine. The non-trivial part is getting both machines to agree to use the same secret. Essentially, the previous clarification says: you cannot assume that both users type in the same secret session key.
Homework: Clarifications to problem 1 on homework 2. If you assume that users type some secret information at the beginning of each session, and that this secret information is deleted from the machine at the end of the session, then achieving perfect forward secrecy is trivial. You should assume the user does not enter any new secrets. Also, you should assume (as usual) that the attacker (Eve) knows the encryption algorithms, the protocol you devised, the message formats, etc.
Reading: Kaufman et al., chapters 7-9.
Grading: Grading statistics for hw1 and tentative weights for course grades have been added above.
Homework: Homework 2 was distributed in lecture on 22 Jan.
Late Homework Policy: Normally, solutions will be posted on the web page within 1 day of the due date of each homework. Homeworks submitted after solutions have been posted will receive no credit, for obvious reasons. Homeworks that are submitted late (but before solutions are posted) will receive a penalty of PI-2% per day (or fraction thereof) of lateness, where PI is the prime interest rate, as published in the Wall Street Journal.
Textbook: I heard the bookstore ran out of copies of the textbook. If you don't have a copy, ask the bookstore to order one for you. I heard it will take a week or so for the book to arrive after that. So, for the next week or so, we will put a copy on 2-hour reserve in Swain Library. It should be there by lunchtime on Thursday.
On Reserve in Swain: I have placed the Handbook of Applied Cryptography, by A. J. Menezes, P. C. van Oorschot, and S. A. Vanstone, on 2-hour reserve in Swain Library. It is clear, well-organized, and comprehensive. You might find it useful as a reference.
DES: [I revised this info, because things changed.] DES is
available as /usr/local/bin/des
; see man des
for
details. That program does not support triple DES, so I ftp'd a newer
version from here and
installed the executable as /u/stoller/bin/des
; for
documentation, just run it with no arguments.
Homework: Homework 1 was distributed in lecture on 15 Jan. I wrote "(cleverly?)" in problem 2 to suggest that some of those choices require more cleverness than others.
Clarification: Someone asked about "wire-tapping" of fiber optic cable. According to one reference: "Fiber optic cables can be tapped through bending, nicking, or placing in high-refractive liquids, which may cause some light to escape."
Reading: Kaufman et al., chapters 1-5. You aren't responsible for knowing the details of how DES, IDEA, RSA, etc., work, but you should understand the basic techniques and assumptions that are involved.
Newsgroup: There is now a course newsgroup: ac.csci.b649. The newsgroup is mainly for you to post questions and answers to other people's questions. I might also post answers in the newsgroup, but I will make general announcements on this web page.