Computer Science 60
Study Guide for the Final Exam
Spring 2014

Exam Reminders

The final exam is a take-home exam. You may spend up to 3 hours on it.

You may use up to two 8.5 by 11 sheets of paper with anything you want on both sides of the sheets.

You may also use your favorite editors for composing code..., just as with the midterm exam.


Below is a study guide to help you prepare for the final exam. In addition to using this guide, we strongly recommend carefully reviewing the class homework assignments / your notes. The best way to study is to re-do and/or re-think the homework problems on paper.  This will simulate the exam experience.  The exam does include all of the languages (and topics) we've covered in CS60... .

Study Guide

  • Java
    • What is a constructor and when is it invoked?
    • What does this refer to in a java class?
    • What do private, protected, and public mean and where is it appropriate to use each of them?
    • What are accessor ("getter") methods and why are they important?
    • What is static and where and why would it be used?
    • What is inheritance, why is it useful, and how does it work?
    • Within an inheritance hierarchy, how does Java decide which out of several overridden methods to actually invoke for an Object?
    • How are the terms extends, super, and implements used in Java?
    • You should be comfortable with box-and-arrow diagrams to illustrate how Java organizes references and data in computer memory. 
    • When or why would you choose Java (vs Scheme, Prolog or Python) as an implementation language? (Note: the intended answer, at least, is not never!)
    • What is the merge technique and under what conditions can it be used?


  • Data Structures
    • What is the difference between an abstract data type and a data structure?
    • How do Java interfaces facilitate the relationship between abstract data types and data structures?
    • What operations do the information structures named Lists, Queues, Stacks, and Heaps support? (Deques, by the way, are double-ended queues.)
    • You should be comfortable with the implementation of the above structures using linked lists of cells.
    • What are the differences between non-destructive lists or data structures ("open" structures) and destructive lists or data structures ("closed" structures)?
    • How do breadth-first search and depth-first search work? How are queues and stacks used in each of these algorithms? How can recursion be used to implement depth-first search without explicitly using a stack?
    • What are the advantages of BFS and DFS over each other?
    • How do linked lists, binary trees, and heaps work?


  • Logic Programming
    • Prolog "programs" are a collection of facts and rules. Make sure that you understand all of the examples that we saw in class and on homework.
    • Why does prolog sometimes produce multiple identical answers to a query?
    • What is the fundamental algorithm that prolog uses in seeking bindings that satisfy predicates?
    • What is "unification" ? What are the differences among prolog's =, ==, and is operators? Similarly, what are the differences between \+, \=, and \== operators?
    • Why does the order in which prolog clauses are placed sometimes matter to the inference of variable/value bindings?
    • What types of problems is Prolog well-suited for?
    • You should feel comfortable composing a Prolog solution to a (moderately-sized) problem similar to the ones we did in class and on the HW.
    • What does it mean for a logical statement (that may include variables) to be satisfiable or unsatisfiable or a tautology?
    • You should understand an algorithm that can determine which of the three categories above a particular logical statement falls under.


  • Functional Programming in Scheme
    • Make sure that you understand and feel comfortable with Scheme's syntax and basic functions like first, rest, etc.
    • You should be comfortable about the differences in the assumptions that append, list, and cons make about their inputs.
    • Make sure that you can write basic Scheme functions like those developed in lecture and in the assignments. In particular, recursion is the secret to all happiness in functional programming -- you may want to review decomposing problems recursively.
    • What are anonymous functions and where are they useful? How is lambda used in Scheme?
    • What is tail recursion and how can it improve performance?
    • You should be familiar with the Scheme higher-order functions such as map, foldl, and foldr. You certainly don't need to memorize them, but you should feel like you could implement them and other higher-order functions from a definition using recursion.
    • What are tree and graph structures? What does it mean if a graph is acyclic? directed? Is a family tree a tree or a graph?
    • How can objects such as trees and graphs be encoded as lists? Make sure that you feel comfortable writing Scheme functions to manipulate such objects.
    • You should be able to create a use-it-or-lose-it algorithm that leverages recursion to solve a problem in Racket (or Prolog or Java, for that matter).
    • What is mutual recursion and when is it useful?
    • What types of problems is Scheme well-suited for?


  • Finite Automata
    • What is a deterministic finite automaton (DFA)? What does the name mean? What is the definition of a regular language?
    • Feel comfortable building a DFA for a given regular language. Remember that the states can encode meaningful information about what the machine has seen so far.
    • What does "distinguishable" mean? What does "pairwise distinguishable" mean?
    • What does the Distinguishability Theorem state? How can it be used to prove that a specific DFA has the minimum possible number of states for the language that it accepts?
    • What does the Nonregular Language Theorem state? How can it be used to prove that a given language is not regular? How did we prove this theorem?
    • What is an NFA? What does it mean for an NFA to accept a string?


  • Computability and Turing Machines
    • What is a Turing Machine? How does it operate? You should feel comfortable building a turing machine that will accept simple languages. What is the Church-Turing Hypothesis? (Answer: it's the belief that Turing Machines can compute anything that any machine can compute, i.e., that Turing Machines are computationally complete. It's near-universally believed among computer scientists.)


  • Algorithm Analysis
    • Two useful formulae for analyzing algorithms are the formula for the arithmetic progression.  You might want to write these on your sheet...
       1 + 2 + 3 + ... + N = N(N+1)/2 = O(N^2)
      and for the exponential/geometric progression, the last term dominates. Usually, we use x=2 here, but any x is OK:
       x^0 + x^1 + ... + x^N = (x^(N+1) - 1) / (x-1) = O(x^N)
       
      Note that this is the same as saying
       1 + 2 + 4 + 8 + ... + N/2 + N = (2N-1) = O(N) 
      just with a difference in what N refers to (in the former case, N refers to the number of terms; in the latter case, N refers to the size of the largest term).
    • Make sure that you understand the algorithm analysis examples we saw in class and on homework.
    • How is big-O defined? How is it used (less formally) in practice?
    • Be sure you feel familiar with defining a recurrence relation based on a short piece of code -- and then can "unwind" that to find the big-O running time of that code.
    • Remind yourself how to keep track of the work done in loops (especially nested loops) in order to estimate the big-O running time of those structures.
    • How did we prove that every comparison-based sorting algorithm requires at least n log n running time (asymptotic worst-case running time)? How can bucketsort be faster!?
    • What is the difference between "tractable" and "intractable" problems? (Answer: tractable == polynomial; intractable == exponential) Why do we care? (tractable == doable; intractable == not doable for large input sizes).
    • How can divide and conquer (or "use-it-or-lose-it") help build recursive algorithms to solve optimization problems?
    • How can memoization and dynamic programming make those recursive algorithms run much faster? You should feel comfortable with the UIOLI and DP examples from class including the Floyd-Warshall algorithm.


  • Fundamental Algorithms
    • How do MergeSort, InsertionSort, HeapSort, BucketSort, and BogoSort work?  What are the worst case bounds on each?  What are the advantages and disadvantages of each? 


  • High-level design and testing/debugging strategies
    • What constitutes a good set of test cases (coverage of a problem's and and implementation's "corner cases" or "edge cases")
    • Be comfortable breaking a problem down to a level where implementation would be straightforward
    • Given piece of (potentially complicated) code, be able to suggest checks that would reveal potential bugs in the code.