Computer Science 121 - Project 2

Design, Review, Implementation and Testing

Fall 2009

Critical Dates

Project Description

  1. Introduction
  2. Overview and Grading
  3. Project Details
  4. Rules and Submission

1. Introduction

The first project went from concept through proposal. The second project deals with the design and implementation phases of a project:

What you choose to design and build is pretty-much up to you. You may choose to continue building components you started with as part of the greater game, some part of your original game proposal, or something entirely different. The only constraints I intend to impose on your choice are to ensure that it is of sufficient complexity to require meaningful design and test analysis, where considered decisions must be made, and where reviews are likely to add significant value. I suspect that these requirements can be met by the game (story arcs) design ... but if someone wants to do this, we will have some details to work out.

It is strongly encouraged, but not actually required, that you carry these activities out in teams (e.g. 4-5 person groups) where your individual components will combine to implement some larger and more interesting functionality. From a learning perspective, working on a team project will require you to manage evolving specifications and integration with work done by other people. A team of people working on related components provides a natural panel of well informed people to review your designs and test plans. It will also help you by providing other people to pace your efforts and help you with problems.

If you prefer to complete these activities as an individual, please come speak to me about your ideas.

As with the first project, the primary goal is to give you real experience with the skills, processes and work-products that we will be discussing in this class, but we are also looking to develop your abilities to intelligently plan a group endeavor, effectively manage it, and critically evaluate the completed effort. These skills will serve you well in clinic, and (more importantly) in the larger projects that you will surely prosecute in your careers to follow (whether commercial or academic).

2. Overview and Grading.

In the first project we gave you due dates for each of the major deliverables. In the second project, you will be responsible for deciding how to spread the required work over the available time. The only externally imposed dates are for your initial proposal and project plan, and for your final report delivery. Everything in between is under your control.

The following table describes the general model we will use to compute your grades for this project.
General Class Deliverable/Process Form Type Value
Plans
40%
initial proposal report team 5%
Component Design report indiv 10%
Design Updates report indiv 5%
Component Test Plan report indiv 10%
Test Plan Updates report indiv 5%
Integration & System Test Plan report team 5%
Review
25%
Review notes report indiv 10%
Review process process team 5%
Review Reports report team 10%
Implementation
15%
Implementation code, tests, results indiv 10%
Integration and System Test code, tests, results team 5%
Management
20%
Plan & Issue Management report, diary team 5%
On-Time, Communication checkins, diary indiv 5%
Post Mortem report team 10%

If you have work that you completed for the greater-game, and would like to have graded and averaged in to your final grade, come and talk to me about it.

3. Grading Details

3.1 Design Documents

The majority of the work involved in project 2 will be the development and review of designs and test plans. Each student will prepare a component design to satisfy provided specifications, package it for design review, revise it in accordance with the results of that review, and get the revisions approved by the review team.

You will find that preparing designs and test-plans for review will

  1. force you to more deeply consider your designs and testing plans
  2. greatly reenforce the elements of good design
  3. effectively exercises your both your envisioning and descriptive skills.
  4. provide you with detailed feed-back on where your descriptions were unclear.

Reviewing plans developed by others will:

  1. develop your ability to read descriptions of complex software.
  2. exercise your skills at critically analyzing proposed systems.
  3. force you to learn how to clearly articulate concerns and issues.
  4. give you many demonstrations of the difficulty and importance of clear descriptions.
  5. give you the oppportunity to learn design and communications techniques from others.

For all of these reasons, every student will be required to prepare a design plan and a test plan for review, and to participate in a graded design review process.

3.1.1 initial proposal

Each team will:

The only requirements I would like to put on this work are:

  1. each student must have a component of sufficient complexity to be worth designing, design reviewing, developing a test plan, and reviewing the test plan.
  2. the modules developed by the various team members must, in some sense, combine to provide some demonstrable and testable higher level functionality (which means that they must exchange calls, data or serivices).
Beyond meeting these requirements, you will not be graded on the amount of work you have to do to implement this project.

If you would like to meet the project 2 requirements in a different way (e.g. with work you did for the greater game, where the notions of implementation and integration are not well defined) come and talk to me about your ideas first.

These initial proposals will be graded on the basis of:

The grading for the estimates and schedule is discussed under section 3.4.1 (management).

3.1.1+ Architecture and Component Specifications

Between the time you develop your concept, and the time you have component specifications to which you will design, you must develop an architecture and component specifications. Had we not been working on the game for the last few weeks, you would have prepared and reviewed these as graded deliverables. As a concession to time, I am eliminating these as graded deliverables ... but you will still have to spend some time working up, reviewing, and refining an architecture in order to develop component specifications. You can shorten this step, but you cannot eliminate it (and I would encourage you not to try to do so :-)

3.1.2 Component Design

Each student will prepare a design document for the component that they will define, implement and test. The design presentation should address two goals:

  1. present the detailed interface definitions, component specifications, and proposed implementation in sufficient detail to permit reviewers to examine their correctness (and raise issues) before you commit to a particular implementation.
  2. describe the interfaces, component specifications, data structures, algorithms, and proposed plan of implementation in sufficient detail to enable a developer (familiar with the domain and tools) to sit down and implement it (without having to solve additional problems).

A single document can almost always serve both purposes. While rationale is not (technically) considered part of a design, an overview of the problems and considerations that have guided the design process can be extremely valuable to both reviewers and implementers. In some cases, the actual work to be done may be trivial ... but the analysis that led to the definition of that work is what must be carefully scrutinized.

What must be specified?

How completely specified?
You are preparing a design to be reviewed by the other students in your team. Describe things in sufficient detail to enable them to understand it, but don't waste your time or theirs with things that should be obvious to all of you.

You are hoping that they can help you find the kinds of design mistakes that you are likely to make. This means that you must elaborate your design in sufficient detail to give them a chance to see your "mistakes". You need not show the details of obvious or trivial algorithms. This is not a "code review" but a "design review".

3.1.3 Design Updates

You will revise your design to respond to all of the must-fix issues raised in the design review, until you have a final approval from the review team. (note, if no revisions are required, these are free points)

The grading of these updates will be based on

3.1.4 Component Test Plan

Considering the component design, you will prepare a risk analysis, and propose a comprehensive testing plan (goals, risks, testing strategy, suites, test cases, automation tools) for your component. You will submit this test plan for review, revise it in accordance with that review, and get the revisions approved by the review team.

Deciding on an appropriate plan is not at all a simple thing:

Therefore, test plans often begin with a risk analysis. looking at the proposed design, assess: These assessments are the rational (expectancy/cost) basis for your testing plan. Areas that are identified as simple and low risk can be tested by a small number of simple test cases. For areas where complexity and risk are high, you will be able to propose (and justify the adequacy) of a testing plan that directly addresses the risks (as best you understand them). Your goal is a test plan that is demonstrably adequate but not wasteful.

This risk analysis will be graded (~20%) on the basis of the completeness of your analysis (covering most likely problems), the depth and resonableness of your assessments, and whether or not you derive useful conclusions about where you need to focus your testing efforts.

After your risk analysis, I would like to see a high level testing strategy, in which you lay out:

Your testing strategy will be graded (~20%) on the basis of its completeness (does it cover all component functionality, the confidence it is likely to yield about the correctness of your component, the practicality of your approach, and the clarity with which you argue for the correctness of this proposal.

I would then like to see detailed testplans (~40%), where you

Your detailed testing plans will be graded on the basis of completeness, response to your risk analysis, congruence with your testing strategy, the clarity with which you describe the assertions to be tested, how functionality will be exercised, how you will ascertain correctness, and the reasonableness of your automation plan.

All of this this must all be packaged for review (~10%) with

And the final 10% of your grade will be based on the overall practicality of this plan and the confidence it inspires.

How much detail?
Specifying test cases in too much detail takes a great deal of time (to fill out some standardized form) while little improving the likelihoood that the test plan will be correctly implemented. Specifying test cases in insufficient detail leaves it un-clear what test cases need to be implemented and what each needs to do ... increasing the work that has to be done by the implementor (to interpret the instructions) and leaving room for important tests to be "missed" in the interpretation. You do not want to make either of these mistakes.

NOTE: the design document can be required reading for all test plan reviewers, and there is no need to repeat information already well covered in the design document.

3.1.5 Test Plan Updates

You will revise your test plan to respond to all of the must-fix issues raised in the review, until you have a final approval from the review team. (note, if no revisions are required, these are free points)

The grading of these updates will be based on

3.1.6 Integration & System Test Plan

In addition to your individual component design and test plans, each team will be asked to write a System Integration and test plan:

Your integration and system testing plan will be graded on the basis of:

3.2 Reviews

Every student will be required to take part in at least one graded design/test review processes (similar to the architectural review performed in class).

3.2.1 Review notes

Prior to each review, each of the reviewers will be required to study the plans to be reviewed, and prepare written comments. These will be graded on the basis of:

3.2.2 Review process

Each team will conduct a formal review, of each submitted design or test plan. One of these reviews will be observed and graded (as a process).

3.2.3 Review Reports

Each review (not merely the graded one) will produce a written report. It should be written by the person who acted as the scribe for the review. The entire review team will receive a grade for it, AND the scribe will receive a primary-authorship work quality assessment as well. Thus, you probably want to have different people act as scribe for each review.

The report will be graded on the basis of:

3.3 Implemenation

Each member of the team will implement their component design and test plan. Then the components will be combined (as described by the integration plan), tested (as described by the system testing plan), and demonstrated for the professor and/or grutors.

3.3.1 Implementation

Each implementation will be graded on the basis of:

3.3.2 Integration and System Test

The final integration will be graded on

3.4 Management

3.4.1 Plan & Maintenance

Each team will submit, along with their initial proposal, an initial plan for the work to be done in project 2. This plan should include:

As the project proceeds, the task list and definitions will probably change, and issues will arrise. As issues arrise, they should be documented and the plan revised (all under subversion control).

This management work will be graded on the basis of:

3.4.2 On-Time, Issue & Status reporting

Each individual will be graded on:

3.4.2 Post Mortem

You will, as a team, review all aspects of project 2 (both your activities in the greater-game project, and these more limited activities). One of you will then summarize that process into a post-mortem analysis report. It should specifically address:

You will be graded on the basis of:

Note that if you make no mistakes, you will not be able to earn any points on the post-mortem. Fortunately, no teams have ever found it necessary to deliberately make mistakes in order to have something to analyze.