Link Search Menu Expand Document


Table of contents

  1. Overview
  2. Default project
  3. Milestones and evaluations
    1. Progress report (5%)
    2. Writeup (35%)
    3. Short presentation (10%)
  4. Miscellaneous project policy


Your project will be one of the two options:

  1. Custom project. You will carry out a research project of choice in topics covered in the course. If you prefer to do a purely theoretical project – either exploring extensions to existing proofs or a comprehensive survey – please contact the course staff.
  2. Default project. You will implement a method covered in course and apply it to a distribution shift problem. We provide more information for the default project below.

You are highly encouraged to form groups of up to 3 people.

We will provide $50 compute credits per student. Please use them wisely! We will release instructions on how to claim Google Cloud credits on Ed soon. For setting up your Google Cloud, you might find CS321N Tutorial helpful.

Default project

In the default project, you will:

  • Pick a distribution shift problem an appropriate method, both of which are covered in the course
  • Implement the method
  • Evaluate the method on the distribution shift/dataset of choice and compare with a relevant baseline
  • (Encouraged) Run systematic experiments to investigate why the method works or does not work in your setting.

While you are free to work on any distribution shift problem and method covered in the course, as well as any dataset of your choice, we provide some defaults below as suggestions:

  • Dataset. Our default datasets include the WILDS benchmark and a few standard domain adaptation benchmarks, and we describe them here.
  • Method. We have selected a several methods covered in the course below:
    • Importance weighting for label shift (confusion matrix and/or EM)
    • Optimal transport for domain adaptation
    • Domain adversarial neural networks (DANN)
    • Self-training for domain adaptation
    • Invariant risk minimization (IRM)
    • Distributionally robust optimization (DRO)
    • Certified adversarial robustness via randomized smoothing
    • Robust self-training (RST) for adversarial robustness
    • SEVER

Milestones and evaluations

The project is fairly standard and is evaluated on the below three parts.

Progress report (5%)

Halfway through the class, each group is expected to submit a short progress report of 1-2 pages, where you describe the proposed project plan as well as any partial progress. The progress report is due May 1 at midnight. It should be submitted via Gradescope as PDFs, and any math should be typeset with LaTeX. Please have one member submit on behalf of the entire group. Please list all group members in your submission.

Default project. The progress report should include a formal statement of the distribution shift problem and the algorithm, descriptions of dataset(s) and task(s) considered, and a preliminary literature survey. Please also report any preliminary results.

Custom project. For custom projects, the progress report should be like a project proposal. It should describe the motivation, formal problem statement, preliminary literature survey, your approach, and any preliminary results.

We will evaluate the progress report on completeness; it should include all the above components in sufficient detail.

Writeup (35%)

At the end of the quarter, each group is expected to write a 8-page paper on the project. This will be the main evaluation, and it should be submitted via Gradescope as PDFs by June 5 at midnight. Please make sure the paper is submitted by the same one member who submitted the progress report, on behalf of the entire group.

The write-up should resemble papers published in machine learning conferences, including many papers discussed in-class such as this paper. To this end, the write-up should include the following:

  • Abstract: Summarize the project in one paragraph.
  • Introduction: Describe the motivation, problem, and key results at high level.
  • Related work: Provide a literature survey on relevant existing work.
  • Set-up: Formally state the problem and define any relevant notation.
  • Approach: Motivate and describe your approach and algorithm.
  • Empirical results: Describe the experimental set-up (model, dataset, hyperparameters), empirical results, and interpretations. What are your key findings?
  • Discussion and conclusions: Discuss any remaining questions and interesting future directions.
  • References: Include all references. References do not count toward the 8-page limit.

While students are expected to follow the above structure for the default project, write-ups for custom projects can deviate from the above structure to better fit the research project. The write-up should be 6-8 pages, excluding references, and if you wish, you can include an appendix in addition. Please follow the NeurIPS formatting guidelines and use the appropriate style files.

Lastly, the following are due along with the write-up (also on June 5 at midnight):

  • Code. Students are also expected to submit code for their project. Please upload your code to Github, Bitbucket, Google Drive, Dropbox, etc. and include a link in the final write-up. It does not have to run out-of-the-box, but it should include all relevant code needed to reproduce the results. It should also come with a README file documenting what commands you ran.
  • Contributions. Please respond to a quick questionnaire on contributions to the project by each member. This is just to ensure everyone contributes meaningfully to the project. Please make sure that each member of your project group submits this form. We will share a form on Ed later in the quarter.

We will evaluate the write-up on the correctness, clarity, and completeness of the entire write-up, as well as the following dimensions:

  • Problem, set-up, and approach. For the default project, is the chosen method appropriate for the distribution shift problem? For custom project, is the problem and approach clearly motivated?
  • Results. Does the project investigate a problem or a method in a thorough, systematic, and technically sound fashion? Are the claims directly supported by evidence? For the default project, does the project go beyond just simply applying the method on the dataset, for example by comparing with relevant baselines and investigating why the method works or fails?
  • Related work. Does the related work cover major lines of work related to the project? If relevant, the related work should not only provide works prior to the proposed method, but also other relevant work following the method.
  • Novelty of proposed ideas and findings (extra credit).

Short presentation (10%)

Students will give a short 5-10 min presentation of the work in-class on the last day. The presentation should distill the project to convey three different points:

  • Problem, motivation, and approach. Summarize the problem, motivation, and approach/method.
  • Claims. Describe your findings and your significance.
  • Evidence. Present the concrete results that support your findings.

The goal of the presentation is to convey the important high-level ideas and takeaways of your project, rather than all the details. Presenters are encouraged to screen share to present the material (e.g., present with slides), and please be prepared to take any questions. The evaluation will be based on both the content (largely following the write-up evaluation criteria) and the presentation.

Miscellaneous project policy

Using the same project for CS329D and another class. While the projects can be related and use shared codebase, we do not allow submitting an identical project as another class project. If you are working on related projects for two courses, you should first make sure that you follow the guidelines for the CS329D project. Second, if any part of the project is done for another course, please clearly indicate which part of the project was done for CS329D and which part was not, both in the progress report and in the contributions questionnaire that accompany the final write-up.

Collaborating with students in another course. This is allowed if (1) the overall project fits the CS329D project guidelines, and (2) your contributions are directly related to the main objectives of the default project / the course topics for the custom project. Please demonstrate that the latter is the case by clearly stating your and your teammates’ contributions in the contributions questionnaire that accompany the final write-up.