Timeline

A suggested year-long EE schedule with CS-specific milestones. Adapt the months to your school’s calendar. Your supervisor will give you the dates that actually apply at your school.

Important. The specific deliverables in the table below – 5+ source bibliography in March, 25-source bibliography by May, a 4-paragraph outline in June, a final school deadline in November – are example school deadlines, not IB requirements. The IB requires the essay, the RPF (with three reflection sessions and the 500-word statement), and that the work is done across roughly 40 hours over an extended period. Everything else on the calendar is your school’s process.


Suggested calendar

Month Student milestone Reflection session
January EE kick-off. Begin reading and exploring broad topic areas.  
February Develop research question. Begin focused research. First reflection (initial)
March Submit research proposal and annotated bibliography (5+ sources).  
April Add 5 additional sources to annotated bibliography. Continue research.  
May Research and develop a ~25-source annotated bibliography.  
June (before summer) Submit detailed outline (minimum 4 paragraphs).  
July–August Continue research and writing. Begin drafting the essay.  
September Full draft due. Interim reflection
October Revise based on feedback. Submit final draft. Final reflection (viva voce)
November Final essay submission. Complete reflective statement on the RPF.  

Key student deadlines

  • Research proposal: March
  • 25-source annotated bibliography: May
  • Detailed outline: June (before the summer break)
  • Full draft: September
  • Final version: October
  • Final submission: November

CS-specific milestones

These are the technical steps that map onto the calendar above. They are not separate deadlines, but they are the things that tend to slip when students underestimate them.

When CS-specific task
Feb–Mar Identify your dataset(s) or design your experimental setup.
Mar–Apr Set up your development environment. Install required libraries. Allow at least two weeks for ML/data-pipeline projects – dependency installs, GPU access, and dataset downloads routinely take longer than expected.
Apr–May Run pilot experiments. Verify the methodology actually produces useful data.
May–Jun Complete primary data collection or experimentation.
Jun–Jul Begin analysis of results.
Jul–Aug Draft methodology, results, and discussion sections.
Sep Complete a draft with all sections, including evaluation.
Oct Final revisions. Make sure code is in the appendix, not the body.

Why pilot experiments matter

The most painful place to discover your methodology does not work is in September, with a draft due. A small pilot in April – one dataset, a few trials, the metrics you intend to use – will surface problems while you still have time to fix them.

Common things a pilot reveals:

  • The dataset is too small or unrepresentative for the comparison you wanted to make.
  • The metric you chose does not actually distinguish the algorithms or models in question.
  • Run times are too long to support the number of trials you planned.
  • The library you wanted to use does not support the configuration you need.

If any of those surface in September, you are unlikely to have time to recover. If they surface in April, they are just project notes you act on.


© EduCS.me — A resource hub for Computer Science education

This site uses Just the Docs, a documentation theme for Jekyll.