CS453 Automated Software Testing, Spring 2023

Lectures

Time: 10:30-11:45, Mondays and Wednesdays Location: N1 114

Lecturer

Shin Yoo shin.yoo@kaist.ac.kr Office: E3-1 Room 2405

Communication

All class announcements, as well as Q&A, will take place using a dedicated Slack workspace. You are required to join cs453spring2023.slack.com if you want to continue this course. It is strongly recommended that you install either a desktop or a mobile client, to get notifications. Email the lecturer or one of the TAs to get the invitation link, if you do not have one. When you sign up, please set your username as your real full name in English, followed by “(your student id number)”. For example, “Shin Yoo (20201234)”.”

Syllabus

This course is concerned with a broad range of software testing techniques, with a heavy emphasis on automation, tools, and frameworks, as well as the research outputs behind them. The topic will include, but are not limited to: black box testing/combinatorial testing, random testing, concepts of coverage, structural testing, mutation testing, regression testing, testability transformation, automated debugging, etc.

Prerequisite

  • Strong programming skills: you are required to actively contribute to group and individual project, which involves serious implementation. There will be also a number of hands-on sessions where we will program together during the class.
  • Unix/Linux-savvy: you should be familiar with the usual build tools and Unix/Linux command line environments.
  • Git-aware: knowing how to use git is mandatory for this course. First, we will use GitHub classroom for coursework. Second, you will be required to submit a github repository as part of your project deliverable.
  • Ideally, CS350 Introduction to Software Engineering.

Evaluation

Please note that, unlike previous years, we will have the final exam instead of the mid-term exam. Also, there is no participation points.

  • Coursework: 40%
  • Project: 30%
  • Final Exam: 30%

Teaching Assistant

  • To be announced.

References

We do not have a textbook per se, and the course will be based on slides and other reading material that are deemed appropriate. However, if you want to get broader sense for some of the topics dealt by this course, I recommend the following books and publications.

Lecture Schedule

Assignment 0: GitHub Classroom Onboarding

You need to get familiar with GitHub Classroom: create a GitHub account if you do not have one, and learn the basics of Git. The assignment invitation link is here.

Assignment 1: Introduction to Metaprogramming

You will learn how to manipulate Python code using ast module. This assignment takes up 5% of total course grade. The assignment invitation link is here.

Assignment 2: Python Coverage Profiler

Your task is to write a coverage profiler for Python that can measure statement, branch, and condition coverage. This assignment takes up 15% of total course grade. The assignment invitation link is here.

Assignment 3: Mutation Testing

Your task is to write a full mutation testing tool that mutates the give Python code, executes the given test cases against the generated mutants, and finally produces kill matrices. This assignment takes up 10% of total course grade. The assignment link is here

Assignment 4: Delta Debugging

Your task will be to implement a delta debugging tool that minimises an error-revealing input. First, we will implement a linear and recursive DD for fake input. Subsequently, we will move onto Hierarchical Delta Debutting for Python programs (i.e., working with ASTs). The assignment link is here.

Project Aim

All teams should develop and/or implement an automated software testing technique based on an idea discussed during the course. I would encourage teams to pursue a novel idea, but a faithful reproduction of a state-of-the-art technique with solid evaluation would also do. If you are uncertain about your team’s idea, I will be happy to discuss it.

Proposal

All teams will give a presentation on 1st and 3rd of May to explain their project topics. I expect three things to be described clearly in the talk:

  • A testing problem the team aims to solve
  • The technique the team is proposing
  • A way of evaluation to show the proposed technique works and is competent

Team Project Deliverables

Each team should submit the following:

  • the team report
  • the implementation: a public repository link in the report (e.g. GitHub or bitbucket repo) The team report should include:

  • a precise description of the problem you attempted to solve
  • a clear description of how you tried to solve the problem
  • a result of experimental comparison of before and after: in other words, what benefits did your solution bring?

Additionally, each individual member should submit a separate individual report via KLMS:

  • details of what you have contributed to the project
  • peer assessment of your team members (yourself not included): use the scale of 10 to evaluate each of your teammates, and write clear justification for your score.

The submission deadline is 20th June, 6pm, GMT+9. The following is a submission checklist:

  • Make sure you have put your group report (as pdf) and your presentation slides (as pdf) in your public project repository; make sure that we would be able to easily find it (i.e., don’t hide it in an obscure folder).
  • Make sure one student from your team has submitted the repo link (plain text) on KLMS.
  • Make sure you have submitted your individual report (as pdf) on KLMS.

The final presentation dates for teams have been announced in the schedule section. Each team will have up to 15 minutes. If your team is scheduled on the early date, you can just report the progress up to that point, with a clear plan for the remaining work.

Teams

Form your teams by 7th April - write down the member names in the Google Sheet document (link will be available from the Slack workspace). Teams can be either four or five people.

Examples from the previous years

I’ve picked a few projects from 2019 that I thought was interesting below.

Paper List