Quality in context. Software process activities. The Waterfall model. The Prototyping model. Evolutionary development. The Spiral model. The Iterative Development Process (IDP).
Principles of the Agile Manifesto. Extreme Programming (XP) values, principles and practices. XP practices include stories, cycles, slack, small releases, pair programming, test first programming, incremental design, continuous integration, ten-minute build, informative workspace.
Foundations of Scrum – empiricism, lean thinking, transparency. Scrum values. Scrum team members including product owner, scrum master, developers. Scrum framework activities including sprint, daily scrum, sprint review and sprint retrospective. Scrum artifacts including product backlog, sprint backlog and incremental releases. The Scrum metric of velocity.
Black box methods – output coverage testing. Exhaustive output testing. Output partitioning. Handling multiple input/output streams/files. Black box methods at different levels. Gray box testing. Black box unit testing. Test harnesses and stubs. Assertions in test automation, tools. Black box class testing (interface / object oriented testing). Traces. Implementing assertions. Black box integration testing.
Role and kinds of white box testing. Code injection. Implementation – source, executable and sampling. White box static analysis. Code coverage methods. Statement analysis methods: statement coverage, basic block coverage.
Regression testing: purpose, method. Establishing and maintaining a regression test set. Observable artifacts: choosing, maintaining, normalizing, differencing. Version signatures. Regression test harnesses. A regression testing example: the TXL interpreter. Regression test organization, signatures and differencing for the TXL interpreter. Kinds of observations: functionality, performance, and internal diagnostic. Advantages and disadvantages of regression testing.
Quality vs. Security. Testing (penetration testing, fuzzing) and static analysis for security. A case study on cybersecurity in connected autonomous vehicles (CAVs).
Using static analysis techniques to assess software quality and detect faults. Static analysis fault detection tools: Lint, FindBugs and CodeSurfer Path Inspector. A case study of the SCRUB tool at NASA JPL.
Reviews, walkthroughs and inspections. Inspection in the software process. Code review techniques: checklists, paraphrasing, walkthroughs. and other lightweight code review practices. A discussion on bias in code review at Google.
In this lecture I will discuss recent research conducted by Riddhi More and I on detecting and classifying flaky tests. Flaky tests are tests that may pass or fail without changing the code.