February 28 – March 2, 2012
The limited short-term memory of human beings causes smart programmers to make dumb mistakes. Moreover, the effects of faults thus introduced can manifest in unexpected ways during software testing or after deployment. It is a difficult, time-consuming, and largely human skill-dependent task (an ‘art form') to locate the cause of an error from an observed symptom. The cost of finding and fixing software defects is so high that companies routinely ship products with known defects. Lowering the cost of software debugging has enormous economic and social implications, given the pervasiveness of software in modern society.
To date, researchers have developed many computer-assisted debugging techniques to reduce the human skill and effort required in the task. These techniques include variants of program slicing, comparing spectra of passing and failing tests, and interpolating between a passing and a failing input to pinpoint the trigger for an error (e.g., delta debugging). Despite these impressive advances, current automated techniques are nowhere close to approaching the skill of an expert human. Experts leverage many sources of information as they go about the business of debugging, such as knowing expected invariants in the program being debugged, knowledge of previous bugs and their resolution, trust in the correctness of stable code, and so on. Indeed, it is quite a pleasure to observe experts 'artfully' insert effective breakpoints in an interactive debugger.
This workshop will examine several challenging questions.
First, debugging the behavior of a program for a problematic test case requires understanding of the program’s intended behavior, which is often informal in the programmer’s mind. How can we bring the program’s intended behavior, or specification, into the debugging process? How can debugging leverage implicit or explicit contracts in a program? This also brings in the exciting prospect of combining debugging with specification-mining methods, which seek to discover a program’s intended behavior via program analysis. In the proposed workshop, we will discuss potential synergies between specifications and debugging.
Second, automated program-verification techniques have enjoyed significant progress in recent years. However, their role in automated debugging, and more generally, the ways in which their power can complement human skill has not been studied well. Can we use automated program verification techniques for determining the root cause of a bug? Can we use automated program verification techniques for validating a candidate program repair? Introduction of automated verification techniques into debugging methods is another exciting topic we seek to discuss in the workshop.
Third, a human resolving a bug rarely looks at the snapshot of the program in isolation of the repository where the program lives. He or she typically looks at history of the program as it has evolved: e.g. what changes were made that might have caused the bug to appear, how were similar bugs resolved in the past. Can debugging leverage the rapidly emerging area of mining software repositories?
These sub-topics have all seen rapid advances within their own communities, but their connection to software debugging has not been examined fully, and certainly not at a common venue. This workshop aims to fill that gap.