Physicians are sometimes accused of ignoring or even burying their mistakes. In fact, there is a long history of review of problems in formats such as the Morbidity and Mortality Conference. However, in seeking to find out who did what wrong, these frequently become “name and blame” sessions in which mental agility and protection of ego and reputation are sometimes more in evidence than honest efforts to identify problems and address them. The commercial airline industry has devised a system in which voluntary self‐reporting of “non metal‐bending” incidents provides personal learning opportunities while protecting the pilot from blame, humiliation, and other forms of punishment. Since incident data are voluminous and generally available, they are a rich source of important learning through which pilots, organizations, manufacturers, and regulators can reap greater benefit from their collective experience with rare but dangerous incidents. This model challenges us to find ways to link assessment with outcomes analysis through individual and community processes of learning.
The Aviation Safety Reporting System, one of the central critical incident reporting systems in US aviation, was in fact designed by a physician, Charles Billings, who has pointed out that medicine is a far bigger and more complex system than aviation.11
Furthermore, critical incident data are characterized by many of the same kinds of “squishiness” that complicate the development of non‐MCQ assessment tools. They are of limited use in developing traditional statistical indicators such as accident rates, but they are immensely valuable for identifying new clusters of problems that can then be investigated in depth using more rigorous methods.
Like critical incident reporting systems, assessments that are introduced around patient safety themes will benefit from being clearly couched in improvement rather than punishment, with plentiful feedback of the type that is usually missing from high stakes tests. They will benefit even more from incorporating growing knowledge of the ways in which real patient safety problems continue to be thorny, complex, and resistant to simple solutions. Observers of “high reliability organizations” have argued that operating safely in dangerous systems depends upon cultivating “requisite variety” in cognitive and social structures to match the complexity of the work environment.12,13
In the same way, future assessments for safety will have to reflect the richness and complexity of real clinical work. This will be an increasing challenge as the pace of technological change in medicine accelerates.