-
Video
-
Theses
-
Video

- Description
- Transcript
- Discussion
About speaker
-
John P. ThomasExecutive Director, Engineering Systems Laboratory Safety and Cybersecurity Group at MIT
About the talk
There are problems with software and software testing. We realize that every time we experience accidents and losses in software-enabled systems: hacking, financial losses, autonomous vehicle crashes, airplane accidents, and other losses where software plays a role.
At MIT’s Department of Aeronautics and Astronautics, we work with a systems approach to safety. We assume that accidents and losses can be caused not only by component failures but by unsafe interactions of system components that have not failed. These ‘components’ include humans.
How do we identify potential flaws, identify the most critical test cases, do targeted testing for software that is very complex, and identify important test cases that include human interactions?
How do we engineer the role of testers and management? Testers and test leadership obviously impact safety, but how are those captured when we analyze safety? How do we take into account the human actions and beliefs of testers, testing managers, and of operators in the systems we are building? How do we account for the safety-critical decisions they make and ensure they are receiving adequate feedback to make the correct decisions? This keynote will introduce the Systems-Theoretic Process Analysis (STPA) methodology and how these factors can be addressed in modern testing.
Share
Alright, thank you. 00:03 I'm just going to say a few words before I go through the slides. 00:09 I'm really fascinated by engineering mistakes that's driven mycareer, 00:13 so far at MIT. 00:18 That's my primary research area. 00:18 We've been developing techniques to help engineers recognize and prevent their mistakes. 00:21 We also need techniques help testers figure out what the task of course. 00:27 Engineering mistakes is right at the heart of what we do right. 00:32
A lot of my work happens to be in safety critical and security critical systems where these mistakes. 00:36 Are the most costly? 00:41 One of the first things people ask after an accident. 00:41 How in the world did that get through testing? 00:47 Why didn't they test that? 00:51 When you see it one accident. 00:53 And you see a second accident. 00:56 Same question comes up in the 3rd and the 4th and hundreds of accidents in the same question comes up you start to think. 00:57
It wasn't just one stupid tester here. 01:03 Got a broader issue. 01:06 You start to see a pattern. 01:11 It's often that we have smart people trying to do their job, 01:13 but they're handicapped. 01:17 With methods don't target the kind of problem that we have. 01:17 Without the proper methods to find that critical thing to test before an accident. 01:22
And from a leadership perspective the types of procedures and methods and resources and techniques that we didn't put in place did not set them up for success. 01:27 Not that we were malicious everyone is trying to do their job, 01:36 but it's what we don't know that gets us. 01:39 Both for testers and engineers an for leaders. 01:42 So we really need techniques that can help us do a better job managing complexity. 01:47
I'm including complexity of the technical system that's a big problem, 01:52 but what about the social. 01:56 Sociotechnical systems, we need techniques to analyze that I've seen a lot of test hazard analysis and engineering safety analysis. 01:58
Buy this talk
Video
Similar talks
Buy this video
Video
Conference Cast
With ConferenceCast.tv you get access to our library of the world's best conference talks.
