Marta Kwiatkowska - Research Overview
Computing infrastructure has become indispensable in our society, with examples ranging from online banking, to intelligent vehicles and electronic medical equipment. Software faults in such systems can have disastrous consequences. My research is concerned with developing modelling and automated verification techniques that can guarantee the stable, safe, secure, timely, reliable and resource-efficient operation of computing systems. Model checking is a verification technique that can establish whether certain properties, usually expressed in temporal logic, hold for a system model. The models describe how systems move between states by executing actions, and are typically represented as state-transition graphs. Model checking explores all system executions and, unlike testing, amounts to a mathematical proof. This task is inherently challenging in view of infinite state spaces and undecidability of the underlying problem. In conventional model checking, the properties of interest are qualitative, such as safety (e.g. "the car will never stall while driving autonomously" for intelligent vehicles).
The distinctive aspect of my work is its focus on probabilistic and quantitative verification techniques, with a recent shift towards synthesis from quantitative specifications. I led the development of the internationally leading probabilistic symbolic model checker PRISM, which has been applied to several real-world case studies, with unusual behaviour discovered in some of these. PRISM systematically explores a model of a system against properties such as "what is the worst case time to deliver an update from the sensor monitoring distance to the car ahead?" (for an intelligent vehicle) and "what is the expected power consumption during the first hour of operation?" (in a body-sensor network). ore information is available from the PRISM website, the PRISM tool paper, and this book.
Probability plays a very important role in modelling and functioning of complex systems: it is used as a symmetry breaker in distributed coordination algorithms, and for analysing performance and Quality of Service properties. Stochastic process calculi and the corresponding verification and analysis methods are being used to describe and reason about wireless networks, security protocols and biological organisms. PRISM proved its usefulness in exceptionally many distinct fields, including distributed and cloud computing, wireless networks, security, robotics, quantum computing, game theory, biology and nanotechnology.
Research themes
My research can be roughly grouped into the following themes:
-
Safety and trust for mobile autonomous robots
As autonomous systems become widely adopted by the general public, potentially increasing the risk to society, we must ensure that the mobile robots we design are safe (e.g. the software does not crash while performing a parking manoeuvre) and their decisions trustworthy (e.g. the rescue-robot is able to precisely identify the location of a victim), without impacting time-to-market deadlines. This theme is centred on developing methods that provide strong safety guarantees for mobile autonomous robots, for example self-driving cars and care robots. Current research is addressing safety verification for neural networks (see this paper) and formalisation and model checking for social trust (see this paper).
This work is being carried out as part of the EPSRC Programme Grant on Mobile Autonomy, see here.
See my the CAV 2017 keynote[Youtube] entitled "Safety verification for neural networks" (slides in PDF are here) and this lecture at the Hay Hestival entitled "When to trust a robot...".
To raise awareness of the issue of trust, I co-organised workshops on Social Trust in Autonomous Robots and Morality and Social Trust in Autonomous Robots. The next edition of the workshop will be held at FLoC in Oxford, see here.
- Quantitative probabilistic model checking and synthesis from quantitative specifications
This theme concentrates on formulating novel and powerful models, logics, algorithms, data structures and computational techniques to improve the efficiency of quantitative verification and synthesis techniques, and to further develop the probabilistic model checker PRISM and its extension PRISM-games. More detail can be found in the invited lecture on Quantitative Verification: Models, Techniques and Tools [Slides, PDF file] given at the ESEC/FSE 2007 conference, two tutorial-style book chapters, on model checking for discrete and continuous time Markov chains and Markov decision processes, as well as this paper for stochastic games.
The 2011 Marktoberdorf Summer School Tutorial on Advances in Probabilistic Model Checking summarises the progress made in this field and future challenges.
For recent work, see the invited lecture on Model checking and strategy synthesis for mobile autonomy: from theory to practice [Slides and video] given at the 2016 Uncertainty in Computation workshop at the Simons Institute, the ATVA 2014 paper that employs machine learning algorithms to improve performance of state space exploration, the CACM 2012 paper on quantitative runtime verification, as well as this paper on synthesis via quotient.
This research was supported by the ERC Advanced Grant VERIWARE: From Software Verification to 'Everyware' Verification (see press release here) and the recently completed Predictable Software Systems project, part of LSCITS collaboration, described in this CACM paper.
The International Conference on Quantitative Evaluation of SysTems (QEST) is the main conference in the area.
-
Software quality assurance for medical devices
This theme concerns sensor-based wearable or implantable devices such as cardiac pacemakers and has concentrated on developing model-based techniques for automated verification and synthesis for cardiac pacemaker software, mainly employing MATLAB Stateflow/Simulink and techniques that combine Monte Carlo simulation with constraint solving. Recent work includes parameter synthesis for pacemaker software (in this EMSOFT 2014 paper), quantitative verification of energy usage (see this paper), invariant verification of non-linear hybrid automata networks (in this CAV 2014 paper) and development and in silico analysis of rate-adaptive pacemaker models (see this paper).
This work is now concerned with developing personalisation technologies for affective disorders and is supported by the AFFECTech project. Previously it was carried out as part of the ERC Proof of Concept Grant VERIPACE and ERC Advanced Grant VERIWARE.
You can find out more about my work on pacemaker verification, as well as more generally application of verification to ubiquitous computing, from the 2012 Milner Lecture entitled "Sensing everywhere: on quantitative verification for ubiquitous computing" at this link [Youtube] (use this link for slides in PDF) and this this invited paper.
- Modelling and verification of biological and engineered systems
This theme aims to apply techniques developed in computer science, such as process calculi and model checking, to model not only biological systems, but also synthetic, engineered, devices built from molecules, for example gates made from DNA and molecular walkers. When applied to biological systems, thus derived models can be used to obtain new quantitative predictions in silico, which can guide the design and selection of in vitro experiments (see this paper and for general motivation also this paper in the Journal of Cell Science). More recently, we have applied probabilistic model checking to DNA computation, see this paper (joint work with Luca Cardelli and Andrew Phillips at Microsoft), where we were able to automatically detect a flaw in a DNA transducer design, and investigated the computational potential of computation by molecular walkers, see this paper (joint work with Andrew Turberfield, Biophysics). We have also improved the performance and scalability of PRISM (see this paper on fast adaptive uniformisation) and analysing the reliability of computing devices implemented using this type of molecular walkers.
This work is being carried out as part of the ERC Advanced Grant VERIWARE, Microsoft studentship and the recently completed project Autonomous Ubiquitous Sensing, funded by Oxford Martin School.
The 2013 Marktoberdorf Summer School Tutorial on Probabilistic Model Checking for Biology is an introduction to this topic.
Synthetic molecular devices can greatly benefit from computer-aided design and verification tools. To encourage verification researchers to contribute, I co-organised the 2014 Workshop on Verification of Engineered Molecular Devices and Programs (VEMDP). This year's edition will be held at FLoC in Oxford, see here.
-
Modelling and verification for ubiquitous computing
This theme is centred on applications of probabilistic model checking, mainly using the PRISM toolset, to model and analyse ubiquitous computing systems. Earlier work involved analysing the performance of wireless protocols, security protocols and embedded systems, for example Bluetooth and IEEE 802.15.4 Zigbee (see this paper). Current research is addressing quantitative software verification for collaborative/competitive stochastic protocols (see PRISM-games extension and this FMSD 2013 paper), strategy synthesis for autonomous driving from multi-objective specifications (see this paper) and compositional controller synthesis (see the CONCUR 2014 paper).
This work is being carried out as part of the ERC Advanced Grant VERIWARE and the recently completed projects Autonomous Ubiquitous Sensing, funded by Oxford Martin School, and EU Framework 7 project CONNECT-IP: Emergent Connectors for Eternal Software Intensive Networked Systems.
See my 2012 Milner Lecture entitled "Sensing everywhere: on quantitative verification for ubiquitous computing" at this link [Youtube] (slides in PDF are here) and this overview paper.
To broaden this field, I co-organised this Dagstuhl Seminar on Model-driven Algorithms and Architectures for Self-Aware Computing Systems.