EU Horizon 2020
Horizon 2020
HomeNewsResearch ThemesPeopleKey Prior PublicationsPublications

FUN2MODEL - News

Research Opportunities: We are seeking exceptional researchers interested in formulating theories, models and algorithms for probabilistic verification and synthesis to enable robust AI. If this sounds like you, and you hold a PhD in computer science, mathematics or a related discipline, together with post-qualification research experience, please contact Marta (marta.kwiatkowska@cs.ox.ac.uk) to learn more about opportunities on the FUN2MODEL project.
28-29 October 2024: Marta Kwiatkowska is co-organiser of a Royal Society Discussion Meeting entitled “Beyond the symbols vs signals debate” at the Royal Society in London. For further information click on this link for full details.
15 October 2024: Marta Kwiatkowska gives a mini-course on “Probabilistic verification for neural networks” at Probability in Computer Science (PICS) in Copenhagen, Denmark. Follow this link for more details.
11 September 2024: Marta Kwiatkowska gives a keynote on “Adversarial robustness certification for neural networks: progress and challenges” at the Formal Methods 2024 conference, Milan, Italy. Follow this link for more details.
Aug 2024 : We are hiring Research Associates (x2) on project Fun2Model! For full details click here. Vacancy is full time, fixed-term and to start ASAP. Applications close on 12th September 2024.
16 July 2024: Marta Kwiatkowska gives a keynote on “Adversarial robustness certification for neural networks: progress and challenges” at the Games and Equilibria in System Design and Analysis workshop, Simons Institute, Berkeley, USA, Follow this link for more details.
May 2024 : We are hiring Research Associates (x2) on project Fun2Model! For full details click here. Vacancy is full time, fixed-term and to start ASAP. Applications close on 6th June 2024.
May 2024: A postdoc position is available now at Oxford on the FAIR project. Applications close on 22nd May 2024. Follow this link for more details.
May 2024: Marta Kwiatkowska – was interviewed about AI (in Polish) by Maciej Kawecki, Follow this link to access the interview.
11 April 2024: Marta Kwiatkowska, Gethin Norman and Dave Parker have won the 2024 ETAPS Test-of-Time Tool Award for PRISM. For more details, see here.

22 February 2024: Marta Kwiatkowska gives a keynote on “Strategy synthesis for stochastic games with neural perception mechanisms” at the CSL conference, Naples, Italy, Follow this link for more details.
15 February 2024: Marta Kwiatkowska – gives an online presentation at the NIST AI Metrology Colloquium Series entitle “Safety and robustness for deep learning with provable guarantees”, Follow this link for more details.
2 February 2024: Marta Kwiatkowska – gives a keynote on “When to Trust AI…” at the first Danish Digitalization, Data Science and AI conference (D3A), Nyborg, Denmark, Follow this link for more details.
November 2023: PRISM-games 3.2 is now out, including symbolic model checking of turn-based stochastic games, correlated/fair equilibria and more. Further information here.
October 2023: A postdoc position is available now at Oxford on the FAIR project. Applications close on 28th November 2023. Follow this link for more details.
October 2023 : We are hiring a Research Associates (x2) on project Fun2Model! For full details click here. Vacancy is full time, fixed-term and to start ASAP. Applications close on 6th November 2023.
September 2023: Dave Parker is giving a keynote talk at QEST / CONFEST 2023 on "Multi-Agent Verification and Control with Probabilistic Model Checking". There is an accompanying paper here.
July 2023: PRISM version 4.8 is now available, including support for robust verification of uncertain models (interval Markov decision processes and Markov chains), improved strategy/policy generation and much more. More information here.
April 2023 : Marta Kwiatkowska has been elected as a member of the American Academy of Arts and Sciences. Founded in 1780, the Academy honours excellence and convenes leaders from every field of human endeavor to examine new ideas, address issues of importance to the United States and the world, and work together, as expressed in their charter, ‘to cultivate every art and science which may tend to advance the interest, honour, dignity, and happiness of a free, independent, and virtuous people.’Current members represent today’s innovative thinkers in every field and profession, including more than two hundred and fifty Nobel and Pulitzer Prize winners. Marta has been invited to a formal induction in September, to be held in Cambridge, MA, where the Academy’s headquarters are located. More information and list of current Academy members is available here: American Academy of Arts and Science 2023. Congratulations Marta!
November 2022 - Award: Marta Kwiatkowska receives the 2021 Van Wijngaarden Award from CWI in Amsterdam at a ceremony postponed from last year due to Covid. Information with pictures is at: Van Wijngaarden Award 2021 and video of the Soiree van Wijngaarden awards is here: Soiree van Wijngaarden. Congratulations Marta!
October 2022 - Papers: Two papers accepted at NeurIPS-2022!
  • In this paper we study learning models where the learner is given more power through the use of local queries, and give the first distribution-free algorithms that perform robust empirical risk minimization (ERM) for this notion of robustness.
  • In this paper, we propose a robust anytime learning approach for Markov decision processes based on a Bayesian approach to learning probability intervals.
September 2022 - Event: Marta Kwiatkowska gives Basser seminar at the University of Sydney speaking on Safety and Robustness for Deep Learning with Provable Guarantees.
September 2022 - Event: Marta Kwiatkowska gives an invited lecture at the QEST 2022 conference speaking on Robustness Guarantees for Bayesian Neural Networks. See here for the invited paper.
August 2022 - Award: The paper, ' Tractable Uncertainty for Structure Learning ', co-authored by Benjie Wang, Matthew Wicker and Marta Kwiatkowska as part of FUN2MODEL, wins a best paper award at the 5th Workshop on Tractable Probabilistic Modeling @ UAI 2022. The paper presents a tractable representation of distributions over orderings for the purposes of causal discovery. Congratulations! Here is the link to the award announcement.
August 2022 - Event: Marta Kwiatkowska gives an invited lecture at the MFCS 2022 conference speaking on Probabilistic Model Checking for Strategic Equilibria-Based Decision Making: Advances and Challenges. See here for the invited paper.
July 2022 - Event: Marta Kwiatkowska gives a keynote lecture at the Highlights 2022 conference speaking on Probabilistic Model Checking for Strategic Equilibria-Based Decision Making: Advances and Challenges.
May 2022 - Papers: Two papers accepted at ICML'22!
  • In this paper we present Tractable Uncertainty for STructure learning (TRUST), a framework for approximate posterior inference that relies on probabilistic circuits as the representation of our posterior belief.
  • In this paper, we analyze the learning dynamics of temporal difference algorithms to gain novel insight into the tension between these two objectives.
May 2022 - Paper: Paper accepted at UAI'22: we present novel techniques for neuro-symbolic concurrent stochastic games, a recently proposed modelling formalism to represent a set of probabilistic agents operating in a continuous-space environment using a combination of neural network based perception mechanisms and traditional symbolic methods.
April 2022 - Papers: Three papers accepted at IJCAI 2022!
  • Here we consider the problem of certifying the individual fairness (IF) of feed-forward neural networks (NNs). In particular, we work with the =E5-=E4-IF formulation.
  • In this paper we address the fundamental problem in adversarial machine learning to quantify how much training data is needed in the presence of evasion attacks within the framework of PAC learning, focusing on the class of deci- sion lists.
  • In this paper we develop a method to quantify the robustness of decision functions with respect to credal Bayesian networks, formal parametric models of the environment where uncertainty is expressed through credal sets on the parameters.
February 2022 - Award: The paper, 'Sampling-based robust control of autonomous systems with non-Gaussian noise', co-authored by Dave Parker as part of FUN2MODEL, wins a distinguished paper award at AAAI-22. The paper builds on new interval MDP model checking in PRISM. Congratulations!
February 2022 - Event: Marta gives an invited lecture at AAAI-22 speaking on Safety and Robustness for Deep Learning with Provable Guarantees.
February 2022 - Event: Marta gives a tutorial at the Logic of Probabilistic Programming Conference.
Jan 2022 - Paper: Paper accepted at TACAS'22, showing how we implement algorithms for both normal form games and the more complex case of multi-player concurrent stochastic games with temporal logic specifications.
December 2021 - Event: Marta gives an invited talk at the Interpretability, safety, and security in AI event, held at the Alan Turing Institute.
November 2021 - News: The Global Partnership on Artificial Intelligence (GPAI) - Responsible AI Working Group, of which Marta ia a member, present two interim progress reports at the 2nd GPAI Summit held in Paris. The first report is titled 'Climate Change and AI: Recommendations for Government Action' and proposes an actionable roadmap on how AI can be responsibly developed, used and governed to take on the fight against climate change. The second report titled 'Responsible AI for Social Media Governance' focuses on developing understanding of social media users' relationships with harmful online content.
November 2021 - Event: Marta participates on a panel discussion at COP26 titled ‘AI for Climate Action'. [Video courtesy of 'German Pavillion@COP26'.]
November 2021 - Event: Marta gives keynote lecture at Machine Learning in Poland 2021 titled "Safety and Robustness for Deep Learning with Provable Guarantees" and participates in a panel discussion titled 'Women in Machine Learning in Poland'.
November 2021 - News: As a member of the Global Partnership on Artificial Intelligence (GPAI) - Responsible AI Working Group, Marta presents a preliminary report at the UN Climate Change Conference COP26 titled ‘Climate Change and AI: Recommendations for Government Action'. See here for more information.
October 2021 - Welcome: We are pleased to welcome researcher, Matthew Wicker, to the FUN2MODEL project. Matthew will focus on probabilistic verification and synthesis for Bayesian neural networks.
October 2021 - News: Marta is named a Turing Fellow by the Alan Turing Institute - the UK's national institute for data science and AI.
September 2021 - Event: Marta gives keynote lecture at ECML PKDD 2021 titled "Safety and Robustness for Deep Learning with Provable Guarantees".
September 2021 - Award: Marta has been awarded the Van Wijngaarden Award 2021 for Computer Science in recognition of her numerous and highly significant contributions to preventing software faults. The five-yearly award also recognised mathematician, Susan Murphy, for her work in improving decision making in health. The Van Wijngaarden Award was established by CWI, and is named after former CWI director, Aad van Wijngaarden. Congratulations Marta!
September 2021 - Welcome: We are delighted that Senior Research Fellow, Gethin Norman, has joined the FUN2MODEL project. Gethin specialises in modelling the formalisms, theories, logics and algorithms that underpin the PRISM model checker.
August 2021 - Event: Dave Parker gives an invited tutorial at EXPRESS/SOS 2021 on "Probabilistic Verification of Concurrent Autonomous Systems".
July 2021 - Event: Marta lectures at ACDL 2021.
May 2021 - Paper: accepted at UAI 2021! Here we propose a framework to provide safety certification for given control policies, and synthesize control policies that improve the certification bounds.
May 2021 - Event: Marta was awarded the prestigious 2019 BCS Lovelace Medal for her research in probabilistic and quantitative verification and so led the 2020 Lovelace Lecture “Probabilistic model checking for the data-rich world.” A recording can be found here.

The BCS Lovelace Lecture was sponsored by The Ada Lovelace Institute

May 2021 - Event: Marta gives a keynote speech at SAC 2021 titled "Advances and Challenges in Quantitative Verification for Adaptive Systems".
May 2021 - Event: Marta gives a keynote speech at FSEN 2021 titled "Safety and Robustness for Deep Learning with Provable Guarantees".
April 2021 - Papers: three papers accepted at IJCAI 2021!
  • Here we build on abduction-based explanations for machine learning and develop a method for computing local explanations for neural network models in natural language processing (NLP).

  • Here we demonstrate provable guarantees on the robustness of decision rules, paving the way towards provably causally robust decision-making systems.

  • Here we introduce the first method for verifying the time-unbounded safety of neural networks controlling dynamical systems.

April 2021 - Welcome: We warmly welcome DPhil Student Emanuele La Malfa, to the FUN2MODEL project. Emanuele's work will focus on robustness and explainability for natural language processing (NLP).
March 2021 - Software Release: PRISM 4.7 is now available, including support for POMDPs, improved accuracy reporting and more.
March 2021 - Welcome: We warmly welcome researcher, Rui Yan, to the FUN2MODEL project. Rui's work will focus on probabilistic verification and synthesis, including Bayesian and psychological games.
December 2020 - News: Marta becomes a Fellow of the European Laboratory for Learning and Intelligent Systems (ELLIS). ELLIS Fellows advance science, provide strategic advice and leadership, and act as ambassadors of ELLIS.
November 2020 - Event: Dave Parker gives a keynote talk at iFM 2020 titled "Verification with Stochastic Games: Advances and Challenges".
October 2020 - Event: Marta is a panelist at the Royal Society's Briefing for Making Europe a Leader in AI: in conversation with Venki Ramakrishnan, Antoine Petit and Martin Stratmann.
October 2020 - Welcome: We are delighted that graduate student, Elias Benussi, has joined the FUN2MODEL project. Elias will focus on fairness in AI.
October 2020 - Paper: accepted at FORMATS 2020! Here we propose MOSAIC, an algorithm for measuring the safety of deep reinforcement learning controllers in stochastic settings.
October 2020 - Paper: accepted at EMNLP 2020! Here we focus on robustness of text classification against word substitutions.
September 2020 - Event: Marta gives a webinar on 'Safety and robustness for deep learning with provable guarantees' in the ICE-TCS Reykjavik University series.
September 2020 - Event: Marta gives a keynote speech at ASE 2020 on 'Safety and robustness for deep learning with provable guarantees'.
September 2020 - Event: Marta gives a plenary talk at DNA26 on 'Probabilistic verification and synthesis for reliable molecular circuit designs'.
September 2020 - Event: Marta gives a keynote speech at KR 2020 on 'Probabilistic model checking for strategic equilibria-based decision making'.
August 2020 - Welcome: We are delighted to have researcher, Gabriel Santos, join the FUN2MODEL project. Gabriel will focus on strategic reasoning and game-theoretic techniques in AI.
August 2020 - Paper: accepted at QEST 2020! Here we propose multi-coalitional verification techniques for concurrent stochastic games.
July 2020 - Event: Marta gives a Webinar on "When to trust a self-driving car" at The National Academy of Sciences, India (NASI) - Delhi Chapter attended by over 440+ participants. Please see here to watch a recording.
June 2020 - Event: Marta has been invited to sit on the Global Partnership on Artificial Intelligence (GPAI) Working Group on Responsible AI, nominated by the European Commission.

GPAI is an international and multi-stakeholder initiative to guide the responsible development and use of artificial intelligence consistent with human rights, fundamental freedoms, and shared democratic values, as reflected in the OECD Recommendation on AI.

Please see here for more details.

May 2020 - Paper: accepted at UAI 2020! Here we show how to compute worst-case adversarial guarantees for Bayesian Neural Networks (BNNs).
May 2020 - Paper: by Clare Lyle accepted at ICML 2020! Here we consider the problem of learning abstractions that generalize in block MDPs, families of environments with a shared latent state space, and dynamics structure over that latent space, but varying observations.
April 2020 - Welcome: We are delighted to welcome researcher, Andrea Patane, to the project. Andrea brings with him expertise in safety verification of Bayesian models and the role played by uncertainty in adversarial prediction settings. He will focus on data-driven perception modelling and cognitive reasoning, for integration within agent-based models.
April 2020 - Paper: accepted at CAV 2020!

Here we present a major new release of the PRISM-games model checker, featuring multiple significant advances in its support for verification and strategy synthesis of stochastic games.

Download the tool and case studies here.

March 2020 - Event: Marta becomes a member of the Royal Society’s Working Group on Digital Technology and the Planet which aims to help the UK play a leading role in data-enabled innovation and the adoption of digital technologies to tackle climate change.
February 2020 - Paper: by Min Wu accepted at CVPR 2020! Here we consider the robustness of deep neural networks on videos.
January 2020 - Paper: accepted at AISTATS 2020! Here we show how to compute worst-case adversarial guarantees for classification with Gaussian processes.
January 2020 - Event: Marta gives a keynote speech at ERTS 2020 titled 'Safety verification for deep neural networks with provable guarantees'.
January 2020 - Event: Marta is an invited speaker at VMCAI 2020.
January 2020 - Software Release: PRISM-games 3.0 is now available, providing concurrent stochastic games, equilibria, real-time models and many new examples. More information here.
October 2019 - Welcome: We warmly welcome graduate student, Emanuele La Malfa, who joins the project as an Associate Member and will formally join in April 2021. Emanuele is working on robustness and explainability of AI, with specific focus on natural language processing (NLP) models.
October 2019 - Welcome: We are delighted that graduate student, Benjie Wang, has joined the FUN2MODEL project. Benjie is particularly interested in causal modelling as a means to enhance the robustness and explainability of deep learning.
October 2019 - Welcome: We are excited to appoint postdoctoral researcher, Luca Laurenti, to the project. Luca will work on on developing probabilistic verification and synthesis methods for deep learning, with a particular focus on Baysian neural networks.
1st October 2019 - The Fun Begins! FUN2MODEL kicks off and work begins as we aim to make major advances in the quest towards provably robust and beneficial AI.
March 2019 - Announcement: We are delighted to announce that Professor Marta Kwiatkowska has been awarded a highly competitive European Research Council Advanced Investigators Grant for a new five-year project FUN2MODEL.

Our objectives are to develop novel probabilistic verification and synthesis techniques to guarantee safety, robustness and fairness for complex decisions based on machine learning, formulate a comprehensive, compositional game-based modelling framework for reasoning about systems of autonomous agents and their interactions, and evaluate the techniques on a variety of case studies.

This is the second ERC Advanced Grant awarded to Marta Kwiatkowska: from 2010 until 2016 she held the grant VERIWARE. Please see the press release here and here for more details.