The AISEC project
AI Secure and Explainable by Construction
Motivation
AI applications have become pervasive: from mobile phones and home appliances to stock markets, autonomous cars, robots, and drones. Each application domain comes with a rich set of requirements such as legal policies, safety and security standards, company values, or simply public perception.
As AI takes over a wider range of tasks, we gradually approach the time when security laws, or policies, ultimately akin to Isaac Asimov’s “3 laws of robotics” will need to be established for all working AI systems. A homonym of Asimov’s first name, the AISEC project aims to build a sustainable, general purpose, and multidomain methodology and development environment for policy-to-property secure and explainable by construction development of complex AI systems.
Methodology
This project will employ types with supporting lightweight verification methods (such as SMT solvers) in order to create and deploy a novel framework for documenting, implementing and developing policies for complex deep learning systems. Types will serve as a unifying mechanism to embed security and safety contracts directly into programs that implement AI. The project will produce an integrated development environment with infrastructure to cater for different domain experts: from lawyers and security experts to verification experts and system engineers designing complex AI systems. It will be built, tested and used in collaboration with industrial partners in two key AI application areas: autonomous vehicles and natural language interfaces (aka chatbots).
Participating universities
Heriott-Watt University
Heriott Watt’s Lab for AI and Verification is leading the project, alongside the NLP lab
University of Edinburgh
Edinburgh University’s Security and privacy research group is co-hosting the project
University of Strathclyde
Strathclyde’s Mathematically Structured Programming group is co-hosting the project
Academic partners
University of St Andrews
Dr Edwin Brady is a project partner
The Hebrew University of Jerusalem
Dr Guy Katz is a project partner
Boston University
Dr Marco Gaboardi is a project partner
Leiden University
Dr Henning Basold is a project partner
Secondments partners
IT University of Copenhagen
Natalia Slusarz -- internship
Hebrew University of Jerusalem
Omri Isac -- two internships
Schlumberger Limited
Ben Coke -- internship
Industrial partners
Funding
Funded by UKRI ESPRC
Engineering and Physical Sciences Research Council
This is an ambitious proposal that can make a good contribution to the verification of AI systems.
EPSRC Peer Review
I want to see this proposal funded and to see what the investigators can achieve against their objectives. The project isn't the final answer in work towards secure AI systems, but it looks like a very important piece of the jigsaw puzzle.
EPSRC Peer Review
Design
The AISEC character and logo were designed by Anna Komendantskaya as a re-interpretation of Asimov's book covers in the 60s and 70s.