Architecture-Preserving Provable Repair of Deep Neural Networks

Tao, Zhe and Nawas, Stephanie and Mitchell, Jacqueline and Thakur, Aditya V.
Proc. ACM Program. Lang. (PLDI) , 2023

Deep neural networks (DNNs) are becoming increasingly important components of software, and are considered the state-of-the-art solution for a number of problems, such as image recognition. However, DNNs are far from infallible, and incorrect behavior of DNNs can have disastrous real-world consequences. This paper addresses the problem of architecture-preserving V-polytope provable repair of DNNs. A V-polytope defines a convex bounded polytope using its vertex representation. V-polytope provable repair guarantees that the repaired DNN satisfies the given specification on the infinite set of points in the given V-polytope. An architecture-preserving repair only modifies the parameters of the DNN, without modifying its architecture. The repair has the flexibility to modify multiple layers of the DNN, and runs in polynomial time. It supports DNNs with activation functions that have some linear pieces, as well as fully-connected, convolutional, pooling and residual layers. To the best our knowledge, this is the first provable repair approach that has all of these features. We implement our approach in a tool called APRNN. Using MNIST, ImageNet, and ACAS Xu DNNs, we show that it has better efficiency, scalability, and generalization compared to PRDNN and REASSURE, prior provable repair methods that are not architecture preserving.

PDF     ACM©    

@article{PLDI2023,
  author = {Tao, Zhe and Nawas, Stephanie and Mitchell, Jacqueline and Thakur, Aditya V.},
  title = {Architecture-Preserving Provable Repair of Deep Neural Networks},
  year = {2023},
  issue_date = {June 2023},
  publisher = {ACM},
  address = {New York, NY, USA},
  volume = {7},
  number = {PLDI},
  url = {https://doi.org/10.1145/3591238},
  doi = {10.1145/3591238},
  journal = {Proc. ACM Program. Lang.},
  month = jun,
  articleno = {124},
  numpages = {25},
  keywords = {Bug fixing, Deep Neural Networks, Synthesis, Repair}
}