Provable Editing of Deep Neural Networks using Parametric Linear Relaxation

Tao, Zhe and Thakur, Aditya V.
Advances in Neural Information Processing Systems : Annual Conference on Neural Information Processing Systems (NeurIPS), 2024

Ensuring that a DNN satisfies a desired property is critical when deploying DNNs in safety-critical applications. There are efficient methods that can verify whether a DNN satisfies a property, as seen in the annual DNN verification competition (VNN-COMP). However, the problem of provably editing a DNN to satisfy a property remains challenging. We present PREPARED, the first efficient technique for provable editing of DNNs. Given a DNN and a property it violates from the VNN-COMP benchmarks, PREPARED is able to provably edit the DNN to satisfy this property within 45 seconds. PREPARED is efficient because it relaxes the NP-hard provable editing problem to solving a linear program. The key contribution is the novel notion of Parametric Linear Relaxation, which enables PREPARED to construct tight output bounds of the DNN that are parameterized by the new parameters. We demonstrate that PREPARED is more efficient and effective compared to prior DNN editing approaches i) by using the VNN-COMP benchmarks, ii) by editing CIFAR10 and TinyImageNet image-recognition DNNs, and BERT sentiment-classification DNNs for local robustness, and iii) by training a DNN to model a geodynamics process and satisfy physics constraints.

PDF    

@inproceedings{NeurIPS2024,
  author = {Tao, Zhe and Thakur, Aditya V.},
  year = {2024},
  title = {Provable Editing of Deep Neural Networks using Parametric Linear Relaxation},
  booktitle = {Advances in Neural Information Processing Systems : Annual Conference on Neural Information Processing Systems (NeurIPS)}
}