The rise of cloud computing technology led to a paradigm shift in technological services that enabled enterprises to delegate their data analytics tasks to third party (cloud) servers. Machine Learning as a Service (MLaaS) is one such service which provides stakeholders the ease to perform machine learning tasks on a cloud platform. This advantage of outsourcing these computationally-intensive operations, unfortunately comes with a high cost in terms of privacy exposures. The goal is therefore to come up with customized ML algorithms that would by design preserve the privacy of the processed data. Advanced cryptographic techniques such as fully homomorphic encryption or secure multi-party computation enable the execution of some operations over encrypted data and therefore can be considered as potential candidates for these algorithms. Yet, these incur high computational and/or communication costs for some operations. In this talk, we will analyze the tension between ML techniques and relevant cryptographic tools. We will further overview existing solutions addressing both privacy and security.
Privacy and security for artificial intelligence
Invited talk at the University of Luxembourg, 1 July 2021, Luxembourg, Luxembourg
© EURECOM. Personal use of this material is permitted. The definitive version of this paper was published in Invited talk at the University of Luxembourg, 1 July 2021, Luxembourg, Luxembourg and is available at :
PERMALINK : https://www.eurecom.fr/publication/6592