Register (Early pricing ends by 31st March)

This year’s program seeks to highlight challenges to privacy posed by widespread adoption of machine learning and artificial intelligence technologies. One motivation for this focus stems from goals and provisions of the European General Data Protection Regulation (GDPR), including requirements for privacy and data protection by design, providing notices and information about the logic of automated decision-making, and emphasis on privacy management and accountability structures in organizations that process personal data. Interpreting and operationalizing these requirements for systems that employ machine learning and artificial intelligence technologies is a daunting task.

As engineering is asked to play a larger role in privacy governance, software developers need tools for understanding, systematizing, and embedding privacy into systems and workflows. This work also requires greater engagement with design, legal, and public policy departments. Methods and tools for bridging privacy work across these communities are essential to success. Furthermore, research that focuses on techniques and tools that can aid the translation of legal and normative concepts into systems requirements are of great value.

Organizations also need tools for systematically evaluating whether systems fulfill users’ privacy needs and requirements and for providing necessary technical assurances. Methods that can support organizations and engineers in developing (socio-)technical systems that address these requirements is of increasing value to respond to the existing societal challenges associated with privacy.

In this context, privacy engineering research is emerging as an important topic. Engineers are increasingly expected to build and maintain privacy-preserving and data-protection compliant systems in different ICT domains such as health, energy, transportation, social computing, law enforcement, public services; based on different infrastructures such as cloud, grid, or mobile computing. While there is a consensus on the benefits of an engineering approach to privacy, concrete proposals for models, methods, techniques and tools that support engineers and organizations in this endeavor are few and in need of immediate attention. Also of great relevance are the development and evaluation of approaches that go beyond the one size fits all mantra, and that attend to the ever evolving practice of software engineering in agile service environments across different domains.

To cover this gap, the topics of the International Workshop on Privacy Engineering focus on all the aspects surrounding privacy engineering, ranging from its theoretical foundations, engineering approaches, and support infrastructures, to its practical application in projects of different scale across the software ecosystem.

Specifically, we are seeking the following kinds of papers:
1) technical papers that illustrate the engineering or application of a novel formalism, method or other research finding (e.g., engineering a privacy enhancing protocol) with preliminary evaluation;
2) experience and practice papers that describe a case study, challenge or lessons learned in a specific domain;
3) early evaluations of tools and other infrastructure that support engineering tasks in privacy requirements, design, implementation, testing, etc.;
4) interdisciplinary studies or critical reviews of existing privacy engineering concepts, methods, tools and frameworks;
5) vision papers that take a clear position informed by evidence based on a thorough literature review.