Data quality in data infrastructures of migration and border control
Image credit: STS ItaliaAbstract
Migrants’ identity data does not always fit into the standard categories inscribed in information systems. These constraints (and possibilities) in standards and interfaces may be one reason why data entered in databases can eventually be found to be of low quality. No wonder that this issue of data quality is frequently identified by policy-makers as an important obstacle in the effective operation of information systems in migration and border control. For example, the EU information systems of Justice and Home Affairs are currently undergoing significant changes to address data quality, a supposed prerequisite to make the systems interoperable. However, despite the wide range of research on data infrastructures of migration and border control in Europe, little research has currently dealt with data quality, and many questions remain. What types of frictions occur during the processes of establishing identities of migrants? How are practices for processing vulnerable populations shaped by databases containing information that is not always complete, accurate, reliable? And how do data quality mechanisms and technologies shape relations between different actors? As part of the ERC-funded “Processing Citizenship” project, I present results from my fieldwork conducted at a company that develops technologies for dealing with data frictions in the processes of migration and border control. My research on the use of these technologies at a migration and asylum agency of a Northern European Member State shows a sample of the kinds of data frictions that can occur during the processing of vulnerable populations, and how these frictions are mediated by such technologies. These findings will contribute to STS research on data infrastructures, the role of data quality mechanisms on socio-technical architectures, and data frictions in migration and border security.
Location
Online conference

Authors
Wouter Van Rossem is a researcher on the intersection between social science and computer science. He previously worked on the European Research Council (ERC) funded project, Processing Citizenship, where he investigated how data infrastructures for population processing co-produce citizens, Europe, and territory. He completed his PhD at the University of Twente in the Netherlands and is still working on publications stemming from these impactful projects. In addition to his academic pursuits as a PhD at the University of Twente in the Netherlands, he brings a diverse background as a software engineer, having worked in various companies and at the European Commission’s Joint Research Centre in Italy. His diverse background, spanning both theoretical and hands-on knowledge, reflects his keen interest in exploring the intricate interconnections between technology and society.