After the COVID-19 pandemic halted many asylum procedures around Europe, new technologies are now reviving these types of systems. From lie detection tools examined at the boundary to a system for verifying documents and transcribes selection interviews, a wide range of solutions is being made use of in asylum applications. This article is exploring just how these solutions have reshaped the ways asylum procedures are conducted. That reveals how asylum seekers happen to be transformed into obligated hindered techno-users: They are asked to comply with a series of techno-bureaucratic steps and keep up with capricious tiny within criteria and deadlines. This kind of obstructs all their capacity to understand these devices and to pursue their right for safeguards.
It also shows how these technologies happen to be embedded in refugee governance: They help in the ‘circuits of financial-humanitarianism’ that function through a flutter of distributed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by simply hindering all of them from being able to view the channels of protection. It further argues that examines of securitization and victimization should be coupled with an insight in the disciplinary www.ascella-llc.com/the-counseling-services-offers-free-confidential-counseling-services-to-enrolled-students/ mechanisms of those technologies, by which migrants will be turned into data-generating subjects who are regimented by their reliability on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal know-how, the article states that these systems have an inherent obstructiveness. They have a double result: when they aid to expedite the asylum process, they also help to make it difficult pertaining to refugees to navigate these types of systems. They can be positioned in a ‘knowledge deficit’ that makes all of them vulnerable to bogus decisions of non-governmental stars, and ill-informed and unreliable narratives about their circumstances. Moreover, they pose new risks of’machine mistakes’ which may result in incorrect or discriminatory outcomes.