Coercion without contact: New technologies and the boundaries of torture
DOI:
https://doi.org/10.7146/torture.v36i1.167351Keywords:
TortureAbstract
Introduction: This editorial revisits Internet and Communications Ill-Treatment and Torture in light
of generative AI, biometric surveillance, spyware, automated inference, neurotechnology, and platform-based coercion. It asks how new technologies reshape the boundaries between coercion, cruel, inhuman or degrading treatment, and torture. Methodology: The paper uses conceptual analysis,
typology-building, and normative human rights interpretation, informed by a purposive interdisciplinary review. Results: The editorial proposes a three-layer framework organised around the human need under attack, the method through which harm is produced, and the site where coercion
occurs. It identifies three overlapping domains: mental-directed interventions, social control, and
social influence. Across scenarios including technologically assisted interrogation, protest policing,
e-carceration, border governance, armed conflict, and digital authoritarianism, the analysis shows
that technologically mediated coercion may produce severe suffering without direct physical contact.
Harms may arise through surveillance, exposure, radical uncertainty, reputational destruction, isolation, automated exclusion, manipulation of perception, and induced vulnerability. The paper argues
that severity should be assessed cumulatively and contextually, including impacts on agency, identity,
relational life, collective belonging, and conditions of existence. It also highlights the difficulty of
attribution when states, companies, platforms, vendors, data brokers, and automated systems jointly
produce coercive environments. Existing human rights frameworks remain relevant but require doctrinal refinement, stronger accountability tools, and better methods for documenting diffuse, opaque,
and collective harms. Conclusion: Torture and ill-treatment do not end where screens begin; technological mediation requires updated legal, clinical, and evidentiary frameworks.
References
Amnesty International. (2024). Recording dissent: Camera surveillance at peaceful protests in the Netherlands. https://www.amnesty.org/en/documents/eur35/8469/2024/en/
Avis, N., Marciniak, L., & Sapignoli, M. (Eds.). (2024). States of surveillance: Ethnographies of new technologies in policing and justice. Routledge. https://doi.org/10.4324/9781003413547
Bachelet, M. (2018, November 14). Human rights in a new era [Speech]. Office of the United Nations High Commissioner for Human Rights. https://www.ohchr.org/en/statements-and-speeches/2018/11/human-rights-new-era
Beduschi, A. (2021). International migration management in the age of artificial intelligence. Migration Studies, 9(3), 576–596. https://doi.org/10.1093/migration/mnaa003
Bublitz, J.-C. (2013). My mind is mine!? Cognitive liberty as a legal concept. In E. Hildt & A. Franke (Eds.), Cognitive enhancement (pp. 233–264). Springer. https://doi.org/10.1007/978-94-007-6253-4_19
Christl, W. (2017). Corporate surveillance in everyday life. Cracked Labs
Council of Europe. (2022). Pegasus spyware and its implications on human rights. https://rm.coe.int/pegasus-spyware-report-en/1680a6f5d8
Council of Europe. (2024). Council of Europe Framework Convention on Artificial Intelligence and Human Rights, Democracy and the Rule of Law (CETS No. 225). https://rm.coe.int/1680afae3c
Crowther, N., & McGregor, L. (2022). A digital cage is still a cage: Human rights and new and emerging technologies in social care. University of Essex, Human Rights, Big Data and Technology Project. https://repository.essex.ac.uk/33020/
Dore-Horgan, E., Ligthart, S., Meynen, G., & Kellmeyer, P. (Eds.). (2026). Cambridge handbook on human rights for the mind: Emerging technologies, law and philosophy. Cambridge University Press.
Dulka, A. (2023). The use of artificial intelligence in international human rights law. Stanford Technology Law Review, 26, 316–366. https://law.stanford.edu/wp-content/uploads/2023/08/Publish_26-STLR-316-2023_The-Use-of-Artificial-Intelligence-in-International-Human-Rights-Law8655.pdf
Elbatanouny, H., Elbatanouny, M., Sallam, M., Rida, I., & Evans, A. (2025). A comprehensive analysis of deception detection methods using machine learning. Expert Systems with Applications, 263, Article 125702. https://doi.org/10.1016/j.eswa.2024.125702
European Commission. (2025). Guidelines on prohibited artificial intelligence practices established by Regulation (EU) 2024/1689 (AI Act). AI Act Service Desk. https://ai-act-service-desk.ec.europa.eu/sites/default/files/2025-08/guidelines_on_prohibited_artificial_intelligence_practices_established_by_regulation_eu_20241689_ai_act_english_ied3r5nwo50xggpcfmwckm3nuc_112367-1.PDF
European Commission. (2026a). Frequently asked questions. AI Act Service Desk. https://ai-act-service-desk.ec.europa.eu/en/faq
European Commission. (2026b). Navigating the AI Act. Shaping Europe’s digital future. https://digital-strategy.ec.europa.eu/en/faqs/navigating-ai-act
European Parliament. (2024, March 13). Harmonised rules on use of artificial intelligence in the European Union (EU AI Act) [Legislative resolution]. https://www.europarl.europa.eu/doceo/document/TC1-COD-2021-0106_EN.pdf
European Union Agency for Fundamental Rights. (2023). Surveillance by intelligence services: Fundamental rights safeguards and remedies in the EU (2023 update of the 2017 report). https://fra.europa.eu/en/publication/2023/surveillance-update#country-related
European Union. (2024). Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act). EUR-Lex. https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng
Freedom House. (2018). Freedom on the Net 2018: The rise of digital authoritarianism. https://freedomhouse.org/report/freedom-net/2018/rise-digital-authoritarianism
Głowacka, D., Youngs, R., Pintea, A., & Wołosik, E. (2021). Digital technologies as a means of repression and social control. European Parliament. https://www.europarl.europa.eu/RegData/etudes/STUD/2021/653636/EXPO_STU%282021%29653636_EN.pdf
High-Value Detainee Interrogation Group. (2016). Interrogation: A review of the science. Federal Bureau of Investigation. https://www.fbi.gov/file-repository/hig-report-interrogation-a-review-of-the-science-september-2016.pdf/view
Human Rights Watch & Harvard Law School International Human Rights Clinic. (2025). A hazard to human rights: Autonomous weapons systems and digital decision-making. Human Rights Watch. https://www.hrw.org/report/2025/04/28/a-hazard-to-human-rights/autonomous-weapons-systems-and-digital-decision-makingIenca, M., & Andorno, R. (2017). Towards new human rights in the age of neuroscience and neurotechnology. Life Sciences, Society and Policy, 13, Article 5. https://doi.org/10.1186/s40504-017-0050-1
Ienca, M. (2021). The Human Right to Cognitive Liberty. Cambridge University Press
International Committee of the Red Cross. (2019). Artificial intelligence and machine learning in armed conflict: A human-centred approach. https://www.icrc.org/sites/default/files/document_new/file_list/ai_and_machine_learning_in_armed_conflict-icrc.pdf
International Committee of the Red Cross. (2024). International humanitarian law and the challenges of contemporary armed conflicts. ICRC. https://shop.icrc.org/international-humanitarian-law-and-the-challenges-of-contemporary-armed-conflicts
International Committee of the Red Cross. (2026). Autonomous weapon systems and international humanitarian law: Selected issues (Position paper). https://www.icrc.org/sites/default/files/2026-03/4896_002_Autonomous_Weapons_Systems_-_IHL-ICRC.pdf
International Network of Civil Liberties Organizations, Physicians for Human Rights, & Omega Research Foundation. (2023). Lethal in disguise 2: How crowd-control weapons impact health and human rights. PHR. https://phr.org/our-work/resources/lethal-in-disguise-2/
International Organization for Migration. (2021). World migration report 2022: Chapter on artificial intelligence, migration and mobility. https://publications.iom.int/system/files/pdf/wmr_2022_book_eng_1.pdf
Joint declaration on freedom of peaceful assembly and of association and misuse of digital technologies. (2023, September 15). https://www.ohchr.org/sites/default/files/documents/issues/trafficking/statements/20230915-jd-foaa-digital-technologies.pdf
Kalodanis, K. (2025). High-risk AI systems — lie detection application. Future Internet, 17(2), Article 54. https://doi.org/10.3390/fi17020054
Krishnan, A. (2016). Attack on the brain: Neurowars and neurowarfare. Space and Defense, 9(1), 4–18. https://doi.org/10.32873/uno.dc.sd.09.01.1110
Krishnan, A. (2018). Military neuroscience and the coming age of neurowarfare. Routledge. https://doi.org/10.4324/9781315595429
Ligthart, S. (2022). Coercive brain-reading in criminal justice: An analysis of European human rights law. Cambridge University Press.
Ligthart, S. (2024). Towards a human right to psychological continuity? Reflections on the rights to personal identity, self-determination, and personal integrity. European Convention on Human Rights Law Review, 5(2), 199–229. https://doi.org/10.1163/26663236-bja10092
Ligthart, S., Bublitz, C., Douglas, T., Forsberg, L., & Meynen, G. (2022). Rethinking the right to freedom of thought: A multidisciplinary analysis. Human Rights Law Review, 22(4), Article ngac028. https://doi.org/10.1093/hrlr/ngac028
Ligthart, S., Douglas, T., Bublitz, C., Kooijmans, T., & Meynen, G. (2021). Forensic brain-reading and mental privacy in European human rights law: Foundations and challenges. Neuroethics, 14(2), 191–203. https://doi.org/10.1007/s12152-020-09438-4
Lubbers, E. (2015). Undercover research: Corporate and police spying on activists. An introduction to activist intelligence as a new field of study. Surveillance & Society, 13(3–4), 338–353. https://doi.org/10.24908/ss.v13i3/4.5371
Malek, S., Hearn, D., Fahy, T., Tully, J., & Exworthy, T. (2023). Legal and human rights issues in the use of electronic monitoring (using GPS “tracking” technology) in forensic mental health settings in the UK. Medicine, Science and the Law, 63(4), 309–315. https://doi.org/10.1177/00258024231174820
Manek, J., Galán-Santamarina, A., & Pérez-Sales, P. (2022). Torturing environments and multiple injuries in Mexican migration detention. Humanities and Social Sciences Communications, 9(1). https://doi.org/10.1057/s41599-022-01252-y
Marin, M., Kalaitzis, F., & Price, B. (2020, July 6). Using artificial intelligence to scale up human rights research: A case study on Darfur. Citizen Evidence Lab. https://citizenevidence.org/2020/07/06/using-artificial-intelligence-to-scale-up-human-rights-research-a-case-study-on-darfur/
Melgaço, L., & Monaghan, J. (Eds.). (2021). Protests in the information age: Social movements, digital practices and surveillance. Routledge. https://doi.org/10.4324/9780429467639
McCarthy-Jones, S. (2019). The autonomous mind: The right to freedom of thought in the twenty-first century. Frontiers in Artificial Intelligence, 2, Article 19. https://doi.org/10.3389/frai.2019.00019
McEvoy, M., Corney, N., & Haar, R. J. (2024). State violence against protesters: Perspectives and trends in use of less lethal weapons. Torture Journal, 34(1). https://doi.org/10.7146/torture.v34i1.144275
McKay, C. (2021). The carceral automaton: Digital prisons and technologies of detention. International Journal for Crime, Justice and Social Democracy, 10(4), 100–119. https://doi.org/10.5204/IJCJSD.2137
Noriega, M. (2020). The application of artificial intelligence in police interrogations. Futures, 117, Article 102510. https://doi.org/10.1016/j.futures.2019.102510
Office of the United Nations High Commissioner for Human Rights, & University of Essex. (2023). Digital border governance: A human rights-based approach. https://www.ohchr.org/sites/default/files/2023-09/Digital-Border-Governance-A-Human-Rights-Based-Approach.pdf
Office of the United Nations High Commissioner for Human Rights. (2022). The right to privacy in the digital age (A/HRC/51/17). https://www.ohchr.org/en/documents/thematic-reports/ahrc5117-right-privacy-digital-age
Office of the United Nations High Commissioner for Human Rights. (2024a). Human rights compliant uses of digital technologies by law enforcement in the context of peaceful protests. https://www.ohchr.org/sites/default/files/2024-03/Toolkit-law-enforcement-Component-on-Digital-Technologies.pdf
Office of the United Nations High Commissioner for Human Rights. (2024b). Trends in digital rights across the Middle East and North Africa (MENA) region: Expansion of digital surveillance and impacts on journalists and human rights defenders. https://www.ohchr.org/sites/default/files/documents/issues/civicspace/resources/civic-space-tech-brief-surveillance-trends-middle-east-north-africa-1-en.pdf
Office of the United Nations High Commissioner for Human Rights. (2025). Call for input for the HRC62 thematic report on the impact of digital and AI-assisted surveillance technologies on human rights, especially the rights to freedom of peaceful assembly and of association. https://www.ohchr.org/en/calls-for-input/2025/call-input-hrc62-thematic-report-impact-digital-and-ai-assisted-surveillance
Office of the United Nations High Commissioner for Human Rights. (2026). Tech brief: Data privacy, discrimination & AI. https://www.ohchr.org/sites/default/files/documents/issues/civicspace/resources/brief-data-privacy-ai-report-rev.pdf
Omega Research Foundation & Amnesty International. (2023). My eye exploded: The global abuse of kinetic impact projectiles. Amnesty International. https://www.amnesty.org/en/documents/act30/6384/2023/en/
Pérez-Sales, P. (2017). Psychological torture: Definition, evaluation and measurement. Routledge. https://doi.org/10.4324/9781315616940
Pérez-Sales, P. (2026). Torturing environments: Psychological, clinical and legal dimensions. Routledge. https://doi.org/10.4324/9781003639350
Pérez-Sales, P., & Serra, L. (2020). Internet and communications as elements for CIDT and torture: Initial reflections in an unexplored field. Torture Journal, 30(1), 5–22. https://doi.org/10.7146/torture.v30i1.120593
Polyakova, A., & Meserole, C. (2019). Exporting digital authoritarianism: The Russian and Chinese models. Brookings Institution. https://www.brookings.edu/wp-content/uploads/2019/08/FP_20190827_digital_authoritarianism_polyakova_meserole.pdf
Roberts, T., & Oosterom, M. (2025). Digital authoritarianism: A systematic literature review. Information Technology for Development, 31(4), 860–884. https://doi.org/10.1080/02681102.2024.2425352
Ruggie, J. G. (2011). Guiding principles on business and human rights: Implementing the United Nations "protect, respect and remedy" framework (A/HRC/17/31). United Nations Human Rights Council. https://www.ohchr.org/sites/default/files/documents/publications/guidingprinciplesbusinesshr_en.pdf
Shiner, B. (2025). The right to freedom of thought and the prohibition of torture, or cruel, inhuman or degrading treatment or punishment: Examining the relationship in the case of the coercive and interrogational use of neurotechnology. In E. Dore-Horgan, S. Ligthart, G. Meynen, & P. Kellmeyer (Eds.), Cambridge handbook on human rights for the mind: Emerging technologies, law and philosophy (pp. 1–26). Cambridge University Press.
Special Rapporteur on the Promotion and Protection of human rights and fundamental freedoms while countering terrorism. (2025). Protecting human rights while using artificial intelligence in counter-terrorism and security settings [Position paper]. https://www.ohchr.org/sites/default/files/documents/issues/terrorism/sr/un-sr-ct-ai-position-paper-dec-2025.pdf
Special Rapporteur on the Rights to Freedom of Peaceful Assembly and of Association. (2024). Model protocol for law enforcement officials to promote and protect human rights in the context of peaceful protests (A/HRC/55/60). United Nations. https://www.ohchr.org/en/documents/thematic-reports/ahrc5560-model-protocol
Stark, E. (2007). Coercive control: How men entrap women in personal life. Oxford University Press.
Teo, S. A. (2022). How artificial intelligence systems challenge the conceptual foundations of the human rights legal framework. Nordic Journal of Human Rights, 40(1), 216–234. https://doi.org/10.1080/18918131.2022.2073078
Tesink, V., Douglas, T., Forsberg, L., Ligthart, S., & Meynen, G. (2024). Right to mental integrity and neurotechnologies: Implications of the extended mind thesis. Journal of Medical Ethics, 50(10), 656–663. https://doi.org/10.1136/JME-2023-109645
UNESCO, & Office of the United Nations High Commissioner for Human Rights. (2026). Protecting critical voices: Guidance for human rights impact assessment on digital platforms. https://www.ohchr.org/sites/default/files/documents/issues/civicspace/protecting-critical-voices-en.pdf
UNESCO. (2024). Recommendation on the ethics of artificial intelligence (updated version). https://www.unesco.org/en/articles/recommendation-ethics-artificial-intelligence
United Nations Human Rights Council Advisory Committee. (2024). Impact, opportunities and challenges of neurotechnology with regard to the promotion and protection of all human rights (A/HRC/57/61). https://www.ohchr.org/sites/default/files/documents/hrbodies/hrcouncil/sessions-regular/session57/A-HRC-57-61-Etext-accessible.pdf
Ünver, A. (2024). Artificial intelligence (AI) and human rights: Using AI as a weapon of repression and its impact on human rights. European Parliament, Subcommittee on Human Rights. https://www.europarl.europa.eu/RegData/etudes/IDAN/2024/754450/EXPO_IDA%282024%29754450_EN.pdf
Yuste, R., Goering, S., Agüera y Arcas, B., et al. (2017). Four ethical priorities for neurotechnologies and AI. Nature, 551, 159–163. https://doi.org/10.1038/551159a
Woodlock, D. (2017). The abuse of technology in domestic violence and stalking. Violence Against Women, 23(5), 584–602.
Zuboff, S. (2019). The age of surveillance capitalism. PublicAffairs. https://www.publicaffairsbooks.com/titles/shoshana-zuboff/the-age-of-surveillance-capitalism/9781610395694/
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Torture Journal

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
We accept that some authors (e.g. government employees in some countries) are unable to transfer copyright. The Creative Commons Licence Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) covers both the Torture Journal and the IRCT web site. The publisher will not put any limitation on the personal freedom of the author to use material contained in the paper in other works which may be published, provided that acknowledgement is made to the original place of publication.