Algorithmic governmentality and new punitive practices

Authors

  • Augusto Jobim do Amaral Universidade de Coimbra
  • Felipe da Veiga Dias Universidad de Sevilla
  • Ana Clara Santos Elesbão Pontifícia Universidade Católica do Rio Grande do Sul

DOI:

https://doi.org/10.24215/25251678e549

Keywords:

algortihms, big data, punitive practices, social control

Abstract

Due to the new sociotechnical context in which algorithms and data configure new forms of social control, this work aims at the commitment or the way in which new punitive practices are configured, seeking to answer the question: how will algorithms configure new punitive practices? Therefore, the research explores by conducting a literature review or how the computer systems on the network served as a multifaceted method of categorization and social classification whose objective is to manage populations influencing people, channeling escorts and determining opportunities. Such multifaceted method works by estimating probabilities to anticipate future intervening passages, teaching a new algorithmic management mode that works according to a screening mechanism that condemns the present to the anticipated future and places in the same field of experience and action with two subjects. This new mode of management reinforces inequalities or restricts and conditions opportunities based on private or government interests, favoring the fortunate and punishing the less fortunate or codifying past injustices in integrated pointological systems that behave like self-fulfilling prophecies with the impact of systematic discrimination. In this way, the algorithms will help to create the environment that justifies their assumptions, producing widespread damage or establishing a punitive dynamic that expansively encompasses all human growth instincts in which algorithmic predictions can be used.

Author Biographies

  • Augusto Jobim do Amaral, Universidade de Coimbra

    Pós-Doutor em Filosofia Política pela Università degli Studi di Padova – Itália. Doutor em História do Pensamento pela Universidade de Coimbra – Portugal – e Doutor em Ciências Criminais pela PUCRS. Professor do Programa de Pós-Graduação em Ciências Criminais e do Programa de Pós-Graduação em Filosofia da PUCRS. http://orcid.org/0000-0003-0874-0583. E-mail: guto_jobim@hotmail.com

  • Felipe da Veiga Dias, Universidad de Sevilla

    Pós-doutor em Ciências Criminais pela PUC/RS. Doutor em Direito pela Universidade de Santa Cruz do Sul (UNISC) com período de Doutorado Sanduíche na Universidad de Sevilla (Espanha). Professor do Programa de Pós-Graduação em Direito da Faculdade Meridional (IMED) – Mestrado. Professor do curso de Direito da Faculdade Meridional (IMED) – Passo Fundo – RS. Brasil. Coordenador do Grupo de Pesquisa “Criminologia, Violência e Controle”. Advogado

  • Ana Clara Santos Elesbão, Pontifícia Universidade Católica do Rio Grande do Sul

    Mestra em Ciências Criminais pelo Programa de Pós-Graduação em Ciências Criminais da Pontifícia Universidade Católica do Rio Grande do Sul, pesquisa fi nanciada pela Capes. Bacharela em Ciências Jurídicas e Sociais pela Escola de Direito da Pontifícia Universidade Católica do Rio Grande do Sul. Vinculada ao Grupo de Pesquisa em Criminologia, Cultura Punitiva e Crítica Filosófica.

References

ALI, Muhammad; et al (2019). “Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes” em arXiv.org. Disponível em: https://arxiv.org/abs/1904.02095.

AMARAL, Augusto Jobim (2020). Prólogo. Algoritarismos. TirantLoBlanch, São Paulo.

ANGWIG, Julia; et al (2016). “Machine Bias” em ProPublica. Disponível em: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing;

BENJAMIN, Ruha (2019). Race after technology. Abolicionist Tools for the New Jim Code. Polity Press, Cambridge

BRAYNE, Sarah (2021). Predict and Surveil: Data, Discretion, and the Future of Policing. Oxford University Press, Nova York

BROWNE, Simone (2015). Dark Matters: on the surveillance of blackkness.Duke University Press, Durham e Londres

BRUNO, Fernanda (2016). “Rastrear, classificar e performar” em Ciência e Cultura, São Paulo, v. 68, n. 1, pp. 34-38.

BUOLAMWINI, Joy; GEBRU, Timnit. (2018). “Gender Shades: Intersectional Accuracy Disparities” emComercial Gender Classification emProceedings of Machine Learning Research, v. 81, pp. 77-91

CARRERA, Fernanda. “Racismo e sexismo em bancos de imagens digitais: análise de resultados de busca e atribuição de relevância na dimensão financeira/profissional” em SILVA, Tarcízio (org.) Comunidades, algoritmos e ativismos digitais: olhares afrodiaspóricos. Literarua, São Paulo

CARRERA, Fernanda; CARVALHO, Denise (2019). “Algoritmos racistas: uma análise da hiper-ritualização da solidão da mulher negra em bancos de imagens digitais” em Galáxia Revista do Programa de Pós-Graduação em Comunicação e Semiótica, n. 43, pp. 99-114. Disponível em https://revistas.pucsp.br/index.php/galaxia/article/view/41614

CITRON, Danielle Keats, PASQUALE, Frank (2014). “The Scored Society: due process for automated predictions” em Washington Law Review, v. 89, n. 1, pp. 1-33. Disponível em https://digitalcommons.law.umaryland.edu/fac_pubs/1431/.

ENSIGN, Danielle; et al (2018). “Runaway feedback loops in predictivepolicing” em ProceedingsofMachine Learning Research, v. 81, pp. 1-12. Disponível em http://proceedings.mlr.press/v81/ensign18a.html

EUBANK, Virginia (2018). Automating Inequality. How high-tech tools profile, police, and punish the poor. St. Martins Press, Nova York.

EUBANK, Virginia (2018). Automating Inequality. How high-tech tools profile, police, and punish the poor. St. MartinsPress, Nova York

FACE Recognition (2017). Eletronic Frontier Foundation. Disponível em: https://www.eff.org/pt-br/pages/face-recognition.

FACE RecognitionVendor Test (2020). NIST. Disponível em https://www.nist.gov/programs-projects/face-recognition-vendor-test-frvt.;

GARVIE, Clare, FRANKLE, Jonathan (2016). “Facial Recognition software might have a racial bias problem” em Atlantic. Disponível em https://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/.

GARVIE, Clare; et al (2016). “The perpetual line-up: unregulated police face recognition in America” em Georgetown Law Center on Privacy and Technology. Disponívelemhttps://www.perpetuallineup.org/conclusion.

HARCOURT, Bernard (2015). Exposed: desire and disobedience in the digital age. Harvard University Press, Cambridge, Londres.

JURNO, Amanda; D’ANDREA, Carlos (2018). “Algoritmos e cosmopolíticas: a política de censura à nudez no Facebook e o regime de poder dos algoritmos” em PISEAGRAMA. Disponível em https://piseagrama.org/algoritmos-e-cosmopoliticas/

LYON, David (2005). “Surveillance as social sorting: computer codes and mobile bodies” em LYON David, Surveillance as social sorting: privacy, risk, and digital discrimination. Routledge, Nova York, Londres.

MAYER-SCHONBERGER, Viktor, CUKIER, Kenneth (2013). Big data: a Revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt Books and Media, Boston, Nova York.

NAKAMURA, Lisa (2007). Digitizingrace: visual culturesofthe Internet. University of Minnesota Press, Minneapolis, Londres

NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software (2020). NIST. Disponível em https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software.

NOBLE, SafiyaUmoja (2012). “Missed connections: what search engines say about women” emBitch Media, n. 54, pp. 36-41. Disponívelemhttps://safiyaunoble.com/missed-connections-search-engines-say-women-spring-2012/

NOBLE, SafiyaUmoja (2014). “Teaching Trayvon” emThe Black Scholar, v. 44, n. 1, pp. 12-29. Disponívelemhttp://www.jstor.org/stable/10.5816/blackscholar.44.1.0012;

NOBLE, SafiyaUmoja (2016). “A Future for Intersectional Black Feminist Technology Studies” emScholar and Feminist Online. Disponívelemhttps://sfonline.barnard.edu/traversing-technologies/safiya-umoja-noble-a-future-for-intersectional-black-feminist-technology-studies/

NOBLE, SafiyaUmoja (2018). Algorithms of oppression: how search engines reinforce racism. New York University Press, Nova York.

NOBLE, SafiyaUmoja (2018). Algorithms of oppression: how search engines reinforce racism. New York University Press, Nova York

NOBLE, SofiyaUmoja (2019). “Google Search: Hiper-visibility as a Means of Rendering Black Woman and Girls Invisible” emInVisible Culture: Na Eletronic Journal for Visual Culture, v. 19. Disponívelem: http://ivc.lib.rochester.edu/google-search-hyper-visibility-as-a-means-of-rendering-black-women-and-girls-invisible/

O’NEIL, Cathy (2016). Weapons of math destruction: how big data increases inequality and threatens democracy. Crown Publishers, Nova York.

O’NEIL, Cathy (2016). Weapons of math destruction: how big data increases inequality and threatens democracy. Crown Publishers, Nova York.

OSOBA, Osonde, WELSER, William (2017). An Intelligence in Our Image: the risks of bias and errors in artificial intelligence. Rand Corporation, Santa Monica.

OSOBA, Osonde; WELSER, William (2017). An Intelligence in Our Image: the risks of bias and errors in artificial intelligence. Rand Corporation, Santa Monica

PASQUALE, Frank (2015). The black box society: the secret algorithms that control money and information. Harvard University Press, Cambridge, Londres.

REY, Emmy (2020). “What PredPol is and what Pred Pol is not” em PredPol. Disponível em: https://www.predpol.com/whatispredpol/.

RITCHIE, Marnie (2020). “Fusing race: the phobogenics of racializong surveillance” emSurveillance&Society, v. 18, n. 1, pp. 12-29. Disponível em https://ojs.library.queensu.ca/index.php/surveillance-and-society/article/view/13131

ROUVROY, Antoinette, BERNS Thomas (2015). “Governamentalidade algorítmica e perspectivas de emancipação: o díspar como condição de individuação pela relação?” em Revista Eco Pós: Tecnopolíticas e Vigilância, Vol. 18, Nro. 2, Pp. 36-56. Disponível em https://revistaecopos.eco.ufrj.br/eco_pos/article/view/2662.

SILVA, Tarcízio (2019). “Racismo Algorítmico em Plataformas Digitais: microacressões e discriminação em código” em Simpósio Internacional LAVITS: assimetrias e (in)visibilidades: vigilância gênero e raça, 6. Anais eletrônicos. Disponível em http://lavits.org/anais-do-vi-simposio-internacional-lavits-assimetrias-e-invisibilidades-vigilancia-genero-e-raca/?lang=pt

SILVA, Tarcízio (2019). “Teoria racial crítica e comunicação digital: conexões contra a dupla opacidade” em Congresso Brasileiro de Ciências da Comunicação, 42. Anais eletrônicos

SILVA, Tarcízio (2020). “Visão computacional e racismo algorítmico: branquitude e opacidade no aprendizado de máquina” em Revista da Associação Brasileira de Pesquisadores/as Negros/as (ABPN), v. 12, n. 31, pp. 428-448. Disponívelemhttp://abpnrevista.org.br/revista/index.php/revistaabpn1/article/view/744?fbclid=IwAR1uD0ab3TQcKGAsEuaVemudHgBWe-Ep5aKIdDplG-9tN59Jf8b3BuHt5kQ).

SILVA, Tarcízio (2020). “Visão computacional e racismo algorítmico: branquitude e opacidade no aprendizado de máquina” em Revista da Associação Brasileira de Pesquisadores/as Negros/as (ABPN), v. 12, n. 31, pp. 428-448. Disponívelem

http://abpnrevista.org.br/revista/index.php/revistaabpn1/article/view/744?fbclid=IwAR1uD0ab3TQcKGAsEuaVemudHgBWe-Ep5aKIdDplG-9tN59Jf8b3BuHt5kQ

SINGER, Natasha; METZ, Cade (2019). “Many facial-recognition systems are biased, says U.S. study” emThe New York Times. Disponívelemhttps://www.nytimes.com/2019/12/19/technology/facial-recognition-bias.html.

SPEICHER, Till, et al (2018). “Potential for discrimination in online targeted advertising” emProceedings of Machine Learning Research, v. 81, pp. 1–15. Disponível em http://proceedings.mlr.press/v81/speicher18a.html

SWEENEY, Latanya (2013). “Discrimination in Online Ad Delivery” em arXiv.org. Disponívelemhttps://arxiv.org/abs/1301.6822

THE Allegheny Family Screening Tool (2021). Allegheny County. Disponível em https://www.alleghenycounty.us/Human-Services/News-Events/Accomplishments/Allegheny-Family-Screening-Tool.aspx.

UGWUDIKE, Pamela (2020). “Digital prediction technologies in the justice system: the implications of a ‘race-neutral’ agenda” em Theoretical Criminology, v. 14, n. 3, pp. 482-501. Disponível em https://journals.sagepub.com/doi/abs/10.1177/1362480619896006.

WACHTER-BOETTCHER, Sara (2017). Technically Wrong: sexist apps, biased algorithms, and other threats of toxic tech. W.W. Norton & Company, Nova York, Londres.

WACHTER-BOETTCHER, Sara (2017). Technically Wrong: sexist apps, biased algorithms, and other threats of toxic tech. W.W.Norton & Company, Nova York, Londres

ZARSKY, Tal Z (2014). “Undertanding discrimination in the scored society” emWashington Law Review, v. 89, n. 4, pp. 1375-1412. Disponívelemhttps://papers.ssrn.com/sol3/papers.cfm?abstract_id=2550248.

Downloads

Published

2021-09-21

Issue

Section

Sección Especial: Derecho Crítico

How to Cite

Jobim do Amaral, A., da Veiga Dias, F., & Santos Elesbão, A. C. (2021). Algorithmic governmentality and new punitive practices. Derechos En Acción, 20(20), 549. https://doi.org/10.24215/25251678e549