Error 417
𰁁􀅾𰅙𰁔C𰆈ATI𰅉N F𰀈𰂐𰄘𰁐D
Error 406
T𰁆𰀡𰁶 F𰀁𰅴𰀩IS𰄢 𰄲O𰆆 A𰀩𰀩𰁒𰅡TA𰀐L𰁉
Error 417 Expectation Fail logo

Error 406 Tech Fascism Not Acceptable

Call For Projects

Error 417 Expectation Failed is looking for 10 scores against tech fascism. Artists, curators, and collectives worldwide are invited to apply with project ideas, instructions, how-tos, interventions and practices that engage with the contemporary condition. This call for resistance is not about global fixes or sweeping victories — it’s about finding ways to challenge the current systems with the means available: misdirection, opting out, and pushing back. Respond to Tech Fascism with Error 406 Not Acceptable.

  • Grants from € 1’500 to € 7’000
  • Deadline call for projects: 17 June 2025
  • Jury: Hito Steyerl, Nora O' Murchú and Sam Lavigne
  • Deadline for the work: 30 November 2025

What we are looking for

The aim of this call for projects is to reflect on, resist, and ultimately undermine the power dynamics imposed by tech fascism. Projects should be developed and realised in your specific context and must be implemented as a score, a simple set of instructions that can be shared online.

Read more

Tech Fascism Not Acceptable

From surveillance systems and algorithmic decision-making to the emerging influence of AI, authoritarian technologies are not just tools — they are systems of control, exclusion, and manipulation. We must therefore ask: What conditions do we want to live in? How can we resist tech fascism?

Read More

Essays

Get some context on the topic by checking out our commissioned essays:

«Oh Man» by Ana Teixeira Pinto
«Refusing Tech Fascism» by tante

Online Exhibition

The online exhibition takes place at the end of 2025, consisting of:

  • Your score.
  • A realisation of your score within your practice that can be published online in a suitable format.

Deadline for the work is 30 November 2025.

Jury

The jury consists of Hito Steyerl, Nora O' Murchú and Sam Lavigne as well as representatives of Error 417 Expectation Failed.

Read about the Jury

Who can apply?

The application is open to artists worldwide, collectives, curators, arts initiatives, exhibition spaces, online platforms, and developers of software and hardware — basically anyone exploring the possibilities of contemporary, networked technologies as art and their significance for society.

We encourage proposals from people who are historically underrepresented in art and technology spaces.

FAQ

When will the jury make their decision?

The jury will meet at beginning of July, grant recipients will be informed by the middle of July 2025.

Do project recipients need to do a final report or accounting?

No, we will not ask you for final reports, receipts or financial statements.

Is it important for my project to begin and finish within the (time) frame of Error 406?

No, however, we do not want you to apply with a project that is already fully finished.

Is it okay if my project also receives funding from other sources?

Sure! For us, this is not a problem. However, please make sure that your other funding sources allow you to receive multiple grants and to exhibit your work with us.

Can I apply anonymously?

The names of the awarded projects and applicants will be published in public communication and announcements for Error 417 Expectation Failed. If you prefer to stay anonymous, please let us know via email: error417@expectation.fail.

Apply here  

What Are We Looking For?

The aim of the program Error 406 [Tech Fascism] Not Acceptable is to reflect, resist and ultimately undermine the power dynamics imposed by tech fascism. Projects should be developed and realised in your specific context and must be implemented as a score, a simple set of instructions that can be shared online.

We support artistic interventions that embrace political activism, community engagement and the development of tools that serve the public good rather than protecting elitist power structures. The proposed artistic projects can be tactical, performative or metaphoric, among others. We are looking for daily acts of resistance, interventions, how-tos on bugging an opera or disturbing surveillance infrastructures, a guide for renaming the Gulf of Mexico on Google Maps, a computer program, a single line of code, a tutorial, event scores, a dérive, or a recipe. The works should be implemented as a simple set of instructions that can be shared online and realised by you in your specific context.

Projects do not have to be fully completed within the (time) frame of Error 406. We support open-ended formats, research, procedural and exploratory works which include the possibility of failure instead of forcing preconceived outcomes.

Please apply until 17 June 2025. You can apply for funding on a sliding scale from €1,500 to €7,000, according to the scope and complexity of your project. The grant recipients will be selected by an international jury, presented in an online exhibition, and have access to a mentoring session with a member of the jury.

𰆄𰁆𰀡𰁶 F𰀁𰅴𰀩IS𰄢
𰄨𰅅𰆃 𰀀𰀨𰀨𰁅P𰆆𰀁𰀐𰄖𰁖

From surveillance systems and algorithmic decision-making to the emerging influence of AI, authoritarian technologies are not just tools — they are systems of control, exclusion, and manipulation. We must therefore ask:

How does tech fascism affect us? How can we refuse, intervene in, or sabotage fascist systems? How can practices of civil disobedience be shared and brought to the mainstream? What role can art play in these acts of resistance? Which email address can I send my complaint to? 
¯\(ツ)/¯

Authoritarian and fascist technologies are deeply embedded in the very systems we interact with daily, from communication platforms, to algorithmic decision making, surveillance and behavioural prediction systems or emerging AI tools. By imposing exclusionary ideologies and elitist mindsets while standardising ways of thought, expression and participation, technology becomes a tool for reinforcing hierarchies — one that links technological proficiency with social worth, often along racial or class lines. It creates systems that determine who gets to participate and who gets left out, with power and privilege tightly interwoven in the algorithms that run it all. In these systems, minorities are represented inadequately or not at all, limiting their ability to make themselves heard. Tech fascism is a power structure that limits our agency.

The concept of the networked individual, a development towards totally individualised nodes 'streamlined' by networked technologies, has allowed for accelerated individualism (capitalism) and fascism to merge even further. This perspective sees technology not as a mere tool but as a marker of human worth. Tech fascism utilises technology not just for innovation and progress but as a tool for political manipulation and control, deeply affecting the fabric of democracy and human rights.

So the question is, how can we refuse tech fascism? The potential for resistance isn’t about easy fixes or sweeping victories — it’s about finding ways to challenge the system in imperfect ways. Misdirection, opting out, and pushing back on the assumptions that drive these systems are some of the ways we might start to carve out space for ourselves in order to build something that better aligns with collective needs rather than serving the accumulation of wealth and power.

The Jury

Hito Styerl

Photo: Leon Kahane

Hito Steyerl is a filmmaker and author.

Nora O' Murchu

Photo: Nora O Murchú

Nora O’ Murchú is a curator and researcher whose work explores how digital infrastructures—software, algorithms, and networks—reshape contemporary culture and politics. Drawing on queer-feminist and postcapitalist theory, her projects examine how technology can reinforce extractivist and authoritarian systems, while also revealing cracks for collective action and dissent.

She has curated exhibitions, residencies, and public programmes at Akademie Schloss Solitude, LABoral, and the Seoul Museum of Art, and served as Artistic Director of transmediale—Europe’s leading festival for art and digital culture—from 2020 to 2024. Her practice questions the boundaries between art and technology, asking how we might reclaim space for agency and collaboration amid the accelerating reconfigurations of techno-social life and the illusions of techno-solutionism. She currently serves as a Professor in the Department of Computer Science and Information Systems at the University of Limerick in Ireland.

https://www.noraomurchu.com

Sam Lavigne

Sam Lavigne, Training Poses, 2018

Sam Lavigne is an artist and educator whose work deals with data, surveillance, cops, natural language processing, and automation. He is a Creative Capital grantee, recipient of the Pioneer Works Working Artist Fellowship, and the Brown Institute’s Magic Grant. He is currently an Assistant Professor of Synthetic Media and Algorithmic Justice at the Parsons School of Design.

https://lav.io