From FoAM's writings behind the algorithms
From: Digital Art and The Glitch:
Glitch as a component of the creative process, where the software/hardware/wetware conflicts play a necessary (even if unexpected) part in the generation of the artwork. The ongoing pursuit of (artistic) development becomes a performance in itself. Here the glitch is a driving force for a play between the human and the machine, while the outcome is simply a fleeting target, that can be adjusted and even radically changed on the fly. We rarely come in contact with the world behind the screen, the reams of code, the protocols and puppet strings. The information which creates this landscape for us. The world behind the screen is suffocating under the burden of the interface. And the interface has nested itself on the surface of the screen, and appears to be paralysed in that position, not allowing the general user to discover the layers behind it. The interface must be violated. Scratched and cracked or pulled out of the screen into the physical space so that its borders become elastic and transparent, revealing the world behind. understanding the chaotic laws of hypercapitalism means understanding that information is the contrary to absolute, objective or stable. Noise interferes with perfect reproduction, the presumed goal. It interferes with translation from code to code. Viruses, errors and incompatabilities–these insert change into fragile patterns, the web page we are trying to download, the program we are trying to run. Noise is the enemy of the seamless interface. “All that is not information, not redundancy, not form and not restraints is noise, the only possible source of new patterns.” – Bateson glitch as a process: where the participants do not represent, but perform, infect each others worlds with alien loopholes, (mis)interpretations, and errors. The glitch as last resort without a shopping mall attached to it. A forest of live wires. Noise undermines the illusion of objectness created by the point'n'click interface. The space in between the interfaces is not a concern. And it is exactly in this space where d-art can function as a subversion of the polished make-believe future of the information era. the code usually hidden from the computer user seems to wriggle its way to the surface of the screen Code stripped of functionality, code drawn from the innards of computer hardware and set loose across the surface of the screen, code as an end in itself. Miles from the glossy, java-enabled, flash-heavy “interactivity” of the web. It deploys the hidden codes and glyphs of Internet protocol as central aesthetic components, a protocol which despite the liberatory rhetoric that surrounds it is put to work in the world enabling largely banal content. What we need to build is not yet another gate to the 'age of access', but a media de-tox chamber, where we can sweat out the unnecessary icons, protocols and constraining interaction methods….. and above all, where we can wash away the content that has polluted both physical and virtual reality before they had a chance to evolve and grow closely together. “we serve no content” -jodi It lives on the other side of the web, the side filled with incompatibilities and error messages, machines speaking to machines in a language we don't speak. When the standardised interface has been stripped away, and the machine spits out all the complaints, that's when we start to interact, trying to comprehend both the machine proceses and the brain-waves of its creators. Here our worlds truly meet, and we engage in clumsy negotiations, trying to translate each-other's worlds into each-other's languages. Translation becomes a living system, through which the two realities warp into a hybrid tangled one. The technology alone does not engage with our world. We ask questions, the technology answers, but it answers with something which we fed into it…. and we fed into it a rigid, mapped and marked structure of reality, one which does not respond to the living world. The machine is _not_ a colonial being. Humans are. Especially humans striving for power. Money. Speed. Even culture is not a safe haven any more. Reduced to bits, interchangeable and reproducible, a lot of digital art and culture is just a pretty name for a new commodity: cultural capital. The mainstream of digital design still thinks of intraction in terms of presentation. However, strange and unexpected interfaces can draw the participant into the generative process of the work (the constructuion of the now). The one interface that many people don't know how to operate any more is their own body. Is it broken, or is it meant to sound like that? What we have been calling the “glitch” is usually the enemy. It is the enemy of a stable system, of a seamless user-interface, of the point'n'click universe. The glitch also draws our attention to the arbitrary and constructed nature of information itself. What happens to life in our embodied actual when the object of our investigations becomes the virtual replicator?
From: SutChwon:
a glue layer between indeterminate components. a framework for developing frameworks. (…) it should be decentralised (peer-to-peer), adaptable, flexible. above all, it should focus on doing what computers do well (crunching data) and as a result enabling the people who are using it do what they do well (whatever that may be). The networked aspect of these projects can frequently be described as a struggle of connections and configurations fuelled by the “ecstasy of communication at a distance”. often, it is the network itself which is the object of fascination, the performer, the stage, the invisible centre of attention focusing geographically dispersed centres of attention. to be a useful tool there definitely should be useable, readable interfaces to this slippery layer of glue. ideally, such interfaces would enhance existing ones on their own terms (eg. a text based programming interface, or a timeline editing system), so in this sense, it is automation to facilitate both transparency (disappearing into current systems) and wider range of vision (extending the scope and interoperability of the system by increasing what is visible). the main purpose of this format for data description is to enable otherwise unreadable data to be used. this makes it possible for participants in the network to exchange data without necessarily knowing if the other participants can understand it. since a protocol is essentially a description of a state machine and the magic words to change it, or elicit a response from it, protocol descriptions could be implemented (at least skeletally) in some kind of finite state machine emulator which has assignable connections to other software. the traditional response to data arriving in an unknown format can usually fit into 2 categories 1) output junk 2) crash. the n[tp]M model would facilitate the exchange of unknown, or arbitrary data with arbitrary protocols by establishing a protocol for the exchange of abstract descriptions of formats and protocols.
From: The Animist Paradox:
Based on three assumptions; 1) that we are living in a “robot cargo cult”, 2) that animism and materialism are incompatible, and 3) that we need ways of better understanding contextually sensitive human-machine complexes if we are to successfully design semi-autonomous artefacts. We are building machines to live up to our expectations of how they should behave, based on what we have been told they can do. Pop culture, cinema, science fiction are thick with description, yet our artefacts are often created in “imitative rituals that are conducted without understanding the underlying cause of a phenomenon.” The animist position suggests ethics as an essential part of robotic development, while a materialist perspective suggests a flexible, rewritable, moral software as something independent. This suggests a series of shifts; from…
accuracy to quirks repeatability to consistency error to ambiguity processing to play control to emergence mechanism to dynamism fragility to redundancy mono- to multi-
From: Multiple Translations |Entangled Aphasia:
Instead of 'content', we propose that 'context' should be the driving force behind the future media productions. We understand 'context' to be the interrelated conditions in which an event occurs, a framework or a setting that actively transforms the objects and actions that it encompasses. Context is not a passive or neutral repository of related content parts, but a generative potential that can bring an 'intermedia' environment into existence. Following this line of thought, the term 'author', often also labeled as 'content provider', should be replaced by the term ‘context provider’; an entity (or group of entities) facilitating the generation of coherent and responsive environments, in which media are not containers, but actuators of interconnected events. In this case, the terms 'origin' and 'creation' (usually attributed to the notion of authorship) are distributed between the facilitators, the entities experiencing the environment as well as a range of computational subsystems, protocols and devices responsible for bringing the environment into being. The future of media tools does not lie in developing more specialised tools for representation, but in supporting the emergence of tools that enable deep interconnnection, that enhance complex relationships of multiple components or subsystems; and that allow a more generative or evolving media communication, transformation and translation. “The interesting thing for me is that changing something in one of the domains (audio, video, graphics) automatically affects the output of the other domain. I hope the outcome of such an experiment would be a very “elastic” system that can be pushed into various extreme/frantic states that could change according to real-world sensor data or internet-sucked analytical / web-user-based data and than re-develops into its sleeping state after stopping firing data at it.” Ideally the intermedia tools would encompass multiple platforms, and be scalable from a text only interface to an immersive Virtual Environment. The media should be scaled or transformed by the environment in which they are experienced. The final media output should be the result of negotiations between internal rules associated with each element (or group of elements) and the external conditions/obstacles created by the environment. High dimensional geometries can be quite easily represented in a computer, and if carefully structured, could contain a large amount of implicit information about the contents of the space. Manipulations and traversals of this geometry would be the computers equivalent of ‘My Very Elderly Mother Just Served Us Pistachio Nuts’.
From: Advanced Error Engineering:
OEmbed Error- All resolution methods failed