k2.2 is a voice-topological interface for gestural navigation in linguistic space. The hand (or any other part of the body) serves as the speech organ during the articulation process. The device is operated by means of the sensory determination of the position of the hands in space, their relative height, and other parameters which are assigned to the jaw and tongue position in the mouth as well as to pitch and rhythm, therefore forming a gestural navigable phonetic space. As the device is based on infrared sensors, it may be operated through the glass of a window. Phoneme production is based on phonetic laws. A spoken language is produced through the implementation of musical scales and speech rhythms whose context of meaning is not characterized by the conveyance of information but by the abstraction of the voice in tonal linguistic space.
Topology Of Voice: The voice as a private expression in talking and singing is typically also producing other articulations of various extremities of the human body. The imitation of speech by machines was and is focus of yesterdays and today’s research. The differentiation of meaning from the sonic appearance has already been seen in art history, though most of the time as performed production of composed voice(-less) arias. In terms of the most recent technical development, the production of meaning is the main focus of data processing applications and circuits. In contrast to this approach, the talking machines of Michael Markert focus on the expression and experience of vocal sonic qualities – and the production of vocoded sounds, that try to go beyond the merely mimic of the human voice. Articulatory-topological phonetics deals with the speech process — parts of the body serve as speech organs during the articulation process. It is therefore historically linked to Kempelen’s motif of voice generation for the voiceless: speech generation for the speechless.
k2.2 is a Phonetic Kemp Inc. production.
Michael Markert is a media-artist specializing in programming and electronics. He is living in Nuremberg, Germany. His research in intuitive musical interfaces started with a diploma in Multimedia / Communications Design. Since then he has developed various interactive sensory devices which he has used for installations and as musical instruments. The focus of his work is exploring harmonic musical control through intuitive and interactive realtime sensory processing thereby overruling hierarchic receptional mechanisms in art. Since 2005 he has been a member of the Urban Research Institute for Public Art and Urban Ethology (Intermedia), founded by Georg Winter. 2008 he graduated with a second diploma at the College of Fine Arts Nuremberg and is currently teaching at the College of Fine Arts in Nuremberg and at the Bauhaus University Weimar / Faculty Media. http://www.audiocommander.de