We are developing an adaptive and programmable media-flow ARchitecture for Interactive Arts (ARIA) to enable real-time control of audio, video, and lighting on an intelligent stage. The intelligent stage is being equipped with a matrix of floor sensors for object localization, microphone arrays for sound localization, beam forming and motion capture system. ARIA system provides an interface for specifying intended mappings of the sensory inputs to audio-visual responses. Based on the specifications, the sensory inputs are streamed, filtered and fused, actuate a controllable projection system, sound surround and lighting system. The actuated responses take place in real-time and satisfy QoS requirements in live performance. In this paper, we present the ARIA quality-adaptive architecture. We model the basic information unit as a data object with a meta-data header and object payload streamed between nodes in the system and use a directed acyclic network to model media stream processing. We define performance metrics for the output precision, resource consumption, and end-to-end delay. The filters and fusion operators are being implemented by quality aware signal processing algorithms. The proper node behavior is chosen at runtime to achieve the QoS requirements and adapt to input object properties. For this purpose, ARIA utilizes a two-phase approach: static pre-optimization and dynamic run-time adaptation.