456789109 of 10
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Merging Places: A Real-Time Distributed LiveReverberation Chamber
Royal College of Music in Stockholm, Department of Composition and Conducting.
Royal College of Music in Stockholm, Department of Composition and Conducting.ORCID iD: 0000-0003-1958-8484
(Chalmers Univeristy of Technology)
Number of Authors: 42024 (English)In: 2024 16th International Conference on Quality of Multimedia Experience (QoMEX), 2024Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

We present Auxtrument, a prototype instrumentthat allows audiences to experience the acoustic qualities ofremote locations. Using the metaphor of a reverberation chamber,artists send signals from the mixers, bus, or aux via theAuxtrument’s network connections from a concert hall to a fewdifferent locations. At each location the signal is played throughloudspeakers and captured, colored by the acoustics and noisesof the place, via a stereo or ambisonics microphone. The signal issent back and played for the audience in a surround sound systemconveying the spatial qualities of the places. The Auxtrumentallows us to merge and layer the different locations in the concerthall. However, this arrangement places great demands on thenetwork. The audio signals need to be high-resolution to preservethe inherent quality of the sounds, which creates large streams.The system must be equipped to work over different types ofnetworks, and to enable any location, the system must work withmobile devices. After ruling out several commercial and open-source solutions, we built the Auxtrument with web technologies:mainly node.js, WebSockets, WebRTC, and WebAudio.

Place, publisher, year, edition, pages
2024.
Keywords [en]
merged reality, IoT, distributed performance
National Category
Music
Identifiers
URN: urn:nbn:se:kmh:diva-5652DOI: 10.1109/QoMEX61742.2024.10598285ISBN: 979-8-3503-6158-2 (electronic)ISBN: 979-8-3503-6159-9 (print)OAI: oai:DiVA.org:kmh-5652DiVA, id: diva2:1923228
Conference
2024 16th International Conference on Quality of Multimedia Experience (QoMEX)
Projects
IRESAP
Funder
Knowledge FoundationAvailable from: 2024-12-20 Created: 2024-12-20 Last updated: 2024-12-20

Open Access in DiVA

fulltext(101 kB)22 downloads
File information
File name FULLTEXT01.pdfFile size 101 kBChecksum SHA-512
1e9673a249b1143938a9f9af49bbf20cb631192709fb390568732109aa540d4149b5d03940c98c71686b9ab8c807bceb38350ecc998703e50ae8ef374ffae66a
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Search in DiVA

By author/editor
Rikard, LindellFrisk, Henrik
By organisation
Department of Composition and Conducting
Music

Search outside of DiVA

GoogleGoogle Scholar
Total: 22 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 124 hits
456789109 of 10
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf