Logo del repository
  1. Home
 
Opzioni

Learning by Means of an Interactive Multimodal Environment

Zanolla S
•
Rodà€ A
•
Canazza S
•
FORESTI, Gian Luca
2012
  • conference object

Abstract
A novel digital musical interface based on sound source localization using a microphone array is presented. It allows a performer to plan and conduct the expressivity of a performance by controlling an audio processing module in real-time through the spatial movement of a sound source (i.e., voice, traditional musical instruments, and sounding mobile devices). The prototype interface consists of an adaptive parameterized Steered Response Power Phase Transform (SRP-PHAT) with a Zero-Crossing Rate (ZCR) threshold and a Kalman filter that provides a more accurate estimate and tracking of the source position if there is movement. A real-time software based on external Max object was developed to test the system in a real-world moderately reverberant and noisy environment, focusing on the performance of pseudo-periodic sounds in a multisource scenario.
WOS
WOS:000416743800015
Archivio
http://hdl.handle.net/11390/865695
info:eu-repo/semantics/altIdentifier/scopus/2-s2.0-84944883562
Diritti
metadata only access
Soggetti
  • Interactive Multimoda...

Visualizzazioni
2
Data di acquisizione
Apr 19, 2024
Vedi dettagli
google-scholar
Get Involved!
  • Source Code
  • Documentation
  • Slack Channel
Make it your own

DSpace-CRIS can be extensively configured to meet your needs. Decide which information need to be collected and available with fine-grained security. Start updating the theme to match your nstitution's web identity.

Need professional help?

The original creators of DSpace-CRIS at 4Science can take your project to the next level, get in touch!

Realizzato con Software DSpace-CRIS - Estensione mantenuta e ottimizzata da 4Science

  • Impostazioni dei cookie
  • Informativa sulla privacy
  • Accordo con l'utente finale
  • Invia il tuo Feedback