Workshops

Workshops

Workshops A

A-1 - Introduction To Music Information Retrieval Using Essentia.js

Tuesday 6th Starting At 18:30 (CEST) – 180 Minutes

Albin Correya ( Universitat Pompeu Fabra ); Dmitry Bogdanov ( Universitat Pompeu Fabra ); Luis Joglar-Ongay ( SonoSuite & Universitat Pompeu Fabra); Jorge Marcos-Fernández ( Universitat Pompeu Fabra )

Web Audio is an intrinsic part of the next generation of applications for multimedia content creators, designers, researchers, music tutors, artists, and consumers. New advances in web audio and software for audio analysis, music information retrieval (MIR), and machine learning open up exciting possibilities. We have recently released Essentia.js, based on one of the most popular MIR libraries on the native platform. We have also created various pre-trained deep learning models for inference with TensorFlow.js. In this tutorial, we introduce the key concepts in MIR and cover the basics of using the library for music and audio analysis. We will show example use-cases and assist the participants in building their MIR Web applications.

A-2 - Browser-Based Collaborative Live Coding With Glicol: A Graph-Oriented Live Coding Language Written In Rust

Tuesday 6th Starting At 18:30 (CEST) – 180 Minutes

Qichao Lan And Alexander Refsum Jensenius (RITMO – Department Of Musicology – University Of Oslo)

In this workshop, participants will be invited to try out Glicol, a graph-oriented live coding language written in Rust.

Participants will get familiar with the syntax of Glicol, as well as its browser-based environment developed with WebAssembly, AudioWorklet and SharedArrayBuffer. In the browser-based interface, a new form of interaction in collaborative live coding will be introduced too. After that, participants can brainstorm new features together and learn how to customise the language. In addition, there will be a scheduled live coding performance with Glicol at the conference, and participants of the workshop can choose to join as co-performers.

A-3 - Creating Telematic Musical Performances Through Max And The Web With Collab-Hub

Tuesday 6th Starting At 18:30 (CEST) – 180 Minutes

Nick Hwang (The Media Arts & Game Development Program, University Of Wisconsin -Whitewater); Anthony T. Marasco (School Of Music, The University Of Texas Rio); Eric Sheffield, (Department Of Music And Theater Arts, SUNY Broome)

This workshop serves as an introduction to building remote/local networked audiovisual performances and pedagogical tools using Collab-Hub, a package for remote collaboration based on Node.js and implemented within Max and as a web-based interface. Collab-Hub is a system built for sharing of data and eliminates the need for collaborators to be aware of their/each others’ IP address. It has applications in many performance paradigms, including telematic performance, laptop orchestra, mixed ensemble with digital elements, distributed control, net-to-physical interaction, and more.

Workshops B

B-1 - Live Streaming Platform As An Instrument, Experience And Pastime

Wednesday 7th Starting At 15:00 (CEST) – 90 Minutes

Louis Foster

This paper introduces a design and the implementation of a proof of concept for a sonic cyberspace. The purpose of this is to explore new media, and find potential in our existing technology and infrastructure. The central theme of this cyberspace is collective collaboration, and documenting the process of developing speculative creativity platforms. It is discovered some streaming technology, such as Icecast, is not suitable for more complex use-cases. The paper proposes an appropriation of modern streaming protocols, and discusses the potential of incorporating out-of-band metadata to explore unique applications of this design. The paper discusses how the attitude towards composition transforms when the ability to dominate experience is countered by randomness. Additionally, the design suggests only the creative experience can have no latency as well as a certainty of realness, questioning the relevance of real-time and live streaming for performance and collaboration in music.

B-2 - Creating And Developing Distributed Music Application Using The Soundworks Framework

Wednesday 7th Starting At 17:00 (CEST) – 90 Minutes

Benjamin Matuszewski (STMS Ircam-CNRS-Sorbonne Université)

This workshop will give participants the opportunity to learn the basics of soundworks, a full-stack JavaScript framework for developing distributed and synchronized web-audio applications both in the browser or on embedded hardware. After a short presentation of the framework possibilities, architecture and ecosystem, the workshop will propose a hands-on session around the implementation of a simple application. The proposed application will be designed to focus on a number of key features proposed by the framework, such as the distributed state management and the plug-in systems, in particular through the implementation and usage of a synchronised scheduling system. The workshop will conclude on a discussion and Q&A session.

B-3 - Understanding And Visualizing Speech Quality

Wednesday 7th Starting At 15:00 (CEST) – 90 Minutes

Jayson DeLancey and Joan Serrà
Dolby.io (Gold Sponsor)

In this workshop, we describe building a project integrating Web Audio with REST APIs. Highlighted will be discussion of an approach for quantifying audio quality. This semi-supervised algorithm helps assess changes in speech-based audio quality.

B-4 - Presentation And Tutorial On WebAudio Modules 2.0, A Standard For Interoperable WebAudio Plug-Ins

Wednesday 7th Starting At 15:00 (CEST) – 180 Minutes

Michel Buffa (Université Côte D’Azur); Shihong Ren (Université Jean Monnet – Saint-Etienne); Steven Yi; Owen Campbell; Jari Kleimola (Web Audio Modules); Stéphane Letz (Grame); Hugo Mallet (53js)

In the past, two standards for WebAudio plug-ins existed, with a certain degree of compatibility: WAP (for WebAudio Plugins) and WAM (for WebAudio Modules). Such plugins could be used in different hosts, including a commercial online DAW (AmpedStudio.com), see screenshots at the end of this proposal.

There were some relationships between the two, some authors worked on both projects, and WAMs were a particular case of WAPs, but this was a bit confusing.

All the people involved (Jari Kleimola and Oliver Larkin from WebAudioModules.org, engineers from the online DAW  AmpedStudio.com, Michel Buffa and Shihong Ren, Steven Yi from Csound, FAUST DSL team Stéphane Letz, Yann Orlarey, a small french company 53JS.com) decided to merge and unify their work in early 2020.

Now comes WebAudio Modules 2.0 (aka WAM2.0), the unification of previous standards. it comes in the form of a git repo with a SDK, many examples of plugins written in JavaScript, TypeScript, some using the React framework, some written in FAUST, some in CSound.