no way to compare when less than two revisions
Differences
This shows you the differences between two versions of the page.
Next revision | |||
— | graduate_seminars [2016/04/22 18:44] – created mtlocal | ||
---|---|---|---|
Line 1: | Line 1: | ||
+ | ~~NOTOC~~ | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | |||
+ | ====== Graduate Seminars Taught by Music Technology Staff ====== | ||
+ | |||
+ | |||
+ | ===== MUMT 501 Digital Audio Signal Processing ===== | ||
+ | |||
+ | **Philippe Depalle** | ||
+ | |||
+ | Discrete-time signal processing concepts and techniques. Discrete-time fourier transform and series, linear time-invariant systems, digital filtering, spectral analysis of discrete-time signals, and the z-transform. | ||
+ | |||
+ | Prerequisite: | ||
+ | |||
+ | ===== MUMT 605: Digital Sound Synthesis and Audio Processing ===== | ||
+ | |||
+ | ** CRN: 4082 ** | ||
+ | |||
+ | ** Prof. Philippe Depalle | ||
+ | |||
+ | Most digital sound synthesis methods and audio processing techniques are based on the spectral representation of sound signals. This seminar starts with a theoretical and practical study of spectral representation, | ||
+ | |||
+ | ===== MUMT 610: Interactive Music Systems ===== | ||
+ | |||
+ | **CRN: 5347** | ||
+ | |||
+ | **Kojiro Umezaki and Bruce Pennycook** | ||
+ | |||
+ | This seminar explores topics central to the research areas of machine listening and machine composition from technical and aesthetic perspectives. | ||
+ | |||
+ | Techniques including rule-based, symbolic processing and network-based, | ||
+ | |||
+ | Machine listening topics will include beat tracking, meter induction, key induction, score following, segmentation, | ||
+ | |||
+ | Five special presentations will be given by Bruce Pennycook on the following topics: | ||
+ | |||
+ | * Principles and aesthetics of interactive music | ||
+ | * Impact of interactive music on performance practice | ||
+ | * Impact of interactive music on compositional methods | ||
+ | * Systems design (composer/ | ||
+ | * New directions and possibilities: | ||
+ | |||
+ | There will be one examination based on the principal readings, one paper, an individual presentation, | ||
+ | |||
+ | ===== MUMT 616: Timbre as a Form-Bearing Element in Music ===== | ||
+ | |||
+ | ** Prof. Stephen McAdams ** | ||
+ | |||
+ | Music theoretic, performance-related, | ||
+ | |||
+ | This seminar is open to graduate students in music theory, composition, | ||
+ | |||
+ | ===== MUMT 617: Cognitive Dynamics of Music Listening ===== | ||
+ | |||
+ | ** Prof. Stephen McAdams ** | ||
+ | |||
+ | Music theoretic, performance-related, | ||
+ | |||
+ | Prerequisites: | ||
+ | |||
+ | ===== MUMT 618: Computational Modeling of Musical Acoustic Systems ===== | ||
+ | |||
+ | ** Prof. Gary Scavone | ||
+ | |||
+ | [[http:// | ||
+ | |||
+ | This seminar will focus on methods for discrete-time modeling of musical acoustic systems. Topics to be covered will include discretization techniques, lumped vs. distributed system characterizations, | ||
+ | |||
+ | ===== MUMT 619: Input Devices for Musical Expression ===== | ||
+ | |||
+ | ** Prof. Marcelo Wanderley ** | ||
+ | |||
+ | Review of basic technologies used in the design of input devices for musical expression. Discussion of the most common types of electronic sensors and associated conditioning circuits and examples of their application on several gestural controllers presented in the literature. Students should have some prior knowledge of analog electronics. | ||
+ | |||
+ | ===== MUMT 620: Human Computer Interaction - Gestural Control of Sound Synthesis ===== | ||
+ | |||
+ | ** Prof. Marcelo Wanderley ** | ||
+ | |||
+ | Computers have long been able to synthesize high quality sound in real-time. The question nowadays is how to play the computer as a real time instrument. In order to answer this question, the analysis of performer gestures and the design of digital musical instruments using the computer are essential steps towards the definition of the interaction possibilities between the performer and the machine. This seminar aims at presenting the basic notions regarding human-computer interaction (HCI) in complex, multi-parametric contexts such as computer music and interactive live performance.< | ||
+ | |||
+ | Specifically, | ||
+ | |||
+ | * The review of basic topics on sound synthesis with respect to real-time control. | ||
+ | * The analysis of the existing literature on the design of input devices in HCI and possible applications of this knowledge to the design and evaluation of new interfaces for musical expression. | ||
+ | * Possibilities regarding gestural acquisition: | ||
+ | * Mapping strategies relating gestural variables to synthesis variables and their influence on instrument expressiveness. | ||
+ | |||
+ | ===== MUMT 621: Music Information Acquisition, | ||
+ | |||
+ | ** Ichiro Fujinaha** | ||
+ | |||
+ | This seminar will investigate the current research activities in the area of music information acquisition, | ||
+ | Each student will be expected to present various music information acquisition, | ||
+ | Potential topics include: Themefinder, | ||
+ | |||
+ | ===== MUMT 622: Time-Frequency and Parametric Representations of Sounds ===== | ||
+ | |||
+ | ** Prof. Philippe Depalle ** | ||
+ | |||
+ | This seminar presents current research trends in time-frequency representations and parametric modeling in the context of music and audio applications. A specific focus is made on the analysis of sounds using parametric methods. Students should have prior knowledge of sound analysis and resynthesis techniques and of digital signal processing. | ||
+ | |||
+ | |||
+ | |||
+ | ===== MUGS 695: Special Topic Seminar: Digital Musical Instruments - Technology, Performance and Composition ===== | ||
+ | |||
+ | ** Profs Sean Ferguson, Marcelo Wanderley ** | ||
+ | |||
+ | Until the beginning of the 20th century, the design of musical instruments relied upon mechanical systems and acoustical properties of tubes, strings, and membranes. With the advent of electricity, | ||
+ | |||
+ | This course will focus on systems that use the computer as the sound-generating device, a choice that offers the flexibility of a general-purpose architecture able to implement different synthesis techniques. An instrument that uses computer-generated sound is known as a digital musical instrument and consists of a control surface driving in real-time the parameters of a synthesis algorithm implemented in the computer. The synthesis parameters are controlled using input devices, or gestural controllers, | ||
+ | |||
+ | As a consequence, | ||
+ | |||
+ | This course will deal with the various issues relating to new digital musical instruments in the following three areas: a) Music Technology b) Performance and c) Composition. Course work will consist of collaborative projects involving students in Composition, | ||
+ | |||
+ | ===== MUMT 609: Music, Media and Technology Project ===== | ||
+ | |||
+ | ** Any Music Technology Professor ** | ||
+ | |||
+ | Independent Music Technology project. Students will prepare a statement of objectives, a comprehensive project design and a schedule of work, and will undertake the project on appropriate music technology platforms. |