MySong: Automatic Accompaniment for Vocal Melodies 
MySong: Automatic Accompaniment for Vocal Melodies
by Stanford
Video Lecture 15 of 18
Not yet rated
Views: 1,609
Date Added: August 30, 2009

Lecture Description


May 9, 2008 lecture by Dan Morris for the Stanford University Human Computer Interaction Seminar (CS547).



MySong is a system that automatically chooses chords to accompany a vocal melody. A user with no musical experience can create a song with instrumental accompaniment just by singing into a microphone, and can experiment with different styles and chord patterns using interactions designed to be intuitive to non-musicians. Dan Morris describes how MySong works, discusses results from a recent usability study, and shows lots of audio examples to demonstrate that non-musicians are in fact able to use this system as a powerful creative tool.

Course Index

Course Description


CS 547: Human-Computer Interaction Seminar (Seminar on People, Computers, and Design) is a Stanford University course that features weekly speakers on topics related to human-computer interaction design. The seminar is organized by the Stanford HCI Group, which works across disciplines to understand the intersection between humans and computers. This playlist consists of seminar speakers recorded during the 2007-2008 academic year.

Comments

There are no comments. Be the first to post one.
  Post comment as a guest user.
Click to login or register:
Your name:
Your email:
(will not appear)
Your comment:
(max. 1000 characters)
Are you human? (Sorry)