Goals and purpose
Driscoll's initial goal for making Haile was to combine the use of auditory input and robotics to create a musical experience that would lead to further human–robot interaction. The final goal was a robot that could translate live music into an acoustic performance that could implement and transcend human capabilities. Haile wasn't designed to replace human musicians, but rather to accompany them with expressive playing. These goals led to Weinberg wanting to create an acoustic musical experience. His earlier experiments failed to incorporate the visual or auditory aspects associated with acoustic music. Haile's functional drumming arms add musical cues (visually stimulating bouncing drum sticks and live, acoustic sounds) that other robot performances lack. Additionally, other attempts at percussion playing robots, Weinberg saw, were limited in the variety of beats that they could produce. Haile is not only preloaded with individual beats, but is also programmed to identify pitch, rhythm, and patterns, allowing it to improvise and play different beats every time, rather than simply mimic.Design
Haile'sForm
Haile's wooden designed was modeled to match the natural feel of a Native American pow wow(Native American gathering), so it was made out of wood rather than metal. The wooden parts were made at thPerception
Haile uses a microphone on the drum which first detected rhythms played by a human in real time. The robot identifies tempo and beats allowing it to play along with the other player. It can also adjust to the human's changes in volume, tempo or beat, allowing it to switch between accompanying and lead playing. Weinberg and his team first developed the robot's low-level perception ability, which includes detecting hit onsets, pitch, amplitude, and density. In terms of sound, a hit refers to a distinct change in both volume and sound quality. Once the outside music is captured, the sound is analyzed through a number of instruments, called perception modules, which each detect a certain aspect of the sound: *Pitch - detects hits and changes in frequency and translates them to find pitches *Beat - processes onsets and determines rhythms and tempo *Amplitude - recognizes changes in volume to determine when to assume leading or following roles *Density - detects changes in rhythm complexity at tempo to also help Haile assume leading or following roles{{cite web, last1=Weinberg, first1=Gil, last2=Driscoll, first2=Scott, title=The Interactive Robotic Percussionist, url=http://delivery.acm.org/10.1145/1230000/1228730/p97-weinberg.pdf?ip=169.231.6.22&id=1228730&acc=ACTIVE%20SERVICE&key=CA367851C7E3CE77%2E022A0CC51A76093F%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35&CFID=449569341&CFTOKEN=82705420&__acm__=1415575528_35305102474ab4165b3c12af23a612f0, website=delivery.acm.org, accessdate=November 9, 2014, ref=6Arm mechanics
Haile's arms are driven by two separate means. The left arm uses aPlaying
Haile's system adopts a leader-follower model, using tempo and beat changes to determine who the current leader is. Haile understands when a new leader emerges based on musical changes (tempo, volume, beat, etc.). The robot has two modes of play: *As a follower, Haile first analyzes the external music. It then matches and maintains the tempo, allowing the human player to play more complicated rhythms. Haile can also tell when the other player begins to play louder or more quickly, forcing it into the submissive role. When the human beings to play basic rhythms at a steady tempo, the robot takes the lead. *As a leader, Haile uses rhythms produced earlier by the human and improvises a rhythm with its right arm. The left arm detects and maintains the other player's tempo.Challenges
Some of the challenges that Weinberg faced with Haile's programming involved being able to distinguish between different, simultaneous sounds. Initially, analysis algorithms were unable to pick out softer and more subtle notes amidst louder sounds. Also, the inability to filter out ambient noise prevented Haile from working properly. After a considerable amount of adjusting, the filters and input hardware were tuned to differentiate between various volumes of music while ignoring interfering noise. As Haile was designed to play in either leading or following roles, early detection algorithms limited the human's ability to lead. The robot was designed to be able to detect changes in the music it heard, but would only respond to changes in tempo. This flaw only allowed the human to lead as long as he or she kept speeding up or slowing down. Weinberg, trying to model human musical interaction, implemented volume and noise density sensors to aid in the robot's ability to define leadership. These additions gave the human longer periods of leadership, giving Haile more opportunity to build upon what it heard.References
External links