top of page

Noise and Buttons

Noise and Buttons App Icon (1024x1024).png

Welcome to the Noise and Buttons official project page!

Here, you can explore what the project is, how it works, future plans, and links to project resources to try for yourself!


How it Works

Teaching Materials

Future Plans


About - Noise and Buttons


     I am autistic, and therefore, am an advocate and supporter of autistic and neurodivergent people everywhere.  This deeply ingrained commitment drives my work in music, spanning from scoring for picture to music production and songwriting.  Over the years, I have actively engaged in advocacy efforts, using social media platforms to educate global audiences about the neurodiversity movement, which gained significant traction during the COVID-19 pandemic.  However, my advocacy extends beyond neurodiversity to encompass the broader disabled community, which comprises millions of individuals with physical, mental, and neurological disabilities.  My dedication to this cause led me to work closely with disabled students across various age groups, from preschool to high school, It was during this time that I was inspired to develop a forward-thinking application concept.

     One day, I was substituting for a teacher, and the students had a library class as their special subject of the day.  As the librarian went about his lesson, I observed a remarkable innovation in the classroom.  A giant red USB push button, serving as a spacebar, was connected to a computer running a simple flash game.  The game’s objective was for an animal to jump over never-ending obstacles coming their way by pressing the space bar to jump.  Fortunately, there was no question in what the aim was, as the button acted as a simple alternative control while being tethered to the more complicated computer keyboard layout.  This setup enabled all students, regardless of their motor skills, neurological differences, or learning preferences, to participate in the game.  This seemingly straightforward adaptation sparked a thought within me – could a complex synthesizer be controlled with the same simplicity as push buttons?  The answer to this question was “yes”.  Sensory-friendly devices could potentially bridge the gap in accessing certain subjects, such as music education, which often presents barriers to disabled individuals.  With this revelation, I envisioned creating a musical instrument that prioritized simplicity and accessibility.  By eliminating unnecessary complexity and empowering users to learn the fundamental aspects of music-making, I aimed to make music education more inclusive and accessible to all.

What is Noise and Buttons?:

     Noise and Buttons is a pioneering music education project that aims to make music education more accessible for neurodivergent students by integrating sensory-friendly hardware and software interfaces.  While the program is inclusive and open to all, its primary focus is on catering to the needs of the neurodivergent community.  At the heart of Noise and Buttons is a sandbox-styled application featuring a user-friendly music engine controlled by four buttons.  These buttons enable users to manipulate different musical elements, being tempo, rhythm, timbre, and harmony.  Each button is color coded with red, yellow, green, and blue, corresponding to tempo, rhythm, timbre, and harmony, respectively.  Through simple button presses, users can explore an extensive array of musical sounds, facilitating hours of exploration.  

     Complimenting the app is the Noise and Buttons Musical Learning Program, a comprehensive curriculum developed to guide students in learning the various aspects of music.  The program incorporates interactive exercises within the app and musical examples to reinforce concepts.  It comprises slideshows designed for classroom instruction and an eight-part video series, with two videos dedicated to each aspect of music.  Collectively, these components build one unifying goal: to make entry-level music education more accessible for everyone, eliminating the fear of where to start, and hopefully leading to students pursuing to continue studying the musical arts.

From Prototype to Web App:

     After studying technology in music at the Oberlin Conservatory, I became accustomed to the DSP programming environment Max MSP.  Developed by Cycling ‘74, Max allows programmers to write interactive real-time patchers with a graphical user interface.  Functions appear as nodes and can be connected by digital patch cables through inlets and outlets in a similar manner to modular synthesizers.  Max can also be connected to any outside hardware and software with ease.  I knew that with this technology, I could bring the Noise and Buttons app to fruition.  From here, a prototype of the app was completely built as a Max patch.

     I used the prototype to add and test features for four years starting in 2019 when the idea presented itself.  The app functioned as designed on my own computer, however, one significant portion of the program was missing: the hardware.

     When designing the prototype app in Max MSP, I was fascinated with how seamless it was to interface with hardware devices through Max.  In conjunction with software development, I built a hardware button box out of wood and arcade buttons.  This box is designed specifically to work with the app through USB, and is treated as a type of game controller, making it compatible with industry standard controller mapping.  Pictured below is the hardware button box prototype after being completed:

     As the scope of the project expanded and I began to test it with students exhibiting neurological differences such as autism and ADHD, it was apparent that I needed a platform beyond my computer running Max to host this technology.  Fortunately, in early 2023, Cycling ‘74 released a revolutionary tool within Max called RNBO.  RNBO allows projects to be exported to targets such as C++, JavaScript, VSTs, and hardware devices like Raspberry Pi.  

     Among these options, the most widely accessible target was exporting the project as a web app with JavaScript, enabling users from around the world to access it directly from their web browsers.  Upon writing a RNBO patch and selecting the JavaScript target, RNBO generates a .json file.  This file, along with instructions from Cycling ‘74s RNBO API, enables seamless integration of the patch into a web development environment.  From here, it would be up to the programmer to write code that can control the patch.  On this step, I referred to the prototype to re-write all functions in JavaScript syntax.  Additionally, I re-built the user interface with HTML 5 and CSS.  Currently, the app is hosted as an open-source project on GitHub, accessible via a GitHub Pages link.  Thanks to this innovative technology, Noise and Buttons has become readily available to anyone worldwide.

How It Works

How it Works - Noise and Buttons

Noise and Buttons' Core:

     The Noise and Buttons application is a web application that runs on any desktop or mobile browser.  As stated in chapter one, the app consists of four push buttons that are each color-coded and control the four major aspects of music: tempo, rhythm, timbre, and harmony.  The date triggered by the user interacting with the buttons is sent to the generative synthesizer core of the app which was constructed in Max MSP with the RNBO export target.  The intuitive programming environment within Max allowed me to develop the three-voice synthesizer and audio processing effects as JavaScript code for the web application target.

     Internally, the Noise and Buttons synth operates similarly to conventional synthesizers with its unique functionality stemming from external data inputs that enable control through simple button commands.  Beginning with pitch control, the synth stores a list of notes from a specific scale, ready to be fed into the oscillators.  When signaled from the synth’s built-in step sequencer, the system selects random numbers corresponding to indices from the scale list.  This occurs three times simultaneously, assigning each voice a random note to play in synchrony with the sequencer’s rhythms.  These notes are then fed into an amplifier that is opened with the signal of an envelope, triggered whenever a new note is selected from the scale.  All of this is clocked by one unifying tempo parameter, which allows the synth to play tight music with ease.  For example, eight notes played at a tempo of 60 beats per minute will result in notes being evenly triggered twice per beat or twice per second.  Finally, the signal travels into a tap delay, vibrato, and low-pass filter that is controlled by the green timbre button.  All of this is paired with a kick drum sample that keeps the beat.  The kick can be activated or deactivated by the user as desired.

Redefining the Step Sequencer:

     The rhythm control mechanism in the Noise and Buttons synth is facilitated by a step sequencer, which was designed to overcome the limitations of traditional sequencer designs.  Commonly found on devices such as the Roland TR-808, there are a fixed number of steps that are limited to only straight rhythms.  However, since I aimed to create an app capable of producing complex polyrhythms, the traditional design proved inadequate.  

     The primary challenge arose when users sought to play rhythms that fell between the discrete steps of a traditional sequencer.  The solution I devised allows for an infinite number of sub-divisions.  The Noise and Buttons step sequencer is only four steps.  Every step is divided into a specified number of sub-steps which are equidistantly interspersed between steps with delays considering the number of sub-steps and the tempo.  This innovative approach enables the implementation of any division of steps like quintuplets if I wish, providing users with unprecedented rhythmic flexibility.  Three of these control each synth voice respectively.

    The implementation of this system is realized through lists of “on” and “off” signals, which dictate the activation status of every sub-step.  Like a normal sequencer, this results in triggers only being fired when a sub-step is on.  Overall, this robust design combines the rhythmic accuracy of a standard step sequencer with the flexibility to handle polyrhythms, resulting in a versatile and powerful tool for musical expression.

The Multipurpose Push-Button:

     When conceptualizing the idea of using push buttons to control complicated music, I recognized the versatility in the push-button interface.  The most common interaction involves a quick press and release, which sends an event to the computer to execute a function.  Quick-press actions are triggered on release of the button, accommodating the second common interaction: the long press, also known as press-and-hold.  During a long press, the quick press function is replaced by a continuous press-and-hold function, which persists until the user releases the button.  This can allow for precise control of quick-press parameters like in the tempo button.  On the contrary, buttons can use this implementation to control two separate parameters of the musical aspect.  For instance, the rhythm button can be pressed quickly to result in new rhythms or held down to swiftly increase/decrease the attack and decay of each note that is played, ultimately allowing articulations to be simulated through the same button.  

     The system works by setting a press-and-hold state to be false unless the button is pressed for more than 250 milliseconds.  When this occurs, the press-and-hold state is true, and the quick-press function on release is ignored.  Each button has their own press-and-hold state variables, allowing for multiple buttons to be pushed in different ways at once.  This system allows each button to double its functionality, creating more avenues to control the synth.

How do the Buttons Affect the Synth?

     The buttons affect all the tempo, rhythm, timbre, and harmony parameters on the synth.  To consolidate the parameters for use with four buttons, each aspect has a grouping of parameters.  Tempo and the time of delay are synced together and therefore considered part of the tempo group.  Voice rhythms and articulations fall into the rhythm group.  All audio processing effect parameters are members of the timbre group.  Pitches and the enabling/disabling of synth voices belong in the harmony group.  These groupings are fundamental for realizing the full potential of the four buttons.

     Starting with tempo, a simple quick press will result in a random tempo change from five options, being 50, 65, 85, 115, and 140, measured in beats-per-minute.  On a press-and-hold action, the tempo will increment and decrement between the minimum and maximum tempo values of 50 and 140 infinitely until released.  When the tempo reaches a minimum or maximum, the app inverts the data flow in the opposite direction.  The tempo value is vital, as the entire synth runs on a clock that relies on tempo to control its speed.  This value is also given to the delay effect, syncing its speed with the clock.


     The rhythm button controls the note value of all three synthesizer voices.  With a quick press, a random note value per voice is selected.  The note value selections are sent to the sequencers in the form of binary lists that activate or deactivate sub-steps.  Because each sequencer is tempo synced, this system prevents rhythms from de-syncing.  Executing a long press triggers a script where the attack and decay times for every note are adjusted incrementally.  First, the attack time increases and decreases.  Once the attack time completes its cycle, the decay timeincrease and decreases, and triggers the attack value increment cycle again.  The pattern will run indefinitely until the button is released.

     The timbre button serves to control the synth’s audio processing effects.  These effects are delay, vibrato, and a low-pass filter.  triggers random adjustments to key parameters of the effects, including the delay wet mix, delay feedback, vibrato speed, and vibrato depth.  Notably, the vibrato effect is directly tied to the wet mix of the delay, affecting only the wet signal while the dry signal remains unaffected.  For more nuanced control, the press-and-hold action runs a script where the cutoff frequency for the filter slowly rises and falls.  While this is happening, the resonance bounces back and forth rapidly.  This creates a distinctive “inchworm” movement between the two parameters.


     The harmony button is responsible for the pitch content of the synth.  When pressed quickly, the app chooses a scale list, and then breaks it down into three sub-lists for the three voices.  Next, a random number generator for each voice selects a random note from each sub-list when triggered from the step sequencer.  The pitches are then converted to frequencies and fed into the synthesizer’s three voices.  A long press will activate or deactivate the audio output of the voices through an order of different on/off combinations.  The app will cycle in order through seven different combinations of voice stackings until the button is released.

The User Interface:

     Noise and Buttons prioritizes user-friendliness, reflected in its intuitive user interface featuring an easy-to-interpret layout and engaging visuals that provide real-time feedback on the music's behavior. The front-end user interface for the web application is organized into three main sections: the global control panel, the information panel, and the button control panel.  Shown above is the front-end user interface for the web application.

     The global control panel houses a range of controls and settings crucial for managing the application's functionality.  At the top of the panel is a drop-down menu offering users a choice between input devices: keyboard/mouse control, or physical button-box control.  Immediately below the input device options are two volume sliders.  The top slider governs the main volume of the app, while the bottom slider specifically controls the volume of the kick drum.  In addition to volume controls, the global panel features other essential controls and visual elements.  A

stopwatch is available for user to see the time they spent using the app.  The stopwatch starts when the app is turned on and pauses when shut off, resuming when switched on again.  A “reset” button will set it back to zero.  Another prominent feature of the global panel is the capability to record audio directly from the app to an audio file on the user's computer.  As a safety measure, the app is initially set to an 'off' state upon loading. This allows users to adjust their speaker volume or accommodate headphones before activating the application, ensuring a comfortable and safe listening experience.

     The information panel is the focal point of interaction within Noise and Buttons, providing users with comprehensive insights into the changes occurring in the music as they manipulate various parameters.  To facilitate easy comprehension, the panel is organized into distinct columns, each dedicated to one of the four fundamental aspects of music.  Each column is also split into two rows, differentiating quick versus long button presses.  

     In the tempo column, users can observe the current tempo in beats per minute (BPM) displayed in the top row.  Additionally, the bottom row shows the direction the tempo will increment when long pressed.  The rhythm column presents three note values that signify the rhythm played by the low, middle, and high voices, respectively.  The bottom row houses two sliders that display the attack and decay speeds of each note.  In the timbre column, users can visually interpret complex aspects of the sound.  The top row displays delay feedback and volume using a single slider, with the shade of green representing the wet/dry mix of the delay and the position of the slider knob indicating the feedback level. The bottom row features two sliders for adjusting the cutoff frequency and resonance of the low-pass filter, labeled as 'brightness' for ease of understanding.  Finally, the harmony column provides information about the scale currently playing through the synth in the top row.  The bottom row shows the state of the three voices regarding their audio output (a blue circle meaning the voice is sounding, and a black circle being the opposite).  These controls are designed to be easy to understand for anyone using the app regardless of musical background or learning styles.

     The button control panel is the front-end interface for managing the intricate controls of the synthesizer.  The four buttons are arranged from red to blue and align with their respective columns on the information panel.  Users can easily control these buttons with a click of the mouse or trackpad.  Additionally, users can opt for keyboard control to manipulate multiple buttons simultaneously.  By default, the buttons are mapped to the numbers one through four on the keyboard, facilitating efficient control.  In instances where users prefer the physical button box, keyboard and mouse control remain active alongside the button box as an additional input device.

Teaching Materials

Teacin Materials - Noise and Buttons

The Slideshow Presentations:                                    

     At the start of each class session, I utilized slideshow presentations designed to introduce concepts about music to the students, with the first four focusing on the four musical aspects that the app controls.  The following slideshows are subsequent lessons exploring the application of these aspects across different musical genres.

Each slideshow maintains a standard layout to provide a sense of routine, recognizing that many neurodivergent students benefit from predictable series of events.  Each slideshow begins with “Musical Brain Exercises”.  These are intended for engaging students into thought-provoking activities, by connecting real-life situations to the concept taught in the lesson.  Referring to the tempo slideshow, students are presented with scenarios prompting them to consider how fast or slow a person might move in different situations.  The first question goes as follows:

“Sam wants to get from their house to the park.  They have nothing to do today.  How do they get there?”

The options for multiple choice are A: “They walk or roll slowly”, or B: “They run or roll quickly”.  The correct answer for this question is A.  Another question follows:

“Joel is playing tag with their friend.  They get tagged and they are now ‘it’.  They want to reach their friend as soon as possible.  How do they get there?”

     The same options are available for this question, with the answer being B.  I used these questions/scenarios to give an emotional connection to the concept of speed.  In this case, students could interpret slow movement to more relaxed situations, and fast movement to more chaotic situations.  The same can apply to music, as a fast song may be perceived as more energetic over a slower song.  After the musical brain exercise, it is intended that students are prepared to learn about the musical concept, and thus, the next slides define the concept.  For this example, I will continue to refer to the tempo slideshow.

     In my lesson plan approach, I aimed to provide clear and straightforward definitions that accurately conveyed musical concepts while remaining accessible to my students.  Therefore, tempo is defined as “how fast or slow a beat is”, capturing the essence of tempo in a concise manner.  For a beat of a song, the definition is written as “what keeps everyone together when playing music”.  These definitions are simple, yet understandable and accurate.

     In addition to providing definitions, each slideshow presentation is supplied with audio examples to further illustrate the concepts being taught.  To enhance comprehension and accommodate differing learning styles, I wrote a simple song with variations to showcase different musical techniques.  On the tempo presentation, the same song is demonstrated in different tempos side-by-side.  This allows auditory learners to grasp the concept easily.  Moreover, this approach is instrumental in developing critical listening skills, as students can learn to discern and analyze musical nuances across different renditions of the same song.

     Along with conveying musical concepts, I prioritized fostering emotional connections to music in my lesson plan design.  After explaining a particular topic, I encouraged my students to explore how different musical examples make them feel.  This open-ended discussion allowed students to express their diverse emotional reactions to the music.

For the tempo slideshow, students can share responses ranging from associating fast music with happiness and slow music to sadness.  Some might share more nuanced perspectives, and some might admit uncertainty about their emotional responses.  The discourse this creates in the classroom is key to empowering students to explore their creativity while learning musical concepts.  Obviously, students will have drastically differing answers, and that is why on every slideshow, there is a slide addressing this.  The slide reads as follows:


     In the subsequent sections of the slideshows, various musical examples are introduced to further illustrate the concepts discussed, allowing students to explore music based on emotional descriptors.  Students are encouraged to select songs they wanted to hear based on their emotional responses, fostering personal engagement and connection to the material.  Also, to solidify my statement on differing emotions, examples are present that showcase different combinations of playing techniques and emotions.  For instance, in the tempo section, songs labeled as “fast and happy” and “slow and sad” are featured.  Additionally, there are examples like “slow and happy” songs, emphasizing that tempo and emotion are not inherently relative to each other.  A song can evoke a range of emotions, such as happiness, sadness, anger, or fear, regardless of its tempo.

     The slideshows following the first four delve into different well-known genres of music, including classical, jazz, rock, pop, and electronic, among others.  The layout of these remains consistent with the musical aspect slideshows but substitute elements for more relevant activities.  In the musical examples section for these genre-focused slideshows, there are two contrasting examples for each aspect of music.  For instance, when viewing tempo, there is a fast and slow classical piece to compare side by side.  For harmony, there is a piece in a major key versus a minor key.  Alongside musical examples, there are musical instrument demonstrations for the students pertaining to the genre taught.  For instance, in the classical genre section, orchestral string instruments are discussed and demonstrated, while other examples include the saxophone for jazz, the theremin for electronic music, and the electric guitar for rock.

     The slideshows for the Noise and Buttons Musical Learning Program allow the concepts controllable by the app to be further understood and taught, while continuing to employ the accessible approach of the entire project.

Video Lessons:

     As part of the Noise and Buttons Musical Learning Program, I developed a series of eight video lessons, with two videos dedicated to each musical aspect.  These videos are hosted on my YouTube channel, offering accessibility to a wide audience.  Each video is designed to be interactive, allowing viewers to engage directly with the content.  The first video in each pair serves as an introductory overview of the topic, accompanied by a short exercise that allows the viewer to get familiar with what is being taught in a hand-on environment.  The second video is used as a tool to measure how well the concept was retained by offering a slightly more challenging exercise.  The first video’s musical examples are derived from the Noise and Buttons app, while the second one contains a full version of the original song.  Throughout both videos, the parameters of the musical aspect being taught dynamically change, and viewers are encouraged to match these changes as closely as possible.  Visual aids are provided on-screen to assist viewers in tracking these changes.  Based on testimony from my students, it is apparent that these videos were instrumental with the learning of the four musical aspects.  Below, I have included the full transcripts of all eight videos, complete with cues for audio playback.

Conclusion on Teaching Methodology:

     The materials of the Noise and Buttons Musical Learning Program are versatile and adaptable, suitable for both individual and classroom settings.  However, their true potential is realized in a classroom environment.  The ultimate objectives of the program are straightforward: to equip students with a comprehensive understanding of music through the four key aspects introduced by the Noise and Buttons app and accompanying lessons.

Future Plans

Future Plans - Noise and Buttons

Noise and Buttons is currently available as a web app.  Along with new features, I plan to deploy this app to more platforms.  As I add new features, modes and functionality to the app, I will be making new teaching material to pair with the app in the Noise and Buttons Musical Learning Program.  Below is a list of updates that are in the works as of April 2024:

  1. Deploy the app in its current form as a desktop and mobile application

  2. Add features for visually hearing impaired people to allow further accessibility to the app and its content.

  3. Create teaching materials within the application so users can get all of their content in one place.

  4. Create new modes that utilize the same button control as the current generative synthesizer.

bottom of page