AUDIOBEADS

A novel visual & musical programming language for teaching basic computer science concepts in the browser


The Project

AudioBeads is my undergraduate thesis project, on which I worked from October 2015 to May 2016. My undergraduate degree being in Computer Science, this is not your typical user experience project. However, I was already interested in the field at the time so I lobbied to conduct a human computer interaction project. This shows especially in the Evaluation section of my project. 

 

The Problem

In recent years, there has been a drive to revitalize computer science education through, for example, the creation of novel educational methods and tools for teaching computer science concepts.

It is well-known that teaching programming to novices is difficult. This is because it requires the ability for abstract reasoning, and for maintaining a mental model of the abstract machine one is trying to program the behavior of. Another major barrier with text-based programming is the unfamiliar, unforgiving syntax. This unnatural mode of expression creates a distance between the mental representation of a programming solution and the way it must be conveyed to the computer, which novices find difficult to reconcile. 

Visual programming languages offer a good alternative to traditional programming languages as initial programming environments. Their syntax-free nature, and their ability to embed instructional analogies into the visual representation means important programming constructs, such as variables, loops, or parallelism, can be introduced to novice programmers by abstraction.  

The aim of this project was to determine whether a beads-on-string metaphor for programming could be used in a 2D environment. The metaphor was created by a team at Microsoft Research Cambridge working on Torino, a tangible programming language for visually impaired children. I personally worked to lay the foundations for Torino through my work on a project called "Live Coding for Blind Children" in my second year of undergraduate studies. 

 

The Process

Below is an overview of the process followed for the design, implementation, and evaluation of AudioBeads. 

 

What is AudioBeads?

AudioBeads is a visual programming language I created and implemented. It is uses the Web Audio API to produce digital music as program outputs. It is targeted at individuals with no programming experience and strives to be used as a tool for teaching basic computing & programming concepts. 

Designing AudioBeads

AudioBeads needed to include a few basic programming constructs and allow users to code musically interesting programs to be successful as a pedagogical musical tool. To scope it out, I used Sonic Pi, a successful music development platform originally designed to support computing education in English schools, as inspiration.

After careful analysis, I decided that the system should allow users to:

  • Play single notes
  • Play pre-defined sound samples,
  • Add sound effects
  • Introduce silences in their composition

It should also provide support for loops, conditionals, and threading. With such a language, it was possible to keep the language size manageable for beginners and still work with various programming constructs: sequences, iteration, conditionals, randomization, event-driven dynamic modifications, parallelism, variables, and parameters. 

The visual design of the notation was guided by the Cognitive Dimensions of Notation (CDN) framework, a set of heuristics for analyzing the usability of information artifacts like programming languages. I identified the crucial dimensions for a successful introductory programming environment as follows:

  • High visibility
  • High role-expressiveness
  • Consistency of the notation
  • Low resistance to change
  • Minimal premature commitment
  • Minimal number of hidden dependencies
  • High closeness of representation to domain
  • Low diffuseness of the language
  • Progressive evaluation

The beads-on-string metaphor naturally prompted the design of 2D beads connected by lines. I decided that each bead should represent a single instruction and that constructs encapsulating more than one instruction should look different to help learners discriminate. Using the various constraints I laid out for the language, I settled on a language composed of 6 main elements. 

Elements of the AudioBeads language

Elements of the AudioBeads language

Screenshot of the web browser interface for coding in AudioBeads. The program here plays a single note for 1 second, at pitch 60.

Screenshot of the web browser interface for coding in AudioBeads. The program here plays a single note for 1 second, at pitch 60.

Example of a more complex program in AudioBeads

Example of a more complex program in AudioBeads

 

Evaluating AudioBeads

The main goals for the evaluation of AudioBeads were:

  • To assess the usability of both its interface and its notation
  • To assess whether its purpose as an educational tool is justified

To do so, I designed a controlled task-based experiment in which participants learned the basics of AudioBeads and went through a series of exercises of increased difficulty. To protect the internal validity of the experiment by keeping guidance to a minimum, I designed a workbook for participants to work through the tasks as independently as possible. 

The workbook contained five sections as follows:

  • Sections 1 & 2 together were 6 tasks, each introducing a new element of the language
  • Section 3 contained 2 tasks of independent programming 
  • Section 4 consisted of 2 "Observe & Predict" tasks and 2 "Listen & Predict" tasks, where participants were presented with an AudioBeads program and had to either explain its output out loud or pick the correct musical output out of 3 pre-recorded samples
  • Section 5 was 4 tasks where participants were presented with short JavaScript code snippets using the Web Audio API and had to try to infer their behaviors

At various points in the booklet, participants had to answer checkpoint questions and state their confidence levels. 

For each task, the number of clicks and deletions, the time to completion, and the task outcome were recorded automatically by the interface, by hand, and through screen video capture software. To test the overall usability of the interface, participants took the System Usability Scale questionnaire, and for the usability of the programming language, they took the Cognitive Dimensions of Notation questionnaire. Statistical analysis was performed using RStudio.

The results showed that the system’s usability was judged above average on the System Usability Scale, but more so for male participants. They also suggested that the notation used for the language fulfilled the goals set in the design phase: good visibility and role-expressiveness, consistency, low premature commitment, and low resistance to change.

In terms of educational value, the experiment showed that AudioBeads taught participants basic programming elements and who were then able to, in most cases, confidently explain the behavior of new AudioBeads programs. Finally, and perhaps most remarkably, participants were able to successfully analyze very basic JavaScript programs, having never programmed before, simply relying on their use of the system.

Although they were not statistically significant in most cases, gender differences were observed, whereby male users were generally more confident in their programming abilities than their female counterparts, and marginally more successful. These results showed that women found the beads-on-string metaphor less usable and instructive than men, but were nonetheless encouraging as they indicated the metaphor worked as a mental model for programming, irrespective of gender. 

You can read my dissertation and the full evaluation report here.