Precode
52 Products in 52 Weeks

Week 10: Creme - scratching a 40-year itch with live coding music

·5 min read
Week 10: Creme - scratching a 40-year itch with live coding music

Making electronic music shouldn't require a £500 DAW, years of music theory, or a complex setup involving Haskell compilers and SuperCollider. Yet for anyone curious about algorithmic music—the kind that generates itself through code—the barrier to entry has been absurdly high.

This week I built Creme, a browser-based live coding music environment. Open a tab, write some code, hear music. No installation. No configuration. Just pure creative expression.

A 40-year itch

I've been into computers and dance music for over 40 years. This wasn't a business opportunity or a gap in the market. It was a question: could I build this with Claude Code?

No product launch planned. No SaaS ambitions. No monetisation strategy. Just pure curiosity and the joy of building something for its own sake.

The live coding scene has produced incredible tools—TidalCycles, Sonic Pi, Strudel—but they all have friction. TidalCycles needs Haskell and SuperCollider. Sonic Pi requires a download. Even Strudel, which runs in the browser, left me wondering if I could take a different architectural approach.

What Creme does

Creme lets you create music by writing patterns. Here's a complete drum loop:

d1(s("bd*4").bank("RolandTR909"))
d2(s("[~ sn]*2").bank("RolandTR909"))
d3(s("hh*8").gain(0.4))

That's it. Three lines. A kick on every beat, snare on the backbeat, hi-hats running continuously. Change any line while it's playing—the new pattern hot-swaps in without missing a beat.

The mini-notation is where the magic happens. Square brackets stack sounds simultaneously (chords). Asterisks multiply (repetition). Angle brackets alternate between values. You can nest these infinitely to create complex polyrhythms that would take hours to program in a traditional DAW:

note("[c3*3, e3*5, g3*7]").slow(4)

Three notes, each repeating at different rates, creating ever-shifting patterns that never quite repeat the same way.

Beyond the pattern language, Creme includes 54+ audio effects (filters, reverb, delay, distortion, compression), multiple synthesis methods, 80+ sample categories loaded on demand from a CDN, and an AI assistant that can generate, explain, or improve your patterns.

How we built it

The stack is TypeScript throughout, organised as a monorepo with six packages: core pattern engine, mini-notation parser, scheduler, transpiler, Web Audio engine, and the web application itself.

Three architectural decisions defined the build:

Pattern-as-query model. Instead of storing a finite sequence of events, patterns are pure functions that, when queried for a time span, return the events within it. This means patterns can be infinitely generative—they produce events on demand rather than from a stored list. It also enables hot-swapping without disruption.

Fraction-based timing. Polyrhythms require exact rational arithmetic. When you're dividing beats into thirds and sixteenths simultaneously, floating point accumulation errors will eventually cause timing drift. Using Fraction.js for all time calculations eliminated this entirely.

Fire-and-forget audio. Rather than pooling and reusing audio voices (the traditional approach), each event creates its own independent node graph that cleans itself up when done. This sounds wasteful but enables true polyphonic parameters—every single note can have completely different effects—and dramatically simplified the codebase.

The result: 106-165ms end-to-end latency from code change to sound. Fast enough that it feels immediate when you're performing.

What worked well

The pattern-as-query model proved extraordinarily powerful. Once the core was working, composing patterns became intuitive. Want to reverse a pattern? Wrap it in rev(). Want it faster? Use fast(2). These transformations compose cleanly because they're all pure functions.

Using a PEG parser (Peggy) for the mini-notation was the right call. Grammar-based parsing meant the syntax was precisely defined, and adding new operators was straightforward. Manual parsing would have become unmaintainable quickly.

Starting with comprehensive documentation from day one—over 5,000 lines across 12 files—forced clarity of thought. When you have to explain what something does, you quickly discover if you actually understand it yourself.

What was harder than expected

Browser timing is genuinely difficult. JavaScript's event loop makes no guarantees about when code runs. The scheduler needed drift compensation—continuously tracking the difference between expected and actual time, then adjusting accordingly. Without this, playback would gradually fall out of sync.

Web Audio's node graph model is powerful but verbose. Building 54 effects meant a lot of boilerplate connecting gain nodes, filters, and analysers. Fire-and-forget simplified the mental model but increased the total code needed.

I underestimated sample library complexity. The TidalCycles Dirt-Samples library has 80+ categories with wildly inconsistent naming. When a user writes s("arpy"), the system needs to find the right samples, load them on demand, and handle the case where they're not yet available—all without blocking audio playback.

What I'd do differently

Start with TypeScript strict mode from the beginning. I added it partway through and caught several issues that would have been prevented earlier.

Build out the test suite earlier. Pattern logic is pure and testable, but I wrote most tests after the fact rather than alongside the code.

The AI integration came late in the build. Adding it earlier would have helped with documentation and example generation throughout development.

Why this matters

This wasn't a client project. It wasn't an MVP Sprint. It was week 10 of my 52 Products challenge—building one new product every week for a year.

But the approach is identical to how I work with clients. Constrained timelines force decisions. You can't overthink architecture when you've got a week to ship. You have to pick a direction, commit, and discover what works through building.

Creme isn't a business. It's a creative tool I wanted to exist, built to scratch a 40-year itch. Sometimes that's the best reason to build anything.

Try it

Creme is live at creme.music. No signup required—open it and start typing:

note("c3 e3 g3 c4").s("sine").slow(2)

Press play. That's algorithmic music. You just made it.

Following the 52 Products challenge? You can track progress on the Precode site. And if you're thinking about validating or building a product idea yourself, get in touch—this is exactly the kind of work I do in UX Sprints and MVP Sprints.