The ever-inventive YACHT used AI to deconstruct their entire back catalogue and then painstakingly put the whole lot back together to create ‘Chain Tripping’, an album of the year contender that has to be heard to be believed

“When you cut into the present, the future leaks out” observed William Burroughs, discussing the creative cut-up techniques pioneered by Brion Gysin and continued by himself and later adopted by the likes of David Bowie, to great effect.

The roots of those experiments lie in rudimentary technology, sometimes as simple as literally cutting up and re-assembling words on paper, at others manipulating reel-to-reel tape. Staying true to their spirit in a post-artificial intelligence age proved to be an altogether more complex and, at times, literally painful process. ‘Chain Tripping’, the latest album from the Los Angeles-based (although they originally hail from Portland, Oregon) electronic pop conceptualists YACHT, is the fruit of that labour. The result of taking those ideas as far as modern advances allow them to be stretched.

It is a remarkable record, both in the story of its construction and its otherworldly sound. It couldn’t have been made by humans alone, but is almost more human as a result.

YACHT – it’s an acronym, not just shouty caps for the sake of it, standing for Young Americans Challenging High Technology – used two central questions as a starting point for the album’s creation.

“We made ‘Chain Tripping’ by stringing a handful of disparate artificial intelligence processes together with a very specific goal in mind: to make new YACHT songs,” explains the band’s multi-instrumentalist Jona Bechtolt. “That raised two questions immediately: how do we define a YACHT song? And how do we work with AI when we’re not programmers?”

The answer to the first involved the breaking themselves down, piece by painstaking piece. They paid someone sourced via freelancers for hire site Fiverr to notate their entire 82-song back catalogue into MIDI information. Then they went through every song and chose their favourite patterns in two and 16-bar chunks, until they’d collected together the best drum patterns, keyboard lines, vocal melodies and guitar riffs from their 17-year career.

“This data didn’t necessarily define us, but it was a start,” says Bechtolt.

“We then ran pairs of patterns through an open-source latent space interpolation model, an AI system that allowed us to navigate the high-dimensional mathematical space between melodies,” explains singer Claire L Evans (third member, all-round collaborator Rob Kieswetter, was otherwise engaged on the day we chatted). “By repeatedly running our back catalogue through this model, we generated hundreds of new weird patterns. We worked through all this melodic data, found what we loved most and assigned each melody a role: as a bassline or a vocal, a guitar riff, a keyboard part. Drum patterns worked in the same way. We then pieced these fragments into songs, one by one, and then learned and performed each song live.”

Simple, right? But learning to play some of these generated melodies was harder than you might expect.

“Of course, they have no relationship to the human body,” says Evans, “a seemingly simple two-bar drum loop took Jona 30 minutes to learn.”

For the lyrics, they worked with creative technologist Ross Goodwin to build a text-generating model trained on a massive corpus of song lyrics, based on two million words from songs they’d previously written, as well as songs they love, and grew up listening to.

“The idea was to replicate the influences already mixed up in our heads,” continues Evans. “Ross’ model generated thousands of lines of text, which we printed out on one giant continuous sheet of dot-matrix printer paper and brought into the studio with us.

“We literally sat on the floor for two weeks, highlighting the best lines from this endless document of lyrical nonsense. Sometimes it blew us away. The model was programmed to also name the songs, and, once trained, could generate infinite songs at varying temperatures.”

Temperature, for the uninitiated, is a concept in machine learning that determines how much risk the model takes when generating output.

“Low-temperate models don’t take many risks,” explains Bechtolt, “and high-temperature models are all over the map. In the case of our lyrics model, low temperature results in lizard-brain, punk-rock lyrics: think “I want your mind” over and over across seven pages. High temperature gives us nonsense words – telepathetic, foodsteps, commuscle…”

To cope with the sheer volume of material these systems generated, and to ensure the finished product would feel like YACHT, they created a strict set of rules.

“We couldn’t add a single note or word to the generated material, but we could remove and arrange things however we wanted,” says Evans. “It was essentially a cut-up method, which has a long history in both the avant-garde and in pop music. Our muses were William Burroughs and Brion Gysin, who pioneered the cut-up, and David Bowie, who used custom software in the 90s to generate lyrical fragments. We were attracted to the idea of taking super low-tech approach to working with really hi-tech systems. Ultimately, technology is only as interesting as what you do with it.”

Evans admits that they went into the project with zero programming knowledge and only the vaguest sense of what was possible. But, she says, YACHT have always appreciated that technology and art are deeply interrelated, often pushing each other forward.

“We had a sense that machine learning might be like the synthesiser: an opportunity for music to evolve in unexpected ways. We wanted to understand what was happening, and we always learn by making.”

However, they needed help to get their heads around the technology, which, especially when they started experimenting a few years back, is not particularly friendly to non-programmers.

They were fortunate to connect with collaborators early on, building a dream team of “research scientists, gonzo coder poets, visual artists and designers” who share their fascination with machine aesthetics.

The aforementioned Ross Goodwin, a former Obama speechwriter and creative technologist, trained the text-generating model on their influences, handing them a 10,000-word document to work from. Machine learning research scientists at Google Brain’s Magenta project tweaked their open-source models for YACHT to experiment with in the studio. While Dadabots, a duo of creative technologist/producers, pointed a raw audio model called SampleRNN at stems of previous YACHT songs, making hundreds of samples to act as a sonic library of source material.

Processed with VSCO with ke1 preset

Even the album’s artwork, typography, band photos and music videos are all showcases for collaborations with different artists working in the machine learning field. No easy task in itself, as Evans points out, as the disciplines of art and science still remain very separate entities, especially when it comes to ways of working.

“There are very, very few people on this Earth who have both the cognitive capacity to code at the level necessary to work with machine learning and the specific madness it takes to make interesting art,” she says. “It’s just too much brain for one person. So making music this way often requires collaboration between technologists and musicians, either in the form of close collaboration, with the technologist building models to execute ideas, cluing the artist into what’s possible, adapting and refining models in an iterative, conversational process, or more abstractly, with the technologist building tools that can be adapted by artists to suit their purposes.

“We’ve done it both ways. At their best, these collaborations force both parties to move away from what’s comfortable, face their assumptions, learn new languages, and find common ground. Artists deal in the subjective, and engineers deal in the quantifiable, and so even the simplest things can prompt long philosophical discussion. Defining ‘success’ for example. What makes an AI’s output interesting, or good? Scientists want metrics, something they can feed back into the system to improve it. All we can do is respond to what these systems generate and make judgments about what speaks to us. Sometimes tools don’t do what they are meant to do at all, but they do something totally different, and maybe better.”


By feeding their entire back catalogue into the process, we suggest, they’ve created an alternative to a greatest hits album. A sort of greatest hits they never wrote, maybe?

“You’re right,” reckons Bechtolt, “but instead of it being the greatest hits we never wrote, it’s more like the greatest hits that could never exist under capitalism, or in this dimension, for that matter.”

In actual fact, the results themselves, far from a homogenised YACHT product that simply rolled off the production line, and were so much weirder than anything they’d ever dare to write.

“Quite often, the stuff these AI models generated barely hung together,” says Evans.

“Sentences led in one direction and then took a sharp turn sideways. We’d get 10 pages of repetitive lyrics and then one bonkers digression. Notes seemed to form a melody before falling down the stairs completely. Meaning when it emerged, it emerged out of nowhere and knocked us sideways. Honestly, there was nothing cold about it. We taught a machine how to write songs by feeding it only the songs we love, and its surrealist attempts to emulate that data, our influences, were a constant joy.”

It’s true. The parts, both lyrical and musical, that were plucked out for use in the songs, definitely have a weird poetry to them. From the album’s opening line “Welcome to your pleasure” on ‘(Downtown) Dancing’, with its inverted ESG-style bassline and instinct-defying onbeat cowbells, to the dub/synthpop hybrid ‘Sad Money’ and its chorus of “I’m gonna have sad money”, a new way of expressing meaning seems to be being established.

‘Loud Light’, for instance, offers up a priceless chorus: “I’m so in love / I can feel it in my car”, a line that arguably no human would ever write, but one that has an uncanny grasp of the wayward language of pop. Contrary to what you might think, Bechtolt says, the process wasn’t about discovering “crazy” moments. Quite the opposite in fact. It was more about being shocked at the sheer beauty of a melody, or being delighted at the wonkiness of a drum pattern. He points to the instrumental break a minute and a half into the album’s third track ‘Scatterhead’ as a good example.

“That was the first melody that really surprised us,” he says. “It was a loose, aimless, over-long digression that breaks all the rules and yet somehow still works. It’s the melody that convinced us this whole project was worthwhile. A melody like that is worth building a house around.”

The hardest part was trying to bypass the highly instinctual, intuitive creative process that was second nature to them.

“We tend to lean on what feels natural and fun for our brains and bodies,” says Bechtolt. “The specific process we imposed on ourselves for this album was not natural, and it often wasn’t fun at all. Since these AI models have no relationship to the human body, and even less relationship to their very specific musical limitations, some of the parts are really weird and hard to perform. Learning the bassline for ‘(Downtown) Dancing’ nearly dislocated my shoulder!”

He acknowledges it is very much of a case of no pain, no gain.

“We’re not alone in our belief that the good stuff lives in the space between ambition and means,” he says. “Our favourite music was made by people struggling to just get through the songs.”

The proof is in the pudding, of course, and ‘Chain Tripping’ proves beyond doubt that all that pain, both mental and physical, was well worth it. Far from feeling cold and emotionless, it actually has a feel of vivid hyper-reality. The whole process has also helped to break YACHT out of the habits of a lifetime, something which will stay with them beyond this point and can only be a good thing for future creativity.

“It’s been extremely challenging,” says Bechtolt in conclusion, “but for every habit we break, we make space for something new. AI is a strange mirror that helps us to see ourselves. When used in this way, we genuinely believe it makes us better.”

‘Chain Tripping’ is out on DFA

0 Shares:
You May Also Like
Read More

David Bowie: Speed Of Life

Two pioneering albums, a classical music voiceover, a safari, an imposter, encounters with Blondie, Eddie & The Hots Rods and Devo and, in the midst of a bleak Berlin winter, an ill-fated film, David Bowie had reached almost terminal velocity by December 1977…
Read More

Nightmares On Wax: Wax Poetic

Taking a lead from the title of his latest album, Nightmares On Wax’s George Evelyn sees a more positive world if we aim to ‘Shape The Future’. How so? Living in Ibiza for 10 years sure helps
Read More

Yann Tiersen: Go Figure

More absorbed than ever in electronics, Yann Tiersen’s latest album ‘11 5 18 2 5 18’ finds the Breton composer sampling and resampling old tracks into compelling, exquisitely wrought fresh shapes… with an occasional nod to the dancefloor. Who knew?
Read More

Ulrich Schnauss: All Present And Correct

In the creation of his retrospective, ‘Now Is A Timeless Present’, shoegaze obsessive and Tangerine Dream member Ulrich Schnauss has spent two years lost in his own musical past. Lessons learned? Funny you should ask…
Read More

Karin Park: Nordic Noir

Electronic artists sometimes struggle to make their music work in a live setting. But not much-lauded electro-popster Karin Park, as our behind-the-scenes peek at the London show on her recent UK tour proves