What Does Titan Sound Like?
That’s the question I’ve been living with for months now. Not the literal sound—we know Titan’s thick atmosphere would carry sound differently than Earth, that the methane rain would create its own acoustic signature. I’m talking about the emotional sound. The sonic texture of William Falck’s transformation. The frequency of a father’s certainty colliding with a son’s doubt. The rhythm of humanity standing before something incomprehensibly vast and ancient.
This is where the music side of Fall of the Titans comes alive, and it’s the part of this project that’s most purely me. No AI collaboration here—just me, my studio, and the sounds trying to escape my head.
The Studio Setup
Let me walk you through where the magic (and frustration, and late-night breakthroughs) happens:
Reason 13 is my DAW of choice. I know some producers swear by Ableton or FL Studio, but Reason’s rack-based workflow just clicks with how my brain works. There’s something about virtually patching cables between devices that feels right for creating the interconnected, layered soundscapes I’m after. Plus, Reason’s native instruments are criminally underrated.
My Korg R3 sits within arm’s reach. This little black beauty is my go-to for generating raw sound material that I then mangle and reshape in Reason. There’s a tactility to twisting physical knobs that you just don’t get from clicking a mouse. When I need that perfect pad sound or a growling bass that has organic movement, the R3 is where I start.
The Nektar Impact LX61+ is my main MIDI controller, and honestly, it’s become an extension of my hands at this point. The key action is perfect for the kind of dramatic, cinematic playing the soundtrack demands.
The Virtual Arsenal
Here’s where things get interesting. While the hardware provides the foundation, my VST collection is where I sculpt the signature sound of Fall of the Titans:
Serum 2 – This is my secret weapon for the Robot Core sounds. Those massive, metallic, almost-alive tones that represent the Cores themselves? That’s Serum. The wavetable synthesis lets me create sounds that feel mechanical but not lifeless, technological but somehow conscious. Perfect for a story about AI and human consciousness merging.
Sylenth – My workhorse for leads and pads. When I need something that cuts through the mix with crystal clarity, Sylenth delivers. A lot of the emotional melodic content—the themes that represent characters like Audencia or the tragedy of the Soul Prisms—comes from Sylenth patches I’ve spent hours refining.
Synthmaster 1, 2, and 3 – These are my texture generators. When I need a sound that’s weird, unexpected, or slightly unsettling (hello, Etheral presence), Synthmaster is where I dig. The modulation capabilities are insane, and I can create evolving soundscapes that feel alive and alien.
Reason’s Native Instruments – Thor, Europa, Grain—these often get overlooked, but they’re phenomenal. Europa especially has become crucial for creating the atmospheric beds that sit beneath the action. That sense of vast Martian landscapes? Europa pads with carefully crafted reverb chains.
Kontakt Libraries – I’m sitting on a collection of libraries that I’ll detail more when we get to individual chapter releases. Orchestral elements, ethnic instruments, found-sound percussion—Kontakt is the bridge between electronic and organic. When I need something to ground the synthetic sounds in human emotion, Kontakt provides it.
The Compositional Philosophy
Here’s how I approach creating music for this story:
Each Chapter Has a Sonic Identity
Every chapter of Fall of the Titans has a corresponding musical track, and each needs its own sonic personality while still fitting the album’s overall sound.
For example, Chapter 17 “Doomsday Clock” needed to feel like inevitable countdown, mounting tension, the slow realization that everything is about to change. So the track builds with layers of rhythmic elements—ticking percussion, pulsing bass, rising synth lines—all converging toward a moment of release that never quite comes. The tension just… escalates.
Compare that to Chapter 9 “Love Me Until I’m a Ghost,” which needed to capture Audencia’s farewell, the weight of motherhood and sacrifice, digital consciousness fading. That track is all atmosphere and emotion—lots of space, reverb-drenched pads, a simple but devastating melody that fragments and distorts as the track progresses, mimicking her consciousness degrading.
Character Themes as Motifs
I’m a huge believer in leitmotif—recurring musical themes associated with characters or concepts. It’s pure Star Wars / Lord of the Rings / Gundam influence, and it works.
Falck has a theme that’s evolved across both albums. In the first story, it was simpler, more uncertain. In Part II, it’s become more complex, layered with dissonant elements that represent his Divergent transformation. You can hear his humanity and his evolution wrestling in the harmony.
Ares has a theme that’s all cold certainty—fourths and fifths, no warmth, very industrial. When you hear it, you know immediately: this is authority without mercy.
The Etherals don’t have a traditional theme. Instead, they have a texture—crystalline tones, frequencies that sit just at the edge of human hearing range, sounds that make you slightly uncomfortable without knowing why. They’re not meant to be understood, just felt.
The Album Must Flow
While each track needs to work with its chapter, the album as a whole needs to flow as a musical journey. All acts combined will be a double LP experience—you should be able to put on headphones, close your eyes, and travel through the entire emotional and narrative arc.
This means careful attention to key relationships between tracks, dynamic pacing (you can’t be at 11 the whole time), and creating moments of breath between the intensity.
The Production Process
Here’s how a typical track comes together:
Step 1: The Emotional Core
I start by reading the chapter multiple times until I can feel its emotional center. What is this chapter about underneath the plot? Chapter 20 “The Wulf at the Door” isn’t just about Krueger switching sides—it’s about the moment when certainty cracks and you have to choose between comfortable lies and painful truth.
Once I have that feeling, I sit at the keyboard and just… play. No planning, no structure, just improvising until something emerges that captures the emotion. This is usually piano or a simple synth, very raw. I record everything because the magic often happens when you’re not trying.
Step 2: The Foundation
From those improvised sessions, I identify the core elements—a chord progression, a melodic fragment, a rhythmic pattern. This becomes the foundation. I’ll build it out in Reason, choosing sounds that match the emotional texture. Is this chapter cold? Warm? Organic? Mechanical? The sound palette follows the feeling.
For combat-heavy chapters, I start with rhythm and build up. For emotional moments, I start with harmony and melody. The structure follows function.
Step 3: The Layers
This is where the magic happens and where hours disappear. I layer sound upon sound, always asking: does this serve the emotion? Does this add to the narrative?
I’m not afraid of density when it’s called for. The assault on Titan should sound massive, overwhelming, chaotic. But I’m also not afraid of space. The moment when the Etherals arrive and every Soul Prism goes dark? That needs silence, or near-silence, because the absence of sound is just as powerful as its presence.
The Korg R3 gets a workout here. I’ll record passes of modulated pads, filtered noise sweeps, distorted leads—raw material that I then chop up, time-stretch, reverse, and layer in Reason. Some of my favorite sounds are happy accidents: I was going for X, got Y instead, and Y turned out to be perfect for a section I hadn’t even written yet.
Step 4: The Arrangement
Once I have all my sounds and ideas, I arrange them into a structure that serves both the music and the narrative. This is where I’m thinking about:
- Pacing – When do we build? When do we release?
- Dynamics – Loud and quiet aren’t just volume levels, they’re emotional states
- Space – Where can the listener breathe?
- Payoff – If I’ve been building tension, where’s the release?
I’ll often have the chapter text open while arranging, making sure musical moments align with narrative beats. When Falck makes a crucial choice, the music should acknowledge it. When horror strikes, the sound should make you feel it.
Step 5: The Mix
Mixing is where good tracks become great or fall apart. I’m not a mastering engineer, but I’ve learned enough to get my tracks to a professional level.
I use a combination of Reason’s built-in effects and external plugins. IK Multimedias compressors and mixing tools are constantly in my chain. Reasons reverbs create the space and atmosphere—there are entire sections of tracks that are just reverb tails, creating this sense of vast emptiness that’s perfect for Mars and deep space.
The challenge with the Fall of the Titans sound is balancing clarity with complexity. There’s a lot happening in these tracks, but every element needs to be heard and felt. That means careful EQ work, surgical compression, and constant A/B testing against professional reference tracks.
The Kontakt Libraries (A Teaser)
I’m saving the detailed breakdown for when we release individual chapters, but here’s a taste of what’s in the arsenal:
- Orchestral libraries for the grand, cinematic moments
- Ethnic percussion that adds human texture to electronic soundscapes
- Vocal libraries for the eerie, wordless choirs that accompany Etheral presences
- Found-sound collections that I mangle into percussion and texture
The key is never using these libraries “straight.” Everything gets processed, distorted, reversed, time-stretched. I want organic sounds that feel technological, human elements that sound alien. It’s all about that blur between categories, which is what the story is about too.
The Challenges
Let me be honest about what’s hard:
Creative block – Some chapters flow immediately. Others? I’ll spend a week going nowhere, scrapping idea after idea. Chapter 21 “The Science of Suffering” fought me for two weeks before I finally found the right approach.
Technical limitations – I know what I want to hear in my head, but sometimes I lack the technical skill to realize it. I’m constantly learning new production techniques, new sound design tricks. Every album is a education.
Balancing music and narrative – Sometimes I’ll create something that sounds amazing but doesn’t serve the story. Killing your darlings is hard when your darling is a perfect four-minute electronic symphony that took two days to craft.
The pursuit of perfect – I’m a perfectionist, which is both a strength and a curse. I can tweak a snare drum sound for an hour. At some point, you have to say “it’s done” and move on, even when part of you wants to keep refining forever.
Why No AI in Music?
People ask why I use AI for writing but not music. The honest answer is: the tools aren’t there yet for what I need. Current AI music generators can’t match the specificity, the emotional nuance, the careful sound design that this project demands.
But it’s also because music is my native language. I’ve been making electronic music for years. It’s where I’m most confident, most fluent. I don’t need collaboration there—I know how to speak through synthesizers and samplers and effects chains.
Writing? That’s a language I’m still learning, where collaboration helps me reach higher. Music? That’s where I can already say what I need to say.
What’s Coming
Right now I’m deep in production for Acts II, III and IV. The tracks are in various stages:
- Some are 90% done, just need final mixing touches
- Others are at the “rough idea” stage, foundations laid but miles to go
- A few are still just sketches, waiting for the right moment of inspiration
My goal is to have 15 chapters completed before I start weekly releases. Each chapter, each track, needs to meet the standard I’ve set. This isn’t about quantity—it’s about creating something that resonates, that matters, that lives up to the story burning inside me.
The Ultimate Goal
When someone experiences Fall of the Titans II, I want the music and narrative to be inseparable. Reading the chapter while listening to its track should create something greater than either alone. The music should make you feel what the characters feel. The story should make you understand what the music is saying.
That’s the goal. That’s what I’m working toward in these late-night studio sessions, tweaking reverb tails and layering synth pads and searching for the perfect kick drum sound.
Because in the end, this isn’t just about making music or telling a story. It’s about creating an experience—a journey to Mars, into the depths of consciousness and evolution, to the frozen moon of Titan where humanity faces its reckoning.
And that experience needs to sound as powerful as it reads.
The sound of war is taking shape. Soon, you’ll hear it too.
— Millennium Falck

