RE Engine May be the Most Flexible Proprietary Engine Out There

This has been a post I’ve had in the draft folder for some time. I’ve been spending a lot of time with the RE Engine lately, thanks to working on Season 5 – Resident Evil for Chapter Select. I decided it is time to crank this one out.

Holy smokes Capcom has a dope game engine!

The RE Engine made its debut with Resident Evil VII: biohazard in 2017. The “RE” would imply Resident Evil, but I guess it stands for “Reach for the Moon Engine.” Capcom has certainly done that. A new engine made sense for a reinvention of the franchise. Utilizing both a first-person perspective and implementing a fully playable VR version of the game dictated a need for new tech.

Since RE7‘s release, Capcom has turned the Resident Evil engine into the Capcom Engine. There are currently a total of 17 games that use the RE Engine. The real dope part is how diverse that catalog of games is and how many different platforms are supported.

While the bulk are third-person action/shooters, RE Engine also supports first-person games, VR, 2D platformers, retro-emulated arcade games, fighting games, action games, and online multiplayer games. The list of hardware is even longer with

  • Xbox One
  • PlayStation 4 / PSVR
  • Nintendo Switch
  • Xbox Series consoles
  • PlayStation 5 / PSVR2
  • PC
  • Apple Silicon-based Mac computers
  • Cloud-streaming platforms

The spread here is wide. And with that spread, comes a slew of technical capabilities. My PS5 version of RE7 has options for raytracing and 120fps, while the Switch cloud version varies. Resident Evil Village hits Apple Silicon Mac computers next week. I am curious how it performs on my M1 iMac compared to a maxed out M1 Ultra and a beefy PC.

Outside of spooky time games, Capcom making this the backbone of Street Fighter VI is serious. The style is off the charts and early impressions indicate the gameplay is solid.

Compared to the likes of Ubisoft’s Snowdrop and EA’s Frostbite, the RE Engine just seems better. RE Engine appears to be more flexible than Snowdrop, which is primarily used for online shooters, open-world RPGs, and Rabbids games (Mario + Rabbids does look beautiful). Frostbite is notorious for being a FPS engine stretched out for sports and massive RPGs. Credit where it is due though: while writing this, I checked what engine NFS Unbound uses and it is reportedly Frostbite. The new Need for Speed has a unique style to it, not far off from Street Fighter VI actually. Graffiti is so hot right now.

With a new update for Resident Evil Village out soon1 and the ground-up remake of Resident Evil 4, I am going to spending a fair chunk of time with Capcom’s game engine. It’s almost as exciting as the games themselves.


1. A game that supports first-person, third-person, and VR!

Naughty Dog on Learning PC Development

How Uncharted: Legacy of Thieves Collection’s PC launch speaks to Naughty Dog’s present and future by Christian Gyrling for PlayStation Blog

I always told my PC-gamer friends that when I can play Uncharted on PC, that’s when I’ll build one. Well, that day has finally come. I’m not in the market to build a PC right now, but I guess my bluff has been called.

To celebrate the launch of Uncharted: The Legacy of Thieves on PC, one of Naughty Dog’s vice presidents took to the PS Blog to share insight to bringing their games to PC.

We also knew this PC release wasn’t going to be a one-off. As you may already know, The Last of Us Part I is in development for PC following its successful launch on PlayStation 5, and with this being, ahem, Uncharted territory for us, we knew we have a lot to learn about bringing our games to PC. But we were also determined to bring our careful consideration for every aspect of our games to this new version.

And The Last of Us Part I won’t be the last Naughty Dog PC game either. I suspect all future Naughty Dog games – all PlayStation Studios games — will likely come to PC at some point in their life.

First and foremost, we learned, particularly through our partnership, what it takes to bring our own engine to parity to deliver on PC hardware. Uncharted 4 and Uncharted: The Lost Legacy are already beautiful games in their own right, and we wanted to maintain that quality in the PC release. But we still wanted to provide flexibility in fine-tuning an experience PC players expect, and so it was important for us to support more cinematic resolutions as well as specific PC graphical features.

Naughty Dog’s engine is their own secret sauce they’ve kept refining since Jak & Daxter on the PS2. The jump to PS3 was notoriously difficult for the team. Since that console generation, Naughty Dog has prioritized adapting their engine to the new hardware as soon as possible; see The Last of Us Remastered on PS4, The Last of Us Part II‘s PS5-specific patch, Uncharted: The Legacy of Thieves on PS5, and The Last of Us Part I. All of these projects have brought Naughty Dog’s engine1 to the ninth console generation and now it’s on PC.

The PC space is also one that offers users a ton of flexibility in hardware specs, controls, and more. As we’ve primarily only ever had to focus on considering one or two system specs in the past, this was simply eye-opening for us. Having primarily developed for console controllers, we had to learn about the preferences and flexibility keyboard and mouse controls offer, and we found that we had to re-evaluate certain game mechanics to fit the new input methods.

This article iterates that PS5 is Naughty Dog’s primary focus. So I am curious how bringing games to PC now will impact design. When you go from two SKUs to so many, how does that enable or hinder game design?

We also needed to account for the variability of PC hardware as it pertains to data loading, and so we reworked our engine to add a “safety valve” of sorts to ensure a smooth gameplay experience across various PC specs. This isn’t something we’ve had to worry about since the Jak and Daxter days, when we added an animation of Jak stumbling if data was loading in too slowly.

Apparently you had subtle safety valves. What a very immersive way to deal with technical limitations. Never pull the player out of the world and game. Jak and Daxter‘s original goal was one seamless world and this stumble animation was one way Naughty Dog achieved that.

We’re excited to be offering The Last of Us Part I on PC in the future, and know that, moving forward, adding PC development to the way we develop games, which in no way undermines the importance of PlayStation 5 as our primary platform, will continue to benefit our team in the long run.

As PlayStation enters the PC realm, I expect The Last of Us multiplayer game to launch on PC day-and-date with the PS5 version. A big tell will be when God of War: Ragnarök comes to PC. With the ground work laid for Sony Santa Monica’s engine and pipeline, how long will Jim Ryan and PlayStation hold onto console exclusivity? When will Spider-Man 2 swing onto PC? When will PlayStation bring Nixxes in to co-develop the PC version of titles for same-day launches? That’s the future for PlayStation Studios in one way or another.


1. I wish I knew the name they call their engine…

Chapter Select: Season 4, Episode 5 – Fast & Furious 6

Photo and design by Max Roberts

Listen up! We’re heading to London baby! Max Roberts, Logan Moore, and wrestling move expert Mario Rivera hit the street to find out if ghosts are real as the Fast Family goes out to save one of their own. Does Justin Lin land the finale of his trilogy or does he run out of runway?

Download (43MB)

RSS FeedOvercastApple PodcastsSpotifyYouTube

Fast & Furious 6

Rotten Tomatoes – 71% critic and 84% viewers


This episode was originally recorded on September 5, 2022.

@ChapterSelect

Max’s Twitter @MaxRoberts143

Logan’s Twitter @MooreMan12

Mario’s Twitter @ThatMarioRivera

Researcher, Editor, and Producer – Max Roberts

Hosted by Logan Moore & Max Roberts

Photo and Art designed by Max Roberts.

I Bought a Pixel 7

Don’t tell Tim Apple, but I bought a Pixel 7. This is my first Android phone.1 I’ve always wanted to have one lying around. It’d offer a peek to what I am missing on the outside of walled Apple orchard. I could have a better pulse on the Android and Google world. Curiosity drove this desire. So with an absurdly good trade-in offer, I decided to let my beloved iPhone Xs Max go to the great Best Buy in the sky.2 Really, I sacrificed it for science.

Then phone showed up yesterday. I set it up fresh. Then I asked myself, “what the heck am I going to do with this thing?”

I spent the night and morning digging into my ideas as to why I even bought the Pixel. Now that it’s here, it’s time to start experimenting and finding where it fits in my life.

The anchor of bringing an Android flagship into my personal technology fleet was utility.

When I watch the Made by Google events and Google I/O, there are usually a few features that leave me gobsmacked in one way or another. Google shows off software I want in my life. The prime example is live transcription. You see the demos on stage or in reviews and you can’t help but think there’s some sort of illusion happening before your eyes. My brain leaps right to interviews and my podcasts.

I tested this feature out just this morning by playing a few minutes of Ben Thompson’s interview with Mark Zukerberg and Satya Nadella about partnering in the Metaverse. The transcription is almost realtime. With decent grammar.3 When Google releases the update to their recorder app to add speaker labels, I plan to use this to make passable transcripts of Chapter Select and The Max Frequency Podcast. It may not be that perfect transcription, but the out-of-the-box accuracy and ease of use makes offering this type of resource affordable to me. It is magic.

Like I just talked about with Casey Liss as I prepared for becoming a parent, the best camera is the one you have on you. That’s a lesson I learned in photojournalism at UCF (one of many). I upgraded to the iPhone 14 Pro Max this year so I could have the best iPhone camera in time for my daughter’s arrival. We also bought a big ol’ fancy camera lens. This trade-in deal seemed like the perfect opportunity to add the legendary Pixel camera to my camera tool belt.

For years, I’ve hear MKBHD talk about Google’s punchy colors and I’ve seen the camera comparisons. Each manufacturer has their own spin on camera priorities and computational photography. I finally get to have two of the major players on deck.

I may be ride or die with Apple, but I do my best to stay on top of the major ongoings with Samsung, Google, and interesting Android devices. Generally, that involves watching MKBHD videos to keep a steady pulse. Keeping this eye on “the other side” has helped me see where Apple falls short and stays ahead. More so, the hardware and software is fascinating. Foldables only exist in Android Land for now. There are under glass fingerprint sensors, super fast charing options, and wild camera arrays. Experimentation is still in full swing over the walls. While Google’s flagship is hardly full-fledged experimentation compared to the rest of the market, the Pixel 7 feels that way to me as a near-decade long iPhone user.

So right now, my plan for the Pixel is two-fold: 1) use it as a tool for podcasting and recording and 2) have a better understanding of the Google and Android ecosystem. I’m looking at the Pixel with a laser focus. What can I do here to enhance my work? I don’t need Twitter or messaging or fitness on my Pixel. It won’t grow in those areas of my walled orchard. But I can use it to prune distractions in production. I can enrich what I make. The Pixel isn’t here to disrupt my workflow, but strengthen it. We’ll see how this experiment pans out.

And now for some quick, initial impressions of the Pixel 7 experience.

  • I prefer the iPhone 14 Pro Max always-on display. I like that colors and the widgets. If Apple was late to the always-on game so they could achieve this vision, I’m glad they waited.
  • The fingerprint is scanner is bright. It’s slightly slower than what I am use to (aka Touch ID), but them’s the breaks with optical sensors. Still a neat feature.
  • Face unlock, not so much. With no depth sensors, it’s physically not as secure as my fingerprint or Face ID. Having dual biometric access is an odd balance. I just default to the fingerprint. Sometimes the face works first. I may just disable Face Unlock.
  • The camera is snappy. There are these helpful little videos when you enter a new spot of the UI. HDR10 limited is limited to 30 fps.
  • Apple Music is nice to have, but weird to have a slice of Apple UI within the confines of Android. Is there an Apple TV app?
  • This phone is slippery. As MKBHD would say, it’s a glass sandwich. I have to buy a case.
  • The gestures are not natural and I’m not sure if that’s four years of iOS gestures or just out of touch design. If you even graze the left side of the screen, it reacts like a Back button. If I try to swipe open the lefthand panel in Discord, it backs out of the app. I turned the sensitivity of this feature as low as it would go. It will certainly take some getting used to.
  • Why are the Chrome controls up top? I don’t want to stretch to the upper right corner to access my tabs. Please. Help.

1. It can’t make phone calls, so is my Pixel 7 even a “phone?”

2. The Xs Max was/is valued around $185~, but the Pixel 7 promotions brought that up to $475. Best Buy also threw in a $100 gift card. After tax, the Pixel 7 cost me $60~, which seemed worth the price of my curiosity.

3. Unlike that sentence.

Dedicated Hardware

While chilling in the hospital during a hurricane after having a baby, I was reading Ben Thompson’s free weekly Stratechery newsletter about Nvidia and their current position.

Moreover, Nvidia is, as with ray-tracing, backing up DLSS with dedicated hardware to make it much more performant. These new approaches, matched with dedicated cores on Nvidia’s GPUs, make Nvidia very well-placed for an entirely new paradigm in not just gaming but immersive 3D experiences generally (like a metaverse).

Nvidia in the Valley by Ben Thompson

Editing in WordPress’ iOS app is atrocious (plus I was busy with a new human), so the draft sat on the server. Then this morning, I was watching MKBHD’s iPhone 14 Plus video and he mentioned a similar trait.

But what’s really happening now is companies are developing their own silicon with special sections of their system on a chip designed to accelerate certain functions that they think will be a priority for their users.

These observations on dedicated hardware in custom silicon reminded me of the PS5. Here’s what I wrote about Sony’s ninth generation console in 2020.

The through line for the whole talk was how customized the PS5 actually is. The SSD is custom. The CPU and GPU are custom. The I/O board is custom. Beyond parts you’d need to build a gaming device, Sony has developed and incorporated their own custom silicon to aid those cornerstone components. There is a custom flash controller for the SSD, which helps prioritize and free up lanes for information to go through. There is a custom “Kraken” decompressor. Kraken is a compression tool that is supposed to be popular amongst many game developers. This custom decompressor unpacks that format with the power of nine Zen 2 cores. These are not chips you can buy off the shelf and slap into a PC.

All of these custom components surround the cornerstone chips that make game consoles possible. Instead of forcing developers to conform to a custom standard, this hardware seems to alleviate hardware work loads and assist developers. It strikes me as the inverse of the PS3 and its Cell architecture. It took developers quite a bit of time to adapt to the Cell processor: It was notoriously tricky to work with. The PS5 is using a custom AMD Zen 2 processor, which seems to be an industry standard…

…I thought the most custom element was the 3D audio tech that Mark detailed. Sony wanted to offer great audio for all players, not just those with fancy sound systems or headphones. So they went ahead and built custom hardware to help create 3D audio from any set of speakers (eventually). Headphones are the gold standard due to one speaker per ear, but Mark even talked about generating 3D audio from TV speakers. With it included in every single PS5, that gives all players and all devs the opportunity to experience/use 3D audio. It reminds me of the leap from standard definition to HD, but for our ears.

Almost two years into the PS5’s lifecycle, we can see the payoffs of leaning into custom hardware alleviation. The raw loading speed of games, 3D audio, the DualSense haptics, all come together to enhance the game experience while providing developers with power. 3D audio itself is a stand out mechanic when implemented right.

[In Returnal] I heard wild alien creatures whipping around me in the level. I instinctively turned toward the sound and stopped immediately on the enemy. 

This was more than standard surround sound pointing me in a direction. I locked-in on the enemy with my ears before I did with my eyes or gun. In a fight-or-flight scenario, my ears did their survival job.

Then I got thinking about PSVR2 again…

When you combine these [haptic feedback and adaptive trigger] elements with the PS5’s Tempest audio engine, PSVR 2 has incredible potential to really put users in a place.

PlayStation’s pursuit of immersion this generation is off to a stellar start and promises to be a transformative addition to gameplay. I can’t wait to feel, hear, and see more.

We are in an era where big tech companies understand both their needs and their consumers (be that developers or customers). These companies then leverage their R&D and pour immense resources into developing custom hardware to make the experience better.

It’s not just PlayStation. Look at Apple with the Mac’s leap to Apple Silicon. The performance smokes Intel and AMD. Nintendo uses a custom Nvidia chip in the Switch. There have been plenty of rumors that the next console from the Big N with implement Nvidia’s DLSS to achieve a 4K image. Microsoft brought back the proprietary memory card!

The tune all of these companies are marching to is custom hardware to alleviate and streamline processes. It doesn’t sound like the songs of the past, like the Cell processor or the Trash Can Mac Pro. Big Tech seems to have found a groove and isolated what their platforms need to flourish without being cornered for years because of one decision. The song of custom sounds good now, hopefully someday it goes out on a high note instead of a whimper.

Chapter Select: Season 4, Episode 4 – Fast Five

Photo and design by Max Roberts

It is time to assemble the team. Two precision podcasters – Max Roberts and Logan Moore – don’t crack under the pressure of discussing this transformative film. Can the Fast Family pull off the greatest heist of all and become a billion dollar franchise?

Download (36MB)

RSS FeedOvercastApple PodcastsSpotifyYouTube

Fast Five

Rotten Tomatoes – 78% critic and 83% viewers


This episode was originally recorded on August 30, 2022.

@ChapterSelect

Max’s Twitter @MaxRoberts143

Logan’s Twitter @MooreMan12

Researcher, Editor, and Producer – Max Roberts

Hosted by Logan Moore & Max Roberts

Photo and Art designed by Max Roberts.