Tech

A weird take on frame rate: How PS5’s first “40 fps” game works, runs

Panoramic screenshot from sci-fi videogame.

Enlarge / This Ratchet & Clank: Rift Apart image was captured while running in the game’s new 40 fps mode. It offers the same amount of resolution, image quality, and ray-tracing fidelity as the prior 30 fps “quality” mode. (credit: Sony / Insomniac)

When it comes to action-filled video games, frame rates matter, and up until recently, traditional “frames per second” wisdom has landed at either 30 fps or 60 fps. Thirty, the rate seen in most standard TV broadcasts, is fine for slower cinematic games, while frantic battles and twitchy fights benefit from a higher rate, since it looks smoother and reduces button-tap latency.

This week, a surprising new number enters the conversation: 40 fps, a standard previously unattainable thanks largely to TV standards. It comes courtesy of a new patch to this month’s Ratchet & Clank: Rift Apart on PlayStation 5, which already includes a 60 fps “performance” option. So why would anyone pick 40 fps instead? And how does it work?

HDMI standards, menu picking, and math

Recent titles by Insomniac Games, particularly Marvel’s Spider-Man and the 2016 Ratchet & Clank remake, launched on PS4 with a 30 fps lock, meant to guarantee higher pixel counts and more detailed shadow and level-of-detail (LoD) settings. Both of those games eventually got PS5 versions with 60 fps support, since they could leverage the newer hardware’s power. As a native PS5 game, this month’s R&C:RA launched with both 30 and 60 fps modes on day one. Its menus asked you what you preferred in your gaming: more pixels and higher image quality or more frames?

Read 10 remaining paragraphs | Comments