NVIDIA shows off neural texture compression tech that shrinks texture memory from 6.5GB to 970MB

The new technology uses neural networks to decode textures but comes with a performance cost.

Ivy-covered stone villa at sunset
(Image via Nvidia)
TL;DR
  • NVIDIA's Neural Texture Compression demo showed VRAM usage dropping from 6.5GB to 970MB by using neural networks to decode textures.
  • The "On Sample" mode that saves the most VRAM can cost around 30% performance in testing.
  • A beta SDK is available to developers, but real game adoption timeline is uncertain, and cross-platform support remains unclear.
Community Reactions
How do you feel about this story?
👍
0
👎
0
😂
0
😡
0
😢
0

NVIDIA demonstrated Neural Texture Compression at a developer presentation, showcasing a technique that dramatically reduces how much VRAM textures consume. In the demo, a texture set that normally takes up 6.5GB of video memory was compressed down to roughly 970MB.

The technology works by storing textures in a compact format and using small neural networks to reconstruct them during gameplay. Instead of keeping full-resolution textures in VRAM like traditional compression methods, NTC converts them into what developers call a “latent representation” that takes up far less space.

NVIDIA outlined two different modes for developers to use. The first is “On Load” mode, which decompresses textures when a game loads them. This shrinks install sizes and patch downloads but doesn’t save much VRAM at runtime. The second is “On Sample” mode, which keeps textures compressed in VRAM and decodes them on the fly when needed. This is where the big memory savings happen.

But those VRAM savings come with a catch. Testing from developers shows the “On Sample” mode can hit performance hard. Some demos showed around a 30% performance drop compared to traditional texture compression. One benchmark comparison showed a scene running at 220 fps with “On Load” mode dropping to 170 fps when switched to “On Sample” mode.

The catch nobody talks about

Rolling this out in the real world is trickier than the demo numbers suggest. Developers would likely need to use NTC selectively, applying it only to certain textures while keeping traditional compression for UI elements, text, and stylized art that might not work well with neural reconstruction.

Visual quality is another question mark. Textures are already stored in lossy formats on GPUs using compression methods like BC7, so NTC is essentially replacing one lossy method with another. The difference is in the artifacts and how the quality holds up under different lighting and viewing conditions.

Cross-platform support remains unclear. While some developers mentioned that DirectX 12 features could enable broader compatibility beyond NVIDIA hardware, there’s no confirmation on whether AMD or Intel GPUs will support it. Console implementation is even murkier.

Explore More
Meet the Editor
mm
Senior Editor