Nvidia's Jensen Huang gave a sneak peek at what the trillion-dollar GPU company has planned for future iterations of Deep Learning Super Sampling (DLSS). During a Q&A session at Computex 2024 (reported by More Than Moore), Huang answered questions about DLSS, saying that in the future we'll see textures and objects that are generated entirely by AI. Huang also said that AI NPCs will be generated entirely by DLSS.
Generating in-game assets with DLSS improves game performance on RTX GPUs: work transferred to tensor cores reduces demand on shader (CUDA) cores, freeing up resources to improve frame rates. Huang explains that DLSS auto-generates textures and objects, improving the quality of objects, similar to how DLSS upscales frames today.
The next iteration of DLSS technology may be on the horizon: Nvidia is already working on a new texture compression technology that takes into account trained AI neural networks to significantly improve texture quality while maintaining the video memory (VRAM) requirements of modern games. While traditional texture compression techniques are limited to 8x compression ratios, Nvidia's new neural network-based compression technology can compress textures by up to 16x ratios.
This technology fits into Huang's explanation of how DLSS improves the fidelity of object images: since objects in games are nothing more than textures wrapped in 3D space, this texture compression technique inevitably improves texture quality.
What's more interesting about Huang's upcoming iteration of DLSS is in-game asset generation. This enhancement to Nvidia's DLSS 3 frame generation tech will generate frames between real frames, improving performance. Asset generation goes even further than DLSS 3 frame generation, where in-game assets are generated completely from scratch by DLSS. (DLSS still needs to be told where assets should be placed in the game world, and which assets it needs to render, but it's generated (created) them completely from scratch.)
Huang also spoke about the future of DLSS surrounding NPCs. Not only does he expect DLSS to generate in-game assets, he also envisions DLSS generating NPCs. He gave the example of six people who exist in a video game, two of which are real characters, while the remaining four are entirely AI-generated.
This is a callback to Nvidia ACE, which was demoed in 2023. ACE is an in-game LLM designed to work in conjunction with user interactions with different characters in the game, giving NPCs their own dialogue and responses, bringing them to life. Nvidia believes ACE (or a future form of it) will play a key role in PC gaming and become an integral part of DLSS.
This isn't the first time we've heard about DLSS's potential future, as the tech giant has publicly stated that it expects the future of PC gaming to be entirely AI-rendered, replacing traditional 3D graphics rendering. For the time being, generating certain assets in-game is a step towards the AI-generated future that Nvidia envisions.