The Painkiller RTX remaster demonstrates how small development teams can leverage generative AI to modernize thousands of legacy game textures at scale. Using PBRFusion, a custom-trained AI model, the team converted low-resolution textures into physically based rendering (PBR) materials with normal, roughness, and height maps,
•14m read time• From developer.nvidia.com
Table of contents
What’s your professional background and current role?What made you want to become an RTX Remix modder, and what brought you to Painkiller ?You’re among the early adopters to use generative AI to rebuild textures and materials at scale. How did you use models like PBRFusion to convert low-resolution assets into high-quality PBR materials?What got you interested in generative AI, and what motivated you to fine-tune a model for RTX Remix?Why was it important for your texture pipeline to blend AI-generated outputs with traditional hand-crafted work, rather than relying on a single approach?How did you maintain a consistent style and quality bar across more than 35 levels while integrating AI-generated content?The materials and textures now react much more realistically to light. How did you rethink your material, texture, and lighting workflows to achieve that result across so many environments?Full-scene path tracing, volumetric lighting, and advanced techniques all work together in Painkiller RTX . How did you combine these systems to shape the game, and what did each contribute that you couldn’t get from more traditional rendering?For developers inspired by Painkiller RTX who want to take a first step toward similar visuals, which features or workflows would you recommend experimenting with first?How do you think generative AI has changed modding and game development, and what tools are you looking forward to next?Join us at GDCResources for game developersSort: