HomeCommunityMobile, Graphics, and Gaming blog
March 10, 2026

New neural technologies set to join the Neural Graphics Development Kit

Arm showcases Neural Frame Rate Upscaling (NFRU) at GDC, a neural graphics technique that generates frames to improve smoothness in mobile games

By Annie Tallund

Share
Reading time 6 minutes

Neural graphics on mobile are moving quickly.

Following the announcement of Arm neural technology in August 2025 that will add dedicated neural accelerators to future Arm GPUs, we have worked to move neural techniques from experimental demonstrations toward new capabilities. At Arm’s developer summit at GDC Festival of Gaming, we are introducing Neural Frame Rate Upscaling (NFRU), and opening the Early Access Program for developers who want to explore it as part of the Neural Graphics Development Kit, ahead of general availability.

The Neural Graphics Development Kit was released alongside the Arm neural technology announcement. Since then, we have been committed to helping developers deliver next-generation gaming experiences on mobile. NFRU is a continuation of the work that resulted in Neural Super Sampling (NSS), the first application released as part of the Arm neural technology development kit, but focuses on frames in addition to pixels.

If you are building advanced lighting pipelines on mobile, this is the moment to start evaluating how neural frame generation fits into your roadmap. AI-driven graphics are advancing fast, and integration takes time. Teams that begin experimenting early will be better positioned as the next-generation of GPU hardware lands.

In this blog post, you will learn what NFRU is, how we showcased it at GDC, and how you can join us on the journey towards next-generation mobile gaming experiences.

What is Neural Frame Rate Upscaling (NFRU)?

NFRU is Arm’s approach to neural frame generation for real-time graphics. At a high level, it uses neural inference to synthesize intermediate frames between traditionally rendered ones, increasing perceived smoothness. Leveraging motion data and temporal information already present in the pipeline, NFRU can generate additional frames efficiently.

NFRU is designed to work alongside advanced rendering features such as ray-traced lighting and complex shading. As scene complexity increases, frame time becomes the dominant bottleneck. Neural frame generation provides another lever: it allows the GPU to skip rendering an entire frame and instead generate it using a neural workload that costs only a fraction as much. This can unlock up to double the frame rate, resulting in smoother, more responsive gameplay.

The technology is built with dedicated neural acceleration in mind. As neural accelerators become integrated into Arm GPUs, machine learning workloads can run alongside traditional graphics tasks, unlocking new performance possibilities for real-time rendering.

Project Buzz: an end-to-end neural graphics case study

At GDC, NFRU is not shown in isolation.

It is embedded within Project Buzz. A production-quality reference game that is the result of technical collaboration with the games studio Sumo Digital. The reference game makes full use of Arm’s neural graphics technologies, including NFRU.

Project Buzz is designed to demonstrate what becomes possible when neural acceleration is integrated directly into the rendering pipeline. The goal is to prove that it is possible to create content with real-time lighting techniques and scene complexity that would otherwise be out of reach within a mobile footprint.

Project Buzz evaluates neural techniques across the entire frame pipeline, from lighting and geometry, through to final image reconstruction and frame delivery.

Neural frame generation in a production Unreal pipeline

In his GDC session, A pragmatic path to 60Hz on mobile with ray-traced lighting and neural graphics, Sergio Alapont, Principal Engineer at Arm, covers how these technologies move from theory to production.

The session walks through how NFRU is deployed in a live Unreal Engine environment, using Project Buzz as a case study.

Sergio says: “Mobile teams are constantly trading visual ambition against frame rate, thermals, and battery life. Neural graphics changes that equation. Instead of cutting features, we can reallocate GPU budget using neural acceleration to maintain quality while staying within real production constraints.”

The focus is on engineering realities. How do you structure the frame? How does neural frame generation integrate with motion vectors and optical flow? How do these systems coexist with ray-traced lighting and other advanced rendering features without destabilizing the pipeline?

Sergio breaks down how NFRU doubles perceived frame rate through neural interpolation while maintaining power efficiency. Crucially, that capability is delivered through the Neural Graphics Development Kit and Unreal Engine plugins, making adoption practical rather than theoretical.

Sergio continues: “Frame generation and neural denoising are not research experiments anymore. They are tools that can ship if they are integrated correctly into real engines and real content.”

And for teams wondering when to start exploring neural workflows, Sergio’s advice is direct:

“If you are serious about high-end mobile graphics, you need to start experimenting with neural workflows now. The studios that build intuition around frame structuring, motion data, and neural scheduling today will be the ones ready when this hardware becomes mainstream.”

From development kit to engine: making neural graphics shippable

While Sergio’s session focuses on rendering trade-offs inside a production pipeline, Arm’s Willen Yang looks at the broader toolbox that makes neural graphics practical to adopt.

In Neural Graphics in Practice: Arm’s SDK for Next-gen Game Development, Willen introduces Arm’s neural technology stack as a developer-ready pathway for bringing neural rendering techniques into real-world mobile game development. A key theme is architectural alignment: NPU-class neural accelerators embedded directly inside Arm GPUs enable efficient, whole-tensor workloads that integrate cleanly with the GPU’s execution and memory model.

Willen says: “Replacing a compute shader with a neural accelerator dispatch is straightforward by design.”

The session walks through the Neural Graphics Development Kit, including Arm’s ML extensions for Vulkan, the Neural Graphics SDK for game engines, and ready-to-use Unreal Engine plugins. Rather than requiring bespoke research integrations, these tools are designed to fit into familiar engine workflows and existing rendering pipelines.

Willen then connects the toolkit to concrete techniques. NSS replaces traditional shader-based upscaling with a lightweight kernel-prediction network, delivering higher image quality while reducing GPU workload. NFRU builds on that foundation, using neural interpolation and optical flow to double perceived frame rate while staying within mobile power budgets.

Willen continues: “Neural technology delivers console-class visual quality on mobile within power budget.”

Case studies from Infold Games (Unity) and Unreal Engine integrations demonstrate that adoption is feasible today with modest engineering effort. The session closes by outlining the roadmap and early access opportunities, encouraging developers to begin experimenting now so they are prepared as neural acceleration becomes available across GPU generations.

Willen concludes: “Developers need to start experimenting now with the Neural Graphics Development Kit, so they are ready to use AI to power their game titles on mobile with neural accelerators, delivering higher visual quality with lower power and GPU load.”

What we showed at GDC

At the Arm developer summit at GDC, NFRU is presented as part of a broader shift in how neural graphics is integrated on future Arm GPUs. It appears within Project Buzz, our end-to-end showcase of neural technologies operating across a complete rendering pipeline, and is explored in depth through dedicated technical sessions focused on real engine deployment and SDK integration.

Alongside these demonstrations, we are opening the NFRU Early Access Program, which provides early access to tools that are not yet generally available.

Start exploring now

Waiting until tools are broadly available, or until hardware is already in market, can compress integration timelines and limit how deeply these techniques are understood inside a team. Frame structuring, motion data handling, and neural workload scheduling require iteration.

The NFRU Early Access Program is an opportunity to begin that work now: to evaluate neural frame generation within your own pipeline, provide feedback on tooling and integration, and build internal expertise ahead of broader release.

If you are interested in participating, we invite you to apply.

Apply for the Neural Frame Rate Upscaling Early Access Program

More updates coming soon

Neural techniques are moving quickly from experimental features to foundational components of modern rendering pipelines. As dedicated neural accelerators become part of future Arm GPUs, the question shifts from if studios will adopt neural workflows to when. We will be sharing more detailed technical content on both Project Buzz and NFRU throughout 2026. Watch this space for updates.


Log in to like this post
Share

Article text

Re-use is only permitted for informational and non-commercial or personal use only.

placeholder