NVIDIA announces Project G-Assist, a new AI Assistant for owners of NVIDIA RTX graphic cards. Rather than help you glue the cheese on your pizza and solve other IRL problems, it uses AI to help automate optimizing your system and game settings. This is extensible, and developers are invited to program plugins to enhance its capabilities. Word is: "G-Assist can provide real-time diagnostics and recommendations to alleviate system bottlenecks, improve power efficiency, optimize game settings, overclock your GPU, and much more." The good news is that it does not sound like the dystopian privacy nightmare that is the signature element of many AI assistants: "Unlike massive cloud-hosted AI models that require online access and paid subscriptions, G-Assist runs on your GeForce RTX GPU. This means it is responsive, free to use, and can run offline." This announcement has details on this and other new features that are now live in the NVIDIA App. If there's bad news, it's in the disclaimer which admits to the potential hazards of both AI models and software that stands at version 0.1:
G-Assist is an automated system powered by AI models and designed to facilitate configuration of your system’s hardware and software settings and to provide information regarding NVIDIA’s GeForce products for your personal, noncommercial use. G-Assist is a pre-release feature and may not be fully functional, may contain errors or design flaws, and may have reduced or different security, privacy, availability and reliability standards relative to commercial versions of NVIDIA offerings.
AI models generate responses and outputs based on complex algorithms and machine learning techniques, and those responses or outputs may be inaccurate, harmful, biased or indecent. By using G-Assist, you assume the risk of any harm caused by any response or output from it. NVIDIA will not be responsible for any actions, losses, or damages suffered as a result of use of G-Assist or its output. We recommend that you verify the information before relying on it.
Beamer wrote on Mar 26, 2025, 11:22:
As someone that's long past the years where I'd do any changing of graphics card settings outside of in-game, and who doesn't even bother with in-game half the time, it feels like this would probably be helpful to me.
jacobvandy wrote on Mar 25, 2025, 20:01:
As mentioned in the blurb, this is a custom small language model (SLM) designed to run locally on your GPU, so it does not require an internet connection to function. Whether or not they're collecting data related to how it's interacting with their other software is another matter, but I highly doubt they're capturing every input and response like a cloud-based AI service would.
They mention it's based on an 8B-sized large language model (LLM) which is normally operated on about 4-6GB of VRAM, but since this is meant to be run alongside your game and they don't need all of that 'knowledge' a generalized model would include, I'm guessing they trimmed that down to 1-2GB or possibly even less. So that gets loaded in and waits for you to enter a query, at which point it will momentarily divert GPU resources to generating the response.
Apparently that can include charts or graphs of frametimes, system latency, power consumption, and other diagnostics like a lot of folks use third-party tools to monitor, which actually sounds pretty nifty... Except with this you can just look at that and then tell it to make changes for you, such as adjusting the frame limiter or power target or enabling/disabling V-Sync or G-Sync (perhaps even while still in-game?). If they can expand that to include checking or changing DLSS status, presets, and overrides, that would be a HUGE quality of life change compared to what is usually involved with that now (i.e. toggling the dev mode on-screen indicator via registry key and re-launching the game). It'd also be pretty great to be able to change graphics settings on the fly, without accessing the menu, but that might require some plugin support from the game developer side.
jacobvandy wrote on Mar 25, 2025, 20:01:
As mentioned in the blurb, this is a custom small language model (SLM) designed to run locally on your GPU, so it does not require an internet connection to function. Whether or not they're collecting data related to how it's interacting with their other software is another matter, but I highly doubt they're capturing every input and response like a cloud-based AI service would.
They mention it's based on an 8B-sized large language model (LLM) which is normally operated on about 4-6GB of VRAM, but since this is meant to be run alongside your game and they don't need all of that 'knowledge' a generalized model would include, I'm guessing they trimmed that down to 1-2GB or possibly even less. So that gets loaded in and waits for you to enter a query, at which point it will momentarily divert GPU resources to generating the response.
Apparently that can include charts or graphs of frametimes, system latency, power consumption, and other diagnostics like a lot of folks use third-party tools to monitor, which actually sounds pretty nifty... Except with this you can just look at that and then tell it to make changes for you, such as adjusting the frame limiter or power target or enabling/disabling V-Sync or G-Sync (perhaps even while still in-game?). If they can expand that to include checking or changing DLSS status, presets, and overrides, that would be a HUGE quality of life change compared to what is usually involved with that now (i.e. toggling the dev mode on-screen indicator via registry key and re-launching the game). It'd also be pretty great to be able to change graphics settings on the fly, without accessing the menu, but that might require some plugin support from the game developer side.
El Pit wrote on Mar 25, 2025, 12:19:
G-Assist - Nvidia helps you to find the right spot where "gaming" becomes real fun!![]()
Still waiting for G-Dominate, a service where the AI plays the games for you. Maybe in a few months, Nvidia?
Disk Space Required:
› System Assistant: 6.5 GB
› Voice Commands: 3 GB
GPU:
› GeForce RTX 30, 40, and 50 Series Desktop GPUs with 12GB VRAM or Higher