With zero fanfare NVIDIA has released a new mobile graphics chip today, the GeForce GTX 965M.
Based on the 28nm Maxwell GM204 core and positioned just below the existing GTX 970M, the new GTX 965M has 1024 CUDA cores (compared to the 970M's 1280) and the new 965M has a lower 128-bit memory interface (vs 192-bit with the 970M). The base clock is slightly faster at 944 MHz (plus unspecified Boost headroom).
Compared with the flagship GTX 980M which boasts 1536 CUDA cores and 256-bit GDDR5 this new GTX 965M will be a significantly lower performer, but NVIDIA is marketing it towards 1080p mobile gaming. At a lower cost to OEMs the 965M should help create some less expensive 1080p gaming notebooks as the new GPU is adopted.
The chip features proprietary NVIDIA Optimus and Battery Boost support, and is GameStream, ShadowPlay, and GameWorks ready.
Specs from NVIDIA:
- CUDA Cores: 1024
- Base Clock: 944 MHz + Boost
- Memory Clock: 2500 MHz
- Memory Interface: GDDR5
- Memory Interface Width: 128-bit
- Memory Bandwidth: 80 GB/sec
- DirectX API: 12
- OpenGL: 4.4
- OpenCL: 1.1
- Display Resolution: Up to 3840×2160
More information on this new mobile GPU can be found via the source link.
I just looked into it and it
I just looked into it and it seems that none of NVIDIA’s mobile chips support G-Sync. Is this a software issue or hardware? Any word from NVIDIA on that? Seems like the mobile environment would be an incredible space for G-Sync for both the generally lower performance (below 60fps) at native resolutions and also for power usage… assuming that refreshing the screen below 60Hz results in lower power consumption.
The G-Sync controller
The G-Sync controller hardware is to big for mobile devices.
But the whole idea is possible with eDP 1.4 connectivity. The VESA A-Sync will work with several new notebooks and even some notebooks from the past with a firmware update.
Where do you think Nvidia got
Where do you think Nvidia got the idea for G-Sync?. From eDP standards.
In eDP 1.2+ the GPU is the scaler and controls the power to the display.
Would G-Sync only make sense
Would G-Sync only make sense in a less compact laptop chassis? The extra physical space, interconnects, power draw, and perhaps thermal load are all barriers to including the G-Sync module itself. If there is not space behind the display panel, then maybe a manufacturer could do away with an optical drive and re-purpose the space it would otherwise have occupied?
DP Adaptive Sync would seem to be a much simpler and more pragmatic alternative, but then GPU choices are limited. nVidia has decided to officially make no commitment to DP Adaptive Sync and it is likely not to be supported on nVidia parts as a whole. While nVidia is towing the line on G-Sync or bust for now…maybe Adaptive Sync is possible with the hardware and they are just holding back on driver support until the market decides which will catch on? Hopefully, laptop vendors start including displays with scalers that provide DP Adaptive Sync support so all GPU’s (even Intel) can have the option of variable refresh through open standards.
I thought that some form of
I thought that some form of frame rate synchronization was already built into laptop LCDs/notebooks, to save on power usage, I have read posts on other forums saying that this is true, for power saving reasons, and that laptops have their own special circuitry, and OEM custom pathways to their laptop’s internal LCD. So maybe it’s just the DP, and VGA standards at the time the laptop’s manufacture, that will determine what a laptop can sync with an external monitor, but internally with the built-in LCD there may be some form of OEM custom adaptive sync going on.
Imho I think veriable refresh
Imho I think veriable refresh rate displays have been out for time and, Virtical display synchronization is dependent on how fast the graphics prossesing unit can tranfer data to the display hardware. If freesync can not handle the high data through put that gsync can then the subject is moot and not relevant. As I have a 120Hz non gsync display and a laptop with 120Hz non gsync display and a desktop with asus swift 1440p gsync I can only say nvidia are sucking us consumers dry. If I ever get around to getting another gtx 980 for my desk top and rbuilding it the maybe with 2 gtx 980’s I will see some gsync in action but as if stands ftw.
intel invented the variable
intel invented the variable refresh , nvidia milked but congratulations to nvidia for having the vision http://www.phoronix.com/scan.php?page=news_item&px=MTU0Njg
Intel did not invent it, but
Intel did not invent it, but they did innovate upon it. These technologies are all based on pre-existing vertical blanking interval (VBLANK) that has been around for decades and has been utilized in several different ways.
Intel has been exploring VBLANK for power saving – Panel Self Refresh and Dynamic Refresh Rate Switching, for example. NVIDIA was smart enough to apply VBLANK methods to gaming and AMD followed suit.
It’s a great technology that is finally getting its opportunity to shine.