NVIDIA GeForce GTX 650 Ti BOOST Review

 

Features - Kepler GPU Technologies

features

Back when we saw the release of the GeForce GTX 660, it was exclaimed that it was "Kepler for the masses."  And taking on a similar lineage to popular cards like the GTX 560 Ti before it, presented what Nvidia thought to be the real "sweet" spot for price-to-performance.  However, as the dust settles on the core GTX 600 series product stack, Nvidia still saw opportunity to convert gamers that are holding out for the next product cycle for an upgrade.  If you can take a GTX 660, make it a bit cheaper, and still hit the "sweet spot" of delivering playable frame rates at "High Quality" settings at 1080P, you'll reallly hit the sweet spot a majority of gamers are really looking for.  This is where the GTX 650 Ti BOOST comes in to play.

 

The GTX 650Ti Boost uses the same GK106 chip of the GTX 660, but has 768 CUDA Cores (down from 960) and 64 Texture Units (down from 80).  This along with a price adjustment of the entire lineup gives you the GTX 660 right at the magic $200 mark and the GTX 650 Ti Boost at a mere $180, which is sure to grab alot of gamers' attention.  Given that 40% of gamers are using a GPU with DirectX 10 or older, the GTX 650 Ti Boost's DirectX 11 capability is a major point of marketing for NVIDIA. Other drool-worthy technologies carried down its older brother the GTX 660 include NVIDIA PhysX, Adaptive VSync, GPU Boost, and TXAA Anti-Aliasing. 

It's now become much blurrier to say which card, the GTX 650 Ti Boost or the GTX 660, rules the bang-for-your-buck segment, offering very high gaming performance at a price point that is attainable by just about anyone who is hungry for an upgrade. The sub-200s market is where most gamers feel comfortable shopping for new graphics hardware, and I haven't seen an offering quite like the GTX 650 Ti Boost from Nvidia's stack in... ever?  If they are anything like I was before I started reviewing hardware, I was an every-other-gen upgrader.  Nvidia sees that there are droves of gamers rocking GTX 460s and even a few generations back, with 9800GTs and GTX 260s, who will finally see big enough gains to make the jump to a new card at a mere $170 suggested retail price (read: competition will make this even lower).  So what we expect to see is being able to make an upgrade to play current-gen games at high quality at 1080P, which is by far the most popular monitor resolution now-a-days, for only around $150.  That's fantastic, and this could set a new precedent for the "template" for product stacks in the future, which is a great thing for gamers.

 

Microsoft DirectX 11

A discussion of DX11 really boils down to two things: tessellation and displacement mapping. Put simply, a displacement map stores height information to give texture to surfaces it is applied to. Limitations arise from a lack of vertices to depict complex textures. Tessellation is able to slice and dice the polygons to deliver highly textured characters that border on film-like realism. Given that tesselation is programmable in DirectX 11, challenging graphics problems like bump mapping, smoothing, object popping, and artwork scaling. This means in a nutshell that gamers will enjoy all the details of their favorite games both far away and close up. A more detailed explanation of DX11 tessellation can be found on the GeForce website.

NVIDIA PhysX

Until we acheive something akin to the Star Trek holodeck, gamers will continue to demand a higher sense of realism from their virtual environments. Flags that don't flutter from a gust and walls that don't crumble from a shotgun blast are dead giveaways of a faked world. PhysX represent NVIDIA's best effort to pull the wool over our eyes by providing environments, objects, and figures that respond dynamically to the actions of the player and the game's AI.

NVIDIA Adaptive VSync

Vertical Synchronization, or VSync, is a known prescription for screen-tearing when an application's frame rates exceed the refresh rate of your monitor. We all know there's no such thing as a free lunch, and previous users of VSync suffered from stuttering due to the locked frame rate. Just as it sounds, Adaptive Vsync can selectively enable and disable the VSync frame rate lock, eliminating any sharp dips in performance while still preventing screen-tearing.

GPU Boost

Graphics cards are designed to operate under a specific wattage called the Thermal Design Point (TDP). However, a great number of popular games consume a fraction of the TDP rating, leaving a great amount potential untapped. GPU Boost dynamically alters the clock speed of the GPU based upon the app that is currently running to extract every necessary watt without going over the TDP threshhold. All this is accomplished via realtime hardware monitoring, so there's no need for applicaiton profiles or updated drivers when the latest great title hits the shelves. One thing that overclockers need to keep in mind is that due to the presence of GPU Boost, their graphics card has two effective clock speeds, a base clock and a boost clock. This gives us a frequency envelope of what the GPU will typically run at (the boost clock) and the lowest speed we'll see (the base clock). Thus there are two approaches to overclocking a card equipped with GPU Boost, increasing the base clock speed to induce a higher corresponding Boost clock, or manually increasing the target power level.

TXAA Anti-Aliasing

Anti-aliasing is particularly important for makers of CG films and game engines to ensure that the audience isn't distracted by jagged edges. Sophisticated filters enable to TXAA work to smooth these edges out at a level comparable to 4x the previous generation MSAA Anti-Aliasing. You also won't notice any jagged lines crawling in front of a slow-moving camera thanks to TXAA taking a "jitter" sample.

You have no rights to post comments