It is very likely that NVIDIA
has more Turing
-based GeForce GTX graphics cards up its sleeve, and rumor has it that two of the upcoming models include the GeForce GTX 1660 (non-Ti) and GeForce GTX 1650. As it pertains to the latter, a new listing in the 3DMark benchmark database may have revealed some key specs.
Frequent leaker and Twitter user @TUM_APISAK posted a screenshot of the entry, which shows the GeForce GTX 1650 paired with an Intel Core i7-9750H processor. Of bigger interest than the testbed, though, are the clockspeeds and RAM—the listing indicates the GeForce GTX 1650 wields 4GB of presumed GDDR5 memory. Past leaks peg the memory bus at 128-bit, and with a 2,000MHz effective clockspeed, that would give the card 128GB/s of memory bandwidth.
Source: Twitter via @TUM_APISAK
As for the GPU, it's listed as having a 1,395MHz base clock and 1,560MHz boost clock. Unfortunately, 3DMark does not dig deeper in a card's specs, and so there is no mention of the specific GPU model and related specs, such as how many CUDA cores and texture units might be present.
That said, we can reasonable surmise that this will be yet another Turing card that lacks RT and Tensor cores, which are what give GeForce RTX series cards their real-time ray tracing and Deep Learning Super Sampling (DLSS) mojo. NVIDIA rightly recognized that gamers at large are waiting for both features to be more widely supported before investing in the necessary hardware. Hence why the GeForce GTX 1660 Ti
exists—it lacks those features and is the least expensive Turing card on the market.
If going by past leaks and rumors, the GeForce GTX 1650 will debut on April 30
and cost $179. Preceding its launch, NVIDIA is expected to unveil the GeForce GTX 1660 on March 15, priced at $229.
Go to Source