Meet The ZOTAC Gaming GeForce GTX 1650 Super

Since this latest GTX 1650 series card launch is a virtual launch like the others, the board partners are once again stepping up to the plate to provide samples. For the GTX 1650 Super launch, we received Zotac’s Gaming GeForce GTX 1650 Super card, which is a fairly straightforward entry-level card for the series.

GeForce GTX 1650 Super Card Comparison
  GeForce GTX 1650 Super
(Reference Specification)
Zotac Gaming GeForce GTX 1650 Super
Core Clock 1530MHz 1530MHz
Boost Clock 1725MHz 1725MHz
Memory Clock 12Gbps GDDR6 12Gbps GDDR6
VRAM 4GB 4GB
GPU Power Limit 100W 100W
Length N/A 6.24-inches
Width N/A 2-Slot
Cooler Type N/A Open Air, Dual Fan
Price $159 $209

For their sole GTX 1650 Super card, Zotac has opted to keep things simple, not unlike their regular GTX 1650 cards. In particular, Zotac has opted to design their card to maximize compatibility, even going as far as advertising the card as being compatible with 99% of systems. The end result of this being that rather than doing a large card that may not fit everywhere, Zotac has gone with a relatively small 6.2-inch long card that would be easily at home in a Mini-ITX system build.

Fittingly, there is no factory overclock to speak of here. With GPU and memory speeds identical to NVIDIA’s reference specifications, Zotac’s card is as close as you can get to an actual reference card. With is very fitting for our generalized look at the GeForce GTX 1650 Super as a whole.

Digging down, we start with Zotac’s cooler. The company often shifts between single fan and dual fan designs in this segment of the market, and for the GTX 1650 Super they’ve settled on a dual fan design. Given the overall small size of the card, the fans are equally small, with a diameter of just 65mm each. This is something to keep in mind for our look at noise testing, as small fans are often a liability there. Meanwhile the fans are fed by a single 2-pin power connector, so there isn’t any advanced PWM fan control or even RPM monitoring available for the fan. In this respect it’s quite basic, but typical for an NVIDIA xx50 series card.

Underneath the fan is an aluminum heatsink that runs most the length of the card. With a TDP of just 100 Watts – and no option to further increase the power limit – there’s no need for heatpipes or the like here. Though the heatsink’s base is big enough that Zotac has been able to cover both the GPU and the GDDR6 memory, bridging the latter via thermal pads. The fins are arranged vertically, so the card tends to push air out of the top and bottom.

The small PCB housing the GPU and related components is otherwise unremarkable. Zotac has done a good job here seating such a large GPU without requiring a larger PCB. As we usually see for such short cards, the VRM components have been moved up to the front of the board. The MOSFETs themselves are covered with a small aluminum heatsink, though with most of the airflow from the fans blocked by the primary heatsink, I don’t expect the VRMs are getting much in the way of airflow.

For power, the card relies on an 6-pin external PCIe power cable, as well as PCIe slot power. The power connector is inverted – that is, the tab is on the inside of the card – which helps to keep it clear of the shroud, but may catch system builders (or video card editors) off-guard the first time they install the card.

Finally for hardware features, for display I/O we’re looking at the same configuration we’ve seen in most GTX 1650 cards: a DisplayPort, an HDMI port, and a DL-DVI-D port. While DVI ports have long been banished from new products, there are still a lot of DVI monitors out there, particularly in China where NVIDIA’s xx50 cards tend to dominate. The tradeoff, as always, is that the DVI port is taking up space that could otherwise be filed by more DisplayPorts, so you’re only going to be able to drive up to two modern monitors with Zotac’s GTX 1650 Super. Of course, one could argue that a DL-DVI port shouldn’t even be necessary – this lower-end card isn’t likely to be driving a 1440p DL-DVI display – but I suspect this is a case where simplicity wins the day.

The NVIDIA GeForce GTX 1650 Super Review The Test
Comments Locked

67 Comments

View All Comments

  • WetKneeHouston - Monday, January 20, 2020 - link

    I got a 1650 Super over the 580 because it's more power efficient, and anecdotally I've experienced better stability with Nvidia's driver ecosystem.
  • yeeeeman - Friday, December 20, 2019 - link

    It is as if AMD didn't have a 7nm GPU, but a 14nm one.
  • philosofool - Friday, December 20, 2019 - link

    Can we not promote the idea, invented by card manufacturers, that everyone who isn't targeting 60fps and high settings is making a mistake? Please publish some higher resolution numbers for those of us who want that knowledge. Especially at the sub-$200 price point, many people are primarily using their computers for things other than games and gaming is a secondary consideration. Please let us decide which tradeoffs to make instead of making assumptions.
  • Dragonstongue - Friday, December 20, 2019 - link

    100% agreed on this.

    Up to the consumers themselves how where and why they will use the device as they see fit, be it gaming or streaming or "mundane" such as watching videos or even for emulation purposes, sometimes even "creation" purposes,

    IMO is very related to the same BS crud smartphone makers use(used) to ditch 3.5mm jacks "customers do not want them anymore, and with limited space we had no choice"

    so instead of adjusting the design to keep the 3.5mm jack AND a large enough battery, the remove the jack, limit the battery size ~95% are all fully sealed cannot replace battery as well as nearly all of them these days are "glass" that is by design pretty but also stupid easy to break so you have not choice but to make a very costly repair and/or buy a new one.

    with GPU they CAN make sure there are DL-DVI connector HDMI full size DP port (with maybe 1 mini DP)

    they seem to "not bother" citing silly reasons "it is impossible / customers no longer want this"

    As well as you point out..the consumer decides the usage case, provide the best possible product, give the best possible NO BS review/test data and we the consumer will see or not see therefore decide with the WALLET if it is worth it or not.

    Likely save much $$$$$$$$$$ and consumer <3 by virtue of not buying something they will inadvertently regret using in the first place.

    Hell I am using and gaming with a Radeon 7870 @ 144Hz monitor 1440p (it only runs at 60Hz due to not fully supporting higher than this) However I still manage to game on it "just fine" maybe not ultra spec everything, but comfortably (for me) high to medium "tweaked" settings.

    Amazing how long this last when they are built properly and not crap kicked out of it...that and well not having hundreds to thousands to spend every year or so (which is most people these days) should mean so much more to these mega corps than "let us sell something that most folks really do not need, let us make it right and upgrades will happen when they really need to instead of just ending in the E trash can in a few months time"
  • timecop1818 - Friday, December 20, 2019 - link

    DVI? No modern card should have that garbage connector. Just let it die already.
  • Korguz - Friday, December 20, 2019 - link

    yea ok sure... so you still want the vga connector instead ???
  • Qasar - Friday, December 20, 2019 - link

    dvi is a lot more useful then the VGA connector that monitors STILL come with. but yet we STILL have those on new monitors. no modern monitor should have that garbage connector
  • The_Assimilator - Saturday, December 21, 2019 - link

    No VGA. No DVI. DisplayPort and HDMI, or GTFO.
  • Korguz - Sunday, December 22, 2019 - link

    vga.. dead connector, limited use case, mostly business... dvi.. still useful, specially in KVMs... havent seen a display port KVM.. and the HDMI KVM, died a few months after i got it.. but the DVI KVMs i have.. still work fine. each of the 3, ( dvi, hdmi and display port ) still have their uses..
  • Spunjji - Monday, December 23, 2019 - link

    DisplayPort KVMs exist. More importantly, while it's trivial to convert a DisplayPort output to DVI for a KVM, you simply cannot fit the required bandwidth for a modern high-res DP monitor through a DVI port.

    DVI ports are large, low-bandwidth and have no place on a modern GPU.

Log in

Don't have an account? Sign up now