Power Consumption and Frequency Ramps

On the box, both processors are listed as having 65 W TDPs. With its Zen-based hardware, AMD has been relatively good at staying around that official on-the-box value, even during turbo. In the last generation, AMD introduced a feature called PPT, or Package Power Tracking.

  1. For 105 W processors, PPT is >142 W
  2. For 65 W processors, PPT is >88 W
  3. For 45 W processors, PPT is >60W

This allows the processor to raise its power limits, assuming it isn’t breaching thermal limits or current limits, and consequently raise the frequency. As a result, while we see 65 W on the box, the real world power consumption during most tasks is likely to be nearer 88 W, unless the current or thermal lines are crossed.

As a new element to our testing, we are recording power over a number of benchmarks in our suite, rather than just a simple peak power test.

AMD Ryzen 3 3300X

For the faster chip, we saw a peak power in both of our tests of around 80 W.

With yCruncher, which is somewhat of a periodic load, the power consumption dropped over time to nearer 75 W.

3DPM is more obvious with its idle steps between loads, being 10 seconds on then 10 seconds waiting. The power almost peaked at a similar amount here.

In both of these graphs, the package power when idle is around 16-17 W. I looked back through the data, and noticed that out of this power only 0.3 W was actually dedicated to cores, with the rest being towards the big IO die, the memory controllers, and the Infinity Fabric. That’s still pretty substantial for an idle load.

At low loads, the power per core was around 14 W, while at full load it was slightly less depending on the test. This is a bit away from the 20 W per core we get from the high end Zen 2 processors, but these only go to 4.3 GHz, not 4.7 GHz+. This is about in line with what we expect.

On our frequency ramp test, the Ryzen 3300X went from an idle state to peak power within 17 milliseconds, or approximately a frame at 60 Hz.

One of the new features with Ryzen 3000 is CPPC2 support, which AMD claims to reduce idle-to-turbo ramping from 30 milliseconds to 2 milliseconds. We’re seeing something in the middle of that, despite having all the updates applied. That being said, the jump up to the peak frequency (we measured 4350 MHz, +50 MHz over the turbo on the box) is effectively immediate with zero skew across a range of frequencies.

AMD Ryzen 3 3100

Given that the TDP number on the side of the box says 65 W as well, any reasonable user would assume that the power of this chip would be equal, right? Regular readers will know that this isn’t always the case.

In our yCruncher test, because the turbo frequency is lower than the 3300X, it means the voltage can be lower, and thus power is lower. Our history of testing Zen 2 has shown that these cores get very efficient at lower frequencies, to the point where our processor doesn’t even break that 65 W threshold during yCruncher.

Similarly the 3DPM peaks are also lower, barely going to 55 W during an AVX2 workload.

On the frequency ramp side, we see another instance of a 16-17 ms transition.

Summary

For the peak power out of all of our testing, we saw the Ryzen 3 3300X hit a maximum of 80 W, and the Ryzen 3 3100 go to 62 W. When we compare that to the Core i7-7700K, at 91 W TDP / 95 W peak, combined with most of the results on the next few pages, AMD by comparison is more efficient.

AMD Ryzen 3 3300X and 3100 Review Test Bed and Setup
Comments Locked

249 Comments

View All Comments

  • Death666Angel - Sunday, May 10, 2020 - link

    No. Official AMD support and motherboard manufacturer support are two different things. As stated in the article.
  • lmcd - Sunday, May 10, 2020 - link

    I misread the paragraph below it, but in general it's weird for AMD to put out a diagram quite that misleading. The ASRock AB350 was ~$120 when I bought it and is ASRock supported for the 3900X -- surely a decent percentage of boards can support most Zen 2 processors barring power constraints for the 16 core if a cheap budget build can?
  • alufan - Monday, May 11, 2020 - link

    Not true the AM$ socket will support all Ryzen chips however not all features are available on all boards such as gen 4 as this is a specific development that was not available when the 1 series launched, also the limitation is on the power system of the board not in AMDs specs

    "CHIPSET FEATURES: Note that not all processors are supported on every chipset, and support may require a BIOS upgrade. See your motherboard manufacturer’s website for compatibility"

    I have a 3 series running in my A320 media pc in my lounge updated the bios and it works fine however i suspect if i tried a 3900 it would not have the power circuit to support it, the other issue is the bios chips in some of the older boards cannot store enough information to allow all the chips to be used, so strictly speaking the issue is with the board supplier.
  • trenzterra - Sunday, May 10, 2020 - link

    I'm still stuck on the i5-6600K which I built back in 2016. Thought it would serve me well for many years to come given the state of Intel and AMD at that point in time, and that my previous i5-2400 lasted me a good number of years while still being competitive. Now barely four years later it's obsoleted by a 100 dollar CPU lol.
  • lmcd - Sunday, May 10, 2020 - link

    It's far from obsolete, even if it's regularly beaten. I'm still using my Sandy-E processor when I'm unopposed to simultaneously running a space heater -- it's just a question of whether you need the latest and greatest.
  • watzupken - Sunday, May 10, 2020 - link

    Actually looking that the performance of these 4 cores chip, I can't wait to see an APU with it. Even the 4 core APU will be great for every day usage, without a graphic card. I just hope they give the 4 core version a decent graphic option, rather than a Vega 6.
  • TexasBard79 - Monday, May 11, 2020 - link

    A very good review, quite in line with the others. Ryzen 3 3300X is a nasty game-changer.
  • TheJian - Tuesday, May 12, 2020 - link

    Please stop running tests that appeal to less than 5% of your audience (and I think I'm being generous here). Crysis on cpu? Who cares? What does it prove I can do today? Dwarf fortress?? WTF? Quit wasting your time and ours. AI ETH tests? What for (farms do this now)? How many tests do you need that show NOTHING to any of us?

    People should get the point. You are irrelevant at some point if you keep posting crap nobody cares to read. Ask toms hardware :) Oh, wait, you guys are toms. ;)

    How about testing 20 games at 1080p where everyone plays. :) Is it too difficult to ask a few pros to make a script for photoshop/premier/AE to test under AMD/NV (cuda vs. OpenCL or whatever is faster on AMD)? It is almost like you guys seek benchmarks that nobody could possibly find useful IRL.

    "provide a good idea on how instruction streams are interpreted by different microarchitectures."
    Your PHD project tells me how these cpus will run in WHICH PRO APP? Why not just test a PRO APP IRL? Bah...fake news. Not sure why, AMD wins everything right now. Why hunt for fake tests that mean nothing? How many people use Agisoft instead of PhotoshopCC for $10 a month?

    Still ripping at crap modes nobody would actually use. Again tells us nothing about what we REALLY do usually. Only a retard uses FAST settings in handbrake for anything but a 15fps training vid.

    "We are currently in the middle of revisiting our CPU gaming benchmarks" and upgrading to 2080ti. Can't happen soon enough, please make sure you test games that sell over 1mil ON PC or don't bother. If the sell poorly or are poorly rated, there is no point in testing them. Test what people PLAY, at settings people really use. 720p low shows what to a person who will NEVER play below 1080p? Oh wait, I just described 99% of your audience, as I'm quite sure they haven't played 720p in ages. So much wasted testing. Stop testing 4k and test more 1080p/1440p (1440p still almost useless, wake me at 10%).

    "Some of these new benchmarks provide obvious talking points, others are just a bit of fun. Most of them are so new we’ve only run them on a few processors so far. It will be interesting to hear your feedback!"

    Please quit wasting your time. It feels like all your benchmarks are "for fun" as I'm not much smarter after coming here. Off to a site that tests a dozen games and some real world stuff some of us actually use (techpowerup for example...games galore, 10 REAL games tested). THIS is how you give a well rounded idea of a cpu/gpu perf. YOU TEST REAL STUFF, instead of your PHD crap or agisoft junk. People use adobe, and play games that SELL. This isn't complicated people.

    Might as well jump off the roof with your cpu and tell us how fast you hit the ground. Just a useless as your benchmarks. Are they benchmarks if nobody uses them? Or is it just more "fun" crap tests that tell us nothing useful? If you are NOT helping me make a more informed decision (useful info) about buying the reviewed product, you have failed. A good review is chock full of useful info related to how we actually use the product, not a bunch of crap nobody cares about or use IRL.

    https://store.steampowered.com/app/975370/Dwarf_Fo...
    The devs make 3K a month from it. This is not exactly played by the world if it pulls down $35K a year. Why even bother testing this crap? Are we all going to go back to pixel crap graphics tomorrow? Heck now. Wake up. Those games (and the shite monitors we had then) are why I needed lasik...ROFL.
  • Spunjji - Tuesday, May 12, 2020 - link

    "Only a retard uses"
    And that's about where I realised you weren't really making a comment so much as farting into a piece of voice recognition software.
  • Meteor2 - Tuesday, August 4, 2020 - link

    I wonder if even one single person ever read that comment

Log in

Don't have an account? Sign up now