AMD's Trinity : An HTPC Perspective

by Ganesh T S on 9/27/2012 11:00 AM EST
Comments Locked

49 Comments

Back to Article

  • Marlin1975 - Thursday, September 27, 2012 - link

    Later, when you hace access, can you do the same test with the lower end dual core 65watt Trinity?

    I think that would be the best HTPC Trinity if it also keeps up.

    But looks good for a HTPC/Light gaming rig.
  • coder543 - Thursday, September 27, 2012 - link

    gotta agree. The A10 would not be my choice of processor for an HTPC. I would go with something lower cost and lower wattage... but maybe other people enjoy transcoding videos on their HTPCs.
  • ddrum2000 - Thursday, September 27, 2012 - link

    I partially disagree (personal preference). I'd like to the 65W A10-5700 reviewed as opposed to the A10-5800K since a 65W part makes much more sense for an HTPC then a 100W part. By extention, the A8-5500 would be interesting as well though I'm curious how much of a difference the number of Radeon cores makes in terms of HTPC usage.
  • coder543 - Thursday, September 27, 2012 - link

    that's what we said. how do you disagree?
  • Silent Rage - Thursday, September 27, 2012 - link

    You said, "The A10 would not be my choice of processor for an HTPC."

    He said, "I'd like to the 65W A10-5700 reviewed as opposed to the A10-5800K since a 65W part makes much more sense for an HTPC then a 100W", hence the partial disagreement.
  • MonkeyPaw - Thursday, September 27, 2012 - link

    I transcode on my HTPC, but I just use Quicksync on my i3 with HD 3000 graphics. I use Arcsoft media converter 7 and rip HD TV recordings down to a manageable size to play on my Iconia tablet. Considering the fact that it only takes 20-30 minutes to take a 1080p show down to 720p at 1/6 the original file size, I can't complain about the results. Intel offers an HD 4000 i3, and that would be my HTPC CPU of choice if I had to buy today.
  • Arbie - Thursday, September 27, 2012 - link

    The features you are testing are never obvious from a spec sheet, so a targeted hands-on review like this is very important. At least it is to me, because my next laptop choice will be based on its capabilities for media viewing and gaming. And battery life, followed by weight.

    Thanks!
  • coder543 - Thursday, September 27, 2012 - link

    this was a desktop review. The Trinity mobile reviews happened months ago..
  • stimudent - Sunday, September 30, 2012 - link

    I'm glad that Anandtech has explained to us that this is a staged released and has offered its review based around that by looking to past performance. This is better reporting. Not the immature biased reporting being done by Tech Report.
    If Intel did this, it's almost a sure thing TechReport.com would not have said a thing about a staged release and gone ahead with its review the same way Anantech did here.
  • ChronoReverse - Thursday, September 27, 2012 - link

    Isn't giving you 23.977 what you'd actually want over 23Hz? I can't think of when you'd want 23Hz (whereas 24Hz, 25Hz and 30Hz are all useful) whereas 23.976 is what you'd want from telecined material.
  • ganeshts - Thursday, September 27, 2012 - link

    Hmmm.. all vendors tag 23.976 Hz as 23 Hz in the monitor / GPU control panel settings. So, when I set the panel to 23 Hz, I am actually expecting 23.976 Hz. However, this platform gives me 23.977 Hz which is a departure from the usually accurate AMD cards that I have seen so far.
  • ChronoReverse - Thursday, September 27, 2012 - link

    23.977 and 23.976 are so close that it's basically the same (the error in measuring tools would be as large as the difference). I'd only be concerned if it were 23.970.

    In any case, from looking at the screenshots in the gallery, the only frequency looking rather off is 60Hz (although my AMD card has always given similar lower than 60Hz results anyway).
  • ganeshts - Thursday, September 27, 2012 - link

    Note that these are in Hz, not MHz. So, the margin for error is quite large. In fact, madVR statistics deliver accurate refresh rates up to 6 decimal digits (as the screenshots show).

    To read more on why the 0.001 Hz difference matters for SOME people, look this up: http://www.anandtech.com/show/4380/discrete-htpc-g...

    In short, with the 0.001 Hz difference, the renderer might need to repeat a frame every ~17 minutes. I am NOT saying that this is a serious issue for everyone, but there are some readers who do care about this (as evidenced by the range of opinions expressed in this thread: http://www.avsforum.com/t/1333324/lets-set-this-st...
  • ChronoReverse - Thursday, September 27, 2012 - link

    That thread on avsforum is talking about 24FPS playback where if you got 23.97x instead, it's a stutter about every 42 seconds which is terrible and clearly not acceptable (to my eye anyway).

    Still, I do admit that even a single stutter every 17 minutes is noticeable.

    Also, I had misread that part of the review a bit since for some reason I had the impression it was saying the performance of AMD has diminished when it's still about the same +/- 0.002Hz
  • jeremyshaw - Thursday, September 27, 2012 - link

    Wasn't AMD's first APU Brazos, not Llano? Or was it too small to really count!?
  • ganeshts - Thursday, September 27, 2012 - link

    Technically correct, but it didn't compete in the same level as the Clarkdales / Arrandales / Sandy Bridge lineup :)
  • jamawass - Thursday, September 27, 2012 - link

    "The video industry is pushing 4K and it makes more sense to a lot of people compared to the 3D push. 4K will see a much faster rate of adoption compared to 3D, but Trinity seems to have missed the boat here. AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but Trinity doesn't have 4K video decode acceleration or 4K display output over HDMI."
    Although this statement is technically correct it has no real world relevance. At this time people who can afford 4k TVs ( if there any commercially available ones at this time) won't be messing around with cheap htpcs. It's an inconsequential statement made just to detract from AMD's overall superiority with this product in the htpc market.
    If I was in AMD's shoes why would I dedicate resources to a nonexistent market ? Has anyone actually tested Nvidia or Intel's 4K output over HDMI to see whether they actually work? In the early days of HDCP all the video card manufacturers were claiming compliance but real world compatibility was a different matter.

    '
  • ganeshts - Thursday, September 27, 2012 - link

    I had the same caveat in the Ivy Bridge HTPC review. Surprised you didn't notice that, but you notice this :) Ivy Bridge doesn't support 4K over HDMI yet.

    Anyways, yes, we have test 4K output from both NV and AMD. When AMD 7750 was released, we didn't have access to a 4K display, but things changed when the GT 640 was released:

    http://www.anandtech.com/show/5969/zotac-geforce-g...

    I don't have a sample image ready for the 7750, but I can assure you that it works as well as NVIDIA's and I have personally tested it. In fact, AMD was the first to 4K output over HDMI.
  • JNo - Saturday, September 29, 2012 - link

    More importantly do you have any 4K films to watch? No. Will you in the immediate future? No. Even then, when will *most* new films coming out be available in 4K? Probably in 5 years time when you'd build a new HTPC anyway.

    The 4K thing is absolutely irrelevant at this point (unlike 3D I'd argue because you can go into plenty of shops and buy actual 3D media).

    After Hi Def came out hardware (TVs) were available quickly but it took a *long* time before there was plenty of 1080p material anyway (note use of the word, 'plenty'). Hell, most people I know are still watching stuff in SD. Laughably, 4K isn't even close to being out yet, let alone the content.

    The whole thing's a red herring right now and for a long while.
  • Cotita - Thursday, September 27, 2012 - link

    I'm not sure I'd go for an A10.

    Even a A4 3420 would do pretty much the same.

    Heck, If I don't care about HD flash or silverlight even a E-350 is enough
  • Beenthere - Thursday, September 27, 2012 - link

    There is an appropriate CPU/APU model for every budget these days. Virtually any current model APU/CPU will perform just fine for 98% of consumers. Most consumers buy what fits their needs and budget, not the over priced, over hyped top-of-the-line models.

    AMDs new Trinity APUs and Vishera desktop FX processors offer more performance for less, which is good for consumers.
  • silverblue - Thursday, September 27, 2012 - link

    We don't know about Vishera, not yet anyway. We don't know what the improvements over Bulldozer will yield as a whole, only what a couple of benchmarks showed in a brief Toms comparison between Trinity and Zambezi. There are plenty of scenarios to consider.
  • Beenthere - Thursday, September 27, 2012 - link

    Yes some of us do know the results... Comparing Trintiy to Vishera is incorrect. Vishera is to be compared to Zambezi.

    AMD has hit their projected 10-15% gains for Vishera compared to Zambezi. Some people already know the results but the NDA doesn't expire for a few weeks so they can't print them yet. Most folks will be happy with Vishera except the haters.
  • silverblue - Thursday, September 27, 2012 - link

    I'd find it hard to believe you were personally under NDA (please prove me wrong). I also believe the gains were per clock, which should theoretically, given the assumption you stated, result in a slightly larger performance gap between the 8150 and the 8350 as the latter has a higher base clock and is more likely to hit max turbo speed.

    Like I said though, two benchmarks in the public domain aren't gospel, regardless of whether we're comparing Vishera OR Trinity to Zambezi. Remember that L3 cache doesn't always help, but when it does, the gains can be significant, meaning the A10-5800K could occasionally be outperformed by a similarly clocked 41x0 CPU, but the flip side is that it could occasionally perform on par with a similarly clocked 43x0 CPU.
  • Death666Angel - Thursday, September 27, 2012 - link

    "AMD was a little late in getting to the CPU - GPU party. Their first endeavour, the Llano APU"
    Aren't Zacate and Ontario APUs? They were released in 01/2011, half a year before Llano. Or aren't you counting low power APUs? :)
    Thanks for the article!
  • Death666Angel - Thursday, September 27, 2012 - link

    Mea culpa, didn't read the comments before posting my own. :) Disregard.
  • EnzoFX - Thursday, September 27, 2012 - link

    What's the big deal with 4k at THIS moment? There are no 4k tv's out are there? By the time they're out, or by the time they're actually affordable by a decent amount of consumers, we would have several generations of new apu's.
  • Denithor - Tuesday, October 2, 2012 - link

    http://www.lg.com/us/tvs/lg-84LM9600-led-tv
  • Allio - Thursday, September 27, 2012 - link

    These HTPC-perspective articles are consistently some of the most useful and interesting content that AT puts up. As far as I can tell, there really aren't any other tech sites that delve this deep into this kind of functionality - most reviews settle for playing a 1080p Bluray and posting a screenshot of the CPU usage in task manager. While it may only be a relatively small audience for who this stuff is relevant, we are a very interested audience, and I personally appreciate every detail and statistic included. Thanks Ganesh!
  • wharris1 - Thursday, September 27, 2012 - link

    At this point, it seems 4k is more marketing hype. I'll link this article: http://reviews.cnet.com/8301-33199_7-57491766-221/...
    For the typical anandtech readers (probably much more technically gifted than me) I also recall reading a similar article/post on avsforums explaining that for any display size <~100 inches, the 4k standard is hard to justify. Also, while I know that future proofing is sound, there is very little content or ability to play back said content at that resolution. As a previous poster mentioned, by the time 4k becomes a standard, the current platforms will seem antiquated. Anyway, Anandtech is the best tech site around by far; read it every morning.
  • Denithor - Tuesday, October 2, 2012 - link

    Which explains 1600p on 30" monitors, right?

    Granted, most of us don't sit 3-4' from our TV but I know even on my 50" 1080p barely cuts it (text is hard to read sometimes, fuzzy if I zoom in enough to read easily).
  • OCedHrt - Thursday, September 27, 2012 - link

    Meaning, setting it to 16-235 means to discard 0-15 and 236-255 and expand the remainder to full RGB.

    Obviously I don't have a Trinity setup so I'm just speculating, but on my HD6400 there is a different parameter on the display configuration section to tweak screen output range - which I set to RGB full range.
  • ganeshts - Thursday, September 27, 2012 - link

    I think you are referring to the pixel format output which is YCbCr 4:4:4 / YCbCr 4:2:2 / RGB Limited / RGB Full

    The dynamic range aspect is orthogonal to the pixel format output over HDMI.

    The screenshot posted is that of a video playing in the background. Sorry if that wasn't clear. I am not sure about AMD's terminology here, but any user setting the dynamic range to 16-235 would expect NOT to see values 0 - 15 and 236 - 255.
  • OCedHrt - Friday, September 28, 2012 - link

    Yes I was referring to pixel format output. I use RGB Full. I was under the impression that YCbCr cannot display the ranges 0-15 and 236-255 but I think I might be wrong on this one. It is YV12 / YUY2 colorspaces that lack these ranges.

    And what you're saying about dynamic range is exactly what I'm saying is happening. If you select 16-235, 0-15 and 236-255 from the video is filtered out and the remaining is expanded back to 0-255. Thus a video decoded to YV12 / YUY2 space played on a full range display would have a greyish black or white without selecing 16-235 range. Meaning, the wording on AMD's UI is correct, just the whole idea behind it is confusing.
  • superccs - Friday, September 28, 2012 - link

    Have all of our expectations of their new hardware dropped considerably? I am an AMD fan as much as the rest of you, but it just seems like we are trying so hard to find their stuff useful.
  • CeriseCogburn - Saturday, October 13, 2012 - link

    Good deal, another fanboy zombied out for years by the marketing hatred and hype useful idiots collective has shown a glimmer of light, hope that the slave mind can break free from the dirty chains.

    The new test is this: Would you put up with this crap from any other company or vendor ?

  • Hardcore69 - Friday, September 28, 2012 - link

    - HTPC box: No point. A G540 + GTX 650 if you really want MadVR and 23.976.

    - Office box: No point. A G540 is enough for a basic everyday system

    - Gaming box: No point. A dedicated card is still the answer for 1080p High/Ultra gaming i.e. real PC gaming.

    Well? APU's are rather pointless. All this accelerated media crap, HD 1000 can do that too.
  • Medallish - Friday, September 28, 2012 - link

    - HTPC Box: that's passively cooled: An A10-5700 would work great in there and be a nice upgrade!

    - Office/Workstation Box: GPU acceleration can make a lot of difference, not to mention people have different needs.

    - Gaming Box: For someone who wants to game but don't want to shell out the money needed to get 1080p Ultra graphics, or as I see it, a gaming starter kit.

    Well? APU's have plenty of point if you're not an out of touch Intel fanatic. Also did you even read the review? There was encoding and decoding that the APU did really well.

    btw. I have a passively cooled HTPC, and a Laptop I use for office work, both based on APU's(Currently Llano, the HTPC is getting a Trinity upgrade though.) and I wouldn't want them any other way.
  • ssj3gohan - Monday, October 1, 2012 - link

    Passively cooling a 130W box? Really?

    I'd like to see AMD trying a bit harder to keep their power consumption down, because in the end the reason for me to choose an i5-3570K was that like AMD it offered 'enough' GPU power, but at a much lower max power. My computer runs at well under 10W idle and about 75W max (OCCT+Furmark), more like 45W in normal use. I wouldn't be able to get near that kind of power consumption with equally-featured Trinity parts (aside from the lower CPU performance, which isn't really a big deal tbh).

    (by the way, my 5.9W core i5-computer: http://ssj3gohan.tweakblogs.net/blog/8217/fluffy2-...
  • Medallish - Friday, October 5, 2012 - link

    Yup, I've been working on my own little HTPC project(Although not as cool as yours :D). The Streacom FC5-OD is surprisingly good at cooling down even a 100W APU, right now I'm using a 3870k, I'm planning on getting the A10-5700 asap, and the final touch I plan on adding is the 6670, connect it to the opposite cooling ribs, however right now I'm running into a PSU limit, that I plan on countering by getting a slightly better PSU(250W CarPC PSU instead of a 150W picoPSU)

    But yeah despite the slightly higher load, the fact is on idle, and most likely average, AMD have really brought down power consumption with Trinity. But I like your setup, and will probably borrow a few ideas from there.
  • Oxford Guy - Friday, September 28, 2012 - link

    4K strikes me as being completely unnecessary. 1080p is enough resolution.
  • brookheather - Friday, September 28, 2012 - link

    Is this a typo? "Intel and NVIDIA offer 50 Hz, 59 Hz and 60 Hz settings which are exactly double of the above settings" - 59 is not double 29 - did you mean 58?
  • ganeshts - Friday, September 28, 2012 - link

    Nope :) 29 Hz is 'control panel speak' for 29.97 Hz and 59 Hz is 'control panel speak' for 59.94 Hz. So, if you have a file at 29.97 fps, it can be played back without any dropped or unsymmetrical repetition at 59.94 Hz since each frame has to be just 'painted' twice at that refresh rate.
  • cjs150 - Friday, September 28, 2012 - link

    This is exact the standard of article I read AT for.

    I remain complete bewildered that chip manufacturers cannot get the frame rates right. It may be an odd frame rate but it is a standard rate that has remained the same forever.

    However, the problem for AMD remains the TDP of the processors. Heat requires to be dealt with, usually by fans and that means noise. An HTPC needs to be as close to silent as possible.

    TDP of 65W is simply too high. You can (as I have) buy a ridiculously over powered i7-3770T which has a TDP of 45W. AMD need to reduce the TDP to no more than 35-45W. At that point there are various HTPC cases which can cool that completely passively.

    Overall this is yet another step forward in the ideal HTPC but we are still short of the promised land
  • wwwcd - Saturday, September 29, 2012 - link

    i7-3770T too expensive against Trynity models and have a double weakness video. For poor peoples it not be choice.
  • cjs150 - Saturday, September 29, 2012 - link

    I agree that the i7-3770T is too expensive at the moment compared to AMD alternatives but it does not have video weaknesses check out the review on Anandtech.

    The refresh rate is close to the correct rate but close is not good enough it should be spot on.

    There is still a lot of work to be done to get to an ideal HTPC CPU. Both AMD and intel are close. If anything AMD has slightly better video but, as I said, TDP is too high.

    Of course the other option is something like the Raspberry pi, unfortunately whilst hardware is promising the software still needs a lot of work
  • Burticus - Friday, September 28, 2012 - link

    Put one of these on a mini-itx board and cram it into something the size of the Shuttle HX61 that I just got and I am interested. I am so spoiled by having a small, silent, cool HTPC I will never go back to anything louder or bigger than a 360.
  • LuckyKnight - Saturday, September 29, 2012 - link

    AMD are missing a market here, working 23.976Hz with a 35W TDP for a passive cooled case. That would be my choice, if it existed.

    Shame Intel can't get 23.976 to work properly, despite their alleged promise!
  • Esskay02 - Saturday, September 29, 2012 - link

    "Intel started the trend of integrating a GPU along with the CPU in the processor package with Clarkdale / Arrandale. The GPU moved to the die itself in Sandy Bridge. Despite having a much more powerful GPUs at its disposal (from the ATI acquisition), AMD was a little late in getting to the CPU - GPU party."

    According to my readings, it was AMD not Intel, first to talk and initiated APU(cpu+gpu). Intel found the threat used it manpower and resources , came out release cpu+Gpu chip.

Log in

Don't have an account? Sign up now