Even in gaming, Intel drivers are still more stable. Everything else depends on architecture optimizations for each game and Intel just like nVidia is pretty good at securing partnerships.
That's a big minus for AMD imo, as someone who owned a RX480, RX580 and a RX590 I dreaded every new update from AMD since more often than not it would break at least one game I played.
looking at your comments - amd gpu drievrs are bad - amd chipset drivers are bad.... ever think of its just the person behind the screen that has no clue.... go buy a gaming console or an OEM based predesigned system. its clear you are not up to the task to maintain a computer.
what's awful about them ? I have no issues, I will say on my laptop the radeon settings panel doesn't work, the nvidia control panel sucks horsecock, and on mobile even more so.. but atleast it starts but don't know what do to in it cause it's mostly empty.. Intel's control panel for gpu on laptops I like, it works, their backend driver sucks but it's ok and I see them making great strides. I'm happy, and not the biggest loss of not having my control panel but would be nice.
Chipset drivers for me have mostly just worked, even if I don't install them my system have just worked too... never really understood them either way.. windows just takes care of you really.. Intel have some crusty gpu drivers, however I prefer them over nvidia. so on windows laptop intel drivers for gpu is by far the best imho. for linux AMD. For windows desktop amd.
I've had games crash for months on nvidia so it's not like they're perfect, and a stupid geforce experience, ms store to download control panel and a control panel from 1999 doesn't cut it, fix that and they'll be windows desktop #1
Actually AMD chipset drivers are fine the issue is folks dont do a clean install and delete the old installation drivers from the folder, that and the fact MS constantly meddles with the drivers with Windows update. This is a decent release from Intel and no doubt it will briefly go to the top on some games titles, however AMD already makes a Ryzen with the rdna2 built in, cant see it taking a lot of effort to expand that to mainstream, I imagine that is the next iteration anyway
it's been 11-12 years since I fooled around with AMD (Bulldozer era) - I fell for the 6 core hype and found that the throughput on everything from their implementation of USB to SATA III resulted in benchmarks were anywhere from 25 -40% below intel. I've been sorely tempted at the RYZEN stuff , especially the brand new renoir stuff APU's - but your comment made me think back - are they really that bad at the chipset level?
@way2funni - No, they're really not that bad at the chipset level. The Bulldozer era was a horror show that has luckily long-since ended. vladx regularly makes claims against AMD and/or in favour of Intel/Nvidia that are unsupportable by facts - they like to mix it up with a few valid points every now and again, but you can assume that any decidedly negative statement is false.
For context, I just (last Thursday) built a brand-new AMD (Ryzen 3 3100, RX 580 - my budget is LOW) system and have had no problems whatsoever with drivers. I moved over from an Intel/Nvidia system, for context. The CPU wasn't even supposed to be supported out-of-the-box by the crappy old A320 board I used, but it still managed to POST and get me to the BIOS screen where I could flash an update.
@way2funni No problem! For what it's worth, from what I've heard, the mobile Ryzen stuff can be more of a fiddle in terms of chipset / GPU drivers. I have no experience there though, so I can't comment from a personal perspective.
AMD chipsets never had issues wth are you talking about. Clearly trying to find any excuse to smash AMD to the ground. But let me remind you (concidering the level of stupidness in your comments) you must make peanuts while AMD makes billions and someone at AMD has the IQ to fabricate those monster CPU's. Maybe you should get a job at AMD and be their engineer instead.
I have used AMD, Intel and nVidia GPU's at various points for decades, they have all been fine, they have all had "rough spots".
I.E. With the older GMA x3100 drivers Intel's GPU would take a performance hit whenever there was a 2D overlay.
nVidia had driver cheats during the FX days.
AMD had frame pacing during the early GCN days.
But we also need to keep in mind that GPU drivers these days have more lines of code than most operating system kernels, which is absolutely nuts to think about, it's going to take time to get stuff right, mistakes will be made. AMD, nVidia and Intel are all the same in this regard.
This brand loyalty/fanboyism is just hilarious and not beneficial to the consumer.
Currently running a Radeon RX 580 in a PC and it's rock solid, no issues, running a Geforce 1660 in another rig, again... Rock solid, no issues. Running an Intel compute stick in yet another rig, rock solid, no issues.
@StevoLincolnite: I have owned dozens GPUs from both AMD/ATI and nVidia since the 90's and there hasn't been a GPU or CPU launch from AMD/ATI without GPU/chipset drivers issues. Even the great Athlon generation(I had an Athlon XP 2500+ Barton at the time) had issues with memory stability.
Meanwhile Intel chipset/iGPU drivers since the 90's have been rock solid, while nVidia had very few gens with problems.
The Athlon XP's memory stability would have been a chipset issue. AMD didn't make chipset's for their systems at that time. Well they has the AMD 750 (Athlon Thunderbird (B) gen chipset) and 760 chipset (Thunderbird (C) generation), but those were early in the sockets lifetime and were not common on the market. With that in consideration you were specifying the Barton 2500+ so your choices were an Sis (746FX or 748), VIA KT333/400, or the coveted but UNSTABLE NFORCE 2 . I owned one of those. And I know AMD did not make the chipset for that platform. Also remember the memory controller was part of the chipset. VIA would have been in control of any memory issues. Most blame would be on VIA, SiS, or Nvidia not AMD.
As a note there are multiple reports of Intel issues. Not sure how you missed out on those. Like the early self destructing SATA ports on the first gen i7 motherboards. I also remember platform issues with one of Intel's ****well-E based platforms dealing with early firmware problems. I know AMD's Zen platform had issues, but Intel is certainly not without theirs.
Nvidia has also had plenty of problems, their chipsets (AMD 64 era) cause lots of problems when moving to the Vista OS being a major cause of crashes for the Vista platform. They have had multiple GPU's that had been recalled for self destructing in notebooks due to overheating issues.
I should clarify the Nforce 2 statement. That chipset was known for it's performance, but it was by far not the most stable platform of that era. The VIA platform was slower, but from what I could tell they were much stable.
@Luminar: As previously stated, I also owned a RX480, RX580 and RX590 so I'm not living in the past at all. Also tested a 5700 XT which had lots of gfx artifacting in games. My point is, with almost no exception AMD and ATI have had drivers issues since before the new millenium.
@vladx, maybe you consistently underpower your systems with shitty PSU's .. then yes.. all your setups will be unstable. I concur with @StevoLincolnite
just accept it dude, intel is going down no matter what they come up with. stop writing lies, i had durons and athlon xps and never experienced "memory instabilities" that you talk about
@Samus - It's no more true for AMD than it is for Nvidia, and I can say this having *personally* owned cards from both since the TNT2. The Rage Maxx and original Radeon had legitimately wonky drivers and ever since then fanboys have copy/pasted the same "AMD drivers suck" nonsense, pointing to whatever latest wrinkle pops up as evidence, whilst keenly ignoring tidbits like Nvidia literally destroying some of their customer's cards with one driver release (it wasn't actually a big deal, didn't affect many cards, but god damn if that had been AMD...)
It's all so, SO tedious. The truth is that Nvidia hold most of the market and are considered default, so every time someone "switches sides" they balk at every little bump in the road. Meanwhile Nvidia's drivers seem utterly incapable of figuring out how multiple displays work until you explicitly tell them, seem to need manual prodding to use the correct GPU in the system for video acceleration, and incorporate tons of other little annoyances that most Nvidia users don't really notice because they've never tried anything else.
I remember people raging on ATI drivers when the Radeon was first launched in 2000.. I didn't buy one till the 8000 series and have owned maybe 20 and installed a few 100.. (bout the same for NV cards..) and honestly? I've never really experienced these alleged "driver issues" There have been a few cards that turned out to be duds (from both companies..) but I blame that on the manufacturer.. I think the only card I ever really had issues with Drivers wise.. was 3dfx, and I friggin loved that card because when it worked it was awesome as hell .. just I struggled alot to get it working on all my games but it wasn't made by Nvidia or Amd/Ati. So there is that.
I had R9 280 and it served me well, before I replaced it with Nvidia 1070. Before 280, I was running Radeon 6870 and that GPU served me well for a few years.
1070 serves me well too, save for recent drivers freezing Planetside 2... so I have rolled back and running May drivers.
I found their drivers were ok, but the software that came with it was often system wrecking. If you do some tricks to only install drivers, it is relatively stable. That being said....it is ridiculous to have to do that.
Then you don't play games like World of Tanks or World of Warships, at least when I played on AMD cards 2 years ago it followed a repeated pattern of breaking one or both of those games and then fixing bugs introduced in previous update.
@vlax - There it is! "more often than not it would break at least one game I played" resolves into "I (allegedly) had problems with these 2 games by this one shitty developer".
Having played both of those games at various times and with various GPUs, I can't say I've been able to mirror your experience. But who knows, maybe I'm just installing it wrong. 😂
Which demanding gaming scenarios would that be ? Don‘t really see many playing high fps AAA games while streaming or recording simultaneously with a multi monitor setup on Intel iGPU.
I have a laptop with Core i7 5500U. Intel's drivers are rubbish. Connecting the laptop to one (or 2) external monitors is a painful experience. I also have a laptop with Ryzen 4800HS *and* 2060-MaxQ. Flawless, despite having both AMD and nVidia drivers.
@Spunjii: Everyone here knows you're a rabid AMD fanboi who can't accept facts, I didn't talk about performance but stability and issues occuring in every generation of their products which only happens to AMD. In those matters, Intel drivers are indeed a hundred times better than AMD's especially if you include Linux drivers as well not just Windows.
Ha! So rabid that last week was the first time in *fourteen years* that I've built a system based on an AMD CPU for my own personal use. So utterly frothing, indeed, that since Maxwell launched, 4 out of 6 of my gaming systems have used Nvidia GPUs, up to and including a pair of GTX 980s.
You can project at me all you want - it's clear from other replies that most people see through you.
I had an Intel Skylake i7 6700 that could not come out of sleep due to Intel drivers. Had to shut down rather than sleep for what seemed to be a year. I would say intel drivers are just as wonky.
Yes like drivers for sandy bridge GMA HD 2000 crappy 3D that not available direct from Intel for. Windows 10, also crappy Nvidia low end GT710 that not available also for. Windows 10
This new Intel is too much for us and now Nvidia has a 3 year lead on graphics. I feel the days of Sledgehammer are back now. Kind of a dark cloud for us fans.
Intel actually has already stepped up with help from AMD. Going by dollars to performance, the 10700 ($300 at MC) compares well to the 3700x ($270 at MC), and the 10100 ($100 at MC) is a better buy than the 3100 ($100)/3300x($150) Ryzens after factoring in cost of the motherboard, and RAM. 10700 actually went up $10 at MC this past week. I'm assuming it was because of it being a good option to the 3700x for people who don't care to OC.
The new A series mobo's brings the AMD's back, but then the 3100/3300x lose ocing/performance, and the 3300x still cost noticeably more overall.
I'm not seeing that though. 10700, and 3700x are both rated at 65 watt, and despite being 7nm the thermals on the 3700x isn't much better. I see plenty of posts about AMD's high temps, and after checking, I found that chips like the 3700x pull 200 watt for short bursts. In fact, going the opposite way, like many of us SFF/itx builders do, Intels undervolt very well. AMD seems to have a short leash there. None of these points (Thermals, performance, brand) are reason enough to go one way or another, so for me, cost factors as high as anything else.
The new A series helped AMD's value proposition a lot. X chips don't need to be OC'd since they are mostly maxed, and B mobo's are priced high like their X boards. With their new CPU's on the horizon, AMD could be the clear cut choice, but again, it will depend on cost.
"Intels undervolt very well" - sure, if you're happy to give up most of that Turbo performance - at which point you might as well disable PBO on the AMD chips and see how they stay neatly within their TDP for a fairly minor sacrifice in peak performance.
Kinda seems like you're either severely misinformed or extremely dishonest.
You don't need to touch Turbo rates to do that, undervolting involves just puting a negative voltage offset as opposed to overclocking which usually requires a positive offset. Just goes to show how little you know about CPUs and especially those from Intel.
Bloody hell vladx, I know how undervolting works 😂 I've been doing it with a 6700HQ for 3 years, and that does indeed undervolt very well, mostly because the stock voltages are crazy-high and it only turbos up to 3.5Ghz.
The fact remains that you're limited in how much you can undervolt the highly-tuned 10 series without giving up stability at those 5Ghz+ peak turbo clocks.
"The fact remains that you're limited in how much you can undervolt the highly-tuned 10 series without giving up stability at those 5Ghz+ peak turbo clocks."
@Spunjji: I own a notebook with 10th gen Intel CPU and guess what, the undervolting values for my i7-10875H are -0.12V cache and -0.165V core which is just as good as previous generations.
Uhh... you can't even OC/XMP the RAM on Intel Motherboard outside of Z-series. It'll cripple Intel performance on it. I don't think AMD still costs noticeable more.
Certainly not in the UK. I literally just bought a 3100 system so I have the numbers. For AMD: 3100 is £95, 3300X is £120. An A320 motherboard that takes them is £45. For Intel: 10100 is £119. Cheapest H410 board is £60 but lacks an M.2 slot.
You "lose" the OC on the AMD systems by going with such a cheap motherboard, but you never had it in the first place in the Intel system, so that's a bit of a red herring.
The 3300X easily wins on performance, while the 3100 is great value for people like me who literally have no spare money for a system and suddenly find themselves needing to build one to replace something that died.
Power consumption != TDP. You seem confused. These chips beat AMD chips that consume far more power in many benchmarks.
AMD does not compete in this segment. There are zero laptops with AMD chips in them in this weight class with Wifi6, TB4, etc and TGL has better single core and GPU performance than ANY AMD chip. This chip is one of the best chips out there and I say this as an AMD system owner, stock holder, and as a tech enthusiast and engineer.
Yep I bought a Renoir laptop recently and now thinking if I should have waited for TGL instead, if anything, for the futureproofing in terms of TB4/USB4 and the huge step up in graphics performance. Perhaps as a consolation I guess the 4750U is still slightly faster in terms of multithreaded performance...
Need to wait till the review units appear. AMD always tout about Cinebench, when only <1% of consumer use this. While Intel use Sysmark (closer to real life application), but how close is this to real life usage, we need Anandtech help to validate
No, vladx - an artificial benchmark suite composed of cherry-picked *segments* of real-world applications overseen by Intel is not "closer to real-life" than a single benchmark based on unaltered software that a few people actually use. https://en.wikipedia.org/wiki/BAPCo_consortium
If they had convincing real-world numbers then they'd quote the real-world numbers. Evidently they do not, so they've resorted to playing a rigged game.
Sadly, I am not. Of course, power consumption and TDP are totally different. However, do you have bonafide evidence showing the Renoir chips consume far more power in ___ benchmark(s)? You seem to be the confused one here. I have gone through the full press release. I suggest you do likewise. One part of the document that is a dead giveaway is on page 82. Pl1 shows a tremendous performance increase in pushing the PL1 ceiling from 15W to 25W, on the order of 33-40%. That brings me again to my point: they are increasing the default power draw to give the illusion of a huge generational leap. Let's not kid ourselves here. Yes, Intel fixed some problems, but they are still in a world of hurt and are massively behind the competition.
Power consumption does not matter. If a CPU spends 2 seconds consuming 50W to complete a task vs. 5 seconds for the same task on another CPU at 25W, which consumed more power?
Of course, energy equals power times time. As to your hypothetical scenario: neither, because the Renoir part consumes less energy over time (power times energy) than either Intel part you first mention in that scenario. ;)
No, eek2121 is an obvious shill and you're an obvious confederate. His "clear pin point answer", as you put it, was a fabrication that tried to re-frame the issue.
The *real* comparison would be: if one CPU spends 2 seconds at 50W and the other spends 3.5 seconds at 25W, which consumed more power? But he had to fudge the numbers to make a bogus point.
"Power consumption does not matter. If a CPU spends 2 seconds consuming 50W to complete a task vs. 5 seconds for the same task on another CPU at 25W, which consumed more power?"
That would only be true if the 50w CPU was more than 2.5x faster and it isnt.
Also, dont forget there is no product here and no benchmarks. No release date or price yet either. Also up in hte air is whether or not Intel can keep up with demand once released. They have "fibbed and faltered" on all fronts lately.
"Power consumption does not matter." What kind of idiocy is this logic? If you're in the mobile market, sure as hell power consumption matters. You have a limited battery that feeds that processor, as well the temperature constraints.
Shill idiocy. They don't have to be right, they just have to repeatedly post false information. Few people bother with the comments, and almost nobody really reads the bickering after the first few posts.
Why post a Twitter post from June...when Anandtech confirmed it's 15 W in today's official article? Sometimes, the rumor mill has genuinely taken over the PC hardware commenting meta.
I trust the engineering documentation more that has been provided to OEMs/SIs for the last year. That is what all these upcoming Tiger Lake-U systems have been built around. Unless Ian has received confirmation saying otherwise? This wouldn't be the first time a typo or a mistaken assumption (not speaking ill of Ian) has slipped in.
Ah, interesting. Intel is omitting the default TDP here. Likely the leaks stated numerous times before, the base clocks they have been quoting has been based on their new 28-watt default TDP.
why do you need to defend AMD so vigorously? Be open, and judge both side of the offering...there should be plenty of reviews coming up, make your judgement then, not now
@Gondalf, silver_eagle and any other shill accounts that rely on lies and projection: AMD quote base clocks at 15W. Their turbo clocks are, indeed, at some level above 25W - as their specs indicate. Up until now Intel have been quoting all of their base clocks at 28W. Their other base clock numbers are for 12W, so they have "mysteriously" chosen TDP levels that make a direct comparison impossible, and the clocks at 12W are frankly pathetic. Their turbo clocks are at *50W* which, I'd hope we can all agree, is a silly-high number for a thin-and-light device and is probably only ever going to be seen for a few seconds at a time.
Independent testing has shown that actual shipping laptops with Renoir have remarkably stable performance over time for a modern processor with boost capabilities, despite some of the designs having absolute dog-shit thermal solutions. This is not likely to be the case at all for TGL. No amount of interpretation or rationalization will alter these basic facts.
@spunjji.. Go read the ryzen 4700u detailed reviews on full cpu load it has a power draw or 57w yes your apple of your eye 15w 8c 4700u takes 57w at all core frequency of 2.5Ghz. Go read the real world independent review of IdeaPad 7 from netbookcheck. You delusional amd fanboys go to all possible levels to spread fake news.
@RedOnlyFan - lol. You're quoting power consumption for *the entire system* under an artificial parasitic GPU *and* CPU load as if it's CPU-only. If you weren't such a mendacious twerp you'd have included the link: https://www.notebookcheck.net/The-Ryzen-7-4800U-is... The same review shows the i7-8565U system drawing 53.6W under the same loads, and 49W for the i7-1065G7 notebook. I wonder why you avoided mentioning that.
I wish you shills would quit projecting at me, it's bloody annoying.
Yup, Hardware Unboxed showed this via Intel‘s spec sheets. The higher clocks (base and boost) are only at the higher TDP, so more difficult to compare to both Ice Lake stated values, same for Renoir.
It is the Intel U 15W line my friend. Even AMD say up to 25W TDP and frequencies ""UP TO"" X or Y. So if you have a 15W laptop your Renoir run much slower of the advetised clock speeds. If OEM set Renoir at 25W hopefully it will run at 1.8Ghz base and 4.2 Ghz single core turbo (plus chipset power).
So basically both cpus at 25/28W yield---> AMD: 8 cores 1.8Ghz base and 4.2Ghz turbo. Intel: 4 cores 3 Ghz base and 4.8Ghz turbo.
So the IPC matters here, and both processes looks very good. A sweet parity, with a clear advantage of Intel in peak clock speed.
Of course, it does. That's still nearly half that of a 50W boost. I imagine many current ultrabook power bricks would struggle under such a instantaneous power draw.
Now, back to Intel. Note they omit the default TDP in the just published ARK listing. Going from a 1.3 GHz base to a 3.0 GHz base doesn't seem very likely. Plus, the last few months of leaks have asserted that the official base clock is based on a 28-watt TDP:
EDIT: Vindicated! It says it right there in the ARK listing. The TDP-Up (28-watt) base frequency is 3.0 GHz. No magic bullet solution here, I am afraid.
HP uses 65W adapters for its Spectre series. If 50W is what PL2 is going to be, I'm sure that manufacturers will include a 65W adapter for Tiger Lake. No big deal.
@tamz_msc "most ultrabooks come with 65W power adaptors" - firstly, no, not all. Secondly, there are other components in a system besides the CPU - add in 10W for the rest of the device's power consumption and account for conversion losses in the VRMs and you're easily approaching peak load for a 65W adaptor. Not a great situation, really.
Luckily, there is literally no "ultrabook" design out there can dissipate a 50W thermal load for more than a few seconds. 30W? Sure, maybe for a few minutes. 50W? Nope. So in reality you're just going to get mediocre performance and a hot device. Yay! :|
To quote: "As with most modern CPUs, the Ryzen 7 quickly ramps up well past its target thermal design power, hitting around 30 Watts draw at the start, but as the test goes on, that value falls back to around 18 Watts."
So that's compared with a CPU that runs 50W at turbo and 28W at its quoted base clocks in order to provide comparable performance to Renoir in real-world applications. WOW.
Holy crap, this was the comparison I was looking for. So it seems Intel's 2021 upcoming 11th-gen is barely competing with it's 10th-gen/9th-gen/8th-gen offerings of the past. And they have nothing to compete with AMD's 2020 current Renoir. Also I think we will get only a mild upgrade from AMD's mobile for next year (still stuck on Vega, and 7nm).
So by the time Intel finally has a solution for Renoir, it's going to be late 2022. And at that time AMD will probably make a decent upgrade for their next next-gen mobile chipsets: +5nm EUV lithography, RDNA-2 iGPU, Zen3+ CPU architecture. The only question that remains is, will laptops using AMD chipsets finally get TB3/TB4/USB4 support?
It sounds like I'm beating a dead horse here, but there's still a Good Case to be made for High-Bandwidth Connection: Just think of using your laptop as a regular Ultrabook while out and about, and when coming home you can "dock it" next to the TV (or on a Desk). Now suddenly you have much more ports, external storage, beefier active cooling for the laptops CPU, and a really fast external GPU connected. It would be like the "Nintendo Switch" but for laptops/steam.
@Kangal I'd say you're taking it a bit far with all that. Intel and Tiger Lake are definitely *competitive* with AMD (in much the same way that Raven Ridge was competitive with Kaby Lake R), it's just interesting to see the extent to which they're bending or breaking their own prior design constraints in order to reach that level of competitiveness.
It'll be interesting to see what Cezanne brings. If they add Zen 3, DDR5 and maybe bump the Vega CUs a little then it should easily reclaim the top spot for overall power/performance from Tiger Lake. Whether or not it will end up in any decent notebook designs is another matter entirely.
I beg to differ. If Intel is ready to sacrifice thermal headroom just to squeeze out more performance, well, that's not much of an improvement. It will always come at the cost of heat generated and power used up, two enemies of "mobile" devices. Maybe in a thicker laptop with large fans and a vapour-chamber, a fast charger in the box, and a huge battery... maybe in such a scenario it's not too shabby. But then using that as a comparison to a thinner laptop with cheaper/conventional cooling, regular charger, and smaller battery. Well, Intel WOULD make that comparison. To me, it is erroneous, sort of like comparing a large sedan to another small sedan, to saying thanks to "innovation" it can travel longer, with no-one commenting on the huge fuel tank size discrepancy.
To me this feels like an evolutionary iteration. Just have to cut through all the jargon to see it, it's almost obligatory that Intel, Nvidia, AMD, Apple and the like to throw these out in the presentations to impress their shareholders. The iteration is sort of like Intel going from their Core i7-6900k chipset to their Core i9-9900k. Both 8-core/16-thread, and both using the same Skylake architecture. Just one is more polished than the other. The Core i7-1185G7 (laptop flagship) feels like an incremental improvement over the i7-1065G7, i7-10610U, i7-8665U, or even the i7-8550U from +3 years ago. And looking at the Core i7-1160G7 (ultrabook flagship) it too isn't too far off from their i7-10510Y, but a notable step up from the likes of the i7-8500Y.
So my prediction from 6 months ago was kinda accurate. AMD will eventually eat into Intel's laptop (24W) dominance, but Intel is still the king when it comes to the Ultrabook (16W) market. And the tablet (8W) market is practically dead, with Android dominating the cheap-end, and iOS dominating the luxury-end. AMD still has another 1-2 iterations until they can get the recipe quite right/optimised for Ultrabooks, but I won't hold my breath, since the company is stretched very thin R&D wise.
CPU-wise, it definitely is just an iteration. The innovation falls entirely on the GPU side and, supposedly, improvements to the manufacturing process. They clearly didn't get as much of a benefit as they wanted from that, though, hence the silly power levels at 4.8Ghz - but we don't yet know how that looks in terms of "average" usage (as these are clearly never going to spend long at that speed in any realistic scenarios).
I think Van Gogh might be the one to watch in the 7-15W TDP arena. 4 Zen 2 cores (that can actually run close to their boost clocks) and RDNA 2 graphics might well be a healthy competitor to Intel's efforts.
@blppt: Single-core CPU performance also becomes largely irrelevant when you game with an iGPU and are very much limited by that, which will be the case for the vast majority of systems these CPUs go into.
@RedOnlyFan: Intel have designed a chip whose primary performance advantages only really show up in scenarios where you won't notice them. Bold strategy!
Even games these days consistently tax a bunch of cores. Go ahead, run some modern demanding titles on a single core. Sure, one or two threads get hit the hardest, but when you're loading up a bunch of cores you are no longer hitting the quoted single core peak turbo figures. The less cores and sustained watts you have to play with, the lower your sustained turbo.
Intel is slightly faster at games for a couple of reasons. One, their architecture is faster at those sorts of workloads. IPC isn't fixed across all code. So even IF AMD has reached "IPC parity" on average, Intel still can have an IPC advantage in some workloads. It even varies by game, actually. On average I'd say they're pretty close. The other factor is frequency - Intel can typically hold higher clocks.
Intel and AMD both build some really good chips these days, they both have their positives and negatives. Overall I'd say gaming performance of CPUs is a non-issue for anything but ultra high end rigs, because you're going to be limited by your GPU first. Even a cheap hexacore CPU can easily keep a $500+ GPU fed at the kinds of settings you're going to run on said GPU. You don't game at 720p on low on a monster GPU, outside of exaggerated benchmarks.
Most of them still rely on single core performance, thinks that need instant response like application opening photoshop for the main functionality, basic windows features to name a few.
Power consumption does indeed not equal TDP, but the graph on page 1 literally shows these processors consuming 50W at peak turbo. AFAIK Renoir boosts up to 35W but mostl stays around 25W. You'd really hope that a 28-50W CPU could beat a 15-35W design in peak performance and single-threaded tasks.
As for the rest of your waffle, it's hardly surprising that there are no AMD laptops in this weight class with Intel-exclusive (TB4) and/or Intel-sponsored (WiFi 6) features. We already know Intel's back to their old "corporate sponsorship" tricks with "Evo" - they refuse to publish the spec, but it's pretty clear that it's about "persuading" manufacturers to single-source from them and/or reserve their best design for systems with Intel CPUs, a g a i n.
Laptops run at whatever power/thermal levels the OEM built them to operate at (normally this isn't something you can mess with). Desktop chips are run at the default settings for the mobo being tested and OCed as high as they'll go with a solid cooling setup.
Laptops decide, just like Intel motherboards, unfortunately. Dell's XPS 13 is 45 W PL2, while HP's Spectre x360 13t is closer to 35 W PL2.
PL2 (and Tau, I presume) should be noted in each review, but laptop OEMs are reticent to share. However, thermals & battery life easily expose manufacturers who don't do their homework & try to stuff in obscene PL2s with no way to handle that power.
This is the turbo boost scenario that happened at fraction of seconds while still maintaining and exceeding in battery life. Did you get it?? Why the lol?
Nobody's in a muddle. It's very clear that in most designs this chip will never sustain its quoted turbo clocks, and that in designs where it sits at the quoted base clocks it will likely run very hot.
"Intel only ever changes its logo when there is a big shift inside the industry" i.e. Intel only changes its logo when it's lagging behind the competition
To be fair, the 2006 brand change came right before Conroe, which was a pretty momentous shift for Intel. It was the start of practically ten years of woe for AMD.
I think AMD is far, far better led now than they were in 2006, and their chances of getting caught flat footed are much, much lower than back then, but it's hard to deny that Intel is more confident in Tiger Lake than anything they've released since Zen 2 came out.
The only big shift in the industry has been Apple dropping Intel. I would think Intel are seeing Apple as their main threat, and are looking to do a big marketing push to counter.
I don't feel like they're any more confident in Tiger than they were with Ice Lake. They appear to be making similarly inflated claims prior to release, which if anything suggests an underlying lack of confidence in the actual product.
if they are so confident, then why did they mention amd or their products in their tiger lake presentation: https://www.youtube.com/watch?v=aFHBgb9SY1Y its from gamers nexus. mentioned amd 12 times, 4800U 8 times, and competition and derivatives 27 times. how many times in nvidia mention amd or intel in its ampere keynote ?? zero. Steve burke in the video, is not nice to intel at all about this tiger lake presentation. intel also denounces the use of benchmarks, and said only imitators use benchmarks. quite funny over all :-)
That's actually a pretty solid way to measure confidence. Intel never used to talk about AMD's mobile products, just how much better their own stuff was than the previous stuff. 😬
Obviously. Amd 2006 was s**t it was on the verge of death. When you compare amd today and amd even 2 years ago you will have mind bending improvement figures.
"Mind bending improvement" from 2 years ago when AMD were competitive in price/performance with Skylake derivatives would be a pretty impressive feat, especially as Intel still rely on those for their desktop and server products. Are you sure you meant to post that? 😂
Looking forward to the first actual tests. I am due for a new laptop, so it's good if Intel and AMD have at it - better for us. Question: does Renoir support AV1 decode?
Yes, I saw that, too. Bit unfortunate, having the AV1 decoded in ASIC makes watching AV1 material a lot less battery-hungry. Still, I look forward to an actual test; Ian, if you can, let us know if that AV1 playback works well or not.
Well some times there are unintentional mistakes in the articles. In this article there are some: "The top 12-25 W processors are technically known as ‘UP3’ processors.." 25W or 28W?
OK, naysayers. Here is a first party source: Tiger Lake-U's 3.0 GHz base clock is ONLY for a 28-watt, "TDP-Up" configuration. No magic bullet with SuperFin, I am afraid.
@Hifihedgehog please cease being a die hard AMD fan...there is no benefit to be a die hard fan. Be open to both offering, and argue with good facts, and not speculating when there is not enough Intel TGL laptop in the marker yet. My advice is to look at the actual product, and draw conclusion
Look who's talking, you comment on every Intel article here like a crazy ex-girlfriend. I would tell you to use your brain, but I know it would just be a waste of time for rabid fanboi like yourself.
According to NotebookCheck (no idea where they got the benchmark, so salt required) https://www.notebookcheck.net/Intel-Core-i7-1165G7... Intel Core i7-1165G7 (15W TPD) at Cinebench R20 - CPU (Multi Core) -> 2530 points, where AMD Ryzen 7 PRO 4750U (15W TPD) is at 2992.5
That's an assumption. And it's wrong, actually. Per the just published ARK listing, the 3.0 GHz base that Intel quoted in their PR release is for a 28-watt TDP (TDP-Up). More interesting is how they omit the default TDP listing entirely in ARK. Wonder why? ;)
Unless I am missing something, you are mistaking it for the Ice Lake part (Core i7-1065G7) I also put in the comparison. It probably swapped columns with the Tiger Lake part after hitting the arrows.
Obfuscation is the name of the game. This way they get a hard reset, instead of having to add a +, then delete the + a few months later and erase all records of previous processes that may or may not have had the same name. 😬
Stop quoting a single number (TDP). Start quoting the range. It’s highly deceptive to quote 15W and post that as a graphic when the chip draws more than double in real-world use.
The people who said “power consumption doesn’t matter” are wrong for many reasons, including performance-per-decibel which is important with portable form factors.
The end result matters. Find me a 2lb laptop that gets 12 hours of battery life with a Renoir chip in it that has Thunderbolt, Wifi6, etc. and we will talk.
@Hifihedgehog don't die for AMD...be open minded to both brands. There is no use to be a die hard AMD fan when all information pointed to TGL will beat Renoir by a margin, performance and battery life, and features...Graphics & Gaming, Thunderbolt4 (4x 4K monitor support), Wifi6, AI algorithm for workflow & photoshop image processing, and battery life....
Weird how you just show up to this one article feeling all certain about precisely how this unreleased chip will perform in unreleased notebook designs.
@Spunjji I think your statements are squarely referring to Hifihedgehog. He is the die hard AMD fan who poo poo Intel since the beginning, without even seeing an actual device. Poor argument from him
@silver_eagle Wow. All HiFiHedgehog has done is point out inaccuracies in other people's comments, I've not seen him actually *advocate* for anything. Swing and a miss.
"Find me a laptop with a Renoir chip in it that has Intel branding on it. Bet you can't. Checkmate, fanboy 🤡" ^ This is almost literally what you're saying here. If you set arbitrary constraints that are only fulfilled in total by one vendor then yeah, congratulations, you "win".
for some, there is no benefit, i am one of those, i could care less if a notebook has it or not, i will never use it. but the intel fanbois, say its a must have feature, and a deal breaker if it doesnt have it.
@Qasar, you don't care for Thunderbolt doesn't mean other users don't care. Plenty of users are using Thunderbolt, as it can enable external discrete GPU connectivity via Thunderbolt, supercharge your laptop into even greater 2k/4k gaming machine, while the Xe graphic should work great as 1080p gaming
i said " some " and i am one of those. i guess you glazed over that part. most of those where i work, also dont seem to care about TB support as well. while you may want it, keep in mind others may not
@silver_eagle - "supercharge your laptop into even greater 2k/4k gaming machine" - sure, if you feel like paying over the odds to get a laptop with 4 lanes on the Thunderbolt 3 ports, then $400 for an enclosure to put your $500 GPU into, all to get $300 GPU performance out of the whole setup.
I was a big convert to the idea of eGPU back in the day, but it's still too expensive and the performance impact is still too high. As a result, in performance-per-dollar terms it's still cheaper to build a separate mini-ITX gaming rig.
But why miss out on a great a future proof ( like a real future proof not the amd fx kind off) with tigerlake you get it for free. If you don't like faster data transfer, connect multiple to multiple devices with just cable.. Sure go live the cables hanging out life.
RedOnlyFan, i just have no use for TB, thats all. currently i have 4 USB cables attached to my comp, keyboard, mouse, usb 3 hub, and a 2 drive usb 3 dock that i dont use very often. thats it, i doubt TB would help reduce those whopping 4 cables at all. not every one needs or has a use for TB, so it may not be a feature that some people look for, and pass on everything just be cause it doesnt have it.
Having to straw-man me that hard just to get a reply in is what's pathetic here, vladx. Feel free to point to where I said - or even implied - that there's "no benefit to having Thunderbolt 3".
Why did they call other CPU manufacturers: "Imitators" in the announcement. Is this a kick to AMD or Apple? This felt so unprofessional, but still intentional the way he said it.
AMD is definitely an imitators. Intel has spent hundreds of millions $ researching and developing ultrabook ecosystem since the invasion of ARM tablets, with tons of R&D money pour in to enable the x86 ecosystem....Isn't AMD an imitator by just sitting & riding on the success of Intel ultrabook success? AMD just need to spend energy and money on the CPU while Intel has to spend whole lot of effort & resources in CPU, platform and ecosystem enabling.
In another words, Intel create while AMD ride for free on Intel's effort. And the AMD fan boys don't seem to get it and appreciate this, and poopoo Intel.
No, most of that has ultimately been on system OEMs.
That Intel decided to help them is wholly on them. Intel have benefited greatly from it, but it's not theirs to control. And Intel hardly psuhed boundaries, as there was always going to be a desire for thinner and lighter computers.
@Tams80, you may need to read up news (lots of them, especially in Taiwan & China OEM), on how Intel spent hundreds of millions into researching and enabling the laptop ecosystem. I read the news where Intel will build thin and light form factors as reference design for all the OEMs, then OEMs will add their own tweak and secret sauce. I read from China news, Intel even directly help to source and identify the quality parts that can help board level design and share with OEMs.
All these efforts from Intel to enable the ecosystem went unread/unseen/unheard by AMD fanboys...AMD can just borrow on Intel pioneering efforts and success.
Intel don't deserve the poopoo and harsh words from AMD fanboys
silver_eagle, " AMD can just borrow on Intel pioneering efforts and success. " oh like how intel was practically forced to adopt AMD64 so they could also have 64 bit capable cpus ? or how intel came out with the in die memory contoller AFTER amd did ? to fair, amd AND intel have borrowed/coppied/imitated each other in various ways over the years.
Again, it was going to happen one way or another. Intel just sped it up.
And if they chose to help third parties, who could and did eventually did take that help for use with a competitor(s); well that's not AMD "cheating", "imitating" or whatever. Intel got their exclusivity. Now the OEMs can do as they wish, and they are always have been big enough to do so; they weren't going to look a gift-horse in the mouth.
And finally, your choice a language suggests that you should be participating in the comments until you grow up.
No, unlike amd Intel works with software vendors, Microsoft display manufacturers, and oems innovates and develops the end devices. Amd after decades of sleeping wakes up and starts copying Intel. They couldn't even come up with a different modeling scheme.. They even copied 3,5,7,9 series eww. Intel as a platform has innovatived more.. Thunderbolt, wifi 6 default, nvme ssd, 2.5G ethernet as a default, pcie 4 (amd uses active cooling on desktop mb for pcie4 chipsets). Intel has single handedly lead the industry as a whole.
Didn't they only invent the word "ultrabook"? Pretty sure Apple released the macbook air, which in essence was the first ultra-compact laptop design as we know today.
Sort-of correct - the MacBook Air was itself a it of a clone of prior Sony Vaio designs, though, right down to the chiclet keyboard. Turns out everything's based on everything else.
The MacBook Air range was also the first place where Apple's dissatisfaction with Intel started to show. They kept having their designs held back and/or hampered by Intel not meeting design targets, right up to the latest models that throttle massively under load.
So you're claiming that Intel pushed the industry forwards by... *checks notes* copy/pasting Apple's MacBook designs, then "encouraging" OEMs to single-source all the components in order to get marketing kick-backs that just happen to lock their competitors out. Sure.
Yet that never stopped them before. It's almost like they only have to make those sorts of decisions when: a) They can't yield high enough to satisfy demand for both markets, and b) Their new designs fail to outperform the old ones when TDP limits are relaxed.
Maybe there will be general availability of 12th gen laptop parts a bit earlier than desktop parts, but 12th gen desktop is going to be Adler Lake S, and that'll be 10 nm.
Why are they releasing lower power chips before high power ones? Intel already has the mobile segment on lock despite the efficiency gains on Ryder. They need to go to 10nm for high end desktop and servers NOW if they want to stop AMD from eating their lunch in the performance market.
Making 4 core dies is easier than making 8 core dies (and especially than 28-core dies), and they didn't start rethinking to a chiplet-style direction for server parts until after Ice Lake SP taped out?
It's not that Intel doesn't know how to clue things.. They have a better and bigger glue stick. There are trade off that come with chiplets if the customer doesn't what it they don't make it. In fact Intel has a 56c 2 die clued xeon.
One reason is that processors for mobile fetch more $$$, once you factor area/number of chips per wafer into account. The next big question is if Intel can move these advances into their Xeon line before EPYC takes their lunch money.
nope. as stated, area/number of chips per wafer, and failure rate. AFAIK, Intel still not using chiplet tech for producing their chips. So, the most reasonable, high success rate is mobile procs.
AMD had such a success with ryzen due to the Chipley design, e.g. The 3600 is 2x4 core with one fused off on each core complex, that means you technically needed to get 3/4 cores to work out successfully within a small die area.
Successfully getting e.g. 6 adjacent cores to all be defect free is a lot harder, also take into account that each wafer has limited space, with the edges usually being wasted.
That's why you start with mobile and work your way up in cpu.
Graphics cards it's the opposite, since you can fuse off lots of cores and it's still fine as they don't need to be adjacent, as long as you meet general area requirements, and then you bin for speed and you get your chip.
It's just the market requirements, there is more demand for thin and ultralight notebooks. High end desktop or desktop market in general is very small compared to notebooks. So follow the money. No amd and Intel compete in different segments even though they all look the same from outside. There are some things the clued epycs are good at there are some things only Intel can do. It's only the rgb gaming desktop youtubers that cry pathetically for a few fps and act like the sky has fallen.
"Intel decided to stop doing things they had done with every generation prior to 14nm because actually that was always a bad idea and they should never have been doing it. It's definitely not because of problems with their processes."
Hopefully I don't get in trouble for posting a URL to another site, but I noticed that Intel 10700 manufactured on 14nm+++ runs cooler than everything including 65 watt 7nm AMD processors. https://www.techpowerup.com/review/intel-core-i7-1... Why do the AMD processors run so hot I wonder?
I also noticed that the 14nm Intel processor only runs 2 watts more than the 7nm AMD processor in the stress test. It seems that Intel has team red beat on efficiency. As an AMD SuperFan, I am saddened by this turn of events. If they beat us that badly with their 14nm process, how much of a beating are we going to be taking with this fancy new 10nm Superfin stuff? I am worried.
temperature can be different because of thermal interface, because of thermal area (smaller chips are hotter at the surface), it doesn't matter, in the end cooling is limited by watts, not temperature
Did you actually look at the review you listed? AMD beats Intel in efficiency by a decent margin and if we look at max boost frequency amd is far away. Same for power consumption, that part can eat 250W at max boost.
Yes the article clearly says that Intel is much more efficient. The only way it isn't more efficient is if it is overclocked. The stock chip on 14nm always seems to beat the 8 core 7nm AMD in performance and efficiency. I can't believe a 14nm chip runs so much cooler than the 7nm chips and at higher performance and clocks. This is making me quite nervous and scared.
Turbo isn't overclocking, Mr Obvious Shill. The USEFUL page of that article ( https://www.techpowerup.com/review/intel-core-i7-1... ) clearly says that Intel are more efficient on single-thread workloads and less efficient on multi-core workloads when running at the same performance level. They only come close to (and still lose to) AMD for multi-threaded efficiency when the chip runs at base clocks, at which point the chip is also slower than the AMD competition.
And guess what: even when your primary task is single-threaded, you can bet your background-tasks aren't!
Funny that you seem to have ignored the turbo figures that sit at the precise opposite end of the chart, meaning that to actually match AMD on performance Intel's 14nm+++ CPU has to pull *114W more power*.
Thanks for sharing your "concerns", though, three kids in an Intel-branded trenchcoat! Uh, I mean, "AMDSuperFan" 😉😉😉
For someone who has repeatedly called me obsessed, a fanboy and a shill, you really did just go through this whole thread and reply to me with snarky bullshit. Here you're white knighting for the most obvious troll since Trump ran for president.
I'm happy to admit I post on here a bunch. I'm a tech-head and it bugs me when people post lies and nonsense. What's your excuse for defending a liar?
"However the underlying clock-for-clock performance improvements are minimal ..."
Surely their dual ring bus, cache updates, PCIE4, memory speed updates, Thunderbolt 4 contribute something more than "minimal" improvements to the performance.
The old mainframes had pretty low performance but absolutely huge I/O enabling them to do data runs that were impossible for a long time with home comouters.
Even though a few years later home computers were approaching mainframe speeds, home computers still didn’t have the I/O power to compete (arrays of memory, racks of disk space, ultra-high speed drum scanners and readers, high speed printers and cutters). This was all before cloud computing and thunderbolt came along.
In the article i7-1160G7 has a base frequency of 1200 MHz Here https://www.intel.co.uk/content/www/uk/en/products... The Configurable TDP-up Frequency of i7-1160G7 is 2.10GHz at Configurable TDP-up 15W The Configurable TDP-down Frequency of i7-1160G7 is 900MHz at Configurable TDP-down 7W
So the 1200MHz form the article is probably for 12W because:
Here https://ark.intel.com/content/www/us/en/ark/produc... The Configurable TDP-up Frequency of i7-1165G7 is 2.80GHz at Configurable TDP-up 28W The Configurable TDP-down Frequency of i7-1165G7 is 1.20GHz at Configurable TDP-down 12W
So for i7 4cores/8threads we have these "base"(guaranteed) frequencies : 2.80GHz at 28W 2.10GHZ at 15W 900MHz at 7W
As far as I can tell here, Intel has combined a smaller process with a higher tdp, while maintaining a core count limited to 4.
8 cores is arguably excessive for most laptop usage. Unless you're using it for compiling, encoding or a similar task. So, this tradeoff may well be worth it. But, this announcement is PR heavy and obviously shows these as yet unavailable processors in the best possible light. Nobody should think AMD will be unable to compete.
This contrasts heavily with the recent nvidia ampere announcements, as the rtx 3000 series specs indisputably indicate a huge increase in performance from an already dominant company. Perhaps AMD can catch up there, but I doubt it.
Renoir will look a little better once independent Tiger Lake reviews come out. And then AMD will leap over Tiger Lake with Cezanne/Van Gogh. But Intel will probably leap over Cezanne a few months later, and so on.
Coming from Intel in 2020, yup, likely to be delayed. It doesn't matter until you see mass production good yield to laptop OEM for a good price (i.e. something selling in good volume so they can lower the price). You can have the best paper launch and then cant make enough of it to lower the price and people will buy your competitor's much cheaper slightly worse product anyday.
Why am I duty-bound to be nice to trolls, and why do you care? The obvious answer would be because you're a troll too, but now I'm wondering if it's a little more personal than that.
and any one that doesnt agree with your own delusional thinking is a troll, classic intel fanboy logic, like before, whats your point ? that fact is, you hate anything amd, others have said this to you already, no need to keep showing it, most already know.
I'm not sure I buy that their CPU will go from 1.2Ghz at 12W to 2.1Ghz at 15W, then take another 13W to get to 2.8Ghz. I get that there are shoulders to the voltage curves, but that still seems fishy.
That's what I was thinking - you can't just use Evo, but then did Samsung ever manage to trademark it themselves? I doubt Intel would slip up like that here.
This reminds me the Centrino campaign Intel ran all those years ago. No one knew what that meant, and I doubt this will gain traction, either. I feel like they're advertising is saying, "This year, our products are better than last year, trust us!"
Enough people knew that it meant "buy this for fast laptop".
Didn't matter whether or not it was true (although back then it mostly was).
Most of what it's actually about will happen behind the scenes - minimum purchase quantities, kickbacks for hitting sales metrics, design assistance in exchange for agreed numbers of design wins, that sort of thing.
That word brings back memories of Centrino advertisements on TV back in the day, with that soothing "Intel Inside" sound effect, which seemed to say to you, unconsciously, "Intel == speed" and "Intel is the way to go."
There's a typo on page 1: "At the top end is Intel’s Core i7-1165G7, a quad core processor with hyperthreading and the full 12 MB of L3 cache." This should be the 1185.
...it's going to be interesting to see how this actually works, because typically this is the regime where thermal limits get blown all to hell. There's a reason that AMD/Intel hold mobile clocks down.
From what I've read, the performance tests were all single threaded tasks. Ryzen 4700u would kill it in multithreaded tasks. Also, I saw a review on PCWorld of a Lenovo laptop with a Ryzen 4800u that beat the performance of a Dell laptop with a i7-10875H and a GeForce GTX 1650 Ti. If someone would put the Ryzen 4800H in a laptop, it would be ever better!
P.S. I've used ATI / AMD video cards exclusively for over 20 years and never had an issue with drivers. Any problems I ever had were solved with patches or mods for the game itself.
Lol at the Intel Fanbois creaming themselves over an unbenchmarked CPU, look its simple AMD make the chips in the consoles, the have added RDNA2 to those chips they have a very good CPU in the mobile space right now that beats pretty much any current ie released Intel mobile part, I admit I hate Intel since I lost a CPU and was denied a warranty claim in the early core 2 duo Days but for all the Fanbois I hope they do have a decent offer becasue it will force AMD back to the Table and we the consumer will benefit, but consider for a second what will happen to this Intel CPU if AMD adds the latest Navi cores as fitted to the Consoles and then also consider if you think AMD doesnt already have this option waiting in the Wings, the trouble is AMD is a whole generation ahead of Intel at the moment and we saw how long it took AMD to make up that kind of ground
Unfortunately for us AMD fans, Intel will always have more money to hire better designers and engineers. So Intel is way back on top. We can't cry over spilled milk. I guess it is good to have competition. We will be back!
Intel's business model is move to a rural town and build up a work force there, they won't pay you top dollar and they basically say take it or leave it. They have been hiring 2nd to 3rd rated engineers for a decade already.
I have also Sharron! I am running a Radeon 7900 series card and it is really great. It's plenty fast enough for all of my gaming needs. I am very very concerned about this new NVidia technology of the 3900 sending us AMD fans back to the stone ages. The 3900 seems much faster than my 7900 in all of the stats. We just don't have the ability to compete at this level. It is better for us fans to be able to buy the new AMD products in the bargain bin.
I doubt they're getting paid for a post that bad. The usual paid posts are along the lines of "muh drivers" and "oh look here's a link to an Intel PR video, fascinating stuff".
It's hard to get excited about a 4c/8t laptop unless gaming is not a primary concern. (Guess I'd need to see this CPU paired with the best mobile GPUs later to gauge it's minimum FPS potential, given that desktops often have issues with 4c/8t minimums, excluding the excellent R3-3300X, anyway)
For a start, it only has 4 lanes of PCIe 4.0 - so according to the rules hastily constructed by the community to explain away the dearth of Renoir gaming laptops, it's LiTeRaLlY iMpOsSiBlE for this to be paired with a high-end GPU.
Realistically, though, they're going to keep on throwing their 14nm+++ CPUs at gaming laptops until they can yield enough 10nm product to curl out a few 8-core TGLs into the market.
All that matters is real world performance. Tell me how fast a CPU renders a video in Premiere/DaVinci, or how fast it applies filters to images in Photoshop/GiMP, etc. And then the next question is, how long the battery lasts while doing the before mentioned tests. A comparison, which is not that easy as battery-sizes do vary alot unfortunately.
They demonstrated it right. Can render a 4k video on premier 2.5x faster than ryzen 8c, can do preset apply on same number of images 2x faster than any other laptop cpu. Any Athena device is 10+ hrs battery.
I thought this was supposed to be a "launch"? Where are the review units for the press - to be honest I expected to see charts full with performance numbers from anandtech - instead I have to believe what Intel presents here.
Regardless of the stupidity of this "launch" - Impressive GPU performance, kudos Intel!
It's a new high for iGPUs, which is to be commended, but they appear to have solved the problem by throwing die area at it - they need significantly more area (and transistors) just to beat AMD in synthetics. It's not a particularly promising sign for their future dGPU efforts.
What do you mean by "launch" this was supposed to be only a public announcement of the 11th gen laptop soc. The final laptops will be launched with pricing and availability by the oems. This is the standard industry procedure.
DP 2.0 has the same bandwidth as TB3, but instead of using 2 lanes up and 2 lanes down concurrently uses all 4 lanes to push the data in one direction.
Think of it as if both sides of a two lane motorway switched to having all traffic go in the same direction - you've got the same total throughput, but twice as much going one way at the cost of losing the return route.
Thunderbolt only supports 40Gb/s, for full DisplayPort 2.0 specs you need 80Gb/s. I'd rather they don't begin offering partial support and make it into a shitshow like with USB Type-C.
"It's a new high for iGPUs, which is to be commended, but they appear to have solved the problem by throwing die area at it - they need significantly more area (and transistors) just to beat AMD in synthetics. It's not a particularly promising sign for their future dGPU efforts."
While we do not know how much die space the top end Xe LP graphic uses on the Tiger Lake, but I will not be surprise it uses a signficant amount of die space. Which is probably one of the reasons why this is stuck at 4 cores max. Other reason could be due to power requirement to sustain the high clockspeed and simply not enough for more than 4 cores with the top end XE LP graphic. These are just my opinions.
I've previously done a rough calculation based on an Intel-provided die shot, and to summarise, if their claimed GPU performance comes true then they're beating Vega 8 by around 25% but at a cost of a 33% larger die-area. We probably already know as much as we ever will about that - Intel aren't in the habit of discussing these things publicly anymore.
Actually after reading through the article, this supposed "amazing" SuperFin and high clockspeed sounds like hot air to me. At 12W, the base speed did not improve over Ice Lake U, which is almost as low. The high base clockspeed that Intel is boasting in their marketing requires a base of 28W. To reach the boost speed, it goes up to 50W in theory. Looking forward to the independent reviews to see how this performs.
Intel has, at secret camps around the world, trained a crew of "muddy the waters" experts. The valedictory event is a lobotomy, after which they spawn to do battle on the forums. Lately they've been programmed to magnify irrelevancies, make exaggerated or baseless claims, hide assumptions, quote discredited benchmarks, and generally derail or confuse any news favorable to AMD or unfavorable to Intel.
Doubts are sown and manured - and logic is no impediment. My favorite is a recent assertion that Intel processors are much more secure because so many vulnerabilities have already been identified in them!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
348 Comments
Back to Article
Chaitanya - Wednesday, September 2, 2020 - link
If only drivers can keep up.vladx - Wednesday, September 2, 2020 - link
You're confusing Intel with AMD, Intel drivers are solid.ksec - Wednesday, September 2, 2020 - link
He is likely referring to Gaming.vladx - Wednesday, September 2, 2020 - link
Even in gaming, Intel drivers are still more stable. Everything else depends on architecture optimizations for each game and Intel just like nVidia is pretty good at securing partnerships.Mr Perfect - Wednesday, September 2, 2020 - link
I believe they're talking about frequency of driver updates, not quality of drivers.vladx - Wednesday, September 2, 2020 - link
That's a big minus for AMD imo, as someone who owned a RX480, RX580 and a RX590 I dreaded every new update from AMD since more often than not it would break at least one game I played.Fritzkier - Wednesday, September 2, 2020 - link
Have you ever, tried to game with Intel iGPU? You talk like you've use Intel GPU to games for years...And no, CPU doesn't need drivers so, so it's not included.
vladx - Wednesday, September 2, 2020 - link
CPUs don't need drivers but chipsets do, and in AMD's case chipset drivers are awful.duploxxx - Thursday, September 3, 2020 - link
looking at your comments - amd gpu drievrs are bad - amd chipset drivers are bad.... ever think of its just the person behind the screen that has no clue.... go buy a gaming console or an OEM based predesigned system. its clear you are not up to the task to maintain a computer.ihatepaidtrolls - Thursday, September 3, 2020 - link
you have clearly made up your mind. you can shut up now. we all know you hate amdoleyska - Thursday, September 3, 2020 - link
what's awful about them ?I have no issues, I will say on my laptop the radeon settings panel doesn't work, the nvidia control panel sucks horsecock, and on mobile even more so.. but atleast it starts but don't know what do to in it cause it's mostly empty..
Intel's control panel for gpu on laptops I like, it works, their backend driver sucks but it's ok and I see them making great strides.
I'm happy, and not the biggest loss of not having my control panel but would be nice.
Chipset drivers for me have mostly just worked, even if I don't install them my system have just worked too... never really understood them either way.. windows just takes care of you really..
Intel have some crusty gpu drivers, however I prefer them over nvidia.
so on windows laptop intel drivers for gpu is by far the best imho.
for linux AMD.
For windows desktop amd.
I've had games crash for months on nvidia so it's not like they're perfect, and a stupid geforce experience, ms store to download control panel and a control panel from 1999 doesn't cut it, fix that and they'll be windows desktop #1
alufan - Thursday, September 3, 2020 - link
Actually AMD chipset drivers are fine the issue is folks dont do a clean install and delete the old installation drivers from the folder, that and the fact MS constantly meddles with the drivers with Windows update.This is a decent release from Intel and no doubt it will briefly go to the top on some games titles, however AMD already makes a Ryzen with the rdna2 built in, cant see it taking a lot of effort to expand that to mainstream, I imagine that is the next iteration anyway
WaltC - Friday, September 4, 2020 - link
No trouble from chipset drivers--either. None.Fritzkier - Friday, September 4, 2020 - link
What the hell is a chipset driver...?5080 - Saturday, September 5, 2020 - link
Here is an example of AMDs chipset drivers: https://www.amd.com/en/support/chipsets/amd-socket...way2funni - Sunday, September 6, 2020 - link
it's been 11-12 years since I fooled around with AMD (Bulldozer era) - I fell for the 6 core hype and found that the throughput on everything from their implementation of USB to SATA III resulted in benchmarks were anywhere from 25 -40% below intel. I've been sorely tempted at the RYZEN stuff , especially the brand new renoir stuff APU's - but your comment made me think back - are they really that bad at the chipset level?Spunjji - Monday, September 7, 2020 - link
@way2funni - No, they're really not that bad at the chipset level. The Bulldozer era was a horror show that has luckily long-since ended. vladx regularly makes claims against AMD and/or in favour of Intel/Nvidia that are unsupportable by facts - they like to mix it up with a few valid points every now and again, but you can assume that any decidedly negative statement is false.For context, I just (last Thursday) built a brand-new AMD (Ryzen 3 3100, RX 580 - my budget is LOW) system and have had no problems whatsoever with drivers. I moved over from an Intel/Nvidia system, for context. The CPU wasn't even supposed to be supported out-of-the-box by the crappy old A320 board I used, but it still managed to POST and get me to the BIOS screen where I could flash an update.
way2funni - Tuesday, September 8, 2020 - link
thank you for taking time out to reply. appreciated.Spunjji - Wednesday, September 9, 2020 - link
@way2funni No problem! For what it's worth, from what I've heard, the mobile Ryzen stuff can be more of a fiddle in terms of chipset / GPU drivers. I have no experience there though, so I can't comment from a personal perspective.alphatech - Friday, September 25, 2020 - link
Onions and mustard please lolAMD chipsets never had issues wth are you talking about.
Clearly trying to find any excuse to smash AMD to the ground.
But let me remind you (concidering the level of stupidness in your comments) you must make peanuts while AMD makes billions and someone at AMD has the IQ to fabricate those monster CPU's. Maybe you should get a job at AMD and be their engineer instead.
RedOnlyFan - Wednesday, September 2, 2020 - link
They are referring to drivers and bios updates for Intel. Amd's bios/drivers was has been a bit janky.StevoLincolnite - Wednesday, September 2, 2020 - link
I have used AMD, Intel and nVidia GPU's at various points for decades, they have all been fine, they have all had "rough spots".I.E. With the older GMA x3100 drivers Intel's GPU would take a performance hit whenever there was a 2D overlay.
nVidia had driver cheats during the FX days.
AMD had frame pacing during the early GCN days.
But we also need to keep in mind that GPU drivers these days have more lines of code than most operating system kernels, which is absolutely nuts to think about, it's going to take time to get stuff right, mistakes will be made.
AMD, nVidia and Intel are all the same in this regard.
This brand loyalty/fanboyism is just hilarious and not beneficial to the consumer.
Currently running a Radeon RX 580 in a PC and it's rock solid, no issues, running a Geforce 1660 in another rig, again... Rock solid, no issues.
Running an Intel compute stick in yet another rig, rock solid, no issues.
vladx - Wednesday, September 2, 2020 - link
@StevoLincolnite: I have owned dozens GPUs from both AMD/ATI and nVidia since the 90's and there hasn't been a GPU or CPU launch from AMD/ATI without GPU/chipset drivers issues. Even the great Athlon generation(I had an Athlon XP 2500+ Barton at the time) had issues with memory stability.Meanwhile Intel chipset/iGPU drivers since the 90's have been rock solid, while nVidia had very few gens with problems.
caqde - Wednesday, September 2, 2020 - link
The Athlon XP's memory stability would have been a chipset issue. AMD didn't make chipset's for their systems at that time. Well they has the AMD 750 (Athlon Thunderbird (B) gen chipset) and 760 chipset (Thunderbird (C) generation), but those were early in the sockets lifetime and were not common on the market. With that in consideration you were specifying the Barton 2500+ so your choices were an Sis (746FX or 748), VIA KT333/400, or the coveted but UNSTABLE NFORCE 2 . I owned one of those. And I know AMD did not make the chipset for that platform. Also remember the memory controller was part of the chipset. VIA would have been in control of any memory issues. Most blame would be on VIA, SiS, or Nvidia not AMD.As a note there are multiple reports of Intel issues. Not sure how you missed out on those. Like the early self destructing SATA ports on the first gen i7 motherboards. I also remember platform issues with one of Intel's ****well-E based platforms dealing with early firmware problems. I know AMD's Zen platform had issues, but Intel is certainly not without theirs.
Nvidia has also had plenty of problems, their chipsets (AMD 64 era) cause lots of problems when moving to the Vista OS being a major cause of crashes for the Vista platform. They have had multiple GPU's that had been recalled for self destructing in notebooks due to overheating issues.
caqde - Wednesday, September 2, 2020 - link
I should clarify the Nforce 2 statement. That chipset was known for it's performance, but it was by far not the most stable platform of that era. The VIA platform was slower, but from what I could tell they were much stable.alufan - Thursday, September 3, 2020 - link
Had a DFI Lanparty Mobo with the Nforce it was dog rough for the first few months then they fixed it but BSODs were common even at idle5080 - Saturday, September 5, 2020 - link
And the nforce chipset burned out on certain notebooks from HP.Luminar - Thursday, September 3, 2020 - link
Talk about living 10 to 20 years in the past.vladx - Thursday, September 3, 2020 - link
@Luminar: As previously stated, I also owned a RX480, RX580 and RX590 so I'm not living in the past at all. Also tested a 5700 XT which had lots of gfx artifacting in games. My point is, with almost no exception AMD and ATI have had drivers issues since before the new millenium.ihatepaidtrolls - Thursday, September 3, 2020 - link
@vladx clearly a pebcak issuebrunis.dk - Thursday, September 3, 2020 - link
@vladx, maybe you consistently underpower your systems with shitty PSU's .. then yes.. all your setups will be unstable. I concur with @StevoLincolnitevladx - Thursday, September 3, 2020 - link
Shitty PSUs that only affect AMD every generation of their prodcts? Right, nice try....ihatepaidtrolls - Thursday, September 3, 2020 - link
just accept it dude, intel is going down no matter what they come up with. stop writing lies, i had durons and athlon xps and never experienced "memory instabilities" that you talk about29a - Thursday, September 3, 2020 - link
I get the impression you don't know how to build and maintain a system very well.vladx - Thursday, September 3, 2020 - link
@29a: I've built my first system at 13 y.o., I have probably more experience than most reviewers.Fritzkier - Friday, September 4, 2020 - link
Anyway, you clearly never uses Intel iGPU drivers. It's bad. Even the client is using Windows 10 UWP for some reason.gescom - Thursday, September 3, 2020 - link
Intel drivers were unbelievably bad at anything opengl. Never understood why they couldn't fix these things for years. Completely useless.dotjaz - Sunday, September 6, 2020 - link
@vladx you hate AMD we got it, now shut your pie hole.Samus - Thursday, September 3, 2020 - link
AMD drivers have always steered my away from their cards. Everyone I know, every generation of card, has some sort of problem.It's ridiculous because they make all the game consoles so the games are designed for their GPU's!
Spunjji - Friday, September 4, 2020 - link
@Samus - It's no more true for AMD than it is for Nvidia, and I can say this having *personally* owned cards from both since the TNT2. The Rage Maxx and original Radeon had legitimately wonky drivers and ever since then fanboys have copy/pasted the same "AMD drivers suck" nonsense, pointing to whatever latest wrinkle pops up as evidence, whilst keenly ignoring tidbits like Nvidia literally destroying some of their customer's cards with one driver release (it wasn't actually a big deal, didn't affect many cards, but god damn if that had been AMD...)It's all so, SO tedious. The truth is that Nvidia hold most of the market and are considered default, so every time someone "switches sides" they balk at every little bump in the road. Meanwhile Nvidia's drivers seem utterly incapable of figuring out how multiple displays work until you explicitly tell them, seem to need manual prodding to use the correct GPU in the system for video acceleration, and incorporate tons of other little annoyances that most Nvidia users don't really notice because they've never tried anything else.
just4U - Sunday, September 6, 2020 - link
I remember people raging on ATI drivers when the Radeon was first launched in 2000.. I didn't buy one till the 8000 series and have owned maybe 20 and installed a few 100.. (bout the same for NV cards..) and honestly? I've never really experienced these alleged "driver issues" There have been a few cards that turned out to be duds (from both companies..) but I blame that on the manufacturer.. I think the only card I ever really had issues with Drivers wise.. was 3dfx, and I friggin loved that card because when it worked it was awesome as hell .. just I struggled alot to get it working on all my games but it wasn't made by Nvidia or Amd/Ati. So there is that.Zagor Te Nay - Monday, September 7, 2020 - link
I had R9 280 and it served me well, before I replaced it with Nvidia 1070. Before 280, I was running Radeon 6870 and that GPU served me well for a few years.1070 serves me well too, save for recent drivers freezing Planetside 2... so I have rolled back and running May drivers.
JTWrenn - Thursday, September 3, 2020 - link
I found their drivers were ok, but the software that came with it was often system wrecking. If you do some tricks to only install drivers, it is relatively stable. That being said....it is ridiculous to have to do that.WaltC - Friday, September 4, 2020 - link
Owned all of those products and never had a driver update "break" a game--ever. What a load of garbage...;)vladx - Friday, September 4, 2020 - link
Then you don't play games like World of Tanks or World of Warships, at least when I played on AMD cards 2 years ago it followed a repeated pattern of breaking one or both of those games and then fixing bugs introduced in previous update.Spunjji - Monday, September 7, 2020 - link
@vlax - There it is! "more often than not it would break at least one game I played" resolves into "I (allegedly) had problems with these 2 games by this one shitty developer".Having played both of those games at various times and with various GPUs, I can't say I've been able to mirror your experience. But who knows, maybe I'm just installing it wrong. 😂
Irata - Thursday, September 3, 2020 - link
Which demanding gaming scenarios would that be ? Don‘t really see many playing high fps AAA games while streaming or recording simultaneously with a multi monitor setup on Intel iGPU.yankeeDDL - Thursday, September 3, 2020 - link
I have a laptop with Core i7 5500U. Intel's drivers are rubbish.Connecting the laptop to one (or 2) external monitors is a painful experience. I also have a laptop with Ryzen 4800HS *and* 2060-MaxQ. Flawless, despite having both AMD and nVidia drivers.
Spunjji - Friday, September 4, 2020 - link
---citation needed---Flunk - Friday, September 4, 2020 - link
Stable? Maybe... although I've had problems before. Their 3D drivers are junk, developers to write around their numerous bugs.Spunjji - Monday, September 7, 2020 - link
You're talking complete bollocks. 🤷♂️ This is to be expected, but worth pointing out all the same.Spunjji - Friday, September 4, 2020 - link
Only a fool or a liar would claim that Intel have better GPU drivers than AMD and then die on a hill defending it.Based on prior posts, I'm going with "both".
vladx - Friday, September 4, 2020 - link
@Spunjii: Everyone here knows you're a rabid AMD fanboi who can't accept facts, I didn't talk about performance but stability and issues occuring in every generation of their products which only happens to AMD. In those matters, Intel drivers are indeed a hundred times better than AMD's especially if you include Linux drivers as well not just Windows.Qasar - Friday, September 4, 2020 - link
@vladx and everyone here knows you are a rabid nvida fanboi, rabid anti amd user, whats your point??dotjaz - Sunday, September 6, 2020 - link
pot kettle, pick oneSpunjji - Monday, September 7, 2020 - link
Ha! So rabid that last week was the first time in *fourteen years* that I've built a system based on an AMD CPU for my own personal use. So utterly frothing, indeed, that since Maxwell launched, 4 out of 6 of my gaming systems have used Nvidia GPUs, up to and including a pair of GTX 980s.You can project at me all you want - it's clear from other replies that most people see through you.
realbabilu - Monday, September 7, 2020 - link
No amd was solid, their old RX550 driver is still available for windows 10.justareader - Friday, September 18, 2020 - link
I had an Intel Skylake i7 6700 that could not come out of sleep due to Intel drivers. Had to shut down rather than sleep for what seemed to be a year. I would say intel drivers are just as wonky.realbabilu - Monday, September 7, 2020 - link
Yes like drivers for sandy bridge GMA HD 2000 crappy 3D that not available direct from Intel for. Windows 10, also crappy Nvidia low end GT710 that not available also for. Windows 10AMDSuperFan - Tuesday, September 8, 2020 - link
This new Intel is too much for us and now Nvidia has a 3 year lead on graphics. I feel the days of Sledgehammer are back now. Kind of a dark cloud for us fans.Spunjji - Wednesday, September 9, 2020 - link
🙄JfromImaginstuff - Wednesday, September 2, 2020 - link
Wow this article came out quick (20 minutes after) but anyway, nice to see Intel FINALLY at least try to step their game against AMD.drothgery - Wednesday, September 2, 2020 - link
Of course, it took AMD a well over 10 years to step up their game against Intel ...Dug - Wednesday, September 2, 2020 - link
When are you going to step up your game?Spunjji - Friday, September 4, 2020 - link
Why bother when you can get paid "over $190 per hour" for low-effort trolling :|Showtime - Wednesday, September 2, 2020 - link
Intel actually has already stepped up with help from AMD. Going by dollars to performance, the 10700 ($300 at MC) compares well to the 3700x ($270 at MC), and the 10100 ($100 at MC) is a better buy than the 3100 ($100)/3300x($150) Ryzens after factoring in cost of the motherboard, and RAM. 10700 actually went up $10 at MC this past week. I'm assuming it was because of it being a good option to the 3700x for people who don't care to OC.The new A series mobo's brings the AMD's back, but then the 3100/3300x lose ocing/performance, and the 3300x still cost noticeably more overall.
goatfajitas - Wednesday, September 2, 2020 - link
"Going by dollars to performance"... True, but going by heat and power, no not at all.Showtime - Wednesday, September 2, 2020 - link
I'm not seeing that though. 10700, and 3700x are both rated at 65 watt, and despite being 7nm the thermals on the 3700x isn't much better. I see plenty of posts about AMD's high temps, and after checking, I found that chips like the 3700x pull 200 watt for short bursts. In fact, going the opposite way, like many of us SFF/itx builders do, Intels undervolt very well. AMD seems to have a short leash there. None of these points (Thermals, performance, brand) are reason enough to go one way or another, so for me, cost factors as high as anything else.The new A series helped AMD's value proposition a lot. X chips don't need to be OC'd since they are mostly maxed, and B mobo's are priced high like their X boards. With their new CPU's on the horizon, AMD could be the clear cut choice, but again, it will depend on cost.
tamsysmm - Wednesday, September 2, 2020 - link
I'd really love to see that reference for 3700X pulling 200W (w/o OC of course).calc76 - Thursday, September 3, 2020 - link
Yea, even a 3900x (12/24) with PBO manual running P95 small all core load only uses ~ 160 watts.Eulytaur - Thursday, September 3, 2020 - link
Huh? Where did you get the 200 watt 3700X from?Spunjji - Friday, September 4, 2020 - link
Bullshit 😂 You're claiming that the 3700X does what the 10700 does:https://www.techpowerup.com/review/intel-core-i7-1...
"Intels undervolt very well" - sure, if you're happy to give up most of that Turbo performance - at which point you might as well disable PBO on the AMD chips and see how they stay neatly within their TDP for a fairly minor sacrifice in peak performance.
Kinda seems like you're either severely misinformed or extremely dishonest.
vladx - Friday, September 4, 2020 - link
You don't need to touch Turbo rates to do that, undervolting involves just puting a negative voltage offset as opposed to overclocking which usually requires a positive offset. Just goes to show how little you know about CPUs and especially those from Intel.Alexvrb - Sunday, September 6, 2020 - link
You don't need to touch them... they'll touch themselves.Spunjji - Monday, September 7, 2020 - link
Bloody hell vladx, I know how undervolting works 😂 I've been doing it with a 6700HQ for 3 years, and that does indeed undervolt very well, mostly because the stock voltages are crazy-high and it only turbos up to 3.5Ghz.The fact remains that you're limited in how much you can undervolt the highly-tuned 10 series without giving up stability at those 5Ghz+ peak turbo clocks.
vladx - Saturday, September 12, 2020 - link
"The fact remains that you're limited in how much you can undervolt the highly-tuned 10 series without giving up stability at those 5Ghz+ peak turbo clocks."@Spunjji: I own a notebook with 10th gen Intel CPU and guess what, the undervolting values for my i7-10875H are -0.12V cache and -0.165V core which is just as good as previous generations.
Spunjji - Monday, September 7, 2020 - link
I note that you had precisely nothing to say about that fanboy's other bullshit claims that I called out...Fritzkier - Wednesday, September 2, 2020 - link
Uhh... you can't even OC/XMP the RAM on Intel Motherboard outside of Z-series. It'll cripple Intel performance on it. I don't think AMD still costs noticeable more.Spunjji - Friday, September 4, 2020 - link
Certainly not in the UK. I literally just bought a 3100 system so I have the numbers.For AMD: 3100 is £95, 3300X is £120. An A320 motherboard that takes them is £45.
For Intel: 10100 is £119. Cheapest H410 board is £60 but lacks an M.2 slot.
You "lose" the OC on the AMD systems by going with such a cheap motherboard, but you never had it in the first place in the Intel system, so that's a bit of a red herring.
The 3300X easily wins on performance, while the 3100 is great value for people like me who literally have no spare money for a system and suddenly find themselves needing to build one to replace something that died.
Hifihedgehog - Wednesday, September 2, 2020 - link
"At 50W"lol. Told you there was a catch to all this.
Tamz_msc - Wednesday, September 2, 2020 - link
Renoir also boosts well past 15W.eek2121 - Wednesday, September 2, 2020 - link
Power consumption != TDP. You seem confused. These chips beat AMD chips that consume far more power in many benchmarks.AMD does not compete in this segment. There are zero laptops with AMD chips in them in this weight class with Wifi6, TB4, etc and TGL has better single core and GPU performance than ANY AMD chip. This chip is one of the best chips out there and I say this as an AMD system owner, stock holder, and as a tech enthusiast and engineer.
trenzterra - Wednesday, September 2, 2020 - link
Yep I bought a Renoir laptop recently and now thinking if I should have waited for TGL instead, if anything, for the futureproofing in terms of TB4/USB4 and the huge step up in graphics performance. Perhaps as a consolation I guess the 4750U is still slightly faster in terms of multithreaded performance...silver_eagle - Thursday, September 3, 2020 - link
Need to wait till the review units appear. AMD always tout about Cinebench, when only <1% of consumer use this. While Intel use Sysmark (closer to real life application), but how close is this to real life usage, we need Anandtech help to validateSpunjji - Friday, September 4, 2020 - link
Sysmark is not really any "closer to real life" than Cinebench, tbh.vladx - Friday, September 4, 2020 - link
Much closer than Cinebench, that's for sure.Spunjji - Monday, September 7, 2020 - link
No, vladx - an artificial benchmark suite composed of cherry-picked *segments* of real-world applications overseen by Intel is not "closer to real-life" than a single benchmark based on unaltered software that a few people actually use.https://en.wikipedia.org/wiki/BAPCo_consortium
If they had convincing real-world numbers then they'd quote the real-world numbers. Evidently they do not, so they've resorted to playing a rigged game.
Tams80 - Friday, September 4, 2020 - link
"It's not a benchmark, we swear!".It is a benchmark too.
Hifihedgehog - Wednesday, September 2, 2020 - link
> Power consumption != TDP. You seem confused.Sadly, I am not. Of course, power consumption and TDP are totally different. However, do you have bonafide evidence showing the Renoir chips consume far more power in ___ benchmark(s)? You seem to be the confused one here. I have gone through the full press release. I suggest you do likewise. One part of the document that is a dead giveaway is on page 82. Pl1 shows a tremendous performance increase in pushing the PL1 ceiling from 15W to 25W, on the order of 33-40%. That brings me again to my point: they are increasing the default power draw to give the illusion of a huge generational leap. Let's not kid ourselves here. Yes, Intel fixed some problems, but they are still in a world of hurt and are massively behind the competition.
eek2121 - Wednesday, September 2, 2020 - link
Power consumption does not matter. If a CPU spends 2 seconds consuming 50W to complete a task vs. 5 seconds for the same task on another CPU at 25W, which consumed more power?Hifihedgehog - Wednesday, September 2, 2020 - link
Of course, energy equals power times time. As to your hypothetical scenario: neither, because the Renoir part consumes less energy over time (power times energy) than either Intel part you first mention in that scenario. ;)silver_eagle - Thursday, September 3, 2020 - link
@Hifihedgehog seems to lost the point here.eek2121 has given a clear pin point answer.
Spunjji - Friday, September 4, 2020 - link
No, eek2121 is an obvious shill and you're an obvious confederate. His "clear pin point answer", as you put it, was a fabrication that tried to re-frame the issue.The *real* comparison would be: if one CPU spends 2 seconds at 50W and the other spends 3.5 seconds at 25W, which consumed more power? But he had to fudge the numbers to make a bogus point.
Alexvrb - Sunday, September 6, 2020 - link
Exactly. He was using hypothetical BS numbers. Let's see the AT efficiency testing!goatfajitas - Wednesday, September 2, 2020 - link
"Power consumption does not matter. If a CPU spends 2 seconds consuming 50W to complete a task vs. 5 seconds for the same task on another CPU at 25W, which consumed more power?"That would only be true if the 50w CPU was more than 2.5x faster and it isnt.
Also, dont forget there is no product here and no benchmarks. No release date or price yet either. Also up in hte air is whether or not Intel can keep up with demand once released. They have "fibbed and faltered" on all fronts lately.
tamalero - Wednesday, September 2, 2020 - link
"Power consumption does not matter." What kind of idiocy is this logic?If you're in the mobile market, sure as hell power consumption matters. You have a limited battery that feeds that processor, as well the temperature constraints.
Spunjji - Monday, September 7, 2020 - link
Shill idiocy. They don't have to be right, they just have to repeatedly post false information. Few people bother with the comments, and almost nobody really reads the bickering after the first few posts.Gigaplex - Wednesday, September 2, 2020 - link
That depends on how much power they draw once they get back to idle.Hifihedgehog - Wednesday, September 2, 2020 - link
Also, the new default TDP is 28 watts:https://twitter.com/_rogame/status/126981912051152...
ikjadoon - Wednesday, September 2, 2020 - link
Why post a Twitter post from June...when Anandtech confirmed it's 15 W in today's official article? Sometimes, the rumor mill has genuinely taken over the PC hardware commenting meta.>These processors have a nominal TDP of 15 W.
Hifihedgehog - Wednesday, September 2, 2020 - link
I trust the engineering documentation more that has been provided to OEMs/SIs for the last year. That is what all these upcoming Tiger Lake-U systems have been built around. Unless Ian has received confirmation saying otherwise? This wouldn't be the first time a typo or a mistaken assumption (not speaking ill of Ian) has slipped in.Gondalf - Wednesday, September 2, 2020 - link
More like AMD?? nominal 15W but defaul 25W + chipset power in shipped laptops??They are on parity :).
Irata - Thursday, September 3, 2020 - link
AMD does not use chipsets in mobile - it‘s one APU that includes everything.Otoh, look at the Tiger Lake module and how many chips do you see ?
Hifihedgehog - Wednesday, September 2, 2020 - link
Ah, interesting. Intel is omitting the default TDP here. Likely the leaks stated numerous times before, the base clocks they have been quoting has been based on their new 28-watt default TDP.https://ark.intel.com/content/www/us/en/ark/compar...
Hifihedgehog - Wednesday, September 2, 2020 - link
*Like the leaks stated numerous times before, these massively increased base clocks they have been quoting have been based on this 28-watt TDP.silver_eagle - Thursday, September 3, 2020 - link
why do you need to defend AMD so vigorously? Be open, and judge both side of the offering...there should be plenty of reviews coming up, make your judgement then, not nowvladx - Friday, September 4, 2020 - link
@silver_eagle: That's what rabid fanbois like him and Spunjji do, they reason with their emotions instead of using any critical thinking.Gondalf - Thursday, September 3, 2020 - link
Even AMD clock speed are at 25W + chipset power. Forget the advetised clock numbers at 15W.All advertised clock speed are UP TO. AKA at 25W (plus something more)
Spunjji - Friday, September 4, 2020 - link
@Gondalf, silver_eagle and any other shill accounts that rely on lies and projection:AMD quote base clocks at 15W. Their turbo clocks are, indeed, at some level above 25W - as their specs indicate.
Up until now Intel have been quoting all of their base clocks at 28W. Their other base clock numbers are for 12W, so they have "mysteriously" chosen TDP levels that make a direct comparison impossible, and the clocks at 12W are frankly pathetic. Their turbo clocks are at *50W* which, I'd hope we can all agree, is a silly-high number for a thin-and-light device and is probably only ever going to be seen for a few seconds at a time.
Independent testing has shown that actual shipping laptops with Renoir have remarkably stable performance over time for a modern processor with boost capabilities, despite some of the designs having absolute dog-shit thermal solutions. This is not likely to be the case at all for TGL. No amount of interpretation or rationalization will alter these basic facts.
RedOnlyFan - Sunday, September 6, 2020 - link
@spunjji.. Go read the ryzen 4700u detailed reviews on full cpu load it has a power draw or 57w yes your apple of your eye 15w 8c 4700u takes 57w at all core frequency of 2.5Ghz. Go read the real world independent review of IdeaPad 7 from netbookcheck. You delusional amd fanboys go to all possible levels to spread fake news.Spunjji - Monday, September 7, 2020 - link
@RedOnlyFan - lol. You're quoting power consumption for *the entire system* under an artificial parasitic GPU *and* CPU load as if it's CPU-only. If you weren't such a mendacious twerp you'd have included the link:https://www.notebookcheck.net/The-Ryzen-7-4800U-is...
The same review shows the i7-8565U system drawing 53.6W under the same loads, and 49W for the i7-1065G7 notebook. I wonder why you avoided mentioning that.
I wish you shills would quit projecting at me, it's bloody annoying.
Irata - Thursday, September 3, 2020 - link
Yup, Hardware Unboxed showed this via Intel‘s spec sheets. The higher clocks (base and boost) are only at the higher TDP, so more difficult to compare to both Ice Lake stated values, same for Renoir.Gondalf - Thursday, September 3, 2020 - link
It is the Intel U 15W line my friend.Even AMD say up to 25W TDP and frequencies ""UP TO"" X or Y. So if you have a 15W laptop your Renoir run much slower of the advetised clock speeds. If OEM set Renoir at 25W hopefully it will run at 1.8Ghz base and 4.2 Ghz single core turbo (plus chipset power).
So basically both cpus at 25/28W yield---> AMD: 8 cores 1.8Ghz base and 4.2Ghz turbo. Intel: 4 cores 3 Ghz base and 4.8Ghz turbo.
So the IPC matters here, and both processes looks very good. A sweet parity, with a clear advantage of Intel in peak clock speed.
Superfin is faster than TSMC offering.
Spunjji - Friday, September 4, 2020 - link
Anandtech did in fact confirm that it's 28W with a 15W TDP-down:https://twitter.com/IanCutress/status/130147060313...
"Nominal TDP" is meaningless when all their performance figures are quoted based on the higher number. They pulled the same nonsense with ICL.
Tamz_msc - Wednesday, September 2, 2020 - link
>However, do you have bonafide evidence showing the Renoir chips consume far more power in ___ benchmark(s)?Yes they do. Read Anandtech's review of the Acer Swift with the 4700U. It consumes up to 30W despite having a 15W TDP.
Hifihedgehog - Wednesday, September 2, 2020 - link
"up to 30W"Of course, it does. That's still nearly half that of a 50W boost. I imagine many current ultrabook power bricks would struggle under such a instantaneous power draw.
Tamz_msc - Wednesday, September 2, 2020 - link
Most ultrabooks come with 65W power adapters. You're grasping at straws.Hifihedgehog - Wednesday, September 2, 2020 - link
Hardly.Actually, many ultrabooks use stock 45-watt adapters. Here is a first-party 45-watt adapter for Dell XPS ultrabooks:
https://www.amazon.com/Charger-Latitude-5290-2in1-...
Now, back to Intel. Note they omit the default TDP in the just published ARK listing. Going from a 1.3 GHz base to a 3.0 GHz base doesn't seem very likely. Plus, the last few months of leaks have asserted that the official base clock is based on a 28-watt TDP:
https://ark.intel.com/content/www/us/en/ark/compar...
Hifihedgehog - Wednesday, September 2, 2020 - link
EDIT: Vindicated! It says it right there in the ARK listing. The TDP-Up (28-watt) base frequency is 3.0 GHz. No magic bullet solution here, I am afraid.Tamz_msc - Wednesday, September 2, 2020 - link
HP uses 65W adapters for its Spectre series. If 50W is what PL2 is going to be, I'm sure that manufacturers will include a 65W adapter for Tiger Lake. No big deal.Spunjji - Friday, September 4, 2020 - link
@tamz_msc "most ultrabooks come with 65W power adaptors" - firstly, no, not all. Secondly, there are other components in a system besides the CPU - add in 10W for the rest of the device's power consumption and account for conversion losses in the VRMs and you're easily approaching peak load for a 65W adaptor. Not a great situation, really.Luckily, there is literally no "ultrabook" design out there can dissipate a 50W thermal load for more than a few seconds. 30W? Sure, maybe for a few minutes. 50W? Nope. So in reality you're just going to get mediocre performance and a hot device. Yay! :|
smilingcrow - Wednesday, September 2, 2020 - link
30W is 60% of 50W which is still dramatically less so no need to exaggerate as it undermines a sense of unbiased reporting.Spunjji - Friday, September 4, 2020 - link
To quote: "As with most modern CPUs, the Ryzen 7 quickly ramps up well past its target thermal design power, hitting around 30 Watts draw at the start, but as the test goes on, that value falls back to around 18 Watts."So that's compared with a CPU that runs 50W at turbo and 28W at its quoted base clocks in order to provide comparable performance to Renoir in real-world applications. WOW.
Hifihedgehog - Friday, September 4, 2020 - link
Bingo.Hifihedgehog - Friday, September 4, 2020 - link
And all while stuck on the "quad core" standard.Kangal - Saturday, September 5, 2020 - link
Holy crap, this was the comparison I was looking for.So it seems Intel's 2021 upcoming 11th-gen is barely competing with it's 10th-gen/9th-gen/8th-gen offerings of the past. And they have nothing to compete with AMD's 2020 current Renoir. Also I think we will get only a mild upgrade from AMD's mobile for next year (still stuck on Vega, and 7nm).
So by the time Intel finally has a solution for Renoir, it's going to be late 2022. And at that time AMD will probably make a decent upgrade for their next next-gen mobile chipsets: +5nm EUV lithography, RDNA-2 iGPU, Zen3+ CPU architecture. The only question that remains is, will laptops using AMD chipsets finally get TB3/TB4/USB4 support?
It sounds like I'm beating a dead horse here, but there's still a Good Case to be made for High-Bandwidth Connection:
Just think of using your laptop as a regular Ultrabook while out and about, and when coming home you can "dock it" next to the TV (or on a Desk). Now suddenly you have much more ports, external storage, beefier active cooling for the laptops CPU, and a really fast external GPU connected. It would be like the "Nintendo Switch" but for laptops/steam.
Spunjji - Monday, September 7, 2020 - link
@Kangal I'd say you're taking it a bit far with all that. Intel and Tiger Lake are definitely *competitive* with AMD (in much the same way that Raven Ridge was competitive with Kaby Lake R), it's just interesting to see the extent to which they're bending or breaking their own prior design constraints in order to reach that level of competitiveness.It'll be interesting to see what Cezanne brings. If they add Zen 3, DDR5 and maybe bump the Vega CUs a little then it should easily reclaim the top spot for overall power/performance from Tiger Lake. Whether or not it will end up in any decent notebook designs is another matter entirely.
Kangal - Monday, September 7, 2020 - link
I beg to differ.If Intel is ready to sacrifice thermal headroom just to squeeze out more performance, well, that's not much of an improvement. It will always come at the cost of heat generated and power used up, two enemies of "mobile" devices. Maybe in a thicker laptop with large fans and a vapour-chamber, a fast charger in the box, and a huge battery... maybe in such a scenario it's not too shabby. But then using that as a comparison to a thinner laptop with cheaper/conventional cooling, regular charger, and smaller battery. Well, Intel WOULD make that comparison. To me, it is erroneous, sort of like comparing a large sedan to another small sedan, to saying thanks to "innovation" it can travel longer, with no-one commenting on the huge fuel tank size discrepancy.
To me this feels like an evolutionary iteration. Just have to cut through all the jargon to see it, it's almost obligatory that Intel, Nvidia, AMD, Apple and the like to throw these out in the presentations to impress their shareholders. The iteration is sort of like Intel going from their Core i7-6900k chipset to their Core i9-9900k. Both 8-core/16-thread, and both using the same Skylake architecture. Just one is more polished than the other. The Core i7-1185G7 (laptop flagship) feels like an incremental improvement over the i7-1065G7, i7-10610U, i7-8665U, or even the i7-8550U from +3 years ago. And looking at the Core i7-1160G7 (ultrabook flagship) it too isn't too far off from their i7-10510Y, but a notable step up from the likes of the i7-8500Y.
So my prediction from 6 months ago was kinda accurate. AMD will eventually eat into Intel's laptop (24W) dominance, but Intel is still the king when it comes to the Ultrabook (16W) market. And the tablet (8W) market is practically dead, with Android dominating the cheap-end, and iOS dominating the luxury-end. AMD still has another 1-2 iterations until they can get the recipe quite right/optimised for Ultrabooks, but I won't hold my breath, since the company is stretched very thin R&D wise.
Spunjji - Tuesday, September 8, 2020 - link
CPU-wise, it definitely is just an iteration. The innovation falls entirely on the GPU side and, supposedly, improvements to the manufacturing process. They clearly didn't get as much of a benefit as they wanted from that, though, hence the silly power levels at 4.8Ghz - but we don't yet know how that looks in terms of "average" usage (as these are clearly never going to spend long at that speed in any realistic scenarios).I think Van Gogh might be the one to watch in the 7-15W TDP arena. 4 Zen 2 cores (that can actually run close to their boost clocks) and RDNA 2 graphics might well be a healthy competitor to Intel's efforts.
Marlin1975 - Wednesday, September 2, 2020 - link
"These chips beat AMD chips that consume far more power in many benchmarks."And what are those? I have not seen any real benchmarks with true power usage for these chips. Let alone compared to AMD chips of equal footing.
eek2121 - Wednesday, September 2, 2020 - link
They beat all AMD chips at single core performance.eek2121 - Wednesday, September 2, 2020 - link
by all i mean desktop, HEDT, server, everything.tamalero - Wednesday, September 2, 2020 - link
How many things still rely on single core performance?Very few products are single core nowadays.
blppt - Wednesday, September 2, 2020 - link
Basically all games, which is why Intel is always at the top of game cpu benchmarks.It doesn't matter much when you game at 1440p or 4k, though, the GPU tends to level everything out.
Spunjji - Friday, September 4, 2020 - link
@blppt: Single-core CPU performance also becomes largely irrelevant when you game with an iGPU and are very much limited by that, which will be the case for the vast majority of systems these CPUs go into.@RedOnlyFan: Intel have designed a chip whose primary performance advantages only really show up in scenarios where you won't notice them. Bold strategy!
Alexvrb - Sunday, September 6, 2020 - link
Even games these days consistently tax a bunch of cores. Go ahead, run some modern demanding titles on a single core. Sure, one or two threads get hit the hardest, but when you're loading up a bunch of cores you are no longer hitting the quoted single core peak turbo figures. The less cores and sustained watts you have to play with, the lower your sustained turbo.Intel is slightly faster at games for a couple of reasons. One, their architecture is faster at those sorts of workloads. IPC isn't fixed across all code. So even IF AMD has reached "IPC parity" on average, Intel still can have an IPC advantage in some workloads. It even varies by game, actually. On average I'd say they're pretty close. The other factor is frequency - Intel can typically hold higher clocks.
Intel and AMD both build some really good chips these days, they both have their positives and negatives. Overall I'd say gaming performance of CPUs is a non-issue for anything but ultra high end rigs, because you're going to be limited by your GPU first. Even a cheap hexacore CPU can easily keep a $500+ GPU fed at the kinds of settings you're going to run on said GPU. You don't game at 720p on low on a monster GPU, outside of exaggerated benchmarks.
Spunjji - Monday, September 7, 2020 - link
@Alexvrb bringing the sense back into the chat. 👆RedOnlyFan - Wednesday, September 2, 2020 - link
Most of them still rely on single core performance, thinks that need instant response like application opening photoshop for the main functionality, basic windows features to name a few.Alexvrb - Sunday, September 6, 2020 - link
If you really think that, try disabling all but one core. Shouldn't lose performance in most cases, if most things rely on single core performance.Marlin1975 - Wednesday, September 2, 2020 - link
You said "these" chips beat AMD and use less power. Where do you see that?You keep making claims yet have not backed a single one up.
Spunjji - Friday, September 4, 2020 - link
If you keep incessantly talking about it, maybe eventually somebody will care?JayNor - Wednesday, September 2, 2020 - link
Info on the benchmarks...https://edc.intel.com/content/www/us/en/products/p...
Spunjji - Friday, September 4, 2020 - link
Power consumption does indeed not equal TDP, but the graph on page 1 literally shows these processors consuming 50W at peak turbo. AFAIK Renoir boosts up to 35W but mostl stays around 25W. You'd really hope that a 28-50W CPU could beat a 15-35W design in peak performance and single-threaded tasks.As for the rest of your waffle, it's hardly surprising that there are no AMD laptops in this weight class with Intel-exclusive (TB4) and/or Intel-sponsored (WiFi 6) features. We already know Intel's back to their old "corporate sponsorship" tricks with "Evo" - they refuse to publish the spec, but it's pretty clear that it's about "persuading" manufacturers to single-source from them and/or reserve their best design for systems with Intel CPUs, a g a i n.
vFunct - Wednesday, September 2, 2020 - link
So when sites like Anandtech run benchmarks, are they run at max power? Like the 50W listed here?Or are benchmark power consumption all over the place?
DanNeely - Wednesday, September 2, 2020 - link
Laptops run at whatever power/thermal levels the OEM built them to operate at (normally this isn't something you can mess with). Desktop chips are run at the default settings for the mobo being tested and OCed as high as they'll go with a solid cooling setup.ikjadoon - Wednesday, September 2, 2020 - link
Laptops decide, just like Intel motherboards, unfortunately. Dell's XPS 13 is 45 W PL2, while HP's Spectre x360 13t is closer to 35 W PL2.PL2 (and Tau, I presume) should be noted in each review, but laptop OEMs are reticent to share. However, thermals & battery life easily expose manufacturers who don't do their homework & try to stuff in obscene PL2s with no way to handle that power.
silver_eagle - Thursday, September 3, 2020 - link
This is the turbo boost scenario that happened at fraction of seconds while still maintaining and exceeding in battery life. Did you get it?? Why the lol?Meteor2 - Thursday, September 3, 2020 - link
I think he/she thinks TGL chips just sit there humming along at 50W.Tbh I can barely it's 2020 and people are still getting in a muddle about throttling and thermals in mobile devices.
Spunjji - Friday, September 4, 2020 - link
Nobody's in a muddle. It's very clear that in most designs this chip will never sustain its quoted turbo clocks, and that in designs where it sits at the quoted base clocks it will likely run very hot.eek2121 - Wednesday, September 2, 2020 - link
Looking forward to the reviews. I need a new laptop, but thus far I haven’t found anything appealing from either side.trenzterra - Wednesday, September 2, 2020 - link
"Intel only ever changes its logo when there is a big shift inside the industry" i.e. Intel only changes its logo when it's lagging behind the competitionDrumsticks - Wednesday, September 2, 2020 - link
To be fair, the 2006 brand change came right before Conroe, which was a pretty momentous shift for Intel. It was the start of practically ten years of woe for AMD.I think AMD is far, far better led now than they were in 2006, and their chances of getting caught flat footed are much, much lower than back then, but it's hard to deny that Intel is more confident in Tiger Lake than anything they've released since Zen 2 came out.
Meteor2 - Thursday, September 3, 2020 - link
The only big shift in the industry has been Apple dropping Intel. I would think Intel are seeing Apple as their main threat, and are looking to do a big marketing push to counter.Spunjji - Friday, September 4, 2020 - link
I don't feel like they're any more confident in Tiger than they were with Ice Lake. They appear to be making similarly inflated claims prior to release, which if anything suggests an underlying lack of confidence in the actual product.RedOnlyFan - Sunday, September 6, 2020 - link
But they look more confident than amd are in there gpu market.Qasar - Monday, September 7, 2020 - link
if they are so confident, then why did they mention amd or their products in their tiger lake presentation: https://www.youtube.com/watch?v=aFHBgb9SY1Y its from gamers nexus.mentioned amd 12 times, 4800U 8 times, and competition and derivatives 27 times. how many times in nvidia mention amd or intel in its ampere keynote ?? zero. Steve burke in the video, is not nice to intel at all about this tiger lake presentation. intel also denounces the use of benchmarks, and said only imitators use benchmarks. quite funny over all :-)
Spunjji - Monday, September 7, 2020 - link
That's actually a pretty solid way to measure confidence. Intel never used to talk about AMD's mobile products, just how much better their own stuff was than the previous stuff. 😬Spunjji - Monday, September 7, 2020 - link
I'd be interested in your rationale for that claim.RedOnlyFan - Sunday, September 6, 2020 - link
Obviously. Amd 2006 was s**t it was on the verge of death. When you compare amd today and amd even 2 years ago you will have mind bending improvement figures.Spunjji - Monday, September 7, 2020 - link
"Mind bending improvement" from 2 years ago when AMD were competitive in price/performance with Skylake derivatives would be a pretty impressive feat, especially as Intel still rely on those for their desktop and server products. Are you sure you meant to post that? 😂jjjag - Wednesday, September 2, 2020 - link
any word on the die size or transistor count of the 4/8/96 or the 2/4/48? Or are all the current product de-featured 4/8/96 die?IanCutress - Wednesday, September 2, 2020 - link
2/4/48 is binned 4/8/96 siliconDesierz - Wednesday, September 2, 2020 - link
PCIe 4.0?IanCutress - Wednesday, September 2, 2020 - link
Says it in the first picture.Spunjji - Friday, September 4, 2020 - link
4 whole lanes of it :DvFunct - Wednesday, September 2, 2020 - link
Really wish they had a 45W TDP part.Solendore - Wednesday, September 2, 2020 - link
It's coming early next year :) - 8 core, 45W class parteek2121 - Wednesday, September 2, 2020 - link
I still have yet to find a solid source on an 8 core Tiger Lake part.JayNor - Wednesday, September 2, 2020 - link
perhaps poke in the linux sources. Tiger Lake-H is already there.ballsystemlord - Wednesday, September 2, 2020 - link
Too bad there are no processors listed with hyperthreading disabled. I was so looking forward to their artificial product segmentation. ;)Luminar - Thursday, September 3, 2020 - link
Hyperthreading uses more power and can hurt performance.Meteor2 - Thursday, September 3, 2020 - link
Not in a laptopAchaios - Wednesday, September 2, 2020 - link
Is this yet another paper launch seeing as how #IntelCPUShortages and all?Also nice touch with the 10nm+++++++.
eastcoast_pete - Wednesday, September 2, 2020 - link
Looking forward to the first actual tests. I am due for a new laptop, so it's good if Intel and AMD have at it - better for us.Question: does Renoir support AV1 decode?
nandnandnand - Wednesday, September 2, 2020 - link
No: https://en.wikipedia.org/wiki/Video_Core_Nexteastcoast_pete - Wednesday, September 2, 2020 - link
Yes, I saw that, too. Bit unfortunate, having the AV1 decoded in ASIC makes watching AV1 material a lot less battery-hungry. Still, I look forward to an actual test; Ian, if you can, let us know if that AV1 playback works well or not.nandnandnand - Wednesday, September 2, 2020 - link
My guess is that VCN 3.0 will have AV1 decode, and that it will be in RDNA2 GPUs in a couple months, and Cezanne and Van Gogh APUs (early?) next year.Meteor2 - Thursday, September 3, 2020 - link
Let's hope sominde - Wednesday, September 2, 2020 - link
When with VPro?vlado08 - Wednesday, September 2, 2020 - link
i7-1185G7 base frequency 3000MHz at what TDP? 12W , 15W or 28WHifihedgehog - Wednesday, September 2, 2020 - link
28W. It's the new default:https://twitter.com/_rogame/status/126981912051152...
ikjadoon - Wednesday, September 2, 2020 - link
Do I trust a random twitter screenshot from three months ago or the Anandtech article here?>These processors have a nominal TDP of 15 W
vlado08 - Wednesday, September 2, 2020 - link
Well some times there are unintentional mistakes in the articles.In this article there are some: "The top 12-25 W processors are technically known as ‘UP3’ processors.."
25W or 28W?
Hifihedgehog - Wednesday, September 2, 2020 - link
OK, naysayers. Here is a first party source: Tiger Lake-U's 3.0 GHz base clock is ONLY for a 28-watt, "TDP-Up" configuration. No magic bullet with SuperFin, I am afraid.https://ark.intel.com/content/www/us/en/ark/compar...
vlado08 - Wednesday, September 2, 2020 - link
Thank you.smilingcrow - Wednesday, September 2, 2020 - link
Good call.silver_eagle - Thursday, September 3, 2020 - link
https://www.youtube.com/watch?v=8Kv4QF1_t-o@Hifihedgehog please cease being a die hard AMD fan...there is no benefit to be a die hard fan. Be open to both offering, and argue with good facts, and not speculating when there is not enough Intel TGL laptop in the marker yet. My advice is to look at the actual product, and draw conclusion
Spunjji - Friday, September 4, 2020 - link
"If I call the other guy a die-hard AMD fan then perhaps nobody will notice how actualyl I haven't yet made a single coherent point."Shoo. Begone. Away with you.
vladx - Friday, September 4, 2020 - link
Look who's talking, you comment on every Intel article here like a crazy ex-girlfriend. I would tell you to use your brain, but I know it would just be a waste of time for rabid fanboi like yourself.Qasar - Friday, September 4, 2020 - link
just like your own comments on any thing amd, says the rabid intel/nvidia fan, vladxvladx - Friday, September 4, 2020 - link
I rarely comment on this website. Nice try "poisoning the well", Qasar.Qasar - Friday, September 4, 2020 - link
and when you do, its usually negative in some way towards amd.Spunjji - Monday, September 7, 2020 - link
@Qasar no sense in trying to get coherent responses out of vladx. He just lies and then accuses other people of doing what he does. 😬🤷♂️Spunjji - Friday, September 4, 2020 - link
Maybe you should have read the graph in this very article that clarified this point..?Here's a reminder:
https://twitter.com/IanCutress/status/130182758102...
boredsysadmin - Wednesday, September 2, 2020 - link
According to NotebookCheck (no idea where they got the benchmark, so salt required)https://www.notebookcheck.net/Intel-Core-i7-1165G7...
Intel Core i7-1165G7 (15W TPD) at Cinebench R20 - CPU (Multi Core) -> 2530 points, where AMD Ryzen 7 PRO 4750U (15W TPD) is at 2992.5
Hifihedgehog - Wednesday, September 2, 2020 - link
That's an assumption. And it's wrong, actually. Per the just published ARK listing, the 3.0 GHz base that Intel quoted in their PR release is for a 28-watt TDP (TDP-Up). More interesting is how they omit the default TDP listing entirely in ARK. Wonder why? ;)https://ark.intel.com/content/www/us/en/ark/compar...
smilingcrow - Wednesday, September 2, 2020 - link
They don't omit it but you have to click a few arrows and then it says 15W 1.3GHz.Very sneaky.
Hifihedgehog - Wednesday, September 2, 2020 - link
Unless I am missing something, you are mistaking it for the Ice Lake part (Core i7-1065G7) I also put in the comparison. It probably swapped columns with the Tiger Lake part after hitting the arrows.smilingcrow - Wednesday, September 2, 2020 - link
You are correct, again! :)Spunjji - Friday, September 4, 2020 - link
Oh wow. They don't even quote a base frequency for TGL - just the "configurable TDP-up" and "TDP-down" figures. That's kind of hilarious.Oxford Guy - Wednesday, September 2, 2020 - link
Intel trying to punish people for mocking its ++++ labeling by inventing an extremely cumbersome replacement.Spunjji - Friday, September 4, 2020 - link
Obfuscation is the name of the game. This way they get a hard reset, instead of having to add a +, then delete the + a few months later and erase all records of previous processes that may or may not have had the same name. 😬Oxford Guy - Wednesday, September 2, 2020 - link
Stop quoting a single number (TDP). Start quoting the range. It’s highly deceptive to quote 15W and post that as a graphic when the chip draws more than double in real-world use.The people who said “power consumption doesn’t matter” are wrong for many reasons, including performance-per-decibel which is important with portable form factors.
eek2121 - Wednesday, September 2, 2020 - link
It doesn’t matter.The end result matters. Find me a 2lb laptop that gets 12 hours of battery life with a Renoir chip in it that has Thunderbolt, Wifi6, etc. and we will talk.
Hifihedgehog - Wednesday, September 2, 2020 - link
Thunderbolt is the only thing you will be hard pressed to find in a Renoir offering. Everything else isn't terribly hard, though.smilingcrow - Wednesday, September 2, 2020 - link
Once USB 4.0 is supported by AMD I can see TB support being much less of an issue.silver_eagle - Thursday, September 3, 2020 - link
@Hifihedgehog don't die for AMD...be open minded to both brands. There is no use to be a die hard AMD fan when all information pointed to TGL will beat Renoir by a margin, performance and battery life, and features...Graphics & Gaming, Thunderbolt4 (4x 4K monitor support), Wifi6, AI algorithm for workflow & photoshop image processing, and battery life....t.s. - Thursday, September 3, 2020 - link
..and double the price. Yep, nope. Thanks anyway. oh, btw, people will believe when there's review/bench, which is next-to-non-existent for TGL.silver_eagle - Thursday, September 3, 2020 - link
double the price? are you sure? this is very misleading....Spunjji - Friday, September 4, 2020 - link
Weird how you just show up to this one article feeling all certain about precisely how this unreleased chip will perform in unreleased notebook designs.silver_eagle - Friday, September 4, 2020 - link
@Spunjji I think your statements are squarely referring to Hifihedgehog. He is the die hard AMD fan who poo poo Intel since the beginning, without even seeing an actual device. Poor argument from himSpunjji - Monday, September 7, 2020 - link
@silver_eagle Wow. All HiFiHedgehog has done is point out inaccuracies in other people's comments, I've not seen him actually *advocate* for anything. Swing and a miss.Spunjji - Friday, September 4, 2020 - link
"Find me a laptop with a Renoir chip in it that has Intel branding on it. Bet you can't. Checkmate, fanboy 🤡"^ This is almost literally what you're saying here. If you set arbitrary constraints that are only fulfilled in total by one vendor then yeah, congratulations, you "win".
vladx - Friday, September 4, 2020 - link
Pretending there's no benefit to having Thunderbolt 3, what a pathetic hypocrite you are, Spunjji.Qasar - Friday, September 4, 2020 - link
for some, there is no benefit, i am one of those, i could care less if a notebook has it or not, i will never use it. but the intel fanbois, say its a must have feature, and a deal breaker if it doesnt have it.silver_eagle - Friday, September 4, 2020 - link
@Qasar, you don't care for Thunderbolt doesn't mean other users don't care. Plenty of users are using Thunderbolt, as it can enable external discrete GPU connectivity via Thunderbolt, supercharge your laptop into even greater 2k/4k gaming machine, while the Xe graphic should work great as 1080p gamingQasar - Friday, September 4, 2020 - link
i said " some " and i am one of those. i guess you glazed over that part. most of those where i work, also dont seem to care about TB support as well. while you may want it, keep in mind others may notSpunjji - Monday, September 7, 2020 - link
@silver_eagle - "supercharge your laptop into even greater 2k/4k gaming machine" - sure, if you feel like paying over the odds to get a laptop with 4 lanes on the Thunderbolt 3 ports, then $400 for an enclosure to put your $500 GPU into, all to get $300 GPU performance out of the whole setup.I was a big convert to the idea of eGPU back in the day, but it's still too expensive and the performance impact is still too high. As a result, in performance-per-dollar terms it's still cheaper to build a separate mini-ITX gaming rig.
RedOnlyFan - Sunday, September 6, 2020 - link
But why miss out on a great a future proof ( like a real future proof not the amd fx kind off) with tigerlake you get it for free. If you don't like faster data transfer, connect multiple to multiple devices with just cable.. Sure go live the cables hanging out life.Qasar - Monday, September 7, 2020 - link
RedOnlyFan, i just have no use for TB, thats all. currently i have 4 USB cables attached to my comp, keyboard, mouse, usb 3 hub, and a 2 drive usb 3 dock that i dont use very often. thats it, i doubt TB would help reduce those whopping 4 cables at all. not every one needs or has a use for TB, so it may not be a feature that some people look for, and pass on everything just be cause it doesnt have it.Spunjji - Monday, September 7, 2020 - link
Having to straw-man me that hard just to get a reply in is what's pathetic here, vladx. Feel free to point to where I said - or even implied - that there's "no benefit to having Thunderbolt 3".RedOnlyFan - Sunday, September 6, 2020 - link
You will not find it. Don't bother searching.Spunjji - Monday, September 7, 2020 - link
Yes, because they asked for a crocoduck. It's a well-know tactic amongst cultists and the terminally dishonest.Bluetooth - Wednesday, September 2, 2020 - link
Why did they call other CPU manufacturers: "Imitators" in the announcement. Is this a kick to AMD or Apple?Bluetooth - Wednesday, September 2, 2020 - link
Why did they call other CPU manufacturers: "Imitators" in the announcement. Is this a kick to AMD or Apple? This felt so unprofessional, but still intentional the way he said it.Spunjji - Friday, September 4, 2020 - link
Jackassery is all they have left.silver_eagle - Friday, September 4, 2020 - link
AMD is definitely an imitators. Intel has spent hundreds of millions $ researching and developing ultrabook ecosystem since the invasion of ARM tablets, with tons of R&D money pour in to enable the x86 ecosystem....Isn't AMD an imitator by just sitting & riding on the success of Intel ultrabook success? AMD just need to spend energy and money on the CPU while Intel has to spend whole lot of effort & resources in CPU, platform and ecosystem enabling.In another words, Intel create while AMD ride for free on Intel's effort. And the AMD fan boys don't seem to get it and appreciate this, and poopoo Intel.
Tams80 - Friday, September 4, 2020 - link
No, most of that has ultimately been on system OEMs.That Intel decided to help them is wholly on them. Intel have benefited greatly from it, but it's not theirs to control. And Intel hardly psuhed boundaries, as there was always going to be a desire for thinner and lighter computers.
The only one being a fanboy here, is you.
silver_eagle - Friday, September 4, 2020 - link
@Tams80, you may need to read up news (lots of them, especially in Taiwan & China OEM), on how Intel spent hundreds of millions into researching and enabling the laptop ecosystem.I read the news where Intel will build thin and light form factors as reference design for all the OEMs, then OEMs will add their own tweak and secret sauce.
I read from China news, Intel even directly help to source and identify the quality parts that can help board level design and share with OEMs.
All these efforts from Intel to enable the ecosystem went unread/unseen/unheard by AMD fanboys...AMD can just borrow on Intel pioneering efforts and success.
Intel don't deserve the poopoo and harsh words from AMD fanboys
Qasar - Friday, September 4, 2020 - link
silver_eagle, " AMD can just borrow on Intel pioneering efforts and success. " oh like how intel was practically forced to adopt AMD64 so they could also have 64 bit capable cpus ? or how intel came out with the in die memory contoller AFTER amd did ? to fair, amd AND intel have borrowed/coppied/imitated each other in various ways over the years.Tams80 - Saturday, September 5, 2020 - link
Again, it was going to happen one way or another. Intel just sped it up.And if they chose to help third parties, who could and did eventually did take that help for use with a competitor(s); well that's not AMD "cheating", "imitating" or whatever. Intel got their exclusivity. Now the OEMs can do as they wish, and they are always have been big enough to do so; they weren't going to look a gift-horse in the mouth.
And finally, your choice a language suggests that you should be participating in the comments until you grow up.
Spunjji - Monday, September 7, 2020 - link
You seem to have confused "help the ecosystem" with "buy themselves market share".RedOnlyFan - Sunday, September 6, 2020 - link
No, unlike amd Intel works with software vendors, Microsoft display manufacturers, and oems innovates and develops the end devices. Amd after decades of sleeping wakes up and starts copying Intel. They couldn't even come up with a different modeling scheme.. They even copied 3,5,7,9 series eww. Intel as a platform has innovatived more.. Thunderbolt, wifi 6 default, nvme ssd, 2.5G ethernet as a default, pcie 4 (amd uses active cooling on desktop mb for pcie4 chipsets). Intel has single handedly lead the industry as a whole.Spunjji - Monday, September 7, 2020 - link
So AMD "copied" Intel's numbering scheme... that Intel "copied" from BMW. 😂 However will they survive.Your copy/paste of Intel marketing materials is quite amusing, but that PCIe 4 claim really has a wonderful air of special pleading about it.
dudedud - Friday, September 4, 2020 - link
Didn't they only invent the word "ultrabook"? Pretty sure Apple released the macbook air, which in essence was the first ultra-compact laptop design as we know today.Spunjji - Monday, September 7, 2020 - link
Sort-of correct - the MacBook Air was itself a it of a clone of prior Sony Vaio designs, though, right down to the chiclet keyboard. Turns out everything's based on everything else.The MacBook Air range was also the first place where Apple's dissatisfaction with Intel started to show. They kept having their designs held back and/or hampered by Intel not meeting design targets, right up to the latest models that throttle massively under load.
Spunjji - Monday, September 7, 2020 - link
So you're claiming that Intel pushed the industry forwards by... *checks notes* copy/pasting Apple's MacBook designs, then "encouraging" OEMs to single-source all the components in order to get marketing kick-backs that just happen to lock their competitors out. Sure.shabby - Wednesday, September 2, 2020 - link
So how many 10nm laptop generation's will they release before a desktop part comes out?vladx - Wednesday, September 2, 2020 - link
2022 most likely for desktop, maybe late 2021 for server CPUs.eek2121 - Wednesday, September 2, 2020 - link
Alder Lake is coming for desktop. Next year.Spunjji - Friday, September 4, 2020 - link
Projected for "second half" of 2021, so it probably depends on yields. Expect an OEM-only release first, like AMD had to do with desktop Renoir.Meteor2 - Thursday, September 3, 2020 - link
Maybe one more, this time next year? Alder Lake will be a desktop CPU built on 10 nm and Intel has stated a 2H 2021 release window.Given that laptops outsell desktops by 50% and benefit more from 10 nm, it's not too surprising.
Spunjji - Friday, September 4, 2020 - link
Yet that never stopped them before. It's almost like they only have to make those sorts of decisions when:a) They can't yield high enough to satisfy demand for both markets, and
b) Their new designs fail to outperform the old ones when TDP limits are relaxed.
drothgery - Thursday, September 3, 2020 - link
Current plan is none, I think.Maybe there will be general availability of 12th gen laptop parts a bit earlier than desktop parts, but 12th gen desktop is going to be Adler Lake S, and that'll be 10 nm.
DejayC - Wednesday, September 2, 2020 - link
Why are they releasing lower power chips before high power ones? Intel already has the mobile segment on lock despite the efficiency gains on Ryder. They need to go to 10nm for high end desktop and servers NOW if they want to stop AMD from eating their lunch in the performance market.DejayC - Wednesday, September 2, 2020 - link
*Ryzendrothgery - Wednesday, September 2, 2020 - link
Making 4 core dies is easier than making 8 core dies (and especially than 28-core dies), and they didn't start rethinking to a chiplet-style direction for server parts until after Ice Lake SP taped out?RedOnlyFan - Sunday, September 6, 2020 - link
It's not that Intel doesn't know how to clue things.. They have a better and bigger glue stick. There are trade off that come with chiplets if the customer doesn't what it they don't make it. In fact Intel has a 56c 2 die clued xeon.JayNor - Wednesday, September 2, 2020 - link
Ice Lake Server is 10nm ... later this year.Also their first discrete GPU, DG1.
eastcoast_pete - Wednesday, September 2, 2020 - link
One reason is that processors for mobile fetch more $$$, once you factor area/number of chips per wafer into account. The next big question is if Intel can move these advances into their Xeon line before EPYC takes their lunch money.shabby - Wednesday, September 2, 2020 - link
Almost sounds like mobile cpu's are the beta test for a new process.t.s. - Thursday, September 3, 2020 - link
nope. as stated, area/number of chips per wafer, and failure rate. AFAIK, Intel still not using chiplet tech for producing their chips. So, the most reasonable, high success rate is mobile procs.joejohnson293 - Thursday, September 3, 2020 - link
Because low yields prevent them from making 10nm desktop/server partsjoejohnson293 - Thursday, September 3, 2020 - link
or for that matter 8 core tigerlake - highly doubt if they can release 8 core Tigerlake early next year..RSAUser - Thursday, September 3, 2020 - link
AMD had such a success with ryzen due to the Chipley design, e.g. The 3600 is 2x4 core with one fused off on each core complex, that means you technically needed to get 3/4 cores to work out successfully within a small die area.Successfully getting e.g. 6 adjacent cores to all be defect free is a lot harder, also take into account that each wafer has limited space, with the edges usually being wasted.
That's why you start with mobile and work your way up in cpu.
Graphics cards it's the opposite, since you can fuse off lots of cores and it's still fine as they don't need to be adjacent, as long as you meet general area requirements, and then you bin for speed and you get your chip.
RedOnlyFan - Sunday, September 6, 2020 - link
It's just the market requirements, there is more demand for thin and ultralight notebooks. High end desktop or desktop market in general is very small compared to notebooks. So follow the money. No amd and Intel compete in different segments even though they all look the same from outside. There are some things the clued epycs are good at there are some things only Intel can do.It's only the rgb gaming desktop youtubers that cry pathetically for a few fps and act like the sky has fallen.
Spunjji - Monday, September 7, 2020 - link
"Intel decided to stop doing things they had done with every generation prior to 14nm because actually that was always a bad idea and they should never have been doing it. It's definitely not because of problems with their processes."Okay, Red.
AMDSuperFan - Wednesday, September 2, 2020 - link
Hopefully I don't get in trouble for posting a URL to another site, but I noticed that Intel 10700 manufactured on 14nm+++ runs cooler than everything including 65 watt 7nm AMD processors. https://www.techpowerup.com/review/intel-core-i7-1... Why do the AMD processors run so hot I wonder?I also noticed that the 14nm Intel processor only runs 2 watts more than the 7nm AMD processor in the stress test. It seems that Intel has team red beat on efficiency. As an AMD SuperFan, I am saddened by this turn of events. If they beat us that badly with their 14nm process, how much of a beating are we going to be taking with this fancy new 10nm Superfin stuff? I am worried.
Alistair - Wednesday, September 2, 2020 - link
a lot of people failed high school physicsheat and temperature are not the same thing, the temperature is irrelevant, only thing that matters is how many watts...
Alistair - Wednesday, September 2, 2020 - link
temperature can be different because of thermal interface, because of thermal area (smaller chips are hotter at the surface), it doesn't matter, in the end cooling is limited by watts, not temperaturehojnikb - Thursday, September 3, 2020 - link
Did you actually look at the review you listed? AMD beats Intel in efficiency by a decent margin and if we look at max boost frequency amd is far away. Same for power consumption, that part can eat 250W at max boost.AMDSuperFan - Thursday, September 3, 2020 - link
Yes the article clearly says that Intel is much more efficient. The only way it isn't more efficient is if it is overclocked. The stock chip on 14nm always seems to beat the 8 core 7nm AMD in performance and efficiency. I can't believe a 14nm chip runs so much cooler than the 7nm chips and at higher performance and clocks. This is making me quite nervous and scared.Spunjji - Friday, September 4, 2020 - link
Turbo isn't overclocking, Mr Obvious Shill. The USEFUL page of that article ( https://www.techpowerup.com/review/intel-core-i7-1... ) clearly says that Intel are more efficient on single-thread workloads and less efficient on multi-core workloads when running at the same performance level. They only come close to (and still lose to) AMD for multi-threaded efficiency when the chip runs at base clocks, at which point the chip is also slower than the AMD competition.And guess what: even when your primary task is single-threaded, you can bet your background-tasks aren't!
Spunjji - Friday, September 4, 2020 - link
Funny that you seem to have ignored the turbo figures that sit at the precise opposite end of the chart, meaning that to actually match AMD on performance Intel's 14nm+++ CPU has to pull *114W more power*.Thanks for sharing your "concerns", though, three kids in an Intel-branded trenchcoat! Uh, I mean, "AMDSuperFan" 😉😉😉
vladx - Friday, September 4, 2020 - link
Oh boohoo, are you sad he stole your nickname?Spunjji - Monday, September 7, 2020 - link
For someone who has repeatedly called me obsessed, a fanboy and a shill, you really did just go through this whole thread and reply to me with snarky bullshit. Here you're white knighting for the most obvious troll since Trump ran for president.I'm happy to admit I post on here a bunch. I'm a tech-head and it bugs me when people post lies and nonsense. What's your excuse for defending a liar?
JayNor - Wednesday, September 2, 2020 - link
"However the underlying clock-for-clock performance improvements are minimal ..."Surely their dual ring bus, cache updates, PCIE4, memory speed updates, Thunderbolt 4 contribute something more than "minimal" improvements to the performance.
hojnikb - Thursday, September 3, 2020 - link
IO will have no impact on clock fro clock performance.Tomatotech - Saturday, September 5, 2020 - link
The old mainframes had pretty low performance but absolutely huge I/O enabling them to do data runs that were impossible for a long time with home comouters.Even though a few years later home computers were approaching mainframe speeds, home computers still didn’t have the I/O power to compete (arrays of memory, racks of disk space, ultra-high speed drum scanners and readers, high speed printers and cutters). This was all before cloud computing and thunderbolt came along.
vlado08 - Wednesday, September 2, 2020 - link
In the article i7-1160G7 has a base frequency of 1200 MHzHere https://www.intel.co.uk/content/www/uk/en/products...
The Configurable TDP-up Frequency of i7-1160G7 is 2.10GHz at Configurable TDP-up 15W
The Configurable TDP-down Frequency of i7-1160G7 is 900MHz at Configurable TDP-down 7W
So the 1200MHz form the article is probably for 12W because:
Here https://ark.intel.com/content/www/us/en/ark/produc...
The Configurable TDP-up Frequency of i7-1165G7 is 2.80GHz at Configurable TDP-up 28W
The Configurable TDP-down Frequency of i7-1165G7 is 1.20GHz at Configurable TDP-down 12W
So for i7 4cores/8threads we have these "base"(guaranteed) frequencies :
2.80GHz at 28W
2.10GHZ at 15W
900MHz at 7W
vlado08 - Wednesday, September 2, 2020 - link
Edit:So for i7 4cores/8threads we have these "base"(guaranteed) frequencies :
2.80GHz at 28W
2.10GHz at 15W
1.20GHz at 12W
900MHz at 7W
So for i5 4cores/8threads we have these "base"(guaranteed) frequencies :
2.40GHz at 28W
1.80GHz at 15W
900MHz at 12W
800MHz at 7W
AMDSuperFan - Wednesday, September 2, 2020 - link
This is much better than AMD and now I am worried.quorm - Wednesday, September 2, 2020 - link
Lol at comparing next-gen intel with a power consumption halfway between current U and H to current gen Ryzen U. Who careAlistair - Wednesday, September 2, 2020 - link
base and TDP is all meaningless anyways, you need actual power consumptioni have a base 2ghz but it always runs at 2.6ghz, that's 30 percent more than guaranteed
quorm - Wednesday, September 2, 2020 - link
As far as I can tell here, Intel has combined a smaller process with a higher tdp, while maintaining a core count limited to 4.8 cores is arguably excessive for most laptop usage. Unless you're using it for compiling, encoding or a similar task. So, this tradeoff may well be worth it. But, this announcement is PR heavy and obviously shows these as yet unavailable processors in the best possible light. Nobody should think AMD will be unable to compete.
This contrasts heavily with the recent nvidia ampere announcements, as the rtx 3000 series specs indisputably indicate a huge increase in performance from an already dominant company. Perhaps AMD can catch up there, but I doubt it.
nandnandnand - Thursday, September 3, 2020 - link
8 cores is excessive for a smartphone.Renoir will look a little better once independent Tiger Lake reviews come out. And then AMD will leap over Tiger Lake with Cezanne/Van Gogh. But Intel will probably leap over Cezanne a few months later, and so on.
Beany2013 - Thursday, September 3, 2020 - link
It's really *not* excessive if you're using a laptop as a desktop replacement doing heavy office based multitasking.You know, like most of the western world has been doing for the last six months....
Alistair - Monday, September 7, 2020 - link
it isn't excessive, not when Intel wants the same amount (or more money) for half the coresPandaBear - Saturday, September 5, 2020 - link
Coming from Intel in 2020, yup, likely to be delayed. It doesn't matter until you see mass production good yield to laptop OEM for a good price (i.e. something selling in good volume so they can lower the price). You can have the best paper launch and then cant make enough of it to lower the price and people will buy your competitor's much cheaper slightly worse product anyday.Spunjji - Friday, September 4, 2020 - link
I rarely say this on here, but please do shut the hell up.vladx - Friday, September 4, 2020 - link
That's not nice, are you a nazi as well?Spunjji - Monday, September 7, 2020 - link
Why am I duty-bound to be nice to trolls, and why do you care? The obvious answer would be because you're a troll too, but now I'm wondering if it's a little more personal than that.vladx - Saturday, September 12, 2020 - link
Yes, everyone who doesn't agree with your delusional thinking is a troll. Classic AMD fanboi logic.Qasar - Saturday, September 12, 2020 - link
and any one that doesnt agree with your own delusional thinking is a troll, classic intel fanboy logic, like before, whats your point ?that fact is, you hate anything amd, others have said this to you already, no need to keep showing it, most already know.
vladx - Monday, September 14, 2020 - link
Except I didn't call Spunjji that, he's a fanboi not a troll.Spunjji - Friday, September 4, 2020 - link
I'm not sure I buy that their CPU will go from 1.2Ghz at 12W to 2.1Ghz at 15W, then take another 13W to get to 2.8Ghz. I get that there are shoulders to the voltage curves, but that still seems fishy.AbRASiON - Wednesday, September 2, 2020 - link
Is this processor going to end up, in any Intel NUC?zamroni - Wednesday, September 2, 2020 - link
If natively reduced to 4 cores and a bit lower clock, ryzen 4000 should also able to reach 7wSpunjji - Friday, September 4, 2020 - link
That's Van Gogh's job.PaulHoule - Wednesday, September 2, 2020 - link
Did they get permission to use the name from Samsung?GreenReaper - Thursday, September 3, 2020 - link
That's what I was thinking - you can't just use Evo, but then did Samsung ever manage to trademark it themselves? I doubt Intel would slip up like that here.vladx - Friday, September 4, 2020 - link
You can't trademark something so generic as Evo.Qasar - Friday, September 4, 2020 - link
did mitsubishi use the word evo first ? if so, then maybe they have the trade mark to it ? :-)Tchamber - Wednesday, September 2, 2020 - link
This reminds me the Centrino campaign Intel ran all those years ago. No one knew what that meant, and I doubt this will gain traction, either. I feel like they're advertising is saying, "This year, our products are better than last year, trust us!"Spunjji - Friday, September 4, 2020 - link
Enough people knew that it meant "buy this for fast laptop".Didn't matter whether or not it was true (although back then it mostly was).
Most of what it's actually about will happen behind the scenes - minimum purchase quantities, kickbacks for hitting sales metrics, design assistance in exchange for agreed numbers of design wins, that sort of thing.
PandaBear - Saturday, September 5, 2020 - link
I remember that name, it just means they have Intel network and graphics and chipset and CPU. A bunch of BS to be honest, like certified pre-own car.GeoffreyA - Saturday, September 5, 2020 - link
That word brings back memories of Centrino advertisements on TV back in the day, with that soothing "Intel Inside" sound effect, which seemed to say to you, unconsciously, "Intel == speed" and "Intel is the way to go."yeeeeman - Thursday, September 3, 2020 - link
Aaand no review samples. Aaand no more than 4 cores. How can ask for trust if they can't even deliver a laptop to test???silver_eagle - Thursday, September 3, 2020 - link
All should watch this to get to know more about Intel 11th Gen core, directly from Intel Newsroomhttps://www.youtube.com/watch?v=8Kv4QF1_t-o
silver_eagle - Thursday, September 3, 2020 - link
Gaming performance starts @ 44mins (vs ICL, vs AMD, & Nvidia MX350)arashi - Thursday, September 3, 2020 - link
I see Intel Marketing is doing OT today.Spunjji - Friday, September 4, 2020 - link
"I bring this to you direct from Intel PR, allegedly apropos of nothing". K.hanselltc - Thursday, September 3, 2020 - link
Other publications are noting the base clock to be rated for the 28W configuration, but you note the nominal tdp to be 15W -- what be going on?IanCutress - Thursday, September 3, 2020 - link
Article has been updated. Intel didn't make it clear...nirolf - Thursday, September 3, 2020 - link
There's a typo on page 1: "At the top end is Intel’s Core i7-1165G7, a quad core processor with hyperthreading and the full 12 MB of L3 cache." This should be the 1185.Sahrin - Thursday, September 3, 2020 - link
4.8GHz in a mobile part?...it's going to be interesting to see how this actually works, because typically this is the regime where thermal limits get blown all to hell. There's a reason that AMD/Intel hold mobile clocks down.
Spunjji - Friday, September 4, 2020 - link
50W 😂RedOnlyFan - Sunday, September 6, 2020 - link
What's so funny about that? 50w is a very normal power consumption on a laptop.Spunjji - Monday, September 7, 2020 - link
Not in ultraportables with a 15W "nominal" TDP, red.Silma - Thursday, September 3, 2020 - link
11th generation Intel Core 45-Watt NOT COMING NEXT.Sharrold66 - Thursday, September 3, 2020 - link
From what I've read, the performance tests were all single threaded tasks. Ryzen 4700u would kill it in multithreaded tasks. Also, I saw a review on PCWorld of a Lenovo laptop with a Ryzen 4800u that beat the performance of a Dell laptop with a i7-10875H and a GeForce GTX 1650 Ti. If someone would put the Ryzen 4800H in a laptop, it would be ever better!P.S. I've used ATI / AMD video cards exclusively for over 20 years and never had an issue with drivers. Any problems I ever had were solved with patches or mods for the game itself.
alufan - Thursday, September 3, 2020 - link
Lol at the Intel Fanbois creaming themselves over an unbenchmarked CPU, look its simple AMD make the chips in the consoles, the have added RDNA2 to those chips they have a very good CPU in the mobile space right now that beats pretty much any current ie released Intel mobile part, I admit I hate Intel since I lost a CPU and was denied a warranty claim in the early core 2 duo Days but for all the Fanbois I hope they do have a decent offer becasue it will force AMD back to the Table and we the consumer will benefit, but consider for a second what will happen to this Intel CPU if AMD adds the latest Navi cores as fitted to the Consoles and then also consider if you think AMD doesnt already have this option waiting in the Wings, the trouble is AMD is a whole generation ahead of Intel at the moment and we saw how long it took AMD to make up that kind of groundAMDSuperFan - Thursday, September 3, 2020 - link
Unfortunately for us AMD fans, Intel will always have more money to hire better designers and engineers. So Intel is way back on top. We can't cry over spilled milk. I guess it is good to have competition. We will be back!PandaBear - Saturday, September 5, 2020 - link
Intel's business model is move to a rural town and build up a work force there, they won't pay you top dollar and they basically say take it or leave it. They have been hiring 2nd to 3rd rated engineers for a decade already.AMDSuperFan - Thursday, September 3, 2020 - link
I have also Sharron! I am running a Radeon 7900 series card and it is really great. It's plenty fast enough for all of my gaming needs. I am very very concerned about this new NVidia technology of the 3900 sending us AMD fans back to the stone ages. The 3900 seems much faster than my 7900 in all of the stats. We just don't have the ability to compete at this level. It is better for us fans to be able to buy the new AMD products in the bargain bin.Spunjji - Friday, September 4, 2020 - link
🤖quorm - Friday, September 4, 2020 - link
It's a weird bit he's doing where he pretends to be a worried AMD fan.Spunjji - Monday, September 7, 2020 - link
I'm currently wondering whether it's a vladx sockpuppet. Whatever the case, they're way too obvious to be your standard Intel paid shill.GeoffreyA - Saturday, September 5, 2020 - link
Worry not, sir or lady, I am sure your cheques from Intel and Nvidia are on their way. You'll be able to upgrade from any AMD-era stone-age equipment.Spunjji - Monday, September 7, 2020 - link
I doubt they're getting paid for a post that bad. The usual paid posts are along the lines of "muh drivers" and "oh look here's a link to an Intel PR video, fascinating stuff".MDD1963 - Thursday, September 3, 2020 - link
It's hard to get excited about a 4c/8t laptop unless gaming is not a primary concern. (Guess I'd need to see this CPU paired with the best mobile GPUs later to gauge it's minimum FPS potential, given that desktops often have issues with 4c/8t minimums, excluding the excellent R3-3300X, anyway)Spunjji - Friday, September 4, 2020 - link
This one's not likely to be paired up that way.For a start, it only has 4 lanes of PCIe 4.0 - so according to the rules hastily constructed by the community to explain away the dearth of Renoir gaming laptops, it's LiTeRaLlY iMpOsSiBlE for this to be paired with a high-end GPU.
Realistically, though, they're going to keep on throwing their 14nm+++ CPUs at gaming laptops until they can yield enough 10nm product to curl out a few 8-core TGLs into the market.
jrs77 - Friday, September 4, 2020 - link
All this talk about TDP, powerdraw etc...WHO CARES?!
All that matters is real world performance. Tell me how fast a CPU renders a video in Premiere/DaVinci, or how fast it applies filters to images in Photoshop/GiMP, etc.
And then the next question is, how long the battery lasts while doing the before mentioned tests. A comparison, which is not that easy as battery-sizes do vary alot unfortunately.
RedOnlyFan - Sunday, September 6, 2020 - link
They demonstrated it right. Can render a 4k video on premier 2.5x faster than ryzen 8c, can do preset apply on same number of images 2x faster than any other laptop cpu.Any Athena device is 10+ hrs battery.
ceisserer - Friday, September 4, 2020 - link
I thought this was supposed to be a "launch"? Where are the review units for the press - to be honest I expected to see charts full with performance numbers from anandtech - instead I have to believe what Intel presents here.Regardless of the stupidity of this "launch" - Impressive GPU performance, kudos Intel!
Spunjji - Friday, September 4, 2020 - link
It's a new high for iGPUs, which is to be commended, but they appear to have solved the problem by throwing die area at it - they need significantly more area (and transistors) just to beat AMD in synthetics. It's not a particularly promising sign for their future dGPU efforts.PandaBear - Saturday, September 5, 2020 - link
Yes I was looking at it and realized that too. It is like some of the earlier gen Ryzen g having half the die area for iGPU.RedOnlyFan - Sunday, September 6, 2020 - link
What do you mean by "launch" this was supposed to be only a public announcement of the 11th gen laptop soc. The final laptops will be launched with pricing and availability by the oems. This is the standard industry procedure.onewingedangel - Friday, September 4, 2020 - link
Has it been confirmed whether or not the integrated Thunderbolt 4 ports will support Displayport 2.0?Under displaypipes it lists DP1.4, but if if TB4 is a superset of USB4, wouldn't it support the displayport alt mode 2.0?
vladx - Friday, September 4, 2020 - link
Thunderbolt 4 has the same exact bandwidth as Thunderbolt 3 so it can't support DisplayPort 2.0 which has a bandwidth of 80Gb/s.onewingedangel - Friday, September 4, 2020 - link
DP 2.0 has the same bandwidth as TB3, but instead of using 2 lanes up and 2 lanes down concurrently uses all 4 lanes to push the data in one direction.Think of it as if both sides of a two lane motorway switched to having all traffic go in the same direction - you've got the same total throughput, but twice as much going one way at the cost of losing the return route.
vladx - Friday, September 4, 2020 - link
Thunderbolt only supports 40Gb/s, for full DisplayPort 2.0 specs you need 80Gb/s. I'd rather they don't begin offering partial support and make it into a shitshow like with USB Type-C.Spunjji - Monday, September 7, 2020 - link
This is something on which we unequivocally agree. USB Type-C is a nightmare standard.wow&wow - Friday, September 4, 2020 - link
10nm for 4-core and 2-core chips, SuperFun transistors and EVil?minde - Sunday, September 6, 2020 - link
Intel .When release u sieries with vPRO? icelake was not , tigerlake?Sortmyit - Sunday, September 6, 2020 - link
Thats one shiney head, can I beat on it like a drum with a drum stickwatzupken - Monday, September 7, 2020 - link
"It's a new high for iGPUs, which is to be commended, but they appear to have solved the problem by throwing die area at it - they need significantly more area (and transistors) just to beat AMD in synthetics. It's not a particularly promising sign for their future dGPU efforts."While we do not know how much die space the top end Xe LP graphic uses on the Tiger Lake, but I will not be surprise it uses a signficant amount of die space. Which is probably one of the reasons why this is stuck at 4 cores max. Other reason could be due to power requirement to sustain the high clockspeed and simply not enough for more than 4 cores with the top end XE LP graphic. These are just my opinions.
Spunjji - Tuesday, September 8, 2020 - link
I've previously done a rough calculation based on an Intel-provided die shot, and to summarise, if their claimed GPU performance comes true then they're beating Vega 8 by around 25% but at a cost of a 33% larger die-area. We probably already know as much as we ever will about that - Intel aren't in the habit of discussing these things publicly anymore.watzupken - Monday, September 7, 2020 - link
Actually after reading through the article, this supposed "amazing" SuperFin and high clockspeed sounds like hot air to me. At 12W, the base speed did not improve over Ice Lake U, which is almost as low. The high base clockspeed that Intel is boasting in their marketing requires a base of 28W. To reach the boost speed, it goes up to 50W in theory. Looking forward to the independent reviews to see how this performs.throAU - Tuesday, September 8, 2020 - link
"launches". guessing this will be like the 1.3ghz pentium 3 of old. on paper launched, in reality not shipping for 6 months at least.Arbie - Tuesday, September 8, 2020 - link
Puzzled by crazy posts?Intel has, at secret camps around the world, trained a crew of "muddy the waters" experts. The valedictory event is a lobotomy, after which they spawn to do battle on the forums. Lately they've been programmed to magnify irrelevancies, make exaggerated or baseless claims, hide assumptions, quote discredited benchmarks, and generally derail or confuse any news favorable to AMD or unfavorable to Intel.
Doubts are sown and manured - and logic is no impediment. My favorite is a recent assertion that Intel processors are much more secure because so many vulnerabilities have already been identified in them!
That's how it's done - and it works!
Spunjji - Wednesday, September 9, 2020 - link
Oh for the days when people did it for free as a hobby 😑desktop computer - Friday, September 11, 2020 - link
Hi I just checked your content, it is really good read. Please have a look on my content also and let me know how you like ithttps://hwrig.com
six_tymes - Saturday, September 12, 2020 - link
does anyone know when these will be out on a DDR5 platform?