The Bay Trail Preview: Intel Atom Z3770 Testedby Anand Lal Shimpi & Brian Klug on September 11, 2013 12:00 PM EST
Earlier this year Intel unveiled Silvermont, its first true architectural update to Atom since its introduction in 2008. I won’t rehash Silvermont’s architecture here, but it’s designed to be a true test of Intel’s performance in the ultra mobile space. Leveraging Intel’s first 22nm SoC process and a very low power/efficient microarchitecture, Silvermont aims squarely at the latest Krait cores from Qualcomm and ARM’s Cortex A15.
Today Intel takes the next step forward, introducing the first tablet SoC based on Silvermont: Bay Trail.
Bay Trail takes up to four Silvermont cores, and for the first time in an ultra mobile Intel SoC pairs them with Intel’s own graphics IP. That’s right, rather than using a GPU block from Imagination Technologies, Bay Trail leverages the same GPU architecture as Ivy Bridge.
The first Bay Trail tablets will be shipping by the end of the year, across both Android and Windows 8.1. Intel expects Bay Trail to show up in tablets and 2-in-1s priced below $599, with everything above $599 falling under Haswell’s jurisdiction.
Bay Trail & Branding
Bay Trail, like all Atom platforms before it, will be available in multiple form factors. Unlike the Atoms of yesterday however, the SoC will carry Pentium and Celeron branding when used in notebooks and desktops. Intel didn’t disclose too much about its Silvermont plans in other form factors other than some basic naming:
Basically notebooks ship under the Pentium N3000 & Celeron N2000 series, while desktops will carry Pentium J2000 & Celeron J1000 branding. All Pentium SKUs seem to be quad-core, while Celeron SKUs will be available in both dual and quad-core versions.
Thankfully Intel shied away from introducing the same complexity with its tablet focused Bay Trail parts. All Bay Trail tablet SKUs carry Atom branding. There’s the quad-core Z3700 series and the dual-core Z3600 series.
Although Intel offers both dual and quad-core Bay Trail SKUs, they are both based on the same single physical design. In other words, dual-core Bay Trail parts are just die harvested quad-core parts. Intel isn’t disclosing die size or transistor counts, which is ironic (and disappointing) given that Apple just disclosed both (or at least relative magnitude of one) for its A7 SoC.
Internally, the Bay Trail design is pretty nice. There are either two or four cores enabled, each pair with a shared 1MB L2 cache (2MB total for a quad-core part). Intel is following the unfortunate lead of everyone else in the mobile industry and advertising max turbo frequencies exclusively.
Thankfully Intel hasn’t yet decided to obfuscate max non-turbo frequencies:
|Bay Trail Turbo Speeds|
|Max turbo frequency||2.39GHz||2.41GHz||1.86GHz||1.83GHz||2.0GHz||2.0GHz|
|Max non-turbo Frequency||1.46GHz||1.5GHz||1.33GHz||1.33GHz||1.33GHz||1.33GHz|
In general you’re looking at 1.33GHz - 1.46GHz max non-turbo frequencies, with Bay Trail being able to turbo up to anywhere between 1.83GHz and 2.40GHz depending on SKU.
Although the core architecture is 64-bit in design, there will be no OS support for 64-bit Bay Trail at launch. Windows 8.1 with Connected Standby appears to still be 32-bit only, and obviously Android is 32-bit only at this point as well.
The memory interface is fairly ridiculous by mobile standards. You either get two 64-bit LPDDR3 channels (128-bit total width) or a single 64-bit DDR3L channel. In the case of the former, that’s the same memory bus width as Apple’s A5X/A6X line of SoCs as well as the standard Core i3/i5/i7 parts. Max supported memory frequency is 1066MHz in dual-channel LPDDR3 mode, or 1333MHz in single-channel DDR3L mode. The only benefit to the latter is really cost, as Bay Trail will purportedly show up in some very cheap devices.
The GPU is Intel’s own Gen7 graphics core, a cut down implementation of what we first saw in Ivy Bridge. I suppose it’s premature to expect Merrifield, Bay Trail’s smartphone counterpart, to also use Intel’s own graphics core but it’s clear this is the direction Intel is headed in - and away from licensing IP from Imagination Technologies.
Rather than 16 EUs in the Ivy Bridge GT2 configuration (HD 4000), Bay Trail’s HD Graphics core ships with 4. The 4 EUs are otherwise effectively identical to what we found in Ivy Bridge. The GPU can dynamically scale frequency and share power between itself and the CPU cores. Minimum GPU frequency on Bay Trail is 311MHz and a max GPU frequency of 667MHz (or 688MHz for the DDR3L SKUs).
Intel is quick to point out that Bay Trail’s GPU supports DirectX 11 and OpenGL ES 3.0. Unfortunately this support list appears limited to Windows. Under Android, it’s unclear whether or not Bay Trail will ship with anything above OpenGL ES 2.0 support. The same goes for GPU accelerated Renderscript. Bay Trail supports up to 2560 x 1440 displays over eDP1.3/DP1.2, or 1080p over HDMI. Panel Self Refresh is also supported.
Video encode and decode blocks also shifted away from Imagination in Bay Trail. Both IP blocks are custom from Intel now. The ISP (Image Signal Processor) is from Silicon Hive (an Intel acquisition).
Post Your CommentPlease log in or sign up to comment.
View All Comments
Nagorak - Wednesday, September 11, 2013 - linkSamsung already used the old Atom in their Galaxy Tab 3 10.1, and they make their own ARM licensed cores in-house. It's not going to take much to get these venders to switch. If the performance is there, and the price is competitive, plenty will make the switch. These OEMs design electronics as their business, it's not going to be a huge difficulty for them to make designs with Atom cores instead of ARM cores. And considering X86 works with both Windows and Android, I don't see why having a higher compatibility base is somehow a negative.
Dentons - Wednesday, September 11, 2013 - linkKeep in mind that ARM allows mobile device manufacturers a large, competitive marketplace from which to purchase CPU's.
Were tablet manufacturers to spurn ARM, they'd drop themselves right back into Intel's high-margin, nearly monopolistic arms.
So why have Samsung and Asus released Android devices featuring Intel CPUs? Both Samsung and Asus purchase a lot of expensive Intel chips for their laptops. It would be less than surprising were they to have been compensated with discounts for having released Intel powered Android devices.
Another huge problem for X86 Android is software support. Nearly all Android applications are compiled for the ARM instruction set. The hundreds of thousands of existing Android apps *WILL NOT RUN* on an Intel powered Android devices. At best, they need recompilation, at worst, rewriting. Moving hundreds of thousands of ARM compiled software to X86 is a heavy lift. Intel has a recompilation service, but it's only able to do so much.
The bottom line is that Intel is just now, finally releasing a CPU competitive with ARM. ARM has a massive lead. A larger lead than Intel has ever had in the PC market. To convince manufacturers to relinquish ARM for X86, Intel doesn't just need minimally better technology, they need far better technology and equal or better pricing.
Right now, Intel's technology is not that much better than ARM, and their pricing? Unless they decide to sell below cost, they'll likely never beat ARM pricing.
jwcalla - Wednesday, September 11, 2013 - linkI agree with much of this but I think you're a bit off on the Android applications aspect. A vast vast majority of the Android apps are written in Java so there's not incompatibility with x86 there. For the native apps, recompiling to x86 is somewhat trivial since Android is a Linux OS. Third, it seems that Intel's ARM-to-x86 ISA translation program works pretty well.
h-jumbo - Thursday, September 12, 2013 - linkYes - and to further correct Dentons comment on Android compatibility, Intel has a binary translator for Android that will convert ARM ISA to x86 on the fly. It works amazingly well. If x86 gets more traction in Android devices, it will be used less and less as app developers compile for x86 in addition to ARM (which is trivial).
JPForums - Thursday, September 12, 2013 - link
Wow, you just talked about Intel killing off WinRT and then moved on to talking about a lack of applications for Windows. You can't have it both ways. Either legitimize WinRT as a competitor and bash it for a lack of applications or (more realistically) dismiss WinRT and accept that Windows has more applications, higher quality, and fully featured than any app store. Since when did a fully featured application become inferior to an app. How many (software) things can you do on a tablet that you can't do with a Windows PC or even OSX. Let's even throw Linux in their for kicks and grins.
Also consider that the biggest advantage of maintaining a process lead is cost. Yes a new process cost more than an old process, especially when applying new techniques like double masking and tri-gates, but the bulk of the cost is still in the silicon. The exact same chip fabricated on a smaller process generically means lower cost due to the ability to fit more chips per wafer. Intel maintains the highest margins in the industry because they also maintain the lowest cost per comparable chip. I'd imagine that Intel will give these chips a price tag to match (or slightly exceed) their level of performance compared to their ARM competition. Unfortunately, as you said, simply being competitive isn't enough to justify a rapid switch over of an ARM dominated market. They are going to need to offer something the their competitors don't have or eat significantly into their margins. That said, they will get some design wins simply by being competitive and being Intel. Pickup into the Windows market will help as manufacturers could conceivably use the exact same or very similar hardware to power both a Windows and an Android tablet, saving cost. This could fuel a slow long term takeover, but like you, I don't see a sudden switch.
lmcd - Wednesday, September 11, 2013 - linkSo in that whole power comparison versus Jaguar where was the graphics disclaimer? I think the pounding seen here easily warrants the TDP difference.
eanazag - Wednesday, September 11, 2013 - linkThe 3DMark extreme bench scores look suspect. I think the graphs are swapped.
Clearly AMD's graphics in Kabini smoke Intel's products and many other non-IvyBridge GPUs. I wonder how much of a power hit did Kabini take to produce that. Meaning, would a comparable performing GPU to Intel's make the power to performance ratio be more favorable at maybe 3.5W max under load? I am not thinking if Kabini was cut down on the GPU to even Bay Trail that it would beat Intel's power consumption. I suspect it would be closer.
I am irritated they don't just call it Atom on desktop and laptop. Clearly trying to get out of the netbook Atom stigma. Whatever. Ultimately, I am still disappointed that Atom doesn't do enough on the GPU side. It still leaves me as a consumer having to make a choice between graphics intensive and CPU intensive workloads. And the fact AMD's Kabini is even close on CPU performance is a weak showing because Intel has a mature process node advantage on virtually everyone now.
PEJUman - Wednesday, September 11, 2013 - linkmore improtantly, can someone explain how Anand was able to run android with x86 kabini and x86 ivy bridge on the android GPU bench? since when can x86 processors run ARM natively? if not, did Intel actually let Anand use their android pre-beta port at the other 2 platforms?
Jaybus - Wednesday, September 11, 2013 - linkAndroid has run on x86 for a long time. Android is Linux-based.
PEJUman - Wednesday, September 11, 2013 - linkyeah but I was under the impression it uses VM and an custom build on that. Anand was able to run GPU bench (I assumed this meant native x86 build), with 4.2.2 build at that. Isn't android is built for ARM only?