I suspect this is almost certainly the case. I wonder if it drops down below 10w if you turn off the Kinect (which I would never do myself)?
I also hope Sony update their software - the Xbox stays in the 16-18w range when downloading updates, whereas the PS4 jumps up to 70 (70w when in standby and downloading an update, but still takes 30 seconds to start up!).
It seems that the PS4's extremely high standby/download power draw is due to the ARM co-processor not being up to the task, it was supposed to be able to handle basic I/O tasks and other needed features, but apparently it wasn't quite spec'd sufficiently for the task, forcing Sony to keep the main CPU powered on to handle that task. The rumor is that they will "soon" release a new revision with a more powerful ARM core that is up to the task, and which should allow powering down the x86 CPU completely, as per the original plan. (either that, or managing to rework the "standby" software functions so that the existing ARM core can handle it would also do the trick)
I believe MS is now also rumored to "soon" release a revision of the Xbone, although what that might entail is unknown. An SKU without the Kinect could allow them to drop the price $100 to better compete with PS4.
Incidentally, the Xbone seems to be running HOTTER than the PS4, so MS' design certainly cannot be said to be a more efficient cooling design, more like they have more open space which isn't being efficiently used compared to PS4's design. The temp differential is also probably down to MS' last minute decision to give a 5% clockspeed bump to the APU.
I'm looking forward to the 'in depth' article covering each. As far as performance is applicable in actual use scenarios, i.e. games, I'm interested to get the real low-down... The vast similarity in most aspects really narrows the number of factors to consider, so the actual differentiating factors really should be able to comprehensively addressed in their implications.
Like Anand says, I don't think memory thru-put is a gross differentiator per se, or at least we could say that Xbone's ESRAM can be equivalent under certain plausible scenarios, even if it is less flexible than GDDR and thus restricts possible development paths. For cross-platform titles at least, that isn't really a factor IMHO.
The ROP difference is probably the major factor for any delta in frame buffer resolution, but PS4's +50% compute unit advantage still remains as a factor in future exploitability... And if one wants to look at future exploitability then addressing GPU and PS4's features applicable to that is certainly necessary. I have seen discussion of GPGPU approaches which essentially can more efficiently achieve 'traditional graphics' tasks than a conventional pure GPU approach, so this is directly applicable to 'pure graphics' itself, as well as the other applications of GPGPU - game logic/controls like collisions, physics, audio raycasting, etc.
When assessing both platforms' implications for future developments, I just don't see anything on Xbone's side that presents much advantage re: unique architectural advantages that isn't portable to PS4 without serious degradation, while the reverse does very much present that situation. While crossplatform games of course will not truly leverage architectural advantages which allow for 'game changing' differences, PS4's CU, ROP, and GPGPU queue advantages should pretty consistently allow for 'turning up the quality dial' on any cross-platform title... And to the extent that their exploitation techniques becomes widely used, we could in fact see some 'standardizd' design approachs which exploit e.g. the GPGPU techniques in ways easily 'bolted on' to a generic cross platform design... Again that's not going to change the ultimate game experience, but it is another vector to increase the qualitative experience. Certainly in even the first release games there is differences in occlusion techniques, and this is almost certainly without significant exploitation of GPGPU.
If Xbone's resolution is considered satisfactory, I do wonder what PS4 can achieve at the same resolution but utilizing the extra CU and GPGPU capacity to achieve actually unique difference, not just a similar experience at higher resolution (i.e. 1080 vs. 900). If 900 upscaled is considered fine, what can be done if that extra horsepower is allocated elsewhere instead of increasing the pixel count?
I love my Moto X, X8 8-core processor means each core has its own job, 1 core for always listening and 1 core for active notifications. Very easy on the battery, which is why it is one of the best battery life phones right now.
That's not entirely true either. The Moto X uses the Qualcomm MSM8960. The SOC is a dual core processor with an Adreno 320 GPU, which has 4 cores. Adding the 2 co-processors equals 8; hence Motorola marketing speak of X8.
I know it might be due to constraints right now, but can you test power consumption when being used for passing TV through? I would like to know how viable an option that is, considering you need 3 pieces of electronics for it to run (TV, Xbox One and "Cable box"). I'm guessing its going to be similar to idle consumption... Which, if it is, is WAY too high just to add a bit of voice control/fantasy sports overlay. Just a pity they didn't integrate a similar experience into the Media Centre part of windows.
Worrying about power consumption, especially just as it passes cable though, is really just looking for problems. If the added cost from power is going to be ridiculously small and if its really a concern you would be better off ditching cable for things like Netflix and if you are actually worried about the generation of the power you would save more power by just watching one less show every day..
I would say that it's not, really. They're advertising this as a useful feature, but if it's adding 50% to your overall TV watching power draw that's a pretty significant concern from various perspectives.
Lets be pessimistic and assume it actually did add 50% of the power draw, which it almost certainly isn't. From what I can find LCDs draw only around 70 watts which means your xbox would be drawing 35 watts. Even if you left your tv on 24/7 all year at th average that the EIA lists for this year of 12.07 cents per killowatt hour that only come up to $37.01. Thats a pathetic amount to be worrying about when you are about to drop $400 -500 just on a system to play games and then $60 per game. This is not a reasonable concern this is looking for faults to complain about.
Will you be going into any of the media streaming capabilities of the different platforms? I've heard that Sony has abandoned almost all DLNA abilities and doesn't even play 3-D Blu-ray discs? (WTF) Is Microsoft going that route as well or have they expanded their previous offerings? Being able to play MKV Blu-ray rips would be interesting...
Also, what's the deal with 4K and HDMI? As I understand it, the new consoles use HDMI 1.4a, so that means only 4K at 24Hz (30Hz max), so no one is going to be gaming at 4K, but it would allow for 4K movie downloads.
I've spent the last couple years investing heavily into PC gaming (Steam, GOG, etc.) after a long stint of mostly console gaming. A lot of my friends who used to be exclusive console gamers have also made the switch recently. They're all getting tired of being locked into proprietary systems and the lack of customization. I've hooked a bunch of them up with $100 i3/C2Q computers on Craigslist and they toss in a GTX 670 or 7950 (whatever their budget allows) and they're having a blast getting maxed (or near maxed) 1080p gaming with less money spent on hardware and games. Combined with XBMC, Netflix via any browser they prefer, it's definitely a lot easier for non-enthusiasts to get into PC gaming now (thanks big Picture Mode!). Obviously, there's still a long way to go to get the level of UI smoothness/integration of a single console, but BPM actually does a pretty good job switching between XBMC, Hyperspin, Chrome, and all that.
Except they aren't 100+GB. The 4K movies Sony is offering through its service are 40-60GB for the video with a couple audio tracks. You forget that most Blu-ray video files are in the 20-30GB range, only a handful even get close to 45GB. And that's using H.264, not H.265 which offers double the compression without sacrificing PQ.
Don't measure other peoples' sanity based upon your own. I download multiple 15-25GB games per month via Steam without even thinking about it. 4K video downloads are happening now and will likely continue with or without your blessing. :/
The thing is, 4k is roughly 4x the pixels as 1080p. Therefore, a 4k video the the appropriate bit-rate will be ~4x the size as the 1080p version. So yes, a 4k movie should be about 80 - 120 GB.
Now the scaling won't be perfect given that we aren't requiring 4x the audio, but the audio on 1080p BRs is a small portion relative to the video.
The 60 GB 4k video will probably be an improvement over a 30 GB 1080p video, but the reality is that it is a bitrate starved version, sort of like 1080p netflix vs 1080p BR.
The thing is, what format is available to delivery full bitrate 4k? Quad layer BRs? Probably not. Digital downloads... not with internet caps. Still, I'm in no rush, my 23 TB (and growing... always growing) media server would be quickly overwhelmed either way. Also, I just bought a Panasonic TC60ST60, so I'm locking into 1080p for the next 5 years to ride out this 4k transition until TVs are big enough or I can install a projector.
When you say "full bitrate 4k", do you even know what you're saying? RED RAW? Uncompressed you're talking many Gbps, several TBs of storage for a feature-length film. DCI 4K? Hundreds of GBs. Sony has delivered on Blu-ray quality picture quality (no visible artifacts) at 4K under 60GB, it's real and it looks excellent. Is it 4K DCI? Of course not, but Blu-ray couldn't match 2K DCI either. There are factors beyond bit rate at play.
Simple arithmetic can not be used to relate bit rate to picture quality, especially when using different codecs... or even the same codec! Using the same bit rate, launch DVDs look like junk compared to modern DVDs. Blu-ray discs today far outshine most launch Blu-ray discs at the same bit rate. There's more to it than just the bit rate.
You certainly know what I meant by "full bitrate" when I spent half my post describing what I meant. Certainly not uncompressed video, that is ridiculous.
There is undoubtedly room for improvements available using the same codec to achieve more efficient encoding of BR video. I've seen significant decreases in bitrate accomplished with negligible impact to image quality with .264 encoding of BR video. That said, to this day these improvements rarely appear on production BR discs, but instead by videophile enthusiasts.
If what your saying is that all production studios (not just Sony) have gotten their act together and are more efficiently encoding BR video, then that's great news! Now when ripping BRs I don't have to re-encode the video more efficiently because they were too lazy to do it in the first place!
If this is the case, then yes, 60 GB is probably sufficient to produce artifact free UHD; however, this practice is contrary to the way BR production has been since the beginning and I'd be surprised if everyone follows suit. Yes, BR PQ/bitrate has been improving over the years, but not to the level of a 60GB feature length movie completely artifact free UHD.
Still, 60 GB is both too large for dual layer BRs and far too large for the current state of internet (with download caps). I applaud Sony for offering an option for the 4K enthusiast, but I'm still unclear as to what the long term game plan will be. I assume a combination of maintaining efficient encoding practices and H.265 will enable them to squeeze UHD content onto a double layer BR? I hope (and prefer) that viable download options appear, but that is mostly up to ISPs removing their download caps unfortunately.
Overall, it's interesting, but still far from accessible. The current extreme effort required to get UHD content and the small benefit (unless you have a massive UHD projector setup) really limits the market. I'm saying this as someone who went to seemingly extreme lengths to get HD content (720p/1080p) prior to the existence of HDDVD and BR. Of course consumers blinded by marketing will still buy 60" UHD TVs and swear they can see the difference sitting 10 - 15+ ft away, but mass adoption is certainly years away. Higher PQ display technology is far more interesting (to me).
You're assuming that all 45 GB of the data is video, when usually at least half of that is audio. Audio standards haven't changed, still DTS-MA, TrueHD, etc. Typically the actual video portion of a movie is around 10GBs, so we're talking closer to the 60GB number that was mentioned above.
The video portion of a BR is by far the bulk and audio is certainly not half the data. For example, Man of Steel has 21.39 GB of video and the English 7.1 DTS HD track is 5.57 GB. The entire BR is 39 GB, the remainder are a bunch of extra video features and some extra other DTS audio tracks for other languages is also included. So keeping the bitrate to pixel ratio the same, 4x scaling gives us ~85 GB. To fit within 60GB, the video portion could only be 54.5GB to leave room for DTS HD audio (english only) That would be ~64% of the bitrate/pixel for UHD compared to 1080p, assuming 4x scaling and the same codec and encoding efficiency. Perhaps in some cases you can get away with less bitrate per pixel given the shear number of pixels, but it certainly seems on the bitrate starved side to me. Even if video without noticeable artifacts is possible for a 2h20min movie (20 min of credits, so ~2h) like Man of Steel, a longer movie or a one that is more difficult to encode without artifacts (grainy, dark) would struggle.
Keep in mind, that is JUST the core video/audio. We've thrown out all the other video that would normally come with a BR (which is fine by me, just give me the core video and audio and I'm happy). If they insist on keeping the extra features on a UHD BR release, they would certainly have to include them on a separate disc since even an average length movie would struggle to squeeze to fit on a 50GB disc. To fit a BR with just video and english DTS HD audio, we are talking 52% bitrate/pixel for UHD compared to 1080p. We would certainly need .265 encoding in that case.
So I would probably concede that UHD without artifacts with only 60GB is possible for shorter films or if you can get away with less bitrate/pixel due to the higher resolution. For longer films and/or difficult to encode films, I could see this going up towards 100 GB. Putting more effort into encoding efficiency and switching to .265 will certainly be important steps towards making this possible.
For what it's worth, BluRay is far beyond the sweet spot in bit rate. Take a UHD video clip, resize to 1080p and compress it to BluRay size. Now compress the UHD video to BluRay size and watch them both on a UHDTV. The UHD clip will look far better than the 1080p clip, at 1080p the codec is resolution starved. It has plenty bandwidth but not enough resolution to make an optimal encoding. The other part is that if you have a BluRay disc, it doesn't hurt to use it. Pressing the disc costs the same if the video is 40GB total instead of 30GB and it could only get slightly better, while if you're streaming video it matters. Hell, even cartoons are often 20GB when you put them on a BluRay...
I pay 29.99 for my 150/30 Mbit connection with 3 TB of traffic. My average download volume was around 450 GB over the last few months and I sit close enough to my 60" screen (which isn’t 4k - yet) to notice a difference between the two resolutions.
So yes, I would absolutely buy/rent 4k movies if sony could offer them at decent nitrate. I would even buy a PS4 for that sole purpose.
You sit closer than 7 ft (4 ft optimal) to your 60" TV? This must be in a bedroom, office, or a tiny apartment. I live in what I consider a small apartment and I still sit 10 ft away. Perhaps you just put your couch in the center of the room so that it is really close to your TV? Either way, this is not most people's setup. Average seating distances are far greater than 7 ft. UHD TVs will need to be ~100+" for any benefit to be apparent to regular consumers.
You must also live in Europe or Asia to get an internet rate like that. I pay $45/mo for 45/4 Mbit with a 300GB cap - although it's unlimited between 2am - 8am, which I take full advantage of.
We've got three rows of seating in our home theater. 115" 1080p projection with seating at approximately 7', 11', and 15'. I choose my seating positions based completely upon my audio configuration which is calibrated to the room's acoustic strengths, not upon one-size-fits-all visual acuity seating calculators. We generally sit in the front row when we don't have guests. It's immersive without being nauseating. Pixels are visible in the back row with good eyesight, so I'm anxiously awaiting a 4K upgrade, but probably not until laser projection becomes affordable.
We've got Comcast Business Class 50/10 for $99/mo. No cap and 50/10 are the guaranteed minimum speeds, unlike the residential service which has a cap (sort of) and sells you a max speed instead of a minimum. Comcast BC also has a $59 plan with no cap that is 12/3, but we wanted more speed. Still can't get gigabit fiber... :-(
Sweet setup! You definitely have the screen real estate and seating arrangement to take advantage of 4k. I'd like a similar setup when I move on from apartment style living to a house. Awesome Internet setup too. I could get unlimited as well, and did for a while, but I realized I could pay half as much and get away without hitting my cap by downloading during "happy hours", but that takes some planning.
I've been anxiously waiting for laser projection systems as well... Will they ever come or is it vaporware? Hopefully that is what my next TV purchase will be.
more BS from the anti 4k crowd. I'm sitting 8 feet from my TV right now. In fact it's difficult to sit further away in your standard apartment living room. For a 60 inch TV 4k resolution is recommended for anything 8 feet or closer. For a 85 inch TV its 11 feet. For a 100 inch screen its 13 feet.
I'm hardly the anti 4k crowd. I think 4k is great, I just think it is only great when properly implemented. This means that 4k TVs should start at 60", since only very few people sit close enough to begin to see the difference. At 8ft,that is the optimal for 1080p for 60". If you really want to take advantage of 4k you'd sit at 4ft for a 60" set.
PS3 didn't launch with DLNA support, either. I'm guessing it will get patched in at some point.
As for the rest of it, I'm guessing they made a guess that 4K won't really catch on during the lifespan of these systems, which seems like a fairly safe bet to me.
And with only 16 ROPs Microsoft has trouble even pushing 1080p gaming. It seems that they targeted 720p gaming which is fine with me since most of the time TVs aren't big enough for this to matter. Microsoft did target 4K video though and they designed the video decode blocks specifically to handle this load. It will likely be high resolution but low bitrate video which in most cases is not an improvement over 1080p with high bitrate.
2005? Well the consoles then being well specc'd? I disagree, they were mostly pretty great, but I recall very distinctly thinking 512MiB RAM was pretty poor.
Shame that it can only use that ESRAM bandwidth on a total of 1/256th of the system's memory... so you need to account for that in your sums. I.e., it's useless for most things except small data areas that are accessed a lot (framebuffer, z-buffer, etc).
Except you just said it... You store what's used the most, and you get to realize a huge benefit from it. It's the same theory as a cache, but it gives programmers finer control over what gets stored there. Giving the developers the ability to choose what they want to put in the super low-latency, high bandwidth eSRAM is really a good idea too.
Computer architecture is mainly about making the common case fast, or in other words, making the things that are done the most the fastest operations in the system. In this case, accessing the z-buffer, etc. is done constantly, making it a good candidate for optimization via placing it in a lower latency, higher bandwidth storage space.
LOL. No. The majority of things that actually affect quality and frame rate are going to be larger in size than the ESRAM. 192 ENTIRE 8GB vs. 204 for a dinky amount of that... It's painfully obvious what the bottlenecks will be. Oh... Forgot the whole PS4 running a 7850 compared to the XB1's 7770.. Oh, and the 8GB ram vs. 5 true GB of ram(3 OSs take up 3GB).
With that said, get the console that your friends will play, or has the games you want... Anyone pretending the XB1 is better in raw power is deluding themselves(it's hardly even close).
I'm simply describing how the eSRAM should work, given that this should be a traditional PC architecture. Nowhere did I comment on which is the more powerful console. I really don't feel I'm qualified in saying which is faster, but the GPU seems to indicate it's the PS4, as you rightly said.
Now, it is true that the PS4 has larger bandwidth to main memory. My point was that if the eSRAM has a good hit rate, let's say 80%, you'll see an effective speed of 0.8*204 = 163GB/s. This is a horrible measure, as it's just theoretically what you'll see, not accounting for overhead.
The other difference is that GDDR5's timings make it higher latency than traditional DDR3, and it will be an order of magnitude higher in latency than the eSRAM in the XB1. Now, that's not to say that it will make a big difference in games because memory access latency can be hidden by computing something else while you wait, but still. My point being that the XB1 likely won't be memory bandwidth bound. That was literally my only point. ROP/memory capacity/shader bound is a whole other topic that I'm not going to touch with a 10-foot pole without more results from actual games.
But yes, buy the console your friends play, or buy the one with the exclusives you want.
It's not even close to a traditional PC architecture. I mean, it totally is, if you completely ignore the eSRAM and custom silicon on the die.
Test after test after test after test has shown that latency makes practically zero impact on performance, and that the increased speed and bandwidth of GDDR5 is much more important, at least when it comes to graphics (just compare any graphics card that has a DDR3 and GDDR5 variant). Latency isn't that much greater for GDDR5, anway.
The eSRAM is only accessible via the GPU, so anything in it that the CPU needs has to be copied to DDR anyway. Further, in order to even use the eSRAM, you still have to put the data in there, which means it's coming from that slow-ass DDR3. The only way you'll get eSRAM bandwidth 80% of the time is if 80% of your RAM access is a static 32 MB of data. Obviously that's not going to be the majority of your graphics data, so you're not going to get anywhere near 80%.
The most important part here is that in order for anyone to actually use the eSRAM effectively, they're going to have to do the work. Sony's machine is probably going to be more developer-friendly because of this. I can see how the eSRAM could help, but I don't see how it could possibly alleviate the DDR3 bottleneck. All of this is probably a moot point anyway, since the eSRAM seems to be tailored more towards all the multimedia processing stuff (the custom bits on the SoC) and has to be carefully optimized for developers to even use it anyway (nobody is going to bother to do this on cross-platform games).
I'm sorry to burst your bubble and I'm sorry to butt in but you are wrong about the eSRAM only available to the GPU cause if you look and read the digital foundry interview of the Microsoft Xbox One architectures and creators and the hot chips diagram IT SHOWS AND THEY SAID that the CPU has access to the eSRAM as well.
Yes, latency has very little impact on graphics workloads due to the ability to hide the latency by doing other work. Which is exactly what I said in my comment, so I'm confused as to why you're bringing it up...
As far as the CPU getting access, I was under the impression that the XB1 and PS4 both have unified memory access, so the GPU and CPU share memory. If that's the case, then yes, the CPU does get access to the eSRAM.
As far as the hit rate on that eSRAM, if the developer optimizes properly, then they should be able to get significant benefits from it. Cross platform games, as you rightly said, likely won't get optimized to use the eSRAM has effectively, so they won't realize much of a benefit.
And yes, you do incur a set of misses in the eSRAM corresponding to first accesses. That's assuming the XB1's prefetcher doesn't request the data from memory before you need it.
A nontrivial number of accesses from a GPU are indeed static. Things like the frame buffer and z-buffer are needed by every separate rendering thread, and hence may well be useful. 32MB is also a nontrivial amount when it comes to caching textures as well. Especially if the XB1 compresses the textures in memory and decodes them on the fly. If I recall correctly, that's actually how most textures are stored by GPUs anyway (compressed and then uncompressed on the fly as they're needed). I'm not saying that's definitely the case, because that's not how every GPU works, but still. 32MB is enough for the frame buffers at a minimum, so maybe that will help more than you think; maybe it will help far less than I think. It's incredibly difficult to tell how it will perform given that we know basically nothing about it.
To actually say if eSRAM sucks, we need to know how often you can hit in the eSRAM. To know that, we need to know lots of things we have no clue about: prefetcher performance, how the game is optimized to make use of the eSRAM, etc.
In general though, I do agree that the PS4 has more raw GPU horsepower and more raw memory bandwidth exposed to naive developers. My only point that I made was that the XB1 likely won't be that far off in memory bandwidth compared to the PS4 in games that properly optimize for the platform.
There's a whole other thing about CPUs being very latency sensitive, etc., that I won't go into because I don't know nearly enough about it, but I think there's going to be a gap in CPU performance as well because things that are optimized to work on the XB1's CPU aren't going to perform the same on the PS4's, especially if they're using the CPU to decompress textures (which is something the 360 did).
And with that, I reiterate: buy the console your friends buy or the one with the exclusives you want to play. Or if you're really into the Kinect or something.
Also, not saying the guy above you isn't an idiot for adding the two together. The effective rate Anand quotes takes into account approximately how often you go to the eSRAM vs. going all the way out to main memory. The dude above you doesn't get it.
That's intensely stupid, you're saying that because something is traditional it has to be better. That's a silly argument, not only that it's not even true. The consoles you mentioned all have embedded RAM but all the others from the same generations don't.
At this point, arguing that the Xbox One is more powerful or even equivalently powerful is just trolling. The Xbox One and PS4 have very similar hardware, the PS4 just has more GPU units and a higher-performing memory subsystem.
Flunk right now if your saying that the PS4 is more powerful then obviously you base your info in current spec sheet tech and not on the architectural design, but what you don't understand is what's underlining all that new architectural design that has to be learned at the same time it's been used, will only improve exponentially in the future. The PS4 it's straight forward a PC machine with a little mod in the CPU to take better advantage of the GPU but it's pretty much straight forward old design or better said "current architecture GPU design". Which is the reason many say it's easier to program than the Xbox One but right now that "weaker system that you so much swear and affirm is the Xbox One " has a couple game that have been pretty much design for it from the ground up been claim to be the most technical looking advance games on the market right now and you can guess which I'm talking about, that not even that I house 1st party game from Sony can't even compete in looks "KSF". I'm not saying that it's not awesome looking, it is actually but even compared to crisis3 it fails in comparison to that game. So it's suppose to be more easier to develop for, it's suppose to be more powerful and called a super computer, but when looking for that power gap in 1st party games that had the time to invest in its power, the "weaker system" with the hardest to develop architecture show a couple of games that trounces what the "superior machine" was able to show. Hmmm hopefully for you, time will tell and the games will tell the whole story!
Calling people names? Haha. How utterly silly for you to say the two different RAM types can be added for a total of 274GB/s. Hey guys it looks like I now have 14400 RPM hard drives now too!
Traditional cache-based architectures rely on all requests being serviced by the cache. This is slightly different, though. I'd be wary of adding both together, as there's no evidence that the SoC is capable of simultaneously servicing requests to both main memory and the eSRAM in parallel. Microsoft's marketing machine adds them together, but the marketing team doesn't know what the hell it's talking about. I'd wait for someone to reverse engineer exactly how this thing works before saying one way or the other, I suppose.
It's entirely possible that Microsoft decided to let the eSRAM and main memory be accessed in parallel, but I kind of doubt it. There'd be so little return on the investment required to get that to work properly that it's not really worth the effort. I think it's far more likely that all memory requests get serviced as usual, but if the address is inside a certain range, the access is thrown at the eSRAM instead of the main memory. In this case, it'd be as dumb to add the two together as it would be to add cache bandwidth in a consumer processor like an i5/i7 to the bandwidth from main memory. But I don't know anything for sure, so I guess I can't say you don't get it (since no one currently knows how the memory controller is architected).
smartypnt4's description of eSRAM is very much how typical cache works in a PC, such as L1, L2, L3. It should also be mentioned that L2 cache is almost always SRAM. Invariably, this architecture is just like typical CPU architecture, because that's what AMD Jaguar is. Calls to cache that aren't in the cache address range get forwarded to the SDRAM controller. There is no way Microsoft redesigned the memory controller. That would require changing the base architecture of the APU.
Parallel RAM access only exists in systems where there is more than one memory controller or the memory controller is spanned across multiple channels. People who start adding bandwidth together don't understand computer architectures. These APUs are based on existing x86 architectures, with some improvements (look up AMD Trinity). These APUs are not like the previous gen which used IMB POWER cores which are largely different.
But Microsoft's chip isn't an APU, it's an SoC. There's silicon on the chip that isn't at all part of the Jaguar architecture. The 32 MB of eSRAM is not L2, Jaguar only supports L2 up to 2 MB per four cores. So it's not "just like a typical CPU architecture."
What the hell does Trinity have to do with any of this? Jaguar has nothing to do with Trinity.
Actually if you read and I apologized for up butting in but if you read the digital foundry interview of the Microsoft Xbox One architects that they heavily modified that GPU and it is a DUAL PIPELINE GPU! So your theory is not really far away from the truth! The interview, http://www.eurogamer.net/articles/digitalfoundry-t...
Plus to add; the idea of adding that DDR3 to the eSRAM kind of acceptable because unlike the PS4 simple straight architecture design like very much the One pool GDDR5 you have 4 modules of DDR3 running at 60- 65gb/s and they each can be used for specific simultaneous request which makes it a lot more advance and more like a future DDR4 way of behaving plus killing that bottleneck people that don't understand, think it has. It's a new tech people and it will take some time to learn its advantages but not hard to program. It's a system design to have less error and be more effective and perform way better than supposedly higher flops GPUS cause it can achieve same performance with less resources! Hope you guys can understand a little and not trying to offend anyone!
All the other consoles you mentioned (apart from the PS2) are based on IBM Power PC chips, you are comparing their setup to X86 on the new consoles - silly boy.
The PS4 has a more powerful GPU since it has more compute cores. What this means is that while Xbone has a slightly higher clock speed, there are more computer cores to do the work on PS4, so it can split up and done faster. Also, while the GPU might be able to read from both pools of memory at one time, that doesn't mean the RAM bandwidth is (60GB/s + 200GB/s) or whatever the numbers are.
"Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One."
"The difference being that you only get that bandwidth to your most frequently used data on the Xbox One."
No. This is effective bandwidth to the eSRAM only after protocol overhead, nothing more.
Ah so graphics card manufacturers can replace GDDR5 with cheap low frequency DDR3 on all of their boards and get equal/greater performance so long as they add a little chunk of SDRAM to the chip... Good to know man, thanks for that brilliant analysis. They should have come to you years ago to tap your knowledge of memory subsystems. Just think of all the money AMD and NVIDIA could have saved by doing so.
Well, in theory they can... but it would cost nVidia/AMD MORE money as the GPU die would be bigger, and thus have less shader cores. So it's not a good solution for a discreet GPU, but it IS a decent solution in SOME cases, see Crystalwell, for instance. Honestly, I would say I think the PS4's setup is better, simple and fast, versus MS's more complex setup (and they ended up with a bigger die too, lol).
Hey noob, it doesn't work that way. SRAM is not equivalent to high speeds GDDR5. This has been well established already. You do get some boost, at some points, but it's not covering every area of performance the way GDDR5 is.
Newb talk? No, you can't add them together. Let me tell you why, in technical terms.
ESRAM is meant to be a cache, and what a cache does is take some data that you're going to need a lot (let's say there's some instruction code or some other code / data that needs to be read frequently. You put that data in the ESRAM, and it gets read 10+ times before being swapped for some other data. What you're saying makes it seem like we can constantly write and read from the ESRAM. That's not how it works.
tl;dr: You can't add them together because you should only use the ESRAM 1/10th the amount of times as you should the main DDR3 RAM that the Xbox One has. So you're argument is invalid, and don't say things that you don't know about.
He's indeed wrong, but I'd be willing to bet good money your hit rate on that eSRAM is way higher than 10% if it's used as a cache. Usual last level caches have a hit rate getting into the 80% range due to prefetching, and large ones like this have even higher hit rates.
If it's hardware mapped like the article indicates(aka not a global cache, but more like it was on the 360), it won't hit quite as often with a naive program, but a good developer could ensure that the bulk of the memory accesses hit that eSRAM and not main memory.
You actually only get about 100GB/s READ or 100GB/s WRITE... The best-case scenario on the XBox One is 68GB/s + 100GB/s - still NOT matching the PS4's capabilities for reading/writing ANY memory... and only in certain situations where you are streaming from both memory systems.
Xbone PEAKS below PS4's AVERAGE memory performance.
Huh? Actually you are wrong. The Xbox One uses 8GB of DDR3 RAM at 2133 MHz for 68.3 GB/s of bandwidth, but also adds an extra 32 MB of ESRAM for 102 GB/s of embedded memory bandwidth. The the PS4 uses 8GB of GDDR5 RAM at 5500 MHz for 170.6 GB/s of bandwidth.
I don't get this constant worrying about power usage on non-mobile devices they plug into a wall and as long as it's not some obscene (300+W) amount of draw I don't care damn it... Heat can be an issue but I'm personally not even remotely concerned that it might cost me $3 more a year in power usage to use my $400 PS4 if I was i shouldn't be buying a PS4 or Xbox One let alone games for them.
It doesn't matter if you aren't concerned, the EPA is.
Vote against Democrats if you dont like it.
Seriously from what i understand particularly regulations in the EU influenced these boxes, and I'm sure a power hog machine was out of the question due to the general climate of "green" propaganda nonsense.
He sounds like he is either A. a retard or B. he is talking out of his ass or C. trolling. Either way, ignore him.
The power consumption regulations are there for more than just "green". We currently have a problem of growing energy needs and where we're going to get that power form is a big question.
What people don't realize is that the power grid's infrastructure is designed with a peak load in mind and due to implementation and cost limitations you can't just "build more" as most Americans seem to think about it. 1 millions consoles sold at launch, think about that in terms of power consumption and remember, this legislation doesn't just apply to consoles.
Also, I don't understand why the whole "anti-green" view, I don't see how it's bad, even if you don't think global climate change is real (which it is btw, it's fact in every meaning of the word), do you really think dumping all that exhaust fumes in to the atmosphere is good for you or something? How would you like to weak a gas mask/air filter when you go outside? See whats happening in China right now (smog) because of the massive amounts of coal being burned.
tl;dr Power consumption affects more than "green" it affects infrastructure durability and limitations and a large upgrades are extremely costly and time consuming. So please do some research instead of trying to act like a 'smartass'. Also burning lost of fossil fuels can make you sick, see China smog issue.
"...going to get that power from and how it's going to be transported are big questions." "...Also, I don't understand the whole "anti-green" view,..." deleted 'why' "...you like to wear a gas mask/air filter when you go outside?"
I think Anand is covering it more as a curiosity. High power PC's with much better capabilities consume similar amounts at idle, so a specifically designed piece of hardware should be optimized MUCH better. But it isn't, and neither is the PS4. Odd.
My only issue is WHY Microsoft DID YOU KILL MEDIA CENTER and throw all your focus on the Xbox??? I would kill to have an HTPC with an HDMI-IN and voice command.
I'm not sure this is entirely true as I remember them saying that they had it at the reveal. Also there is some conflicting information on the internet about that. I would bet that the chip that they are using has it and even if it's not available at launch I would hope they didn't take the traces out of the header to save money. I will certainly test in my lab when I get mine and keep you guys up-to-date.
I completely agree. You still cannot buy a DVR!!! Tivo - look it up, you have to pay for their service. $20/month for a DVR from the cable company... we've gone backwards from the VCR in many ways. It makes me want to build my own HTPC (with CableCard and maybe SiliconDust HD HomeRun) but there are still lots of nagging issues doing that - HD recording, MythTV and XBMC integration, IR blaster remotes...
Look at Silicon Dusts HDhomerun prime, I know it isn't supported at launch but I will bet money that it will be supported in later updates. I have it now running on Windows Media Center (both windows 7 and 8) and love it.
XBMC doesn't support "copy once" material. So it's not even an option for many though if it did I would switch in a minute due to the fact that it is still being developed.
XBMC has no DVR functionality. Maybe combined with MythTV you can do it but that integration has just been done recently with an XBMC add-on, otherwise you are dealing with the two programs independently - much better to watch things in XBMC but you have to go into MythTV to do recording.
Because nobody made an off the shelf, plug and play, HTPC. Since MS is making hardware now, i don't know why they didn't try to rebaggage Media Center as a Windows 8 app and make another try. The whole world is fighting for your TV, Microsoft was here since 2005 and somehow they call it quit (for the PC) and put all their eggs in Xbox basket.
How expensive would it be to offer two options instead of one? I know a good deal of enthusiasts that will kill for a 2k$ HTPC with full XBone capabilities. Would cut the grass under steambox feets too.
So wait.. that IGN review where they stated that the PS4 has a 2.75Ghz clock is false? 'Cause this can explain the faster response times and more power usage, since the GPU's are not THAT different. I don't think that all that power difference of 20W-30W is GPU only.
Okay, "max frequency of 2.75Ghz".. either way, that could explain a lot.(including the overheating problems some people are having now)
Excuse me, I'm new here so.... I'm sorry if we're not supposed to post links or anything for that matter. The IGN review is called "Playstation 4 Operating Temperature Revealed".
I would be glad if someone could clear this up for me. Since this Anand review states that the PS4 runs at 1.6Ghz.
The worst part is as I tweeted you, as recently as weeks from launch MS was strongly considering enabling the two redundant CU's, but choose not too. Both my own reliable sources told me this, as well it was somewhat referenced by MS engineers in a digital foundry article.
Anyways I strongly wish they had, 1.5 teraflops just would have felt so much better, even if no paper a small increase.
MS was so dumb to not beef up the hardware more, charging 499 for essentially a HD7770 GPU in nearly 2014 I find sad.
Hell my ancient 2009, factory overclocked to 950, HD 4890 has more flops in practice, even if the 7770/XO GPU is probably faster due to being more advanced.
Think about that, the 4890 is a 5 year old GPU. The XO is a brand new console expected to last 7+ years. So sad I dont even wanna think about it.
Ahh well, the sad thing is by the looks of your comparison vids MS will very likely get away with it. even the 720P vs 1080P Ghosts comparison there is not much difference (and I imagine over time the XO will close the resolution gap to something more like 900P vs 1080P)
One of the most interesting parts of your article though was the speculation XO is ROP limited. Not something I hadn't heard before, but still interesting. Shortsighted on MS part if so.
Overall it feels like as usual MS is misguided. Focus on Live TV when it's probably slowly fading away (if not for that pesky sports problem...), and other things that seem cute and cool but half assed (voice recognition, Snap, Skype, etc etc etc).
Yet for all that I can still see them doing well, mostly because Sony is even more incompetent. If they were up against Samsung or Apple they would be already dead in consoles, but fortunately for them they are not, they are up against Sony, who loses pretty much every market they are in.
I think if XO struggles it would be a nice rebrand as a kinect-less, games focused, machine at 299. For that it'd arguably be a nice buy, and cheap DDR3 base should enable it. But if it sells OK at 499 with Kinect, and it probably will, we'll probably never get a chance to find out.
I agree for the most part, but 14, or even 18 CUs isn't going to be enough to really makea big difference. I think the sad part technology-wise is how not one of the 3 major console gaming companies this time around focused on pushing the horsepower or even doing anything very innovative. Don't get me wrong, I for one don't think graphics is primarily what makes a good game, but since the days of Atari -> NES, this really feels like the smallest technological bump (was gonna say "leap", but that just doesn't seem to appy) from gen to gen. What makes it worse is the last gen lasted longer than any before it. You know the rise of the dirt cheap phone/tablet/FB/freemium game had something to do with it...
Having actual CPU resources, a unified GPU architecture with desktops (and many mobile SoCs), and tons of RAM are all big differences over the last generation's introduction.
The Xbox expounds on that by adding in co-processors that allow for lots of difficult stuff to happen in real-time without affecting overall performance.
Thank god people didn't think like this when computers first started with switches and paper tape. Remember we have to start some where to move the technology forward. I want the Jarvis computer in Iron Man! You don't get there by making a console that can play games. You get there by making a console that can play games and has voice recognition and gestures and ...... People get use to interacting with new input sources and then you find your self in a situation when you say how did I ever live without this. You guys sound like I did in the 80s when Microsoft was coming out with this stupid gui crap. "You will have to rip the command line from my cold dead fingers!" Where would we be today if everyone thought like me. Where would the Internet be if it was just command line. I for one applaud Microsoft for trying to expand the gaming market not just for hard core gamers but people like my girl too. I know the PS4 might have more power in terms of compute performance but that is not what games are about, it's about story line, immersiveness (made-up word), and to some extent graphics. Truth is there is really no difference between 1080 and 720 on a Big Screen, remember people this is not a PC monitor. And the X1 can do 1080p. I'm looking forward to what both systems can offer in this next generation but I'm more interested in the X1 due to it's forward thinking aspects. Only time will tell though.
Rule of thumb is you need a 10x increase in power to get a 100% increase in visual fidelity. Look at 360 vs One. 6x the power and maybe games look 50% better. So we are talking about the PS4 looking 5% better than Xbox One. In this gen, it really is about who has the exclusives you want.
And if you are looking out 5+ years you have to take into account Xbox's cloud initiative. Have you used OnLive? II can play Borderlands 2 on an Intel Atom. If MS puts the $ behind it, those 8 cores and pitiful CPU could be used just to power the OS and cloud terminal. Only way these consoles can keep up with midrange PCs.
Interesting that you use numbers referring to visual fidelity, when it is a non quantifiable, perceptual, quality.
Also there is no such Rule of Thumb regarding it, but what is known is that in certain games like CoD:Ghosts due to certain choices the xb1 is able to pump less than half the pixels that the ps4 can.
If you believe in the Cloud for that kind of gaming, Sony has bought Gaikai and it is a project that started sooner than the MS counterpart, heck the MS counterpart hasn't been named.
How do the noise levels of the consoles compare? According to other reviews they both seem to be fairly quiet, which is great, but is there a noticable difference between them?
I'm wondering the same - I've seen lots of people point out the fact that the Xbox One is designed to be bigger, but more cool and quiet. However, I haven't seen any confirmation that it is in fact more quiet than the PS4.
Yes, the standby power of the XBone and PS4 bothers me too. I often leave my TV and Consoles untouched for weeks, so the only sensible thing is to put them on a Master/Slave powerstrip which cuts them off the grid when the TV isn’t on.
Of course that defeats the entire standby background downloads, but in the case of Sony, I have to wonder why they put a whole proprietary ARM SoC* (with 2GB of DDR3 RAM) on the board for "low power standby and background downloads" and then end up with unbelievable 70W figures.
This is essentially a mobile phone without a display, I don’t think it should use more than 3 Watt idle with the HD spun down.
My only explanation is that they couldn’t get the ARM software/OS side if things wrapped up in time for the launch, so for now they use the x86 CPU for background downloads even though it was never intended to do that.
Correction, the SoC only has access to 2Gb (= 256 MB) of DDR3 RAM.
However, I found a document that seems to confirm that the ARM Subsystem did not work as planned and Sony currently uses the APU for all standby/background tasks.
Maybe somebody who is fluent in Japanese could give us a short abstract of the part that talks about the subsystem.
Hey Anand, did you see the Wii U GPU die shots? How many shaders do you think are in there? I think it's almost certainly 160 at this point, but there are a few holdouts saying 320 which seems impossible with the shader config/size. They are basing that off the clusters being a bit bigger than normal shader cores, but that could be down to process optimization.
there is a guy on neogaf with access to wii u official documentation that more or less confirms 160 shaders, even though it's never explicitly stated (for example it refers to 32 alu's, which would be vliw 5 in this case, meaning 160 shaders). combine that with the die evidence and it's clear. 8 tmu's, 8 rops also.
some people will never accept it but there's no real doubt for me personally, it's 160.
A 1:1 ROP to TMU ratio is quite strange. TMUs are usually double. Not doubling it is another weird limitation. Nintendo sure does have a lot of head scratchers in there.
For the first time since the Atari 2600, I actually want a particular console: the Xbox One. I don't even play console games. I want it for the voice control for the apps. I watch Hulu Plus and Netflix instead of TV, and this would make it easier to watch than using a Win7 PC like I do right now. It also uses less power than the PC I use right now. In addition, I would like to have the Skype app so I could talk to certain family members face to face, sort of, who are too far away for me to visit. The games don't attract me to the console so much as the other uses.
I agree with you. The $500 price doesn't even scare me that much because my young girls and wife will probably like the casual kinect games.
However, Microsoft has made a HUGE mistake with requiring a $60 yearly subscription. This isn't 2008 anymore. I can get a Roku streamer for $50 that will play netflix and Hulu for years to come. Kinect really appeals to the casual gamer and I'd get one myself for the steaming and the occasional console game session, but the $60 a year charge that can't be canceled easily (as I found on my 360) makes the XO a non-starter for me.
Hubb1e I hear you but how do they make up for the cost of the console? How much do you think a sensor like the Kinect 2 would cost on the PC? How do they continue to make money to make the network better so things like voice recognition get better and to make the investments to get the network closer to the end user to reduce latency? I'm willing to pay $60/year (one night out with the family at the movies/diner) to get a better experience.
Couldn't you get a $35 Chromecast dongle for your TV? Or does your TV not have a USB port? It just seems so odd shelling out $500 for watching TV. Heck, you could probably spend $500 and get a fairly decent smart TV with Skype, Hulu, Netflix, and Amazon Prime!
Seems like voice command is the only reason he can't do something like that, or just continue using his Win7 PC or an XBMC. I'm still leaning toward making a new APU based HTPC for XBMC myself.
So, based on the numbers shown here, it looks like the PS4's GPU is roughly on par with a Radeon HD 7850 (more shaders, but slightly lower clock). Meanwhile, the XB1's GPU is considerably weaker, with performance falling somewhere between a 7770 and 7790. Considering that this is a game console we're talking about (notwithstanding Microsoft's attempt to position it as a do-everything set-top box), that's going to hurt the XB1 a lot.
I just don't see any real advantage to the *consumer* in Microsoft's design decisions here, regardless of supply chain considerations, and I think Anandtech should have been more pro-active in calling them out on this.
The right question to ask is can both cards do 1080p gaming. Remeber these aren't PC where people are running games at much higher resolutions than 1920x180 on multiple monitors.
Take a 7850 and 7770 and put them next to each other with FOR locked to 60 fps. Sit back 6 feet and play a fps. Tell me which is which. Maybe a 5% difference in visual fidelity.
Lol no, by the way, what will you do if a game is heavy enough to run at 720p 30 on the ps4, at which resolution will you run it on the xb1?... yeap, it will be notoriously different.
With the PS4 offering-up such a more powerful system, the arguement turned to Xbox One's eSRAM and "cloud power" to equalize things. Even with Microsoft boosting clocks, the Xbox One simply does not deliver game play graphics the way the PS4 has now been demonstrated to do.
The PS4 graphics look much better. In COD Ghosts it almost looks like the PS4 is a half-generation ahead of the Xbox One. This actually makes sense with the the PS4 offering 50% more GPU cores and 100% more ROPs.
Considering the PS4 is $100 cheaper and with the bundled Kinect being a non-starter, the decision seems easy.
The troubling piece is that both systems are dropping featues that previous gen systems had, like Blu-ray 3D.
heh, half generation? Do you have visual problems?
Looking at all the Anand evidence, pics and yt's, you're quibbling over a 1% visual difference, seriously. It's shocking how little difference there is in COD for example, and that's a full 720 vs 1080 split! I expect in the future Xone will close that gap to 900 vs 1080 and the like.
I would say even the average *gamer* wont be able to tell much difference, let alone your mom.
Hell, half the time it's hard to spot much different between "current" and "next" gen versions at a glance, let alone between the PS4/Xone versions.
I'd say that, sad as it is, MS won that war. Their box will be perceived as "good enough". I've already seen reviews today touting Forza 5 as the best looking next gen title on any console, and the like.
All you really need is ports. Mult plat devs are already showing all effects and textures will be the same, the only difference might be resolution (even then games like NFS Rivals and NBA 2K are 1080P on Xone).
Then you'll get to exclusives, where PS4 could stretch it's lead if it has one. However these are the vast vast minority of games (and even then I'd argue early exclusives prove nothing)
I hate what Ms did going low power, it was stupid. But they'll probably get away with it because, Sony.
You trolling? You are the visually impaired if you don't see the difference! Just look at the screenshots and if you have a low resolution screen zoom them in and see the difference. The difference is like playing a game on very high settings(ps4) to medium(xbone) on PC.
"MS won that war. Their box will be perceived as "good enough"." hehehehe you're an obvious troll or a blind fanboy, no one says that the loser won a battle because he was good enough
You say the Forza 5 is the best looking next gen title, then you go on talking about ps4 exclusives prove nothing?
The actual graphics are not the top priority, xbone could have the same graphics as the ps4 but the most important thing is to keep the framerates above and atleast 60 at all times.
You and I must have watched the different videos. There is a pronounced "Shimmering" effect on the Xbox One - caused by weaker anti-aliasing. It's far more distracting than a mere 1%. In every video the PS4 image looks more solid and consistent. I'm less than an average Gamer and I can see the difference immediately.
Microsoft simply didn't "Bring It" this time and when your in a tough competitive situation like game consoles you really can't afford not to. I really don't want to buy a "Good Enough" console. Thank you, but no thanks.
I really didn't see much difference between the two. If I tried really hard I could see some more detail in the PS4 and it had a little less "shimmering" effect. In actual use on a standard 50" TV sitting the normal 8-10 feet away I doubt there will be much difference. Shit, most people don't even realize their TV is set to stretch 4:3 content and they have it set to torch mode because the "colors pop" better. It's probably going to come down to price and Kinect and in this case an extra $100 is a lot of extra money to pay. $449 would have a better a better price, but we'll see since there is plenty of time for MS to lower prices for their console after first adopters have paid the premium price.
Yes, you usually win war with weaker hardware, bundled with generally unwanted accessories, which pre-orders significantly worst than competitor, even on local US turf. /s
Here in NZ, all chains I have checked have PS4 pre-sold until late January to mid-February. Coincidently, every shop tried to sell me XO instead. "We have plenty of those", they said.
Great win for XO. They will own shop shelves in the next 2 - 3 months, at least ;)
The weaker console almost always wins the war. Sega always had a hardware edge on Nintendo. Same with everything vs Gameboy. PS1 vs Dreamcast? Wii vs PS3 and 360. DS vs Vita.
He's going in the right direction but lacks the real reason why.
You guys here on AnandTech need to realize that you live in your own little bubble and while you may know a lot about the consoles, the casual consumer market (which makes up most people) have different priorities. So why did Nintendo products beat it's competitors with the Wii while having horrible specs? The experience.
Yes, there is a performance difference between the PS4 and the XO but what really matters is how the console feels and does what people want it to do. This is where the Wii comes in (the Wii U was a flop because they actually went backwards in this regard). Most of the console market is made up of casual gamers. Casual gamers like to invite their friends over and have a LAN party or party game, play with their family (this includes younger audiences), watch movies together and play music at times. The Wii dominated the market because of it's new control interface(s) that added the missing point to this market, it was extremely versatile and made playing it all that more fun than the other consoles.
This is why Nintendo didn't really beef up the Wii U, they just added the extra power to allow for more advanced and precise gesture computation.
So why isn't the Wii U dominating again? Well for starters, most people who have a Wii are satisfied with it and are not out to buy a new one, the Wii U doesn't add anything spectacular that would make the majority of it's target market want to upgrade.
The reason the higher spec console ended up losing is because when the company developed the console, they focused their resources on the performance and as a result cut back on the usability and experience aspect. But that isn't necessarily the case, it all depends on what the focus experience of the console and how well polished that experience is.
If Microsoft want's to win the war it needs to pander to the needs of the casual market, not to say it should copy Nintendo but it has another market. The all-in-one, that is to say make the XO a future PVR, set-top-box, media/streaming centre. Replace the HTPC with a low cost alternative. Most descent HTPCs fall into the $500-$700 market for those who want some light gaming too. The XO would absolutely destroy this market with the proper hardware and software support. Being a console for mid-high end gaming while still being a multimedia powerhouse that does a multitude of things. This includes the voice recognition, a killer feature if done right.If I could say "latest episode of the walking dead" or some other show and it worked, then gg Sony, you just got rolled.
Actually, on digital foundry MS admitted that the useable GPU bandwidth in real world scenarios was of 140-150GB/s, while the developers of ps4 games have reported useable bandwidths of ~170GB/s.
The 9% gpu is useful until you remember that you need to set aside power for Snap, and that you are running 3 OS's.
Snes-Genesis might be debatable, but not N64 vs PSone? On what planet you live on? PSone was crap. Only thing that made it what it become was the use of CD and Final Fantasy 7!
The question is: Why? And the answer probably isn't "it failed because it was the best hardware."
This is the first generation where social lock-in is going to affect purchase decisions right from the start... Most people will end up buying the console their friends have so they can do multiplayer. Since both consoles are going to sell out for the next few months the question this time around might be: Who can make them faster?
Total rubbish. If you could mimic the controllers or use a third party identical controller and do a blind test most people would be unable to detect any graphical differences. Most of it is pixel peeping where you take snapshots and compare.
It's all nonsense, either platform will play games that look roughly the same - Ryse is said to be probably the best _looking_ game on either platform, and it's a One game.
Sorry - this line of thinking of yours is a fail. Game quality will depend on the developers, not slight differences in peak performance.
This generation is less about hardware and more about software - and Microsoft is _miles_ ahead of Sony as a software company.
I wonder if you did a double blind test if anyone could pick the PS4 over the Xbox One. Maybe Anadtech should run that test. Hell add the Wii U in there too. I don't think people would like what they see. Humans eye is designed to see contract and frame-rate over resolution.
those idling power consumption numbers are awful, especially when it's supposed to have low power jaguar cpus on board. Consoles are really pieces of junk.
That's not really the best comparison, though. Kabini, which uses the same Jaguar cores as the PS4 and XB1, has very good power consumption figures at both idle and load. AMD's mid-range GPUs like the 7790 and 7850 equal or beat Nvidia's solutions in terms of performance/watt.
Bulldozer was an inefficient design, no doubt about it. Piledriver was a bit better and Steamroller should be better still. But none of that is being used here.
I really think this is a case of MS and Sony failing to add the necessary code to take advantage of the silicon. I think they had so many things to do to get these systems working that idle power consumption fell into the le'ts do that later category which greatly simplifies everything from the initial coding of the OS to the testing and validation. Anand thought that maybe that silicon for turning off cores wasn't there. I doubt that and I think it will be coming with a patch in the 3 -12 months timeframe.
Sorry for the offtopic Anand, but since you mentioned cutting the cord a few years ago... care to share with us your avenue of choice(as in streaming services, set top boxes and whatnot) ?
Pretty sure Webkit has a multi-threaded JS engine. And if the XBone restricts CPU time for apps along with core counts, that could explain some more of it.
plus the xbox is running internet explorer which tends to lose all the javascript benchmarks. It's not likely important anyways, javascript benchmarks do not
It may be that MS has just simply decided that two cores is enough CPU horsepower to run all OS functions and doesn't even bother letting the OS touch any more cores even when outside of a game. Two Jaguar cores at 1.75 ghz really isn't half bad so it could make sense.
Microsoft has stated that there are two standby modes. One in which Kinect is listening for the command "Xbox On". And another where you turn that feature off. If you turn the "Xbox On" feature off, they have stated that standby power consumption drops to just 0.5W (although given that they said that it burns just 7W with the feature turned on makes me wonder).
Could you test the power consumption with the "Xbox On" feature turned off?
Considering the internet browser is not an important component of a console, whereas it's hugely important on a smartphone, it's pretty understandable, really.
Microsoft is just beyond stupid. It's nothing to produce GDDR5. It costs basically the same amount of money to produce 50 million GDDR5 chips vs 50 million DDR3 chips. That is the whole point of making a gaming console in the first place. You get massive volume discounts on all your parts. Only a fool would buy an xbox, there is absolutely no reason to.. its not like microsoft is going to have that much exclusivity.
The predicted price and availability of GDDR5 was highly questionable at the time MSFT needed to commit to the decision. Sony gambled and it happened to work out for them. A 6 month to 1 year delay or an extra $100-200 for the console would have been devastating if it had gone the other way, no?
Sony's gamble paid off an now MSFT looks foolish, which is a shame for all of us.
I had also heard that Sony had decided to go with 4GB of GDDR5 but decided to double that when MS announced 8GB. Half the ram on the sony box would have hurt its ability to take advantage of its better hardware.
I thought the image quality differences would be more subtle. But watching the COD:Ghosts video side-by-side you can see there is a more pronounced "shimmering" in the image on the Xbox One. Microsoft screwed up - I didn't spend $1500 on an HDTV to look at crappy images. Fore me the choice is clear - the PS4 wins this round. If enough people avoid the Xbox One, next year there won't be exclusive titles to miss out on.
Thanks for confirming my suspicions that it’s likely going to be a good 12 - 24 months before we'll need to buy one of these new systems. Call me old fashioned but I like for that "killer" app before I upgrade to new hardware.
Funny that you point out it's not in depth, but then actually go WAY more in depth than anyone else yet has! Great article. I'm shocked PS4 has 2x the ROPS. I was assuming either 0 or 50% more.
I hate that the hard drive on One is sealed...wish you could disable the 5 or 15 minute video caching too, for noise and hard drive longevity reasons. Makes me wonder if throwing an SSD in a PS4 even makes sense.
I forget where but I read an article that stated for 1080p/60fps at the GPU's clock they only needed 2 more ROPs, or 18 total, but you couldn't selectively choose to add 2 more - it was 16 or 32.
Quick Q I haven't seen addressed - are XBO games all compatible with the 1?
I have to admit I am a bit surprised by the relatively weak hardware in both new consoles, in previous gens they were roughly on par with high end gaming PCs here it seems like they are more like a mainstream rig at best. If these go as long between generations again I see bad things happening, A) console games will fall far behind graphically vs PCs and B) PCs will be hampered by console graphics on multi-plat titles.
The best part of all this for PC gaming seems to be the sudden and very welcome arrival of x64-capable executables for games with games that use more than 3 GB's of RAM on a regular basis. I didn't expect the transition to happen to suddenly, but then bam, there we were with BF4 and Call of Duty with x64-capable executables. And Watch Dogs, whenever it arrives.
That these gaming systems are already well surpassed by mid-range gaming PC's is also pretty nice in terms of ensuring ports run reasonably well for the near term. Kinda sucks for those buying into these systems for $500 or $400 (or both!) since you could easily build a PC out of the one you likely already own that would surpass them. This has never been more true than this generation and never been so easy to verify, either, but it's a nice boon for those of us who are PC gamers already.
It also opens the door for Steam to make their own value argument with SteamOS and Steam Machines.
I agree. Now only if they let Call of Duty PC gamers play against their console counterparts in multiplayer, aaahahaha, no mouse, sorry your soldier has a hangover today and has to turn slowly.
"Support for external storage is apparently on its way, as the Xbox One doesn’t allow end user upgrades of the internal 500GB hard drive. I have to say that I prefer Sony’s stance on this one."
What IS Sony's stance on this one? I have no idea, haven't heard anything.
My understanding is that HDD on PS4 is user-replaceable, but no external storage at the moment. Will they introduce external storage in future, and is there limit to HDD size (with current firmware), I don't know.
What the..?! Xbox and ps4 just got released and already reviewed? I come to this website everyday for a month to look for MacBook retina 15 inch review as I hold off buying one, yet Anandtech never posts any review? This is weird don't they review MacBook all the time....??
I would expect that review sometime later this week. An and admitted somewhere that he was putting the 15" Mac at the bottom of the pile of things to review.
Now I only need Klug to tell me how great my decision to buy a Nexus 5 was....
"If Sony’s price tag didn’t nerf the PS3 last round, it’s entirely possible that Microsoft’s Kinect bundle and resulting price hike won’t do the same for the Xbox One this time."
Wat? Not sure what you're trying to say here. Sony's price tag last gen hurt their sales. It's entirely possible that Microsoft's high price tag won't hurt their sales this round. That's my guess, but it's worded very strangely. Also, if that is what you're saying I don't agree. The fact that it's an extra $100 for something the vast majority of people consider not only useless but intrusive can only hurt them further.
For my part I'm done with Microsoft. They fucked up Windows 8 to the point that it's unusable. They fucked up Xbox Live by banning everyone who has fun. (trash talks) and they built a console with sub-par hardware in the hopes that a fast cache would compensate. We won't notice the inferiority early on, but in 2-4 years it will become obvious that the PS4 is vastly superior.
I'm really a PC gamer now, and don't expect to have enough time to also be into consoles. But if that does happen down the road I'll be going Sony only for the first time ever. Only other console of their I have is the PS2. To play a PS1 game I like and a handful of PS2 exclusives I wanted to try. Shadows of Colossus being the primary one. But I got that, and games/accesories, off ebay for $70.
"I have to say that I prefer Sony’s stance on this one." -> Seeing how I don't really follow the console market apart from superficial reading of some news, what is Sony's stand here? :)
i've read the ars review as well as others and have gathered some interesting info. from what i've gathered, everything is "rosy..." without quibbling on aesthetics and personal/subjective choices, such as, design of console/controller and video game comparison side-by-sides, which are hard to decipher anyway on youtube--i am really against the grain... firmly against the grain of sony and/or microsoft adapting such measly hardware inside these consoles. i can forgo a big/fat controller and box (XBO) if the hardware, i feel, has some muscle in it. or hide the more elegant PS4 under a shelf and behind some stuff to make it quieter and/or add a fan to keep it cool and from bricking, if the inside of it had some oomph! i mean, come one, the jaguar cpu is a tablet cpu and the gpu's are like cheap $100 gpu's. not only cheap, but these gpu's aren't even the new recently released gpu's from AMD that is found in the R9-290x.
the pc people seem to be praising this shift of inferior/cheap hardware solutions, as if, it is a "good thing" for the industry just because the architecture is the same as their $3000 lc's. give me a break.
please explain to me why this is good for games, for the gamers and for the industry, if the tech is not moving forward but semi backwards?
in 2006, ppl complained that the PS3 cost too much. well, the tech in the PS3 at that time didn't exist! $600 wasn't too much and add in blu-ray which at that time was an infant!!! and infant! now, the PS4 is a more agreeable price point but the inside of the machine are parts from a year ago. why is this good?
developers are saying they can squeeze out the consoles more than their pc counterparts, as if to say, "Yes! these consoles doesn't have the oomph of higher end pc's, but games will run better in them anyway because we can optimize the games better in consoles... yada-yada-yada."
the most upsetting part about all of this is that the games, visually and innovatively-speaking will suffer. and yes, the graphics are better and the new consoles are more powerful than their predecessors, but, it's not that more powerful. GTA V for ps4 will look better. yes. BF4 looks almost like a high end PC--yes. but this is probably where it will end, graphically speaking. i mean, the graphical wall will be squeezed out a lot quicker for this gen than last gen, i think. so, by next year, the best graphical masterpiece for either console will be possible. correct me if i'm wrong. and if developers can't squeeze out every metal of these consoles by next year, then something is wrong with the developers or these consoles or whatever since developers should already known how to develop for an x86 console since it is the same architecture as PC's which have been around since 1980 or whatever. i just don't see any games in the future that will be mind blowing is my greatest fear.
but, really, i'm just a little upset that... 1) they went with x86. 2) the x86 they went with isn't that powerful
What would you say would have been a better alternative to x86? I personally find the change to x86 fine, but the gimping of the hardware... well i definitely agree with you there.
well, glad you ask. if i were to build my dream console, i would build it to require and exceed the fastest intel/amd cpu out there in terms of raw performance for gaming/graphics applications. at least, 8-cores, of course. and a full RISC cpu like the cell processor in the ps3 but, it's successor, or if doesn't exist, i'd make them make cpu from the ground up like what apple is doing with arm cpu's. in this case, if it can't beat the fastest mainstream cpu from intel/amd, in terms of, raw performance, then i'd add more cores to it. so maybe a 16-core sony/arm risk cpu that is 64-bit. i know risk cpu's are more scalable. so, adding more cores will give it that edge in terms of raw performance. and then, i would add 8GB of XDR3 RAM which i think is in development and will be faster than GDDR5 (i think). this is for bandwidth and future-proofing this system to meet its 6-10 year life cycle. the GPU would have to be discrete and will probably ask nvidia, instead of AMD to make one specifically for this dream console since Nvidia has better power/efficiency cards. this dream nvidia gpu will be like the upcoming maxwell gpu that is not even out yet. this is how advance my dream console is. and even though it's not an x86 machine and is risc-based, developing for this dream console will be the same as developing for anything else. the 8GB of XDR3 ram is shared btwn cpu and gpu, btw. what else am i missing? yeah, maybe, the console will be bigger than the ps4 but will be as sleek and have a higher price point. but, you can't argue that the inside of this dream console is anything but slapped together.
Oh, the sound chip is also discrete. Adds cost. But whatever.
Bluray standard.
The i/o of the machine is standard. So sata3 or whatever.
Wifi is a/c standard.
Maybe the prce will be $899. But that is XDR3 ram, 16-core RISC sony/arm cpu, nvidia Maxwell gpu, dedicated soundcard, wifi a/c, and 1TB of 7200rpm HDD.
well, the price is a little steep but the tech inside are state of the art, emerging, nonexistant technology as of right now. maybe, the console wouldn't have made launch this year. but, maybe next fall with all of the specs i mentioned. considering how fast apple updates their iphone hardware and the buzz around Arm and MIPS getting back into the RISC cpu race, then i don't think it's inconceivable to think that an electronic giant like sony in partnership with Arm, or MIPS could have co-developed a fast, multi-core RISC cpu that can compete with a desktop intel i7 or future AMD steamroller cpu. or maybe even samsung and sony since samsung also makes cpu's and they have a fab lab. i don't know. i am sort of pulling this out of my butt. but, it's a dream console, afterall. and my knowledge of the cpu market place are non existent. so, i got nothing to go by except for google searches about these things.
someone also mentioned that a cpu is fundementally different than a gpu and they're right. a cpu isn't as fast as a gpu and a gpu isn't as fast as a cpu on certain task. but what bridges those gaps closer is a RISC cpu built from the ground up, sort of like the cell processor, but more powerful obviously that can do cpu task well and gpu task well. my proposition for a maxwell gpu in this dream console is also important since nividia is also incorporating an ARM chip in their upcoming 800 series of gpu to do what gpu's can't do. so, the maxwell version of this dream console will forgo that arm chip because there are already 16 of these chips (or cores) in the proposed RISC cpu of my dream console. my dream console is basically a video/graphic powerhouse where the cpu and the gpu are like synchronously or asynchronous talking to each other, but aren't dependant on each other. and the XDR3 memory controller to feed it is also part of this to give it massive bandwidth. Also, since the cpu and gpu are all co-developed and built with this application in mind, the entire console will only pull around 200 watts at load. Maybe less.
i know i'm dreaming. and it will never happen. well, it will eventually happen. But, I was hoping to happen sooner and in a console. why? Consoles are great platforms for diverging/emerging tech. Or should be. Sort of like what apple is doing with iphone and ipad hardware, but obviously, much more powerful. Much, much, much more powerful since consoles don't have to be that small like an iphone or ipad.....
just wanted to add that what my proposition is is a CPU and GPU that are both CPU and GPU, if that makes sense. so, theoretically, the cpu can be a gpu and the gpu can be a cpu so it's like having a dual gpu setup such as found in pc's. and/or a dual cpu's or possibly more.
Haha. You want to build a RISC CPU from the ground up to be more powerful than an Intel i7, use ram that isn't even available yet, and use a graphics core that hasn't been finished yet? I'm not saying that's impossible, but it would be more expensive than the whole Manhattan Project to build the first nuclear bomb.
x86 chips are already available, Relatively fast, Jaguar chips are easily scalable to new processes, and DDR3 and GDDR5 ram are already in full volume production. Graphics are just a situation of adding more blocks and the minor differences in relative power consumption of AMD vs Nvidia is a moot point as Nvidia is incapable of creating an APU with a decent CPU in it.
I love the idea of an APU in these boxes because it makes so much sense but my ideal box would have been 7870 type graphics performance coupled with a 6 core CPU based on AMD's new streamroller core running at above 3ghz.
For RAM I would have taken the old approach they used to do with the 780G chipset and used 1GB of GDDR5 for the GPU coupled with 8GB of DDR3 accessible by either the GPU or the CPU.
Power consumption would have been similar to the XB360 on initial launch but they should have been able to build that within the size of this XBone.
wow. your dream console is might not even be that much more powerful than what is already in the ps4. and, your vram configuration is worst since it only has 1GB of GDDR5.
i went for the fences with my specs because it is all made up of dreams.
I wonder how the rise of DDR3 prices has affected MS? I'm sure they purchased contracts at fixed prices a while ago, but going forward it seems DDR3 prices aren't much better than GDDR5 right now. The cost savings may not have been worth it looking at the current marketplace.
i'm so tired of these companies making a big cache on the chip's die to negate poorly chosen memory interfaces.
apple did it with the A7 in the iphone and ipad, and now Microsoft is doing it with the XBone.
Just spend the die space on a beefy memory interface and call it a day. Sure the memory interface is going to take up more space on the chip, but its better than wasting even MORE space on eSRAM/Cache.
Apple could have just put a beefy memory controller and call it a day, instead they put 4 MB of cache which takes tons of die space and served as a stop gap solution
Microsoft could have just went with GDDR5 and call it a da, but instead went with ddr3 and wasted tons of die space on esram
sigh, just beef these things up and call it a day, especially if these are going to be on the market for the next 8 years.
While your complaints are valid, Apple will probably address them within 12 months with the A8, so I don't see it as that big of a problem for them. On the X1 side, they gambled wrong and we're kind of stuck with it until ~2020.
I wonder how much of the cross-platform comparisons are just that due to time constraints, the Xbox just didn't get optimized very well. Unfortunately it looks slightly harder to code for.
I'll be curious to see how this goes moving forward. Do games like Forza 5 also have the aliasing problems? Other reviews have just said that it looks great.
Also - Anand - you've outdone yourself. You're preview is better than most reviews I've seen.
Why is everyone abbreviating "box" and not "one"? Like XbOne or XbO or Xb1. And the capitalization. All I read when I see this is "X Bone". I'll just call it that now. I guess there are difficulties with confusion with the original Xbox? The Scion xB? lol. I have an xB and owners call the current model the xB2.
I don't care. Why should I? The only thing that goes on in my living room is playing games and watching TV. So even in the unlikely event that the Kinect camera is feeding somebody (NSA? Microsoft interns? Who exactly am I supposed to be afraid of again?) a 24/7 feed of my living room and somebody is actually looking at it, big whoop.
I'm not planning on purchasing either console, btw. Just irritated by the tin-foil hat brigade pretending it's reasonable to be scared by the Kinect.
Oh, and not to mention that if that is actually taking place, it'll be found out pretty quickly and there'll be a huge backlash against Microsoft. The huge potential for negative press and lost sales for absolutely no gain makes me pretty sure it's not going on, though.
Microsoft, Google, Sony, and any other corporation out there has absolutely zero right to my privacy. Whether I am or am not doing anything "wrong." You my friend will not know what you've lost until it is truly gone.
I recently build a Steam box. With a 360 controller/wireless adapter and Steam Big Picture set to launch on startup, it's a surprisingly console-like experience. Works much better than I had expected, frankly. My motivation to plunk down cash for the new consoles is now very low.
Anand, just wondering if the Xbox One controller works with a Windows based PC (as per the 360 controller)? Would be great if you could try that out and let us know :)
The wireless XBOX 360 controller required a special USB receiver to work with a PC, and that took a few years to be released. I don't know if XBOX One controllers are compatible with the 360 wireless controller receiver or if a new one is required. I actually liked the wired XBOX 360 controller for certain PC games, and I'm curious to know if Microsoft will make wired XBOX One controllers.
There is a lot of discussion about the memory bandwidth issues but what I want to know is how latency affects the performance picture. That SRAM latency might be an order of magnitude quicker even if it is small. What workloads are more latency dependant to where the Xbox design might have a performance advantage?
It is important to understand that GPUs work in a fundamentally different way to CPUs. The main difference when it comes to memory access is how they deal with latency.
CPUs require cache to hide memory access latency. If the required instructions/data are not in cache there is a large latency penalty and the CPU core sits there doing nothing useful for hundreds of clock cycles. For this reason CPU designers pay close attention to cache size and design to ensure that cache hit rates stay north of 99% (on any modern CPU).
GPUs do it differently. Any modern GPU has many thousands of threads in flight at once (even if it has, for example, only 512 shader cores) . When a memory access is needed, it is queued up and attended to by the memory controller in a timely fashion, but there is still the latency of hundreds of clock cycles to consider. So what the GPU does is switch to a different group of threads and process those other threads while it waits for the memory access to complete.
In fact, whenever the needed data is not available, the GPU will switch thread groups so that it can continue to do useful work. If you consider that any given frame of a game contains millions of pixels, and that GPU calculations need to be performed for each and every pixel, then you can see how there would almost always be more threads waiting to switch over to. By switching threads instead of waiting and doing nothing, GPUs effectively hide memory latency very well. But they do it in a completely different way to a CPU.
Because a GPU has many thousands of threads in flight at once, and each thread group is likely at some point to require some data fetched from memory, the memory bandwidth becomes a much more important factor than memory latency. Latency can be hidden by switching thread groups, but bandwidth constraints limit the overall amount of data that can be processed by the GPU per frame.
This is, in a nutshell, why all modern pc graphics cards at the mid and high end use GDDR5 on a wide bus. Bandwidth is king for a GPU.
The Xbox One attempts to offset some of its apparent lack of memory bandwidth by storing frequently used buffers in eSRAM. The eSRAM has a fairly high effective bandwidth, but its size is small. It still remains to be seen how effectively it can be used by talented developers. But you should not worry about its latency. Latency is really not important to the GPU.
I hope this helps you to understand why everyone goes on and on about bandwidth. Sorry if it is a little long-winded.
Anand, when MS initially talked about the Xbox One OS design from their description it certainly sounded like the Xbox OS (i.e. the gaming OS) was just a VM running on top of a hypervisor. Given that, then in theory that VM could be modified to be made runnable on say a Windows Desktop PC or potentially even a Tablet.
With one in hand now, is there anything that can be done to shed some light on that possibility?
To me the most intriguing aspect of XB1 is the OS if it truly is just a VM because that could open up some really interesting possibilities down the road.
This. Was Xbox360 on an x86 CPU? No. But Xbone is. Therefore it seems logical to consider that if there is a possibility of somehow "extracting" the actual VM from the XBone, it could be made to run on a normal Windows PC with much less modification and hassle than the Xbox360 VM because there's no need to worry about the difference in architecture. Basically, I perceive that the biggest deterrent to making an "emulator" of the XBone (via a VM) is some form of software or hardware DRM. The Mac has a similar mechanism in Mac OS which will not let you install that OS on a regular PC because the regular PC doesn't have some extra chip that the boot code of the OS install disc looks for. As we all know, this was quite successfully cracked and Hackintoshes are plentiful. Ok, so Microsoft is not Apple and they may go down on anyone releasing an XBone emulator, but it doesn't mean it can't be done. It would seem much easier to produce an emulator for a console that uses, basically, almost, off-the-shelf parts.
Good lord the Xbone falls short. The embedded SRAM is irrelevant, trading outright strength in 3D for faster operations tied to the subsystem is a failing strategy dating back to the PSX and Sega Saturn.
Looks like PS4 wins not only in hardware specs, but graphics visuals. The only difference maker between the two seems to be game titles. I would have bought the Playstation 4 if Gran Turismo 6 was coming out for it but nope they released it for the PS3, bummer. I have Forza 2, 3, 4 for X360 and will not get Forza 5 after how Turn10 turned Forza 4 into a cash cow with DLC cars.
Exactly, it is huge failure on the MS side and I suspect many a game developer will eventually reveal just how limiting their decision has been. Overall for the two consoles that I would consider to be a modern investment of 3 to 5 years, these are pretty pathetic hardware examples. Current gen PC's are already way ahead and the difference will only continue to surpass these consoles.
Actually, what's wrong with you? It's pretty common knowledge that ROPs are huge consumers of memory bandwidth in a GPU, and with the Xbone having half of them, memory bandwidth becomes far less of an issue.
Less of an issue at a given performance level. Your performance becomes gated by the ROPs instead, so it's still a bloody stupid design decision for a "next gen" console.
Frankly, I'm disappointed in both of them In an age where PCs are moving to 2560x1440 as a standard, 120Hz, and G-sync. These consoles are simply already dated, even more so than at the release of the Xbox 360 and PS3. Good on the upgrades, but I simply can't see buying one over a PC I can build for around $500. (To be fair, it would cost you closer to $700 if you buy pre-made, but I'll point out that almost every one already has a PC. $500 for a PC and $400 for a console means spending more money, not less, for less capability; it only makes sense if you need 2 different pieces of hardware so one person in the family can use one while the other uses something else.)
The only thing consoles offer is existing community. If all your friends play on an Xbox, or Playstation, it is hard to buy a PC instead. However, that isn't a plus, it is a minus because it sets apart gamers that want to play together. It polarizes those gamers that are emotionally attached to one or the other, and that is just bad for everyone. Good news is that Microsoft is talking about making it so PC players can play with Xbone players - but how is that going to effect the quality of the PC versions? Are they going to have to be capped in terms of game responsiveness and frame rates in order to level the playing field?
Don't get me wrong; I'm not bashing console players themselves. And, I get the attraction to cool hardware, I'm even tempted a bit myself, just cause "cool hardware" despite the limitations involved. And, there's the whole playing with others thing, havng both consoles would mean I didn't have to exclude people I want to game with. But, I'd feel like I'd be supporting a part of gaming that I really believe is bad for gamers in this day and age, so I won't be buying a console.
(And, don't give me any freakin tired, old arguments about controllers and a "different experience". It simply is not true, you can use any console controller on a PC. There is absolutely, categorically nothing you can do on a console that you can't do on a PC, except connect with exclusive communities and play exclusive games. Exclusive communities are bad for gamers as a whole, exclusive games are bad for gamers, too. Crappy hardware is bad for everyone.)
Sorry about the emotion in the last paragraph, but it irritates me that some console players have to make up excuses for their decision. If you decide to buy a console, that's all good, but don't cut your nose off to spite yourself by purchasing one for reasons that simply aren't true.
That's very true, but then they've always lagged PC gaming. The closed proprietary system is a double edged sword. SDKs designed for a specific system can eek every last drop out of said system but then it's basically set in stone. I honestly don't think most peoples eyes are attune to the blur without GSync but they will notice true 1080p gaming. They all (PC, PS4, Xbone) all still serve their roles. Xbone just happens to veer off into Netflix territory a little too hard.
I agree, I’m not as excited about consoles as I used to be. What I am really excited about is SteamOS.
Most reasonably priced gaming PCs have the potential to compete with this generation of consoles if Valve (somehow, magically) manages to bring down the overhead using Linux. Plus you get the community. And a controller that at least has the potential to work better than anything we have used so far (see Civ5 on Steam Controller demo). And holiday sales. Upgradable hardware. Heck, I can even see myself dual booting SteamOS on a MBP with the Steam Controller to play the latest and greatest games at almost equal quality than "next-gen" consoles, but completely mobile.
First off, 1440p, G-Sync and 120 Hz are all technologies that cost $250+ for the monitor alone and really demand another $300 on the GPU, so they are not comparable to the PS4 or XBox. Secondly, how can you build a gaming rig for $500? $100 is the Windows license. Another $100 gets a PSU, a case and a Blu-ray drive (but a really cheap case and PSU). Another $100 needs to be spent for a HDD and RAM. Now we are at $300 and don'y have a Mobo, CPU or RAM. A good CPU and CPU cooler costs $150, even for a cheaper CPU (with a stock cooler, the console would be much quieter than the desktop). At least $50 needs to be spent on a Mobo. This leaves you with only $50 on your $500 budget for a GPU. As you can see, this leaves you with a system that underperforms the consoles. I would also argue that a $500 system needs to cheap out on components leaving you with worse build quality than a console which is more similar to a premium SFF PC (which cost a premium to full sized). Also, this cost analysis doesn't have a monitor or peripherals, so if you don't have a PC or have a laptop, that is at least another $150 (many more people have TVs, and fewer people have monitors sitting around now that laptops have been a majority of PC sales over the past five years).
PC gaming is superior, but as long as developers leave out the local multiplayer elements of their console counterpart, a console will always have a spot in my home. You know, gaming in the living room with actual friends. I'd hook up my gaming PC to my TV and get some controllers, but there are basically no PC games that offer any decent local multiplayer options.
What about the noise of both new consoles? Anand is not commenting on that in the article, but after my experience with a Xenon 360 this is really important to me.
It's funny how PC hardware reviews obsess over tiny differences in memory bandwidth, shader throughput and clock speeds, yet the PS4 having 40% greater shader throughput and 160% more memory bandwidth just doesn't seem to matter...
Did you read the article? It was pretty clear and even pointed out to make real world differences. Maybe you thought theyd outright denounce the xb1 for it?
Those "obsessions" in the PC-sphere are academic exercises to underline the differences between otherwise very similar pieces of silicon. Good GPU reviews (and good PC builders) focus on actual game performance and overall experience, incl. power and noise.
And of course it matters that the PS4 is has a better GPU. It's just that native 1080p vs upscaled 720p (+AA) isn't a world of difference when viewed from 8-10 feet away (don't take my word for it, try for yourself).
But like Anand states in the article, things might get interesting when PS4 devs use this extra power to do more than just bump up the res. I, for one, would trade 1080p for better effects @ 60fps.
Great comparison of both products! Has anyone else heard of Why Remote though? I heard it has face and hand gesture recognition and apparently integrates with different types of social media and streaming apps. It seems pretty cool, I'm looking forward to seeing them at the upcoming CES convention!
Wow and I thought the Xbox One was just significantly handicapped in both memory bandwidth and GPU cores. Now I learn about this magical third thing called ROP where the Xbox One literally has only half that of the PS4 and it noticeably affects perceived resolution and is even lower than the standard AMD configuration for proper 1080p output. More nails in the Microsoft coffin.
If you want to talk exclusive games and variety, the PS4 has more than enough bald headed space marine games and yet-another-space-marine-FPS-OMG-oversaturation to satiate any Halo desires, if you even had one to begin with. What you won't find on the Xbox One, however, is all the exclusive Japanese-made games, because lets face it, the Xbox is gonna sell poorly in Japan regardless, and that means no incentive to even make a half-ass port for the Xbox. This means all the JRPG fans and quirky Japanese adventure and indie games are not coming to Xbox, just like last gen.
And Microsoft just opened a Scroogled store selling more anti-Google paraphernalia, a continuation of their assinine and low-brow tactics and culture. They continue to be nothing but assholes day in and day out. They may have curbed their evil corporation ambitions with the backlash from their Xbox mind-control "features", but they show no sign of letting up anywhere else. I didn't think I could care much about tech companies, as they are all in it for money, but Microsoft continues to be the most morally reprehensible one around. A company not worth supporting or saving. To be shunned. It helps that all their recent products have been absolute out of touch flops, from Windows Phone to Windows RT and 8. Ditto Xbox power grab.
Drama queen. This shit just doesnt matter in consoles unless youre a fanboy of one side or another. What matters is how good the game plays when they are done and its in your hands.
Have to agree with you on the Japanese exclusives. They either take forever to get ported or don't get ported at all, unless it's a big title. I never got a PS3, but the PS4 seems like a good place to start and hopefully there'll be more indie stuff from Japan as well. I'm just waiting for a limited edition console to be released before getting one! Though using a Japanese PSN account is a bit of a pain sometimes.
However, I don't think the PS4 has that many bald headed space marines ;)
i agree there's always someone crying about power costs. if the $5 a year in power is that big of a deal then you probably shouldn't be spending $500 on an xbox and $60 a year on xbox live.
Or alternatively they might care for the environment. Multiply all that "wasted" power by everyone and it adds up. This is doubly true when the apparent tasks this power is used on don't really require it.
I acnually feel this generation is pretty bad for innovation. The PS3 And 360 made sense, at the time. They were very fast machines for the money. Sony sold PS3s at a loss for years. MS I dunno.
I feel like time has kind of caught up with that kind of console. What's the use of building a whole new OS when these machines are x86 and fast enough to run Linux? Why focus on all kinds of closed proprietary online features when all that has been done before - and better - by volunteers building freeware. You build a PC thats comparable performance-wise and competitive on price with these machines, if you rip some parts out of an old one and replace PSU/mobo/cpu/gfx. Everyone can find a battered old pc that you can screw new parts in. People throw the things away if they get a little slow.
Then you can have the power of running what you want on the machine you paid for. Complete control. It'll save money in the long run.
It is a shame hearing about the reported use of SATAII and the lack of 802.11ac from both consoles.
Given some of the title of the game are over 40GB in size, something tells me that'll need to be addressed with the inevitable XboxOne Slim and PS4 slim models that'll come out about 2-3 years from now.
Especially Microsoft odd stance on not allowing the hard drive to be removed. PS4 wireless limitations are sort of an odd decision given the stigma of their PSN network being slower and unresponsive; any help from the hardware to be at current standards and future-proof.
Excluding China obviously, some of the fastest broadband infrastructures in the world (i.e. South Korea) are based in Asia. I would think that they would have at least took Microsoft's route to have a dual-band 802.11n connections being available to them.
It's weird that even Sony's standard phones connect and download to the internet using WiFi faster than their flagship console. It makes little sense.
Disappointed I'll have to wait this gen out for 2-3 years. By then, hopefully the ripple effect of SteamOS, Steam Controller, G-Sync, Mantle, Oculus Rift, 4K gaming, and so on will be evident enough to even consider buying either console outside of exclusives.
*any help from the hardware to be at least accommodate common American wireless speeds and be a bit more future-proof would have been helpful to improve the perception of PSN for Sony.
In the Gravity demo, 0:02s in. It was interesting to see the difference in the astronaut falling.
To me, it appeared that the 360 had higher contrast, but there were also other inconsistencies. A black bar ran across the leg of another astronaut in the scene -- I suspect this was debris -- but more notably the 360's face shield was blacked out, whereas the XB1 showed the astronaut's full face.
In terms of quality, due to the higher contrast, it actually seemed like the 360 won out there. However, as expressed, in all the other scenes despite brighter lighting, the XB1 had much better detail and noticeable edges -- the 360 was much softer and less defined.
What I don't understand is the naming convention. Why XB1? It's not the first XBox,
All this talk about specs and even the "higher spec console loses the war" non-sense is so stupid, just stop.
You guys here on AnandTech need to realize that you live in your own little bubble and while you may know a lot about the consoles, the casual consumer market (which makes up most people) have different priorities.So why did Nintendo products beat it's competitors with the Wii while having horrible specs? The experience.
Yes, there is a performance difference between the PS4 and the XO but what really matters is how the console feels and does what people want it to do. This is where the Wii comes in (the Wii U was a flop because they actually went backwards in this regard). Most of the console market is made up of casual gamers. Casual gamers like to invite their friends over and have a LAN party or party game, play with their family (this includes younger audiences), watch movies together and play music at times. The Wii dominated the market because of it's new control interface(s) that added the missing point to this market, it was extremely versatile and made playing it all that more fun than the other consoles.
This is why Nintendo didn't really beef up the Wii U, they just added the extra power to allow for more advanced and precise gesture computation.
So why isn't the Wii U dominating again? Well for starters, most people who have a Wii are satisfied with it and are not out to buy a new one, the Wii U doesn't add anything spectacular that would make the majority of it's target market want to upgrade.
The reason the higher spec console ended up losing is because when the company developed the console, they focused their resources on the performance and as a result cut back on the usability and experience aspect. But that isn't necessarily the case, it all depends on what the focus experience of the console is (the market) and how well polished that experience is.
If Microsoft want's to win the war it needs to pander to the needs of the casual market, not to say it should copy Nintendo but it has another market. The all-in-one market, that is to say make the XO a future PVR, set-top-box, media/streaming centre. Replace the HTPC with a low cost alternative. Most descent HTPCs fall into the $500-$700 market for those who want some light gaming too. The XO would absolutely destroy this market with the proper hardware and software support. Being a console for mid-high end gaming while still being a multimedia powerhouse that does a multitude of things. This includes the voice recognition, a killer feature, if done right.
If I could say "latest episode of the walking dead" or some other show and it worked, then gg Sony, you just got rolled.
@AnandTech: Fix your forum/comment software, not having an edit button is really annoying
The Wii dominated sales at first, they captured a market of casual gamers that otherwise wouldn't have a bought a console. That market didn't buy many games, attach rate and they grew tired of the Wii, with all the smartphone and Facebook games etc. The Wii sales slumped, and in the end, x360 and PS3 each surpassed it in total by 2012. For us hardcore gamers who also are Nintendo fans, the Wii was bought but it then left a bad taste in our mouths. The outstanding titles were few and far between, and the rest was shovelware. True motion control never really materialized in many games, most just made use of a "waggle" gimmick. Wii-u comes out, casual gamers have already moved on, and the hardcores are reluctant to jump into another gimmick "tablet" just for the Nintendo software.
Disclosure: As a big N fan, I bought a wii-u for the Nintendo 1st party titles. Others like me are the only people buying this thing.
Thanks for the mini-review, much appreciated! Some interesting technical information no doubt.
Personally I'm more keen on the PS4, primarily due to having good experiences with Sony equipment in general as well as the price. We currently have a Sony BluRay player (the BDP-470S) and I'd have loved to replace it with a gaming-capable alternative (that also does Netflix) but alas that's unlikely unless Sony can squeeze in CD, MP3 and most importantly DLNA support in the machine.
Anyway, I'm also concerned about the sound levels of the machines as I have quite sensitive ears and I find even my current BluRay player to be something of a hair dryer when playing back discs. BD discs in particular.
That was an awesome mini review! One of the best review's I've read about these new titans.
'm really surprised that 8 year old 360 hardware is as close as it is! A tad old now, but a great book to read on the old hardware is "The Race for a New Game Machine". This book really shows how MS pulled some fast ones on Sony and ended up with the better plan.
This time looks like Sony really kept everything under wraps better and has at least a slight upper hand. There is no way MS can make it's hardware better/faster at this point. Good Enough? Maybe...Time will tell.
It would make sense to provide a revised wireless adapter option in the future that plugs in (w/ an out) to the aux port. Baffles me that it is not 5ghz wireless N or the new AC standard.
"but after talking with Ryan Smith (AT’s Senior GPU Editor) I’m now wondering if memory bandwidth isn’t really the issue here." So what are you wondering after speaking with him? That it is the ROP's being halved?
It appears my xBox One does use HDMI-CEC. During the setup it tested my TV (Samsung) and cable box (DirecTV) and both are being controlled without an IR blaster. Perhaps this was added in a final update?
4K will not happen with this new generation of consoles. 4K video takes too much bandwidth for streaming. A new high capacity disc is still being developed to store 4K movies. The current consoles don't have enough power for 4K games. Be glad you have 1080p.
hi Anand. can you help me out? I'm looking for a no-break to connect my PS4, Xbox One, PS3 and a 42" LCD TV. which would be the proper one for all the power consumed? thanks in advance!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
286 Comments
Back to Article
elerick - Wednesday, November 20, 2013 - link
Thanks for the power consumption measurements. Could the xbox one standby power be higher due to power on / off voice commands?althaz - Wednesday, November 20, 2013 - link
I suspect this is almost certainly the case. I wonder if it drops down below 10w if you turn off the Kinect (which I would never do myself)?I also hope Sony update their software - the Xbox stays in the 16-18w range when downloading updates, whereas the PS4 jumps up to 70 (70w when in standby and downloading an update, but still takes 30 seconds to start up!).
mutantsushi - Saturday, November 23, 2013 - link
It seems that the PS4's extremely high standby/download power draw is due to the ARM co-processor not being up to the task, it was supposed to be able to handle basic I/O tasks and other needed features, but apparently it wasn't quite spec'd sufficiently for the task, forcing Sony to keep the main CPU powered on to handle that task. The rumor is that they will "soon" release a new revision with a more powerful ARM core that is up to the task, and which should allow powering down the x86 CPU completely, as per the original plan. (either that, or managing to rework the "standby" software functions so that the existing ARM core can handle it would also do the trick)I believe MS is now also rumored to "soon" release a revision of the Xbone, although what that might entail is unknown. An SKU without the Kinect could allow them to drop the price $100 to better compete with PS4.
Incidentally, the Xbone seems to be running HOTTER than the PS4, so MS' design certainly cannot be said to be a more efficient cooling design, more like they have more open space which isn't being efficiently used compared to PS4's design. The temp differential is also probably down to MS' last minute decision to give a 5% clockspeed bump to the APU.
I'm looking forward to the 'in depth' article covering each. As far as performance is applicable in actual use scenarios, i.e. games, I'm interested to get the real low-down... The vast similarity in most aspects really narrows the number of factors to consider, so the actual differentiating factors really should be able to comprehensively addressed in their implications.
Like Anand says, I don't think memory thru-put is a gross differentiator per se, or at least we could say that Xbone's ESRAM can be equivalent under certain plausible scenarios, even if it is less flexible than GDDR and thus restricts possible development paths. For cross-platform titles at least, that isn't really a factor IMHO.
The ROP difference is probably the major factor for any delta in frame buffer resolution, but PS4's +50% compute unit advantage still remains as a factor in future exploitability... And if one wants to look at future exploitability then addressing GPU and PS4's features applicable to that is certainly necessary. I have seen discussion of GPGPU approaches which essentially can more efficiently achieve 'traditional graphics' tasks than a conventional pure GPU approach, so this is directly applicable to 'pure graphics' itself, as well as the other applications of GPGPU - game logic/controls like collisions, physics, audio raycasting, etc.
When assessing both platforms' implications for future developments, I just don't see anything on Xbone's side that presents much advantage re: unique architectural advantages that isn't portable to PS4 without serious degradation, while the reverse does very much present that situation. While crossplatform games of course will not truly leverage architectural advantages which allow for 'game changing' differences, PS4's CU, ROP, and GPGPU queue advantages should pretty consistently allow for 'turning up the quality dial' on any cross-platform title... And to the extent that their exploitation techniques becomes widely used, we could in fact see some 'standardizd' design approachs which exploit e.g. the GPGPU techniques in ways easily 'bolted on' to a generic cross platform design... Again that's not going to change the ultimate game experience, but it is another vector to increase the qualitative experience. Certainly in even the first release games there is differences in occlusion techniques, and this is almost certainly without significant exploitation of GPGPU.
mutantsushi - Saturday, November 23, 2013 - link
If Xbone's resolution is considered satisfactory, I do wonder what PS4 can achieve at the same resolution but utilizing the extra CU and GPGPU capacity to achieve actually unique difference, not just a similar experience at higher resolution (i.e. 1080 vs. 900). If 900 upscaled is considered fine, what can be done if that extra horsepower is allocated elsewhere instead of increasing the pixel count?errorr - Wednesday, November 20, 2013 - link
That is still ridiculous considering the moto x and some other future phones can do the same thing at order of magnitudes less power draw.Teknobug - Wednesday, November 20, 2013 - link
I love my Moto X, X8 8-core processor means each core has its own job, 1 core for always listening and 1 core for active notifications. Very easy on the battery, which is why it is one of the best battery life phones right now.uditrana - Thursday, November 21, 2013 - link
Do you even know what you are talking about?blzd - Thursday, November 21, 2013 - link
Moto X is dual core. x8 is a co processor not eight cores.errzone - Monday, November 25, 2013 - link
That's not entirely true either. The Moto X uses the Qualcomm MSM8960. The SOC is a dual core processor with an Adreno 320 GPU, which has 4 cores. Adding the 2 co-processors equals 8; hence Motorola marketing speak of X8.Roy2001 - Wednesday, December 11, 2013 - link
Kinect?dillingerdan - Wednesday, November 20, 2013 - link
I know it might be due to constraints right now, but can you test power consumption when being used for passing TV through? I would like to know how viable an option that is, considering you need 3 pieces of electronics for it to run (TV, Xbox One and "Cable box"). I'm guessing its going to be similar to idle consumption... Which, if it is, is WAY too high just to add a bit of voice control/fantasy sports overlay. Just a pity they didn't integrate a similar experience into the Media Centre part of windows.brickmaster32000 - Saturday, November 23, 2013 - link
Worrying about power consumption, especially just as it passes cable though, is really just looking for problems. If the added cost from power is going to be ridiculously small and if its really a concern you would be better off ditching cable for things like Netflix and if you are actually worried about the generation of the power you would save more power by just watching one less show every day..Spunjji - Tuesday, November 26, 2013 - link
I would say that it's not, really. They're advertising this as a useful feature, but if it's adding 50% to your overall TV watching power draw that's a pretty significant concern from various perspectives.brickmaster32000 - Tuesday, November 26, 2013 - link
Lets be pessimistic and assume it actually did add 50% of the power draw, which it almost certainly isn't. From what I can find LCDs draw only around 70 watts which means your xbox would be drawing 35 watts. Even if you left your tv on 24/7 all year at th average that the EIA lists for this year of 12.07 cents per killowatt hour that only come up to $37.01. Thats a pathetic amount to be worrying about when you are about to drop $400 -500 just on a system to play games and then $60 per game. This is not a reasonable concern this is looking for faults to complain about.tipoo - Wednesday, November 20, 2013 - link
Will you be doing one for the PS4 Anand? And will there be a Maxi to this Mini review? :PThanks
tipoo - Wednesday, November 20, 2013 - link
Also, any guess on the Wii U shader count based off the die shot?http://www.conductunbecoming.ie/wp-content/uploads...
bill5 - Wednesday, November 20, 2013 - link
protip, it's 160 shaders.wii u fans dont like this, but it is fact.
tipoo - Wednesday, November 20, 2013 - link
I think so too. Also 8 ROPs and TMUs.Wolfpup - Wednesday, November 20, 2013 - link
160 is known for sure? And I assume this is...what, AMD's Direct X 10 part? I can barely remember what the norm was for that...tipoo - Wednesday, November 20, 2013 - link
Something like a Radeon 4600 series, yeah.nathanddrews - Wednesday, November 20, 2013 - link
Nice article, thanks for posting!Will you be going into any of the media streaming capabilities of the different platforms? I've heard that Sony has abandoned almost all DLNA abilities and doesn't even play 3-D Blu-ray discs? (WTF) Is Microsoft going that route as well or have they expanded their previous offerings? Being able to play MKV Blu-ray rips would be interesting...
Also, what's the deal with 4K and HDMI? As I understand it, the new consoles use HDMI 1.4a, so that means only 4K at 24Hz (30Hz max), so no one is going to be gaming at 4K, but it would allow for 4K movie downloads.
I've spent the last couple years investing heavily into PC gaming (Steam, GOG, etc.) after a long stint of mostly console gaming. A lot of my friends who used to be exclusive console gamers have also made the switch recently. They're all getting tired of being locked into proprietary systems and the lack of customization. I've hooked a bunch of them up with $100 i3/C2Q computers on Craigslist and they toss in a GTX 670 or 7950 (whatever their budget allows) and they're having a blast getting maxed (or near maxed) 1080p gaming with less money spent on hardware and games. Combined with XBMC, Netflix via any browser they prefer, it's definitely a lot easier for non-enthusiasts to get into PC gaming now (thanks big Picture Mode!). Obviously, there's still a long way to go to get the level of UI smoothness/integration of a single console, but BPM actually does a pretty good job switching between XBMC, Hyperspin, Chrome, and all that.
SunLord - Wednesday, November 20, 2013 - link
A 4k movie is at least 100+GB and that just for one movie... No one sane is going to be downloading them or at least not more then 2 a month.nathanddrews - Wednesday, November 20, 2013 - link
Except they aren't 100+GB. The 4K movies Sony is offering through its service are 40-60GB for the video with a couple audio tracks. You forget that most Blu-ray video files are in the 20-30GB range, only a handful even get close to 45GB. And that's using H.264, not H.265 which offers double the compression without sacrificing PQ.Don't measure other peoples' sanity based upon your own. I download multiple 15-25GB games per month via Steam without even thinking about it. 4K video downloads are happening now and will likely continue with or without your blessing. :/
3DoubleD - Wednesday, November 20, 2013 - link
The thing is, 4k is roughly 4x the pixels as 1080p. Therefore, a 4k video the the appropriate bit-rate will be ~4x the size as the 1080p version. So yes, a 4k movie should be about 80 - 120 GB.Now the scaling won't be perfect given that we aren't requiring 4x the audio, but the audio on 1080p BRs is a small portion relative to the video.
The 60 GB 4k video will probably be an improvement over a 30 GB 1080p video, but the reality is that it is a bitrate starved version, sort of like 1080p netflix vs 1080p BR.
The thing is, what format is available to delivery full bitrate 4k? Quad layer BRs? Probably not. Digital downloads... not with internet caps. Still, I'm in no rush, my 23 TB (and growing... always growing) media server would be quickly overwhelmed either way. Also, I just bought a Panasonic TC60ST60, so I'm locking into 1080p for the next 5 years to ride out this 4k transition until TVs are big enough or I can install a projector.
nathanddrews - Wednesday, November 20, 2013 - link
When you say "full bitrate 4k", do you even know what you're saying? RED RAW? Uncompressed you're talking many Gbps, several TBs of storage for a feature-length film. DCI 4K? Hundreds of GBs. Sony has delivered on Blu-ray quality picture quality (no visible artifacts) at 4K under 60GB, it's real and it looks excellent. Is it 4K DCI? Of course not, but Blu-ray couldn't match 2K DCI either. There are factors beyond bit rate at play.Simple arithmetic can not be used to relate bit rate to picture quality, especially when using different codecs... or even the same codec! Using the same bit rate, launch DVDs look like junk compared to modern DVDs. Blu-ray discs today far outshine most launch Blu-ray discs at the same bit rate. There's more to it than just the bit rate.
3DoubleD - Wednesday, November 20, 2013 - link
You certainly know what I meant by "full bitrate" when I spent half my post describing what I meant. Certainly not uncompressed video, that is ridiculous.There is undoubtedly room for improvements available using the same codec to achieve more efficient encoding of BR video. I've seen significant decreases in bitrate accomplished with negligible impact to image quality with .264 encoding of BR video. That said, to this day these improvements rarely appear on production BR discs, but instead by videophile enthusiasts.
If what your saying is that all production studios (not just Sony) have gotten their act together and are more efficiently encoding BR video, then that's great news! Now when ripping BRs I don't have to re-encode the video more efficiently because they were too lazy to do it in the first place!
If this is the case, then yes, 60 GB is probably sufficient to produce artifact free UHD; however, this practice is contrary to the way BR production has been since the beginning and I'd be surprised if everyone follows suit. Yes, BR PQ/bitrate has been improving over the years, but not to the level of a 60GB feature length movie completely artifact free UHD.
Still, 60 GB is both too large for dual layer BRs and far too large for the current state of internet (with download caps). I applaud Sony for offering an option for the 4K enthusiast, but I'm still unclear as to what the long term game plan will be. I assume a combination of maintaining efficient encoding practices and H.265 will enable them to squeeze UHD content onto a double layer BR? I hope (and prefer) that viable download options appear, but that is mostly up to ISPs removing their download caps unfortunately.
Overall, it's interesting, but still far from accessible. The current extreme effort required to get UHD content and the small benefit (unless you have a massive UHD projector setup) really limits the market. I'm saying this as someone who went to seemingly extreme lengths to get HD content (720p/1080p) prior to the existence of HDDVD and BR. Of course consumers blinded by marketing will still buy 60" UHD TVs and swear they can see the difference sitting 10 - 15+ ft away, but mass adoption is certainly years away. Higher PQ display technology is far more interesting (to me).
Keshley - Wednesday, November 20, 2013 - link
You're assuming that all 45 GB of the data is video, when usually at least half of that is audio. Audio standards haven't changed, still DTS-MA, TrueHD, etc. Typically the actual video portion of a movie is around 10GBs, so we're talking closer to the 60GB number that was mentioned above.3DoubleD - Wednesday, November 20, 2013 - link
The video portion of a BR is by far the bulk and audio is certainly not half the data. For example, Man of Steel has 21.39 GB of video and the English 7.1 DTS HD track is 5.57 GB. The entire BR is 39 GB, the remainder are a bunch of extra video features and some extra other DTS audio tracks for other languages is also included. So keeping the bitrate to pixel ratio the same, 4x scaling gives us ~85 GB. To fit within 60GB, the video portion could only be 54.5GB to leave room for DTS HD audio (english only) That would be ~64% of the bitrate/pixel for UHD compared to 1080p, assuming 4x scaling and the same codec and encoding efficiency. Perhaps in some cases you can get away with less bitrate per pixel given the shear number of pixels, but it certainly seems on the bitrate starved side to me. Even if video without noticeable artifacts is possible for a 2h20min movie (20 min of credits, so ~2h) like Man of Steel, a longer movie or a one that is more difficult to encode without artifacts (grainy, dark) would struggle.Keep in mind, that is JUST the core video/audio. We've thrown out all the other video that would normally come with a BR (which is fine by me, just give me the core video and audio and I'm happy). If they insist on keeping the extra features on a UHD BR release, they would certainly have to include them on a separate disc since even an average length movie would struggle to squeeze to fit on a 50GB disc. To fit a BR with just video and english DTS HD audio, we are talking 52% bitrate/pixel for UHD compared to 1080p. We would certainly need .265 encoding in that case.
So I would probably concede that UHD without artifacts with only 60GB is possible for shorter films or if you can get away with less bitrate/pixel due to the higher resolution. For longer films and/or difficult to encode films, I could see this going up towards 100 GB. Putting more effort into encoding efficiency and switching to .265 will certainly be important steps towards making this possible.
Kjella - Friday, November 22, 2013 - link
For what it's worth, BluRay is far beyond the sweet spot in bit rate. Take a UHD video clip, resize to 1080p and compress it to BluRay size. Now compress the UHD video to BluRay size and watch them both on a UHDTV. The UHD clip will look far better than the 1080p clip, at 1080p the codec is resolution starved. It has plenty bandwidth but not enough resolution to make an optimal encoding. The other part is that if you have a BluRay disc, it doesn't hurt to use it. Pressing the disc costs the same if the video is 40GB total instead of 30GB and it could only get slightly better, while if you're streaming video it matters. Hell, even cartoons are often 20GB when you put them on a BluRay...ydeer - Thursday, November 21, 2013 - link
I pay 29.99 for my 150/30 Mbit connection with 3 TB of traffic. My average download volume was around 450 GB over the last few months and I sit close enough to my 60" screen (which isn’t 4k - yet) to notice a difference between the two resolutions.So yes, I would absolutely buy/rent 4k movies if sony could offer them at decent nitrate. I would even buy a PS4 for that sole purpose.
3DoubleD - Thursday, November 21, 2013 - link
You sit closer than 7 ft (4 ft optimal) to your 60" TV? This must be in a bedroom, office, or a tiny apartment. I live in what I consider a small apartment and I still sit 10 ft away. Perhaps you just put your couch in the center of the room so that it is really close to your TV? Either way, this is not most people's setup. Average seating distances are far greater than 7 ft. UHD TVs will need to be ~100+" for any benefit to be apparent to regular consumers.You must also live in Europe or Asia to get an internet rate like that. I pay $45/mo for 45/4 Mbit with a 300GB cap - although it's unlimited between 2am - 8am, which I take full advantage of.
nathanddrews - Thursday, November 21, 2013 - link
We've got three rows of seating in our home theater. 115" 1080p projection with seating at approximately 7', 11', and 15'. I choose my seating positions based completely upon my audio configuration which is calibrated to the room's acoustic strengths, not upon one-size-fits-all visual acuity seating calculators. We generally sit in the front row when we don't have guests. It's immersive without being nauseating. Pixels are visible in the back row with good eyesight, so I'm anxiously awaiting a 4K upgrade, but probably not until laser projection becomes affordable.We've got Comcast Business Class 50/10 for $99/mo. No cap and 50/10 are the guaranteed minimum speeds, unlike the residential service which has a cap (sort of) and sells you a max speed instead of a minimum. Comcast BC also has a $59 plan with no cap that is 12/3, but we wanted more speed. Still can't get gigabit fiber... :-(
3DoubleD - Friday, November 22, 2013 - link
Sweet setup! You definitely have the screen real estate and seating arrangement to take advantage of 4k. I'd like a similar setup when I move on from apartment style living to a house. Awesome Internet setup too. I could get unlimited as well, and did for a while, but I realized I could pay half as much and get away without hitting my cap by downloading during "happy hours", but that takes some planning.I've been anxiously waiting for laser projection systems as well... Will they ever come or is it vaporware? Hopefully that is what my next TV purchase will be.
douglord - Thursday, November 21, 2013 - link
more BS from the anti 4k crowd. I'm sitting 8 feet from my TV right now. In fact it's difficult to sit further away in your standard apartment living room. For a 60 inch TV 4k resolution is recommended for anything 8 feet or closer. For a 85 inch TV its 11 feet. For a 100 inch screen its 13 feet.3DoubleD - Friday, November 22, 2013 - link
I'm hardly the anti 4k crowd. I think 4k is great, I just think it is only great when properly implemented. This means that 4k TVs should start at 60", since only very few people sit close enough to begin to see the difference. At 8ft,that is the optimal for 1080p for 60". If you really want to take advantage of 4k you'd sit at 4ft for a 60" set.A5 - Wednesday, November 20, 2013 - link
PS3 didn't launch with DLNA support, either. I'm guessing it will get patched in at some point.As for the rest of it, I'm guessing they made a guess that 4K won't really catch on during the lifespan of these systems, which seems like a fairly safe bet to me.
Hubb1e - Wednesday, November 20, 2013 - link
And with only 16 ROPs Microsoft has trouble even pushing 1080p gaming. It seems that they targeted 720p gaming which is fine with me since most of the time TVs aren't big enough for this to matter. Microsoft did target 4K video though and they designed the video decode blocks specifically to handle this load. It will likely be high resolution but low bitrate video which in most cases is not an improvement over 1080p with high bitrate.piroroadkill - Wednesday, November 20, 2013 - link
2005? Well the consoles then being well specc'd? I disagree, they were mostly pretty great, but I recall very distinctly thinking 512MiB RAM was pretty poor.airmantharp - Wednesday, November 20, 2013 - link
It was horrific, and the effects of that decision still haunt us today.bill5 - Wednesday, November 20, 2013 - link
of course it matters. here xbox one has an edge with an awesome 204 gb/s of esram bandwidth p;us 68 gb/s of ddr bw for a total of 272 gb/s.and yes, you can add them together. so dont even start that noob talk.
psychobriggsy - Wednesday, November 20, 2013 - link
Shame that it can only use that ESRAM bandwidth on a total of 1/256th of the system's memory... so you need to account for that in your sums. I.e., it's useless for most things except small data areas that are accessed a lot (framebuffer, z-buffer, etc).smartypnt4 - Wednesday, November 20, 2013 - link
Except you just said it... You store what's used the most, and you get to realize a huge benefit from it. It's the same theory as a cache, but it gives programmers finer control over what gets stored there. Giving the developers the ability to choose what they want to put in the super low-latency, high bandwidth eSRAM is really a good idea too.Computer architecture is mainly about making the common case fast, or in other words, making the things that are done the most the fastest operations in the system. In this case, accessing the z-buffer, etc. is done constantly, making it a good candidate for optimization via placing it in a lower latency, higher bandwidth storage space.
cupholder - Thursday, November 21, 2013 - link
LOL. No. The majority of things that actually affect quality and frame rate are going to be larger in size than the ESRAM. 192 ENTIRE 8GB vs. 204 for a dinky amount of that... It's painfully obvious what the bottlenecks will be. Oh... Forgot the whole PS4 running a 7850 compared to the XB1's 7770.. Oh, and the 8GB ram vs. 5 true GB of ram(3 OSs take up 3GB).With that said, get the console that your friends will play, or has the games you want... Anyone pretending the XB1 is better in raw power is deluding themselves(it's hardly even close).
smartypnt4 - Friday, November 22, 2013 - link
I'm simply describing how the eSRAM should work, given that this should be a traditional PC architecture. Nowhere did I comment on which is the more powerful console. I really don't feel I'm qualified in saying which is faster, but the GPU seems to indicate it's the PS4, as you rightly said.Now, it is true that the PS4 has larger bandwidth to main memory. My point was that if the eSRAM has a good hit rate, let's say 80%, you'll see an effective speed of 0.8*204 = 163GB/s. This is a horrible measure, as it's just theoretically what you'll see, not accounting for overhead.
The other difference is that GDDR5's timings make it higher latency than traditional DDR3, and it will be an order of magnitude higher in latency than the eSRAM in the XB1. Now, that's not to say that it will make a big difference in games because memory access latency can be hidden by computing something else while you wait, but still. My point being that the XB1 likely won't be memory bandwidth bound. That was literally my only point. ROP/memory capacity/shader bound is a whole other topic that I'm not going to touch with a 10-foot pole without more results from actual games.
But yes, buy the console your friends play, or buy the one with the exclusives you want.
rarson - Saturday, November 23, 2013 - link
It's not even close to a traditional PC architecture. I mean, it totally is, if you completely ignore the eSRAM and custom silicon on the die.Test after test after test after test has shown that latency makes practically zero impact on performance, and that the increased speed and bandwidth of GDDR5 is much more important, at least when it comes to graphics (just compare any graphics card that has a DDR3 and GDDR5 variant). Latency isn't that much greater for GDDR5, anway.
The eSRAM is only accessible via the GPU, so anything in it that the CPU needs has to be copied to DDR anyway. Further, in order to even use the eSRAM, you still have to put the data in there, which means it's coming from that slow-ass DDR3. The only way you'll get eSRAM bandwidth 80% of the time is if 80% of your RAM access is a static 32 MB of data. Obviously that's not going to be the majority of your graphics data, so you're not going to get anywhere near 80%.
The most important part here is that in order for anyone to actually use the eSRAM effectively, they're going to have to do the work. Sony's machine is probably going to be more developer-friendly because of this. I can see how the eSRAM could help, but I don't see how it could possibly alleviate the DDR3 bottleneck. All of this is probably a moot point anyway, since the eSRAM seems to be tailored more towards all the multimedia processing stuff (the custom bits on the SoC) and has to be carefully optimized for developers to even use it anyway (nobody is going to bother to do this on cross-platform games).
4thetimebeen - Saturday, November 23, 2013 - link
I'm sorry to burst your bubble and I'm sorry to butt in but you are wrong about the eSRAM only available to the GPU cause if you look and read the digital foundry interview of the Microsoft Xbox One architectures and creators and the hot chips diagram IT SHOWS AND THEY SAID that the CPU has access to the eSRAM as well.smartypnt4 - Monday, November 25, 2013 - link
Yes, latency has very little impact on graphics workloads due to the ability to hide the latency by doing other work. Which is exactly what I said in my comment, so I'm confused as to why you're bringing it up...As far as the CPU getting access, I was under the impression that the XB1 and PS4 both have unified memory access, so the GPU and CPU share memory. If that's the case, then yes, the CPU does get access to the eSRAM.
As far as the hit rate on that eSRAM, if the developer optimizes properly, then they should be able to get significant benefits from it. Cross platform games, as you rightly said, likely won't get optimized to use the eSRAM has effectively, so they won't realize much of a benefit.
And yes, you do incur a set of misses in the eSRAM corresponding to first accesses. That's assuming the XB1's prefetcher doesn't request the data from memory before you need it.
A nontrivial number of accesses from a GPU are indeed static. Things like the frame buffer and z-buffer are needed by every separate rendering thread, and hence may well be useful. 32MB is also a nontrivial amount when it comes to caching textures as well. Especially if the XB1 compresses the textures in memory and decodes them on the fly. If I recall correctly, that's actually how most textures are stored by GPUs anyway (compressed and then uncompressed on the fly as they're needed). I'm not saying that's definitely the case, because that's not how every GPU works, but still. 32MB is enough for the frame buffers at a minimum, so maybe that will help more than you think; maybe it will help far less than I think. It's incredibly difficult to tell how it will perform given that we know basically nothing about it.
To actually say if eSRAM sucks, we need to know how often you can hit in the eSRAM. To know that, we need to know lots of things we have no clue about: prefetcher performance, how the game is optimized to make use of the eSRAM, etc.
In general though, I do agree that the PS4 has more raw GPU horsepower and more raw memory bandwidth exposed to naive developers. My only point that I made was that the XB1 likely won't be that far off in memory bandwidth compared to the PS4 in games that properly optimize for the platform.
There's a whole other thing about CPUs being very latency sensitive, etc., that I won't go into because I don't know nearly enough about it, but I think there's going to be a gap in CPU performance as well because things that are optimized to work on the XB1's CPU aren't going to perform the same on the PS4's, especially if they're using the CPU to decompress textures (which is something the 360 did).
And with that, I reiterate: buy the console your friends buy or the one with the exclusives you want to play. Or if you're really into the Kinect or something.
Andromeduck - Wednesday, November 27, 2013 - link
163 GB/s and hogging the main memory bandwidth - that data doesn't jut magically appearsmartypnt4 - Wednesday, November 20, 2013 - link
Also, not saying the guy above you isn't an idiot for adding the two together. The effective rate Anand quotes takes into account approximately how often you go to the eSRAM vs. going all the way out to main memory. The dude above you doesn't get it.bill5 - Wednesday, November 20, 2013 - link
yes i do get it, dork.small caches of high speed memory are the norm in console design. ps2, gamecube, wii, x360, wii u, on and on.
the gpu can read from both pools at once so technically they can be added. even if it's not exactly the same thing.
peak bw, xone definitely has an advantage on ps4, especially on a per-flop basis due to feeding a weaker gpu to begin with.
Flunk - Wednesday, November 20, 2013 - link
That's intensely stupid, you're saying that because something is traditional it has to be better. That's a silly argument, not only that it's not even true. The consoles you mentioned all have embedded RAM but all the others from the same generations don't.At this point, arguing that the Xbox One is more powerful or even equivalently powerful is just trolling. The Xbox One and PS4 have very similar hardware, the PS4 just has more GPU units and a higher-performing memory subsystem.
4thetimebeen - Saturday, November 23, 2013 - link
Flunk right now if your saying that the PS4 is more powerful then obviously you base your info in current spec sheet tech and not on the architectural design, but what you don't understand is what's underlining all that new architectural design that has to be learned at the same time it's been used, will only improve exponentially in the future. The PS4 it's straight forward a PC machine with a little mod in the CPU to take better advantage of the GPU but it's pretty much straight forward old design or better said "current architecture GPU design". Which is the reason many say it's easier to program than the Xbox One but right now that "weaker system that you so much swear and affirm is the Xbox One " has a couple game that have been pretty much design for it from the ground up been claim to be the most technical looking advance games on the market right now and you can guess which I'm talking about, that not even that I house 1st party game from Sony can't even compete in looks "KSF". I'm not saying that it's not awesome looking, it is actually but even compared to crisis3 it fails in comparison to that game. So it's suppose to be more easier to develop for, it's suppose to be more powerful and called a super computer, but when looking for that power gap in 1st party games that had the time to invest in its power, the "weaker system" with the hardest to develop architecture show a couple of games that trounces what the "superior machine" was able to show. Hmmm hopefully for you, time will tell and the games will tell the whole story!Owls - Wednesday, November 20, 2013 - link
Calling people names? Haha. How utterly silly for you to say the two different RAM types can be added for a total of 274GB/s. Hey guys it looks like I now have 14400 RPM hard drives now too!smartypnt4 - Wednesday, November 20, 2013 - link
Traditional cache-based architectures rely on all requests being serviced by the cache. This is slightly different, though. I'd be wary of adding both together, as there's no evidence that the SoC is capable of simultaneously servicing requests to both main memory and the eSRAM in parallel. Microsoft's marketing machine adds them together, but the marketing team doesn't know what the hell it's talking about. I'd wait for someone to reverse engineer exactly how this thing works before saying one way or the other, I suppose.It's entirely possible that Microsoft decided to let the eSRAM and main memory be accessed in parallel, but I kind of doubt it. There'd be so little return on the investment required to get that to work properly that it's not really worth the effort. I think it's far more likely that all memory requests get serviced as usual, but if the address is inside a certain range, the access is thrown at the eSRAM instead of the main memory. In this case, it'd be as dumb to add the two together as it would be to add cache bandwidth in a consumer processor like an i5/i7 to the bandwidth from main memory. But I don't know anything for sure, so I guess I can't say you don't get it (since no one currently knows how the memory controller is architected).
hoboville - Thursday, November 21, 2013 - link
smartypnt4's description of eSRAM is very much how typical cache works in a PC, such as L1, L2, L3. It should also be mentioned that L2 cache is almost always SRAM. Invariably, this architecture is just like typical CPU architecture, because that's what AMD Jaguar is. Calls to cache that aren't in the cache address range get forwarded to the SDRAM controller. There is no way Microsoft redesigned the memory controller. That would require changing the base architecture of the APU.Parallel RAM access only exists in systems where there is more than one memory controller or the memory controller is spanned across multiple channels. People who start adding bandwidth together don't understand computer architectures. These APUs are based on existing x86 architectures, with some improvements (look up AMD Trinity). These APUs are not like the previous gen which used IMB POWER cores which are largely different.
rarson - Saturday, November 23, 2013 - link
But Microsoft's chip isn't an APU, it's an SoC. There's silicon on the chip that isn't at all part of the Jaguar architecture. The 32 MB of eSRAM is not L2, Jaguar only supports L2 up to 2 MB per four cores. So it's not "just like a typical CPU architecture."What the hell does Trinity have to do with any of this? Jaguar has nothing to do with Trinity.
4thetimebeen - Saturday, November 23, 2013 - link
Actually if you read and I apologized for up butting in but if you read the digital foundry interview of the Microsoft Xbox One architects that they heavily modified that GPU and it is a DUAL PIPELINE GPU! So your theory is not really far away from the truth!The interview,
http://www.eurogamer.net/articles/digitalfoundry-t...
4thetimebeen - Saturday, November 23, 2013 - link
Plus to add; the idea of adding that DDR3 to the eSRAM kind of acceptable because unlike the PS4 simple straight architecture design like very much the One pool GDDR5 you have 4 modules of DDR3 running at 60- 65gb/s and they each can be used for specific simultaneous request which makes it a lot more advance and more like a future DDR4 way of behaving plus killing that bottleneck people that don't understand, think it has. It's a new tech people and it will take some time to learn its advantages but not hard to program. It's a system design to have less error and be more effective and perform way better than supposedly higher flops GPUS cause it can achieve same performance with less resources! Hope you guys can understand a little and not trying to offend anyone!melgross - Wednesday, November 20, 2013 - link
You really don't understand this at all, do you?fourthletter - Wednesday, November 20, 2013 - link
All the other consoles you mentioned (apart from the PS2) are based on IBM Power PC chips, you are comparing their setup to X86 on the new consoles - silly boy.CubesTheGamer - Wednesday, November 20, 2013 - link
You...you are a special kind of idiot.The PS4 has a more powerful GPU since it has more compute cores. What this means is that while Xbone has a slightly higher clock speed, there are more computer cores to do the work on PS4, so it can split up and done faster. Also, while the GPU might be able to read from both pools of memory at one time, that doesn't mean the RAM bandwidth is (60GB/s + 200GB/s) or whatever the numbers are.
Egg - Wednesday, November 20, 2013 - link
"Microsoft has claimed publicly that actual bandwidth to the eSRAM is somewhere in the 140 - 150GB/s range, which is likely equal to the effective memory bandwidth (after overhead/efficiency losses) to the PS4’s GDDR5 memory interface. The difference being that you only get that bandwidth to your most frequently used data on the Xbox One.""The difference being that you only get that bandwidth to your most frequently used data on the Xbox One."
No. This is effective bandwidth to the eSRAM only after protocol overhead, nothing more.
editorsorgtfo - Wednesday, November 20, 2013 - link
Uhhh, no you can't... Are you serious?bill5 - Wednesday, November 20, 2013 - link
the gpu can read both pools at once. this is a fact. you can.period. no arguing this, it's a fact.
it's not the same as a single pool, but you can add them.
szimm - Wednesday, November 20, 2013 - link
You do realize, that telling someone they are not allowed to argue something, will only make them much more eager to do just that?melgross - Wednesday, November 20, 2013 - link
No, you can't. Well, maybe YOU can, but the systems can't.Owls - Wednesday, November 20, 2013 - link
Please provide some technical insight as to how you can magically add the two together to get your ridiculous throughput. We'll wait.Or don't since you are clearly astroturfing for MS.
Wolfpup - Wednesday, November 20, 2013 - link
Good grief, we've got fanbois on Anandtech too? LOL Umm..the specs are right there. One quite obviously does not "have an edge".editorsorgtfo - Wednesday, November 20, 2013 - link
Ah so graphics card manufacturers can replace GDDR5 with cheap low frequency DDR3 on all of their boards and get equal/greater performance so long as they add a little chunk of SDRAM to the chip... Good to know man, thanks for that brilliant analysis. They should have come to you years ago to tap your knowledge of memory subsystems. Just think of all the money AMD and NVIDIA could have saved by doing so.extide - Wednesday, November 20, 2013 - link
Well, in theory they can... but it would cost nVidia/AMD MORE money as the GPU die would be bigger, and thus have less shader cores. So it's not a good solution for a discreet GPU, but it IS a decent solution in SOME cases, see Crystalwell, for instance. Honestly, I would say I think the PS4's setup is better, simple and fast, versus MS's more complex setup (and they ended up with a bigger die too, lol).djboxbaba - Wednesday, November 20, 2013 - link
you have the cutest name ever bra, "wolfpup" kudos ^^melgross - Wednesday, November 20, 2013 - link
Hey noob, it doesn't work that way. SRAM is not equivalent to high speeds GDDR5. This has been well established already. You do get some boost, at some points, but it's not covering every area of performance the way GDDR5 is.CubesTheGamer - Wednesday, November 20, 2013 - link
Newb talk? No, you can't add them together. Let me tell you why, in technical terms.ESRAM is meant to be a cache, and what a cache does is take some data that you're going to need a lot (let's say there's some instruction code or some other code / data that needs to be read frequently. You put that data in the ESRAM, and it gets read 10+ times before being swapped for some other data. What you're saying makes it seem like we can constantly write and read from the ESRAM. That's not how it works.
tl;dr: You can't add them together because you should only use the ESRAM 1/10th the amount of times as you should the main DDR3 RAM that the Xbox One has. So you're argument is invalid, and don't say things that you don't know about.
smartypnt4 - Thursday, November 21, 2013 - link
He's indeed wrong, but I'd be willing to bet good money your hit rate on that eSRAM is way higher than 10% if it's used as a cache. Usual last level caches have a hit rate getting into the 80% range due to prefetching, and large ones like this have even higher hit rates.If it's hardware mapped like the article indicates(aka not a global cache, but more like it was on the 360), it won't hit quite as often with a naive program, but a good developer could ensure that the bulk of the memory accesses hit that eSRAM and not main memory.
Da W - Friday, November 22, 2013 - link
XBone is ROP bound. Will you stop bitching around with your geometry and bandwith? Its all about ROP!MadMan007 - Wednesday, November 20, 2013 - link
I can add $2 and a whore together too, that doesn't make it good.looncraz - Thursday, November 21, 2013 - link
You actually only get about 100GB/s READ or 100GB/s WRITE... The best-case scenario on the XBox One is 68GB/s + 100GB/s - still NOT matching the PS4's capabilities for reading/writing ANY memory... and only in certain situations where you are streaming from both memory systems.Xbone PEAKS below PS4's AVERAGE memory performance.
daverasaro - Saturday, November 23, 2013 - link
Huh? Actually you are wrong. The Xbox One uses 8GB of DDR3 RAM at 2133 MHz for 68.3 GB/s of bandwidth, but also adds an extra 32 MB of ESRAM for 102 GB/s of embedded memory bandwidth. The the PS4 uses 8GB of GDDR5 RAM at 5500 MHz for 170.6 GB/s of bandwidth.daverasaro - Saturday, November 23, 2013 - link
32GB*SunLord - Wednesday, November 20, 2013 - link
I don't get this constant worrying about power usage on non-mobile devices they plug into a wall and as long as it's not some obscene (300+W) amount of draw I don't care damn it... Heat can be an issue but I'm personally not even remotely concerned that it might cost me $3 more a year in power usage to use my $400 PS4 if I was i shouldn't be buying a PS4 or Xbox One let alone games for them.bill5 - Wednesday, November 20, 2013 - link
It doesn't matter if you aren't concerned, the EPA is.Vote against Democrats if you dont like it.
Seriously from what i understand particularly regulations in the EU influenced these boxes, and I'm sure a power hog machine was out of the question due to the general climate of "green" propaganda nonsense.
A5 - Wednesday, November 20, 2013 - link
The 360 got so hot it melted its own solder pads, despite sounding like a damn jet engine.There are plenty of engineering reasons to reduce power consumption.
JDG1980 - Wednesday, November 20, 2013 - link
Exactly how will voting against Democrats in the US stop the European Union from imposing additional energy regulations?blitzninja - Saturday, November 23, 2013 - link
He sounds like he is either A. a retard or B. he is talking out of his ass or C. trolling. Either way, ignore him.The power consumption regulations are there for more than just "green". We currently have a problem of growing energy needs and where we're going to get that power form is a big question.
What people don't realize is that the power grid's infrastructure is designed with a peak load in mind and due to implementation and cost limitations you can't just "build more" as most Americans seem to think about it. 1 millions consoles sold at launch, think about that in terms of power consumption and remember, this legislation doesn't just apply to consoles.
Also, I don't understand why the whole "anti-green" view, I don't see how it's bad, even if you don't think global climate change is real (which it is btw, it's fact in every meaning of the word), do you really think dumping all that exhaust fumes in to the atmosphere is good for you or something? How would you like to weak a gas mask/air filter when you go outside? See whats happening in China right now (smog) because of the massive amounts of coal being burned.
tl;dr Power consumption affects more than "green" it affects infrastructure durability and limitations and a large upgrades are extremely costly and time consuming. So please do some research instead of trying to act like a 'smartass'. Also burning lost of fossil fuels can make you sick, see China smog issue.
blitzninja - Saturday, November 23, 2013 - link
Some typo corrections, typing on my phone:"...going to get that power from and how it's going to be transported are big questions."
"...Also, I don't understand the whole "anti-green" view,..." deleted 'why'
"...you like to wear a gas mask/air filter when you go outside?"
evonitzer - Wednesday, November 20, 2013 - link
I think Anand is covering it more as a curiosity. High power PC's with much better capabilities consume similar amounts at idle, so a specifically designed piece of hardware should be optimized MUCH better. But it isn't, and neither is the PS4. Odd.Da W - Wednesday, November 20, 2013 - link
My only issue is WHY Microsoft DID YOU KILL MEDIA CENTER and throw all your focus on the Xbox??? I would kill to have an HTPC with an HDMI-IN and voice command.althaz - Wednesday, November 20, 2013 - link
If the XBox One had a tuner (or four) and more codec support, it would be an amazing media centre. As it is it's a bit inconsistent.A5 - Wednesday, November 20, 2013 - link
Yeah. Not being able to do DVR stuff on the XBone makes it kind of a deal-killer.andrewaggb - Wednesday, November 20, 2013 - link
I was genuinely surprised they didn't integrate a set top box+pvr in at least one sku of the one and market it to cable providersairmantharp - Wednesday, November 20, 2013 - link
And no HDMI-CEC? What the hell's with that?Gigaplex - Wednesday, November 20, 2013 - link
I don't get it. Other sites I've read state that it does support HDMI-CEC.mikeisfly - Thursday, November 21, 2013 - link
I'm not sure this is entirely true as I remember them saying that they had it at the reveal. Also there is some conflicting information on the internet about that. I would bet that the chip that they are using has it and even if it's not available at launch I would hope they didn't take the traces out of the header to save money. I will certainly test in my lab when I get mine and keep you guys up-to-date.Da W - Friday, November 22, 2013 - link
Even worst. They made the software for some IP-TV provider (mediaroom), and they are getting rid of it.mikato - Monday, November 25, 2013 - link
I completely agree. You still cannot buy a DVR!!! Tivo - look it up, you have to pay for their service. $20/month for a DVR from the cable company... we've gone backwards from the VCR in many ways. It makes me want to build my own HTPC (with CableCard and maybe SiliconDust HD HomeRun) but there are still lots of nagging issues doing that - HD recording, MythTV and XBMC integration, IR blaster remotes...mikeisfly - Thursday, November 21, 2013 - link
Look at Silicon Dusts HDhomerun prime, I know it isn't supported at launch but I will bet money that it will be supported in later updates. I have it now running on Windows Media Center (both windows 7 and 8) and love it.Flunk - Wednesday, November 20, 2013 - link
If you're annoyed by lack of Windows Media Center, XBMC is a good and totally free alternative.Da W - Wednesday, November 20, 2013 - link
It's HDMI-in i lack.mikeisfly - Thursday, November 21, 2013 - link
XBMC doesn't support "copy once" material. So it's not even an option for many though if it did I would switch in a minute due to the fact that it is still being developed.mikato - Monday, November 25, 2013 - link
XBMC has no DVR functionality. Maybe combined with MythTV you can do it but that integration has just been done recently with an XBMC add-on, otherwise you are dealing with the two programs independently - much better to watch things in XBMC but you have to go into MythTV to do recording.melgross - Wednesday, November 20, 2013 - link
Because nobody used Media Center.Da W - Friday, November 22, 2013 - link
Because nobody made an off the shelf, plug and play, HTPC. Since MS is making hardware now, i don't know why they didn't try to rebaggage Media Center as a Windows 8 app and make another try. The whole world is fighting for your TV, Microsoft was here since 2005 and somehow they call it quit (for the PC) and put all their eggs in Xbox basket.How expensive would it be to offer two options instead of one? I know a good deal of enthusiasts that will kill for a 2k$ HTPC with full XBone capabilities. Would cut the grass under steambox feets too.
taikamya - Wednesday, November 20, 2013 - link
So wait.. that IGN review where they stated that the PS4 has a 2.75Ghz clock is false?'Cause this can explain the faster response times and more power usage, since the GPU's are not THAT different. I don't think that all that power difference of 20W-30W is GPU only.
Okay, "max frequency of 2.75Ghz".. either way, that could explain a lot.(including the overheating problems some people are having now)
http://goo.gl/Fd6xJY
taikamya - Wednesday, November 20, 2013 - link
Excuse me, I'm new here so.... I'm sorry if we're not supposed to post links or anything for that matter. The IGN review is called "Playstation 4 Operating Temperature Revealed".I would be glad if someone could clear this up for me. Since this Anand review states that the PS4 runs at 1.6Ghz.
althaz - Wednesday, November 20, 2013 - link
It runs at 1.6 Ghz, IGN are incorrect.A5 - Wednesday, November 20, 2013 - link
Don't go to IGN for technical information. Or anything, really. They're just plain wrong on this.cupholder - Thursday, November 21, 2013 - link
Yeah, double the ROPs = not THAT different.Each of my 770s are totally the same as a Titan... Totally.
bill5 - Wednesday, November 20, 2013 - link
14 CU's, yes, it does have 14 for redundancy.The worst part is as I tweeted you, as recently as weeks from launch MS was strongly considering enabling the two redundant CU's, but choose not too. Both my own reliable sources told me this, as well it was somewhat referenced by MS engineers in a digital foundry article.
Anyways I strongly wish they had, 1.5 teraflops just would have felt so much better, even if no paper a small increase.
MS was so dumb to not beef up the hardware more, charging 499 for essentially a HD7770 GPU in nearly 2014 I find sad.
Hell my ancient 2009, factory overclocked to 950, HD 4890 has more flops in practice, even if the 7770/XO GPU is probably faster due to being more advanced.
Think about that, the 4890 is a 5 year old GPU. The XO is a brand new console expected to last 7+ years. So sad I dont even wanna think about it.
Ahh well, the sad thing is by the looks of your comparison vids MS will very likely get away with it. even the 720P vs 1080P Ghosts comparison there is not much difference (and I imagine over time the XO will close the resolution gap to something more like 900P vs 1080P)
One of the most interesting parts of your article though was the speculation XO is ROP limited. Not something I hadn't heard before, but still interesting. Shortsighted on MS part if so.
Overall it feels like as usual MS is misguided. Focus on Live TV when it's probably slowly fading away (if not for that pesky sports problem...), and other things that seem cute and cool but half assed (voice recognition, Snap, Skype, etc etc etc).
Yet for all that I can still see them doing well, mostly because Sony is even more incompetent. If they were up against Samsung or Apple they would be already dead in consoles, but fortunately for them they are not, they are up against Sony, who loses pretty much every market they are in.
I think if XO struggles it would be a nice rebrand as a kinect-less, games focused, machine at 299. For that it'd arguably be a nice buy, and cheap DDR3 base should enable it. But if it sells OK at 499 with Kinect, and it probably will, we'll probably never get a chance to find out.
djboxbaba - Wednesday, November 20, 2013 - link
It really is sad.. Good post.augiem - Wednesday, November 20, 2013 - link
I agree for the most part, but 14, or even 18 CUs isn't going to be enough to really makea big difference. I think the sad part technology-wise is how not one of the 3 major console gaming companies this time around focused on pushing the horsepower or even doing anything very innovative. Don't get me wrong, I for one don't think graphics is primarily what makes a good game, but since the days of Atari -> NES, this really feels like the smallest technological bump (was gonna say "leap", but that just doesn't seem to appy) from gen to gen. What makes it worse is the last gen lasted longer than any before it. You know the rise of the dirt cheap phone/tablet/FB/freemium game had something to do with it...airmantharp - Wednesday, November 20, 2013 - link
Having actual CPU resources, a unified GPU architecture with desktops (and many mobile SoCs), and tons of RAM are all big differences over the last generation's introduction.The Xbox expounds on that by adding in co-processors that allow for lots of difficult stuff to happen in real-time without affecting overall performance.
mikeisfly - Thursday, November 21, 2013 - link
Thank god people didn't think like this when computers first started with switches and paper tape. Remember we have to start some where to move the technology forward. I want the Jarvis computer in Iron Man! You don't get there by making a console that can play games. You get there by making a console that can play games and has voice recognition and gestures and ......People get use to interacting with new input sources and then you find your self in a situation when you say how did I ever live without this. You guys sound like I did in the 80s when Microsoft was coming out with this stupid gui crap. "You will have to rip the command line from my cold dead fingers!" Where would we be today if everyone thought like me. Where would the Internet be if it was just command line. I for one applaud Microsoft for trying to expand the gaming market not just for hard core gamers but people like my girl too. I know the PS4 might have more power in terms of compute performance but that is not what games are about, it's about story line, immersiveness (made-up word), and to some extent graphics. Truth is there is really no difference between 1080 and 720 on a Big Screen, remember people this is not a PC monitor. And the X1 can do 1080p. I'm looking forward to what both systems can offer in this next generation but I'm more interested in the X1 due to it's forward thinking aspects. Only time will tell though.
douglord - Thursday, November 21, 2013 - link
Rule of thumb is you need a 10x increase in power to get a 100% increase in visual fidelity. Look at 360 vs One. 6x the power and maybe games look 50% better. So we are talking about the PS4 looking 5% better than Xbox One. In this gen, it really is about who has the exclusives you want.And if you are looking out 5+ years you have to take into account Xbox's cloud initiative. Have you used OnLive? II can play Borderlands 2 on an Intel Atom. If MS puts the $ behind it, those 8 cores and pitiful CPU could be used just to power the OS and cloud terminal. Only way these consoles can keep up with midrange PCs.
Revdarian - Sunday, November 24, 2013 - link
Interesting that you use numbers referring to visual fidelity, when it is a non quantifiable, perceptual, quality.Also there is no such Rule of Thumb regarding it, but what is known is that in certain games like CoD:Ghosts due to certain choices the xb1 is able to pump less than half the pixels that the ps4 can.
If you believe in the Cloud for that kind of gaming, Sony has bought Gaikai and it is a project that started sooner than the MS counterpart, heck the MS counterpart hasn't been named.
RubyX - Wednesday, November 20, 2013 - link
How do the noise levels of the consoles compare?According to other reviews they both seem to be fairly quiet, which is great, but is there a noticable difference between them?
szimm - Wednesday, November 20, 2013 - link
I'm wondering the same - I've seen lots of people point out the fact that the Xbox One is designed to be bigger, but more cool and quiet. However, I haven't seen any confirmation that it is in fact more quiet than the PS4.bill5 - Wednesday, November 20, 2013 - link
15w standby, seems a bit high.Lets say you leave it on standby 24/7, as you would, that's 360 watts a day, almost 11 KWh/s month. I pay ~10cent poer Kwh in general, so 1.10/month.
Could add up to $60+ over 5 years. More if the EPA enforces more regulations rising the cost of electricity as they typically are doing.
ydeer - Thursday, November 21, 2013 - link
Yes, the standby power of the XBone and PS4 bothers me too. I often leave my TV and Consoles untouched for weeks, so the only sensible thing is to put them on a Master/Slave powerstrip which cuts them off the grid when the TV isn’t on.Of course that defeats the entire standby background downloads, but in the case of Sony, I have to wonder why they put a whole proprietary ARM SoC* (with 2GB of DDR3 RAM) on the board for "low power standby and background downloads" and then end up with unbelievable 70W figures.
This is essentially a mobile phone without a display, I don’t think it should use more than 3 Watt idle with the HD spun down.
My only explanation is that they couldn’t get the ARM software/OS side if things wrapped up in time for the launch, so for now they use the x86 CPU for background downloads even though it was never intended to do that.
* http://www.ifixit.com/Teardown/PlayStation+4+Teard...
ydeer - Thursday, November 21, 2013 - link
Correction, the SoC only has access to 2Gb (= 256 MB) of DDR3 RAM.However, I found a document that seems to confirm that the ARM Subsystem did not work as planned and Sony currently uses the APU for all standby/background tasks.
Maybe somebody who is fluent in Japanese could give us a short abstract of the part that talks about the subsystem.
http://translate.google.com/translate?u=http%3A//p...
tipoo - Wednesday, November 20, 2013 - link
Hey Anand, did you see the Wii U GPU die shots? How many shaders do you think are in there? I think it's almost certainly 160 at this point, but there are a few holdouts saying 320 which seems impossible with the shader config/size. They are basing that off the clusters being a bit bigger than normal shader cores, but that could be down to process optimization.tipoo - Wednesday, November 20, 2013 - link
http://www.conductunbecoming.ie/wp-content/uploads...bill5 - Wednesday, November 20, 2013 - link
there is a guy on neogaf with access to wii u official documentation that more or less confirms 160 shaders, even though it's never explicitly stated (for example it refers to 32 alu's, which would be vliw 5 in this case, meaning 160 shaders). combine that with the die evidence and it's clear. 8 tmu's, 8 rops also.some people will never accept it but there's no real doubt for me personally, it's 160.
tipoo - Wednesday, November 20, 2013 - link
A 1:1 ROP to TMU ratio is quite strange. TMUs are usually double. Not doubling it is another weird limitation. Nintendo sure does have a lot of head scratchers in there.djboxbaba - Wednesday, November 20, 2013 - link
Wii U is almost decade old hardware packaged for todaytipoo - Thursday, November 21, 2013 - link
The PowerPC 750 its based on is from the 1998 iMac G3 :PBut I know, that's like saying the Core 2 is based on the Pentium 3.
dgingeri - Wednesday, November 20, 2013 - link
For the first time since the Atari 2600, I actually want a particular console: the Xbox One. I don't even play console games. I want it for the voice control for the apps. I watch Hulu Plus and Netflix instead of TV, and this would make it easier to watch than using a Win7 PC like I do right now. It also uses less power than the PC I use right now. In addition, I would like to have the Skype app so I could talk to certain family members face to face, sort of, who are too far away for me to visit. The games don't attract me to the console so much as the other uses.Hubb1e - Wednesday, November 20, 2013 - link
I agree with you. The $500 price doesn't even scare me that much because my young girls and wife will probably like the casual kinect games.However, Microsoft has made a HUGE mistake with requiring a $60 yearly subscription. This isn't 2008 anymore. I can get a Roku streamer for $50 that will play netflix and Hulu for years to come. Kinect really appeals to the casual gamer and I'd get one myself for the steaming and the occasional console game session, but the $60 a year charge that can't be canceled easily (as I found on my 360) makes the XO a non-starter for me.
mikeisfly - Thursday, November 21, 2013 - link
Hubb1e I hear you but how do they make up for the cost of the console? How much do you think a sensor like the Kinect 2 would cost on the PC? How do they continue to make money to make the network better so things like voice recognition get better and to make the investments to get the network closer to the end user to reduce latency? I'm willing to pay $60/year (one night out with the family at the movies/diner) to get a better experience.Owls - Wednesday, November 20, 2013 - link
Couldn't you get a $35 Chromecast dongle for your TV? Or does your TV not have a USB port? It just seems so odd shelling out $500 for watching TV. Heck, you could probably spend $500 and get a fairly decent smart TV with Skype, Hulu, Netflix, and Amazon Prime!mikato - Monday, November 25, 2013 - link
Seems like voice command is the only reason he can't do something like that, or just continue using his Win7 PC or an XBMC. I'm still leaning toward making a new APU based HTPC for XBMC myself.JDG1980 - Wednesday, November 20, 2013 - link
So, based on the numbers shown here, it looks like the PS4's GPU is roughly on par with a Radeon HD 7850 (more shaders, but slightly lower clock). Meanwhile, the XB1's GPU is considerably weaker, with performance falling somewhere between a 7770 and 7790. Considering that this is a game console we're talking about (notwithstanding Microsoft's attempt to position it as a do-everything set-top box), that's going to hurt the XB1 a lot.I just don't see any real advantage to the *consumer* in Microsoft's design decisions here, regardless of supply chain considerations, and I think Anandtech should have been more pro-active in calling them out on this.
mikeisfly - Thursday, November 21, 2013 - link
The right question to ask is can both cards do 1080p gaming. Remeber these aren't PC where people are running games at much higher resolutions than 1920x180 on multiple monitors.douglord - Thursday, November 21, 2013 - link
Take a 7850 and 7770 and put them next to each other with FOR locked to 60 fps. Sit back 6 feet and play a fps. Tell me which is which. Maybe a 5% difference in visual fidelity.Revdarian - Sunday, November 24, 2013 - link
Lol no, by the way, what will you do if a game is heavy enough to run at 720p 30 on the ps4, at which resolution will you run it on the xb1?... yeap, it will be notoriously different.jeffrey - Wednesday, November 20, 2013 - link
With the PS4 offering-up such a more powerful system, the arguement turned to Xbox One's eSRAM and "cloud power" to equalize things. Even with Microsoft boosting clocks, the Xbox One simply does not deliver game play graphics the way the PS4 has now been demonstrated to do.The PS4 graphics look much better. In COD Ghosts it almost looks like the PS4 is a half-generation ahead of the Xbox One. This actually makes sense with the the PS4 offering 50% more GPU cores and 100% more ROPs.
Considering the PS4 is $100 cheaper and with the bundled Kinect being a non-starter, the decision seems easy.
The troubling piece is that both systems are dropping featues that previous gen systems had, like Blu-ray 3D.
bill5 - Wednesday, November 20, 2013 - link
heh, half generation? Do you have visual problems?Looking at all the Anand evidence, pics and yt's, you're quibbling over a 1% visual difference, seriously. It's shocking how little difference there is in COD for example, and that's a full 720 vs 1080 split! I expect in the future Xone will close that gap to 900 vs 1080 and the like.
I would say even the average *gamer* wont be able to tell much difference, let alone your mom.
Hell, half the time it's hard to spot much different between "current" and "next" gen versions at a glance, let alone between the PS4/Xone versions.
I'd say that, sad as it is, MS won that war. Their box will be perceived as "good enough". I've already seen reviews today touting Forza 5 as the best looking next gen title on any console, and the like.
All you really need is ports. Mult plat devs are already showing all effects and textures will be the same, the only difference might be resolution (even then games like NFS Rivals and NBA 2K are 1080P on Xone).
Then you'll get to exclusives, where PS4 could stretch it's lead if it has one. However these are the vast vast minority of games (and even then I'd argue early exclusives prove nothing)
I hate what Ms did going low power, it was stupid. But they'll probably get away with it because, Sony.
Philthelegend - Wednesday, November 20, 2013 - link
You trolling?You are the visually impaired if you don't see the difference! Just look at the screenshots and if you have a low resolution screen zoom them in and see the difference. The difference is like playing a game on very high settings(ps4) to medium(xbone) on PC.
"MS won that war. Their box will be perceived as "good enough"." hehehehe you're an obvious troll or a blind fanboy, no one says that the loser won a battle because he was good enough
You say the Forza 5 is the best looking next gen title, then you go on talking about ps4 exclusives prove nothing?
The actual graphics are not the top priority, xbone could have the same graphics as the ps4 but the most important thing is to keep the framerates above and atleast 60 at all times.
TEAMSWITCHER - Wednesday, November 20, 2013 - link
You and I must have watched the different videos. There is a pronounced "Shimmering" effect on the Xbox One - caused by weaker anti-aliasing. It's far more distracting than a mere 1%. In every video the PS4 image looks more solid and consistent. I'm less than an average Gamer and I can see the difference immediately.Microsoft simply didn't "Bring It" this time and when your in a tough competitive situation like game consoles you really can't afford not to. I really don't want to buy a "Good Enough" console. Thank you, but no thanks.
Hubb1e - Wednesday, November 20, 2013 - link
I really didn't see much difference between the two. If I tried really hard I could see some more detail in the PS4 and it had a little less "shimmering" effect. In actual use on a standard 50" TV sitting the normal 8-10 feet away I doubt there will be much difference. Shit, most people don't even realize their TV is set to stretch 4:3 content and they have it set to torch mode because the "colors pop" better. It's probably going to come down to price and Kinect and in this case an extra $100 is a lot of extra money to pay. $449 would have a better a better price, but we'll see since there is plenty of time for MS to lower prices for their console after first adopters have paid the premium price.Kurge - Wednesday, November 20, 2013 - link
Fail. All of that has more to do with the developers than the hardware.nikon133 - Wednesday, November 20, 2013 - link
Yes, you usually win war with weaker hardware, bundled with generally unwanted accessories, which pre-orders significantly worst than competitor, even on local US turf. /sHere in NZ, all chains I have checked have PS4 pre-sold until late January to mid-February. Coincidently, every shop tried to sell me XO instead. "We have plenty of those", they said.
Great win for XO. They will own shop shelves in the next 2 - 3 months, at least ;)
douglord - Thursday, November 21, 2013 - link
The weaker console almost always wins the war. Sega always had a hardware edge on Nintendo. Same with everything vs Gameboy. PS1 vs Dreamcast? Wii vs PS3 and 360. DS vs Vita.xgerrit - Thursday, November 21, 2013 - link
"The weaker console almost always wins the war." You're the first person I've seen suggest the Wii U is going to win this generation... interesting.blitzninja - Saturday, November 23, 2013 - link
He's going in the right direction but lacks the real reason why.You guys here on AnandTech need to realize that you live in your own little bubble and while you may know a lot about the consoles, the casual consumer market (which makes up most people) have different priorities. So why did Nintendo products beat it's competitors with the Wii while having horrible specs? The experience.
Yes, there is a performance difference between the PS4 and the XO but what really matters is how the console feels and does what people want it to do. This is where the Wii comes in (the Wii U was a flop because they actually went backwards in this regard). Most of the console market is made up of casual gamers. Casual gamers like to invite their friends over and have a LAN party or party game, play with their family (this includes younger audiences), watch movies together and play music at times. The Wii dominated the market because of it's new control interface(s) that added the missing point to this market, it was extremely versatile and made playing it all that more fun than the other consoles.
This is why Nintendo didn't really beef up the Wii U, they just added the extra power to allow for more advanced and precise gesture computation.
So why isn't the Wii U dominating again? Well for starters, most people who have a Wii are satisfied with it and are not out to buy a new one, the Wii U doesn't add anything spectacular that would make the majority of it's target market want to upgrade.
The reason the higher spec console ended up losing is because when the company developed the console, they focused their resources on the performance and as a result cut back on the usability and experience aspect. But that isn't necessarily the case, it all depends on what the focus experience of the console and how well polished that experience is.
If Microsoft want's to win the war it needs to pander to the needs of the casual market, not to say it should copy Nintendo but it has another market. The all-in-one, that is to say make the XO a future PVR, set-top-box, media/streaming centre. Replace the HTPC with a low cost alternative. Most descent HTPCs fall into the $500-$700 market for those who want some light gaming too. The XO would absolutely destroy this market with the proper hardware and software support. Being a console for mid-high end gaming while still being a multimedia powerhouse that does a multitude of things. This includes the voice recognition, a killer feature if done right. If I could say "latest episode of the walking dead" or some other show and it worked, then gg Sony, you just got rolled.
ydeer - Thursday, November 21, 2013 - link
"I'd say that, sad as it is, MS won that war. Their box will be perceived as "good enough"."This ranks very high on my list of "most hillarious console war comments 2014".
douglord - Thursday, November 21, 2013 - link
The jump in TFlops gen to gen is usually 10x+. 50% more is not a big deal.bill5 - Wednesday, November 20, 2013 - link
btw, xone has a few spec advantages too, 9% more CPU speed, 7% more geometry setup, and 54% more peak gpu bandwidth.Revdarian - Sunday, November 24, 2013 - link
Actually, on digital foundry MS admitted that the useable GPU bandwidth in real world scenarios was of 140-150GB/s, while the developers of ps4 games have reported useable bandwidths of ~170GB/s.The 9% gpu is useful until you remember that you need to set aside power for Snap, and that you are running 3 OS's.
Da W - Wednesday, November 20, 2013 - link
The best hardware has always lost the war. Genesis, N64, Xbox, PS3...Death666Angel - Wednesday, November 20, 2013 - link
Genesis wasn't superior to SNES neither was the N64 to the Playstaytion. Xbox and PS3 I agree.Da W - Wednesday, November 20, 2013 - link
Snes-Genesis might be debatable, but not N64 vs PSone? On what planet you live on? PSone was crap. Only thing that made it what it become was the use of CD and Final Fantasy 7!djboxbaba - Wednesday, November 20, 2013 - link
Crap? PSone had the greatest game library in the history of consoles... what planet are you on?kyuu - Wednesday, November 20, 2013 - link
It's game library is irrelevant. They're talking about the hardware.nikon133 - Wednesday, November 20, 2013 - link
PS3 outsold X360 globally...kyuu - Wednesday, November 20, 2013 - link
Don't forget the Saturn and Dreamcast.xgerrit - Thursday, November 21, 2013 - link
The question is: Why? And the answer probably isn't "it failed because it was the best hardware."This is the first generation where social lock-in is going to affect purchase decisions right from the start... Most people will end up buying the console their friends have so they can do multiplayer. Since both consoles are going to sell out for the next few months the question this time around might be: Who can make them faster?
Kurge - Wednesday, November 20, 2013 - link
Total rubbish. If you could mimic the controllers or use a third party identical controller and do a blind test most people would be unable to detect any graphical differences. Most of it is pixel peeping where you take snapshots and compare.It's all nonsense, either platform will play games that look roughly the same - Ryse is said to be probably the best _looking_ game on either platform, and it's a One game.
Sorry - this line of thinking of yours is a fail. Game quality will depend on the developers, not slight differences in peak performance.
This generation is less about hardware and more about software - and Microsoft is _miles_ ahead of Sony as a software company.
mikeisfly - Thursday, November 21, 2013 - link
I wonder if you did a double blind test if anyone could pick the PS4 over the Xbox One. Maybe Anadtech should run that test. Hell add the Wii U in there too. I don't think people would like what they see. Humans eye is designed to see contract and frame-rate over resolution.hoboville - Wednesday, November 20, 2013 - link
Very interesting read, I wish I understood more about the importance of more CUs vs clock speed.kallogan - Wednesday, November 20, 2013 - link
those idling power consumption numbers are awful, especially when it's supposed to have low power jaguar cpus on board. Consoles are really pieces of junk.A5 - Wednesday, November 20, 2013 - link
AMD is pretty bad at power consumption. See: Bulldozer, R9 290, etc.JDG1980 - Wednesday, November 20, 2013 - link
That's not really the best comparison, though. Kabini, which uses the same Jaguar cores as the PS4 and XB1, has very good power consumption figures at both idle and load. AMD's mid-range GPUs like the 7790 and 7850 equal or beat Nvidia's solutions in terms of performance/watt.Bulldozer was an inefficient design, no doubt about it. Piledriver was a bit better and Steamroller should be better still. But none of that is being used here.
Hubb1e - Wednesday, November 20, 2013 - link
I really think this is a case of MS and Sony failing to add the necessary code to take advantage of the silicon. I think they had so many things to do to get these systems working that idle power consumption fell into the le'ts do that later category which greatly simplifies everything from the initial coding of the OS to the testing and validation. Anand thought that maybe that silicon for turning off cores wasn't there. I doubt that and I think it will be coming with a patch in the 3 -12 months timeframe.mikato - Monday, November 25, 2013 - link
Agree, and I don't know why Anand thought AMD didn't make that available. No reason to remove it that I know of.kallogan - Wednesday, November 20, 2013 - link
A powerfull PC with quad core i7 and a GTX Titan can idle below 30W. Gosh these are really prehistorical devices. Not green.kyuu - Wednesday, November 20, 2013 - link
Source please? I don't doubt it idles lower than either console, but 30W seems pretty low to me.ydeer - Thursday, November 21, 2013 - link
30W is low, but not out of the realm of possibility.The HardOCP Haswell test system with 16GB RAM and two SSDs used 32W idle. (http://www.hardocp.com/article/2013/06/01/intel_ha...
A Titan would add less 10W to that because the IGPU would be completely disabled. (http://www.techpowerup.com/reviews/nvidia/geforce_...
So maybe not "less than 30W", but 35W idle should be absolutely possible for a Haswell/Titan machine.
ananduser - Wednesday, November 20, 2013 - link
Sorry for the offtopic Anand, but since you mentioned cutting the cord a few years ago... care to share with us your avenue of choice(as in streaming services, set top boxes and whatnot) ?tipoo - Wednesday, November 20, 2013 - link
The PS4 browser being twice as fast is a surprise, since the CPUs are so close. Do we know the official PS4 CPU clock yet?bill5 - Wednesday, November 20, 2013 - link
it's 1.6. vs 1.75 on xone.anand speculates the ps4 is using more cores for os.
tipoo - Wednesday, November 20, 2013 - link
The PS4 clock was speculation though, not official. More cores would not change Javascript scores, which are single threaded mostly.A5 - Wednesday, November 20, 2013 - link
Pretty sure Webkit has a multi-threaded JS engine. And if the XBone restricts CPU time for apps along with core counts, that could explain some more of it.andrewaggb - Wednesday, November 20, 2013 - link
plus the xbox is running internet explorer which tends to lose all the javascript benchmarks. It's not likely important anyways, javascript benchmarks do nottipoo - Wednesday, November 20, 2013 - link
IE11 tends to win Sunspider a lot. Seems they don't have the most modern IE rendering engine in there.Hubb1e - Wednesday, November 20, 2013 - link
It may be that MS has just simply decided that two cores is enough CPU horsepower to run all OS functions and doesn't even bother letting the OS touch any more cores even when outside of a game. Two Jaguar cores at 1.75 ghz really isn't half bad so it could make sense.ShapeGSX - Wednesday, November 20, 2013 - link
Microsoft has stated that there are two standby modes. One in which Kinect is listening for the command "Xbox On". And another where you turn that feature off. If you turn the "Xbox On" feature off, they have stated that standby power consumption drops to just 0.5W (although given that they said that it burns just 7W with the feature turned on makes me wonder).Could you test the power consumption with the "Xbox On" feature turned off?
Death666Angel - Wednesday, November 20, 2013 - link
How would that differ to the "Off" state Anand tested?darkich - Wednesday, November 20, 2013 - link
Holy smoke.. my phone has more than 4X better java script browsing performance than Xbox One!That's just disgraceful on the console part, and inexcusable for Microsoft.
kyuu - Wednesday, November 20, 2013 - link
Considering the internet browser is not an important component of a console, whereas it's hugely important on a smartphone, it's pretty understandable, really.Stuka87 - Wednesday, November 20, 2013 - link
*WHY* are the comparison videos uploaded at 360P!?!Stuka87 - Wednesday, November 20, 2013 - link
Hmm, seems they are 360P when viewed on youtube, but HD is available if watching the embedded version. Strange.Hubb1e - Wednesday, November 20, 2013 - link
I was able to see the 4K versions actually which was pretty cool. First time I've actually seen a 4K video from youtube.Shadowmaster625 - Wednesday, November 20, 2013 - link
Microsoft is just beyond stupid. It's nothing to produce GDDR5. It costs basically the same amount of money to produce 50 million GDDR5 chips vs 50 million DDR3 chips. That is the whole point of making a gaming console in the first place. You get massive volume discounts on all your parts. Only a fool would buy an xbox, there is absolutely no reason to.. its not like microsoft is going to have that much exclusivity.tipoo - Wednesday, November 20, 2013 - link
A few dollars extra across say 80 million units is a lot. Do I wish they used GDDR5? Yes. But their decisions are based on their own cost analysis.Tyns - Wednesday, November 20, 2013 - link
The predicted price and availability of GDDR5 was highly questionable at the time MSFT needed to commit to the decision. Sony gambled and it happened to work out for them. A 6 month to 1 year delay or an extra $100-200 for the console would have been devastating if it had gone the other way, no?Sony's gamble paid off an now MSFT looks foolish, which is a shame for all of us.
Hubb1e - Wednesday, November 20, 2013 - link
I had also heard that Sony had decided to go with 4GB of GDDR5 but decided to double that when MS announced 8GB. Half the ram on the sony box would have hurt its ability to take advantage of its better hardware.TEAMSWITCHER - Wednesday, November 20, 2013 - link
I thought the image quality differences would be more subtle. But watching the COD:Ghosts video side-by-side you can see there is a more pronounced "shimmering" in the image on the Xbox One. Microsoft screwed up - I didn't spend $1500 on an HDTV to look at crappy images. Fore me the choice is clear - the PS4 wins this round. If enough people avoid the Xbox One, next year there won't be exclusive titles to miss out on.Kurge - Wednesday, November 20, 2013 - link
COD is badly coded, and what you don't see in the video is the frame rate choking of the PS4.What now?
Revdarian - Sunday, November 24, 2013 - link
Actually DF found out that CoD on the PS4 was running too fast, and that was the issue.Flunk - Wednesday, November 20, 2013 - link
Thanks for confirming my suspicions that it’s likely going to be a good 12 - 24 months before we'll need to buy one of these new systems. Call me old fashioned but I like for that "killer" app before I upgrade to new hardware.Wolfpup - Wednesday, November 20, 2013 - link
Funny that you point out it's not in depth, but then actually go WAY more in depth than anyone else yet has! Great article. I'm shocked PS4 has 2x the ROPS. I was assuming either 0 or 50% more.Wolfpup - Wednesday, November 20, 2013 - link
I hate that the hard drive on One is sealed...wish you could disable the 5 or 15 minute video caching too, for noise and hard drive longevity reasons. Makes me wonder if throwing an SSD in a PS4 even makes sense.Tyns - Wednesday, November 20, 2013 - link
I forget where but I read an article that stated for 1080p/60fps at the GPU's clock they only needed 2 more ROPs, or 18 total, but you couldn't selectively choose to add 2 more - it was 16 or 32.tipoo - Wednesday, November 20, 2013 - link
And TMUs too.Icehawk - Wednesday, November 20, 2013 - link
Quick Q I haven't seen addressed - are XBO games all compatible with the 1?I have to admit I am a bit surprised by the relatively weak hardware in both new consoles, in previous gens they were roughly on par with high end gaming PCs here it seems like they are more like a mainstream rig at best. If these go as long between generations again I see bad things happening, A) console games will fall far behind graphically vs PCs and B) PCs will be hampered by console graphics on multi-plat titles.
Owls - Wednesday, November 20, 2013 - link
no, it's notswilli89 - Wednesday, November 20, 2013 - link
Wait what? Why is this out before ANY Playstation 4 article?kyuu - Wednesday, November 20, 2013 - link
I'm guessing Sony didn't give Anand a PS4 for pre-launch review. Otherwise I have no idea.kyuu - Wednesday, November 20, 2013 - link
I mean at least not as early as they got the XBone, since obviously they do have a PS4.HisDivineOrder - Wednesday, November 20, 2013 - link
The best part of all this for PC gaming seems to be the sudden and very welcome arrival of x64-capable executables for games with games that use more than 3 GB's of RAM on a regular basis. I didn't expect the transition to happen to suddenly, but then bam, there we were with BF4 and Call of Duty with x64-capable executables. And Watch Dogs, whenever it arrives.That these gaming systems are already well surpassed by mid-range gaming PC's is also pretty nice in terms of ensuring ports run reasonably well for the near term. Kinda sucks for those buying into these systems for $500 or $400 (or both!) since you could easily build a PC out of the one you likely already own that would surpass them. This has never been more true than this generation and never been so easy to verify, either, but it's a nice boon for those of us who are PC gamers already.
It also opens the door for Steam to make their own value argument with SteamOS and Steam Machines.
djboxbaba - Wednesday, November 20, 2013 - link
Let me rephrase your first statement for you: The best part of all this IS PC gaming. haha :)mikato - Monday, November 25, 2013 - link
I agree. Now only if they let Call of Duty PC gamers play against their console counterparts in multiplayer, aaahahaha, no mouse, sorry your soldier has a hangover today and has to turn slowly.Hrel - Wednesday, November 20, 2013 - link
"Support for external storage is apparently on its way, as the Xbox One doesn’t allow end user upgrades of the internal 500GB hard drive. I have to say that I prefer Sony’s stance on this one."What IS Sony's stance on this one? I have no idea, haven't heard anything.
nikon133 - Wednesday, November 20, 2013 - link
My understanding is that HDD on PS4 is user-replaceable, but no external storage at the moment. Will they introduce external storage in future, and is there limit to HDD size (with current firmware), I don't know.Owls - Wednesday, November 20, 2013 - link
2TBCommentfairy - Wednesday, November 20, 2013 - link
What the..?! Xbox and ps4 just got released and already reviewed? I come to this website everyday for a month to look for MacBook retina 15 inch review as I hold off buying one, yet Anandtech never posts any review? This is weird don't they review MacBook all the time....??Hrel - Wednesday, November 20, 2013 - link
it's not the full review, that's forthcoming. If you had read it you'd know that.I would assume mac product reviews are forthcoming as well. Not that reasonable people care.
errorr - Wednesday, November 20, 2013 - link
I would expect that review sometime later this week. An and admitted somewhere that he was putting the 15" Mac at the bottom of the pile of things to review.Now I only need Klug to tell me how great my decision to buy a Nexus 5 was....
Hrel - Wednesday, November 20, 2013 - link
"If Sony’s price tag didn’t nerf the PS3 last round, it’s entirely possible that Microsoft’s Kinect bundle and resulting price hike won’t do the same for the Xbox One this time."Wat? Not sure what you're trying to say here. Sony's price tag last gen hurt their sales. It's entirely possible that Microsoft's high price tag won't hurt their sales this round. That's my guess, but it's worded very strangely. Also, if that is what you're saying I don't agree. The fact that it's an extra $100 for something the vast majority of people consider not only useless but intrusive can only hurt them further.
For my part I'm done with Microsoft. They fucked up Windows 8 to the point that it's unusable. They fucked up Xbox Live by banning everyone who has fun. (trash talks) and they built a console with sub-par hardware in the hopes that a fast cache would compensate. We won't notice the inferiority early on, but in 2-4 years it will become obvious that the PS4 is vastly superior.
I'm really a PC gamer now, and don't expect to have enough time to also be into consoles. But if that does happen down the road I'll be going Sony only for the first time ever. Only other console of their I have is the PS2. To play a PS1 game I like and a handful of PS2 exclusives I wanted to try. Shadows of Colossus being the primary one. But I got that, and games/accesories, off ebay for $70.
Hubb1e - Wednesday, November 20, 2013 - link
lol complaining about getting banned for trash talking...Death666Angel - Wednesday, November 20, 2013 - link
"I have to say that I prefer Sony’s stance on this one." -> Seeing how I don't really follow the console market apart from superficial reading of some news, what is Sony's stand here? :)Mugur - Wednesday, November 20, 2013 - link
You can easily replace the hdd in PS4, just like on the PS3.peterfares - Wednesday, November 20, 2013 - link
Glad to hear it works fine without Kinect. I won't be plugging mine in.epyclytus - Wednesday, November 20, 2013 - link
hi,i've read the ars review as well as others and have gathered some interesting info. from what i've gathered, everything is "rosy..." without quibbling on aesthetics and personal/subjective choices, such as, design of console/controller and video game comparison side-by-sides, which are hard to decipher anyway on youtube--i am really against the grain... firmly against the grain of sony and/or microsoft adapting such measly hardware inside these consoles. i can forgo a big/fat controller and box (XBO) if the hardware, i feel, has some muscle in it. or hide the more elegant PS4 under a shelf and behind some stuff to make it quieter and/or add a fan to keep it cool and from bricking, if the inside of it had some oomph! i mean, come one, the jaguar cpu is a tablet cpu and the gpu's are like cheap $100 gpu's. not only cheap, but these gpu's aren't even the new recently released gpu's from AMD that is found in the R9-290x.
the pc people seem to be praising this shift of inferior/cheap hardware solutions, as if, it is a "good thing" for the industry just because the architecture is the same as their $3000 lc's. give me a break.
please explain to me why this is good for games, for the gamers and for the industry, if the tech is not moving forward but semi backwards?
in 2006, ppl complained that the PS3 cost too much. well, the tech in the PS3 at that time didn't exist! $600 wasn't too much and add in blu-ray which at that time was an infant!!! and infant! now, the PS4 is a more agreeable price point but the inside of the machine are parts from a year ago. why is this good?
developers are saying they can squeeze out the consoles more than their pc counterparts, as if to say, "Yes! these consoles doesn't have the oomph of higher end pc's, but games will run better in them anyway because we can optimize the games better in consoles... yada-yada-yada."
the most upsetting part about all of this is that the games, visually and innovatively-speaking will suffer. and yes, the graphics are better and the new consoles are more powerful than their predecessors, but, it's not that more powerful. GTA V for ps4 will look better. yes. BF4 looks almost like a high end PC--yes. but this is probably where it will end, graphically speaking. i mean, the graphical wall will be squeezed out a lot quicker for this gen than last gen, i think. so, by next year, the best graphical masterpiece for either console will be possible. correct me if i'm wrong. and if developers can't squeeze out every metal of these consoles by next year, then something is wrong with the developers or these consoles or whatever since developers should already known how to develop for an x86 console since it is the same architecture as PC's which have been around since 1980 or whatever. i just don't see any games in the future that will be mind blowing is my greatest fear.
but, really, i'm just a little upset that... 1) they went with x86. 2) the x86 they went with isn't that powerful
good review though.
djboxbaba - Wednesday, November 20, 2013 - link
What would you say would have been a better alternative to x86? I personally find the change to x86 fine, but the gimping of the hardware... well i definitely agree with you there.epyclytus - Wednesday, November 20, 2013 - link
well, glad you ask. if i were to build my dream console, i would build it to require and exceed the fastest intel/amd cpu out there in terms of raw performance for gaming/graphics applications. at least, 8-cores, of course. and a full RISC cpu like the cell processor in the ps3 but, it's successor, or if doesn't exist, i'd make them make cpu from the ground up like what apple is doing with arm cpu's. in this case, if it can't beat the fastest mainstream cpu from intel/amd, in terms of, raw performance, then i'd add more cores to it. so maybe a 16-core sony/arm risk cpu that is 64-bit. i know risk cpu's are more scalable. so, adding more cores will give it that edge in terms of raw performance. and then, i would add 8GB of XDR3 RAM which i think is in development and will be faster than GDDR5 (i think). this is for bandwidth and future-proofing this system to meet its 6-10 year life cycle. the GPU would have to be discrete and will probably ask nvidia, instead of AMD to make one specifically for this dream console since Nvidia has better power/efficiency cards. this dream nvidia gpu will be like the upcoming maxwell gpu that is not even out yet. this is how advance my dream console is. and even though it's not an x86 machine and is risc-based, developing for this dream console will be the same as developing for anything else. the 8GB of XDR3 ram is shared btwn cpu and gpu, btw. what else am i missing? yeah, maybe, the console will be bigger than the ps4 but will be as sleek and have a higher price point. but, you can't argue that the inside of this dream console is anything but slapped together.Oh, the sound chip is also discrete. Adds cost. But whatever.
Bluray standard.
The i/o of the machine is standard. So sata3 or whatever.
Wifi is a/c standard.
Maybe the prce will be $899. But that is XDR3 ram, 16-core RISC sony/arm cpu, nvidia Maxwell gpu, dedicated soundcard, wifi a/c, and 1TB of 7200rpm HDD.
flyingpants1 - Wednesday, November 20, 2013 - link
Great, congrats. And it would be utterly pointless because noone would buy it.epyclytus - Wednesday, November 20, 2013 - link
well, the price is a little steep but the tech inside are state of the art, emerging, nonexistant technology as of right now. maybe, the console wouldn't have made launch this year. but, maybe next fall with all of the specs i mentioned. considering how fast apple updates their iphone hardware and the buzz around Arm and MIPS getting back into the RISC cpu race, then i don't think it's inconceivable to think that an electronic giant like sony in partnership with Arm, or MIPS could have co-developed a fast, multi-core RISC cpu that can compete with a desktop intel i7 or future AMD steamroller cpu. or maybe even samsung and sony since samsung also makes cpu's and they have a fab lab. i don't know. i am sort of pulling this out of my butt. but, it's a dream console, afterall. and my knowledge of the cpu market place are non existent. so, i got nothing to go by except for google searches about these things.someone also mentioned that a cpu is fundementally different than a gpu and they're right. a cpu isn't as fast as a gpu and a gpu isn't as fast as a cpu on certain task. but what bridges those gaps closer is a RISC cpu built from the ground up, sort of like the cell processor, but more powerful obviously that can do cpu task well and gpu task well. my proposition for a maxwell gpu in this dream console is also important since nividia is also incorporating an ARM chip in their upcoming 800 series of gpu to do what gpu's can't do. so, the maxwell version of this dream console will forgo that arm chip because there are already 16 of these chips (or cores) in the proposed RISC cpu of my dream console. my dream console is basically a video/graphic powerhouse where the cpu and the gpu are like synchronously or asynchronous talking to each other, but aren't dependant on each other. and the XDR3 memory controller to feed it is also part of this to give it massive bandwidth. Also, since the cpu and gpu are all co-developed and built with this application in mind, the entire console will only pull around 200 watts at load. Maybe less.
i know i'm dreaming. and it will never happen. well, it will eventually happen. But, I was hoping to happen sooner and in a console. why? Consoles are great platforms for diverging/emerging tech. Or should be. Sort of like what apple is doing with iphone and ipad hardware, but obviously, much more powerful. Much, much, much more powerful since consoles don't have to be that small like an iphone or ipad.....
/end babble
/end dream
epyclytus - Thursday, November 21, 2013 - link
just wanted to add that what my proposition is is a CPU and GPU that are both CPU and GPU, if that makes sense. so, theoretically, the cpu can be a gpu and the gpu can be a cpu so it's like having a dual gpu setup such as found in pc's. and/or a dual cpu's or possibly more.i know. advance stuff....
Hubb1e - Wednesday, November 20, 2013 - link
Haha. You want to build a RISC CPU from the ground up to be more powerful than an Intel i7, use ram that isn't even available yet, and use a graphics core that hasn't been finished yet? I'm not saying that's impossible, but it would be more expensive than the whole Manhattan Project to build the first nuclear bomb.x86 chips are already available, Relatively fast, Jaguar chips are easily scalable to new processes, and DDR3 and GDDR5 ram are already in full volume production. Graphics are just a situation of adding more blocks and the minor differences in relative power consumption of AMD vs Nvidia is a moot point as Nvidia is incapable of creating an APU with a decent CPU in it.
I love the idea of an APU in these boxes because it makes so much sense but my ideal box would have been 7870 type graphics performance coupled with a 6 core CPU based on AMD's new streamroller core running at above 3ghz.
For RAM I would have taken the old approach they used to do with the 780G chipset and used 1GB of GDDR5 for the GPU coupled with 8GB of DDR3 accessible by either the GPU or the CPU.
Power consumption would have been similar to the XB360 on initial launch but they should have been able to build that within the size of this XBone.
epyclytus - Wednesday, November 20, 2013 - link
wow. your dream console is might not even be that much more powerful than what is already in the ps4. and, your vram configuration is worst since it only has 1GB of GDDR5.i went for the fences with my specs because it is all made up of dreams.
Subyman - Wednesday, November 20, 2013 - link
I wonder how the rise of DDR3 prices has affected MS? I'm sure they purchased contracts at fixed prices a while ago, but going forward it seems DDR3 prices aren't much better than GDDR5 right now. The cost savings may not have been worth it looking at the current marketplace.Morawka - Wednesday, November 20, 2013 - link
i'm so tired of these companies making a big cache on the chip's die to negate poorly chosen memory interfaces.apple did it with the A7 in the iphone and ipad, and now Microsoft is doing it with the XBone.
Just spend the die space on a beefy memory interface and call it a day. Sure the memory interface is going to take up more space on the chip, but its better than wasting even MORE space on eSRAM/Cache.
Apple could have just put a beefy memory controller and call it a day, instead they put 4 MB of cache which takes tons of die space and served as a stop gap solution
Microsoft could have just went with GDDR5 and call it a da, but instead went with ddr3 and wasted tons of die space on esram
sigh, just beef these things up and call it a day, especially if these are going to be on the market for the next 8 years.
blacks329 - Saturday, November 30, 2013 - link
While your complaints are valid, Apple will probably address them within 12 months with the A8, so I don't see it as that big of a problem for them. On the X1 side, they gambled wrong and we're kind of stuck with it until ~2020.Braumin - Wednesday, November 20, 2013 - link
I wonder how much of the cross-platform comparisons are just that due to time constraints, the Xbox just didn't get optimized very well. Unfortunately it looks slightly harder to code for.I'll be curious to see how this goes moving forward. Do games like Forza 5 also have the aliasing problems? Other reviews have just said that it looks great.
Also - Anand - you've outdone yourself. You're preview is better than most reviews I've seen.
GTVic - Wednesday, November 20, 2013 - link
"One" or "the One" is not a good shorthand/nickname. I prefer XBone or X-Bone.piroroadkill - Wednesday, November 20, 2013 - link
I thought it was Xbox 180 after all their U-Turns...djboxbaba - Wednesday, November 20, 2013 - link
hahaha awesomemikato - Monday, November 25, 2013 - link
Why is everyone abbreviating "box" and not "one"? Like XbOne or XbO or Xb1. And the capitalization. All I read when I see this is "X Bone". I'll just call it that now. I guess there are difficulties with confusion with the original Xbox? The Scion xB? lol. I have an xB and owners call the current model the xB2.prophet001 - Wednesday, November 20, 2013 - link
"Those concerned about their privacy will be happy to know that Kinect isn’t required for use."As opposed to those people who don't care about a video camera watching their living room 24 hours a day.
My word people. Wake up.
kyuu - Wednesday, November 20, 2013 - link
I don't care. Why should I? The only thing that goes on in my living room is playing games and watching TV. So even in the unlikely event that the Kinect camera is feeding somebody (NSA? Microsoft interns? Who exactly am I supposed to be afraid of again?) a 24/7 feed of my living room and somebody is actually looking at it, big whoop.I'm not planning on purchasing either console, btw. Just irritated by the tin-foil hat brigade pretending it's reasonable to be scared by the Kinect.
kyuu - Wednesday, November 20, 2013 - link
Oh, and not to mention that if that is actually taking place, it'll be found out pretty quickly and there'll be a huge backlash against Microsoft. The huge potential for negative press and lost sales for absolutely no gain makes me pretty sure it's not going on, though.prophet001 - Thursday, November 21, 2013 - link
How sad.Microsoft, Google, Sony, and any other corporation out there has absolutely zero right to my privacy. Whether I am or am not doing anything "wrong." You my friend will not know what you've lost until it is truly gone.
mikato - Monday, November 25, 2013 - link
I don't think it will be a problem (see kyuu), but I really disagree with your "nothing to hide" attitude.http://en.wikipedia.org/wiki/Nothing_to_hide_argum...
Floew - Wednesday, November 20, 2013 - link
I recently build a Steam box. With a 360 controller/wireless adapter and Steam Big Picture set to launch on startup, it's a surprisingly console-like experience. Works much better than I had expected, frankly. My motivation to plunk down cash for the new consoles is now very low.Quidam67 - Wednesday, November 20, 2013 - link
Anand, just wondering if the Xbox One controller works with a Windows based PC (as per the 360 controller)? Would be great if you could try that out and let us know :)The Von Matrices - Wednesday, November 20, 2013 - link
The wireless XBOX 360 controller required a special USB receiver to work with a PC, and that took a few years to be released. I don't know if XBOX One controllers are compatible with the 360 wireless controller receiver or if a new one is required. I actually liked the wired XBOX 360 controller for certain PC games, and I'm curious to know if Microsoft will make wired XBOX One controllers.Quidam67 - Sunday, November 24, 2013 - link
Targetted to work with PC in 2014 apparently http://www.polygon.com/2013/8/12/4615454/xbox-one-...errorr - Wednesday, November 20, 2013 - link
There is a lot of discussion about the memory bandwidth issues but what I want to know is how latency affects the performance picture. That SRAM latency might be an order of magnitude quicker even if it is small. What workloads are more latency dependant to where the Xbox design might have a performance advantage?khanov - Wednesday, November 20, 2013 - link
It is important to understand that GPUs work in a fundamentally different way to CPUs. The main difference when it comes to memory access is how they deal with latency.CPUs require cache to hide memory access latency. If the required instructions/data are not in cache there is a large latency penalty and the CPU core sits there doing nothing useful for hundreds of clock cycles. For this reason CPU designers pay close attention to cache size and design to ensure that cache hit rates stay north of 99% (on any modern CPU).
GPUs do it differently. Any modern GPU has many thousands of threads in flight at once (even if it has, for example, only 512 shader cores) . When a memory access is needed, it is queued up and attended to by the memory controller in a timely fashion, but there is still the latency of hundreds of clock cycles to consider. So what the GPU does is switch to a different group of threads and process those other threads while it waits for the memory access to complete.
In fact, whenever the needed data is not available, the GPU will switch thread groups so that it can continue to do useful work. If you consider that any given frame of a game contains millions of pixels, and that GPU calculations need to be performed for each and every pixel, then you can see how there would almost always be more threads waiting to switch over to. By switching threads instead of waiting and doing nothing, GPUs effectively hide memory latency very well. But they do it in a completely different way to a CPU.
Because a GPU has many thousands of threads in flight at once, and each thread group is likely at some point to require some data fetched from memory, the memory bandwidth becomes a much more important factor than memory latency. Latency can be hidden by switching thread groups, but bandwidth constraints limit the overall amount of data that can be processed by the GPU per frame.
This is, in a nutshell, why all modern pc graphics cards at the mid and high end use GDDR5 on a wide bus. Bandwidth is king for a GPU.
The Xbox One attempts to offset some of its apparent lack of memory bandwidth by storing frequently used buffers in eSRAM. The eSRAM has a fairly high effective bandwidth, but its size is small. It still remains to be seen how effectively it can be used by talented developers. But you should not worry about its latency. Latency is really not important to the GPU.
I hope this helps you to understand why everyone goes on and on about bandwidth. Sorry if it is a little long-winded.
F00L1Sh - Friday, November 22, 2013 - link
I found this explanation very helpful.beefgyorki - Wednesday, November 20, 2013 - link
Anand, when MS initially talked about the Xbox One OS design from their description it certainly sounded like the Xbox OS (i.e. the gaming OS) was just a VM running on top of a hypervisor. Given that, then in theory that VM could be modified to be made runnable on say a Windows Desktop PC or potentially even a Tablet.With one in hand now, is there anything that can be done to shed some light on that possibility?
To me the most intriguing aspect of XB1 is the OS if it truly is just a VM because that could open up some really interesting possibilities down the road.
flyingpants1 - Wednesday, November 20, 2013 - link
What do you mean "just a VM", don't you realise the Xbox 360 OS was running in a VM too?Elooder2 - Thursday, November 21, 2013 - link
This. Was Xbox360 on an x86 CPU? No. But Xbone is. Therefore it seems logical to consider that if there is a possibility of somehow "extracting" the actual VM from the XBone, it could be made to run on a normal Windows PC with much less modification and hassle than the Xbox360 VM because there's no need to worry about the difference in architecture. Basically, I perceive that the biggest deterrent to making an "emulator" of the XBone (via a VM) is some form of software or hardware DRM. The Mac has a similar mechanism in Mac OS which will not let you install that OS on a regular PC because the regular PC doesn't have some extra chip that the boot code of the OS install disc looks for. As we all know, this was quite successfully cracked and Hackintoshes are plentiful. Ok, so Microsoft is not Apple and they may go down on anyone releasing an XBone emulator, but it doesn't mean it can't be done. It would seem much easier to produce an emulator for a console that uses, basically, almost, off-the-shelf parts.PliotronX - Wednesday, November 20, 2013 - link
Good lord the Xbone falls short. The embedded SRAM is irrelevant, trading outright strength in 3D for faster operations tied to the subsystem is a failing strategy dating back to the PSX and Sega Saturn.Teknobug - Wednesday, November 20, 2013 - link
Looks like PS4 wins not only in hardware specs, but graphics visuals. The only difference maker between the two seems to be game titles. I would have bought the Playstation 4 if Gran Turismo 6 was coming out for it but nope they released it for the PS3, bummer. I have Forza 2, 3, 4 for X360 and will not get Forza 5 after how Turn10 turned Forza 4 into a cash cow with DLC cars.warezme - Wednesday, November 20, 2013 - link
Exactly, it is huge failure on the MS side and I suspect many a game developer will eventually reveal just how limiting their decision has been. Overall for the two consoles that I would consider to be a modern investment of 3 to 5 years, these are pretty pathetic hardware examples. Current gen PC's are already way ahead and the difference will only continue to surpass these consoles.Homeles - Wednesday, November 20, 2013 - link
Actually, what's wrong with you? It's pretty common knowledge that ROPs are huge consumers of memory bandwidth in a GPU, and with the Xbone having half of them, memory bandwidth becomes far less of an issue.Get educated.
Spunjji - Tuesday, November 26, 2013 - link
Less of an issue at a given performance level. Your performance becomes gated by the ROPs instead, so it's still a bloody stupid design decision for a "next gen" console.Sabresiberian - Wednesday, November 20, 2013 - link
Frankly, I'm disappointed in both of them In an age where PCs are moving to 2560x1440 as a standard, 120Hz, and G-sync. These consoles are simply already dated, even more so than at the release of the Xbox 360 and PS3. Good on the upgrades, but I simply can't see buying one over a PC I can build for around $500. (To be fair, it would cost you closer to $700 if you buy pre-made, but I'll point out that almost every one already has a PC. $500 for a PC and $400 for a console means spending more money, not less, for less capability; it only makes sense if you need 2 different pieces of hardware so one person in the family can use one while the other uses something else.)The only thing consoles offer is existing community. If all your friends play on an Xbox, or Playstation, it is hard to buy a PC instead. However, that isn't a plus, it is a minus because it sets apart gamers that want to play together. It polarizes those gamers that are emotionally attached to one or the other, and that is just bad for everyone. Good news is that Microsoft is talking about making it so PC players can play with Xbone players - but how is that going to effect the quality of the PC versions? Are they going to have to be capped in terms of game responsiveness and frame rates in order to level the playing field?
Don't get me wrong; I'm not bashing console players themselves. And, I get the attraction to cool hardware, I'm even tempted a bit myself, just cause "cool hardware" despite the limitations involved. And, there's the whole playing with others thing, havng both consoles would mean I didn't have to exclude people I want to game with. But, I'd feel like I'd be supporting a part of gaming that I really believe is bad for gamers in this day and age, so I won't be buying a console.
(And, don't give me any freakin tired, old arguments about controllers and a "different experience". It simply is not true, you can use any console controller on a PC. There is absolutely, categorically nothing you can do on a console that you can't do on a PC, except connect with exclusive communities and play exclusive games. Exclusive communities are bad for gamers as a whole, exclusive games are bad for gamers, too. Crappy hardware is bad for everyone.)
Sabresiberian - Wednesday, November 20, 2013 - link
Sorry about the emotion in the last paragraph, but it irritates me that some console players have to make up excuses for their decision. If you decide to buy a console, that's all good, but don't cut your nose off to spite yourself by purchasing one for reasons that simply aren't true.Sabresiberian - Wednesday, November 20, 2013 - link
"yourself" is a typo, should be "your face". :)PliotronX - Thursday, November 21, 2013 - link
That's very true, but then they've always lagged PC gaming. The closed proprietary system is a double edged sword. SDKs designed for a specific system can eek every last drop out of said system but then it's basically set in stone. I honestly don't think most peoples eyes are attune to the blur without GSync but they will notice true 1080p gaming. They all (PC, PS4, Xbone) all still serve their roles. Xbone just happens to veer off into Netflix territory a little too hard.ydeer - Thursday, November 21, 2013 - link
I agree, I’m not as excited about consoles as I used to be. What I am really excited about is SteamOS.Most reasonably priced gaming PCs have the potential to compete with this generation of consoles if Valve (somehow, magically) manages to bring down the overhead using Linux.
Plus you get the community. And a controller that at least has the potential to work better than anything we have used so far (see Civ5 on Steam Controller demo). And holiday sales. Upgradable hardware.
Heck, I can even see myself dual booting SteamOS on a MBP with the Steam Controller to play the latest and greatest games at almost equal quality than "next-gen" consoles, but completely mobile.
Please Valve, don't mess this up.
Wall Street - Thursday, November 21, 2013 - link
First off, 1440p, G-Sync and 120 Hz are all technologies that cost $250+ for the monitor alone and really demand another $300 on the GPU, so they are not comparable to the PS4 or XBox.Secondly, how can you build a gaming rig for $500? $100 is the Windows license. Another $100 gets a PSU, a case and a Blu-ray drive (but a really cheap case and PSU). Another $100 needs to be spent for a HDD and RAM. Now we are at $300 and don'y have a Mobo, CPU or RAM. A good CPU and CPU cooler costs $150, even for a cheaper CPU (with a stock cooler, the console would be much quieter than the desktop). At least $50 needs to be spent on a Mobo. This leaves you with only $50 on your $500 budget for a GPU. As you can see, this leaves you with a system that underperforms the consoles. I would also argue that a $500 system needs to cheap out on components leaving you with worse build quality than a console which is more similar to a premium SFF PC (which cost a premium to full sized). Also, this cost analysis doesn't have a monitor or peripherals, so if you don't have a PC or have a laptop, that is at least another $150 (many more people have TVs, and fewer people have monitors sitting around now that laptops have been a majority of PC sales over the past five years).
Hixbot - Sunday, November 24, 2013 - link
PC gaming is superior, but as long as developers leave out the local multiplayer elements of their console counterpart, a console will always have a spot in my home. You know, gaming in the living room with actual friends. I'd hook up my gaming PC to my TV and get some controllers, but there are basically no PC games that offer any decent local multiplayer options.mikato - Monday, November 25, 2013 - link
They do but you need to have all the computers in the same room. Pain in the butt, but we do it a couple times a year.Lonesloane - Thursday, November 21, 2013 - link
What about the noise of both new consoles? Anand is not commenting on that in the article, but after my experience with a Xenon 360 this is really important to me.Could you add that information?
JimmiG - Thursday, November 21, 2013 - link
It's funny how PC hardware reviews obsess over tiny differences in memory bandwidth, shader throughput and clock speeds, yet the PS4 having 40% greater shader throughput and 160% more memory bandwidth just doesn't seem to matter...blzd - Thursday, November 21, 2013 - link
Did you read the article? It was pretty clear and even pointed out to make real world differences. Maybe you thought theyd outright denounce the xb1 for it?IKeelU - Thursday, November 21, 2013 - link
Those "obsessions" in the PC-sphere are academic exercises to underline the differences between otherwise very similar pieces of silicon. Good GPU reviews (and good PC builders) focus on actual game performance and overall experience, incl. power and noise.And of course it matters that the PS4 is has a better GPU. It's just that native 1080p vs upscaled 720p (+AA) isn't a world of difference when viewed from 8-10 feet away (don't take my word for it, try for yourself).
But like Anand states in the article, things might get interesting when PS4 devs use this extra power to do more than just bump up the res. I, for one, would trade 1080p for better effects @ 60fps.
chelsea2889 - Thursday, November 21, 2013 - link
Great comparison of both products! Has anyone else heard of Why Remote though? I heard it has face and hand gesture recognition and apparently integrates with different types of social media and streaming apps. It seems pretty cool, I'm looking forward to seeing them at the upcoming CES convention!greywolf0 - Thursday, November 21, 2013 - link
Wow and I thought the Xbox One was just significantly handicapped in both memory bandwidth and GPU cores. Now I learn about this magical third thing called ROP where the Xbox One literally has only half that of the PS4 and it noticeably affects perceived resolution and is even lower than the standard AMD configuration for proper 1080p output. More nails in the Microsoft coffin.If you want to talk exclusive games and variety, the PS4 has more than enough bald headed space marine games and yet-another-space-marine-FPS-OMG-oversaturation to satiate any Halo desires, if you even had one to begin with. What you won't find on the Xbox One, however, is all the exclusive Japanese-made games, because lets face it, the Xbox is gonna sell poorly in Japan regardless, and that means no incentive to even make a half-ass port for the Xbox. This means all the JRPG fans and quirky Japanese adventure and indie games are not coming to Xbox, just like last gen.
And Microsoft just opened a Scroogled store selling more anti-Google paraphernalia, a continuation of their assinine and low-brow tactics and culture. They continue to be nothing but assholes day in and day out. They may have curbed their evil corporation ambitions with the backlash from their Xbox mind-control "features", but they show no sign of letting up anywhere else. I didn't think I could care much about tech companies, as they are all in it for money, but Microsoft continues to be the most morally reprehensible one around. A company not worth supporting or saving. To be shunned. It helps that all their recent products have been absolute out of touch flops, from Windows Phone to Windows RT and 8. Ditto Xbox power grab.
UltraTech79 - Thursday, November 21, 2013 - link
>More nails in the Microsoft coffin.Drama queen. This shit just doesnt matter in consoles unless youre a fanboy of one side or another. What matters is how good the game plays when they are done and its in your hands.
immanuel_aj - Friday, November 22, 2013 - link
Have to agree with you on the Japanese exclusives. They either take forever to get ported or don't get ported at all, unless it's a big title. I never got a PS3, but the PS4 seems like a good place to start and hopefully there'll be more indie stuff from Japan as well. I'm just waiting for a limited edition console to be released before getting one! Though using a Japanese PSN account is a bit of a pain sometimes.However, I don't think the PS4 has that many bald headed space marines ;)
jonjonjonj - Thursday, November 21, 2013 - link
i agree there's always someone crying about power costs. if the $5 a year in power is that big of a deal then you probably shouldn't be spending $500 on an xbox and $60 a year on xbox live.tuxfool - Friday, November 22, 2013 - link
Or alternatively they might care for the environment. Multiply all that "wasted" power by everyone and it adds up. This is doubly true when the apparent tasks this power is used on don't really require it.maxpwr - Friday, November 22, 2013 - link
Both "next generation" systems are increadibly weak and outdated. Not enough performance for Oculus Rift, let alone 4K displays.cheshirster - Friday, November 22, 2013 - link
Please, stop your squarephobia.Origin64 - Friday, November 22, 2013 - link
I acnually feel this generation is pretty bad for innovation. The PS3 And 360 made sense, at the time. They were very fast machines for the money. Sony sold PS3s at a loss for years. MS I dunno.I feel like time has kind of caught up with that kind of console. What's the use of building a whole new OS when these machines are x86 and fast enough to run Linux? Why focus on all kinds of closed proprietary online features when all that has been done before - and better - by volunteers building freeware. You build a PC thats comparable performance-wise and competitive on price with these machines, if you rip some parts out of an old one and replace PSU/mobo/cpu/gfx. Everyone can find a battered old pc that you can screw new parts in. People throw the things away if they get a little slow.
Then you can have the power of running what you want on the machine you paid for. Complete control. It'll save money in the long run.
PhatTran - Friday, November 22, 2013 - link
My sincere advice is you should buy a PS4 now. Why? You can see here: http://lovingtheclassicsreviewsite.net/trend/6-rea...lilkwarrior - Friday, November 22, 2013 - link
It is a shame hearing about the reported use of SATAII and the lack of 802.11ac from both consoles.Given some of the title of the game are over 40GB in size, something tells me that'll need to be addressed with the inevitable XboxOne Slim and PS4 slim models that'll come out about 2-3 years from now.
Especially Microsoft odd stance on not allowing the hard drive to be removed. PS4 wireless limitations are sort of an odd decision given the stigma of their PSN network being slower and unresponsive; any help from the hardware to be at current standards and future-proof.
Excluding China obviously, some of the fastest broadband infrastructures in the world (i.e. South Korea) are based in Asia. I would think that they would have at least took Microsoft's route to have a dual-band 802.11n connections being available to them.
It's weird that even Sony's standard phones connect and download to the internet using WiFi faster than their flagship console. It makes little sense.
Disappointed I'll have to wait this gen out for 2-3 years. By then, hopefully the ripple effect of SteamOS, Steam Controller, G-Sync, Mantle, Oculus Rift, 4K gaming, and so on will be evident enough to even consider buying either console outside of exclusives.
lilkwarrior - Friday, November 22, 2013 - link
*any help from the hardware to be at least accommodate common American wireless speeds and be a bit more future-proof would have been helpful to improve the perception of PSN for Sony.vol7ron - Friday, November 22, 2013 - link
In the Gravity demo, 0:02s in. It was interesting to see the difference in the astronaut falling.To me, it appeared that the 360 had higher contrast, but there were also other inconsistencies. A black bar ran across the leg of another astronaut in the scene -- I suspect this was debris -- but more notably the 360's face shield was blacked out, whereas the XB1 showed the astronaut's full face.
In terms of quality, due to the higher contrast, it actually seemed like the 360 won out there. However, as expressed, in all the other scenes despite brighter lighting, the XB1 had much better detail and noticeable edges -- the 360 was much softer and less defined.
What I don't understand is the naming convention. Why XB1? It's not the first XBox,
Blaster1618 - Friday, November 22, 2013 - link
No Minecraft yet on PS4, thats a deal breaker for our family. I want to drive a Mclaren P1.blitzninja - Saturday, November 23, 2013 - link
All this talk about specs and even the "higher spec console loses the war" non-sense is so stupid, just stop.You guys here on AnandTech need to realize that you live in your own little bubble and while you may know a lot about the consoles, the casual consumer market (which makes up most people) have different priorities. So why did Nintendo products beat it's competitors with the Wii while having horrible specs? The experience.
Yes, there is a performance difference between the PS4 and the XO but what really matters is how the console feels and does what people want it to do. This is where the Wii comes in (the Wii U was a flop because they actually went backwards in this regard). Most of the console market is made up of casual gamers. Casual gamers like to invite their friends over and have a LAN party or party game, play with their family (this includes younger audiences), watch movies together and play music at times. The Wii dominated the market because of it's new control interface(s) that added the missing point to this market, it was extremely versatile and made playing it all that more fun than the other consoles.
This is why Nintendo didn't really beef up the Wii U, they just added the extra power to allow for more advanced and precise gesture computation.
So why isn't the Wii U dominating again? Well for starters, most people who have a Wii are satisfied with it and are not out to buy a new one, the Wii U doesn't add anything spectacular that would make the majority of it's target market want to upgrade.
The reason the higher spec console ended up losing is because when the company developed the console, they focused their resources on the performance and as a result cut back on the usability and experience aspect. But that isn't necessarily the case, it all depends on what the focus experience of the console is (the market) and how well polished that experience is.
If Microsoft want's to win the war it needs to pander to the needs of the casual market, not to say it should copy Nintendo but it has another market. The all-in-one market, that is to say make the XO a future PVR, set-top-box, media/streaming centre. Replace the HTPC with a low cost alternative. Most descent HTPCs fall into the $500-$700 market for those who want some light gaming too. The XO would absolutely destroy this market with the proper hardware and software support. Being a console for mid-high end gaming while still being a multimedia powerhouse that does a multitude of things. This includes the voice recognition, a killer feature, if done right.
If I could say "latest episode of the walking dead" or some other show and it worked, then gg Sony, you just got rolled.
@AnandTech: Fix your forum/comment software, not having an edit button is really annoying
Hixbot - Sunday, November 24, 2013 - link
The Wii dominated sales at first, they captured a market of casual gamers that otherwise wouldn't have a bought a console. That market didn't buy many games, attach rate and they grew tired of the Wii, with all the smartphone and Facebook games etc.The Wii sales slumped, and in the end, x360 and PS3 each surpassed it in total by 2012.
For us hardcore gamers who also are Nintendo fans, the Wii was bought but it then left a bad taste in our mouths. The outstanding titles were few and far between, and the rest was shovelware. True motion control never really materialized in many games, most just made use of a "waggle" gimmick.
Wii-u comes out, casual gamers have already moved on, and the hardcores are reluctant to jump into another gimmick "tablet" just for the Nintendo software.
Disclosure: As a big N fan, I bought a wii-u for the Nintendo 1st party titles. Others like me are the only people buying this thing.
Exodite - Saturday, November 23, 2013 - link
Thanks for the mini-review, much appreciated! Some interesting technical information no doubt.Personally I'm more keen on the PS4, primarily due to having good experiences with Sony equipment in general as well as the price. We currently have a Sony BluRay player (the BDP-470S) and I'd have loved to replace it with a gaming-capable alternative (that also does Netflix) but alas that's unlikely unless Sony can squeeze in CD, MP3 and most importantly DLNA support in the machine.
Anyway, I'm also concerned about the sound levels of the machines as I have quite sensitive ears and I find even my current BluRay player to be something of a hair dryer when playing back discs. BD discs in particular.
ol1bit - Saturday, November 23, 2013 - link
That was an awesome mini review! One of the best review's I've read about these new titans.'m really surprised that 8 year old 360 hardware is as close as it is! A tad old now, but a great book to read on the old hardware is "The Race for a New Game Machine". This book really shows how MS pulled some fast ones on Sony and ended up with the better plan.
This time looks like Sony really kept everything under wraps better and has at least a slight upper hand. There is no way MS can make it's hardware better/faster at this point. Good Enough? Maybe...Time will tell.
nerd1 - Saturday, November 23, 2013 - link
It's funny that those 'next gen' consoles are actually on par with some gaming laptops and nothing better. PC is the best gaming console again.kryteris - Sunday, November 24, 2013 - link
It would make sense to provide a revised wireless adapter option in the future that plugs in (w/ an out) to the aux port. Baffles me that it is not 5ghz wireless N or the new AC standard.kryteris - Sunday, November 24, 2013 - link
"but after talking with Ryan Smith (AT’s Senior GPU Editor) I’m now wondering if memory bandwidth isn’t really the issue here." So what are you wondering after speaking with him? That it is the ROP's being halved?Broo2 - Monday, November 25, 2013 - link
It appears my xBox One does use HDMI-CEC. During the setup it tested my TV (Samsung) and cable box (DirecTV) and both are being controlled without an IR blaster. Perhaps this was added in a final update?NoSoMo - Wednesday, November 27, 2013 - link
Why no review of 4k feature???? Certainly Anand makes more than enough from the site to get a decent one.MrCoyote - Friday, December 6, 2013 - link
4K will not happen with this new generation of consoles. 4K video takes too much bandwidth for streaming. A new high capacity disc is still being developed to store 4K movies. The current consoles don't have enough power for 4K games. Be glad you have 1080p.Serjonis - Tuesday, December 24, 2013 - link
hi Anand. can you help me out? I'm looking for a no-break to connect my PS4, Xbox One, PS3 and a 42" LCD TV. which would be the proper one for all the power consumed? thanks in advance!