Decision Support benchmark: Nieuws.be

Decision Support benchmark Nieuws.be
Operating System Windows 2008 Enterprise RTM (64 bit)
Software SQL Server 2008 Enterprise x64 (64 bit)
Benchmark software vApus + real-world "Nieuws.be" Database
Database Size > 100GB
Typical error margin 1-2%

 

The Flemish/Dutch Nieuws.be site is one of the newest web 2.0 websites, launched in 2008. It gathers news from many different sources and allows the reader to completely personalize his view on all this news. Needles to say, the Nieuws.be site is sitting of top of a pretty large database, more than 100 GB and growing. This database consists of a few hundred separate tables, which have been carefully optimized by our lab (the Sizing Server Lab).

Almost all of the load on the database are selects (99%), about 5% of them are stored procedures. Network traffic averages 6.5MB/s and peaks at 14MB/s. So our Gigabit network connection has still a lot of headroom. Disk Queue Length (DQL) is at 2 in the first round of tests, but we only report the results of the subsequent rounds where the database is in a steady state. We measured a DQL close to 0 during these tests, so there is no tangible intervention of the harddisks.

We now use an new even heavier log. As the Nieuws.be application became more popular and more complex, the database has grown and queries have become more complex too. The results are no longer comparable to previous results. They are similar, but much lower. 

Nieuws.be MS SQL Server 2008 - New Heavy log!

Pretty amazing performance here. And while AMD gets a pat on the back, it is the hard working people of Microsoft SQL Server team we should send our kudos to. Our calculations show that SQL Server adds about 80% of performance when adding an extra 12 cores, which is simply awesome scaling. The result of this scaling is that for once, you can notice which CPUs have real cores vs. ones that have virtual (Hyper Threading) cores: the 12-core Opteron 6174 outperforms the best Xeon by 20%. The people with transaction databases should go for the Intel CPUs, while the data miners should consider the latest Opteron. The architectures that AMD and Intel have chosen are complete opposites, and the result is that the differences between the different software categories are very dramatic. Profile your software before you make a choice! It has never been so important.

 

SAP S&D 2-Tier Virtualization & Consolidation
Comments Locked

58 Comments

View All Comments

  • Accord99 - Monday, March 29, 2010 - link

    The X5670 is 6-core.
  • JackPack - Tuesday, March 30, 2010 - link

    LOL. Based on price?

    Sorry, but you do realize that the majority of these 6-core SKUs will be sold to customers where the CPU represents a small fraction of the system cost?

    We're talking $40,000 to $60,000 for a chassis and four fully loaded blades. A couple hundred dollars difference for the processor means nothing. What's important is the performance and the RAS features.
  • JohanAnandtech - Tuesday, March 30, 2010 - link

    Good post. Indeed, many enthusiast don't fully understand how it works in the IT world. Some parts of the market are very price sensitive and will look at a few hundreds of dollars more (like HPC, rendering, webhosting), as the price per server is low. A large part of the market won't care at all. If you are paying $30K for a software license, you are not going to notice a few hundred dollars on the CPUs.
  • Sahrin - Tuesday, March 30, 2010 - link

    If that's true, then why did you benchmark the slower parts at all? If it only matters in HPC, then why test it in database? Why would the IDM's spend time and money binning CPU's?

    Responding with "Product differentiation and IDM/OEM price spreads" simply means that it *does* matter from a price perspetive.
  • rbbot - Saturday, July 10, 2010 - link

    Because those of us with applications running on older machines need comparisons against older systems in order to determine whether it is worth migrating existing applications to a new platform. Personally, I'd like to see more comparisons to even older kit in the 2-3 year range that more people will be upgrading from.
  • Calin - Monday, March 29, 2010 - link

    Some programs were licensed by physical processor chips, others were licensed by logical cores. Is this still correct, and if so, could you explain in based on the software used for benchmarking?
  • AmdInside - Monday, March 29, 2010 - link

    Can we get any Photoshop benchmarks?
  • JohanAnandtech - Monday, March 29, 2010 - link

    I have to check, but I doubt that besides a very exotic operation anything is going to scale beyond 4-8 cores. These CPUs are not made for Photoshop IMHO.
  • AssBall - Tuesday, March 30, 2010 - link

    Not sure why you would be running photoshop on a high end server.
  • Nockeln - Tuesday, March 30, 2010 - link

    I would recommend trying to apply some advanced filters on a 200+ GB file.

    Especially with the new higher megapixel cameras I could easilly see how some proffesionals would fork up the cash if this reduces the time they have to spend in front of the screen waiting on things to process.


Log in

Don't have an account? Sign up now