skip to Main Content

Post by Rob Collie

Best and Fastest Computers for Power Pivot, DAX, and Power BI - WE WANT TO KNOW!

Looking for the Fastest Desktops, Laptops, Tablets, and Servers!

“What’s a good computer for running Power Pivot?”  That’s a question we get asked all the time.  And we do have lots of advice – things that make a good CPU, RAM considerations, etc. – but that always translates to a hunting license of sorts. 

It would be much nicer for us to say, “here are three models in your price range that we recommend.”

And to be comfortable doing that, we need to specifically test for speed.

So if you have five minutes, please take a moment and run a quick test.  We provide a benchmark workbook and instructions.

The Links!

Excel 2010 Benchmark Workbook

Excel 2013 Benchmark Workbook

Submit Your Results Here

The Results!

We will leave this open for a week or so, then summarize the results back here on the blog, along with some very specific recommendations on particular hardware models in various price ranges and form factors.

Rob Collie

One of the original engineering leaders behind Power BI and Power Pivot during his 14-year career at Microsoft, Rob Collie founded a consulting company in 2013 that is 100% devoted to “the new way forward” made possible by Power BI and its related technologies. Since 2013, PowerPivotPro has rapidly grown to become the leading firm in the industry, pioneering an agile, results-first methodology never before seen in the Business Intelligence space. A sought-after public speaker and author of the #1-selling Power BI book, Rob and his team would like to help you revolutionize your business and your career.

This Post Has 17 Comments
  1. The issue with the download links is that the final parameter of the URL i.e. dl=0 causes dropbox to try and render these in the browser. That hangs Firefox or Chrome. If you could change the same to dl=1, the files download directly.

  2. Done! Nice test, I’m interested in the results. But why did you choose not to ask about the RAM amount in the systems?

  3. Agree with Bas on Memory (if the user is using 64 bit)…I’m assuming you can tell 64bit by the version number?

  4. I found this VBA code at http://stackoverflow.com/questions/198409/how-do-you-test-running-time-of-vba-code. It sets up millisecond timer in VBA. Here’s a link to the VBA code i used to let the computer time itself.

    https://onedrive.live.com/redir?resid=e47c22701ccc5c4b!40001&authkey=!AB2oCOAPEHExAu8&ithint=file%2ctxt

    Just paste that into a VBA module. Macro1 runs the first test and Macro2 runs the second test. The time in milliseconds will appear in a MsgBox after it runs. A second MsgBox will show the start end end times in hh:mm:ss, a rough check figure.

    Both macros click on Slicer Start Here, Click Here to Prime the Engine, then Test 1 or Test 2. The timer start before the test and ends after the test.

    I’m not really sure how the millisecond timer works. If anyone has some insight into this, I would love to hear it.

    I added PtrSafe to the function so that it will work on 64-bit.

    Also, I’m fairly new with VBA and programming in general, so most of the code was done with the macro recorder.

    Cheers!

  5. RAM-wise, it’s just much harder for people to report on the type of RAM and its speed. I wanted to keep the survey short, and we can always just recommend “get the fastest RAM package” on any machine we end up recommending.

    CPU’s are *FAR* more finicky than the RAM, and very often, the “better” CPU’s underperform relative to “budget” offerings. And you can’t rely on clock speed either as a good indicator.

    So this is primarily aimed at CPU’s, Since we’re also asking for model/make of the computer itself, we can kinda reverse-engineer the RAM specs, but again, the CPU is the most sensitive variable.

    (Also we deliberately sized the benchmark data set to fit into 32- or 64- bit systems. Again, that are pros and cons of that approach. But we’re just looking for a “coarse” bucketing of CPU’s, and so far, the results appear to be VERY decisive in that regard. Let’s see if the trend holds.)

  6. Really interested in hearing back on the results of this test. A coworker and I were just discussing today if a ram upgrade might improve the performance of a heavy power query manipulation model refresh. Didn’t consider CPU performance as much mainly because we don’t have the ability to get CPU upgrades were we work. But if the results of this survey are conclusive in that regard, we may be able to make a business case. The “free” memory on performance monitor while doing the refresh on this particular workbook refresh is 0 pretty much the whole power query refresh portion. There is still “available” memory available though. We are working with 8 GB of ram.
    I’m curious if anyone has any tuning advice to improve performance on an existing setup or if anyone has any thoughts on the benefits we might have upgrading to 12 or 16GB? We have tuned the SQL queries about as much as we can given the disparate nature of our enterprise’s dataset.
    Look forward to hearing the results!
    Kellan

  7. The 2 cores of my laptop are running at different speeds, oddly, so I entered the higher one. I also found that running multiple times gave very different results – first time I ran test 1 I got over 7 seconds, subsequent times it was under 5.

  8. 1st try: 7.2/3.2 2nd 4.0 and 2.3.. you didnt specify if only the first were relevant, but at Jeff pointed out the first test would not be enough to “fire up” the CPU i guess 🙂 or maybe it went faster because of storing it in memory..

    1. Sorry for spam, didnt find an edit button.. I restarted the file and ensured that the CPU was running at a higher clock rate; results went down to 2.7 and 1.7. Still the CPU was only running at 2.7 out of 3.4 GHZ. The file doesnt seem “heavy” enough to conduct a representable test, depents a lot of the current CPU clock rate. Will be interesting to see the results any way 🙂

      1. I think we will do this again at some point too. As hardware advances, we need to continually refresh our recommendations.

        A learning process for sure. But the results are still pretty damn clear regardless. We can absolutely make recommendations based on what we are seeing.

    2. Yeah the engine has the ability to cache results. So the second run-through is not particularly valid.

      I debated making the instructions more complex, clearly specifying the “first run only,” but instructions are always dicey as they get longer. So I decided to just identify outlier results in the data and exclude them. Not perfect, but good enough.

  9. Are the results in now? Not sure if on a different post somewhere or if you are still collecting data.

  10. Hey Rob,

    Can you please share your system’s configuration that you use for your BI workshops? I have been interested to know this for a long time. Please don’t disappoint us!

    Also, Can you write a post for best system config or a particular system model/brand that you suggest for guys who are surfing BI wave.

    Bob

Leave a Comment or Question