skip to Main Content

Today I am going to spend some time trying to convince (and demonstrate to) Excel Power Pivot users that it is easy and valuable to learn some simple SQL code.  SQL is one of the easiest languages I have ever taught myself and you can do it too.  The reasons SQL is easy to learn include:

  • It has a very simple syntax that uses English language expressions.
  • You can start with simple short queries and get more complex only when you need to.
  • When you need to do something more complex, the Internet is just a quick Google away.
  • The chance of getting something wrong is relatively low (for these simple queries anyhow – see warning at the end).

Why You Should Care about SQL

If you use Power Pivot for Excel and you directly source your data from a database, learning some basic SQL will help you get significantly more value from the process of loading your data.   With some SQL skills, you can add significant flexibility to your data acquisition and loading processes into Power Pivot.  The same opportunity exists with Power BI but with a different UI experience – I will not cover the Power BI UI in this post.

Using SQL will:

  • Allow you to import a non-contiguous subset of records into your data tables that filters out unwanted data – something that is otherwise difficult to achieve with the standard UI.
  • It can help you significantly improve the compression and hence performance of your larger reports.
  • Allow you to import smaller lookup tables based on the subset of records in your data table(s).
  • Allow you to store these loading variations in a bespoke workbook rather than in the source database.

But enough of that – let’s get into it.

The Power Pivot Import Dialogue Box

One of the first dialogue boxes you will ever see in Power Pivot for Excel is this one.

image

When I first saw the 2 options in this dialogue, it was very clear to me.  I would ALWAYS use option 1 and I would NEVER use option 2.  Now I know better and I am going to share with you why I now routinely use option 2. 

Note: You can only use SQL if you have a supported database as a source.  You can’t write SQL code to selectively import CSV data for example.

Changing the Import Approach of a Loaded Table

I want to start with a data model that already is loaded with data built and is up and running.  The sample I am using is a typical Adventure Works report that I built some time ago for another blog post.  You can Download the files I used from here if you want to play along.  Make sure you unzip the files into a folder – don’t use the files from a zipped folder else the next steps may not work.

The reason I am doing this is that this is my most common use case – I have a report (typically large – maybe 200MB, and someone wants a variation).

Reconnect The Database

If you are following along with the samples, the first task is to reconnect the Power Pivot Workbook to the Access Database.  To do this, go to the Power Pivot Window, select Home\Existing Connections and then select the Access connection (#1 below) and then click Edit (#2).

image

Then just browse to the location of the Access database (it is in the zip file with the Excel workbook), select the Access database, Click Open\Save\Close.  This reconnects the Excel file with the database.

The Basic SQL Syntax

The basic SQL syntax is simply as follows

SELECT one or more columns

FROM a table

WHERE a set of selection rules apply

It can be a lot more complex than this, but the point is that this is enough to get started.

Use Case 1 – Import a Sub Set of a Data Table

Let’s assume that you want to refresh a workbook so it only contains data for the top 10 selling products.  First create a copy of the existing workbook (the one that contains all your data) so you don’t change or corrupt your master copy.  Then use the new workbook (still containing all the data) to find a list of the required product codes you want in your new workbook.  In my example I want the top 10 selling products, so it is easiest to do this in a pivot table like shown below.

image

Once you have this list of values (product keys), you could go into the Power Pivot window, click on Design\Table Properties, and bring up the Import Wizard dialogue.  From there you could manually select the product codes you need as shown below.

image

Now if you just want to re-import the data for 1 or 2 products, or if you need to import a contiguous range of product codes, then this is a pretty easy process.  But if you want to select a non-contiguous set of data like in this example, it is a lot more work. And it gets worse the more items in the list you need to import.  But this problem is easy to solve with some simple SQL.  Here is how I do it.

The first step is to deselect all product codes and then select any 1 item in the list – click OK.  It doesn’t matter which one as it is going to be changed anyway.

image

Then in the same dialogue box, switch from Table Preview to Query Editor as shown below.

image

You will then see the SQL code generated by the wizard (shown below).

image

The SQL code is not very well formatted and is hard to read, so the next thing I like to do is format the SQL code.  I normally use the following website but you can use any online service you like. http://www.sql-format.com/

I copy and paste the SQL code from the Power Pivot window above and then paste it into the formatting service and then click Format.  It gives me something that looks like this

image

I then copy and paste the code back into the dialogue box in Power Pivot, and it looks like this below.

image

Now the reason I selected any single product from the table preview editor earlier in the process is because this forced the table preview editor to write some more of the SQL code for me.  See how it added a WHERE clause?  If there is no WHERE clause, then the query will simply bring back all the records.

The last part of the code is as follows

FROM [Sales]
WHERE ([ProductKey] = 214)

The next step is to replace this WHERE clause with a list of product keys.  I am using an IN statement to do this as follows:

FROM [Sales]
WHERE [ProductKey] IN (312, 310, 313, 314, 311, 361, 353, 363, 359, 357)

Note that I have removed some extra unnecessary brackets from the original code line 2.  You don’t need to be a programmer to work out how this works.  It basically says “bring back the columns I have asked for from the Sales table where the product key is in this list of values above. The reason the list of columns is so long (in this case) is because I selected a subset of columns when originally importing my data.  If I selected every column, it would simply say SELECT * FROM [Sales].

To get the list of product codes, you can use the normal Excel tricks to concatenate them into a single string separated by commas.  To do this, I normally get my list of values into a Pivot Table, sort the pivot based on sales (in this case given I am after the top 10 products), copy the product codes from the Pivot Table and then paste them as values somewhere in the spreadsheet.

image

Once you have done this, you can manually combine them into the IN statement needed using Concatenate or similar.  Once I had done this a few times I figured some VBA code would be better.  I wrote this VBA code (below) that allows you to select/highlight a range of cells containing the data (product numbers saved as values in this case) and then run the VBA code to create the IN statement.  Note this range of values can’t be the pivot table, it has to be values in cells.

Here is the code if you want to use it (one code for text, one code for numbers).

Sub CombineCommasSQLCodeText()
    Dim Cell As Range, mySelection As Range
    Set mySelection = Selection
    For Each Cell In mySelection
        If Cell.Address <> mySelection.Address Then
            If Cell.Address <> mySelection.Range("A1").Address Then
                If Cell.Value <> "" Then
                    mySelection.Range("A1").Value = mySelection.Range("A1").Value & "', '" & Cell.Value
                    Cell.Clear
                End If
            End If
        End If
    Next Cell
    mySelection.Range("A1").Value = "in ('" & mySelection.Range("A1").Value & "')"
End Sub
Sub CombineCommasSQLCode()
    Dim Cell As Range, mySelection As Range
    Set mySelection = Selection
    For Each Cell In mySelection
        If Cell.Address <> mySelection.Address Then
            If Cell.Address <> mySelection.Range("A1").Address Then
                If Cell.Value <> "" Then
                    mySelection.Range("A1").Value = mySelection.Range("A1").Value & ", " & Cell.Value
                    Cell.Clear
                End If
            End If
        End If
    Next Cell
    mySelection.Range("A1").Value = "in (" & mySelection.Range("A1").Value & ")"
End Sub

After running the VBA or manually concatenating the values you will have this:

IN (312, 310, 313, 314, 311, 361, 353, 363, 359, 357)

Now all you need to do is copy the IN statement and add it to the SQL code in the Query Import dialogue.  Paste it in, click validate (to make sure you haven’t made any mistakes) and then save.

image

The sales table is then re-imported with a sub set of records.

image

Use Case 2 – Reduce the Size of a Lookup Table

Now that the Sales table contains a small list of 10 products, it is not necessary to bring in every product from the products table.  It is easy to fix this using the exact same approach as above – in fact you can use the exact same WHERE clause (assuming the name of the product code column is the same in both the Sales and the Products tables).

  • Go to the products table
  • Click on Table Properties
  • Change the product key filter to select a single product
  • Switch to the Query Editor
  • Format the Text with an online formatter
  • Copy the formatted SQL code back to Power Pivot
  • Reuse the IN statement from before to limit the selected products to be the same ones in your sales table.

You will end up with this.  Validate the formula and then save.

image

Use Case 3 – Sort data on load to improve compression.

As I covered in my last blog post, you can gain significant compression improvements if your workbooks are a lot larger than 2 million rows. To gain the benefits, you will need to sort the data on load.  To do this, simply do the following.

  • Go to your data table
  • Switch to the Query Editor
  • Format the query if you like
  • add an ORDER BY clause to the end.

See the example below.  Now in my test workbook there are only a couple of thousand rows, and so this wont make any difference to my workbook.  However if you have millions of rows, this can make a big difference (read the previous blog I mentioned above to find out why).  You will need to work out which column to sort by in your own data with some trial and error and understanding of the cardinality of your data.

image

Use Case 4 – Reduce the Size of an Indirect Lookup Table

This next example is not beginners SQL and also will not work with Access, but it is a good demo of what can be done.  You can copy the pattern I show below or invest some time learning some more complex SQL.  Keep in mind that I am an Excel user with no formal SQL training, and I can now write this code quickly and easily when needed. If I can learn it, other Excel users can learn it too.  Basically what this code does is:

  • Find a list of customer codes that have purchased the products in question
  • Creates a temporary table of these customer codes
  • Uses the temporary table to extract a list of just those customers into the customer table.

I use this pattern to retrieve a subset of dimension tables from a SQL Server DB all the time.  If your data source is SQL Server you would do this as follows (same general process as before).

Go into the Customer table and copy the SQL Code (just copy what is there), format the code at an online formatter.

image

This time I have left it in the formatting tool to show you what I am doing.  If I am doing this in real life, I typically complete this step in SSMS, but the process shown below is fine too.

Now go into the Sales Table and copy the SQL code from there too.  Paste the SQL code from the sales table and paste it before the SQL from the Customer Table into the editor below. You will have something like this.

image

Then I edit the first SQL statement (Sales) so that all it does is create a list of Customer Numbers that have purchased this product (shown below).

image

Then I turn this code snippet into a temporary table that I can reuse in my query.  There are various ways of doing this, but here is what I do (shown below).  This gives me a temporary table called ‘temp’ that contains all the customer numbers I need in my Power Pivot Workbook.

image

Then I change the second part of the SQL statement to just extract those customers as follows.

image

Copy the code back to the customer table in Power Pivot, validate and save.

Consolidation Case – Bespoke Workbooks

All of these examples have 1 thing in common.  The code to extract a different set of data is saved inside the Query Editor in a duplicate copy of a Power Pivot Workbook.  If you need the same query over and over, then it would be much better to ask someone to create a query/view for you in the database itself.  However for those times that you want to build an ad hoc report, I think it is much better to store your queries in the workbook itself.  It is accessible to you (the report author) and doesn’t pollute the source database with one off queries that will rarely be reused.  So the end state of the steps outlined above is to create this bespoke workbook that contains all the information for the specific bespoke need without polluting your SQL Server DB.

A Warning to Finish

One thing to be aware of – you can do bad things to your database server if you write bad SQL. Literally you can bring the SQL DB to a standstill. If you do this, it is possible someone who administers the server will come after you.  Worst case they may ban you from refreshing your workbooks against the server.  So I suggest you start out simple and make sure you know what you are doing before you get too tricky.

I would love to hear from others about what other SQL tricks they use with Power Pivot.  I have thought about creating parameter tables to store the list of values and then use Power Query to manage it all, but I am not an expert on Query Folding so I have not gone down this path (yet anyway).

Matt Allington

Matt Allington is a Microsoft MVP specalising in Power Pivot, Power BI and Power Query Consulting and Training based in Sydney Australia. Visit Matt's blog here.

This Post Has 39 Comments
  1. I love your final point and it reminds me of when I started tinkering with SQL in Power Pivot. I tried to use our corporate MRP system to build a nice calendar table that tied each day to a fiscal month, calendar month, calendar week, etc. I felt this was a fairly simple request for 3-4 years prior (really only like what, 1600 lines of data and 7 columns??). The run would take forever and I just gave up on it after a few tries. A week later an email had filtered down to me with the subject “Nasty Query”, and someone who had an off-conversation with me linked the query detail in the email with me mentioning that I was trying to do this. Apparently my query was bogging the system down significantly and the IT folks asked me what I wanted so they could just pull it for me. I hung that email up as a badge of honor :).

    1. That’s so funny. I actually originally wrote “someone will come after you” but decided to tone it down a bit. And it if you write queries over a live system rather than a data warehouse then the potential problems are magnified in there impact. 🙂

  2. Matt- Good stuff as usual. I am a huge fan of using SQL to pull data into PowerPivot for the very reasons you highlight. I would like to point out that you CAN use SQL when importing both Excel and CSV data. I blogged about it some time back here: https://brentpearce.wordpress.com/2013/12/04/powerpivot-csvs-the-sql-way/

    Truthfully I am using “CSVs the SQL Way” less and less as Power Query matures and offers other side benefits (like parametrized queries) – but sometimes it is still the best solution.

    1. Thanks Brent, I didn’t know you could do that. The schema.ini setup looks like some effort,and I agree that Power Query is probably a more user friendly way for csv files now.

  3. Matt – Nice article. SQL and PowerPivot are a phenomenally powerful combination. I think you can do Case 4 using Access though (or accessing Access through PowerPivot. Here is the code:

    SELECT DISTINCT
    ( Customers.CustomerKey ),
    Customers.GeographyKey,
    Customers.FullName,
    Customers.BirthDate,
    Customers.MaritalStatus,
    Customers.Gender,
    Customers.YearlyIncome,
    Customers.Occupation,
    Customers.HouseOwnerFlag,
    Customers.NumberCarsOwned,
    Customers.AddressLine1,
    Customers.AddressLine2,
    Customers.Phone
    FROM Sales
    INNER JOIN Customers
    ON Sales.CustomerKey = Customers.CustomerKey
    WHERE Sales.ProductKey IN ( 312, 310, 314, 311, 361, 353, 363, 359, 357 );

    Note that I changed [Name] to [FullName] as “Name” is a reserved word in Access.

    1. Nice. Like I said, there are lots of way to do these things. I found the WITH pattern on the Internet, learnt it, and now use it all the time. When I realised it didn’t work with Access I didn’t bother to Try to find an alternative as personally I don’t use Access anymore. I was forced to move to SQL developer edition for file size reasons. Once you make the move, you would never touch Access again without a very good reason. For anyone that is reasonable in Access and is considering making that move to SQL Server Dev or Express, I recommend you bite the bullet and do it.

      1. I would love to work with SQL server or Oracle or some other tool more robust than Access, but I am not a database administrator (thank goodness) and do not have access to these tools (no pun intended). Consequently, I have learned to function within the constraints of the Access/Excel paradigm by becoming proficient in SQL and VBA (and now learning DAX and M). I bet this applies to a lot of us “data gurus” out there. Thanks again for informative article.

        1. Any chance your company will let you install SQL server Express on your laptop? That is what I did (I have developer edition now). It is a bit daunting to move from Access to SQL server on your PC – even the install process scared me. But seriously it is worth it.

          1. Thanks for the suggestion. I work for state government and am somewhat constrained by a standard software configuration and rigid IT policies, but this sounds intriguing. I will investigate SQL Server Express and see if I can get approval for a software exception. You never know, I have managed to get R and Python installed. 🙂

          2. This is definitely the way to go, if you’re able to.

            One approach I have recently started using is pulling data from a reporting server through Power Query, and using lookup tables and such in the excel data model to format the output parameters I am looking for. Then, loading the PowerPivot to a server on my PC in an SSAS database, and then creating the reports in a new excel file using a connection to the SSAS server.

            The result is most of the heavy processing being done on the servers, while my Excel files are less than 50KB each and tied to the data (for regular updating).

  4. I’m a huge proponent of utilizing SQL with Excel and PowerPivot. I work to analyze close to a million rows of data daily and I just did a compression test by sorting my data and it saved me over 46%

    The memory size of my model (708,429 Rows) was 295,642 KB and after I sorted it by the largest row it went to 159,004 KB. A savings of 136,638 KB or 46.22%!!

    The standard way I develop a query is to use SSMS (SQL Server Mgnt Studio) and test my queries there and pull down the data first and then use the “Write a Query that will specify the data to import” option. I also like to format the query to help make it a little more usable. I’ve used the SQL Formatter as Matt mention’s but one of the better Add-In’s I have is RedGate’s SQL Prompt.

    If you’re able to learn and teach yourself DAX and PowerPivot, Power Query, Power BI, etc… you can definitely teach yourself SQL as well. Having this skill will only increase your ability bend and analyze any type of data any way you choose.

  5. Because your article makes it so easy (thank you), I will try to devote some time to learning this. However, be aware that since I have stuffed so much new stuff into my brain over the last 6 years (mainly due to Rob), something else will have to go. I hope that it will not be something I really need to remember. 🙂

  6. I work with Excel 2010, and as I’m sure you are aware, the data model does not play well with Power Query. To get around that, I used a slightly different approach. Instead of trying to “Load To” the data model by checking the field – “Add this to the data model” (this is greyed out in 2010) – in other words to push the data, I went into the data model to pull the data. Although I’m relatively new to PQ, I have been using connections for a while within Excel. Within the data model, I went to Design/Existing Connections. Within the “Existing Connections”, I initially found two types of data source connections – “Local Connections” and “Workbook Connections”. The “Workbook Connections” I found were apparently connections made between PQ and the source data. I double clicked the connection and up popped an SQL Statement showing an SQL statement apparently established within PQ – displayed as SELECT * FROM [Append2]. I hit finish, and the info imported into the data model.

    Now here is the SQL trick I found very helpful since I’m using 2010. When I went back into the “Existing Connections” a second time, a new type of connection was listed, “PowerPivot Data Connections”. When I double clicked this connection, the “Table Import Wizard” opened up allowing me to choose the “Write a query that will specify the data to import”. I backed out, grabbed the SQL statement from the original query against PQ (SELECT * FROM [Append2]). That is – select all from the Power Query connection called Append 2. I went back into the Existing Connections, double clicked the PowerPivot Connection and pulled a look up table against the PowerPivot Data Connection by slightly changing the SQL. Select DISTINCT[Code] FROM [Append2] Order By [Code]. To add look up tables for reason codes and stores, I simply connected to the PowerPivot Data Connection again, removed [Code] from the SQL above and replaced with [RsnCode] and [Store].

    That is DISTINCT[RsnCode] FROM [Append2] Order By [Code]
    and separately as DISTINCT[Store] FROM [Append2] Order By [Code].

    If and WHEN the connection fails due to a change within Power Query, you can delete and simply reestablish the connection. And since your original file and look up tables within the data model all pull off the same connection, you need to reestablish ONE not MANY connections. Sorry if I made any errors above, I’m rushing to leave on time for work. Thanks for your post Matt!! It was very helpful.

  7. I use SQL + Excel/PP everyday! It’s the best! Especially when the user needs to pass a date to the query and have the data refresh in the dashboard.

      1. The process is a bit interesting (and I have only done it with Excel 2013). It may take me a minute, but I can type up the steps and post them here if you’d like.

  8. Hey Matt,

    Not sure if this would work in Access, but you could use a subquery in SQL Management Studio as an alternative way to attack Use Case 4 so you don’t need a temp table:

    SELECT
    [Customers].[CustomerKey],
    [Customers].[GeographyKey],
    [Customers].[Name],
    [Customers].[BirthDate],
    [Customers].[MaritalStatus],
    [Customers].[Gender],
    [Customers].[YearlyIncome],
    [Customers].[NumberChildrenAtHome],
    [Customers].[Occupation],
    [Customers].[HouseOwnerFlag],
    [Customers].[NumberCarsOwned],
    [Customers].[AddressLine1],
    [Customers].[AddressLine2],
    [Customers].[Phone]
    FROM [Customers]
    WHERE [Customers].[CustomerKey] IN
    (SELECT [CustomerKey] FROM [Sales] WHERE [ProductKey] IN (312, 310, 313, 314, 311, 361, 353, 363, 359, 357))

    You could also return a subquery in the main SELECT to do something like this (in my experience, subqueries perform better than JOINS):

    SELECT
    [Customers].[CustomerKey],
    [Customers].[GeographyKey],
    [Customers].[Name],
    [Customers].[BirthDate],
    [Customers].[MaritalStatus],
    [Customers].[Gender],
    [Customers].[YearlyIncome],
    [Customers].[NumberChildrenAtHome],
    [Customers].[Occupation],
    [Customers].[HouseOwnerFlag],
    [Customers].[NumberCarsOwned],
    [Customers].[AddressLine1],
    [Customers].[AddressLine2],
    [Customers].[Phone],
    (SELECT COUNT(DISTINCT [ProductKey]) FROM [Sales] WHERE [CustomerKey]=[Customers].[CustomerKey]) [NumberOfProductsSold]
    FROM [Customers]

  9. Hi Matt,
    no need to worry about Query folding in Power Query. It will almost always do. Provided you follow the rules that I’ve linked to in my blogpost (which primarily is about avoiding a bug when using long parameter tables):

    http://www.thebiccountant.com/2015/09/17/filter-sql-server-queries-with-excel-tables-query-folding-limitations-and-bug-workaround/

    The most important rule is: Don’t write the SQL-statements by hand in the editor, but use M for the filtering instead. So you actually don’t have to use SQL-statements if you want to query you SQL-server, but M only (1 less language to learn :-))

    Other main advantages are that you’re reducing the risk to “write bad SQL” substantially (actually, currently I cannot think how bad SQL would be created with M, but how knows..)

    Being able to use parameter tables in Excel where the user could actually adjust the filters without editing any code is one of the coolest features this method offers in my eyes. If doing so, you should also use this trick:

    http://www.thebiccountant.com/2015/10/31/tip-for-parameter-tables-in-power-query-and-power-bi/

    It nicely improves codewriting and readability when working with multiple different parameters.

    1. Thanks Imke for the links and advice. I have definitely been reluctant to use the parameter techniques (which I do use) because of my lack of understanding about query folding. It is good to know what to do to make it work.

    2. Hi Imke,

      I love that parameter technique in Power Query and love to use it in Excel. However, I’ve wondered if it’s possible to do the same thing in Power BI Desktop. Can you think of a way?

      Tim

      1. I can think of 2. You could create a parameter table now given you can now add data directly in a table in Power BI desktop. But I guess you can’t then edit that in Power BI service, right? Some other way is to use a disconnected table with a list of the possible values and then use a slicer to select the one that applies.

  10. Thanks Matt. I did think of those options, but neither is ideal. The disconnected table slicer is probably the best, but it doesn’t work for user-entered values. I’m hoping for something like parameters in SSRS at some point.

    1. Totally agree that Power BI falls short here (at least at the moment, but it looks as if this is in planning (if my understanding is correct): https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/7141377-allow-runtime-parameters-in-power-view-and-or-powe).

      In addition to Matt’s suggestions: If you’re really desperate you could add a hyperlink in your report that links to an external parameter table (Excel, SharePoint..whatever) that’s linked to your model (but you should be prepared to stand that kind of laughter you don’t normally want to hear)

      sq 😉

  11. Agree! And wouldn’t it make sense to get a MS Certification in SQL if one works with lots of data models frequently? Anyone here have the 2012 Microsoft SQL Server certification?

  12. Hi,
    I aggree that SQL Queries are a good solution.
    Too bad the MS Query tool is not available in Power Pivot…
    Danny

  13. I use SQL to bring data into powerpivot all the time. I usually don’t use order by, but today I was trying to sort my data by an ID column, and it’s not sorting. It seems to be doing some sort of grouping that I can’t quite figure out. ever run into a case where it is blatantly not sorting?

    My query is simple because first I created a view in SSMS, so it’s Select * from myView order by [id]

    1. How do you know it is not sorting on load? The order you see in the data view is not necessarily the sort order it was loaded with. Is your ID column a table key? If so, I would not sort by that anyway, as there would be no advantage of doing so.

  14. Matt,

    As this is the first time I learn SQL code, I copied your whole script of last snapshot in this blog post and tried to validate it but got the following error message:
    Invalid SQL statement; expected ‘DELETE’, ‘INSERT’, ‘PROCEDURE’, ‘SELECT’, or ‘UPDATE’.
    What’s wrong with it? Could you please me to fix it? Thanks.

      1. I was referring to the script for case 4 starting line #1 “WITH temp..” ending line #26 “INNER JOIN temp” and line #27 “ON customers.customerkey = temp.customerkey”.The Access version of Adventure Works database was connected and run without problems for the scenarios from case #1 through #3.

        1. OK. Sorry, this is an old post and not top of mind familiar to me. The WITH statement only exists in T-SQL when you are using SQL Server, it doesn’t exist in Access. Access is a consumer database tool and is not fully featured in SQL. I assume I must have loaded my Adventure Works DB to my SQL Server for this article as I was explaining how I use SQL for my work with customers when I use a SQL Server DB. Sorry for any confusion.

  15. Matt – I’m late to the party here but this is great information! In your example you sort by a single column, would ordering across the board increase performance that much more or is this really only helpful on 1 or 2 columns with low cardinality? Thanks!

    1. Sorting by multiple columns has the potential to improve performance, but it does depend on how well the engine found the best sort order originally. You just have to test and try it out.

Leave a Comment or Question