PerformancePoint and ADOMD.NET

I was recently configuring PerformancePoint in SharePoint 2013 and had deployed the SQL Server 2012 AdventureWorksDW Multidimensional cube for testing purposes. It had been a while since I configured PerformancePoint so thankfully I had written this post previously. As luck would have it I ran into pretty much every roadblock mentioned in that post……and one additional “detour” that I had almost forgot about (that also had a new twist).

If you’re in PerformancePoint Dashboard Designer and trying to create a data source that connects to SSAS, but seeing nothing show up in your Database dropdown (as seen below…..sorta) you might need to install ADOMD.NET 11.


Not only install it, but install it on all servers that run PerformancePoint AND you’ll need to adjust the Web.config file to tell PerformancePoint to use it.  This is of course assuming you’ve set everything else up properly. I’m taking for granted that you’ve already done stuff like

  • Associate an Unattended Service Account to the PerformancePoint Service Application and granted the appropriate users the right to use the unattended account in the Secure Store Service Application
  • Double checked that the service account you’re using for your Unattended Service Account actually has access to at least one db on the SSAS instance you’re trying to connect to

If none of that sounds familiar, check out my post on secure store for Visio Services or my posts on using data refresh with Excel Services…configuring PerformancePoint should be pretty similar to those.

Note that I wasn’t getting any error messages, but instead the database dropdown was just empty. If you’re getting any type of error on the screen shown above, your problem is most likely something else and not ADOMD.NET. Still with me? Is your database dropdown still mocking you at the very moment with its emptiness? Hopefully this will cure your PerformancePoint woes.

So….let’s do this

  1. Go download ADOMD.NET 11 already will ya?….at the time of this writing you can find it here:
  2. Install it on all the first farm devices that’s running PerformancePoint. (something like next, next, finish)
  3. Once ADOMD.NET 11 is installed, navigate to \Program Files\Microsoft Office Servers\15.0\WebServices\PpsMonitoringServer
  4. Locate and make a backup copy of the web.config file found there. (Just in case you mess this all up…..I certainly don’t trust you.)
  5. Open the web.config file and scroll on down to the very bottom. Once you get there you should see something like this:
  6. Change the oldVersion from to
  7. Change the newVersion from to
  8. Save that bad boy
  9. Repeat for every server running PerformancePoint

Completing the above had my database dropdown populating with an abundance of easily selectable db’s for as far as the eye could see. Some say it’s still populating new and wonderful connection options to this very day (results may vary).

I don’t recall having to do an iisreset or anything…although I think I might have closed Dashboard Designer and reopened it after making the changes. Hopefully this helps!

Using PerformancePoint 2013 and SSAS roles without Kerberos (configuring EffectiveUserName)

If you’ve spent any time with PerformancePoint, you’re probably at least somewhat familiar with the different authentication options that are available when creating a data source. Any time you create a new data source in PerformancePoint Dashboard Designer, you’re confronted with 3 choices.

  1. Unattended Service Account
  2. Use a stored account
  3. Per-user Identity


With the first two, there is also an option to pass the authenticated users name in the connection string using the CustomData property. I’ve covered how to use that option previously here:

What most people REALLY want is the ability to create and leverage roles in their Analysis Services cube for different sets of users. If you want different users to authenticate to different roles within your SSAS cube, you’ll need to use the “Per-user Identity” option when creating your data source. The bad news is that the “Per-user Identity” option comes with its own set of obstacles you need to overcome before it works.

The main problem with using “Per-user Identity” is an issue called “The Double Hop” issue. If you are unfamiliar with the double hop issue, I highly suggest you watch the following video where it is explained in very simple terms:, but the short version is simply that by default, credentials can’t be delegated more than one server away (one hop).

There are a few ways to get around the double hop issue. Most commonly, an administrator will configure Kerberos in the environment, which allows the delegation of credentials between computers more than one hop away. This is a great option if you have the appropriate security credentials (I believe you need domain admin rights) and know how to configure Kerberos. For the uninitiated, Kerberos is not an easy setup.

HOWEVER, if configuring Kerberos in your environment sounds more than you’re ready to take on, there’s good news! In PerformancePoint 2013 we can leverage the new “EffectiveUserName” functionality that will allows us to use the “Per-user Identity” option without having to configure Kerberos! AND it’s pretty easy to setup. In order to use the EffectiveUserName functionality we’ll need to accomplish the following:

  1. In SharePoint Central Administration, you need to enable the “use the EffectiveUserName connection string property instead of windows delegation” option for the PerformancePoint service application.
  2. The application pool account that is running PerformancePoint will need to be given admin rights on the SSAS instance you want to connect to.
  3. The “Claims to Windows Token Services” will need to be running in SharePoint Central Administration.
  4. Create a roll in your cube that gives your test user(s) the correct access to the cube….otherwise they’ll just be denied access.
  5. When creating your PerformancePoint data sources, choose the “Per-user Identity” option.

For those still needing a little more guidance, the remainder of this post will walk through accomplishing the 5 requirements above and ensuring that the EffectiveUserName functionality is working in our environment.

STEP 1 - Enable the “Use the EffectiveUserName connection string property instead of Windows delegation” option.

To enable this property, you’ll need to perform the following actions:

  1. Open SharePoint Central Administration
  2. Click on the “Manage service applications” link (under the “Application Management” heading)
  3. Locate and click on the PerformancePoint service application
  4. Click on the “PerformancePoint Service Application Settings” link
  5. Locate the checkbox labeled “Use the EffectiveUserName connection string property instead of Windows delegation” and place a check in it.
  6. Scroll down to the bottom of the screen and click the “OK” button

STEP 2 - grant the PPS application pool account admin rights on the SSAS instance

In this step we need to first find out what application pool the PerformancePoint service application is using, and what account that application pool is running under. Once we know what account is being used, we simply need to add it as an administrator on the SSAS instance we’re trying to connect to. First let’s find what app pool and account is being used by following the steps below:

  1. Open SharePoint Central Administration
  2. Click on the “Manage service applications” link (under the “Application Management” heading)
  3. Locate and highlight (do not click on the actual link/text) the PerformancePoint service application
  4. Click the “Properties” button on the ribbon at the top of the screen
  5. Scroll down to the bottom of the page until you see the “Application Pool” Section
  6. Make note of the application pool being used, we’ll need to remember this name while determining what account this app pool is running as
  7. Click the “Cancel” button at the bottom of the screen


Now that we know what application pool PPS is using, next up we’re going to determine what account the app pool is being run as. Typically we can’t just look in IIS because the actual name of the App pool is probably a GUID rather than the name we saw listed previously. There are a few different ways we can still see which account the app pool is using, by issuing a PowerShell command, or by using Central Administration. To use PowerShell, simply open up a terminal window and issue the following commands.

Add-PSSNapin microsoft.sharepoint.powershell


PowerShell should return the name of each application pool and a related “ProcessAccountName”. It’s the “ProcessAccountName” that you’ll want to add as an administrator to your SSAS instance. If Powershell is a bit too much for you, then you can find the account being used by following these steps in Central Administration.

  1. Open SharePoint Central Administration
  2. Click the “Configure service accounts” link (under the “Security” heading)
  3. Locate the application pool in the first dropdown (*note the application pool name may be prefixed by “Service Application Pool - “)

  4. Make note of the Account listed at the bottom of the screen
  5. Click the “Cancel” button at the bottom of the screen

Now that we have our account, all we need to do is add it to our SSAS instance as an administrator. To do that complete the following steps:

  1. Open SQL Server Management Studio
  2. Connect to your Alalysis Services (SSAS) instance
  3. In the Object Explorer, right click on the instance and select properties to open the “Analysis Server Properties” window
  4. Select the Security page
  5. Use the “Add” button to add the app pool account
  6. Click “OK”

STEP 3 - Ensure that “Claims to Windows Token Services” (c2wts) is running

If you’re getting errors similar to the following:

“Dashboard Designer could not connect to the specified data source. Verify that either the current user or Unattended Service Account has read permissions to the data source, depending on your security configuration. Also verify that all required connection information is provided and correct.”

Chances are c2wts isn’t running in your environment. Either way, we’ll need it to be running for our EffectiveUserName to work. To ensure c2wts is running, follow these steps:

  1. Open SharePoint Central Administration
  2. Click on the “Application Management” link heading
  3. Click on the “Manage services on server” link
  4. Locate the “Claims to Windows Token Service” service, and ensure that the status is “Started”. If it is not started click “Start” button under.
  5. If you had to start the service, run an iisreset or reboot the server

STEP 4 - Create an SSAS role for your test user

Now that our SharePoint farm and datasource are configured to use the EffectiveUserName, you’ll have to grant users access to your cube by creating a role in the SSAS database and adding your users to that role. In our case we’re going to add the user “contosoo\test” to the role. If you don’t have a test account to use, either create one or have your Active Directory administrator create one. To create a simple role with read access follow these steps:

  1. Open SQL Server Management Studio
  2. Connect to your SSAS instance
  3. Click the “+” next to the SSAS database that you want to give users access to
  4. Right click on the “Roles” folder and select “New Role…”
  5. Select the “Membership” page on the “Create Role” screen
  6. Use the “Add” button to add your users/test user

  7. Select the “Cubes” page on the “Create Role” screen
  8. Locate the cube you want to grant the user access to and select “Read” for the “Access” Dropdown

  9. Click the “OK” Button

STEP 5 - Set your PerformancePoint Datasources to use the “Per-user Identity” option

Now that we’ve everything configured, and our roles created, all that’s left to do is set our PerformancePoint data sources to use the “Per-user Identity” option when connecting to our cube. Complete the following steps to configure your data sources:

  1. In SharePoint, open the “PerformancePoint Dashboard Designer”
  2. Note who you are logged into the Dashboard Designer as. This information is displayed at the very bottom left hand corner of the dashboard designer. This account must be in the SSAS role we create earlier in order to be able to authenticate to the cube.

  3. Create a new “Analysis Services” data source, or open an existing one
  4. Enter the appropriate server, database, and cube information
  5. Choose “Per-user Identity” as the Authentication method

  6. Click the “Test Data Source” button

If all has gone well, you should see that your connection has been successful.


That’s really all there is to it. Now you can add additional SSAS roles and have all your AD users/group(s) authenticate into the proper role when they visit your PPS dashboard. Quite an easy setup when compared to Kerberos. Good luck and happy reporting!

How to Auto-Generate a Time Dimension

One thing almost every Data Warehouse has in common is that they have a Time dimension.  There’s almost always a need to view aggregated data by day, month, quarter, year, etc.  In addition to the standards you might also have a fiscal calendar that’s slightly different than the standard calendar.  Typically a decision is made to generate the Time dimension starting at point in time the business feels the data will be relevant to the warehouse, and then populated out into a point in time that’s in the distant future (I usually do about 10 years for starters. 

Now you could spend a lot of time writing the sql script that is going to create and populate your time dimension.  However, you can also use SQL Server Analysis Services (SSAS) to do the bulk of the work for you (even if you don’t actually plan on using SSAS…..or if SSAS is a little intimidating to you). 

This walkthrough will guide you through using one of the built in wizards to generate and populate a Time dimension inside your SQL Server database, with very little effort.  In this walkthrough I’ll be using SQL Server 2012, but the steps are very similar in SQL Server 2008.  So let’s get to it:

1.       First things first, we’ll need to create a new Analysis Services project in SQL Server Data Tools (or BIDS in 2008).  If you’re using 2012, make sure to select the “Multidimensional and Data Mining” project, and not the “Tabular Project”.  Name it whatever you want and click ok.


2.      Once our project is created you should see several folders in your “Solution Explorer” (right hand side of your screen most likely).  Right click on the Dimensions folder as shown below and then left click on “New Dimension…”.


3.      Make sure to select “Generate a time table in the data source” and then click next.


4.      On this screen we get to choose the start and end date ranges for the dates we want to populate our time dimension with.  In addition we get to choose what additional attributes we want to include in our Time dimension such as Quarter, Hafl Year, etc.  Make your selections, and then click next.


5.      If you have Fiscal or reporting calendars that are offset from the standard calendar, you can choose to include additional attributes with those offset values in your calendar.  In this example we’re not going to use any of these, but it’s important you know they’re available.  Click next.


6.      On this screen I’ve renamed the dimension from Time to DimTime just because I’m in the habit of always starting my dimension table names with Dim.  Feel free to rename your dimension whatever you want here.  YOU MUST MAKE SURE TO CHECK “Generate schema now” BEFORE YOU CLICK FINISH!  That’s why I highlighted it in yellow all pretty like.


7.      Now we’ve kicked off a new wizard (by selecting  “generate schema now” in our previous screen and selecting finish) to generate our schema.  Click next on the intro screen.


8.      Now we need to create a data source that points to the database that we want our dimension table created in.  Click new, and walk through the data source creation wizard.  It’s exactly like the data source wizard that you find in SSRS, SSIS, or SSAS.  Once you’ve created your data source, select it in the “Data source” dropdown and click next.


9.      You can tell the wizard to create or not create various constraints on the table when it gets created.  For this example we’re going to leave all options checked (default).  Just make sure the “Populate time table(s)” dropdown has “Populate” selected and then click next. 


10.  This screen contains several options that should all be pretty self-explanatory.  We’re going to leave them all as their default values and click next.


11.  Aww yeaaaah!  The finish screen!  Your screen should look something similar to mine, just review it and as long as nothing looks out of place go ahead and click finish.


12.  The schema generation process might take a few minutes.  I’d guess the wider the range of the start and end dates you picked at the beginning of the wizard, the longer it will take.  Ultimately you want to see “Generation Completed Successfully” down at the bottom left in green.


13.  Now if I open up a query window in SQL Server Management Studio (SSMS) to the same database that I pointed my data source to during the walkthrough, I see that I now have a table named DimTime (though yours will be named whatever you named it… will be named “Time” if you didn’t specify anything).  Running a quick select * statement shows me that it is indeed populated. 



Well, that was pretty easy….Way more so than writing our own scripts to generate and populate this table.  Well, I hope everyone found this useful.  I also recently enabled comments on the blog (so feel free to leave a comment if this did or didn’t work out for you.)

How to configure dynamic security in Analysis Services and Performance Point 2010 without Kerberos

Also titled – “how can you make it so users only see data that belongs to them…without configuring Kerberos”

By the end of this walkthrough you’ll hopefully have a pretty good understanding of how to implement dynamic security in a Performance Point 2010 dashboard connected to analysis services. Security in general can be a pretty tricky subject, and if you don’t understand the basics it’s easy to get lost/confused. I’ll try my best to make this as simple as possible with lots of screen shots.

If you’re not familiar with terms like “Double Hop”, and why you might need Kerberos configured, you should go watch this video it’s pretty easy to follow, and will probably make the rest of the walkthrough easier to understand. (although, you do not need Kerberos configured to implement this)

Let’s start off by taking a look at our data source options in PerformancePoint. There are 3 different options when you create a new data source in PerformancePoint as shown below in the highlighted yellow area


Let’s take a minute to talk about each one and how it works

1. “Unattended Service Account” – This is by far the easiest to implement. In this scenario you configure a service account in the “Secure Store Service Application” in SharePoint and all PerformancePoint users use those credentials when connecting to your data source.

2. “Unattended Service Account and add authenticated user name in connection string” – This method is the same as the first, with one exception. In addition to using the service account, it also passes the current users id, which can then be retrieved later using the CUSTOMDATA() function in MDX.

3. “Per-user Identity” – This option always connects to your SSAS data source as the current logged in user. HOWEVER, you’ll almost certainly need to have Kerberos configured on your domain in order to make this work….and Kerberos configuration is indeed no small task…..even getting the security needed to configure Kerberos in the first place can be a nightmare. The only way you won’t need Kerberos, is if your SharePoint installation resides completely on 1 server, AND your SSAS instance and cube also happen to be on that server…….and that shouldn’t be the case. It should be noted though, that implementing Kerberos is a best practice, and if this option is available to you, it is more secure and recomended.

In order to implement dynamic security we’ll be using the 2nd option “Unattended Service Account and add authenticated user name in connection string”. Once you’ve got your data source configured as such, go ahead and create an analytics grid report and place this mdx in it (This is assuming you’re using the [Adventure Works DW 2008R2] SSAS project. If you’re not, then I’ve highlighted the important parts below, just replace non highlighted parts with something that is relevant to your cube)

WITH MEMBER [Measures].[MyUsername] as CUSTOMDATA()


{ [Measures].[MyUsername] } ON 0,

NON EMPTY { [Employee].[Title]} ON 1

FROM [Adventure Works]

If all has gone well thus far, the results of this should show “MyUsername” with a value of the account you’re currently logged in as….something like “domain\username”. Now in order to implement dynamic security, this user name will need to physically exist in our cube data somewhere, so that we can compare/match to it. There are a few different ways you could go about it, but for the sake of simplicity in this example, we’re going to say that there’s a dimension called DimOpportunity that has an attribute called “Account Owner” that contains our user id’s. Our goal is:

User should only see opportunities that belong to them based on who they’re logged in as.

In order to make that happen we’ll need to create a new role in our cube, and apply some “allowed sets” to the dimension that passes in our user id and makes the comparison. To create a new role in your cube you’ll want to

1. Connect to Analysis Services with SQL Server Management Studio (SSMS).

2. Select the database your cube is in, expand it, and then right click on rolesànew role


On the general tab of your new role, you can assign the role a name, I’m just going to leave the name as “Role” since this is just an example, but you’ll obviously want to give it a more meaningful name.


After you’ve named your role, click on the “Cubes” page as shown below, and change your Access, and Local Cube/Drillthough values to match what’s shown below (Cube name has been blacked out to protect the innocent). You'll get some additional prompts asking if you want to enable dimensions for read...yes you do.


Next you’ll want to select the “Dimension Data” page on the left hand side, then select the dimension you want to apply the security to. (remember, this dimension will need to have the actual user names in one of its attributes, so we can compare to it) In my case I’m comparing to Dim Opportunity.


After we’ve selected our dimension, you’ll need to click on the “Advanced” tab and then select the attribute that contains our user name from the Attribute drop down box. In my case that attribute is Account Owner. We then need to create an allowed member set. Allowed member sets grant access to only the attributes that have been specifically assigned…all other attributes not specified in the set are denied. Since we want the user to see only rows that they are assigned to, we can use the STRTOMEMBER() and CUSTOMDATA() function to define the named set like so:

{strtomember("[Dim Opportunity].[Account Owner].&[" + CUSTOMDATA() + "]")}


If we want this limitation to apply to other dimensions hanging off our fact, we’ll also need to make sure to check the “Enable Visual Totals” checkbox, as shown above.

We’re almost done! Next we’ll need to tell our PerformancePoint data source to use our new security enabled role we’ve created when it retrieves data. To do this we simply open the data source in PerformancePoint Dashboard Designer, and type our role name in the box titled “Role:”. Since our Role was simply named “Role”, that’s what you see in the screenshot below.


If all has gone well, you should now be able to log into your dashboard and view the data filtered based on who you are. Hopefully you’ve found this walkthrough helpful, if so gimme some stars. You can find some additional reading on the subject at the below link.