How to fix Reporting Services broken images when hosting reports in a PerformancePoint dashboard.

I’ve been looking for the solution to this problem on and off for months, and finally stumbled upon the answer.

The scenario was this:

I placed an SSRS report into a PPS dashboard. The SSRS report has nothing but a gauge, and a table with some related data. For some reason the gauge image only shows up when I have the “show toolbar” option checked, otherwise the gauge shows up as a broken image or red x. In either case, the data for the table always shows up. If I navigate to the report server directly and view the report I have no problems, it only seems to happen from within PPS (both in designer and once deployed).

Apparently when Reporting services is configured to authenticate via “Trusted Account” mode, you need to specify the URL of the Report Server proxy rather than just the URL to the Report Server.

So for example, when configuring the “Report Server URL” in your Performance Point SSRS report you would typically enter it as

http://server/reportserver

however if you’re experiencing the problem described above, try referencing it including _vti_bin in the URL like so

http://server/_vti_bin/reportserver

That seemed to do the trick for me. Hope this saves someone some frustration.

 

How to configure dynamic security in Analysis Services and Performance Point 2010 without Kerberos

Also titled – “how can you make it so users only see data that belongs to them…without configuring Kerberos”

By the end of this walkthrough you’ll hopefully have a pretty good understanding of how to implement dynamic security in a Performance Point 2010 dashboard connected to analysis services. Security in general can be a pretty tricky subject, and if you don’t understand the basics it’s easy to get lost/confused. I’ll try my best to make this as simple as possible with lots of screen shots.

If you’re not familiar with terms like “Double Hop”, and why you might need Kerberos configured, you should go watch this video http://go.microsoft.com/fwlink/?LinkId=218209 it’s pretty easy to follow, and will probably make the rest of the walkthrough easier to understand. (although, you do not need Kerberos configured to implement this)

Let’s start off by taking a look at our data source options in PerformancePoint. There are 3 different options when you create a new data source in PerformancePoint as shown below in the highlighted yellow area

clip_image002

Let’s take a minute to talk about each one and how it works

1. “Unattended Service Account” – This is by far the easiest to implement. In this scenario you configure a service account in the “Secure Store Service Application” in SharePoint and all PerformancePoint users use those credentials when connecting to your data source.

2. “Unattended Service Account and add authenticated user name in connection string” – This method is the same as the first, with one exception. In addition to using the service account, it also passes the current users id, which can then be retrieved later using the CUSTOMDATA() function in MDX.

3. “Per-user Identity” – This option always connects to your SSAS data source as the current logged in user. HOWEVER, you’ll almost certainly need to have Kerberos configured on your domain in order to make this work….and Kerberos configuration is indeed no small task…..even getting the security needed to configure Kerberos in the first place can be a nightmare. The only way you won’t need Kerberos, is if your SharePoint installation resides completely on 1 server, AND your SSAS instance and cube also happen to be on that server…….and that shouldn’t be the case. It should be noted though, that implementing Kerberos is a best practice, and if this option is available to you, it is more secure and recomended.

In order to implement dynamic security we’ll be using the 2nd option “Unattended Service Account and add authenticated user name in connection string”. Once you’ve got your data source configured as such, go ahead and create an analytics grid report and place this mdx in it (This is assuming you’re using the [Adventure Works DW 2008R2] SSAS project. If you’re not, then I’ve highlighted the important parts below, just replace non highlighted parts with something that is relevant to your cube)

WITH MEMBER [Measures].[MyUsername] as CUSTOMDATA()

SELECT

{ [Measures].[MyUsername] } ON 0,

NON EMPTY { [Employee].[Title]} ON 1

FROM [Adventure Works]

If all has gone well thus far, the results of this should show “MyUsername” with a value of the account you’re currently logged in as….something like “domain\username”. Now in order to implement dynamic security, this user name will need to physically exist in our cube data somewhere, so that we can compare/match to it. There are a few different ways you could go about it, but for the sake of simplicity in this example, we’re going to say that there’s a dimension called DimOpportunity that has an attribute called “Account Owner” that contains our user id’s. Our goal is:

User should only see opportunities that belong to them based on who they’re logged in as.

In order to make that happen we’ll need to create a new role in our cube, and apply some “allowed sets” to the dimension that passes in our user id and makes the comparison. To create a new role in your cube you’ll want to

1. Connect to Analysis Services with SQL Server Management Studio (SSMS).

2. Select the database your cube is in, expand it, and then right click on rolesànew role

clip_image003

On the general tab of your new role, you can assign the role a name, I’m just going to leave the name as “Role” since this is just an example, but you’ll obviously want to give it a more meaningful name.

clip_image005

After you’ve named your role, click on the “Cubes” page as shown below, and change your Access, and Local Cube/Drillthough values to match what’s shown below (Cube name has been blacked out to protect the innocent). You'll get some additional prompts asking if you want to enable dimensions for read...yes you do.

clip_image007

Next you’ll want to select the “Dimension Data” page on the left hand side, then select the dimension you want to apply the security to. (remember, this dimension will need to have the actual user names in one of its attributes, so we can compare to it) In my case I’m comparing to Dim Opportunity.

clip_image009

After we’ve selected our dimension, you’ll need to click on the “Advanced” tab and then select the attribute that contains our user name from the Attribute drop down box. In my case that attribute is Account Owner. We then need to create an allowed member set. Allowed member sets grant access to only the attributes that have been specifically assigned…all other attributes not specified in the set are denied. Since we want the user to see only rows that they are assigned to, we can use the STRTOMEMBER() and CUSTOMDATA() function to define the named set like so:

{strtomember("[Dim Opportunity].[Account Owner].&[" + CUSTOMDATA() + "]")}

clip_image011

If we want this limitation to apply to other dimensions hanging off our fact, we’ll also need to make sure to check the “Enable Visual Totals” checkbox, as shown above.

We’re almost done! Next we’ll need to tell our PerformancePoint data source to use our new security enabled role we’ve created when it retrieves data. To do this we simply open the data source in PerformancePoint Dashboard Designer, and type our role name in the box titled “Role:”. Since our Role was simply named “Role”, that’s what you see in the screenshot below.

clip_image013

If all has gone well, you should now be able to log into your dashboard and view the data filtered based on who you are. Hopefully you’ve found this walkthrough helpful, if so gimme some stars. You can find some additional reading on the subject at the below link.

http://www.sqlmag.com/article/sql-server-analysis-services/protect-udm-with-dimension-data-security-part-2