FIM and FIPS or FIPS and FIM

Hi, Tomek here with some post finally ;). End of the world didn’t happened so there is no excuse to stay silent here and time to write some initial entry finally.

This time it will be quick troubleshooting entry for issue we came across few times so it might be an issue for others as well. And the topic is – FIM and FIPS (Federal Information Processing Standard) and what issues might be causing by these settings in a security locked down environments. As usually this knowledge comes with some real world learning experience so I hope this post will save some time on this learning curve for others.

When we were deploying some basic FIM elements on production servers in production environment we’ve found out during UAT after deployment that our setup is not working and is throwing Access Denied errors in authorization phase for some workflows. Quick look at details of a request which was denied showed us a cause:

This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms.

We were going with this solution through UATs on test and pre-production environment and it didn’t happened so it pointed out to some difference in configuration. Quick search showed that this issue can happen in systems which are configured with following GPO settings:

“System cryptography: Use FIPS compliant algorithms for encryption, hashing, and signing”

Which translates into following registry entries:

  • On Windows 2003: HKLMSystemCurrentControlSetControlLsaFIPSAlgorithmPolicy
  • On Windows 2008 and later: HKLMSystemCurrentControlSetControlLsaFIPSAlgorithmPolicyEnabled

From quick tests made in a lab environment in order to troubleshoot this issue we’ve quickly found out that enabling any of entries above will cause FIM workflows to fail with this error message. Disabling this issue cause problem to be resolved.

Recently we were updating same environment with FIM 2010 R2 and adding reporting support to it. When we were deploying SCSM components (you can read on FIM 2010 R2 reporting infrastructure here) on new servers we have found out that SCSM setup is failing at the finalization stage:

This wasn’t obvious from a setup log file at the first glance, but at the end it has turned out that this is caused exactly by the same setting affecting new servers deployed for System Center components of FIM reporting platform.

This isn’t actually FIM specific as this is known issue related to FIPS compliant systems and .NET environment. There is a bunch of articles on this issue related to .NET environment:

Solution for us was to disable this setting in GPO which affects FIM Servers and this has resolved it for us. If it is not possible in your environment you can use links above to make necessary changes in a configuration of your environment without disabling these policies, however I have personally not tested these solutions with FIM (if You will do – please use comments or e-mail me with this information)


Actually during writing this article I’ve found out this KB article 935434, which describes fix for .NET 3.0 environment which might be a solution for it – if you have access to Microsoft support it might be worthy to give it a try.

Conclusions from this are:

  • Consult with your Security / AD /GPO team if environment in which you are about to deploy your FIM installation is configured with FIPS compliant settings and work your solution for it with teams.
  • Always make sure that your dev /staging environments are as close to production as it is possible. It will make your life easier and in case of a problems also troubleshooting will be quicker.

FIM Reporting – architecture overview

Hi, Adam here again with FIM reporting series. In first post I’ve covered basic concepts of FIM reporting included in FIM 2010 R2.  After short break I’m getting back to continuation of this topic, this time with brief description of reporting architecture and data flow.

With FIM 2010 R2 release Microsoft temped to supply feature responsible for tracking and reporting changes made on FIM resources. Whole solution wasn’t built from scratch but it is based on the Microsoft System Center. Says more precisely the data flow and data storage are handled by System Center.


FIM Reporting can be deployed on two kind of topologies. Depending on production environment size, Microsoft recommended two solutions for small/medium or large size deployments. More detailed information you can see here.

Briefly I can tell that small or mediums size deployments can be hosted on servers where FIM and System Center Service Manager (SCSM) are hosted on one machine. While System Center Data Warehouse (SCDW) is hosted on second machine (look at figure 1).

For large deployments the whole environment is hosted on three servers. The difference is that the FIM Service and SCSM are separated into separate servers (look at figure below).

Schema update and Data Flow

As I mentioned before the System Center Service Management is used to handle data flow from FIM to System Center Data Warehouse. In general the whole process takes place following steps:

Data Warehouse schema update

In order to extend Data Warehouse schema to log new FIM resources it is required to supply MP definition. Management Pack are not new feature which is built-in FIM Reporting, this is part of System Center solution.  Management Pack files will be described in the third post. For more precisely description I refer to TechNet documentation.

  • Ready Management Packs files describing extension for Data Warehouse are imported (using Power Shell) to System Center Management service.
  • Based on MP files appropriate schema changes are made on the System Center Data Warehouse databases.

Data Flow

The Date Flow process is responsible for transfer FIM data from FIM Service database directly to the DWDataMart database in the end. All take place in the following manner.

  • Data are transferred form FIM Service database to the System Center Management. Appropriate bindings (connections between FIM resources and DW fields) are described in MP definitions.
  • Transferred data are sequentially passes DWStagingAndConfig, DWRepository and DWDataMart databases where are subjected to complex transforms and loads operation, which are out of the scope of this blog series. Second reason is that I never had any experience with System Center and I didn’t try deep dive into this topic. For more details about System Center Data Warehouse ETL process I refer to TechNet documentation.
  • Based on data transferred to DataMart Warehouse we can build custom reports, which we can store on Report Server. It also possible to upload reports to the System Center reports repository by using MP.

The whole process is presented on the diagram below (click to enlarge):

This short description is intended to show simple overview of FIM 2010 R2 Reporting architecture. In order to get more detailed and exhaustive description I suggests refer to TechNet articles:

If you are interested in this topic stay tuned … we will get back soon to it with more details into how to get this service running and doing what we want to do with our data.

Cover image (cc)


In my previous post I explained how you can execute any query in FIM and see the results. Remember how cumbersome that was?

Today I introduce our next internal project that just got open sourced: FimExplorer. Here is what it looks like (bear in mind that it’s not supposed to be pretty;) ):

And here is what it can do:

  • run any xpath query against FIM
    • you can choose what attributes will be fetched; all are fetched by default
  • find objects by ID
  • display results in grid
  • display single object information in a dialog (double-click the grid row)
  • navigate through references (click on the ID link)
  • export displayed results to XML (this one produces the same results as FIM migration cmdlets: Export-FIMConfig / ConvertTo-FIMResource / ConvertFrom-FIMResource, read MSDN for more info)
  • import objects from XML (generated by FimExplorer or FIM cmdlets) and show them in grid; this can be useful for “offline” analysis
  • it is not required to run on a machine with FIM installed

You can find the code on GitHub and CodePlex. Compiled, ready-to-run version is also available (on CodePlex). Of course you are more than welcome to send contributions via pull requests.

Before running the application you need to modify it’s configuration file (Predica.FimExplorer.exe.config). Just replace the initial settings with your own FIM URL, username and password.

Important (at least for those who will explore code of this project):

It uses our FimCommunication library under the hood. It is referenced as a Git submodule. So after you clone the FimExplorer repository make sure to also run “git submodule init” and “git submodule update” commands to download it!

Executing any XPath queries in FIM, the hard way

Being able to execute any query in FIM and get the results can be crucial when developing FIM-based solutions. Analysis based on real data can be very helpful during debugging as well as designing FIM objects to complete a certain task.

<sarcasm mode=”on”>
Fortunately Microsoft knows this is the case and provided an easy way to send XPath to FIM and get the result set.
</sarcasm mode=”off”>

Of course the above statement is not really true. In order to get XPath query results in FIM you need to:

  • Go to Search Scopes configuration

  • Start to create new Search Scope and fill first screen with any data:

  • Enter query on 2nd screen and “TAB away” – get focus out of the field. The query will now be executed and you will be presented the results in a standard FIM grid:

And this works. It’s not the most comfortable solution, but it gets the job done. However it adds a massive overhead if all you need to do is just perform some quick query/data checks.

Quick Editors Note (from Tomek)


Of course … there is still PowerShell which can be used here.


Probably most annoying thing here is that when one will make mistake in a query or in its syntax, or just query is not proper for FIM to crunch it, you will get only this error message:

Now it is up to you to figure out what’s wrong, which in case of complex queries might be sometimes problematic.

For now this is all I have to share with you. An alternative solution to this problem will be included in my next post. And yes, this will be another Predica open source project!

Happy querying!