Monday, December 26, 2011

Property Sets for Permissions in AD and AD LDS

A while back I needed to set up Property Sets in AD LDS for granting of permissions to many of the attributes on the person object all at once, as I reviewed the Technet documentation on AD Property Sets I realized that it doesn’t tell you what object type property sets are, nor does it tell you how to create a property set, nor does it tell you how to assign an attribute to a property set. The MSDN documentation on Property Sets lets you see which attributes where included in which property sets in the different versions of AD, and it hints that property sets are part of Control Access Rights. Finally there is some more MSDN documentation on Control Access Rights that starts to spell it out:

  • For defining property sets, to enable controlling access to a subset of an object's attributes, rather than just to the individual attributes. Using the standard access rights, a single ACE can grant or deny access to all of an object's attributes or to a single attribute. Control access rights provide a way for a single ACE to control access to a set of attributes. For example, the user class supports the Personal-Information property set that includes attributes such as street address and telephone number. Property set rights are created on controlAccessRight objects by setting the validAccesses attribute to contain both the ACTR_DS_READ_PROP (16) and the ACTRL_DS_WRITE_PROP (32) access rights.

This illustrates the first goal of my post: property sets exist in AD as controlAccessRight objects. But still doesn’t tell us where in the AD do they live. In fact they live in the CN=Extended-Rights container inside the Configuration partition(not the schema):


Digging deeper into the MSDN docs on Creating Control Access Rights illustrates how you link attributes to a property set:

If you define a control access right for a property set, use the rightsGUID of the controlAccessRight object to identify the properties in the set. Every property is defined by an attributeSchema object in the Active Directory schema. The attributeSecurityGUID property of an attributeSchema object identifies the property set, if any, that the property belongs to. Be aware that the attributeSecurityGUID property is single-valued and stores the GUID in binary format (octet string syntax).

Another goal of this post is to help by making this a little more visual.When you create a property set, you must first generate a GUID and place in the rightsGUID attribute on the controlAccessRights object. To assign an attribute to a property set you need to place this same GUID in the attributeSecurityGUID attribute on the attributeSchema object (in the Schema partition). Remember an attribute can only belong to one property set.


Take a look at the following

Instructions on how to assign permissions to someone using a Property Set

For information on how to get the GUIDs into the right forms see my post

GUIDs to Octets, GUIDs to Base64 strings and back again

Suppose I generate a GUID of 8c4ac332-975f-4717-ad7b-ba4a4e968fff by running the following PowerShell Command line


Don’t worry if your GUID is different from mine; it should be! If it isn’t let me know because I think I’ll partner with you for the lottery (aka a tax on the mathematically impaired).

Some attributes (like the attributeSecurityGUID) when edited through ADSI Edit require you to convert the GUID to octet string (for little endian systems – Intel processors are little endian): 32c34a8c5f971747ad7bba4a4e968fff

Which you can do with this one line of PowerShell script

[System.String]::Join('',(( new-object system.guid('8c4ac332-975f-4717-ad7b-ba4a4e968fff') ).ToByteArray()
ForEach-Object { $_.ToString('x2') } ) )

Then if you want to put this in an LDIF file you must base64 encode the value

so that it looks like: MsNKjF+XF0ete7pKTpaP/w==

You can do that with this one line of PowerShell

[System.Convert]::ToBase64String((new-Object system.Guid("8c4ac332-975f-4717-ad7b-ba4a4e968fff")).ToByteArray())

To convert from the Base64 string to the GUID use this line of PowerShell:

new-Object -TypeName System.Guid -ArgumentList(, ( ([System.Convert]::FromBase64String("MsNKjF+XF0ete7pKTpaP/w==")) ) )

FYI – I chose to express all of these in PowerShell as opposed to C# as many readers are not C# developers and I still wanted to give all the ability to do these transforms without the complexity of compiling code or downloading an executable.

Thanks to John Geitzen whose reply to someone else’s question helped me see how to make the correct call to be able to pass the array as a whole parameter to the guid constructor instead of it getting splatted.

Thanks to Poshololic whose comment on this post showed how to do the Guid to Octet conversion in one line.

Monday, November 28, 2011

Referenced by Other works and Sale at Lulu

I was pleasantly surprised today to find three other books, referencing FIM Best Practices Volume 1, which because of a Lulu Sale you can get at 25% off until 12/14/2011 Coupon Code: BUYMYBOOK305 Coupon expires December 14, 2011 $50 Max Savings. Of course today only 30% off, CYBERMONDAY305.

All three have an identical blurb about FIM and reference FIM Best Practices Volume 1 as additional material.

Title Author
User Provisioning: High-impact Strategies - What You Need to Know: Definitions, Adoptions, Impact, Benefits, Maturity, Vendors Kevin Roebuck (Jun 7, 2011)
Excerpt - Page 138: "... TechNet Wiki [7] FIM Best Practices Volume 1: Introduction ..."  
Run Book Automation: What you Need to Know For IT Operations Management by Michael Johnson (May 3, 2011)
Excerpt - Page 74: "... Microsoft TechNet Wiki [7] FIM Best Practices Volume 1 ..."  
Federated Id management: High-impact Strategies - What You Need to Know: Definitions, Adoptions, Impact, Benefits, Maturity, Vendors by Kevin Roebuck (Jun 7, 2011)
Excerpt - Page 148: "... TechNet Wiki [7] FIM Best Practices Volume 1: Introduction ..."  

Although the blurb lists the license for FIM as Shareware. I hadn’t thought that FIM would fit the definition of Shareware.

All three appear to start with an introductory paper, and the contain a compilation of articles on various related technologies.

Tuesday, November 22, 2011

Wednesday, November 16, 2011

What the %_ is the deal with wildcards in FIM Queries in the latest hotfix?

Ok I am not actually swearing, nor are those substitute words, rather % and _ are two characters that until hotfix rollup package (build 4.0.3594.2) could be used to perform some much needed and cool searches for sets, search scopes, groups and 3rd party client queries against FIM. Such as querying for the presence of string attributes.

I am sure what happened is that someone created a resource with an underscore in the name and then couldn’t search for it. So the fix. However it wasn’t broken. We need this functionality. Furthermore, simply enclosing the wildcard character in [] would cause it to be evaluated as a literal.

The secret, as I previously blogged, is that FIM takes what you type in (on some searches) and passes it as the right hand parameter of the T-SQL LIKE operator. Ergo, whatever wildcards you can do with LIKE you can do here. Was this a form of SQL injection? Perhaps, but I tested it for other kinds of SQL injection, such as adding a single quote and other commands, and those don’t work. So it wasn’t a vulnerability, but a feature. Undocumented? Sure, but needed.

Using Wildcard Characters As Literals

You can use the wildcard pattern matching characters as literal characters. To use a wildcard character as a literal character, enclose the wildcard character in brackets. The following table shows several examples of using the LIKE keyword and the [ ] wildcard characters

The problem with this hotfix is that it destroys our ability to build sets and queries that test for presence of values in string attributes. This will break many of the implementations of FIM that I and my team have done. We need a mechanism for detecting nulls in the attributes in the FIM Service database so that we can create sets based on the presence or absence of attributes.

Some might say that we can use DRE’s to accomplish this too, but the calculation of sets of objects that have DRE is non-trivial requiring the creation of an Outbound Sync Rule, the creation of a set of DRE objects, and then another set of objects whose DRL has members in the first set. But worst of all this only applies to attributes in the connector space and their matching attribute in the Metaverse and requires a few syncs and I cannot apply this approach to attributes that exist only in the FIM Service, but not in the Metaverse.

Another alternative would be to create an IsPresent function in the XPath queries, but please ensure that it works on all attribute types.

Preference of fixes (in decreasing order of desirability):

1) We can still use the wildcards in the queries, but have a way to escape them and get an IsPresent function, in other words roll back this portion of the fix and teach/document how to have the wildcards treated as literals.

2) If we can’t do that then I would prefer to see an IsPresent function in the XPath

3) If we can’t do that still use the wildcards in the queries, but have a way to escape them

Official text from hotfix rollup package (build 4.0.3594.2):

Issue 2
Revised the FIM "Query and Sets" features to correctly treat percent signs, underscores, and opening brackets as literals instead of as SQL wildcard characters.
The approved character sets for strings that are used in FIM attribute values are defined in the attribute and binding schema in the FIM service. The syntax for representing an XPath filter is documented on MSDN in the following "FIM XPath Filter Dialect" article: (

Some customers may have included characters that SQL defines as query wildcard characters, such as the percent character, in FIM searches and Set filters. In this case, the customers intended FIM to treat the characters as SQL wildcard characters. This is not a documented or supported feature of the product. In some cases, customers may be able to achieve the intended functionality by removing the wildcard and by using a “contains” query/filter instead.
Existing Set resources that have filters that contain SQL wildcard characters may not continue to function as the filters functioned before this hotfix was applied. Also, a filter that contains wildcard characters and that continued to function as expected after the hotfix was applied may function differently if the administrator later updates the filter definition.
Customers who used characters that SQL defined as query wildcard characters must check and revise their Set filters either before or after they upgrade to this hotfix. Customers should consider the impact of Set membership changes on Set transition MPRs. And, customers may want to temporarily disable MPRs or update workflow definitions while they change their Set filters to avoid unintentionally triggering provisioning or deprovisioning operations during Set definition maintenance.

FIM 2010 hotfix available (4.0.3594.2)

Microsoft has released a new hotfix (kb 2520954) at the end of October with some key fixes in it as well as one item that I will blog about next that prevents me from loading this on most implementations, until it is addressed.


Component Official Description Comments
Workflow Engine (FIM Service) Assume that you perform an operation that accesses the SQL database when the Microsoft SQL Server connection pooling feature is enabled in the FIM server. For example, you run a query or a request. If the operation times out for any reason, a future operation on the same thread may fail until that thread is removed from the SQL connection pool. An error message that resembles the following is displayed in the FIM Service Application event log, in the RequestStatusDetails property for a request, or in the WorkflowStatusDetails property of a workflow instance: Cannot enlist in the transaction because a local transaction is in progress on the connection.
Additionally, the time stamp is the same as the time when the operation fails.
An operation on a thread that make a sql call that times out poisons the thread and all future operations on the thread fail.
This could have lead to other problems that were hard to reproduce. Kudos on this one
Sync Engine An ExpectedRulesEntry (ERE) object is associated to a child synchronization rule of a Metaverse object. If the ERE object has a Remove action, deprovisioning of the object is also being triggered. Then, the behavior causes the deletion of the Metaverse object Much needed fix to ensure that deprovisioning doesn’t fire incorrecltly.

Fixes many "Export not reimported" errors that might occur because of errors in SQL.

Hallelujah – we see a fair amount of those. Would like to see more detail on that one
Improves the performance of all Sync Engine operations.
Note This change involves an extensive upgrade to the sync database. This upgrade can take lots of time, depending on your hardware. A progress bar is displayed during the database upgrade.
Ok plan for a long time for your update. Be sure to back it up.

This also sounds like a future blog article, to look a little deeper as to the changes.
Feature 2
The FIM 2010 Active Directory Management Agent (AD MA) does not honor the preferred domain controller list when passwords are exported. This is an issue for customers who require password changes to flow to a specific set of domain controllers. This hotfix rollup package changes the AD MA to use the preferred domain controller list first. If the preferred domain controller list does not exist, the domain controller locator service will identify a domain controller for password export operations. Additionally, you can still force password operations to use the primary domain controller by setting the following registry subkey:



UsePDCForPasswordOperations (REG_DWORD, 1 = True, 0 = False)

This hotfix rollup package also updates the AD MA so that a trust relationship with the configured Active Directory forest is not required to export passwords to that forest.

This will be very helpful in large environments.

Prior to this all password operations on FIM were targeting the PDC Emulator, which incidentally introduced a single point of failure.

I also applaud the elimination of the need for the trust to do password exports!
Feature 3
Adds the ability to filter objects before they are imported into the AD MA connector space.
Another big win for large environments where we need to ignore large portions of the domain!
Sets and Query (FIM Service) Fixes an issue that would sometimes cause incorrect Set calculations. This resulted in lots of set corrections. Also revised the Sets Correction job so that it does not change special sets that are maintained by another system maintenance job. Thank you!
FIM MA Fixes an issue in which the FIM synchronization service configuration for synchronization rules and codeless provisioning was not correctly written to the FIM Service database. Seen this one. Glad to have a fix.
FIM Service Fixes an issue in which unexpected data in the FIM Service database could result in the FIM MA causing the Synchronization service to fail during import, and a stopped-server error occurred. Seen this one too.
Issue 4
Some ExpectedRuleEntry objects and DetectedRuleEntry objects in FIM 2010 can become "orphaned" over time. When a DetectedRuleEntry object is not referenced in the DetectedRulesList of any object in the system, that object is determined to be orphaned. Similarly, when an ExpectedRuleEntry object is not referenced in the ExpectedRulesList of any object in the system, that object is also determined to be orphaned.
Once more thank you.

TEC 2012 call for papers open for 2 more days

The Experts Conference Call for Papers still open until Nov 18th 

For general info:

I have attended at spoke at this conference since 2007. I love it. It is a great experience and loads of great in-depth technical training by top experts on Directory & Identity, as well as SharePoint, Exchange, Virtualization & Cloud and PowerShell Deep Dive. Also come and learn about the inside joke dealing with the rubber chicken.

Awesome FIM Case Study

Microsoft recently published a case study about our work (Ensynch -- now Insight) at Grand Canyon University, implementing a FIM 2010 based identity management solution.
The document is available for download directly from

O Blog how I have neglected thee

Ok so I have neglected my blog a bit. You all saw the news that Ensynch is now part of Insight.

Wow the same day we announced the merger (9/19), I was also given word that my uncle and cousin died in a plane crash, that wound up making the regional news.

Later my kids earned their trip to Disneyland, so we took them in early October.

One of our architects had to disengage from a big project (for personal reasons) and I needed to step in and play that role too.

Phew! Life has been crazy busy. In reflection I just want to say how precious life and family are to me and hopefully are to each one of you.

Friday, September 23, 2011

Big news–Insight + Ensynch

Insight to acquire Ensynch.
As my colleague Rebecca Croft said:
We are very excited about the union of Insight and Ensynch and the benefits that it will bring to our clients. Both companies are focused on helping our clients find innovative, cost effective solutions to address business needs. Bringing Ensynch into the Insight organization will offer clients more robust software services, particularly around Microsoft Enterprise Agreements, as well as improved services delivery, enhanced virtualization and cloud capabilities and solution-focused approach to software sales. This acquisition will further simplify our clients’ ability to acquire, procure, implement and manage IT solutions across their technology environment.
For more information, read the press release here, visit or, or contact me with any questions.

Get 15% off of FIM Best Practices Volume 1

Through Sept 26th get 15% of FIM Best Practices Volume 1 at

Use the following code at checkout OKTOBERFEST305

Tuesday, September 6, 2011

Get 20% of FIM Best Practices Volume 1


Buy FIM Best Practices Volume 1 in Soft Cover or E-Book

Enter coupon code SEPTEMBER305 at checkout and receive 20% off your order. The maximum savings for this offer is $100. Offer expires on September 9 at 11:59 PM

Thursday, September 1, 2011

Calling a stored procedure in an ADFS claims rule

After you have setup your SQL Attribute Claims Store in ADFS. If you want to use it and in fact test it you must set up a claims rule that makes use of it. To do this you must create a claim using a custom rule, which allows you to employ the claims rule language.

The following technet entry is a good start as it illustrates how to enter a SQL Query and even a stored procedure.

SQL Query:

c:[Type == ""]

=> issue(store = "SQLClaims", types = (""), query = "SELECT myID from employees where @myp={0}", param = c.Value);

Stored Procedure:

c:[Type == ""]

=> issue(store = "SQLClaims", types = (""), query = "EXEC dbo.test @myp={0}", param = c.Value);


Note that the parameter{0} is not surrounded by single quotes.

One may ask what gets passed in as the parameter? The incoming claim value of course. In this case the emailaddress as defined in the c:[Type == ""]

One might also ask what happens if I make a query or stored procedure that returns more than one value? Your claims transformation rule adds all the resulting values to the token as claims of the same type.

One might also ask what happens if my query or stored procedure returns more than one column? An error results and the whole process fails.

Troubleshooting SQL Attribute Stores with ADFS

Several others have showed how to define SQL attribute stores with ADFS.

Note that when entering the connection string there is no validation or feedback to the administrator. If there is a problem you usually won’t see it until you setup a claims rule that uses it and you get an error. So make certain to carefully build and test your connection string. Remember that if you use integrated authentication to connect to the SQL Server that it will run under the context of your ADFS Service account so you will need to grant your ADFS service account permissions to the SQL Server and Database.



For example you might get event 149

During processing of the Federation Service configuration, the attribute store 'SQLClaims' could not be loaded. 
Attribute store type: Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Sql.SqlAttributeStore, Microsoft.IdentityServer.ClaimsPolicy

User Action
If you are using a custom attribute store, verify that the custom attribute store is configured using AD FS 2.0 Management snap-in.

Additional Data
POLICY3906: Could not parse the parameter as a valid connection string.

Thursday, July 28, 2011

Using FIM Best Practices Volume 1 to study for the FIM exam

Ok so info on the exam and its list of items covered is provided here

For fun I thought I would map out the domain objectives to items in the FIM Best Practices Volume 1

The book helps with items in area 1 Planning a FIM Implementation and Installing FIM.

Objective Chapter
1. Planning a FIM Implementation and Installing FIM
1.1 Plan and design FIM topology 4 and 5

1.2. Install the FIM Service and the FIM Portal.

6 and 7
1.3 Upgrade Microsoft Identity Integration Server (MIIS)/Microsoft Identity Lifecycle Manager (ILM) to FIM 2010 not covered

1.4. Deploy and manage client components.

6 and 7

1.5. Implement disaster recovery for FIM 2010.

Chapter 9 partially covers this

Beta Exam for FIM available until Aug 4th

Beta exam 71-158, TS: Forefront Identity Manager 2010, Configuring

So in a short while we should see some folks who are actually Microsoft Certified Technical Specialists(MCTS)  for FIM!

Tuesday, June 28, 2011

FIM Bug for multi-valued strings that need approval

I think I found a bug in FIM Version 4.0.3576.2 take a look:

It appears that when you have a multi-valued string attribute when you add more than 1 value at a time and you need approval to create the object or to update the attribute, the request will fail. In the event log you will see an error (UnwillingToPerformException … CREATE UNIQUE INDEX statement terminated because a duplicate key was found for the object).






Log Name: Forefront Identity Manager

Source: Microsoft.ResourceManagement

Date: 6/27/2011 6:33:52 PM

Event ID: 3

Task Category: None

Level: Error

Keywords: Classic

User: N/A

Computer: fimserver


Microsoft.ResourceManagement.WebServices.Exceptions.UnwillingToPerformException: Other ---> System.Data.SqlClient.SqlException: Reraised Error 50000, Level 16, State 1, Procedure ReRaiseException, Line 37, Message: Reraised Error 50000, Level 16, State 1, Procedure ReRaiseException, Line 37, Message: Reraised Error 1505, Level 16, State 1, Procedure ReEvaluateRequestOutputString, Line 53, Message: The CREATE UNIQUE INDEX statement terminated because a duplicate key was found for the object name 'dbo.#reevaluateRequestOutputStringRemovalCandidate______________________________________________________________________0000000175F1' and the index name 'IX_ReEvaluateRequestRequestOutputStringRemovalCandidate_ObjectKey_ObjectTypeKey_AttributeKey_ValueString'. The duplicate key value is (23564, 32698, 32655, 0).

Uncommittable transaction is detected at the end of the batch. The transaction is rolled back.

at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)

at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)

at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)

at System.Data.SqlClient.SqlDataReader.ConsumeMetaData()

at System.Data.SqlClient.SqlDataReader.get_MetaData()

at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)

at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)

at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)

at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)

at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)

at System.Data.SqlClient.SqlCommand.ExecuteReader()

at Microsoft.ResourceManagement.Data.DataAccess.ProcessRequest(RequestType request)

--- End of inner exception stack trace ---

Event Xml:

<Event xmlns="">


<Provider Name="Microsoft.ResourceManagement" />

<EventID Qualifiers="0">3</EventID>




<TimeCreated SystemTime="2011-06-28T01:33:52.000000000Z" />


<Channel>Forefront Identity Manager</Channel>


<Security />



<Data>Microsoft.ResourceManagement.WebServices.Exceptions.UnwillingToPerformException: Other ---&gt; System.Data.SqlClient.SqlException: Reraised Error 50000, Level 16, State 1, Procedure ReRaiseException, Line 37, Message: Reraised Error 50000, Level 16, State 1, Procedure ReRaiseException, Line 37, Message: Reraised Error 1505, Level 16, State 1, Procedure ReEvaluateRequestOutputString, Line 53, Message: The CREATE UNIQUE INDEX statement terminated because a duplicate key was found for the object name 'dbo.#reevaluateRequestOutputStringRemovalCandidate______________________________________________________________________0000000175F1' and the index name 'IX_ReEvaluateRequestRequestOutputStringRemovalCandidate_ObjectKey_ObjectTypeKey_AttributeKey_ValueString'. The duplicate key value is (23564, 32698, 32655, 0).

Uncommittable transaction is detected at the end of the batch. The transaction is rolled back.

at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)

at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)

at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)

at System.Data.SqlClient.SqlDataReader.ConsumeMetaData()

at System.Data.SqlClient.SqlDataReader.get_MetaData()

at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)

at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)

at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)

at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)

at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)

at System.Data.SqlClient.SqlCommand.ExecuteReader()

at Microsoft.ResourceManagement.Data.DataAccess.ProcessRequest(RequestType request)

--- End of inner exception stack trace ---</Data>




So I looked up the stored procedure mentioned in the Error Message.

We can see that this stored procedure can get called from a lot of places and so can be an issue in many spots.

The problem is found in the text of the ReEvaluateRequestOutputString stored procedure excerpted here below with my comments added inside /* */:


[requestOriginal].[ObjectKey] AS N'ObjectKey',

[requestOriginal].[ObjectTypeKey] AS N'ObjectTypeKey',

[requestOriginal].[AttributeKey] AS N'AttributeKey',

[requestOriginal].[ValueString] AS N'ValueString',

[requestOriginal].[Deleted] AS N'Deleted'

INTO #reevaluateRequestOutputStringRemovalCandidate

FROM [fim].[RequestOutputString] AS [requestOriginal]


[requestOriginal].[RequestKey] = @originalRequestKey






/* Which results in


Note that the last two rows will cause a problem with the next command because they have the same values in the objectkey, objecttypekey, attribute key and deleted columns.

Yet the adding of two values to a multi-valued string is a legal operation.


CREATE UNIQUE CLUSTERED INDEX [IX_ReEvaluateRequestRequestOutputStringRemovalCandidate_ObjectKey_ObjectTypeKey_AttributeKey_ValueString]

ON #reevaluateRequestOutputStringRemovalCandidate







/* Resulting error:

Msg 1505, Level 16, State 1, Line 26

The CREATE UNIQUE INDEX statement terminated because a duplicate key was found for the object name 'dbo.#reevaluateRequestOutputStringRemovalCandidate______________________________________________________________________0000000178D0' and the index name 'IX_ReEvaluateRequestRequestOutputStringRemovalCandidate_ObjectKey_ObjectTypeKey_AttributeKey_ValueString'. The duplicate key value is (23623, 32698, 32655, 0).

The statement has been terminated.



Then the whole transaction rolls back and the request fails


Thursday, June 23, 2011

SQL Extensible Management Agents That Scale (Rebecca Croft)

Rebecca, a fellow Ensynchian, presented at TEC 2011 on the limitations of the standard out of the box SQL Management and how she overcame them by writing a very fast eXtensible Management Agent (XMA).

First attempt use sql reader to read data (really fast) and write one row at a time to the AVP file (but that gets slow when dealing with large data sets).

Second attempt use the T-SQL “FOR XML” clause to transform the data to XML and then use an XSLT to transform to LDIF.

So the XMA executes a T-SQL statement to export the data to XML and then XSLT to transform to LDIF and then returns the LDIF file to the FIM Synchronization Service.

She even showed off a wizard to create the XMA for us. When it completed successfully she received a spontaneous round of applause.

Friday, June 17, 2011

RCDC Editor

As previously discussed the RCDC is a very powerful tool for customizing FIM without writing your own front-end and web client. There are several drawbacks to the RCDC. The worst is that you have to export the RCDC to an xml file, open it up in your favorite XML editor, modify it by hand, load it back into the FIM Portal and then run iisreset. All of which means that mistakes are quite painful, as it can take you several minutes to discover your mistake. Worse if you made more than one change. Ugh!
So thanks to my friends over at OCG there is an RCDC editor. While not perfect it can shave hours off your time to edit RCDC’s.
You get an almost WYSIWYG editor that saves you from making many easy simple mistakes. If I need to tweak something simple I might for go it, but then again I have lots of experience tweaking the RCDC by hand (painful experience). For $775 for a project I can get an editor that makes life much simpler. No brainer!
The UI is good but not perfectly intuitive. I found several “bugs” only to discover that I needed to learn just a bit more about the tool.
You will need to run a PowerShell command to export the FIM Configuration, install the software before you can use it at all. After activating the license you can save the RCDC’s as XML. Then yes you still have to load the RCDC manually and run iisreset. Nonetheless, this is still much easier.
While you are still learning more about what the RCDC can do, this is still an iterative process. Creating an RCDC for a new FIM resource type is now a 2-8 hour job instead of 8-32 hour job.
The Resultant Rights Editor is a nice bonus that allows you to setup scenarios (who is accessing what resource and which attributes to include) so that you can see what control will be visible, and enabled for the different users.
Three complaints (with paraphrased responses from Tools4FIM):
1) When I purchased the tool the purchase was for a 1 year license – I wasn’t warned anywhere that this was only for a 1 year license until I was completing the purchase. I didn’t spot it in the EULA when I installed the demo version (yes I did read it, if I missed it please let me know). In my opinion, limit the license by time or project, not both. (You get licensed for 1 project – one FIM install, for one year).
Apparently, the one year bit is that you only have 1 year to activate the license. You get the license without a time limit. I love it when my complaints are resolved before I make them.
2) Sometimes I may want to add a control that doesn’t really have anything to do with an attribute, yet the tool forces me to name the control after the attribute.
-- Next version (due out Q3) and they intend to allow those that purchase now access to that new version without penalty.
3) When adding/editing UocDropDownList controls it doesn’t let me set the Caption (what the users sees) for the Option differently than the Value (what the computer sees). It lets me set a hint but not the Caption.
Yes I did read the help file which tells me that I can do it but not how. So I do think that is a bug.
-- Aha! They told me that there is a way to do this. When you modify the Value:
<edited 6/23/2011>
The Option Value is the value (what the computer will see – SA) and the Constant is the caption that the user will see (Admin Account).
So it looks like this:
Wish list:
1) Copy all of the XML data sources (like regions, or other custom data sources) so that I can manage them centrally and copy them into other RCDC’s. That way if a new country is born tomorrow, or next year, I only have one place to go update the list of countries and their codes instead of in several different RCDC’s for several different resource types.
2) Let me copy all of the settings from one control to another. Sometimes I want 4 text boxes right after another with all of the same settings, just different binding.
-- Next version and they promised to name the feature after me Winking smile
3) XML editor (or link to) so that I can open it up for the manual tweaks that RCDC Editor doesn’t do yet.
-- They are thinking about it for the next version.
4) Add the ability to run the export PowerShell script from the tool (probably also need to let me configure the exact command line).
-- Already planning on it in the next version
5) Add the ability to upload the RCDC right from the tool, optionally running iisreset.
-- Already planning on it in the next version
6) Home Page and Nav Bar editing, especially with the Resultant Rights Evaluator.
7) FIM Portal style sheet editing
8) Ability to diffs with previous versions (whether stored by the Editor or by querying the FIM Service for recent requests to modify the RCDC
Bottom line: This is a tool that I as a FIM implementer can’t live without, especially at the $775 price.

Tuesday, June 14, 2011

RCDC Requiring another field

Ok I just had to blog this.

I created a custom resource type in FIM for resource mailboxes (Room and Equipment) with accompanying RCDC’s. Based on a Boolean attribute I hide or make visible a tab of info about Room resources on the edit and view RCDC’s.  (You can’t do that to the create RCDC because the object doesn’t yet exist)

But, I would like to make room number on the Hidden tab to be required when the tab is visible, and not when the tab isn’t. Obviously I can’t do that on the create because the object doesn’t yet exist and so I can’t reference the Boolean attribute. So I just set the required property to true and figured it would work or not. – It does not work. The tab is still hidden until I click finish and then the tab is revealed and it insists on input to the field “The required field cannot be empty”.


Isn’t that just weird?

However, by binding the required property of the control to the same Boolean attribute as the visible property of the tab uses we don’t get the same issue. It keeps the tab hidden when it is not a room, but shows the tab when it is a room and requires those fields.

Monday, May 23, 2011

FIM 2010 R2 News

At Tech Ed Atlanta Brjann Brekkan and Mark Wahl discussed FIM 2010 R2 in a public forum – so here is a lot of info that is now in the public forum.

Mark covered the new items that will come out in R2:

  1. Web Based Password reset (no need for a domain joined computer, no need to install Password Client no need for Active X, support for Firefox)
    1. Although for integration with the GINA (the login screen) you still need to install the FIM Password Reset Client
    2. Have the ability to mark QA gates as executing for everyone or only those coming through the extranet.
      1. image
    3. Considering adding Captcha or OTP gates to phones
  2. Reporting
    1. Depends on System Center Data Warehouse (SCDW)
      1. But no separating licensing is required for SCDW
    2. Reports
      1. Membership Change Reports
      2. Object History
        1. Users
        2. Groups
        3. Sets
        4. Requests
        5. Policy Rules (MPRs)
  3. MA – EZ MA – Andreas Kjellman covered this one at TEC

Monday, May 16, 2011

Behind the scenes of RoomResources–Custom Properties

While using FIM and PowerShell to manage Exchange 2010 I was following along a wonderful article on resource mailboxes that left me wondering a few things.

1) Exactly how is the data stored in the msExchResourceDisplay and msExchResourceSearchProperties attributes?

2) How is it stored with multiple custom properties?

3) Is manipulating those AD attributes sufficient or is PowerShell storing something in the Exchange Data store?

Here are the answers:


1) msExchResourceDisplay = “Room,FlatScreenTV” It appears to be a single valued string with commas.

msExchResourceSearchProperties at first blush appears to be a single-valued string with semi-colons, however further examination reveals it to be a multi-valued attribute


2) What happens when multiple Resource Custom Properties are set?


msExchResourceDisplay = “Room,FlatScreenTV,Whiteboard”


So the new value is simply added to the old ones.

3) Is manipulating these AD attributes sufficient?



Now the reveal:


It works!

4) Well I came up with another question – What happens if the AD programmer forgets to manipulate both attributes?


If it is missing from Display but is in Search


Then it isn’t visible in the Address Book but a search returns it as a result:


6) so what if I put MountedProjector back in Display but it is missing from Search?


It shows up but a search for MountedProjector reveals nothing:


Whereas as search for Whiteboard:


Wednesday, April 20, 2011

Using FIM to manage BPOS/Office 365

Carol presented a solution to a very thorny problem – how to overcome the lack of delegation in BPOS. In BPOS a user is either an admin or a user. So she used FIM to provide the delegation. Very detailed, very complete solution. She illustrated some of the scripts she has posted on her blog such as 

Well done Carol!

FIM 2010 reporting using SQL Server Reporting Services (Jeremy and Craig)

Jeremy and Craig had an interesting shoot out showing off their differing versions of reporting from FIM. Jeremy has an “agent” that he uses to pull the data out of FIM and store it in SQL, after which doing SSRS reports is not terribly difficult. Craig’s approach was to start off by creating a generic SSRS Data Processing extension for PowerShell, and then adjusted to pull data from FIM. Both approaches look very slick. Afterwards they explained how their efforts actually turned out to be quite complimentary. Two thumbs up gentlemen!

Cloud computing single sign-on. Making ADFS work with Google and Salesforce (Nikita Ryumin)

This TEC session on the Directory Services track was short but sweet illustrating how to connect ADFS to Google and SalesForce.

Tuesday, April 19, 2011

Desktop Virtualization and Identity Management

I did a lunch time presentation in partnership with Jonathan Sander. We presented how we can use Quest VWorkspace and Quest One Identity Manager to build a corporate store (we code named it VIPER) to provide a dynamic desktop experience.

Creating Authentication Activities in FIM (Ikrima Elhassan)

This session at TEC was quite interesting. Ikrima presented quite a lot of material about how to extend FIM with your own authentication activities, demonstrating a OTP password reset approach.

Code is available at


Hey readers, our Identity Practice at Ensynch is keeping us very busy. We would like to have more Identity consultants as part of our team. Come work with me and the rest of our fantastically talented Identity Team.

We are looking for people with experience in Forefront Identity Manager 2010 and people with experience in ADFS 2.0. We are looking for both Full Time Employees as well as people interested in being contractors for us.

Travel requirement: Depends on where you live and ranges from 15%-60%

Locations: Ideally New York/New Jersey (travel between 15%-30%)

Southern California (travel between 20%-50%)

Any other place in the continental US with access to a major airport (Travel between 50%-60%)

Shoot me an email at DLundell


Monday, April 18, 2011

Designing and Implementing RBAC Solutions with FIM 2010 Group Management

After I introduced Brad Turner and turned the time over to him, he showed off some really cool FIM extensions to enable RBAC. He even showed how it fits the NIST RBAC definitions even through level 3.

The key design decision was to extend the Set and Group objects. The Set then functions as a role. This allows for both explicit and criteria based membership. A new object type for a Role Membership allows for the user’s membership in a role to expire at an individual time.

FIM Best Practices: Sizing Your FIM Installation

I had a lot of fun presenting this session. Largely based on chapter 5 in volume 1 I showed how to decide on your High availability approach, how that impacts your topology choice, and then how to estimate your scale, load, and complexity points. Then based on those factors figure out how big to make your SQL Server that hosts the FIM service database.

In the middle I did enjoy putting in a plug for our Ensynch sponsored green, dishwasher safe water bottles, as I took a drink of my fruit punch Gatorade mix.

I received lots of great questions and got to see lots of familiar faces.

Can PXEs Fly? FIM and SCCM Integration (Rob Allen)

I was looking forward to this one, but got called away. I hope to look at the slides soon.

Creating Management Agents with the new EZMA (Andreas Kjellman)

At TEC 2011, Andreas Kjellman of Microsoft, who “owns” the FIM synchronization engine, showed off the upcoming EZMA framework.

The problem:

The existing eXtensible Management Agent (XMA) does not have a call based import method, we are limited to using GUIDs as the initial anchors, and we don’t have partitions in an XMA.


EZMA – which, IMO, will actually be a little harder to do than an XMA but will allow the developer to do much more that will make the FIM admin’s life easier.

Some of the new features:

Call based import, that you can batch! So just like with an AD MA run profile step (see the figure) we can configure batch size and it will actually have an impact, and you can also choose a partition to process.


The call based export is modified to be able to batch it  too. So instead of calling ExportEntry for each csentry object you will get the ExportEntries method which will have a collection of csentry objects that have pending exports.

The schema, partitions and hierarchy can be discovered programmatically.

Custom anchors – that aren’t GUIDs.

Even better support for custom parameters (of different data types)

Finally the ability to do a full export! Which is great when you have a target that doesn’t store state.  However, you must decide at design time which type of exports your MA will be executing.  You can choose either delta or full, but not both.


The XMA will still be supported.

The EZMA is more of a developer activity than the XMA was. Your dev will need to learn new interfaces, but should need to know a little less about the internal workings of the sync engine.

Bottom Line

Good move because now we can write EZMA’s that are as fully functional as anything the product group does.

Files, FIM, and PowerShell (James Booth)

James Booth former Microsoft Group Program Manager for MIIS (precursor to FIM) presented on using PowerShell to process files in preparation for consumption by FIM.

James points out that “In the beginning, it was all files.” These call based MA’s are the new kids on the block, also said that at Microsoft in 2000 the philosophy was “XML is the answer, now what is your question?”

James has posted his new commandlets to GitHub 

Commandlet Description
Import-DirectoryCredential Imports directory credentials from a file, and returns a custom PowerShell object. Imports directory credentials from a file created using Export-DirectoryCredential
Export-DirectoryCredential see above
Import-LDIF Imports directory information from an LDIF file, and writes custom PowerShell objects to the pipeline.


Exports directory information from the pipeline to an LDIF file


Escape DN components – escaping

James also talked about “munging” the data by piping the data through other functions to transform the data.

He also cautioned against thinking that PowerShell is the only way to do something.

Saturday, April 16, 2011

TEC 2011–FIM Workflows Deep dive

I am already in Las Vegas, prepping to assist my fellow Ensynch coworkers, Joe Zamora, and Rebecca Croft as they lead an awesome value packed pre-conference workshop tomorrow (Sunday) morning at 8 AM to 12 PM (noon). Jerry Camel and Brad Turner will also be around to assist.

There are so many good sessions to attend this time here are some of the ones I am looking forward to:

Monday morning gets the FIMsters off to a great start with a choice of two great sessions:

1) Andreas Kjellman of Microsoft presenting on Msft PM 2Creating Management Agents with the new EZMA. Apparently “in the next few months a new” and to me very exciting “development framework for creating management agents will be released.” This is a great one to send a developer so they can prep for using the EZMA which sounds as though it will make the XMA obsolete.

2) James Booth, formerly of Microsoft, now of Boothbilt, makes his return to speaking at TEC as he presents on Files, FIM, and Powershell. I am looking forward to learning how James has made use of PowerShell to automate certain chores in maintaining FIM. Man, I love collaborating with that guy!

Then in the post lunch sleepiness we have two exciting speakers to keep us awake, Craig Martin, FIM MVP, speaking on FIM Powershell Deep Dive Must stack up for PS deep dive and Rob Allen with his cleverly named Can PXEs Fly? FIM and SCCM Integration. Which one to choose?

Immediately following Craig’s session I am delivering: FIM Best Practices: Sizing Your FIM Installation. Hopefully it will be a beaut!

At the same time following another FIM MVP, Carol Wapshere (of Miss MIIS fame), will be speaking on Head in the clouds – navigating the identity pitfalls of a complex cloud migration. At that altitude it sounds like someone got a bloody nose. Nonetheless, I am sure that Carol’s clear and direct style of speech will help others avoid the nose bleed.

Another “bloody nose” session will also be going on at the same time as mine, AD FS Troubleshooting in the Wild – Cookies and Tokens and Fiddler, Oh My! by Laura Hunter and Brian Puhl. Those MS IT masters of disaster Winking smile err I mean ADFS are at it again.

Winding up the day and setting the stage for the Quest sponsored TEC party, is Brad Turner, showing off some really slick FIM add-ons from one of our latest projects. I will be there to lead things off for Designing and Implementing RBAC Solutions with FIM 2010 Group Management. Also at the same hour Brian Komar, shows off some work from a recent project (thanks for leading that one Brian!) illustrating the Simplifying certificate enrollment to non-Windows computers

To Gil Kirkpatrick, Christine McDermott, and Stella De Jean Lowe and all of the other folks at Quest involved in TEC, I say you have put together what looks to be an amazing set of pre-cons and first day sessions! Look for my reviews on Monday as the day goes, I am going to try and blog it as we go. I suppose that’s really a twitter kind of thing, but then again has anyone ever known me to limit myself to 120 characters?

Wednesday, April 13, 2011

Making Sense of the Cloud



National Roadshow Series:  2 High Value Sessions in 1 Business Focused Technology Briefing from Leading Industry Experts at Ensynch and Microsoft

It’s time to make sense of the plethora of rhetoric around the term "Cloud." It's time to cut through the hype and figure out how to leverage the latest Dynamic Private Cloud and Public Cloud technologies and provide real value to your business.
Why Attend?
Learn how organizations worldwide are realizing tremendous business value as they begin to migrate portions of their business to securely provide IT as a service through private and public cloud solutions.  Unlike many product-focused technology events, this event is focused on business use cases and solutions.  You will leave this event having gained real value and perspective that you can immediately apply to your business's information technology strategy and roadmap.


Complimentary Steakhouse Lunch  will be served at noon to all attendees present.
We recommend attending both event sessions, but you can choose to attend the one that will provide you with the most value.

Session 1- Building your Business Cloud (10:30 AM)
How to Makeover your Infrastructure by providing IT as a Service through Virtualization, Next Generation Management, and Automation solutions from Microsoft.

Session 1- Intended Audience:
Business focused IT Executives and Leaders, Infrastructure and Desktop Management Directors and Managers, Identity Management and Security Directors and Managers.
Products Relevant in Session 1:
Windows 7, Windows Server 2008 R2, Systems Center Suite, Windows Server Hyper-V, Microsoft Desktop Optimization Pack, Forefront Identity Manager 2010, Windows InTune, Quest vWorkspace, Quest One Identity Manager, Active Directory Federation Services (ADFS)

Lunch Break: (12:00 PM)
Complimentary steakhouse lunch will be served.
Session 2- Consuming your Business Cloud (1:00 PM)
How to makeover your productivity and business intelligence platforms powered by SharePoint 2010 and SQL to enable One Place to View the Facts, Collaborate, and Provide a Consumer Cloud Experience at Work. 

Session 2- Intended Audience:
Business and IT Executives, SharePoint/Collaboration/Web Directors and Managers, Business Intelligence Leaders and Stakeholders.
Products Relevant in Session 2:
SharePoint 2010, Office 365, Lync 2010, SQL Server 2008 R2, SQL Azure, Windows Azure

Choose Your City. RSVP Today.

April 27, 2011 - Irvine, CA  (Ruth's Chris Steakhouse)


April 28, 2011 - San Diego, CA (Donovan's Steakhouse)


May 4, 2011 - Parsippany, NJ  (Ruth's Chris Steakhouse)


May 5, 2011 - New York, NY (Park Avenue-Spring Restaurant)


May 11, 2011 - Phoenix, AZ (Donovan's Steakhouse)





Tuesday, March 15, 2011

EBook of Vol 1 is now available

After listening to many pleas for an e-book version of FIM Best Practices Volume 1, I have relented and created an e-book version. List price is $22.00 but here is a 10% off discount for the next week to $19.80.

Most of the requests were for speed of delivery, searching, but the one that got me was a request based on eyesight made by Bill Singh. So you can all thank him for there being an e-book of volume 1.

So here it is as a PDF. With no DRM. That’s right now DRM. I hate it when I get PDF’s with DRM, so I am making this available with no DRM, only protected by Copyright and your sense of honor. I am trusting all of you my readers to behave appropriately. So you can have this on your iPad, your IPod, your computer or any other device you have that uses PDF, but not your neighbor’s device.

You can purchase the e-book through Lulu.


Or from the regular link click on the File Download.


Wednesday, March 2, 2011

Webinar: Cloud’s Silver Lining: Identity Management

ensynch logo

Business Insights Webcast: 
The Cloud's Silver Lining: Identity Management

main image

Join Us for an Informative Webcast on the Value of IDA in the Cloud
- Part 2 in a Series of Webcasts from Microsoft FIM MVP David Lundell -

Identity Management is a critical component to realizing the true value of the Cloud.

Solutions from Microsoft including Forefront Identity Manager (FIM), Active Directory Federation Services (AD FS), and Microsoft Forefront Unified Access Gateway (Forefront UAG) allow you to get the most out of your cloud applications (such as Office 365, BPOS, and other Software a Service (SaaS) solutions); while enabling a seamless transition in managing the identities of your users.

If you are planning to migrate or deploy applications to the Cloud, you must first address provisioning and Single Sign-On (SSO)in order to enable a seamless transition.

Delivered by Microsoft FIM MVP David Lundell, this webcast will provide insight on how to save money and make your business more agile with the Cloud, by ensuring you have a successful Identity Management strategy.

Presenter: David Lundell, Microsoft MVP, Forefront Identity Manager, Ensynch

David is the author of FIM Best Practices Volume 1 (David Lundell, 2010), is a Microsoft Most Valuable Professional (MVP) for Microsoft Forefront Identity Manager 2010, functions as a virtual technical specialist for Microsoft in the identity space, and serves as the Identity Management Practice Director for Ensynch. David frequently speaks at the Directory Experts Conference, the Experts Conference, holds an MBA and numerous other technical certifications.

When: March9th, 2011


Time: 10:00-11:00 AM Pacific



Monday, February 14, 2011

FIM Training back—on May 23-26 in Phoenix

Last week I taught a group of students 50382A Implementing Forefront Identity Manager 2010, and referenced FIM Best Practices Volume 1 to supplement. It was a great bunch, full of humor. We even had one gentleman fly all the way from Australia to attend my class. I felt quite honored.

Well due to popular demand we are going to run it again May 23-May 26 (M-Th) once more in downtown Phoenix.

Register by emailing, providing your contact info, which class and date you want to attend. You will then be contacted to complete the registration. The cost of the course is  $1895 USD

The courseware is very good with quite solid labs. The first lab provides a great overview experience of FIM, much like the first lab of ILM 2 Beta pre-conference workshop I did at DEC 2008 in Chicago, only even sweeter and focused just on FIM. Then in lab 2 you start of with a fresh environment and build the solution from there, setting up Management Agents and configuring Join/Projection rules, Attribute flow rules and learning about the sync engine. Module 3 commences with delving deeper into the Sync Engine. I showed the ILM 2 (FIM) Flow Chart with Screen shots and code snippets that Brad and I did in 2009. Then with Module 4 we start into managing users through the FIM Portal and then into Management Policy Rules and Sets (here I diverted to cover some more graphical slides around MPRs and Sets). Module 5 covers inbound synch rules and outbound sync rules showing how to replace the classic rules with declarative rules from the portal. With Module 6 we dived into the management of credentials – password sync and password reset. In Module 7 we run through group management and in 8 we covered many other items including automating the Sync Engine.

This course is intended for Systems Engineers, Developers, Architects, and Project Leaders who need to gain a good understanding of how Forefront Identity Manager 2010 can be applied to manage identity information across a number of directories or databases. It is also suitable for those who simply want to review the technology in some depth.

After completing this course, students will be able to:

  • Understand FIM concepts and components.
  • Identify appropriate FIM scenarios.
  • Manage users, groups, and passwords using FIM.
  • Synchronize identity data across systems, such as Active Directory and HR.
  • Understand the issues involved in loading data (initial load, backup, and disaster recovery).
  • Configure security for different levels of user.
  • Manage password self-service reset and synchronization.
  • Automate run cycles.
  • Handle sets, simple workflows, and management policy rules (MPRs).