Wednesday, December 24, 2014
'Twas the night before Christmas
Not an identity was stirring, not even a Passport .NET
The user accounts requests were submitted with care
Hoping that their access would soon be there
The users were nestled all snug in their beds
While visions of being able to do their jobs danced in their heads
The servers and computers were in sleep mode
Awaiting someone to move a mouse and send the wake up code
An urgent email pinging my iPhone created a vibration
I sprang to my Surface to see what was the perturbation.
Opening up Windows 8.1, I signed in to the computer
I ran AD Users and Computers and Event Viewer
User accounts had been created and added to groups
All while I had slept after eating my soups
As I looked at my network, what should appear?
But a brand new Identity Management System so nice and clear
On Sync Engine, on Management Agent! Now MPRs and Workflows!
On Metaverse on Sync Rules! On PowerShell and Data flows!
To the web service! To Self Service Password Resets!
Provision, Deprovision and Synchronize all the sets!
Ok, ok so maybe I am just a bit eager for the release of Microsoft Identity Manager (due out 1st half of 2015).
Friday, December 12, 2014
Speaking at 2015 Redmond Summit (Jan 27-29 '15)
This summit is put on by my friends at Oxford Computer Group.
I will be speaking on Password Sync vs. ADFS. Then the next day I will speak on the Business track about How Identity Management Impacts the Bottom Line.
See you there
Thursday, December 4, 2014
What AD Attributes are indexed? ANR? Tuple? PowerShell
Write-Host "Tuple Index Enabled Attributes"
Get-ADObject -SearchBase ((Get-ADRootDSE).schemaNamingContext) -SearchScope OneLevel -LDAPFilter "(searchFlags:1.2.840.113556.1.4.803:=32)" -Property objectClass, name, whenChanged, whenCreated, LDAPDisplayNAme | Out-GridView
Write-Host "ANR Enabled Attributes"
Get-ADObject -SearchBase ((Get-ADRootDSE).schemaNamingContext) -SearchScope OneLevel -LDAPFilter "(searchFlags:1.2.840.113556.1.4.803:=4)" -Property objectClass, name, whenChanged, whenCreated, LDAPDisplayNAme | Out-GridView
Write-Host "Indexed Enabled Attributes"
Get-ADObject -SearchBase ((Get-ADRootDSE).schemaNamingContext) -SearchScope OneLevel -LDAPFilter "(searchFlags:1.2.840.113556.1.4.803:=1)" -Property objectClass, name, whenChanged, whenCreated, LDAPDisplayNAme | Out-GridView
The above script is something I use to quickly look and see what is indexed in an AD environment
Friday, October 24, 2014
SQL Maintenance for FIM and anything other databases
Friday, October 3, 2014
Mistaken Identity
Ironic.
A few years before that I visited a client whose VP of HR had his account disabled when they let the janitor go. Again same last name but this time the same first name.
What went wrong?
In both cases the AD account was linked to the wrong employee record.
How did that happen?
In the first example they had been diligently entering the employeeID into the employeeID field in AD long before Identity Management. The helpdesk had a tool to query the HR database to look up an employee ID. Apparently, the day this PM had been hired HR was a little slow or the helpdesk made a mistake. Either way they plugged in the wrong employeeID into his AD account. So when the other gentleman was termed, the script they ran (this was before we turned on FIM) disabled his account too.
Garbage in, garbage out. While FIM was not the "perpetrator" it would have done the same thing acting on the wrong data.
In the HR of VP example, the initially joining was done using MIIS (a FIM predecessor) based on first name and last name. Somehow in the intervening years no one noticed that the wrong job title had been pushed into AD.
So how can you avoid this? You can't entirely, but you can reduce the # of occurrences. The first step is to understand the data you are given. The second step is to question the validity of the data -- especially if a human was involved. If the whole process has been automated then any errors should be consistent throughout. A firm hiring George Cludgy (instead of Clooney) would have that data flow from HR out to AD and everywhere else with the correct employeeID. The name itself might be wrong but at least it would be consistent. However, if a human gets involved to do data entry, even though the look it up you have a chance for errors. So you can't take the presence of an employeeID in AD for granted. You must question its validity and confirm it.
I prefer to get dumps of HR and AD and use PowerShell to match them up. Just kidding, this is a job for SQL. While PowerShell actually can do some matching this really is a job for SQL.
By then running queries in my database before setting up FIM I can get a good idea of the matches and non-matches. I can then get the client to confirm the matches and fix the non-matches.
Steps:
1) Look at and understand the data
2) Question its validity
Did humans input the data?
3) Export from AD using csvde
4) Get an export of the employees
5) Load 3 and 4 into a SQL database
6) write some queries joining based on employeeID (if present)
7) look at the matches and come up with some additional ways to verify such as including First name and last name
8) use a nick name database to handle the David vs Dave issues.
9) Use Fuzzy lookups from SSIS to generate possible matches.
10) Get the client to validate your matches, especially the possible matches
11) Get the client to work on the non-matches (these accounts may end up getting disabled if no match can be found)
Tuesday, September 16, 2014
Phoenix MVP Roadshow Transform the DataCenter Wed Sept 24 4 PM-8PM
I will be presenting on why we want to get to Active Directory based on Windows Server 2012 R2 and how to get there. My fellow MVP's will be covering the rest of the agenda. I also created an IT clue game to play in small groups where the objective is to figure out who stole the data and how it could have been prevented.
Presented by: MVP David Lundell, MVP Jason Helmick, MVP Rory Monaghan, MVP Tom Ziegmann
Agenda | |
4:00 – 4:30 | Registration and Welcome/Dinner |
(Post/share whoppers, challenges, and questions through twitter and paper) | |
4:30 – 5:00 | IT Clue game – in small groups |
5:00– 5:35 | To Upgrade or not to Upgrade? |
§ Why you really need to upgrade from Windows Server 2003 or | |
2008! (Server Platform) | |
§ Demo: Combating Configuration Drift with PowerShell | |
§ Desired State Configuration Q&A | |
§ Why you really need to upgrade your Active Directory from Windows Server 2003 or 2008 to 2012 R2! | |
§ Q&A | |
5:50– 6:00 | 10 minute Break |
6:00 – 7:00 | Upgrading to Windows Server 2012 R2 |
§ How to upgrade from Windows Server 2003 | |
§ How to upgrade from Windows Server 2008 | |
§ Q&A | |
§ How to upgrade AD from Windows Server 2003 | |
§ How to upgrade AD from Windows Server 2008 | |
§ Q&A | |
7:00 – 8:00 | Datacenter - Dealing with Application Compatibility and Delivery |
§ Discussion and Demos for strategizing Application Migration | |
§ Discussion and Demos of App-V for Application Delivery | |
IT Clue game -- someone stole the data | |
Wrap up |
ADUC Common Queries: Days Since Last Logon
LastLogon is not replicated so to really get it you have to query every single DC. So I was reasonably certain that the query didn't use LastLogon but rather used the LastLogonTimestamp which was created "to help identify inactive computer and user accounts." Assuming default settings "the lastLogontimeStamp will be 9-14 days behind the current date."
However, I couldn't find any documentation confirming that so I had to test it. For all I knew it could have been querying all the DC's to get an accurate LastLogon.
Sure enough the account showed up. Conclusion: ADUC's Days Since Last Logon query is using the LastLogonTimeStamp as I expected.
Friday, July 4, 2014
Happy Independence Day -- Using PowerShell for Reporting
Now I could have taken the data and imported it into SQL and then busted out some awesome queries in no time flat. But my buddy Craig Martin, keeps insisting how awesome this PowerShell stuff is. So I decided to give it a try, plus if I can get it to work then it will be faster to run this repeatedly from PowerShell rather than needing to import it into SQL Server. I am actually a big believer in using the right tool for the job. Otherwise you end up blaming the tool for failing you when you should have picked a different tool, one better suited for your task.
When working in a language of which I am not yet the master, I like to start small and build, so that I don't create 15 billion places to troubleshoot my code. So we start with using Get-ADComputer. Made certain that my filter, searchbase, searchscope and properties give me what I want:
Get-ADComputer -filter * -searchscope subtree -SearchBase "OU=Workstations,DC=Domain,dc=com" -Resultsetsize 4000 -Properties whenCreated
whenCreated gives me the complete date and time but I want to group and count by day. So I needed to transform the whenCreated to date with no time. The .Date method will work for that but I struggled with how to get it into the pipeline for further processing. Eventually I discovered that I can use the @ symbol to note a hash table and tell the Select-Object commandlet to transform it with an expression and give the result a new name. (Thanks Don Jones)
Get-ADComputer -filter * -searchscope subtree -SearchBase "OU=Workstations,DC=Domain,dc=com" -Resultsetsize 4000 -Properties whencreated | Select-Object -Property Name,@{Name="DateCreated"; Expression = {$_.WhenCreated.Date}}
I later discovered I could do the same thing with the Group-Object commandlet which simplifies the command set. So I tack on: | Group-Object @{Expression = {$_.WhenCreated.Date}} -NoElement
to get:
Get-ADComputer -filter * -searchscope subtree -SearchBase "OU=Workstations,DC=Domain,dc=com" -Resultsetsize 4000 -Properties whenCreated | Group-Object @{Expression = {$_.WhenCreated.Date}} -NoElement
But then in sorting it if I want to get a true sorting by date rather than a textual sorting I once again need to do an expression because the Group-Object commandlet has transformed my DateTime values into strings so I tack on:
| Sort-Object @{Expression = {[datetime]::Parse($_.Name) }}
So all together with a little message at the beginning:
Write-host "Daily totals of computer migrations"
Get-ADComputer -filter * -searchscope subtree -SearchBase "OU=Workstations,DC=Domain,dc=com" -Resultsetsize 4000 -Properties whencreated | Group-Object @{Expression = {$_.WhenCreated.Date}} -NoElement | Sort-Object @{Expression = {[datetime]::Parse($_.Name) }}
Tuesday, July 1, 2014
8 Time MVP
Looking forward to the on-going journey with this product set and wonderful friends I have made along the way, product group members (past and present), MVP's (past and present), readers (book, blog, twitter) and other Identity Management professionals.
Tuesday, June 24, 2014
Projects and Heisenberg's Uncertainty Principle
The old saying goes "a watched pot never boils," especially if you keep sticking a new thermometer into a heating pot of water every two seconds. Observations change the system. Frequent observations can change it even more.
On a project, when you get asked for status (or position) and it alters your velocity. If you get asked often enough your velocity slows and then halts. Which isn't the kind of change leaders are looking for.
An article in the Wall Street Journal reveals that even interruptions as short as two seconds can lead to errors.
So observation affects the system. That doesn't mean that we can go without measuring, just that leaders, managers and project managers all need to keep in mind that the demand for constant updates alters the velocity (usually slowing) of the people in the system.
Thursday, May 1, 2014
To Farm, or not to Farm, that is the question --
- Whether 'tis nobler in the mind to suffer
- the slings and arrows of outrageous fortune
- Or to take Farms against a sea of patches
- and by opposing end them? To, die, to sleep --
Today I will be "moderating" the debate about using SharePoint Farms vs. Stand-Alone as the foundation for the FIM Portal. In this corner we have Paul Williams of Microsoft sharing knowledge from his hard fought victories with FIM and painful experiences with Farms. In the other corner we have Spencer Harbar, SharePoint MVP, applying his years of SharePoint expertise to the FIM world providing a definitive guide to installing FIM 2012 R2 SP1 portal on SharePoint 2013.
Spencer points out that "farm deployment aspects are generally not well understood by FIM practitioners which leads to a number of common deployment and operational challenges." Point conceded. I saw much of the same thing with regards to MIIS and ILM when it came to deploying SQL Server.
Spencer argues that "the general idea is to build a small dedicated SharePoint instance purely for the purposes of hosting the FIM Portal [and FIM Service] and nothing else (although it could also host the Password Registration and Reset web sites)" and that by deploying a farm instead of Stand-Alone, the "craptastic demoware default configuration," you can avoid "a bunch of unnecessary goop." Note: Assuming Spencer knows, but just to clarify for everyone, the Password Portals use IIS and do not need or use SharePoint.
An example of the "unnecessary goop" is the SharePoint Search Service Application, which when installed then requires us to turn off the SharePoint Indexing job. A benefit of avoiding "a bunch of stuff we don’t want" is that it "minimizes the attack surface available." Minimizing attack surface is a good thing.
Spencer opines that "the Standalone Server Type... is evil, pure and simple." He also decries the use of "Embedded SQL."
Paul shares some compelling experience based evidence to think about using Stand-Alone instead of a farm, stating that a farm gives you a "serious headache when patching ... more operational maintenance" (more SQL Server Databases to manage instead of the "optimised WID[embedded SQL] files that are totally expendable") "and more complexity around backup and restore (or rebuild) and patching SharePoint itself" not to mention when you need to "[modify] web.config."
Patching needs to be explored further. According to Paul, you must "patch ... nodes sequentially" which "takes quite a bit longer than a standalone node" because "the FIM installer isn’t optimised for a farm" which would normally "deploy the solution pack once" instead we have an installer for the FIM Service and Portal, meaning the patch is the same. Since you need to patch the FIM Service on each node you must run the patch on each node which will also see the FIM Portal, "retract the solution pack and deploy the new one," which in turns causes "all application pools related to SharePoint to be recycled." Since "the retraction and redeployment is global (to SharePoint)" that means "that downtime affects all nodes in the farm – you can’t drop one out of the NLB array, patch, add back, drop the next, etc." Whereas if you do Stand Alone you can "drop one out of the NLB array, patch, add back, drop the next, etc."
I know that with some of the pre-R2 updates I have been able to run the patch on the first node of the farm, installing FIM Service and Portal, and then on the second node just installing the patch for the FIM Service, since the Portal bits had already been updated. I need to double check whether this is still the case (since then most of our installs have been stand-alone).
Paul continues with the woes of the Language packs that they "comprise some file system files for the FIM Service and SharePoint solution packs," which for a Farm means repeated downtime for the whole farm as each node is Language Packed. If you need language packs then a farm is still bad news for downtime even if the method I have used still applies for the Service Pack and hotfixes.
Pros for SharePoint Stand-alone for FIM | Pros for SharePoint Farm for FIM |
Setup is simple (it creates the first site collection, plus database and site for you) | Can get much smaller attack surface by not installing "unnecessary goop" |
Don't have to have a separate SQL Instance (which you must make highly available to avoid single point of failure) to manage, backup, etc. | Avoid the overhead of running the Windows Internal Database/SQL Express Edition (aka Embedded SQL) on each node (overhead that we haven't seen cause FIM performance issues). |
Can patch one server at a time without taking down whole NLB Array of FIM Servers (also each node is faster to patch) | Can deploy pure SharePoint items and CSS files once instead of to each node |
Perhaps there are ways to get the most of the best of both worlds.
- Install one Single Server SharePoint Farm for each FIM Portal node
- Upside: You avoid the painful patching process and Language Pack process
- Upside: Done right have the smaller attack surface (you would get complete control)
- Downside: More complex installation, but you could use the very complete scripts from Spencer to do this
- Downside: Shoot where do I want to put all those databases? I could put them on the SQL Server that will host the other FIM databases
- Separate the FIM Service from the FIM Portal
- Upside: This way when you do the patching and language packs that impact the portal they should only need to be done once, but still have downtime for whole farm.
- Upside: Smaller attack surface
- Upside: Pure SharePoint items and CSS files get deployed and configured only once
- Downside: more vm's/machines to manage and more FIM Server licenses to buy
- Install Stand-Alone and find a way to reduce the "attack surface" by eliminating some of the "unnecessary goop"
- This has most of the upsides and few of the downsides if we can find a way to do it
- Spencer: This is where I would love to have your expert opinion: How to reduce the attack surface on SharePoint Stand-Alone.
Conclusion: Initially, working with Brad Turner, I went with Farms, but then when I saw the Language Pack issues I thought Stand-Alone. Also when trying to keep it simple for non-SharePoint Admins, I thought Stand-Alone. As always there are trade offs, and want to see more discussion before we settle on single answer or even a definitive decision tree for which one to choose. For now I lean towards each FIM Portal and Service node having its own SharePoint Stand-Alone instance, but would love to advance the state of the art with better security and possibly performance.
All: Give me your thoughts on one vs. the other or on the additional options.
Note: Ross Currie also provides a guide resulting from his hard fought battle to get FIM on SharePoint 2013
Note: Paul and Spencer AFAIK have never actually carried out a debate on this topic.
Wednesday, April 30, 2014
MIM's the word -- New name for FIM
Last week the Product group announced the new name for FIM and MIM's the word Microsoft Identity Manager.
Of course as a good futurist I had made enough guesses that I got this one right, even though as an honest man I must admit I also had it wrong -- Azure is not part of the name.
Fortunately, they didn't go with APE nor AILMENT, nor MIME, nor MIAMI, nor MICE, nor MAIM, nor WIMP. MIM's the word!
Hopefully, many of my readers have been entertained by my speculation. It has been fun. So now back to real work ... what will it be called in the release after the next one?
Hmm...
Hybrid Identity Manager (HIM)-- Too sexistHybrid Identity Provisioning Engine (HIPE)-- Hype -- nah- Hybrid Identity Access Engine -- (HIAE) -- pronounced Hi yah! I could go for that one!
Friday, April 18, 2014
Mailbag: Learning FIM, SQL and IIS
First think for a moment about your best learning styles for technology. Do you need to read the concepts and architecture first and then do it? Do you need to watch a video and then read, and then do it? Do you need to try it and then go back and read? Do you need an instructor? Sometimes you have to learn through experimentation. In the early days of ILM 2 Beta there wasn't much info so we had to experiment. Brad Turner and I spent many days in a lab configuring and trying things out to see what was the best practice.
Fortunately there are a fair amount of videos, articles, virtual labs and classes about all three subjects. In general I find the virtual labs to be a great way to get in and get some quick hands on lab knowledge without having to labor endlessly to setup your own lab. Not that you won't get something out of that experience. But sometimes you need to pickup tidbits or try something out before deciding you need to setup a more permanent lab to experiment with.
FIM, SQL and IIS rely on Windows Server, Active Directory and Networking. It is surprising how many issues get resolved through knowledge of basic networking and its troubleshooting tools. Understand how client applications use DNS to find what they are looking for and SPNs to authenticate through Kerberos. If you are shaky or want a refresher I encourage people to start with those topics.
For FIM I would start with the Ramp Up training. It provides you with video, lab manuals and the virtual labs. Of course I also recommend my book. There is also another FIM book by Kent Nordstrom. Beyond that here is a great list of resources: http://social.technet.microsoft.com/wiki/contents/articles/399.forefront-identity-manager-resources.aspx#Learning_FIM_TwentyTen
SQL: this is more in the context of what you need to know about SQL to support FIM. Start with a presentation I gave a few years ago at The Experts Conference on Care and Feeding of the databases
as this gives you some perspective on what you need to SQL to support FIM. Configuring overall memory for SQL, TempDB configuration, Index management, Backups, Transaction Logs, Recovery Models. The last chapter of FIM Best Practices Volume 1 covers how to intelligently automate your SQL maintenance.
If you want to start learning SQL queries try http://www.ilmbestpractices.com/files/I_Dream_in_SQL.zip or take the Microsoft course
IIS: Again this is in the context of what you need to know about IIS to support FIM.
Overview of IIS 8 (Windows 8 and Windows Server 2012) http://technet.microsoft.com/en-us/library/hh831725.aspx
Overview of IIS 7 http://technet.microsoft.com/en-us/library/cc753734(v=WS.10).aspx
Great post comparing how IIS 6 through 8 deal with SSL.
Intro to IIS 8 virtual lab http://go.microsoft.com/?linkid=9838455
Thursday, April 17, 2014
New name for FIM?
Azure Identity Manager (AIM) -- I would be ok with this
Azure Role Based Access Manager (ARBAM) -- Explosive sounding name
Azure Provisioning Engine (APE) -- Please no!!
Azure Identity Technology (AIT) -- pronounced 8 or aight. Nah.
Azure Identity Sync Lifecycle Engine (AISLE) -- Certainly when people walk down the aisle they have an identity changing event.
Azure Identity Lifecycle Management Engine Next Technology (AILMENT). I really hope not we want to cure ailments not install one for you.
My Official guess -- Azure Identity Enhancements (AIE)
Unless we have already seen the new product name -- Azure Active Directory Premium (AADP).
Maybe the on-premise version will have a slightly different name
Azure Active Directory Premium On Premises Edition (AAD POPE)
The above has been pure speculation. I have no inside knowledge on the name.
Hints of FIM's Future: Azure Active Directory (AAD) Sync
But then, yesterday, I watched Andreas Kjellman present at the FIM user group
Andreas unveiled the AADSync, the Azure Active Directory Sync that will replace DirSync to sync from your Active Directory to the cloud. I finally got it! My crystal ball wasn't broken!
AADSync is built on the next generation of the Sync Engine. 80% of the scenarios for syncing with Azure (Office365) will be handled with a wizard, including Multi-Forest. For more advanced scenarios you will be able to use a significantly upgraded function library to do "declarative provisioning" with sync rules. In fact no code for rules extensions will be permitted.
What does this mean for FIM?
I speculate that eventually FIM will follow this path. Since this next version seems to support the same connector framework, I think we will continue to see connector development as well as continued cloud capabilities ala Azure Access Enhancements and Azure AD Premium.
Thanks to the user group sponsor -- the FIM team, hosted by Carol Wapshere for putting it together and eventually providing the recording found here: http://thefimteam.com/fim-team-user-group/
AADSync is available now in Preview.
Wednesday, April 16, 2014
Good RID(ance, I mean issuance)
In short most of you would be more likely to encounter this in a test or dev environment where you destroy and create many many users as part of your testing with FIM.
So Windows Server 2012 to the rescue.
1) It adds a bit so now you can unlock that bit and have 31 bits for the RID or 2 billion RIDs.
2) You get warnings in the event log whenever you consume 10% of the space left since your last warning.
3) Now there is a safety mechanism, you can't increase the RID Block size to higher than 15,000. Previously there was no limit and you could have allocated the entire RID space in one transaction to one domain controller.
4) There are also brakes. When you are within 1 percent of only have 10% of your global RID space left you get warned and there is also an artificial ceiling so that you can fix whatever is chewing up your RIDs before you are out.
In short good RID(ance I mean issuance).