Performance implications of a lot of Sets

I’ve been creating lots of Sets lately. Lots and lots. I created over 400 of them and there are around 40k objects in the system.

At TEC I found out that no one else seems to have created quite so many Sets so here are my observations.

The FIMService database grew 10-20%

I wish I could give a more accurate figure but I wasn’t looking out for a big growth in the DB. I noticed it had jumped in size the next time I replicated the DB to my test server and there hadn’t been any other big increase in object numbers.

The FIM MA really slowed down

I was doing around 3500 exports through the FIM MA per hour. Everyone knows the FIM MA is not the fastest, but it slowed down so much it started running into the next hour’s sync cycle.

Discussing this with David Lundell made me understand that each change to an object in the FIM Portal causes a re-evaluation of all possible Sets to see if it has now fallen into the scope of any of them. No wonder it slowed down!

As it happens the vast majority of those hourly changes were “Last Contact Time” attributes, which it has now been agreed only need updating daily. So I’m back to a low number of hourly exports and the problem, while not solved, is at least side-stepped.

But there are advantages

The reasons I have all these Sets are as follows:

  1. I want to be able to delegate very specifically which Administrators can modify which Users, and
  2. I want to target my MPRs to exact scenarios.

So for example I end up with a lot of Sets named things like “Paris BPOS Active with forwarding address”.

And in fact this part of it is working great. IT Admins only have access to change things under the right circumstances, for the right users, and triggering the right Workflows.

Why don’t you use “Relative to Resource”

At TEC I was asked why I didn’t cut down the number of Sets by using “Relative to Resource”. While I’d still need “BPOS Active with forwarding address” perhaps this would reduce the need to have a seperate version of the Set for each delegation zone.

Unfortunately I don’t see how it would be possible. Users may be administered by multiple IT admins – including their own local support person/team, a backup support person/team from another site, and then the global administrators. All these possible people would have to be stamped on each user object and then kept up to date somehow. I honestly think that would be much harder to work with than using Sets, and would probably lead to performance problems of its own.

And how do I keep all these Sets up to date?

See my earlier post: A script to create Sets and MPRs from Templates.

8 Replies to “Performance implications of a lot of Sets”

  1. Thanks for posting this. Indeed, after your presentation I was very curious to see how this was impacting performance.

    Just to clarify — it is only criteria based sets that are evaulated with every object change. So one way to preserve the flexibility desired without impacting performance would be to change those sets to manual (means deleting and recreating them), and then running a script on a periodic basis (hourly?) to update the membership. That way the query to update the set only runs once/hr instead of every time an object is updated.

  2. Would this idea also work:
    Create an additional objecttype which contains non-critical attributes (e.g. the attributes that are not relevant for sets). Create a reference to that object from the Person object, so you can reach the values.

  3. Frank,
    I had to think for a while about what you’re asking here. I think I get it now – you suggest putting some attributes in another resource type so I can update them whenever I want without triggering a set re-evaluation? Unfortunately this won’t work. In my environment I have a lot of “devices” which are not member of any set, except I guess “All Devices”. They were getting updated through the Sync service quite a lot, and I had to make changes there because the FIM MA got so slow. It appeared that the device objects were being re-evaluated against all the sets every time, even though the sets are almost all for the ‘person’ resource type.

  4. Intersting,
    for those 3500 exports, can you share the HW configuration you used. I will be doing around 20K for galsync and wondering about the time that will be consumed for exports and imports to complete.

  5. Are you exporting to the FIM Portal? This post was specifically about creating hundreds of sets in the portal, and should not be relevant to a straight galsync scenario.

  6. There’s plenty of people doing syncs on hundreds of thousands of objects. Once you get the deltas going it will be fine.

Leave a Reply

Your email address will not be published. Required fields are marked *


*