When our interconnection of various services by using midPoint, SCIMv1 protocol and Connid framework was ready, we proceeded to testing. While looking around for services supporting the SCIM protocol we stumbled upon two quite popular ones. Salesforce and Slack which both support the SCIM 1.1 specification but both on their own way with a couple of facts which we view as shortcomings .
Let’s take a look on the Slack service and let us check up on a couple of special handling procedures which had to be implemented so we can effectively work with the service.
Issues mostly lie in schema inconsistencies. The service provides a “Schemas/Users” and “Schemas/Groups” endpoint in which the schema definition for both of the resources lays. Yet a handful of attributes, even mandatory, are not present in the definition. We had to fix this issue with injecting a couple of attributes, inspired by the SCIM 1.1 core resource representations, to the provided schemas and it looks like it done the trick. The injected attributes:
Another issue was with lacking sub attributes. To be specific, the sub attribute “formatted” from the attribute “name” is missing from the json object representation returned from the service provider. This is another inconsistency with the schema representation which is returned by querying the schema endpoints because the ‘formated’ sub attribute is present in the schema json response for the “Users” resource.
As mentioned before, some attributes like “addresses”, “phoneNumbers” or “emails” are divided with their “type” sub attribute to specify to the kind of information they provide. An issue was found with the “emails” attribute. The “type” sub attribute has a property which contains a couple of canonical values. The values in this case are, as defined in the schema returned by the provider, the following:
The problem is that only the type value “work” is considered valid and accepted in a query to the Service. And also a returned json object does not contain a type sub attribute.
This fact may also cause some warning in our identity management solution. Because the connector will generate schema attributes for each canonical value and these attributes are considered mandatory in the provided schema. But it should not limit any type of usage.
Issues are also present with query filters. A majority of filters work only for a handful of attributes. Mostly the filters work only for the “userName” attribute in the “Users” resource and “displayName” attribute in the “Groups” resource and the “id” attribute in both resources. And the “Endswith” filter does not work at all (“500” HTTP code returned).
Some cases of error reporting tend to be a bit too general. The HTTP status code “400” or “Bad Request” is used in a majority of error responses to a prohibited or unexpected query made by the user (e.g. the “409” conflict could be a better response to a conflict in updating unique attributes when the same value of the unique attribute is found in a different account).
The second tested service was Salesforce. And also a couple of workarounds were implemented to patch some flaws we found.
First the schema inconsistencies. A couple of attributes which are mandatory for account creation had the “readOnly” flag set to “true” which prohibited assigning a value to those attributes. We had to change the schema definition which we got from the service provider to allow mutability on those attributes.
We also had to set the “multiValued” flag of some attributes to “true” which enables multiple values for a single attribute.
Issues are also found with a couple of filter. The “EndsWith” filter is not working (“400” HTTP code returned) nor the “Contains all values” filter (“500” code returned). To comply with some requirements of our identity management solution we had to make a workaround for the “Contains all values” filter. The use-case was that the filter has returned with only one value, so we switches the filter to a “Contains” filter evaluating the only value present in the original filter. This should be an equivalent operation considering our implementation.
Despite of some issues we encountered the SCIM connector from our point of view was a success. Thank you for reading this blog and if there are any suggestions for the improvement of this feature or this blog we are open to hear them.