5 Developing Identity Connectors Using Java
This chapter is a tutorial that walks through the procedures necessary to develop an identity connector using the Identity Connector Framework (ICF) and the Oracle Identity Manager metadata. It includes information about important ICF classes and interfaces, the connector bundle, the connector server, and code samples for implementing a flat file identity connector and creating Oracle Identity Manager metadata for user provisioning and reconciliation processes.
This chapter contains the following sections:
5.1 Introduction to Flat File Connector Development
To develop a flat file connector, you must develop an implementation of the Configuration interface followed by the implementation of the Connector class.
Before beginning, you must prepare IO representation modules for all flat file connector operations. This might include all or some of the following:
-
Read the column names of the flat file and prepare metadata information.
-
Add a record to the flat file with the corresponding column values separated by the specified delimiter.
-
Delete a record to the flat file based on the UID value.
-
Search operations on flat file.
This tutorial is focused on identity connector development, and therefore, these preparations are not discussed in detail.
Note:
The following supporting classes are used for file input and output handling during identity connector operations:
-
org.identityconnectors.flatfile.io.FlatFileIOFactory
-
org.identityconnectors.flatfile.io.FlatFileMetadata
-
org.identityconnectors.flatfile.io.FlatFileParser
-
org.identityconnectors.flatfile.io.FlatFileWriter
See Supporting Classes for File Input and Output Handling for the implementations of the input and output handling supporting classes.
5.2 Developing a Flat File Connector
Developing a flat file connector involves implementing the AbstractConfiguration, PoolableConnector, and AbstractFilterTranslator classes, and creating the connector bundle JAR file.
This section describes the high-level procedure to develop a flat file connector along with code samples. It contains the following topics:
5.2.1 Overview of Developing a Flat File Connector
Developing a flat file connector involves implementing the AbstractConfiguration, PoolableConnector, and AbstractFilterTranslator classes, and creating the connector bundle JAR file.
To develop a flat file connector:
-
Implement the configuration class for the Flat File Connector by extending the org.identityconnectors.framework.spi.AbstractConfiguration base class.
See Implementation of AbstractConfiguration for a sample implementation of the configuration class.
See The org.identityconnectors.framework.spi.Configuration Interface for more information.
-
Create connector class for the Flat File Connector by implementing the org.identityconnectors.framework.spi.Connector interface.
See Implementation of PoolableConnector for a sample implementation of the PoolableConnector class.
-
This connector supports only the ContainsAllValuesFilter operation. Implement the ContainsAllValuesFilter operation. See Implementation of AbstractFilterTranslator for a sample implementation of the of the AbstractFilterTranslator<T> class.
-
Create the connector bundle JAR. The MANIFEST.MF file must contain the following entries:
-
ConnectorBundle-FrameworkVersion
-
ConnectorBundle-Name
-
ConnectorBundle-Version
See The MANIFEST.MF File for the contents of the MANIFEST.MF file.
-
-
Update the connector bundle JAR as created in step 4. To do so:
-
Extract the connector bundle JAR into any desired location.
-
Create a lib directory in the directory in which you extracted the JAR.
-
Add the dependent third-party JARs into the lib directory.
-
JAR the entire directory.
Note:
The MANIFEST.MF file must contain the entries listed in step 4.
-
5.2.2 Implementation of AbstractConfiguration
The AbstractConfiguration base class can be extended to implement the configuration class for a Flat File Connector.
The following is a sample of the implementation of the AbstratConfiguration class:
package org.identityconnectors.flatfile; import java.io.File; import org.identityconnectors.flatfile.io.FlatFileIOFactory; import org.identityconnectors.framework.common.exceptions.ConfigurationException; import org.identityconnectors.framework.spi.AbstractConfiguration; import org.identityconnectors.framework.spi.ConfigurationProperty; /** * Class for storing the flat file configuration */ public class FlatFileConfiguration extends AbstractConfiguration { /* * Storage file name */ private File storeFile; /* * Delimeter used */ private String textFieldDelimeter; /* * Unique attribute field name */ private String uniqueAttributeName = ""; /* * Change attribute field name. Should be numeric */ private String changeLogAttributeName = ""; public File getStoreFile() { return storeFile; } public String getTextFieldDelimeter() { return textFieldDelimeter; } public String getUniqueAttributeName() { return uniqueAttributeName; } public String getChangeLogAttributeName() { return changeLogAttributeName; } /** * Set the store file * @param storeFile */ @ConfigurationProperty(order = 1, helpMessageKey = "USER_ACCOUNT_STORE_HELP", displayMessageKey = "USER_ACCOUNT_STORE_DISPLAY") public void setStoreFile(File storeFile) { this.storeFile = storeFile; } /** * Set the text field delimeter * @param textFieldDelimeter */ @ConfigurationProperty(order = 2, helpMessageKey = "USER_STORE_TEXT_DELIM_HELP", displayMessageKey = "USER_STORE_TEXT_DELIM_DISPLAY") public void setTextFieldDelimeter(String textFieldDelimeter) { this.textFieldDelimeter = textFieldDelimeter; } /** * Set the field whose values will be considered as unique attributes * @param uniqueAttributeName */ @ConfigurationProperty(order = 3, helpMessageKey = "UNIQUE_ATTR_HELP", displayMessageKey = "UNIQUE_ATTR_DISPLAY") public void setUniqueAttributeName(String uniqueAttributeName) { this.uniqueAttributeName = uniqueAttributeName; } /** * Set the field name where change number should be stored * @param changeLogAttributeName */ @ConfigurationProperty(order = 3, helpMessageKey = "CHANGELOG_ATTR_HELP", displayMessageKey = "CHANGELOG_ATTR_DISPLAY") public void setChangeLogAttributeName(String changeLogAttributeName) { this.changeLogAttributeName = changeLogAttributeName; } @Override public void validate() { // Validate if file exists and is usable boolean validFile = (this.storeFile.exists() && this.storeFile.canRead() && this.storeFile.canWrite() && this.storeFile.isFile()); if (!validFile) throw new ConfigurationException("User store file not valid"); // Validate if there is a field on name of unique attribute field name // Validate if there is a field on name of change attribute field name FlatFileIOFactory.getInstance(this); // Initialization does the validation } }
5.2.3 Implementation of PoolableConnector
The org.identityconnectors.framework.spi.Connector interface is implemented to create the connector class for a Flat File Connector.
The following code sample implements the CreateOp, DeleteOp, SearchOp and UpdateOp interfaces and thus supports all four operations. The FlatFileMetadata, FlatFileParser and FlatFileWriter classes are supporting classes. Their implementation is not shown as they do not belong to the ICF.
package org.identityconnectors.flatfile; import java.util.HashSet; import java.util.Iterator; import java.util.List; import java.util.Map; import java.util.Set; import org.identityconnectors.flatfile.io.FlatFileIOFactory; import org.identityconnectors.flatfile.io.FlatFileMetadata; import org.identityconnectors.flatfile.io.FlatFileParser; import org.identityconnectors.flatfile.io.FlatFileWriter; import org.identityconnectors.framework.api.operations.GetApiOp; import org.identityconnectors.framework.common.exceptions.AlreadyExistsException; import org.identityconnectors.framework.common.exceptions.ConnectorException; import org.identityconnectors.framework.common.objects.Attribute; import org.identityconnectors.framework.common.objects.AttributeInfo; import org.identityconnectors.framework.common.objects.AttributeInfoBuilder; import org.identityconnectors.framework.common.objects.ConnectorObject; import org.identityconnectors.framework.common.objects.ConnectorObjectBuilder; import org.identityconnectors.framework.common.objects.ObjectClass; import org.identityconnectors.framework.common.objects.OperationOptions; import org.identityconnectors.framework.common.objects.ResultsHandler; import org.identityconnectors.framework.common.objects.Schema; import org.identityconnectors.framework.common.objects.SchemaBuilder; import org.identityconnectors.framework.common.objects.Uid; import org.identityconnectors.framework.common.objects.filter.AbstractFilterTranslator; import org.identityconnectors.framework.common.objects.filter.FilterTranslator; import org.identityconnectors.framework.spi.Configuration; import org.identityconnectors.framework.spi.ConnectorClass; import org.identityconnectors.framework.spi.PoolableConnector; import org.identityconnectors.framework.spi.operations.CreateOp; import org.identityconnectors.framework.spi.operations.DeleteOp; import org.identityconnectors.framework.spi.operations.SchemaOp; import org.identityconnectors.framework.spi.operations.SearchOp; import org.identityconnectors.framework.spi.operations.UpdateOp; /** * The main connector class */ @ConnectorClass(configurationClass = FlatFileConfiguration.class, displayNameKey = "FlatFile") public class FlatFileConnector implements SchemaOp, CreateOp, DeleteOp, UpdateOp, SearchOp<Map<String, String>>, GetApiOp, PoolableConnector { private FlatFileConfiguration flatFileConfig; private FlatFileMetadata flatFileMetadata; private FlatFileParser flatFileParser; private FlatFileWriter flatFileWriter; private boolean alive = false; @Override public Configuration getConfiguration() { return this.flatFileConfig; } @Override public void init(Configuration config) { this.flatFileConfig = (FlatFileConfiguration) config; FlatFileIOFactory flatFileIOFactory = FlatFileIOFactory.getInstance(flatFileConfig); this.flatFileMetadata = flatFileIOFactory.getMetadataInstance(); this.flatFileParser = flatFileIOFactory.getFileParserInstance(); this.flatFileWriter = flatFileIOFactory.getFileWriterInstance(); this.alive = true; System.out.println("init called: Initialization done"); } @Override public void dispose() { this.alive = false; } @Override public Schema schema() { SchemaBuilder flatFileSchemaBldr = new SchemaBuilder(this.getClass()); Set<AttributeInfo> attrInfos = new HashSet<AttributeInfo>(); for (String fieldName : flatFileMetadata.getOrderedTextFieldNames()) { AttributeInfoBuilder attrBuilder = new AttributeInfoBuilder(); attrBuilder.setName(fieldName); attrBuilder.setCreateable(true); attrBuilder.setUpdateable(true); attrInfos.add(attrBuilder.build()); } // Supported class and attributes flatFileSchemaBldr.defineObjectClass (ObjectClass.ACCOUNT.getDisplayNameKey(),attrInfos); System.out.println("schema called: Built the schema properly"); return flatFileSchemaBldr.build(); } @Override public Uid create(ObjectClass arg0, Set<Attribute> attrs, OperationOptions ops) { System.out.println("Creating user account " + attrs); assertUserObjectClass(arg0); try { FlatFileUserAccount accountRecord = new FlatFileUserAccount(attrs); // Assert uid is there assertUidPresence(accountRecord); // Create the user this.flatFileWriter.addAccount(accountRecord); // Return uid String uniqueAttrField = this.flatFileConfig .getUniqueAttributeName(); String uniqueAttrVal = accountRecord .getAttributeValue(uniqueAttrField); System.out.println("User " + uniqueAttrVal + " created"); return new Uid(uniqueAttrVal); } catch (Exception ex) { // If account exists if (ex.getMessage().contains("exists")) throw new AlreadyExistsException(ex); // For all other causes System.out.println("Error in create " + ex.getMessage()); throw ConnectorException.wrap(ex); } } @Override public void delete(ObjectClass arg0, Uid arg1, OperationOptions arg2) { final String uidVal = arg1.getUidValue(); this.flatFileWriter.deleteAccount(uidVal); System.out.println("Account " + uidVal + " deleted"); } @Override public Uid update(ObjectClass arg0, Uid arg1, Set<Attribute> arg2, OperationOptions arg3) { String accountIdentifier = arg1.getUidValue(); // Fetch the account FlatFileUserAccount accountToBeUpdated = this.flatFileParser .getAccount(accountIdentifier); // Update accountToBeUpdated.updateAttributes(arg2); this.flatFileWriter .modifyAccount(accountIdentifier, accountToBeUpdated); System.out.println("Account " + accountIdentifier + " updated"); // Return new uid String newAccountIdentifier = accountToBeUpdated .getAttributeValue(this.flatFileConfig.getUniqueAttributeName()); return new Uid(newAccountIdentifier); } @Override public FilterTranslator<Map<String, String>> createFilterTranslator( ObjectClass arg0, OperationOptions arg1) { // TODO: Create a fine grained filter translator // Return a dummy object as its not applicable here. // All processing happens in the execute query return new AbstractFilterTranslator<Map<String, String>>() { }; } @Override public ConnectorObject getObject(ObjectClass arg0, Uid uid, OperationOptions arg2) { // Return matching record String accountIdentifier = uid.getUidValue(); FlatFileUserAccount userAcc = this.flatFileParser .getAccount(accountIdentifier); ConnectorObject userAccConnObject = convertToConnectorObject(userAcc); return userAccConnObject; } /* * (non-Javadoc) * This is the search implementation. * The Map passed as the query here, will map to all the records with * matching attributes. * * The record will be filtered if any of the matching attributes are not * found * * @see * org.identityconnectors.framework.spi.operations.SearchOp#executeQuery * (org.identityconnectors.framework.common.objects.ObjectClass, * java.lang.Object, * org.identityconnectors.framework.common.objects.ResultsHandler, * org.identityconnectors.framework.common.objects.OperationOptions) */ @Override public void executeQuery(ObjectClass objectClass, Map<String, String> matchSet, ResultsHandler resultHandler, OperationOptions ops) { System.out.println("Inside executeQuery"); // Iterate over the records and handle individually Iterator<FlatFileUserAccount> userAccountIterator = this.flatFileParser .getAccountIterator(matchSet); while (userAccountIterator.hasNext()) { FlatFileUserAccount userAcc = userAccountIterator.next(); ConnectorObject userAccObject = convertToConnectorObject(userAcc); if (!resultHandler.handle(userAccObject)) { System.out.println("Not able to handle " + userAcc); break; } } } private void assertUserObjectClass(ObjectClass arg0) { if (!arg0.equals(ObjectClass.ACCOUNT)) throw new UnsupportedOperationException( "Only user account operations supported."); } private void assertUidPresence(FlatFileUserAccount accountRecord) { String uniqueAttrField = this.flatFileConfig.getUniqueAttributeName(); String uniqueAttrVal = accountRecord.getAttributeValue(uniqueAttrField); if (uniqueAttrVal == null) { throw new IllegalArgumentException("Unique attribute not passed"); } } private ConnectorObject convertToConnectorObject(FlatFileUserAccount userAcc) { ConnectorObjectBuilder userObjBuilder = new ConnectorObjectBuilder(); // Add attributes List<String> attributeNames = this.flatFileMetadata .getOrderedTextFieldNames(); for (String attributeName : attributeNames) { String attributeVal = userAcc.getAttributeValue(attributeName); userObjBuilder.addAttribute(attributeName, attributeVal); if (attributeName.equals(this.flatFileConfig .getUniqueAttributeName())) { userObjBuilder.setUid(attributeVal); userObjBuilder.setName(attributeVal); } } return userObjBuilder.build(); } @Override public void checkAlive() { if (!alive) throw new RuntimeException("Connection not alive"); } }
5.2.4 Implementation of AbstractFilterTranslator
The org.identityconnectors.framework.common.objects.filter.AbstractFilterTranslator<T> class is implemented to define the filter operation.
The following is a sample implementation of org.identityconnectors.framework.common.objects.filter.AbstractFilterTranslator<T> that defines the filter operation:
package org.identityconnectors.flatfile.filteroperations; import java.util.HashMap; import java.util.Map; import org.identityconnectors.framework.common.objects.Attribute; import org.identityconnectors.framework.common.objects.filter.AbstractFilterTranslator; import org.identityconnectors.framework.common.objects.filter.ContainsAllValuesFilter; public class ContainsAllValuesImpl extends AbstractFilterTranslator<Map<String, String>>{ @Override protected Map<String, String> createContainsAllValuesExpression( ContainsAllValuesFilter filter, boolean not) { Map<String, String> containsAllMap = new HashMap<String, String>(); Attribute attr = filter.getAttribute(); containsAllMap.put(attr.getName(), attr.getValue().get(0).toString()); return containsAllMap; } }
5.2.5 The MANIFEST.MF File
The MANIFEST.MF file is used to create the connector bundle JAR file.
The following is the contents of the MANIFEST.MF file:
Manifest-Version: 1.0 Ant-Version: Apache Ant 1.7.0 Created-By: 14.1-b02 (Sun Microsystems Inc.) ConnectorBundle-FrameworkVersion: 1.0 ConnectorBundle-Name: org.identityconnectors.flatfile ConnectorBundle-Version: 1.0 Build-Number: 609 Subversion-Revision: 4582
5.3 Supporting Classes for File Input and Output Handling
The supporting classes for file input and output handling are FlatFileIOFactory, FlatFileMetaData, FlatFileParser, FlatFileWriter, FlatfileLineIterator, FlatfileUserAccount, FlatfileAccountConversionHandler, and Messages.Properties.
This section shows the implementation of the following supporting classes for file input and output handling:
5.3.1 Implementation of the FlatFileIOFactory Supporting Class
The following code sample shows the implementation of the FlatFileIOFactory supporting class:
package org.identityconnectors.flatfile.io; import org.identityconnectors.flatfile.FlatFileConfiguration; public class FlatFileIOFactory { private FlatFileMetadata flatFileMetadata; private FlatFileConfiguration flatFileConfig; /** * Provides instance of the factory * @param flatfileConfig Configuration bean for the flat file */ public static FlatFileIOFactory getInstance(FlatFileConfiguration fileConfig) { return new FlatFileIOFactory(fileConfig); } /** * Making it private to avoid public instantiation. Encouraging use of getInstance * @param fileConfig */ private FlatFileIOFactory(FlatFileConfiguration fileConfig) { this.flatFileConfig = fileConfig; this.flatFileMetadata = new FlatFileMetadata(flatFileConfig); System.out.println("Metadata set"); } /** * Returns the metadata instance * @return */ public FlatFileMetadata getMetadataInstance() { return this.flatFileMetadata; } /** * Returns the FlatFileParser instance * @return */ public FlatFileParser getFileParserInstance() { return new FlatFileParser(this.flatFileMetadata, this.flatFileConfig); } /** * Returns the FlatFileWriter instance * @return */ public FlatFileWriter getFileWriterInstance() { return new FlatFileWriter(this.flatFileMetadata, this.flatFileConfig); } }
5.3.2 Implementation of the FlatFileMetaData Supporting Class
The following code sample shows the implementation of the FlatFileMetaData supporting class:
package org.identityconnectors.flatfile.io; import java.io.BufferedReader; import java.io.File; import java.io.FileReader; import java.io.IOException; import java.util.ArrayList; import java.util.List; import java.util.StringTokenizer; import org.identityconnectors.flatfile.FlatFileConfiguration; /** * This class contains all the metadata related information Example: Ordering of * columns, Number of columns etc. * * @author harsh * */ public class FlatFileMetadata { private FlatFileConfiguration fileConfig; private List<String> orderedTextFieldNames; private String changeLogFieldName; private String uniqueAttributeFiledName; /** * Instantiates the class with the file configuration. * Making it package private to encourage instantiation from Factory class * @param fileConfig */ FlatFileMetadata(FlatFileConfiguration fileConfig) { /* * Ideally you should not take connector specific configuration class in * flat file resource classes. Change if this has to go to production. * Probably make another configuration class for flat file with same * signatures. */ this.fileConfig = fileConfig; initializeMetadata(); validateConfigProps(); } /** * Returns the text field names in the order of their storage * * @return */ public List<String> getOrderedTextFieldNames() { return this.orderedTextFieldNames; } /** * Returns the number of columns */ public int getNumberOfFields() { int numberOfTextFields = this.orderedTextFieldNames.size(); return numberOfTextFields; } /** * Specifies if number of tokens are matching with the standard length of metadata * @param countTokens * @return */ public boolean isDifferentFromNumberOfFields(int countTokens) { return (getNumberOfFields() != countTokens); } /** * Reads the header line and sets the metadata */ private void initializeMetadata() { // Read the file. File recordsStore = this.fileConfig.getStoreFile(); try { BufferedReader storeFileReader = new BufferedReader(new FileReader( recordsStore.getAbsolutePath())); // Read the header line String headerString = storeFileReader.readLine(); // Tokenize the headerString StringTokenizer tokenizer = new StringTokenizer(headerString, fileConfig.getTextFieldDelimeter()); this.orderedTextFieldNames = new ArrayList<String>(); while (tokenizer.hasMoreTokens()) { String header = tokenizer.nextToken(); this.orderedTextFieldNames.add(header); } System.out.println("Columns read - " + this.orderedTextFieldNames); } catch (IOException e) { throw new RuntimeException("How can I read a corrupted file"); } // Store the change log and unique attribute field names this.changeLogFieldName = fileConfig.getChangeLogAttributeName(); this.uniqueAttributeFiledName = fileConfig.getUniqueAttributeName(); } /** * Validate if the attribute names in config props object are present in the * column names * * @throws RuntimeException * if validation fails */ private void validateConfigProps() { // Check if unique attribute col name is present if (!this.orderedTextFieldNames.contains(this.changeLogFieldName)) throw new RuntimeException("Change log field name " + this.changeLogFieldName + " not found in the store file "); // Check if change col name is present if (!this.orderedTextFieldNames.contains(this.uniqueAttributeFiledName)) throw new RuntimeException("Unique attribute field name " + this.uniqueAttributeFiledName + " not found in the store file"); } }
5.3.3 Implementation of the FlatFileParser Supporting Class
The following code sample shows the implementation of the FlatFileParser supporting class:
package org.identityconnectors.flatfile.io; import java.io.BufferedReader; import java.io.File; import java.io.FileReader; import java.io.IOException; import java.util.ArrayList; import java.util.HashMap; import java.util.Iterator; import java.util.List; import java.util.Map; import org.identityconnectors.flatfile.FlatFileConfiguration; import org.identityconnectors.flatfile.FlatFileUserAccount; import org.identityconnectors.flatfile.utils.AccountConversionHandler; public class FlatFileParser { private File recordsStore; private FlatFileConfiguration fileConfig; private FlatFileMetadata metadata; private AccountConversionHandler accountConverter; /** * Instantiates the parser class. Making it package private to encourage * instantiation from Factory class * * @param metadata * @param fileConfig */ FlatFileParser(FlatFileMetadata metadata, FlatFileConfiguration fileConfig) { this.fileConfig = fileConfig; this.recordsStore = fileConfig.getStoreFile(); this.accountConverter = new AccountConversionHandler(metadata, fileConfig); this.metadata = metadata; } /** * Returns all accounts in the file * * @return */ public List<FlatFileUserAccount> getAllAccounts() { try { BufferedReader userRecordReader = new BufferedReader( new FileReader(recordsStore.getAbsolutePath())); String recordStr; // Skip headers userRecordReader.readLine(); // Loop over records and make list of objects List<FlatFileUserAccount> allAccountRecords = new ArrayList<FlatFileUserAccount>(); while ((recordStr = userRecordReader.readLine()) != null) { try { FlatFileUserAccount accountRecord = accountConverter .convertStringRecordToAccountObj(recordStr); allAccountRecords.add(accountRecord); } catch (RuntimeException e) { System.out.println("Invalid entry " + e.getMessage()); } } userRecordReader.close(); return allAccountRecords; } catch (IOException e) { throw new RuntimeException("How can I read a corrupted file"); } } /** * Gets the account of matching account identifier * * @param accountIdentifier * @return */ public FlatFileUserAccount getAccount(String accountIdentifier) { /* * I know its not right to get all account details. Don't want to focus * on efficiency and scalability as this is just a sample. */ // Iterate over all records and check for matching account Map<String, String> matchSet = new HashMap<String, String>(); matchSet.put(fileConfig.getUniqueAttributeName(), accountIdentifier); for (FlatFileUserAccount userRecord : getAllAccounts()) { if (userRecord.hasMatchingAttributes(matchSet)) return userRecord; } // Got nothing.. return null; } /** * Returns all records with matching Attributes If more than attributes are * passed. it will check all the attributes * * @param matchSet * Checks if all provided attributes are matched */ public List<FlatFileUserAccount> getAccountsByMatchedAttrs( Map<String, String> matchSet) { /* * I know its not right to get all account details. Don't want to focus * on efficiency and scalability as this is just a sample. */ // Iterate over all records and check for matching account List<FlatFileUserAccount> matchingRecords = new ArrayList<FlatFileUserAccount>(); for (FlatFileUserAccount userRecord : getAllAccounts()) { if (userRecord.hasMatchingAttributes(matchSet)) matchingRecords.add(userRecord); } return matchingRecords; } /** * Returns the records that fall after the specified change number This * function helps in checking the function of sync * * @param changeNumber * the change number for the last search */ public List<FlatFileUserAccount> getUpdatedAccounts(int changeNumber) { /* * I know its not right to get all account details. Don't want to focus * on efficiency and scalability as this is just a sample. */ // Iterate over all records and check for matching account List<FlatFileUserAccount> matchingRecords = new ArrayList<FlatFileUserAccount>(); String changeLogAttrName = fileConfig.getChangeLogAttributeName(); for (FlatFileUserAccount userRecord : getAllAccounts()) { int recordChangeNumber = userRecord .getChangeNumber(changeLogAttrName); if (recordChangeNumber >= changeNumber) matchingRecords.add(userRecord); } return matchingRecords; } /** * Returns an iterator that iterates over the records. This is provided for * dynamic retrieval of records * * @param matchSet * Filters the records by matching the given attributes. Use null * or empty set to avoid filtering * @return */ public Iterator<FlatFileUserAccount> getAccountIterator( Map<String, String> matchSet) { Iterator<FlatFileUserAccount> recordIterator = new FlatFileLineIterator( this.metadata, this.fileConfig, matchSet); return recordIterator; } /** * Gives the next change number. Logic is max of existing change numbers + 1 * @return */ public int getNextChangeNumber() { int maximumChangeNumber = 0; /* * I know its not right to get all account details. Don't want to focus * on efficiency and scalability as this is just a sample. */ // Iterate over all records and check for matching account String changeLogAttrName = fileConfig.getChangeLogAttributeName(); for (FlatFileUserAccount userRecord : getAllAccounts()) { int changeNumber = userRecord.getChangeNumber(changeLogAttrName); if (changeNumber >= maximumChangeNumber) { maximumChangeNumber = changeNumber + 1; } } return maximumChangeNumber; } }
5.3.4 Implementation of the FlatFileWriter Supporting Class
The following code sample shows the implementation of the FlatFileWriter supporting class:
package org.identityconnectors.flatfile.io; import java.io.BufferedReader; import java.io.BufferedWriter; import java.io.File; import java.io.FileReader; import java.io.FileWriter; import java.io.IOException; import org.identityconnectors.flatfile.FlatFileConfiguration; import org.identityconnectors.flatfile.FlatFileUserAccount; import org.identityconnectors.flatfile.utils.AccountConversionHandler; /** * Class for searching operations on files * * @author Harsh */ public class FlatFileWriter { private File recordsStore; private FlatFileParser recordParser; private FlatFileConfiguration fileConfig; private AccountConversionHandler accountConverter; /** * Initializes the writer with the configuration Making it package private * to encourage use of Factory class for global instantiation * * @param metadata * @param fileConfig */ FlatFileWriter(FlatFileMetadata metadata, FlatFileConfiguration fileConfig) { this.fileConfig = fileConfig; this.recordsStore = fileConfig.getStoreFile(); recordParser = new FlatFileParser(metadata, fileConfig); accountConverter = new AccountConversionHandler(metadata, fileConfig); } /** * Appends the user record at the end of * * @param accountRecord */ public void addAccount(FlatFileUserAccount accountRecord) { try { BufferedWriter userRecordWriter = new BufferedWriter( new FileWriter(this.recordsStore.getAbsolutePath(), true)); // Set the latest changelog number int latestChangeNumber = recordParser.getNextChangeNumber(); accountRecord.setChangeNumber(fileConfig .getChangeLogAttributeName(), latestChangeNumber); // Validate if same account id doesn't exist String accountUid = accountRecord.getAttributeValue(fileConfig .getUniqueAttributeName()); FlatFileUserAccount accountByAccountId = recordParser .getAccount(accountUid); if (accountByAccountId != null) throw new RuntimeException("Account " + accountUid + " already exists"); // Put the user record in formatted way String userRecordAsStr = accountConverter .convertAccountObjToStringRecord(accountRecord); userRecordWriter.write("\n" + userRecordAsStr); // Close the output stream userRecordWriter.close(); } catch (IOException e) {// Catch exception if any throw new RuntimeException("How can I write on a corrupted file"); } } /** * Removes the entry for respective account identifier * * @param accountIdentifier */ public void deleteAccount(String accountIdentifier) { String blankRecord = ""; this.modifyAccountInStore(accountIdentifier, blankRecord); } /** * Updates the entry with respective account identifier * * @param accountIdentifier * @param updatedAccountRecord * @return new accountIdentifier */ public String modifyAccount(String accountIdentifier, FlatFileUserAccount updatedAccountRecord) { // Frame a record string and update back to file int nextChangeNumber = recordParser.getNextChangeNumber(); String changeNumberFieldName = fileConfig.getChangeLogAttributeName(); updatedAccountRecord.setChangeNumber(changeNumberFieldName, nextChangeNumber); String newRecordAsStr = accountConverter .convertAccountObjToStringRecord(updatedAccountRecord); // Update to the file this.modifyAccountInStore(accountIdentifier, newRecordAsStr); // Return new UID String uniqueAttrFieldName = fileConfig.getUniqueAttributeName(); String newAccountIdentifier = updatedAccountRecord .getAttributeValue(uniqueAttrFieldName); return newAccountIdentifier; } /** * Returns the complete flat file as string. * * @return */ private String getCompleteFlatFileAsStr() { try { BufferedReader userRecordReader = new BufferedReader( new FileReader(recordsStore.getAbsolutePath())); String recordStr; // Loop over records and make list of objects StringBuilder flatFileStr = new StringBuilder(); while ((recordStr = userRecordReader.readLine()) != null) { if (!recordStr.isEmpty()) flatFileStr.append(recordStr + "\n"); } userRecordReader.close(); return flatFileStr.toString(); } catch (IOException e) { throw new RuntimeException("How can I read a corrupted file"); } } /** * Updates the account with the new record. this can also be used for delete * * @param accountIdentifier * @param updatedRecord */ private void modifyAccountInStore(String accountIdentifier, String updatedRecord) { try { // Load the complete flat file String completeFlatFile = this.getCompleteFlatFileAsStr(); // Construct the string to be removed and replace it with blank FlatFileUserAccount accountToBeRemoved = recordParser .getAccount(accountIdentifier); String updatableString = accountConverter .convertAccountObjToStringRecord(accountToBeRemoved); String updatedFlatFile = completeFlatFile.replaceAll( updatableString, updatedRecord); // Rewrite the file BufferedWriter userRecordWriter = new BufferedWriter( new FileWriter(this.recordsStore.getAbsolutePath(), false)); userRecordWriter.write(updatedFlatFile); /*** debug ***/ System.out.println("Old string " + updatableString); System.out.println("New String" + updatedRecord); System.out.println("new file - " + updatedFlatFile); /******/ // Close the output stream userRecordWriter.close(); } catch (IOException e) {// Catch exception if any throw new RuntimeException("How can I write on a corrupted file"); } } }
5.3.5 Implementation of the FlatfileLineIterator Supporting Class
The following code sample shows the implementation of the FlatfileLineIterator supporting class:
package org.identityconnectors.flatfile.io; . import java.io.BufferedReader; import java.io.File; import java.io.FileReader; import java.io.IOException; import java.util.Iterator; import java.util.Map; . import org.identityconnectors.flatfile.FlatFileConfiguration; import org.identityconnectors.flatfile.FlatFileUserAccount; import org.identityconnectors.flatfile.utils.AccountConversionHandler; . /** * Iterator class to fetch the records dynamically during search operations This * is needed to prevent VM overloading when all records are stored in memory * * @author admin * */ public class FlatFileLineIterator implements Iterator<FlatFileUserAccount> { . private File recordsStore; private AccountConversionHandler accountConverter; private FlatFileUserAccount nextRecord; private BufferedReader userRecordReader; private Map<String, String> attrConstraints; . /** * Making it package private to prevent global initialization * * @param metadata * @param fileConfig * @param attributeValConstraints * Iterator will apply this constraint and filter the result */ FlatFileLineIterator(FlatFileMetadata metadata, FlatFileConfiguration fileConfig, Map<String, String> attributeValConstraints) { this.recordsStore = fileConfig.getStoreFile(); this.accountConverter = new AccountConversionHandler(metadata, fileConfig); this.attrConstraints = attributeValConstraints; . initializeReader(); this.nextRecord = readNextValidRecord(); } . private void initializeReader() { try { userRecordReader = new BufferedReader(new FileReader(recordsStore .getAbsolutePath())); . // Skip headers userRecordReader.readLine(); . } catch (IOException io) { throw new IllegalStateException("Unable to read " + recordsStore.getName()); } } . @Override public boolean hasNext() { return (nextRecord != null); } . @Override public FlatFileUserAccount next() { FlatFileUserAccount currentRecord = this.nextRecord; this.nextRecord = readNextValidRecord(); return currentRecord; } . @Override public void remove() { // Nothing to do here } . /** * Returns next valid record. This happens after applying * * @return */ private FlatFileUserAccount readNextValidRecord() { try { FlatFileUserAccount userAccObj = null; String recordStr; // Match the constraints or read next line do { System.out.println("Before record string"); recordStr = getNextLine(); . // No more records ?? if (recordStr == null) return null; . userAccObj = accountConverter .convertStringRecordToAccountObj(recordStr); } while (!userAccObj.hasMatchingAttributes(attrConstraints)); return userAccObj; } catch (Exception e) { System.out.println("Error reading record" + e.getMessage()); e.printStackTrace(); return null; } } . private String getNextLine() throws IOException { String nextLine = userRecordReader.readLine(); . // No records ?? if (nextLine == null) { this.userRecordReader.close(); return null; } . if (nextLine.trim().isEmpty()) { return getNextLine(); } . return nextLine; } }
5.3.6 Implementation of the FlatfileUserAccount Supporting Class
The following code sample shows the implmentation of the FlatfileUserAccount supporting class:
package org.identityconnectors.flatfile; . import java.util.HashMap; import java.util.HashSet; import java.util.Map; import java.util.Set; . import org.identityconnectors.framework.common.objects.Attribute; . /** * Object representing a user entity * * @author admin * */ public class FlatFileUserAccount { . /* * Mandatory attribute names */ private Set<String> mandatoryAttrNames = new HashSet<String>(); . /* * Attributes making the account */ private Map<String, String> attributes = new HashMap<String, String>(); . /** * Instantiates the attribute value map * * @param mandatoryAttributeNames * Names of the attributes that are necessary * @param attributeValMap * Name value map for the attributes. * @throws IllegalStateException * If mandatory attributes are not found in attribute val map */ public FlatFileUserAccount(Set<String> mandatoryAttributeNames, Map<String, String> attributeValMap) { // Check if mandatory attribute values are passed Set<String> attrValuesKeySet = attributeValMap.keySet(); if (!attrValuesKeySet.containsAll(mandatoryAttributeNames)) throw new IllegalStateException("Mandatory attributes missing"); . // Initialize this.mandatoryAttrNames = mandatoryAttributeNames; this.attributes = attributeValMap; . } /** * Instantiates the attribute value map. * Considers all attributes to be mandatory * @param attributeValMap */ public FlatFileUserAccount(Map<String, String> attributeValMap) { this.mandatoryAttrNames = attributeValMap.keySet(); this.attributes = attributeValMap; } /** * Instantiates the attribute value map * @param attrs */ public FlatFileUserAccount(Set<Attribute> attrs) { for(Attribute attr: attrs) { String attrName = attr.getName(); //Consider first value. Multivalued not supported String attrVal = (String) attr.getValue().get(0); this.attributes.put(attrName, attrVal); } } . /** * Updates the set of attributes. If new attributes present, they are added, * If old attributes are present in the parameter set, values are updated * * @param updatedAttributeValMap */ public void updateAttributes(Map<String, String> updatedAttributeValMap) { this.attributes.putAll(updatedAttributeValMap); } /** * Updates the set of attributes. * @param upatedAttributes */ public void updateAttributes(Set<Attribute> upatedAttributes) { Map<String, String> updatedAttributeValMap = new HashMap<String, String>(); for(Attribute attr: upatedAttributes) { String attrName = attr.getName(); //Consider first value. Multivalued not supported String attrVal = (String) attr.getValue().get(0); updatedAttributeValMap.put(attrName, attrVal); } this.attributes.putAll(updatedAttributeValMap); } . /** * Deletes the attributes with given name. * * @param attributeKeys * Set of the attribute names that are needed * @throws UnsupportedOperationException * if delete for mandatory attributes is attempted */ public void deleteAttributes(Set<String> attributeKeys) { // Check if mandatory attributes are not there. for (String attrKey : attributeKeys) { if (this.mandatoryAttrNames.contains(attrKey)) throw new UnsupportedOperationException( "Delete for mandatory attributes not supported. Try update"); // Not deleting here as it might result inconsistent } // Remove the attributes for (String attrKey : attributeKeys) { this.attributes.remove(attrKey); } } . /** * Gets the attribute of a given name * * @param attributeName * @return * @throws IllegalArgumentException * if attribute is not there for a given name */ public String getAttributeValue(String attributeName) { return this.attributes.get(attributeName); } . /** * Returns the current set of attributes * * @return */ public Map<String, String> getAllAttributes() { return this.attributes; } . /** * Returns true if all passed attributes are matching for this object * * @param attrValMap * @return */ public boolean hasMatchingAttributes(Map<String, String> attrValMap) { boolean noFilterSupplied = (attrValMap == null )|| (attrValMap.isEmpty()); if (noFilterSupplied) // No filter. Everything matches return true; // Iterate to match attributes one by one Set<String> keySet = attrValMap.keySet(); for (String attrName : keySet) { String objAttrVal = this.attributes.get(attrName); String passedValue = attrValMap.get(attrName); . if (!objAttrVal.equals(passedValue)) // This attribute is not same return false; } . // All attributes are same return true; } . /** * Returns the change log number * * @param changeLogAttrName * attribute representing the number * @return */ public int getChangeNumber(String changeLogAttrName) { String changeNumStr = this.attributes.get(changeLogAttrName); int changeNumber = 0; . try { changeNumber = Integer.parseInt(changeNumStr); } catch (Exception e) { System.out.println("Not a valid change log number " + changeLogAttrName + " :" + changeNumStr); } . return changeNumber; } /** * Sets the given attribute with a new value * @param attrName * @param attrVal */ public void setAttribute(String attrName, String attrVal) { this.attributes.put(attrName, attrVal); } /** * Updates the changelog number * @param changeLogAttrName * @param newChangeNumber */ public void setChangeNumber(String changeLogAttrName, int newChangeNumber) { String changeNumberValStr = "" + newChangeNumber; this.attributes.put(changeLogAttrName, changeNumberValStr); } . @Override public String toString() { // Just print the attributes return this.attributes.toString(); } . }
5.3.7 Implementation of the FlatfileAccountConversionHandler Supporting Class
This following code sample shows the implementation of the FlatfileAccountConversionHandler supporting class:
package org.identityconnectors.flatfile.utils; . import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Set; import java.util.StringTokenizer; . import org.identityconnectors.flatfile.FlatFileConfiguration; import org.identityconnectors.flatfile.FlatFileUserAccount; import org.identityconnectors.flatfile.io.FlatFileMetadata; . /** * Class for the utility functions * * @author Admin * */ public class AccountConversionHandler { . private FlatFileConfiguration fileConfig; private FlatFileMetadata metadata; . /** * Instantiates the handler class. But needs the configuration * * @param metadata * @param fileConfig */ public AccountConversionHandler(FlatFileMetadata metadata, FlatFileConfiguration fileConfig) { this.fileConfig = fileConfig; this.metadata = metadata; } . /** * Converts strings records to the user account objects. * * @param accountRecord * @return * @throws RuntimeException * If string is not formatted as per accepted standards */ public FlatFileUserAccount convertStringRecordToAccountObj( String accountRecord) { . StringTokenizer tokenizer = new StringTokenizer(accountRecord, fileConfig.getTextFieldDelimeter()); . // Assert number of columns matching with number of tokens if (metadata.isDifferentFromNumberOfFields(tokenizer.countTokens())) throw new RuntimeException( "Number of tokens doesn't match number of columns"); . // Get the attributes List<String> attrNames = metadata.getOrderedTextFieldNames(); Map<String, String> attrValMap = new HashMap<String, String>(); . // Number of tokens are same. Same loop will work for (String attrName : attrNames) { String attrVal = ""; if (tokenizer.hasMoreTokens()) attrVal = tokenizer.nextToken(); . attrValMap.put(attrName, attrVal); } . // Assumption : All attributes are mandatory for user. Change with the // change in assumption Set<String> mandatoryAttributeNames = attrValMap.keySet(); FlatFileUserAccount userAccountRecordObj = new FlatFileUserAccount( mandatoryAttributeNames, attrValMap); return userAccountRecordObj; . } . /** * Converts account objects to storable string records * * @param accountObj * @return */ public String convertAccountObjToStringRecord( FlatFileUserAccount accountObj) { StringBuilder strRecord = new StringBuilder(); . // Build the string record from the object List<String> attrNames = metadata.getOrderedTextFieldNames(); int index=0; for (String attrName: attrNames) { String attrVal = accountObj.getAttributeValue(attrName); strRecord.append(attrVal); // Add delimeter if (index < attrNames.size()-1) { strRecord.append(fileConfig.getTextFieldDelimeter()); index++; } else { // Record ended String newLineCharacter = "\n"; strRecord.append(newLineCharacter); break; } } return strRecord.toString(); } . /** * Asserts if given object is not null * * @param message * @param obj */ public void assertNotNull(String message, Object obj) { if (obj == null) throw new RuntimeException(message); } }
5.3.8 Implementation of the Messages.Properties Supporting Class
The following code sample shows the implementation of the Messages.Properties supporting class:
USER_ACCOUNT_STORE_HELP=File in which user account will be stored USER_ACCOUNT_STORE_DISPLAY=User Account File USER_STORE_TEXT_DELIM_HELP=Text delimeter used for separating the columns USER_STORE_TEXT_DELIM_DISPLAY=Text Field Delimeter UNIQUE_ATTR_HELP=The name of the attribute which will act as unique identifier UNIQUE_ATTR_DISPLAY=Unique Field CHANGELOG_ATTR_HELP=The name of the attribute which will act as changelog CHANGELOG_ATTR_DISPLAY=Changelog Field
5.4 Uploading the Identity Connector Bundle to Oracle Identity Governance Database
The identity connector bundle must be available to ICF in Oracle Identity Governance database.
Follow the list of sections in order to integrate the ICF identity connector with Oracle Identity Manager. Some of the procedures include configuration by using the Oracle Identity Manager Design Console.
5.4.1 Registering the Connector Bundle with Oracle Identity Governance
The connector bundle must be available for the Connector Server local to Oracle Identity Manager.
Following is the procedure to accomplish this:
5.4.2 Creating Basic Identity Connector Metadata
The connector metadata configuration is needed for both provisioning and reconciliation.
The following set of procedures in this section are completed by using the Oracle Identity Manager Design Console:
5.4.2.1 Creating the IT Resource Type Definition
An IT resource type definition is the representation of a resource's connection information. The configuration parameters in the IT resource type definition should be matched with the configuration parameters of the connector bundle. The values of the parameters in the IT resource will be set in the bundle configuration.
Note:
You may include parameters the bundle configuration is not using. They produce no negative effects on the bundle operations.
5.4.2.2 Creating the Resource Object
The resource object is the Oracle Identity Manager representation of a resource. The connector bundle is tied to the resource object.
5.4.2.3 Creating Lookups
Separate lookups have to be defined for different objects supported by the connector bundle. This lookup can contain provisioning and reconciliation related information for those objects. The Main Configuration Lookup is the root for object specific lookups as it contains the pointers to those lookups. The following sections contain information on how to create lookups.
5.4.2.3.1 Creating the Main Configuration Lookup
The Configuration Lookup (as defined in Creating the IT Resource Type Definition) holds connector bundle configurations that are not counted as connection information. If a configuration parameter is not found in the IT Resource Type Definition, Oracle Identity Manager will look in the Configuration Lookup. The main Configuration Lookup contains bundle properties and bundle configurations. Bundle Property parameters are mandatory as they are needed for identifying the correct bundle. Bundle configurations that are not defined as part of the IT resource type definition (discussed in Creating the IT Resource Type Definition) can be declared here.
Note:
The values for Code Key should match exactly as illustrated. The values for Decode are specific to the connector bundle.
To create the main configuration lookup:
5.4.2.3.2 Creating Object Type Configuration Lookup
Object type configuration lookup contains the parameters specific to the particular object type. Object type is an entity over which an identity connector operates. It is mapped to ICF ObjectClass. In Creating the Main Configuration Lookup, User Configuration Lookup has been referenced so that User is the object type, in this case mapped to ObjectClass.ACCOUNT. (Roles and UserJobData are two other object types.) The object type name has to match with ObjectClass name supported by the identity connector bundle. The User object type is mapped to predefined ObjectClass.ACCOUNT, the Group object type is mapped to predefined ObjectClass.GROUP. If the identity connector supports multiple objects, then this step must be repeated for each.
Note:
Because these use cases cover only the basic functionality, the configuration is kept to the mandatory attribute.
To create the object type configuration lookup:
5.4.3 Creating Provisioning Metadata
To configure Oracle Identity Manager for flat file provisioning, you create the provisioning metadata, which involves creating a process form, adapters, a process definition, and a provisioning attribute mapping lookup.
The following sections should be followed in order to configure Oracle Identity Manager for flat file provisioning.
5.4.3.1 Creating a Process Form
A process form is used as the representation of object attributes on Oracle Identity Manager.
This section describes about process forms and how to create a process form. It contains the following topics:
5.4.3.1.2 About Process Forms
A process form is used as the representation of object attributes on Oracle Identity Manager. This facilitates user input to set object attributes before passed to the connector bundle for an operation.
Attributes defined in the process form are not conventions. The form is a way to challenge the attributes that need to be passed to the identity connector. In general, define an attribute for each supported attribute in the identity connector.
Note:
It is good practice to have a one to one mapping on the identity connector attributes.
There should be a field for querying the IT resource that should be associated with the respective IT Resource Type Definition. Variable type of each field should map to the type of the object attribute.
5.4.3.1.3 Attributes in the Connector Schema
Table 5-1 lists the attributes defined in the connector schema.
Table 5-1 Form Designer Fields
Name | Variant | Field Label | Field Type |
---|---|---|---|
UD_FLAT_FIL_FIRSTNAME |
String |
First Name |
TextField |
UD_FLAT_FIL_UID |
String |
Universal ID |
TextField |
UD_FLAT_FIL_CHANGENO |
String |
Change Number |
TextField |
UD_FLAT_FIL_MAILID |
String |
Email ID |
TextField |
UD_FLAT_FIL_SERVER |
long |
Server |
ITResource |
UD_FLAT_FIL_LASTNAME |
String |
Last Name |
TextField |
UD_FLAT_FIL_ACCOUNTID |
String |
Account ID |
TextField |
UD_FLAT_FIL_RETURN |
String |
Return ID |
TextField |
Note:
The flat file column names are FirstName, ChangeNo, EmailID, Server, LastName, and AccountID.
5.4.3.2 Creating Adapters
An adapter has to be created for all operations supported by the connector bundle, including Create, Update, and Delete.
To create the adapter:
-
Log in to the Oracle Identity Manager Design Console.
-
Click Adapter Factory under Development Tools.
-
Create a new adapter and add FFCreateUser as the Adapter Name.
-
Add Process Task as the Adapter Type.
-
Save the adapter.
-
Click the Variable List tab and add the following variables, as shown in Figure 5-7.
-
objectType with Type String and Mapped as Resolve at runtime.
-
processInstanceKey with Type long and Mapped as Resolve at runtime.
-
itResourceFieldName with Type String and Mapped as Resolve at runtime.
Figure 5-7 Adapter Factory Variable List in Design Console
Description of "Figure 5-7 Adapter Factory Variable List in Design Console" -
-
Add a Java functional task to the adapter by following this sub procedure, as shown in Figure 5-8.
-
Click the Adapter Tasks tab.
-
Select the adapter and click Add.
-
Select Java from the task options.
-
Select icf-oim-intg.jar from the API source.
-
Select oracle.iam.connetors.icfcommon.prov.ICProvisioninManager as the API Source.
-
Select createObject as the method for the task.
-
Save the configurations.
-
Map the variables (previously added to the Variables List) against the appropriate method inputs and outputs.
-
Map the configuration parameters against the appropriate method inputs and outputs.
Database Reference maps to Database Reference (Adapter References) and Return Variable maps to Return Variable (Adapter Variables).
Figure 5-8 Adapter Factory in Design Console
Description of "Figure 5-8 Adapter Factory in Design Console" -
-
Save and build the adapter.
5.4.3.3 Creating A Process Definition
Process Definition defines the behavior of the connector bundle for a particular operation. Every operation has a corresponding task associated with it.
The following procedure will configure the process definition and integration of the process task for the Create operation:
5.4.3.4 Creating a Provisioning Attribute Mapping Lookup
Provisioning Attribute Mapping Lookup contains mappings of Oracle Identity Manager fields to identity connector bundle attributes.
This section describes the following topics about the Provisioning Attribute Mapping Lookup:
5.4.3.4.1 About Provisioning Attribute Mapping Lookup
Provisioning Attribute Mapping Lookup contains mappings of Oracle Identity Manager fields to identity connector bundle attributes. In the Provisioning Attribute Mapping Lookup:
-
Code keys are Field Labels of the process form.
-
Decodes are identity connector bundle attributes.
-
Child form attributes can be configured as embedded objects in inputs.
-
The identity connector's provisioning operation returns the UID in response. This can be set in a form field by coding it against the identity connector bundle attribute.
5.4.3.4.2 Creating a Provisioning Attribute Mapping Lookup
To create a Provisioning Attribute Mapping Lookup:
5.4.3.4.3 Field Flags Used in the Provisioning Attributes Map
For provisioning attributes mapping, the following field flags can be appended to the code key:
-
LOOKUP: This must be specified for all fields whose values are obtained by running a lookup reconciliation job. The values obtained from lookup reconciliation job have IT Resource Name/Key appended to it. Specifying this flag helps ICF integration to remove the appended value just before passing them onto the bundle. For example, the code key for a field with label Database whose value is obtained by running a lookup reconciliation job looks similar to Database[LOOKUP].
Note:
The LOOKUP flag can be specified for both Provisioning and Reconciliation Attribute Map. For provisioning, IT Resource Name/IT Resource Key prefix must be removed. For reconciliation, IT Resource Name/IT Resource Key prefix must be added.
-
IGNORE: This must be specified for all fields whose values are to be ignored and not sent to bundle. For example, the code key for a field with label Database whose value need not be sent to bundle looks similar to Database[IGNORE].
-
WRITEBACK: This must be specified for all fields whose values need to be written back into the process form right after the create or update operation. Adding this flag makes the ICF integration layer call ICF Get API to get values of attributes marked with the WRITEBACK flag. For example, the code key for a field with label Database whose value needs to be written back to the process form right after create/update looks similar to Database[WRITEBACK]. For this to work, the connector must implement the GetApiOp interface and provide an implementation for the ConnectorObject getObject(ObjectClass objClass,Uid uid,OperationOptions options) API. This API searches the target for the account whose Uid is equal to the passed in Uid, and builds a connector object containing all the attributes (and their values) that are to be written back to process form.
Note:
If the connector does not implement the GetApiOp interface, then the WRITEBACK flag does not work and an error is generated.
-
DATE: This must be specified for fields whose type need to be considered as Date, without which the values are considered as normal strings. For example, the code key for a field with label Today whose value needs to be displayed in the date format looks similar to Today[DATE].
-
PROVIDEONPSWDCHANGE: This must be specified for all fields that need to be provided to the bundle(target) when a password update happens. Some targets expect additional attributes to be specified on every password change. Specifying the PROVIDEONPSWDCHANGE flag, tells ICF integration to send all the extra fields or attributes whenever a password change is requested. For example, the code key for a field with label Extra Attribute Needed for Password Change whose value needs to be provided to bundle(target) while password update looks similar to Extra Attribute Needed for Password Change[PROVIDEONPSWDCHANGE].
5.4.4 Creating Reconciliation Metadata
You can configure the reconciliation of records from the flat file. You can use the target reconciliation as an example; trusted reconciliation can also be configured in a similar fashion.
Perform the procedures in the listed order.
5.4.4.1 Creating a Reconciliation Scheduled Task
By default, reconciliation uses a Search operation on the connector bundle. This operation is invoked with a scheduled task configured using Oracle Identity Manager. This procedure is comprised of the following subprocedures:
5.4.4.2 Creating a Reconciliation Profile
A reconciliation profile defines the structure of the object attributes while reconciliation. The reconciliation profile should contain all the attributes that have reconciliation support.
To create a reconciliation profile:
5.4.4.3 Setting a Reconciliation Action Rule
A Reconciliation Action Rule defines the behavior of reconciliation. In this procedure, define the expected action when a match is found. This procedure assumes you are logged into the Oracle Identity Manager Design Console.
5.4.4.4 Creating Reconciliation Mapping
The reconciliation mapping has to be done in the process definition. This is to map the supported reconciliation fields (from resource object) to the process form fields. This mapping is needed only for configuring target reconciliation.
To create a reconciliation mapping:
5.4.4.5 Field Flags Used in the Reconciliation Attributes Map
For reconciliation attributes mapping, the following field flags can be appended to the code key:
-
TRUSTED: This must be specified in the Recon Attribute Map for the field that represents the status of the account. This flag must be specified only for trusted reconciliation. If this is specified, then the status of the account is either Active or Disabled. Otherwise, the status is either Enabled or Disabled. For example, the code key for a field with label Status whose value needs to be either Active/Disabled must look similar to Status[TRUSTED].
-
DATE: In Recon Attribute Map, this must be specified for fields whose type need to be considered as Date. For example, the code key for a field with label Today whose value needs to be displayed in the date format must look similar to Today[DATE].
5.5 Provisioning a Flat File Account
Provisioning a Flat File account involves creating an IT resource of type Flat File with IT resource and Lookup.FF.Configuration parameters.
The flat file connector is ready to work. Now, the user needs to log in to Oracle Identity Manager and create an IT resource (target) using the following procedure.
-
Create IT resource of type "Flat File".
-
Provide the IT resource parameters as appropriate.
-
Provide the configuration parameters in Lookup.FF.Configuration as appropriate.
5.7 Configuring the Java Connector Server with SSL for Oracle Identity Governance
You can configure SSL for Java Connector Server by providing the key store credentials in the ConnectorServer.properties
file.
Note:
JAVA_HOME value should be set with JDK 17 version.5.8 Configuring the Java Connector Server without SSL for Oracle Identity Governance
Note:
JAVA_HOME value should be set with JDK 17 version.5.9 Upgrading the Java Connector Server
In the 12.2.1.3.1 version of the Connector Server pack, you can select the protocol for SSL communication between Oracle Identity Manager and Java Connector Server by using the connectorserver.protocol property
. The supported values of this property are TLSv1.2
, and TLSv1.3
.Here, TLSv1.2
denotes TLS 1.2
protocol, and TLSv1.3
denotes TLS 1.3
protocol. The default value of this property is TLSv1.2
, which denotes TLS 1.2 protocol.
Note:
JAVA_HOME value should be set with JDK 17 version.