389-LDAP Red Hat connector
389-LDAP Red Hat connector is used for reading and writing user and group information towards the customers 389-LDAP directory. It is important to notice, that 389-LDAP connector is available only for Efecte IGA solution and needs separate agreement to be used in other Efecte solutions.
389-LDAP connector requires configuration according to customers use case, and in principle configuration has three (3) steps:
- Configure connector - enables connector to establish connection to the directory
- Configure scheduled task - is used when data is read from the directory
-
Configure event task - is used when data is written to the directory
-
Also workflow orchestration nodes are required to be configured
-
Also workflow orchestration nodes are required to be configured
2024.2 and newer version instructions
General functionalities
General functionalities
Connectors - general functionalities
In this article are described general functionalities for managing native connectors in solution. All Native Connectors are managed from the same connector management UI.
Notice that there are separate descriptions for each connector, where connector specific functionalities and configuration instructions are described in detailed level.
To be able to access connector management, user needs to have admin level permissions to customers Platform configuration. When accesses are granted correctly, connector tab will be visible and user can manage connectors.

Left menu
Connector management is divided into four tabs:

- Overview - for creating and updating Native Connectors. The Admin User can see their status, type and how many scheduled or event tasks are associated with them.
- Authentication - for creating and updating authentication tasks. Provisioning task for authentication is needed for Secure Access to be able to define which Customers end-users can access to Matrix42 Core, Pro and IGA login page.
- Logs - for downloading Native Connector and Secure Access logs from UI.
- Settings - general settings for Native Connectors and Secure Access, including environment type for logging and monitoring.
Connectors overview tab
From overview page, user can easily and quickly see status for all connectors.

Top bar:
-
Status for Native Connectors (EPE)
- Green text indicates that Native Connectors is online. All the needed services are up and running.
- Red text indicates that there is a problem with Native Connectors, all the services are not running.
-
Status for Secure Access (ESA)
- Green text indicates that Secure Access is online. All the needed services are up and running.
- Red text indicates that there is a problem with Secure Access, all the services are not running.
-
Native Connectors version number is displayed in top right corner
Top bar for list view:

- New connector - opens new window for adding and configuring new connector
- Remove connector(s) - workflow references are calculated and pop-up window appears to confirm removal (notice that calculating references can take several seconds)
- Export - Admin can export one or more connectors (and tasks) from the environment. Usually used for exporting connectors and connectors (and tasks) from test to prod. Native connectors secret information is password protected.
-
Import - Admin can import one or more connectors (and tasks) to the environment. Usually used for importing connectors (and tasks) from test to prod.
- Admin cannot import from old EPE UI (Older than 2024.1) to the new one. Source and target environments must have same version.
- Import will fail if the configuration (templates, attributes) is not the same - for example when some attribute is missing
- If you are importing something with the same connector details, it will be merged under existing connector
- Refresh - Admin can refresh connectors view by clicking the button.
List view for overview,

- Select connector(s) - Select one connector by clicking check box in front of the connector row or clicking check box in the header row will select all connectors
- Id - Automatically generated unique ID of the connector. Cannot be edited or changed.
-
Status - indicates scheduled task status
-
Green check mark - Task is executed without any errors
-
Red cross -Task is executed, but there have been error
-
Grey clock - Task is not executed yet, waiting for scheduling
-
Orange - one of the tasks has a problem
- No value - Scheduled based-task is missing
-
- Name - Connector name added to connector settings. Unique name of the connector holding configuration for one data source
- Type - indicates target / source system
- Scheduled - informs how many scheduled tasks are configured
- Event - informs how many event tasks are configured
-
Manage
- Pen icon - opens connector settings (double-clicking the connector row, also opens settings)
- Paper icon - copies the connector
- Stop - workflow references are calculated and pop-up window appears to confirm removal
- Search - User can search connector by entering the search term in the corresponding field . Id, Status, Name, Type, Scheduled and Event fields can be searched.
Scheduled task information (click arrow front of the connector)
When clicking arrow at the beginning of the connector row, all related scheduled and event tasks are shown

Top bar for scheduled tasks
- New task - opens configuration page for new task
- Remove task(s) - removes the selected task(s) from the system and they cannot be recovered anymore.
- Refresh - refresh scheduled-based tasks view
- Search - user can search task by entering the search term in the corresponding field for Id, name, enabled, extract/load status.
List view for scheduled tasks

- Select task(s) - Select task to be deleted from the list view by ticking.
- Id - Unique ID of the task. Generated automatically and cannot be changed.
- Name - Task name added to task settings, unique name of the task.
-
Enabled - Displays is the task scheduled or not
-
Green check mark - Task is scheduled
-
Red cross - Task is not scheduled
-
-
Extract status - Displays status of data extraction from the target directory/system
-
Green check mark - data is extracted successfully
-
Red cross - data is extracted with error(s) or extraction is failed
Clock - Task is waiting execution
-
-
Load status - Displays status of data export from json-file to customers solution
-
Green check mark - data is imported successfully to customers solution
-
Red cross - data is imported with error(s) or import is failed
-
-
Manage
- Pen icon - opens task settings in own window (double-clicking the task row also opens task settings)
- Paper icon - copies the task
- Clock icon - opens task history view
- Stop - remove task, pop-up window opens to confirm the removal
Scheduled task history view
By clicking the clock icon in the scheduled task row, history for scheduling will be shown.

Top bar for view history
- Refresh - refreshes scheduled task status. This doesn't affect task, this only updates UI to show latest information of task run.
List view for scheduled task history
-
Color of the row is indicating status
-
Green - task executed successfully
-
Red - error happened during execution
-
- Execution ID - unique ID for the scheduled task row
- Extract planned time - when next extract from the directory/application is scheduled to happen
- Extract complete time - when extract was completed
- Extract status - status for fetching data from the directory/application
- Load start time - when next load to customers solution is scheduled to happen
- Load complete time - when load was completed
- Load status - status for loading information to customers solution
List view for scheduled task status
- Actual start time - timestamp for actual start
- Users file - JSON file containing user information read from the directory/application
- Group file - JSON file containing group information read from the directory/application
- Generic file - JSON file containing generic information read from the directory/application
- Extract info - detailed information about reading information from the directory/application
- Load info - detailed information about loading the information to customers Matrix42 Core, Pro and IGA solution
Edit window for scheduled task
Configuration for scheduled task can be opened by clicking pen icon or double-clicking the task row.
Left menu and attributes varies according to selected options and therefore more detailed instructions for editing tasks can be found from the connector description, but there are common functionalities for all scheduled tasks which are described below.

Saving the task
In case mandatory information is missing from the task, hoovering mouse on top of the save button will show which attributes are still empty.
Top bar for edit scheduled task
- Run task manually - admin can run task manually out side of the defined scheduling
- Stop task - admin can stop scheduled-based task which is currently running. Task will be stopped and status is changed to be stopped. It waits in this state until the next timing occurs.
-
Clear data cache - Data cache for the next provisioning of Users and Groups will be cleared. It means that next run is run as first time run.
- By default, we clear the cache everyday at 00:00 UTC
- If you want to clear the cache at different time, then it has to specify some different value in host file 'custom.properties'.
-
EPE Cache is also cleared when EPE is restarted, whole environment is restarted, EPE mappings have been changed
Event task information
When clicking arrow at the beginning of the connector row, all related scheduled and event tasks are shown

Top bar for event tasks
- New task - opens configuration page for new event task
- Remove task(s) - removes selected task(s), pop-up window appears to confirm the removal
- Refresh - refreshes event tasks view
- Show workflow references - calculates task related workflow relations and statuses. This is very useful if you don't know from which workflows event-based tasks are used.
List view for event tasks
- Select task(s) - Select task to be deleted from the list view by ticking.
- Id - Unique ID of the task. Generated automatically and cannot be changed.
- Name - Task name added to task settings, unique name of the task.
-
Workflow relations
- Question mark shows pop-up window with detailed information about the reference
-
Workflow status
- Not used - No relations to workflow
- In use - Workflow(s) attached to task, task cannot be removed
-
Manage
- Pen icon - opens task settings in own window
- Paper icon - copies the task
- Stop icon - removes the task, pop-up window appears to confirm the removal
Edit window for event task
Configuration for event task can be opened by clicking pen icon or double-clicking the task row.
Left menu and attributes varies according to selected options and therefore more detailed instructions for editing tasks can be found from the connector description, but there are common functionalities for all event tasks which are described below.

Edit event task window
- Task usage, editable? - this appears when editing existing task and changing the task usage type will break workflows
- Mappings type, editable? - this appears when editing existing task and changing the mappings type will break workflows
Saving the task
In case mandatory information is missing from the task, hovering mouse on top of the save button will show which attributes are still empty.
Authentication tab
Authentication for Matrix42 Core, Pro and IGA solutions are configured from authentication tab, notice that only some of the connectors (directory connectors) are supporting authentication, so its not possible to create authentication tasks to all available connectors.

Top bar for authentication
- New connector - opens new window for configuring new connector (notice that not all connectors are supporting authentication)
- Remove connector(s) - removes selected task(s), pop-up window appears to confirm the removal
-
Export - user can export one or more tasks from the environment. Usually used for exporting tasks from test to prod. EPE connectors are password protected.
- Note that Realm for authentication tasks is not exported, you need to set that manually after importing
- Import - user can import one or more tasks to the environment. Usually used for importing task from test to prod.
- Refresh - refreshes authentication tasks view
List view for authentication overview
- Select connector(s) - Select one connector by clicking check box in front of the connector row or clicking check box in the header row will select all connectors
- Id - Automatically generated Unique ID of the connector. Cannot be edited or changed.
- Name - Connector name added to connector settings. Unique name of the connector holding configuration for one data source
- Type - indicates target / source system
- Count - informs how many authentication tasks are configured
-
Manage
- Pen icon - opens authentication task setting in own window
- Paper icon - copies the task
- Stop icon - removes selected task
Authentication task information
When clicking arrow at the beginning of the connector row, all related scheduled and event tasks are shown

Top bar for authentication overview
- Create new task - opens configuration page for new authentication task
- Remove task(s) - removes selected task(s), pop-up window appears to confirm the removal
- Refresh - refreshes authentication tasks view
List view for authentication overview,
- Select task(s) - Select task to be deleted from the list view by ticking.
- Id - Unique ID of the task. Generated automatically and cannot be changed.
- Name - Task name added to task settings, unique name of the task.
-
Manage
- Pen icon- opens task settings in own window (double-clicking the task row, also opens settings window)
- Paper icon - copies the task
- Stop icon - removes the task, pop-up window appears to confirm the removal
Logs tab
Logs tab is for downloading Native Connector and Secure Access logs from UI for detailed troubleshooting.

- epe-master logs - contains warning, debug and error level messages about Native Connectors and info how long task actions has been taken.
- epe-worker-ad logs - contains extract data status of Active Directory connector (what Native Connector is loading to to customers Matrix42 Core, Pro and IGA solution). If selection is empty, it means that directory is not in use in this environment.
- epe-worker-azure logs - contains extract data status of Entra ID (what Native Connector is loading to to customers Matrix42 Core, Pro and IGA solution). If selection is empty, it means that directory is not in use in this environment.
- epe-worker-ldap logs - contains extract data status of LDAP (what Native Connector is loading to to customers Matrix42 Core, Pro and IGA solution). If selection is empty, it means that directory is not in use in this environment.
- epe-launcher logs - contains information about EPE launches
- datapump-itsm logs - Contains information about data export to customers Matrix42 Core, Pro and IGA solution.
- esa logs - Contains information about Secure Access authentication.
Settings tab
Settings tabs is used for monitoring environments with connectors.

-
Environment type - is mandatory to be in selected and information is used for example defining alert critically.
- Test - select this when your environment is used as testing environment
- Prod - select this when your environment is used as production environment
- Demo - select this when your environment is used as demo or training environment
- Dev - select this when your environment is used as development environment
What we are monitoring?
- Failures in scheduled based provisioning (extracting data, exporting data to ESM, outdated certificates, incorrect passwords, incorrect search base/filter, incorrect mappings, etc.)
- Failures in event based provisioning (fail to write to AD/Azure, etc.)
- Event-based provisioning - which connectors are used for writing data towards applications/directories.
- ESA more than ten failed login attempts to one user in past 3 days
- Environment type - is mandatory to be in selected and information is used for example defining alert critically.
Data migrations
Do not click “Migrate attribute mappings” or “Migrate workflows”, if not requested to do so by Matrix42.
Configure 389-LDAP connector
In this chapter is described configuration instructions for 389-LDAP connector to be able connect to customers 389-LDAP.
For accessing connector management, user needs to have permissions to Efecte Platform configuration.
1. Open the Efecte administration area (a gear symbol).
2. Open connectors view.
3. Choose new connector

4. Select Data Source type to be 389-LDAP Red Hat

5. Fulfill information related to customers 389-LDAP
- Connector name - give your connector a friendly name (name can be changed afterwards)
- 389-LDAP host - host address which will be used connecting to customers 389-LDAP
- 389-LDAP port - port number which will be used connecting to customer 389-LDAP
- 389-LDAP username - service account name which is used for reading and writing data to/from customers 389-LDAP
- 389-LDAP password - password for the service account

6. Fulfill WebAPI user information
- WebAPI user - select correct WebAPI user which is used when writing data from customers 389-LDAP to customers Efecte solution
- WebAPI password - password for the WebAPI user

7. Save connector information
- Press test connection to validate that port and host information is set correctly
- Press test authentication to validate that 389-LDAP user and password (service account) information is set correctly

8. Customers Efecte solution is now able to connect to customers 389-LDAP.
- Next step is to configure scheduled task for data read or event task for data writing.
General guidance for scheduled tasks
General guidance for scheduled tasks
How to Create New Scheduled Task to import data
For configuring scheduled-based provisioning task, you will need access to Administration / Connectors tab.
1. Open the Administration area (a cogwheel symbol).
2. Open Connectors view.
3. Choose Connector for Scheduled-based task and select New Task
Note! If connector is not created, you have to choose first New connector and after that New task.

4. Continue with connector specific instructions: Native Connectors
Should I use Incremental, Full or Both?
Scheduled task can be either Incremental or Full -type.
Do not import permissions with AD and LDAP incremental task
Incremental task has issue with permissions importing. At the moment it is recommended not to import group memberships with incremental scheduled task.
On Microsoft Active Directory and OpenLDAP connectors, remove this mapping on incremental task:

Setting on Scheduled tasks:

Incremental type is supported only for Microsoft Active Directory, LDAP and Microsoft Graph API (formerly known as Entra ID) Connectors.
Incremental type means, that Native Connectors (EPE) fetches data from source system, using changed timestamp information, so it fetches only data which is changed or added after previous incremental task run.
When Incremental type task is run for very first time, it does a full fetch (and it marks the current timestamp to EPE database), thereafter, task uses that timestamp to ask the data source for data that changed since that timestamp (and then EPE updates the timestamp to EPE database for next task run). Clearing task cache doesn't affect this timestamp, so Incremental task is always incremental after first run.
Full type is supported for all Connectors.
Full type import fetches always all data (it's configured to fetch) from source system, on every run.
Both Full and Incremental type tasks use also Task cache in EPE, which makes certain imports faster and lighter for M42 system.
By default that task cache is cleared ad midnight UTC time. When cache is cleared, next import after that is run without caching used to reason if data fetched should be pushed to ESM, all fetched data is pushed to ESM. But after that, next task runs until next time cache is cleared, are using EPE cache to determine if fetched data needs to be pushed to ESM or not.
You can configure at what time of day task cache is emptied, by changing global setting in EPE datapump configuration:
/opt/epe/datapump-itsm/config/custom.properties
which is by default set to: clearCacheHours24HourFormat=0
You can also clear cache many times a day, but that needs to be thinked carefully, as it has impact on overall performance as EPE will push changes to ESM, that probably are already there, example(do not add spaces to attribute value): clearCacheHours24HourFormat=6,12
After changing this value, reboot EPE datapump container to take change into use.
Recommendations:
Have always by default Full type scheduled task.
If you want to fetch changes to data fetched already by full task, more frequently than you can run full task, add also incremental task. Usually incremental task is not needed.
Recommended Scheduling Sequence
Recommended scheduling sequence, depends how much data is read from Customers system/directory to the Matrix42 Core, Pro or IGA solution and is import Incremental or Full.
Examples for scheduling,
| Total amount of users | Total amount of groups | Full load sequence | Incremental load sequence |
| < 500 | < 1000 |
Every 30 minutes if partial load is not used Four (4) times a day if partial load is used |
Every 10 minutes |
| < 2000 | < 2000 |
Every 60 minutes, if partial load is not used Four (4) times a day if partial load is used |
Every 15 minutes |
| < 5000 | < 3000 |
Every four (4) hours, if partial load is not used Twice a day if partial load is used |
Every 15 minutes |
| < 10 000 | < 5000 | Maximum imports twice a day, no matter if partial load is or is not used | Every 30 minutes |
| < 50 000 | < 7000 | Maximum import once a day, no matter if partial load is or is not used | Every 60 minutes |
| Over 50 000 | Over 7000 | There might be a need for another EPE-worker, please contact Product Owner | Separately evaluated |
Please note that if there are several tasks running at the same time you may need more EPE-workers. The tasks should be scheduled at different times and can be completed according to the table above. However, if there are more than 6 tasks running at the same time, the number of epeworkers should be increased. It's best practice not to schedule tasks to run at same time, if possible.
Recommendations related to performance
If the amount fo data to be imported is over 10000 concider these things:
Adjust log level of ESM and DATAPUMP to ERROR-level, to lowe the amount of logging during task run
Have as few as possible automations starting immediately for imported datacards (listeners, handlers, workflows), as those make ESM to take longer time handling new datacards.
Set removed accounts and entitlements status removed/disabled
With this functionality, you can mark account and entitlement status to e.g. Deleted or Disabled, when account or entitlement is removed from source system. Starting from version 2025.3 you can also set status to generic objects (not only to accounts/identities and entitlements/groups).
For version 2025.3 and newer
In version 2025.3 these settings are moved from properties files to Task UI. Also you can now set these settings for Generic objects, which have not been possible before this version.
There is separate configuration for each scheduled task, and for all mapping types. Here is example of this config on task:

For version 2025.2 and older
This functionality is available for “full” type scheduled tasks.
Settings are on datapump dockers configuration file. To change those parameter values, you need to set those in /opt/epe/datapump-itsm/config/custom.properties file.
Configuration
To enable disabling functionality, datapump config should have these parameters set to true:
disable.unknown.esm.users=truedisable.unknown.esm.groups=true
Those 2 parameters are false by default in 2024.2 and 2025.1 versions. In 2025.2 and newer version those are true by default.
Next are these parameters:
personTemplateStatusCodeAttributeKey=accountStatuspersonTemplateStatusAttributeDisabledValueKey=DeletedgroupTemplateStatusCodeAttributeKey=statusgroupTemplateStatusAttributeDisabledValueKey=5 - Removed
First two attributes should point to the DatacardHiddenState attribute in the User template, and tell which value should be send there when the user is deleted.
By default its accountStatus and Value 5 - Removed on IGA Account template.
All these needs to match with the attribute configuration:

Same thing applies for the next two paramaters, but its for Groups.'
If you need to change those parameters in properties file, do changes in Datapump container to file: /opt/epe/datapump-itsm/config/custom.properties and those changes will then survive over container reboot and will be copied on reboot to /opt/epe/datapump-itsm/config/application.properties.
Description
Tasks save their __taskid__ shown as Task Id mapping in the UI to the datacards, its then used to determine if the datacard was added by this task. In case there are multiple tasks with different sets of users.
This field was previously used as datasourceid, but since we moved to the model where connector can have multiple tasks its identifier cannot be used anymore, thats why the field was repurposed as taskid instead.
Taking users as an example, when task runs ESM is asked for the list of users that have its taskid in Task Id mapping field, and doesn't have a personTemplateStatusAttributeDisabledValueKey value in the personTemplateStatusCodeAttributeKey
This result is then compared to what the task fetched, and the datacards of users that were not fetched have their personTemplateStatusattribute set to value specified in the config - 5 - Removedby default.
Example log below shows described process and informs that one user was removed.

Same thing applies to groups but groupTemplateStatusattributes are used instead.
Notes
- Feature works only with full fetch scheduled tasks..
- No support for generic templates yet, only identity and access
- When migrating from the previous versions where datasourceid was still used it needs to run at least once to set its taskid’s in the datacards first.
- EPE identifies Disabled users or groups as the ones that were removed from the AD, at the present we do not support statuses related to the entity beign active or not.
- EPE does not enable users back on its own.
- If more than one tasks fetches the same users or groups it may overwrite the taskid in the datacard depending on which task ran last. It is suggested that many full type tasks are not fetching same user or group.
- Always do configuration file changes to custom.properties, do not change only application.properties as those changes are lost on container reboot if you have not done same changes to custom.properties.
Configure scheduled task for reading data
389-LDAP connector is used for to read user and group information from Customers 389-LDAP and it is configured from Efecte platform by creating scheduled-based task.
How to create new task for Scheduled Provisioning
For configuring scheduled-based task, you will need access to Efecte Platform configuration console.
Note! If connector is not created, you have to create first “new connector” and after that you are able to create new tasks.
1. Open the Efecte administration area (a gear symbol).
2. Open connectors view.
3. Choose connector for which scheduled-based task is configured
4. Select new task under the correct connector

5. Define scheduling for the task (if and how scheduled task should be run periodically). Choose scheduling sequence, which depends how much data is read to customers Efecte solution.

Recommended scheduling sequence, depends how much data is read from customers directory to the Efecte solution and is import partial or full load.
Example scheduling for reading groups and user accounts.
| Total amount of users | Total amount of groups | Full load sequence | Partial load sequence |
| < 500 | < 1000 |
Every 30 minutes if partial load is not used Four (4) times a day if partial load is used |
Every 10 minutes |
| < 2000 | < 2000 |
Every 60 minutes, if partial load is not used Four (4) times a day if partial load is used |
Every 15 minutes |
| < 5000 | < 3000 |
Every four (4) hours, if partial load is not used Twice a day if partial load is used |
Every 15 minutes |
| < 10 000 | < 5000 | Maximum imports twice a day, no matter if partial load is or is not used | Every 30 minutes |
| < 50 000 | < 7000 | Maximum import once a day, no matter if partial load is or is not used | Every 60 minutes |
| Over 50 000 | Over 7000 | There might be a need for another EPE-worker, please contact Product Owner | Separately evaluated |
5. Fill in Task Details
- Fill in unique task name for the scheduled-based task, notice that name cannot be changed afterwards.
- Task usage indicated that is the task used for reading data, writing data or to authentication. Note that If event type is changed afterwards it can break the workflows.
-
Mappings type depends what type of information is read from the directory
- Identity and access rights - are used when user account and group information is read from the directory
- Single (identity only) - are used when only user account information is read from the directory
- Single (access right only) - are used when only group information is read from the directory
- Generic (one template) - are used when a generic information is read from the directory, usually other than Users/Groups

6. Fill in filtering details,
-
Fetch data
- Full - all information is read according to defined filtering
- Incremental - only changed information is delivered to Efecte solution

Fill in 389-LDAP userBase / 389-LDAP userFilter * This filter is commonly set with the OU-path, where users are read from and it also includes all sub-OUs in the import, but it also enables several other filtering possibilities,
Examples of other commonly used user filter examples:
- The following example finds User-objects: (&(objectCategory=person)(objectClass=user))
- The following example finds user-, computer- and contact-objects: (objectClass=person)
- The following example finds user- and contact-objects: (objectCategory=person)
- The following example finds all enabled user-objects: (&(objectCategory=person)

Fill in 389-LDAP groupBase / 389-LDAP groupFilter * This filter is commonly set to the OU-path, where groups are read from. All sub-OUs will be included in the import (example: OU=Finance,DC=adtest,DC=local). Users will be ignored, if they exist directly in one of defined <OU> and also they exist in subtree of one of the already defined <OU>.
Examples of other commonly used group filters:
- The following example finds group-objects:
(objectCategory=group) - The following example finds group-objects that has value in cn:
(&(objectCategory=group)(cn=*)) -
The following example finds group-objects that have value in their description:
(&(objectCategory=group)(description=*))

7. Fill in failure information
Optional settings for failure handling, if scheduled task fails it can create data card to Efecte ESM that displays the error. If failure settings are defined , the administrator does not need to manually check the status of scheduled tasks.
- Failure template - Select a Template of datacard which will be created in case of any errors during provisioning (connection to data sources,timeouts,etc.)
- Failure folder - Select folder where failure data card is stored.
- Failure attribute - Select an attribute where in the Failure Template should the error information be stored in.

8. Fill in Identity Mappings
Users are imported to IGA Account template and it is mandatory to set Target folder, datasourceid and unique values which are used for identifying users between customers 389-LDAP and Efecte solution. For example.
-
Target template - Select a template to define attribute mappings
- Target folder - Select a folder from a list of folders. The list is narrowed down to match compatibility with selected Template.
-
Attribute mappings
- Efecte template attribute - to which attribute in Efecte directory attribute is mapped to.
- Directory attribute - which attribute from the directory is mapped to Efecte
- It is possible to set additional attributes, which are read from user accounts in 389-LDAP, by choosing New attribute

9. Fill in Access Rights Mappings
Groups are read to IGA Entitlement template and it is mandatory to set target folder, datasourceid and unique values which are used for identifying users between customers 389-LDAP and Efecte solution.
-
Target template - Select a template to define attribute mappings
- Target folder - Select a folder from a list of folders. The list is narrowed down to match compatibility with selected Template.
-
Attribute mappings
- Efecte template attribute - to which attribute in Efecte directory attribute is mapped to.
- Directory attribute - which attribute from the directory is mapped to Efecte
- It is possible to set additional attributes, which are read from user accounts in directory, by choosing New attribute

11. Save provisioning task from the Save button. If some required attributes are missing the save button is displayed as grey and it will display what is missing from the settings.

12. You have now configured scheduled-based connector task for 389-LDAP data read.
- You can now wait until task is started based on scheduling or
- Run task manually - by clicking the button task is configured to be scheduled to start immediately. Usually for test runs or if you don't want to change the schedule settings, but want to run the task now.
Configure event-based task for writing data
Event task is used when writing data towards customers 389-LDAP. These Efecte Provisioning Engine capabilities are available only for Efecte IGA solution and IGA license is required if these are used.
All configuration related to Efecte Provisioning Engine and event-based provisioning task are configured in Efecte Service Management platform.
- Open the Efecte Platform configuration view (a gear symbol).
- Open connectors view
- Choose connector, which will use the event task
-
Select “new task” under correct connector

- Fill in Task Details
-
Task name - Give name name for the task, it will be displayed in connectors view.
- It is good practice to name a task in a way that describes its purpose for example
- Task usage indicated that is the task used for reading data or writing data. Can be changed afterwards, but it's not recommended if event task is in use. It will break workflows.
-
Mappings type depends what type of information is read from the directory, identity mappings selections are displayed based on this setting.
- Identity and access rights - are used when user account and group information is read from the directory
- Single (identity only) - is used when only user account information is read from the directory
- Single (access right only) - is used when only group information is read from the directory
- Generic (one template) - is used when generic information is read from the directory. Usually some other data than user or group information.

6. Fill in Security Information
This information is needed if user creation process in workflow is attached to the task.
-
Password for first login - Default password that is the same for all users,random password is generated in workflow,and it's usually unique.
-
Default password / type in the box below
- Default password - Type default password that is same for all users and matches to directory requirements for the password. Note that number of characters doesn't represent actual length of the password
-
Random password / generate in the workflow
- Generated password
-
Default password / type in the box below

7. Fill in Filtering info
When data is written to customer directory, filtering is needed for the connector to know to which OU's information is written.
Fill in 389-LDAP userBase / 389-LDAP userFilter * This filter is commonly set with the OU-path, where users are read from.

Fill in 389-LDAP groupBase / 389-LDAP groupFilter * This filter is commonly set to the OU-path, where groups are read from.

6. Define identity mappings
-
Target template - Select a template to define attribute mappings
- Target folder - Select a folder from a list of folders. The list is narrowed down to match compatibility with selected Template.
-
Attribute mappings
- Efecte template attribute - to which attribute in Efecte directory attribute is mapped to.
- Directory attribute - which attribute from the directory is mapped to Efecte
- Add new attribute - It is possible to set additional attributes, which are read from user accounts in 389-LDAP, by choosing New attribute button.

7. Define Access Rights mappings
-
Target template - Select a template to define access rights mappings
- Target folder - Select a folder from a list of folders. The list is narrowed down to match compatibility with selected Template.
-
Attribute mappings
- Efecte template attribute - to which attribute in Efecte directory attribute is mapped to.
- Directory attribute - which attribute from the directory is mapped to Efecte
- Add new attribute - It is possible to set additional attributes, which are read from groups in 389-LDAP, by choosing New attribute button.

8. Save provisioning task from “save” button
If any mandatory information is missing, you are not able to save the task and save button will show missing information.

9. The next step is to configure the workflow to use this event-based task. From workflow engine in Efecte Platform, it is possible to execute provisioning activities towards customer directory services. This means that any of the available orchestration nodes activity can be run at any point of the workflow.
Workflow references are shown in the connector overview page,

2024.1 and older version instructions
Configure Scheduled-based Task
For configuring scheduled-based provisioning task, you will need access to Efecte Platform configuration console.
1. Open the Efecte Administration area (a gear symbol).
2. Open IGA view
3. Choose Add a new task for Scheduled-based Provisioning

4. Choose 389-LDAP Red Hat from the Add a new task list
5. Fill in unique name for the provisioning task
6. Choose WebAPI user and type in password
7. Select a Failure Template of datacard which will be created in case of any errors during provisioning (connection to data sources,timeouts,etc.)

8. Choose scheduling sequence, which depends how much data is read to Customers Efecte solution

9. Fill in Properties section, where information for the connection is defined and filter's for reading user and group information from Customers 389-LDAP

10. Users are imported to IGA Account template and it is mandatory to set Target folder, datasourceid and unique values which are used for identifying users between Customers 389-LDAP and Efecte solution. It is possible to set additional attributes, which are read from user accounts in 389-LDAP, by choosing Add property

12. Groups are read to IGA Entitlement template and it is mandatory to set Target folder, datasourceid and unique values which are used for identifying users between Customers 389-LDAP and Efecte solution. It is possible to set additional attributes, which are read from user accounts in 389-LDAP, by choosing Add property

14. Save provisioning task from the top bar
15. You have now configured scheduled-based provisioning task and you can
- Test connection
- Test Authentication
- Run task manually

16. If task is executed manually (run task manually) or it is run according to scheduling, task status can be reviewed under Extract / Load Status tab.

Configure Event-based Task
Event-based Provisioning task is used when writing data towards 389-LDAP. These Efecte Provisioning Engine capabilities are available only for Efecte IGA solution and IGA license is required if these are used.
All configuration related to Efecte Provisioning Engine and event-based provisioning task are configured in Efecte Service Management platform.
1. Choose 389-LDAP Red Hat from the Add a new task list
2. Select Provisioning type Event-based provisioning
3. Select mappings type: Single Identity only, Single (Access Rights only) or Identity & Access rights.
"Identity only" - only Users' mappings are available, "Access Rights only" - only Groups' mappings are available.
4. Fill in unique name for the provisioning task

5. Fill in Properties section, where information for the connection is defined and filters for user and group information from Customers 389-LDAP.
Note! The connection between Efecte Provisioning Engine and the directory services is always recommended to be secured.

6. Define attribute mappings.
- It is possible to set additional attributes by choosing Add custom property.
- In the event-based provisioning task, it is possible to define which attribute information in IGA solution is written to the directory.
- There can be several provisioning task for different purpose, like one for creating users and one for removing users from the directory.

7. Save provisioning task from the top bar
8. You have now configured event-based provisioning task and you can
- Test connection
- Test Authentication
9. The next step is to configure the workflow to use this event-based task. From workflow engine in Efecte Service Management platform, it is possible to execute provisioning activities towards Customer directory services. This means that any of the available Activity's can be run at any point of the workflow
Workflow activities
Add User to Group

In the illustration above, the Person attribute configuration should point to the template where the orchestration node finds the user’s data (usually IGA account). The Role attribute needs to be configured to define where the orchestration node finds the available roles (directory groups where the user should be removed). There might be single or multiple attribute groups configured in a “Role attribute”. The list of available registered directory Tasks is fetched from the EPE-master.
Exception handling:
- The result of a node will be in the “Completed” state only in the case when all user’s group memberships will be updated successfully. In the case when, for example, the user will be successfully removed from 5 out of 6 groups then the result of a node will be in the “Exception” state.
- Hence the mapping for distinguishedName JSON field, for both Identity and Access Right is required. If mapping won’t be found then the orchestration node will result in an “Exception” state.
- The attempt to remove a user from a group which they do not belong will end as a failure.
- The attempt to add a user to a group to which he already belongs will end as a failure.
- Details about successfully/unsuccessful updated user’s group membership can be found in logs.
- Provisioning and group membership exceptions are optional properties on this workflow node. Admins can configure this properties in use where exceptions can be written if any exceptions exists during the provisioning actions.
Create user

In the illustration above, the Identity Attribute Mappings are populated from Provisioning tasks. Admins choose the correct Directory “Target” and are able to view the Identity Mappings which are configured for selected directory tasks. In this orchestration view admins are not allowed to change any mappings, those are presented as a visual aid. If any changes to the mappings are needed, those needs must be execute in the Provisioning task configuration view.
The creating new user orchestration node read attributes from Data Cards in question and execute LDAP command to 389-LDAP Directory.
Creating new user activity notes:
-
There are two ways to create the password for a new user for their first login.
-
Define “Default” password in the Provisioning Task -configuration view
- That password will only be used by users, when they login into the system for the first time
- Generating random password in the workflow and select into which attribute on Identity Mapping data-card it was written to
- Possibility to choose if the password must change at the first login or not. Administrators can make the selection for this directly from the workflow User Creation orchestration node.
-
Define “Default” password in the Provisioning Task -configuration view
- There are different ways to provide password for the first login for the end-user. Depending on customers needs it is possible to use workflow functionalities to send the password directly to the end user via email or sms. Another option is to send the password for first login to the manager, who will provide it for the end-user. EPE’s Orchestration node itself DO NOT provide that functionality, it needs to be defined elsewhere.
- Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Create Group

In the illustration above, the Access Rights Attribute Mappings are populated from Provisioning tasks. Admins choose the correct 389-LDAP “Target” and are able to view the Access Rights Mappings which are configured for selected 389-LDAP tasks. In this orchestration view admins are not allowed to change any mappings, those are presented as a visual aid. If any changes to the mappings are needed, those needs must be execute in the Provisioning task configuration view.
The creating new user orchestration node read attributes from Data Cards in question and execute LDAP command to 389-LDAP.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Delete Group

In the illustration above administrators can choose the correct 389-LDAP “Target”. The delete group orchestration node read attributes from Data Cards in question and execute LDAP command to 389-LDAP.
'Role group attribute' should contain group’s distinguishedName name.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Delete User

In the illustration above administrators can choose the correct 389-LDAP “Target”. The delete user orchestration node read attributes from Data Cards in question and execute LDAP command to 389-LDAP.
'Person Attribute' should contain User's distinguishedName name.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Remove User from Group

In the illustration above administrators can choose the correct 389-LDAP “Target”. The remove user attribute orchestration node read attributes from Data Cards in question and execute LDAP command to 389-LDAP.
Person attribute* should contain user's distinguishedName name.
Role attribute* should contain group’s distinguishedName name.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Verify Group


In the illustration above, the Access Rights Attribute Mappings are populated from the Provisioning tasks. Administrators choose the correct 389-LDAP from “Target” and are able to view what Access Rights Mappings are configured for the selected 389-LDAP task. In this orchestration view you are not allowed to change any mappings, those are presented only as a visual aid. If there are needs to change the attribute mappings, those attributes must be defined in the provisioning task configuration view, in order them to be changed in the orchestration node.
Within the Access Rights Mappings admins panel, admins are able to provide “IF” expression, which will form a LDAP query to verify if the group exists. It’s possible to select as many attributes from the Data Card as needed to confirm the uniqueness of a group. When an action takes place, those attributes will be read from the Data Card in question and will be compared to the appropriate 389-LDAP attributes according to the “Target*“ Directory configuration. Admins can also choose to use “equal” or “not equal” to corresponding 389-LDAP attribute by changing the “IF” expression. The “Save result*” field is used to define where the successful LDAP query results are saved, “true” if group was found or “false” otherwise.
Administrators has possibility to “check” Include OU subtree -property on this orchestration node to verify if group exists in the defined Organization Unit -subtree. If this administrator doesn’t select this option, orchestration node will only check the specific OU defined in the configuration.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Verify User

In the illustration above, the Identity Attribute Mappings are populated from the Provisioning tasks. Administrators choose the correct 389-LDAP from “Target” and are able to view what Identity Mappings are configured for the selected 389-LDAP task. In this orchestration view you are not allowed to change any mappings, those are presented only as a visual aid. If there are needs to change the attribute mappings, those attributes must be defined in the provisioning task configuration view, in order them to be changed in the orchestration node.
Within the Identity Mappings admins panel, admins are able to provide “IF” expression, which will form a LDAP query to verify if the user exists. It’s possible to select as many attributes from the Person Data Card as needed to confirm the uniqueness of a user. When an action takes place, those attributes will be read from the Data Card in question and will be compared to the appropriate 389-LDAP attributes according to the “Target*“ 389-LDAP configuration. Admins can also choose to use “equal” or “not equal” to corresponding AD attribute by changing the “IF” expression. The “Save result*” field is used to define where the successful LDAP query results are saved, “true” if user was found or “false” otherwise.
Key point to understand this node's mechanics is - while forming IF expression, admin needs to use Template's attributes, but in fact, values read from them, will be translated (mapped) to proper 389-LDAP attributes, according to Identity Mapping configuration and will be passed to 389-LDAP as a search query.
Administrators has possibility to “check” Include OU subtree -property on this orchestration node to verify if user exists in the defined Organization Unit -subtree. If this administrator doesn’t select this option, orchestration node will only check the specific OU defined in the configuration.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Read User's data

In the illustration above, the Identity Attribute Mappings are populated from the Provisioning tasks. Administrators choose the correct 389-LDAP from “Target” and are able to view what Identity Mappings are configured for the selected 389-LDAP task. In this orchestration view you are not allowed to change any mappings, those are presented only as a visual aid. If there are needs to change the attribute mappings, those attributes must be defined in the provisioning task configuration view, in order them to be changed in the orchestration node.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Run provisioning task

Administrators choose the correct 389-LDAP from “Target”. In this orchestration view you are not allowed to change any mappings. If there are needs to change the attribute mappings, those attributes must be defined in the provisioning task configuration view.
Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
Table of Contents

