Generic REST API Connector
Learn how to easily connect and interact with multiple REST APIs using a customizable and versatile Generic REST API Connector.
Generic REST API Connector
Learn how to easily connect and interact with multiple REST APIs using a customizable and versatile Generic REST API Connector.
Generic REST API Connector
Generic REST API connector is part of Provisioning Engine Native Connectors and it is used in this first version for writing data towards customers 3rd party solution which offers REST API. It can be used for all Matrix42 Core, Pro and IGA solutions which are using Connector management (EPE).
2025.2 release introduced this new Generic REST API connector with event-based tasks.
2025.3 release introduced scheduled tasks support for this Generic REST API connector.
Fair Usage Policy
The connector may be used without restriction under normal operating conditions. However, if usage results in significantly increased load requiring additional cloud resources (e.g., memory or processing capacity), additional charges may apply.
Using this connector for creating, updating, or removing user accounts or group objects for external systems requires an active IGA license.

Generic REST API Connector requires configuration according to customers use case, and in principle configuration has three steps:
- Configure connector - enables connector to establish connection to the solution offering REST API
- Configure event task - is used when data is written (event-based calls from workflow orchestration node) to the solution offering REST API
- Configure workflow orchestration nodes to use event-based tasks
General functionalities
General functionalities
Connectors - general functionalities
In this article are described general functionalities for managing native connectors in solution. All Native Connectors are managed from the same connector management UI.
Notice that there are separate descriptions for each connector, where connector specific functionalities and configuration instructions are described in detailed level.
To be able to access connector management, user needs to have admin level permissions to customers Platform configuration. When accesses are granted correctly, connector tab will be visible and user can manage connectors.

Left menu
Connector management is divided into four tabs:

- Overview - for creating and updating Native Connectors. The Admin User can see their status, type and how many scheduled or event tasks are associated with them.
- Authentication - for creating and updating authentication tasks. Provisioning task for authentication is needed for Secure Access to be able to define which Customers end-users can access to Matrix42 Core, Pro and IGA login page.
- Logs - for downloading Native Connector and Secure Access logs from UI.
- Settings - general settings for Native Connectors and Secure Access, including environment type for logging and monitoring.
Connectors overview tab
From overview page, user can easily and quickly see status for all connectors.

Top bar:
-
Status for Native Connectors (EPE)
- Green text indicates that Native Connectors is online. All the needed services are up and running.
- Red text indicates that there is a problem with Native Connectors, all the services are not running.
-
Status for Secure Access (ESA)
- Green text indicates that Secure Access is online. All the needed services are up and running.
- Red text indicates that there is a problem with Secure Access, all the services are not running.
-
Native Connectors version number is displayed in top right corner
Top bar for list view:

- New connector - opens new window for adding and configuring new connector
- Remove connector(s) - workflow references are calculated and pop-up window appears to confirm removal (notice that calculating references can take several seconds)
- Export - Admin can export one or more connectors (and tasks) from the environment. Usually used for exporting connectors and connectors (and tasks) from test to prod. Native connectors secret information is password protected.
-
Import - Admin can import one or more connectors (and tasks) to the environment. Usually used for importing connectors (and tasks) from test to prod.
- Admin cannot import from old EPE UI (Older than 2024.1) to the new one. Source and target environments must have same version.
- Import will fail if the configuration (templates, attributes) is not the same - for example when some attribute is missing
- If you are importing something with the same connector details, it will be merged under existing connector
- Refresh - Admin can refresh connectors view by clicking the button.
List view for overview,

- Select connector(s) - Select one connector by clicking check box in front of the connector row or clicking check box in the header row will select all connectors
- Id - Automatically generated unique ID of the connector. Cannot be edited or changed.
-
Status - indicates scheduled task status
-
Green check mark - Task is executed without any errors
-
Red cross -Task is executed, but there have been error
-
Grey clock - Task is not executed yet, waiting for scheduling
-
Orange - one of the tasks has a problem
- No value - Scheduled based-task is missing
-
- Name - Connector name added to connector settings. Unique name of the connector holding configuration for one data source
- Type - indicates target / source system
- Scheduled - informs how many scheduled tasks are configured
- Event - informs how many event tasks are configured
-
Manage
- Pen icon - opens connector settings (double-clicking the connector row, also opens settings)
- Paper icon - copies the connector
- Stop - workflow references are calculated and pop-up window appears to confirm removal
- Search - User can search connector by entering the search term in the corresponding field . Id, Status, Name, Type, Scheduled and Event fields can be searched.
Scheduled task information (click arrow front of the connector)
When clicking arrow at the beginning of the connector row, all related scheduled and event tasks are shown

Top bar for scheduled tasks
- New task - opens configuration page for new task
- Remove task(s) - removes the selected task(s) from the system and they cannot be recovered anymore.
- Refresh - refresh scheduled-based tasks view
- Search - user can search task by entering the search term in the corresponding field for Id, name, enabled, extract/load status.
List view for scheduled tasks

- Select task(s) - Select task to be deleted from the list view by ticking.
- Id - Unique ID of the task. Generated automatically and cannot be changed.
- Name - Task name added to task settings, unique name of the task.
-
Enabled - Displays is the task scheduled or not
-
Green check mark - Task is scheduled
-
Red cross - Task is not scheduled
-
-
Extract status - Displays status of data extraction from the target directory/system
-
Green check mark - data is extracted successfully
-
Red cross - data is extracted with error(s) or extraction is failed
Clock - Task is waiting execution
-
-
Load status - Displays status of data export from json-file to customers solution
-
Green check mark - data is imported successfully to customers solution
-
Red cross - data is imported with error(s) or import is failed
-
-
Manage
- Pen icon - opens task settings in own window (double-clicking the task row also opens task settings)
- Paper icon - copies the task
- Clock icon - opens task history view
- Stop - remove task, pop-up window opens to confirm the removal
Scheduled task history view
By clicking the clock icon in the scheduled task row, history for scheduling will be shown.

Top bar for view history
- Refresh - refreshes scheduled task status. This doesn't affect task, this only updates UI to show latest information of task run.
List view for scheduled task history
-
Color of the row is indicating status
-
Green - task executed successfully
-
Red - error happened during execution
-
- Execution ID - unique ID for the scheduled task row
- Extract planned time - when next extract from the directory/application is scheduled to happen
- Extract complete time - when extract was completed
- Extract status - status for fetching data from the directory/application
- Load start time - when next load to customers solution is scheduled to happen
- Load complete time - when load was completed
- Load status - status for loading information to customers solution
List view for scheduled task status
- Actual start time - timestamp for actual start
- Users file - JSON file containing user information read from the directory/application
- Group file - JSON file containing group information read from the directory/application
- Generic file - JSON file containing generic information read from the directory/application
- Extract info - detailed information about reading information from the directory/application
- Load info - detailed information about loading the information to customers Matrix42 Core, Pro and IGA solution
Edit window for scheduled task
Configuration for scheduled task can be opened by clicking pen icon or double-clicking the task row.
Left menu and attributes varies according to selected options and therefore more detailed instructions for editing tasks can be found from the connector description, but there are common functionalities for all scheduled tasks which are described below.

Saving the task
In case mandatory information is missing from the task, hoovering mouse on top of the save button will show which attributes are still empty.
Top bar for edit scheduled task
- Run task manually - admin can run task manually out side of the defined scheduling
- Stop task - admin can stop scheduled-based task which is currently running. Task will be stopped and status is changed to be stopped. It waits in this state until the next timing occurs.
-
Clear data cache - Data cache for the next provisioning of Users and Groups will be cleared. It means that next run is run as first time run.
- By default, we clear the cache everyday at 00:00 UTC
- If you want to clear the cache at different time, then it has to specify some different value in host file 'custom.properties'.
-
EPE Cache is also cleared when EPE is restarted, whole environment is restarted, EPE mappings have been changed
Event task information
When clicking arrow at the beginning of the connector row, all related scheduled and event tasks are shown

Top bar for event tasks
- New task - opens configuration page for new event task
- Remove task(s) - removes selected task(s), pop-up window appears to confirm the removal
- Refresh - refreshes event tasks view
- Show workflow references - calculates task related workflow relations and statuses. This is very useful if you don't know from which workflows event-based tasks are used.
List view for event tasks
- Select task(s) - Select task to be deleted from the list view by ticking.
- Id - Unique ID of the task. Generated automatically and cannot be changed.
- Name - Task name added to task settings, unique name of the task.
-
Workflow relations
- Question mark shows pop-up window with detailed information about the reference
-
Workflow status
- Not used - No relations to workflow
- In use - Workflow(s) attached to task, task cannot be removed
-
Manage
- Pen icon - opens task settings in own window
- Paper icon - copies the task
- Stop icon - removes the task, pop-up window appears to confirm the removal
Edit window for event task
Configuration for event task can be opened by clicking pen icon or double-clicking the task row.
Left menu and attributes varies according to selected options and therefore more detailed instructions for editing tasks can be found from the connector description, but there are common functionalities for all event tasks which are described below.

Edit event task window
- Task usage, editable? - this appears when editing existing task and changing the task usage type will break workflows
- Mappings type, editable? - this appears when editing existing task and changing the mappings type will break workflows
Saving the task
In case mandatory information is missing from the task, hovering mouse on top of the save button will show which attributes are still empty.
Authentication tab
Authentication for Matrix42 Core, Pro and IGA solutions are configured from authentication tab, notice that only some of the connectors (directory connectors) are supporting authentication, so its not possible to create authentication tasks to all available connectors.

Top bar for authentication
- New connector - opens new window for configuring new connector (notice that not all connectors are supporting authentication)
- Remove connector(s) - removes selected task(s), pop-up window appears to confirm the removal
-
Export - user can export one or more tasks from the environment. Usually used for exporting tasks from test to prod. EPE connectors are password protected.
- Note that Realm for authentication tasks is not exported, you need to set that manually after importing
- Import - user can import one or more tasks to the environment. Usually used for importing task from test to prod.
- Refresh - refreshes authentication tasks view
List view for authentication overview
- Select connector(s) - Select one connector by clicking check box in front of the connector row or clicking check box in the header row will select all connectors
- Id - Automatically generated Unique ID of the connector. Cannot be edited or changed.
- Name - Connector name added to connector settings. Unique name of the connector holding configuration for one data source
- Type - indicates target / source system
- Count - informs how many authentication tasks are configured
-
Manage
- Pen icon - opens authentication task setting in own window
- Paper icon - copies the task
- Stop icon - removes selected task
Authentication task information
When clicking arrow at the beginning of the connector row, all related scheduled and event tasks are shown

Top bar for authentication overview
- Create new task - opens configuration page for new authentication task
- Remove task(s) - removes selected task(s), pop-up window appears to confirm the removal
- Refresh - refreshes authentication tasks view
List view for authentication overview,
- Select task(s) - Select task to be deleted from the list view by ticking.
- Id - Unique ID of the task. Generated automatically and cannot be changed.
- Name - Task name added to task settings, unique name of the task.
-
Manage
- Pen icon- opens task settings in own window (double-clicking the task row, also opens settings window)
- Paper icon - copies the task
- Stop icon - removes the task, pop-up window appears to confirm the removal
Logs tab
Logs tab is for downloading Native Connector and Secure Access logs from UI for detailed troubleshooting.

- epe-master logs - contains warning, debug and error level messages about Native Connectors and info how long task actions has been taken.
- epe-worker-ad logs - contains extract data status of Active Directory connector (what Native Connector is loading to to customers Matrix42 Core, Pro and IGA solution). If selection is empty, it means that directory is not in use in this environment.
- epe-worker-azure logs - contains extract data status of Entra ID (what Native Connector is loading to to customers Matrix42 Core, Pro and IGA solution). If selection is empty, it means that directory is not in use in this environment.
- epe-worker-ldap logs - contains extract data status of LDAP (what Native Connector is loading to to customers Matrix42 Core, Pro and IGA solution). If selection is empty, it means that directory is not in use in this environment.
- epe-launcher logs - contains information about EPE launches
- datapump-itsm logs - Contains information about data export to customers Matrix42 Core, Pro and IGA solution.
- esa logs - Contains information about Secure Access authentication.
Settings tab
Settings tabs is used for monitoring environments with connectors.

-
Environment type - is mandatory to be in selected and information is used for example defining alert critically.
- Test - select this when your environment is used as testing environment
- Prod - select this when your environment is used as production environment
- Demo - select this when your environment is used as demo or training environment
- Dev - select this when your environment is used as development environment
What we are monitoring?
- Failures in scheduled based provisioning (extracting data, exporting data to ESM, outdated certificates, incorrect passwords, incorrect search base/filter, incorrect mappings, etc.)
- Failures in event based provisioning (fail to write to AD/Azure, etc.)
- Event-based provisioning - which connectors are used for writing data towards applications/directories.
- ESA more than ten failed login attempts to one user in past 3 days
- Environment type - is mandatory to be in selected and information is used for example defining alert critically.
Data migrations
Do not click “Migrate attribute mappings” or “Migrate workflows”, if not requested to do so by Matrix42.
Configure connector
In this chapter is described configuration instructions for connector to be able to connect to REST API.
For configuring provisioning , you will need access to Platform configuration console.
1. Open the Administration area (a gear symbol).
2. Open Connectors view.
3. Choose "New connector"

4. Select Data Source type to be Generic REST API

5. Give name for the connector and add connection settings:
- Connector name - give your connector a friendly name (name can be changed afterwards)
- Host url - base url for REST API. This can be used as prefix for final url to be called.
- REST Connector type - Generic or Google. For Google use Google, and for all other type of REST API's use General.

Select correct authentication method, based on system you are connecting

- Basic Auth - Basic authentication using username/password as base64 encoded authentication header
- Bearer Token - Using constant bearer token authentication header
- Disabled - For public API's without authentication (in 2025.3 release and newer)
- Auth Header - For constant secret header (in 2025.3 release and newer)
- Dynamic Token - For OAuth 2.0 Client Credentials flow and similar than it (in 2025.3 release and newer)
- Example of Dynamic Token configuration: https://docs.efecte.com/configure-connectors/ms-ado-connector
- When you are using Dynamic Token authentication, you need to set your Login Content type also to header. For example if your Login Content type is application/json then you need to add to connector a header:
Content-Type application/json
Set correct pagination, based on the 3rd party system you are connecting with this connector.
Generic REST API connector pagination supports 5 different types of pagination. These are used by scheduled tasks:

- Disabled - The API does not support pagination and returns all data in a single response.
- Start & Offset - The API uses start and offset parameters. On each request, connector automatically increases start by amount of offset until all data is retrieved.
- Page Increment - The API uses page numbering. On each request, connector automatically increments the page number by one until no more results are returned.
- Link Token - The response includes a URL for the next page. Connector automatically follows that URL until no next link is provided.
- Link Attribute Token - The response includes a next-page token. Connector automatically send that token as a query parameter in the next request. This is used for example by Google.
Fulfill WebAPI user information
- WebAPI user - select correct WebAPI user which is used when writing data from external system to Matrix42 Core, Pro and IGA solutions
- WebAPI password - password for the WebAPI user

7. Save connector information
8. Add external REST API systems root https certificate to be trusted by Connector management (EPE). This can be done only by Matrix42: Add certificate.
9. Matrix42 Core, Pro and IGA solution is now able to connect to external REST API
- Next step is to configure scheduled task for data read or event task for data writing and actions towards REST API.
General guidance for scheduled tasks
General guidance for scheduled tasks
How to Create New Scheduled Task to import data
For configuring scheduled-based provisioning task, you will need access to Administration / Connectors tab.
1. Open the Administration area (a cogwheel symbol).
2. Open Connectors view.
3. Choose Connector for Scheduled-based task and select New Task
Note! If connector is not created, you have to choose first New connector and after that New task.

4. Continue with connector specific instructions: Native Connectors
Should I use Incremental, Full or Both?
Scheduled task can be either Incremental or Full -type.
Do not import permissions with AD and LDAP incremental task
Incremental task has issue with permissions importing. At the moment it is recommended not to import group memberships with incremental scheduled task.
On Microsoft Active Directory and OpenLDAP connectors, remove this mapping on incremental task:

Setting on Scheduled tasks:

Incremental type is supported only for Microsoft Active Directory, LDAP and Microsoft Graph API (formerly known as Entra ID) Connectors.
Incremental type means, that Native Connectors (EPE) fetches data from source system, using changed timestamp information, so it fetches only data which is changed or added after previous incremental task run.
When Incremental type task is run for very first time, it does a full fetch (and it marks the current timestamp to EPE database), thereafter, task uses that timestamp to ask the data source for data that changed since that timestamp (and then EPE updates the timestamp to EPE database for next task run). Clearing task cache doesn't affect this timestamp, so Incremental task is always incremental after first run.
Full type is supported for all Connectors.
Full type import fetches always all data (it's configured to fetch) from source system, on every run.
Both Full and Incremental type tasks use also Task cache in EPE, which makes certain imports faster and lighter for M42 system.
By default that task cache is cleared ad midnight UTC time. When cache is cleared, next import after that is run without caching used to reason if data fetched should be pushed to ESM, all fetched data is pushed to ESM. But after that, next task runs until next time cache is cleared, are using EPE cache to determine if fetched data needs to be pushed to ESM or not.
You can configure at what time of day task cache is emptied, by changing global setting in EPE datapump configuration:
/opt/epe/datapump-itsm/config/custom.properties
which is by default set to: clearCacheHours24HourFormat=0
You can also clear cache many times a day, but that needs to be thinked carefully, as it has impact on overall performance as EPE will push changes to ESM, that probably are already there, example(do not add spaces to attribute value): clearCacheHours24HourFormat=6,12
After changing this value, reboot EPE datapump container to take change into use.
Recommendations:
Have always by default Full type scheduled task.
If you want to fetch changes to data fetched already by full task, more frequently than you can run full task, add also incremental task. Usually incremental task is not needed.
Recommended Scheduling Sequence
Recommended scheduling sequence, depends how much data is read from Customers system/directory to the Matrix42 Core, Pro or IGA solution and is import Incremental or Full.
Examples for scheduling,
| Total amount of users | Total amount of groups | Full load sequence | Incremental load sequence |
| < 500 | < 1000 |
Every 30 minutes if partial load is not used Four (4) times a day if partial load is used |
Every 10 minutes |
| < 2000 | < 2000 |
Every 60 minutes, if partial load is not used Four (4) times a day if partial load is used |
Every 15 minutes |
| < 5000 | < 3000 |
Every four (4) hours, if partial load is not used Twice a day if partial load is used |
Every 15 minutes |
| < 10 000 | < 5000 | Maximum imports twice a day, no matter if partial load is or is not used | Every 30 minutes |
| < 50 000 | < 7000 | Maximum import once a day, no matter if partial load is or is not used | Every 60 minutes |
| Over 50 000 | Over 7000 | There might be a need for another EPE-worker, please contact Product Owner | Separately evaluated |
Please note that if there are several tasks running at the same time you may need more EPE-workers. The tasks should be scheduled at different times and can be completed according to the table above. However, if there are more than 6 tasks running at the same time, the number of epeworkers should be increased. It's best practice not to schedule tasks to run at same time, if possible.
Recommendations related to performance
If the amount fo data to be imported is over 10000 concider these things:
Adjust log level of ESM and DATAPUMP to ERROR-level, to lowe the amount of logging during task run
Have as few as possible automations starting immediately for imported datacards (listeners, handlers, workflows), as those make ESM to take longer time handling new datacards.
Set removed accounts and entitlements status removed/disabled
With this functionality, you can mark account and entitlement status to e.g. Deleted or Disabled, when account or entitlement is removed from source system. Starting from version 2025.3 you can also set status to generic objects (not only to accounts/identities and entitlements/groups).
For version 2025.3 and newer
In version 2025.3 these settings are moved from properties files to Task UI. Also you can now set these settings for Generic objects, which have not been possible before this version.
There is separate configuration for each scheduled task, and for all mapping types. Here is example of this config on task:

For version 2025.2 and older
This functionality is available for “full” type scheduled tasks.
Settings are on datapump dockers configuration file. To change those parameter values, you need to set those in /opt/epe/datapump-itsm/config/custom.properties file.
Configuration
To enable disabling functionality, datapump config should have these parameters set to true:
disable.unknown.esm.users=truedisable.unknown.esm.groups=true
Those 2 parameters are false by default in 2024.2 and 2025.1 versions. In 2025.2 and newer version those are true by default.
Next are these parameters:
personTemplateStatusCodeAttributeKey=accountStatuspersonTemplateStatusAttributeDisabledValueKey=DeletedgroupTemplateStatusCodeAttributeKey=statusgroupTemplateStatusAttributeDisabledValueKey=5 - Removed
First two attributes should point to the DatacardHiddenState attribute in the User template, and tell which value should be send there when the user is deleted.
By default its accountStatus and Value 5 - Removed on IGA Account template.
All these needs to match with the attribute configuration:

Same thing applies for the next two paramaters, but its for Groups.'
If you need to change those parameters in properties file, do changes in Datapump container to file: /opt/epe/datapump-itsm/config/custom.properties and those changes will then survive over container reboot and will be copied on reboot to /opt/epe/datapump-itsm/config/application.properties.
Description
Tasks save their __taskid__ shown as Task Id mapping in the UI to the datacards, its then used to determine if the datacard was added by this task. In case there are multiple tasks with different sets of users.
This field was previously used as datasourceid, but since we moved to the model where connector can have multiple tasks its identifier cannot be used anymore, thats why the field was repurposed as taskid instead.
Taking users as an example, when task runs ESM is asked for the list of users that have its taskid in Task Id mapping field, and doesn't have a personTemplateStatusAttributeDisabledValueKey value in the personTemplateStatusCodeAttributeKey
This result is then compared to what the task fetched, and the datacards of users that were not fetched have their personTemplateStatusattribute set to value specified in the config - 5 - Removedby default.
Example log below shows described process and informs that one user was removed.

Same thing applies to groups but groupTemplateStatusattributes are used instead.
Notes
- Feature works only with full fetch scheduled tasks..
- No support for generic templates yet, only identity and access
- When migrating from the previous versions where datasourceid was still used it needs to run at least once to set its taskid’s in the datacards first.
- EPE identifies Disabled users or groups as the ones that were removed from the AD, at the present we do not support statuses related to the entity beign active or not.
- EPE does not enable users back on its own.
- If more than one tasks fetches the same users or groups it may overwrite the taskid in the datacard depending on which task ran last. It is suggested that many full type tasks are not fetching same user or group.
- Always do configuration file changes to custom.properties, do not change only application.properties as those changes are lost on container reboot if you have not done same changes to custom.properties.
Configure scheduled task for reading data
2025.3 version added support for reading data with scheduled tasks.
Generic REST API connector is used to read data from 3rd party solutions exposing REST API by creating scheduled-based provisioning task.
How to create new task for Scheduled Provisioning
For configuring scheduled-based provisioning task, you will need access to Platform configuration console.
Note! If connector is not created, you have to create first “new connector” and after that you are able to create new tasks.
1. Open the Platform configuration view (a gear symbol).
2. Open Connectors view.
3. Choose connector for which scheduled-based task is configured
4. Select “new task” under the correct connector

5. Define scheduling for the task (if and how scheduled task should be run periodically). Choose scheduling sequence, which depends how much data is read to Matrix42 Core, Pro and IGA solution.

Recommended scheduling sequence, depends how much data is read from customers directory to the Matrix42 Core, Pro and IGA solution and is import partial or full load.
Example scheduling for reading data.
| Total amount of users | Total amount of groups | Full load sequence | Partial load sequence |
| < 500 | < 1000 |
Every 30 minutes if partial load is not used Four (4) times a day if partial load is used |
Every 10 minutes |
| < 2000 | < 2000 |
Every 60 minutes, if partial load is not used Four (4) times a day if partial load is used |
Every 15 minutes |
| < 5000 | < 3000 |
Every four (4) hours, if partial load is not used Twice a day if partial load is used |
Every 15 minutes |
| < 10 000 | < 5000 | Maximum imports twice a day, no matter if partial load is or is not used | Every 30 minutes |
| < 50 000 | < 7000 | Maximum import once a day, no matter if partial load is or is not used | Every 60 minutes |
| Over 50 000 | Over 7000 | There might be a need for another EPE-worker, please contact Product Owner | Separately evaluated |
5. Fill in task details
- Fill in unique task name for the scheduled-based task
- Task usage indicated that is the task used for reading data, writing data or to authentication. Note that If event type is changed afterwards it can break the workflows or integrations.
-
Mappings type depends what type of information is read from the directory, only Generic is supported for Generic REST API connector
- Generic (one template) - are used when a generic information is read from the directory

6. Fill in task parameters
- Query - actual API url endpoint to be called
-
Sub Queries - optional. Can be used if there is a need to fetch data related to Query, from another API endpoint.
Sub queries support only basic first level variable from main query resultset. Subqueries don't support jsonpath syntax.
Example of supported subquery (this expects to have userid attribute on first level on json response): /groups/{userid}
Example of NOT supported subquery which tries to use jsonpath syntax: /groups/{users[0].userid} -
Query headers - Custom headers added to final query during data extraction.
Connector management (EPE) adds these headers automatically, so no need to explicitly add these:“Content-Type”, “…”"Authorization", “…” - Value Marker - can be empty. Value depends on JSON data structure returned by Query API. For example if whole data is inside resultset json object, set this value marker to resultset.
- Error Marker - can be empty. Value depends on JSON error message structure returned by Query API. For example if whole error message is inside errors json object, set this value marker to errors.
- Safety Threshold for API calls - safety threshold to prevent infinite loops. Set this 2 times larger than current calls needed for this task to fetch all data. If fetching all your data by this task, e.g. 50 paginated calls are needed, set this value to 100.
-
Unique attribute - set unique attribute name (case sensitive) from JSON resultset. Usually ID, id or guid etc. Every row must have unique value in this attribute. You also need to add this attribute to mapping table in the end of task configuration. Note! Scheduled task can't fetch data which doesn't contain this unique attribute for every object.

7. Fill in failure information
Optional settings for failure handling, if scheduled task fails it can create data card to ESM that displays the error. If failure settings are defined, the administrator does not need to manually check the status of scheduled tasks.
- Failure template - Select a Template of datacard which will be created in case of any errors during provisioning (connection to data sources,timeouts,etc.)
- Failure folder - Select folder where failure data card is stored.
- Failure attribute - Select an attribute where in the Failure Template should the error information be stored in. Select text type attribute.

8. Add Generic mappings
In mappings section you configure which attribute from JSON message is read to which attribute on Matrix42 Core, Pro and IGA datacard.
- Target template - Select a template to define attribute mappings
- Target folder - Select a folder from a list of folders. The list is narrowed down to match compatibility with selected Template.
- Data Source Type mapping - optional. If it is set, it writes connectors type to that attribute.
- Task Id mapping - Task id number is written to this attribute.
- Set value to datacard fo object deleted from source system - This functionality is activated by setting checkbox on. When some object that was previously read from 3rd party system to solution is deleted from 3rd party system (meaning this task query doesn't return it anymore). This scheduled task notices that is was deleted and marks that datacard selected attribute with value you want. This can be for example used to set Status attribute to Deleted (as shown in below example screenshot).

-
Attribute mappings
- External attribute - which attribute from the 3rd party system API is read from JSON body
- Local attribute - to which attribute in Matrix42 Core, Pro and IGA attribute is mapped to
- It is possible to set additional attributes to attribute mappings, by choosing New attribute
-
JSONPath expressions supported and not supported on mappings table external attribute:
https://docs.efecte.com/configure-connectors/jsonpath-mappings-for-generic-rest-api-connector

9 Save provisioning task from the Save button.
If some required attributes are missing the save button is displayed as grey and it will display what is missing from the settings.

10. You have now configured scheduled-based connector task for Generic REST API read
- You can now wait until task is started based on scheduling or
- Run task manually - by clicking the “Run Task” button on top of task edit window, task is configured to be scheduled to start immediately. Usually for test runs or if you don't want to change the schedule settings, but want to run the task now.

Example of manual task run starting message:

If task is executed manually (run task) or it is run according to scheduling, task status can be reviewed from Scheduled tasks list manage column, by clicking “View history” button.

Configure event-based task
Event task is used when writing data (or making other event based REST API calls from Workflow) towards customers solution offering generic REST API. These Connector Management capabilities are available for all Matrix42 Core, Pro and IGA solutions.
Examples for event-based task usages:
- Add account
- Update person
- Activate account
- Create ticket
- Inactive device
All configuration related to Connector Management and event-based provisioning task are configured in Matrix42 Core, Pro or IGA Connectors view.
1. Open the Matrix42 Core, Pro or IGA configuration view (a gear symbol).
2. Open connectors view
3. Choose connector, which will use the event task
4. Select “new task” under correct connector

5. Fill in task settings
- Task name - Give name name for the task, it will be displayed in connectors view.
- It is good practice to name a task in a way that describes its purpose for example [Template name]:[Activity] IGA Service request: Add account
- Task usage indicated that is the task used for reading data or writing data. Can be changed afterwards, but it's not recommended if event task is in use. It will break workflows.
- Mappings type depends what type of information is read from the directory, identity mappings selections are displayed based on this setting.
- Generic (one template) - is used when generic information is read from the directory. This can be any type of data.

Warning about changing task usage or mappings type when event task in use:

6. Fill in task parameters
- Query - api endpoint which this task will call. This can be left empty and given in Workflow orchestration node. You can give full endpoint address like: https://apiurl.com/endpointtocall or just end of url. If you give just end of url for example endpointtocall, final url will be concatenated using all these: connector host url + task query + Orchestration node url.
- Query headers - Custom headers added to final query to REST API. See target/source system API documentation what headers it requires.
- Date Attribute formatter - used to format dates, when provisioning date type attribute to target system. You can select format from list or add your own custom format. Right side field shows example of formatted date.
- DateTime Attribute formatter - used to format datetimes, when provisioning datetime type attribute to target system. You can select format from list or add your own custom format. Right side field shows example of formatted datetime.

7. Define generic mappings
- Target template - Select a template to define attribute mappings
- Target folder - Select a folder from a list of folders. The list is narrowed down to match compatibility with selected Template.
- Attribute mappings (Note! In most cases there is no need to set these. Data used in event task is configured on Workflow Orchestration node).
- Efecte template attribute - to which attribute in Efecte directory attribute is mapped to.
- Directory attribute - which attribute from the directory is mapped to Efecte
- Add new attribute - It is possible to set additional attributes, by choosing New attribute button.

8. Save task
Save provisioning task with “save” button
If any mandatory information is missing, you are not able to save the task and save button will show missing information.

9. Configure Workflow to use this event-based task with Orchestration node
The next step is to configure the workflow to use this event-based task.
From workflow engine in Platform, it is possible to execute provisioning activities towards customer directory services. This means that any of the available orchestration nodes activity can be run at any point of the workflow.
Workflow references are shown in the connector overview page (you can refresh workflow references with “Show Workflow references button”):

Configure Workflow Orchestration node activities
Generic REST call
Add Orchestration node to your workflow and configure following things.
Name - set descriptive name
Description - set descriptive description
Orchestrate - select Provisioning Engine
Data Source - select Generic REST API
Activity - select Generic REST call
Target - select your event-based task (you can use same task for many orchestration nodes)
Action - select action based on REST API you are calling (check API documentation for correct action). Supported actions are: POST, PUT, PATCH, DELETE and GET
REST URL - api endpoint you are calling. This can be full url or suffix of the url. If you use suffix, actual url to call will be concatenated using all these: connector host url + task query + Orchestration node url. You can use template attribute variables similarly than in email node, using dollar syntax like: $attribute_code$ and $reference:attribute_code$.
Note! If you have spaces or $ signs in your url, remember to url encode those. So $ is %24 space is %20
Rest Body - json body for api call. Can be empty if API you are calling does not need body. See body details from API documentation. You can use template attribute variables similarly than in email node, using dollar syntax like: $attribute_code$ and $reference:attribute_code$.
For example in this body we have two variables: $firstname$ and $lastname$
{"name": {
"givenName": "$firstname$",
"familyName": “$lastname$”}}

In case you need to create complicated json body for API call, you can construct that full json first to some datacard attribute. And then only have that attribute code in body, like this: $generated_json_body_attributecode$

How to generate JSON body in code
How to generate JSON in workflow
Use json library to read value from JSON
This example is for example for OnPremisesExtensionAttributes, but this same approach can be used for all kinds of JSON messages for Microsoft Graph API and REST API's.
To read 1 specific value from JSON which looks like this:
{ "extensionAttribute1": "test data1", "extensionAttribute2": null, "extensionAttribute3": null, "extensionAttribute4": "EXT", "extensionAttribute5": null, "extensionAttribute6": null, "extensionAttribute7": null, "extensionAttribute8": null, "extensionAttribute9": "HR functions", "extensionAttribute10": "100", "extensionAttribute11": null, "extensionAttribute12": null, "extensionAttribute13": null, "extensionAttribute14": "test5", "extensionAttribute15": "M365_E5" }
You can do it easily with following code on workflow script.
Example (example uses two esm attributes: onPremisesExtensionAttributes and extensionAttribute14code):
import json
if onPremisesExtensionAttributes:
_data = this.get("onPremisesExtensionAttributes")
_obj = json.loads(_data)
_value = _obj["extensionAttribute14"]
this.set("extensionAttribute14code",_value)
Remember to always test, that code selects correct data from JSON for your use-case, and if not, do needed adjustments to it.
Rest Response Attribute - resultset json of API call will be written to this attribute. Not all API's return any response json. See details from API documentation. Note! This doesn't contain actual http code e.g. 200, 401, 500 etc., this contains only resultset message if API returned any.
How to read value from JSON in workflow
Use json library to read value from JSON
This example is for example for OnPremisesExtensionAttributes, but this same approach can be used for all kinds of JSON messages for Microsoft Graph API and REST API's.
To read 1 specific value from JSON which looks like this:
{ "extensionAttribute1": "test data1", "extensionAttribute2": null, "extensionAttribute3": null, "extensionAttribute4": "EXT", "extensionAttribute5": null, "extensionAttribute6": null, "extensionAttribute7": null, "extensionAttribute8": null, "extensionAttribute9": "HR functions", "extensionAttribute10": "100", "extensionAttribute11": null, "extensionAttribute12": null, "extensionAttribute13": null, "extensionAttribute14": "test5", "extensionAttribute15": "M365_E5" }
You can do it easily with following code on workflow script.
Example (example uses two esm attributes: onPremisesExtensionAttributes and extensionAttribute14code):
import json
if onPremisesExtensionAttributes:
_data = this.get("onPremisesExtensionAttributes")
_obj = json.loads(_data)
_value = _obj["extensionAttribute14"]
this.set("extensionAttribute14code",_value)
Remember to always test, that code selects correct data from JSON for your use-case, and if not, do needed adjustments to it.
Provisioning exception - If there is an issue on REST API call, error will be written to this attribute.

Exception handling:
- Provisioning exception is an optional property on this workflow node. Admins can configure this property in use where exceptions can be written if any exceptions exists during the provisioning actions.
- The result of a node will be in the “Completed” state if REST API http response code is 200 or over and under 400.
- Details about successfully/unsuccessful REST API calls can be found in logs.
Connectors created using this Generic REST API Connector
Microsoft Azure DevOps: https://docs.efecte.com/configure-connectors/ms-ado-connector
Google: https://docs.efecte.com/configure-connectors/google-connector
Raynet: https://docs.efecte.com/configure-connectors/raynet-connector
Matrix42 Enterprise and Matrix42 Software Asset Management (SAM): https://docs.efecte.com/configure-connectors/m42enterprise-sam-connector
Troubleshoot
In this chapter are described troubleshooting options,
- In case failure template is used, check correct data card
- Check scheduled task history from connector management
-
Check Native Connectors logs. From
Connectorstab, left menuLogs.- See mainly debug.log, *worker*.log and datapump.log
Restrictions
Event-based tasks restrictions
- Emails are not supported
- Attachments are not supported
Scheduled tasks restrictions
- Date type attributes (you need to map those to string type attribute)
- DateTime type attributes (you need to map those to string type attribute)
- Emails are not supported
- Attachments are not supported
Table of Contents

