5 Set Up Catalog Import and Export

Import Catalog Definitions

Use this topic to understand how you can import third-party catalogs into the Launch application.

You can import third-party catalogs either through APIs or through the UI using JSON format. It can be a ZIP file containing all the JSON file formats. All the entities imported are assigned to a single project to manage the publish process. The import process is flexible and enables you to import one or more of the entities in a sequence by determining the dependencies between the entities.

You can import catalog definitions in the following ways:

  • One at a time: Enables you to import in bulk, one entity at a time. For example, a set of product specifications, product lines, simple offers, bundle offers, price lists, and so on. However, you must ensure that the referenced entities are either in the same project or already in a lifecycle status of Active or Launched. For example, you could choose to bulk import product specifications, while ensuring that its dependent service specifications, usage specifications are in place. This also applies for the hierarchy while importing product offers to have its associated product specifications in the application.

  • The entire structure in one go: For example, a Package type offer along with its bundles, both commercial and service bundles, simple offers, price lists, terms, product lines and category associations.

The sequence in which the catalog entities are imported are as follows:

  1. Project
  2. Category
  3. Catalog
  4. Balance Element
  5. Price List
  6. Tax Service Provider
  7. Attributes
  8. Customer Profile Specification
  9. Custom Profile Specification
  10. Service Specification
  11. Usage Specification
  12. Product Specification
  13. Product Line
  14. Product Offer Price
  15. Product Offer
  16. Pricing Logic Algorithm
  17. Pricing Constraints
  18. Product Rule
  19. Promotion
  20. Entitlement
  21. PriceTag

Getting Ready for Import

Before you begin an import, it's recommended that you increase the number of import threads using the ORA_ATC_IMPORT_THREAD_COUNT profile value. This profile option determines the number of parallel threads that run during the import. The default value is 10.

You can use the following criteria to configure the profile option:

  • Number of ESS servers configured on your instance: You can get this information by contacting Oracle Support.

  • Other active processes: Other application processes that may be simultaneously running on this server and utilizing the same ESS servers.

If the Launch Cloud Service import is the only process running on the instance, you can configure the number of threads up to 10 times the number of ESS servers. If there are other processes likely to be running simultaneously, reduce the number of threads proportionally.

Here's how you can set the profile options for import:

  1. Go to Navigator > My Enterprise > Setup and Maintenance.

  2. In Setup and Maintenance, click Tasks > Search, and enter Manage Administrator Profile Value in the Search field.

  3. On the Manage Administrator Profile Value page, search for ORA_ATC_IMPORT_THREAD_COUNT in the profile option code.

  4. Set the profile value for Profile Level to Site and change the profile value to the appropriate number.

  5. Click Save and Close.

How to prepare data files for import through APIs and UI

As the schema of the import file is based on TMF 620, the third-party catalog import file must be prepared in TMF JSON schema with Oracle shipped extensions. The template schema has resources which are supported by the import job. Each resource is an array of records that must adhere to the seeded schemas for Launch Cloud Service. You must use the template schema to download the seeded schema files that are referred within each resource. See REST API Reference for Launch Cloud Service for sample template and payload.

Here is a list of some of the key attributes present in the importJob resource:

  • id: Identifier of the import job.

  • @type: Indicates the type of job.

  • contentType: Indicates the content type of file which was used in import job.

  • status: Indicates the status of triggered import job.

  • href: Indicates the url of triggered import job.

  • url: URL of the file containing the import data.

  • errorLog: Indicates the error summary of triggered job.

  • errorLogUrl: Indicates the error log file url. The error file will show the error messages and corresponding IDs of the failed entities with summary information. fileName: indicates the name of the imported file.

  • importSummary: Indicates the success summary of triggered job in terms of each successfully imported resources and their success count.

  • path: Indicates the path of job request file.

  • createdBy: Indicates the submitter of the job request.

  • creationDate: Indicates the time when job was triggered.

  • completionDate: Indicates the time when job was completed.

What you must know

  • Multiple JSON files can be archived and imported as a single ZIP file.

  • The productOfferingPrice subresource within the productOffering resource only supports references to top-level productOfferingPrice resource. So, you must only provide the reference when associating a productOfferingPrice. The complete structure of price must be included in the top-level productOfferingPrice resource array in the same data file, if not already present in the application, when associating a productOfferingPrice. Lifecycle statuses are by default set to In design for all the imported resources.

  • If an existing Launch Cloud Service top-level entity is imported, then the entity is updated. If the entity doesn't exist, then a new entity will be created as version 1.0.

  • If you intend to publish the model to Buying, Catalogs and Categories should be a part of the same initiative. If there are multiple initiatives, none of the reference initiatives should have categories. Initiatives must be in the In Design state until all reference initiatives are imported successfully and published to Buying in the same order.
  • Project and Initiative are used interchangeably.

  • The lifecycle status value for all the imported records is set to In design.

How to use a project for import

Every import job uses one top-level project. If the top-level project isn't provided, the import job implicitly creates a top-level project with ID Import_<Timestamp> and name Import Job <ImportJob_ID> and associates all the imported objects to this project. This top-level project gets imported as In design lifecycle status with all other import file entities associated to it as project items. The project references within all the imported objects are optional and are defaulted to the top-level project used by the import job. If project references are provided, it must refer to the single top-level project provided in the same input file.

Note:

For Zip file imports, the top-level project can be provided in any one of the JSON files contained in the Zip file for import. Ensure that only one file contains the top-level project.

How to initiate an import

Through APIs

You must use the TMF product catalog management import endpoint which involves the following:

  • Creating payloads using the import template. For a sample schema template and payload, see REST API Reference for Launch Cloud Service.

  • Using the curl command to initiate the import process:

    • Header:

      • Content Type: Use multipart or form data

      • Authorization: Use the standard basic authorization, such as, an encoded user name and password.

    • Form: primaryFile: Use the import data file which you have prepared.

Here's a sample of the curl command that you must use to initiate the import process:

curl --location --request
POST https://<hostName>/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/importJob/\
--header 'Content-Type: multipart/form-data' \
--header 'Authorization: <ID>' \
--form '<import data file>'

Through UI

  1. Go to Administration > Job Management > Import Jobs.

  2. Click Create Import Job and select the payload prepared for import.

  3. Click Import.

For a sample request payload on initiating the import API, see REST API Reference for Launch Cloud Service.

How to review the import status

After you have initiated the import job, you can check on the import status and error scenarios, if any, through API. The Oracle Enterprise Scheduler job status can be checked from the Status field of the GetById response. The errorLog attribute in importJob GET by ID response indicates the error summary of the triggered job and must be used to identify the failure and fix the input data file. The importSummary indicates the successfully imported records. The structure of the ImportSummary also includes details like total number of objects successfully imported along with the name and count of each of the individual successfully imported resources. To initiate the import job status, use the following curl command:

curl --location --request 
GET 'https://<hostName>/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/importJob/<ID>' \
--header 'Content-Type: application/json' \
--header 'Authorization: <ID>'

Here's a sample of a successful response:

{
  "id": 149312,
  "@type": "ImportJobOracle",
  "contentType": "application/json",
  "status": "SUCCEEDED",
  "fileName": "testPayload.json",
  "createdBy": "booth",
  "path": "",
  "url": "v1/importFile/testPayload.json",
  "errorLog": "",
  "importSummary": {
    "id": 149312,
    "totalImportObjects": 11,
    "resources": [
      {
        "name": "project",
        "count": 1
      },
      {
        "name": "productOffering",
        "count": 10
      }
    ]
  },
  "errorLogUrl": "",
  "creationDate": "2021-04-07 14:28:00.195",
  "completionDate": "2021-04-07 14:29:26.382",
  "href": "https://hostName/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/importJob/149312"
}

For a successful response, the importSummary lists out the success summary. To verify a successful import and get summary, use the following curl command:

curl --location --request 
GET 'https://<hostName>/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/importJob/<ID>' \
--header 'Authorization: <ID>'

Here's a sample of a failed response:

{

    "id": 146623,
    "@type": "ImportJobOracle",
    "contentType": "application/json",
    "status": "ERROR",
    "fileName": "samplePayload.json",
    "createdBy": "booth",
    "path": "",
    "url": "v1/importFile/samplePayload.json",
    "errorLog": "Errors importing data for resource productOfferingPrice. 3 error(s) occurred.\n\nErrors importing data for resource productOffering. 5 error(s) occurred.\n\nErrors importing data for resource constraint. 3 error(s) occurred.\n\nErrors importing data for resource productOfferingPrice. 5 error(s) occurred.\n\nErrors importing data for resource productOffering. 6 error(s) occurred.\n, 6 record(s) rolled back.",
    "importSummary": {
        "id": 146623,
        "totalImportObjects": 110,
        "resources": [
            {
                "name": "project",
                "count": 1
            },
            {
                "name": "category",
                "count": 4
            },
            {
                "name": "productLine",
                "count": 4
            },
            {
                "name": "catalog",
                "count": 1
            },
            {
                "name": "pricelist",
                "count": 2
            },
            {
                "name": "productSpecification",
                "count": 13
            },
            {
                "name": "productOfferingPrice",
                "count": 46
            },
            {
                "name": "productOffering",
                "count": 39
            }
        ]
    },
    "errorLogUrl": "v1/importFile/146623.log",
    "creationDate": "2021-04-06 10:47:01.564",
    "completionDate": "2021-04-06 10:56:44.045",
    "href": "https://hostName/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/importJob/146623"
}

In the failure response, the errorLog lists out the error summary as well as the total rollback records. As some records are processed successfully as well, they're indicated in the ImportSummary. The errorLogUrl to initiate the case is: https://hostName/crmRestApi/atcProductCatalog/11.13.18.05/v1/importFile/146623.log.

Export Catalog Definition

Use this topic to know the different ways using which you can export catalog entities from the Launch application.

You can export product catalog entities either through REST APIs or through UI (applicable only for initiatives) and retrieve the exported data files from a known location. The records are exported into a single JSON file or a ZIP file containing multiple JSON files, based on the number of records, which can later be used for importing into another Launch Cloud Service environment.

Note:

While exporting large number of objects from Launch application, there might be a risk of increased memory usage and file size. To avoid this, use the ORA_ATC_EXPORT_IN_MEMORY_COUNT profile option to configure the maximum limit for the number of objects that can be stored in the memory. The default value is 1000.

When the number of objects in the memory reaches this specified limit, it starts dumping into files in between and clears the memory. This results in the generation of multiple files that are zipped into a single ZIP file at the end.

Here are the ways to export catalogs:

Export by Resource Types

Here are some of the primary export resource types available in the ExportJob resource. Export job allows exporting of specific resource types using the resourceType attribute. Following are some of the TMF and non-TMF object types that can be specified when submitting export jobs:

  • productoffering

  • productOfferingPrice

  • productLine

  • constraint

  • pricingLogicAlgorithmSpecification

  • promotion

  • productSpecification

  • serviceSpecification

  • usageSpecification

  • customerProfileSpecification

  • customProfileSpecification

  • taxServiceProvider

  • pricelist

  • balanceElement

  • catalog

  • category

  • productRule

  • attribute

  • entitlement

  • priceTag

Additionally, here are some of the product offering resource sub-types available to you.

  • package

  • commercial_bundle

  • service_bundle

  • service

  • device

The following export options are also supported for a resource-based export. However, it's only the latest version of the resource that gets exported.

  • Name: Allows exporting objects in a specific resource type based on name attribute. This attribute also supports % search to export all objects based on a name pattern.

  • Lifecycle Status: Allows exporting objects across multiple resource types based on a valid lifecycle status attribute.

  • Last Update: Allows exporting objects filtered based on their lastUpdate attribute for a specified date range, whenever name attribute isn't used.

    "lastUpdate": {
               "startDateTime": "yyyy-MM-dd'T'HH:mm:ss.SSSZ",
               "endDateTime": "yyyy-MM-dd'T'HH:mm:ss.SSSZ"
            			}

    For a sample payload with lastUpdate, see REST API Reference for Launch Cloud Service.

Export by References

You can export resources that contain the exportReferenced attribute to indicate if the references present in exported resources must also be exported from the application. This results in multiple resource types being exported, so that they can be imported into another Launch Cloud Service environment.

Export by Project

You can export the resources associated to an initiative using the project attribute available in the export job. Initiatives in In design status aren't supported for export.

Export by File Name Prefix

You can provide a specific file name prefix for the file containing the exported objects in the fileNamePrefix attribute. The exported file's name is then set to <fileNamePrefix>_<ExportJob_ID>. If the fileNamePrefix isn't provided, then the exported file's name is defaulted to export_<ExportJob_ID>.

What you must know

  • For Export by Resource Types

    • Name

      • Must be a value or pattern (with search %) already present in the application.

      • Attribute isn't supported when exporting more than one resource types.

      • Attribute isn't supported when exporting for date range specified using lastUpdate attribute.

    • Lifecycle Status

      • Must be a value configured in the application.

      • Attribute isn't supported for exact name match (without search %).

    • ResourceType values can be only one of the supported values.

    • ResourceType doesn't support a TMF object type and its corresponding sub type together when creating an export job.

    • Project attribute isn't supported.

  • For Export by Project

    • Initiatives in In design status aren't supported for export.

    • The initiative name or ID is required and must be a value already present in the application.

    • The resourceType attribute isn't supported.

    • The lifecycleStatus attribute isn't supported.

    • The lastUpdate attribute isn't supported.

  • For Export by File Name Prefix: The fileNamePrefix is optional but the length is limited to 30 characters.

  • A Zip file is generated if there are more than 1000 objects that need to be exported.

How to initiate an export

Through API

You must use the TMF product catalog management exportJob endpoint which involves the following:

  • Creating payloads using the export template. For sample schema files, see REST API Reference for Launch Cloud Service.

  • Using the REST endpoints:

    • Method: POST

    • URL: https://hostName/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/exportJob

    • Header:

      • Content-Type: Use application or JSON.

      • Authorization: Use Standard Basic Authorization, provide encoded user name and password.

Through UI

  1. Go to Initiatives.

  2. Select and View the initiative that needs to be exported.

  3. Click Export.

    The export job is initiated and the Job ID is notified to you.

Track the job in the Administration>Export Job UI page. The Administration>Export Job UI lists all the export jobs that have been triggered. You can monitor and track the progress of the export job on the page. Once the job succeeds, the view page would have download links to download the exported content.

How to review the export status

To verify the export job status, do a GET by ID API call on the export endpoint using the following curl command and check the response field status.


curl --location --request GET 'https://hostName/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/exportJob/<exportJobId>' \
--header 'Content-Type: application/json' \
--header 'Authorization: <ID>'

Here's the response for the curl command:

{
  "id": <ID>,
  "@type": "ExportJobOracle",
  "href": "https://hostName/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/exportJob/<ID>",
  "status": "RUNNING",
  "creationDate": "2021-01-21 07:11:38.641",
  "completionDate": "",
  "exportOptions": {
    "exportReferenced": true,
    "filenamePrefix": "exp",
    "resourceType": [
      "productOffering"
    ],
    "id": "TestProductOfferId123",
    "name": "Test Product Offer"
  },
  "createdBy": "booth",
  "path": "productOffering"
}

In this GetByID export API response, the field called status, indicates the current job status.

For a success response, exportSummary lists out the success summary. To verify a successful export and get the summary, use the following curl command:

curl --location --request GET 'https://hostName/crmRestApi/atcProductCatalog/11.13.18.05/tmf-api/productCatalogManagement/v4/exportJob/<exportJobId>'\
--header 'Content-Type: application/json' \
--header 'Authorization: <ID>'

You can see the exportSummary parameter in response to easily identify the summary of a successfully exported resource. The ExportSummary shows the totalExportedObjects as well as a resource-wise split up of each resource being exported.

Troubleshoot Catalog Import Errors

Use this topic to understand how to troubleshoot some of the errors that occur during Import job. These errors may occur when the input data file used for import is incorrect.

You must use the error information provided by the errorLog and errorLogUrl attributes in the importJob GET by ID response to identify and address the issue in the input data file. The errorLog attribute provides a summary of errors that has occurred in the import job, while the errorLogUrl gives an error log file URL which shows the exact resource ID, resource name, and the error message. A new import job may be submitted with the fixed data file.

Here's the list of potential errors along with some troubleshooting tips.

Table 5-1 Troubleshoot Catalog Import Errors

Error Error Description Troubleshooting Tips

Wrong JSON format of the input data file

The records in the input data file are based on specific JSON formatting. Incorrect and invalid formatting results in errors.

Refer to the instructions provided in the How to prepare data files for import section to resolve this error.

Wrong attributes used in the input payload resource

Use of unknown attributes which aren't part of Launch Cloud Service entities results in errors.

You must correct the individual records to use only the supported attributes.

Records rolled back

Import runs in multiple batches. Any error in a batch results in all records in that batch to be rolled back. Sample error message with resulting rollback records in the log file:

jobId: 102465|id:
500TextMessage|name:
500 Text Message
|resource:
productOffering|
errorMessage: <Actual 
Error for this record>
 | status: FAILED|
The following record(s) were rolled
back due to 1 error(s) in the Import
Batch sub job:-
jobId: 102465|id: SpotifyMusic|
name: Spotify Music|resource:
productOffering|status: ROLLBACK|
jobId: 102465|id:
ModCaseClearM15M25M35|name: Moderna
Case Clear - M15/M25/M35|resource:
productOffering|status: ROLLBACK|

As a corrective action, you have to check the log file, correct the errors and rerun the import. When the error is fixed, the rollback record gets resolved by itself.

Schema validation failures

The records are validated based on the schema for base resource types for each resource mentioned in the schema template. Error messages appear when the validation fails.

You must provide the required attributes for all the resources and subresources in the correct format for attribute values.

Reference ID validation

Any reference to an existing or new top-level catalogManagement resource that isn't provided in the same input file is validated based on the ID value passed in the reference attributes. The error message appears in the following format:The <reference resource name> with id <reference resource id> referenced within <parent resource> with id <parent resource id> and name <parent resource name> doesn't exist in the system.

Any reference to a new top-level catalogManagement resource that's provided in the same input file will be provided with an empty ID value. This gets validated based on the name value passed in the reference attributes. The error message appears in the following format

The <reference resource name> with name <reference resource name> referenced within <parent resource> with id <parent resource id> and name <parent resource name> isn't present in the input file.

If the mandatory project reference in any of the resource is different from the top-level project in the same input file, the error message appears in the following format: The Project with id <project id> and version <project version> referenced within <parent resource> with name <parent resource name> and id <parent resource id> and isn't present in the input file.

For an existing or new top-level catalogManagement resource that isn't provided in the same input file, you have to either fix the reference ID or include the top- level record in the same input file.

For a new top-level catalogManagement resource that's provided in the same input file, you have to ensure that a unique name value is provided in the top-level and reference resources.

If the referred top-level project doesn't exist in the input file as a separate top-level resource, it must be added. Otherwise the incorrect reference must be corrected to refer to the existing top-level project.

ID null or empty validation

When ID isn't present in the payload, it's generated using the name in the base item details, devoid of all special characters and spaces, and including only characters and numbers. Example: For a base item titled, summer offer 123, the ID generated is summeroffer123. ID must not be an empty string for a top-level resource

ID must not be an empty string for a top-level resource.

Name validation

Name is a mandatory and unique attribute for top-level resources and when null or empty, an error message appears.

Name must be provided for all the top-level resources.

Version validation

The version number must be provided for the entities that are imported.

When the version is either null or empty, a default version 1.0 is created.

Troubleshoot Catalog Export Errors

Here's how you can troubleshoot export job and validation errors:

Table 5-2 Troubleshoot Catalog Export Errors

Error Error Description Troubleshooting Tips

Connection failure

The log file generated has the reason for the failure. There can be some errors such as connection failure which means that export job should be tried again after sometime. In case of Launch exception, the error may be related to some specific resource ID not found in the application. That is most common form of error.

In case of connection failure, try the export job after some time.

Application error

Application errors, such as database downtime or some other issues, can't be predicted. Also, sometimes an error may not have occurred at all but the job isn't completed.

Check the scheduled processes for the error reason. It may happen that the job is waiting to be scheduled or the error has occurred because of some Oracle Enterprise Scheduler failures.

To check the details of the export job and sub job logs, go to scheduled processes, find the specific job and download it. Verify the job log and the sub job IDs.

Migrate Catalog Definitions

Use this topic to understand how you can migrate catalog definitions from one Launch instance to another.

All the entities migrated are assigned to a single project to manage the publish process. The migration process is flexible and enables you to migrate entities individually or you can migrate an entity along with its references. There's an option to include references (migrate with references) in the migration process. When you select migrate with references, all the references of the entities are migrated and included as a part of the same initiative. You can select any entity separately or an entire initiative for migration. When you select an initiative, all the project items are migrated along with it.

You can migrate catalog definitions in the following ways:

  • One at a time: Enables you to migrate one entity at a time. For example, a product specification, product line, simple offer, bundle offer, price list, and so on. However, you must ensure that the referenced entities are either in the same project or already in a lifecycle status of Active or Launched. For example, you could choose to migrate product specification, while ensuring that its dependent service specifications, usage specifications are in place. This also applies for the hierarchy while migrating product offer to have its associated product specifications in the application.
  • The entire structure in one go using "migrate with references" option: For example, a Package type offer along with its bundles, both commercial and service bundles, simple offers, price lists, terms, product lines and category associations.

This is the sequence in which you must migrate catalog entities if you are pursuing with option 1 (i.e migrating entities one by one):

  1. Project
  2. Catalog
  3. Category
  4. Balance Element
  5. Price List
  6. Tax Service Provider
  7. Attributes
  8. Customer Profile Specification
  9. Custom Profile Specification
  10. Service Specification
  11. Usage Specification
  12. Product Specification
  13. Product Line
  14. Product Offer Price
  15. Product Offer
  16. Pricing Logic Algorithm
  17. Pricing Constraints
  18. Product Rule
  19. Promotion
  20. Entitlement
  21. PriceTag

Here are the steps to migrate:

  1. Go to Administration > Job Management > Migration Jobs.
  2. Click Create Migration Job and select Source as launch.
  3. Select the entity type to migrate.
  4. Select the option migrate with references. This is optional.
  5. Add the query parameter and select ID and its value for the entity to be migrated.
  6. Provide your configured source's preselection key in X-Source-Preselection.
  7. Click Submit.

You can now track the migration job from the migration job landing page.

Set Up the Integration

To set up this integration, ensure that you've registered the source Launch instance and validated the integration between the applications.

For more information, see the article CX Industries Framework (Doc ID 2720527.1) on My Oracle Support.