A workflow to export data and use them in SPARQUE AI is probably a much needed configuration for ICM. This guide shows one way to automate this workflow. For this automation, the following steps are necessary (find more detailed instructions for these steps in the main part of this document, also find the attached ZIP file with the files mentioned in this guide sparqueExportAndProcessC…):
create a product export job
create a catalog export job (for each catalog)
create a file destination in Azure DevOps
create a transport configuration
create a process chain for automatic export
create all job configurations via DBInit
configure SPARQUE to read from export destination
Concept - Intershop-SPARQUE Integration
To automate the export of all products, a job to run the product export needs to be created.
# Name of job configuration RunProductExport.Name=RunProductExport RunProductExport.Description=RunProductExport #RunProductExport.Date=2010.11.01 at 00:00:00 #RunProductExport.Interval=1440 RunProductExport.PipelineName=ProcessImpexJob RunProductExport.PipelineStartNode=Start RunProductExport.EnableJob=true RunProductExport.ApplicationSite=inSPIRED-Site RunProductExport.ApplicationURLIdentifier=inTRONICS # add custom attributes (keypair with AttributeName<Number> = AttributeValue<Number>) RunProductExport.AttributeName1=DomainName RunProductExport.AttributeValue1=inSPIRED-inTRONICS RunProductExport.AttributeName2=ExportDirectory RunProductExport.AttributeValue2=sparque RunProductExport.AttributeName3=JobName RunProductExport.AttributeValue3=ProcessCatalogImpex RunProductExport.AttributeName4=ProcessPipelineName RunProductExport.AttributeValue4=ProcessProductExport RunProductExport.AttributeName5=ProcessPipelineStartNode RunProductExport.AttributeValue5=Export RunProductExport.AttributeName6=SelectedFile RunProductExport.AttributeValue6=exportFromProcessChain.xml
Catalogs need to be exported separately, one export per catalog. This can also be done via a job configuration, similar to the following:
# Name of job configuration RunCatalogCamerasExport.Name=RunCatalogCamerasExport RunCatalogCamerasExport.Description=RunCatalogCamerasExport #RunCatalogCamerasExport.Date=2010.11.01 at 00:00:00 #RunCatalogCamerasExport.Interval=1440 RunCatalogCamerasExport.PipelineName=ProcessImpexJob RunCatalogCamerasExport.PipelineStartNode=Start RunCatalogCamerasExport.EnableJob=true RunCatalogCamerasExport.ApplicationSite=inSPIRED-Site RunCatalogCamerasExport.ApplicationURLIdentifier=inTRONICS # add custom attributes (keypair with AttributeName<Number> = AttributeValue<Number>) RunCatalogCamerasExport.AttributeName1=DomainName RunCatalogCamerasExport.AttributeValue1=inSPIRED-inTRONICS RunCatalogCamerasExport.AttributeName2=ExportDirectory RunCatalogCamerasExport.AttributeValue2=sparque RunCatalogCamerasExport.AttributeName3=CatalogID RunCatalogCamerasExport.AttributeValue3=Cameras-Camcorders RunCatalogCamerasExport.AttributeName4=ProcessPipelineName RunCatalogCamerasExport.AttributeValue4=ProcessCatalogExport RunCatalogCamerasExport.AttributeName5=ProcessPipelineStartNode RunCatalogCamerasExport.AttributeValue5=Export RunCatalogCamerasExport.AttributeName6=SelectedFile RunCatalogCamerasExport.AttributeValue6=exportCameras.xml
Go to http://portal.azure.com.
Create a storage account or use an existing one.
Create a new file share, in this example it is called sparque.
Create an access key, it will be needed in the next step.
For the full transport, a transport configuration needs to be created, like the following:
domain=inSPIRED-inTRONICS process.id=SparqueTransport process.displayname=SparqueTransport process.type=AZURE location.local=<path to shared file system>/sites/inSPIRED-inTRONICS-Site/units/inSPIRED-inTRONICS/impex/export/sparque account.key=<previously created acces key> account.name=<storeage account name> file.share=<previously created folder in fileshare, e.g. sparque> process.direction=PUSH process.delete=0
Transport can be automated via a job.
ExecuteSparqueTransport.Name=ExecuteSparqueTransport ExecuteSparqueTransport.Description=ExecuteSparqueTransport #ExecuteSparqueTransport.Date=2010.11.01 at 00:00:00 #ExecuteSparqueTransport.Interval=1440 ExecuteSparqueTransport.PipelineName=FileTransportJob ExecuteSparqueTransport.PipelineStartNode=Start ExecuteSparqueTransport.EnableJob=true # add custom attributes (keypair with AttributeName<Number> = AttributeValue<Number>) ExecuteSparqueTransport.AttributeName1=TransportProcessID ExecuteSparqueTransport.AttributeValue1=SparqueTransport
The process chain contains all previous exports and the transport configuration. The timeouts should be adjusted for projects. Also, depending on the amount of products/categories, the exports might be more efficient/faster if executed concurrently. Find more details about all options of process chains here: Concept - Process Chains (valid to 11.x).
<?xml version="1.0" encoding="UTF-8" standalone="yes"?> <chain xmlns="http://www.intershop.com/xml/ns/enfinity/6.4/core/processchain" name="Chain 1" timeout="90"> <sequence name="Chain 1.1 - Sequence" timeout="90"> <job job="RunProductExport" domain="inSPIRED-inTRONICS" name="Chain 1.1.1 - Job" timeout="60"/> <job job="RunCatalogCamerasExport" domain="inSPIRED-inTRONICS" name="Chain 1.1.2 - Job" timeout="60"/> <!--- more catalog exports i.e.<job job="RunCatalogSpecialsExport" domain="inSPIRED-inTRONICS" name="Chain 1.1.3 - Job" timeout="60"/>---> <job job="ExecuteSparqueTransport" domain="inSPIRED-inTRONICS" name="Chain 1.1.4 - Job" timeout="30"/> </sequence> </chain>
A process chain can be triggered manually in the backoffice, the automation approach would be to create a job configuration for this as well.
# Name of job configuration ExecuteSparqueProcessChain.Name=ExecuteSparqueProcessChain ExecuteSparqueProcessChain.Description=ExecuteSparqueProcessChain #ExecuteSparqueProcessChain.Date=2010.11.01 at 00:00:00 #ExecuteSparqueProcessChain.Interval=1440 ExecuteSparqueProcessChain.PipelineName=ExecuteProcessChain ExecuteSparqueProcessChain.PipelineStartNode=Start ExecuteSparqueProcessChain.EnableJob=true # add custom attributes (keypair with AttributeName<Number> = AttributeValue<Number>) ExecuteSparqueProcessChain.AttributeName1=XmlFileName ExecuteSparqueProcessChain.AttributeValue1=system\\config\\cluster\\ExportAndTransportProducts.xml
Job configurations and transport configurations can easily be created on DBInit with the help of the preparers PrepareTransportConfiguration
and PrepareJobConfigurations
.
Class1500 = com.intershop.component.transport.dbinit.PrepareTransportConfiguration \ com.intershop.demo.responsive.dbinit.data.job.TransportConfiguration Class1505 = com.intershop.beehive.core.dbinit.preparer.job.PrepareJobConfigurations \ inSPIRED-inTRONICS \ com.intershop.demo.responsive.dbinit.data.job.JobConfigurations
After running the export jobs and the transport configuration, the exported and transported files are in the created file share. SPARQUE.AI can use this file share as a base for a dataset. To use this function, configure a dataset source to Fetch a file from URL and enter the path to the fileshare, together with the access key. This enables SPARQUE.AI to be able to fetch data from that datasource.
Example: <https://<storageaccount>.file.core.windows.net/<filesharename>/<exportfile>?sv=<access> key>
The information provided in the Knowledge Base may not be applicable to all systems and situations. Intershop Communications will not be liable to any party for any direct or indirect damages resulting from the use of the Customer Support section of the Intershop Corporate Web site, including, without limitation, any lost profits, business interruption, loss of programs or other data on your information handling system.