This PowerShell script allows you to upload files from a folder to Aprimo DAM. The script can be executed manually or you could use a service such as the Windows Task Scheduler. This script is not executed in the Azure environment but runs on a local server.

For the upload, the script will use the MO REST API (for authentication) and the Aprimo DAM REST API.

Transfers occur over HTTP and will be limited by the upload bandwidth you have available. If you have concerns about ingesting large sets (>10GB) of assets, please contact your Aprimo representative to discuss ingestion options.


PowerShell 4 or higher.


Use cases

You can use the script to upload files in bulk to Aprimo DAM. For instance, you could use it for a one-time bulk upload or for automatic uploads from a specific folder.

Bulk ingestion

In the situation where you have a folder containing a large number of assets in a (structured) folder, you don’t want to upload the files by hand. Instead, you can execute the script and let it run until all files are uploaded.

Note that files are uploaded one by one. If you need more parallelism, you will need to divide your files over multiple folders and run the script multiple times with different paths.

Additionally, this script only supports adding files to new records only. Updating existing records or adding the files as new versions is not supported.

Hot folder

You may want to monitor a folder and automatically upload any files or folders that are added to it. In that case, you can set up a scheduled task to run the shell script every hour (or more often, whatever is preferred). All files that are copied to the folder or any of the subfolders will be uploaded the next time the script runs. Note that your folder could also be on a shared drive in your local network, which allows users to copy files to it.

Tip: when also uploading metadata along with the files it’s recommended to place the metadata CSV file in the upload folder before copying over the files that should be uploaded. This avoids that the upload starts importing files before the metadata file is present. In that case the upload script will create records without metadata. for more information.

See Metadata Upload for more information.


In order for the shell script to work, you have to add the Aprimo DAM URL, MO client ID and authentication token to the app.config file.

The client ID and token must be set up in MO upfront. See MO REST API authentication for more information.

Once you have the necessary information, you can fill out the app.config file:

<?xml version="1.0" encoding="utf-8"?>
<add key="endpointUri" value="" />
<add key="uploadserviceUri" value="" />
<add key="authorization" value="Basic ZHV0Y2htZW46Y2E4KsIeK7KsaWIxNDhjZjg5MmZhZWQ4MmYwNjM5MzE=" />
<add key="clientId" value="F6ZSR71-H4OD" />
<add key="logDir" value="C:\scriptlog" />
<add key="logLevel" value="Info" />

The app config file also contains two keys for logging:

  1. logDir: The destination folder for the log files.
  2. logLevel: The log level. Valid values are None, Error, Warn, Info, Debug.

Script execution

A sample call for the script:

 Aprimo.HotFolderService.ps1 -Path C:\hotfolder -Classification 76f6824d83fc4c7d9440a8630103419c -ClassifySubFolders -ActionOnSuccess
 DeleteFile -FailedFilesFolder C:\failed -MetaDataFile "C:\HotFolder\metadata.csv" -MetaDataSchema "example1"

The Aprimo.HotFolderService.ps1 command accepts the following parameters:

  • Path: the path to the folder to scan
  • Classification: (Optional) the Id (guid) of the root classification to link the new records to, typically a waiting room or temporary classification. If you specify a classification here, make sure it doesn’t overlap with any classifications that are added via the default value of synchronized Classification List fields!
  • ClassifySubFolders: (Optional) add this parameter if you want to create a classification structure based on the folder structure of the path you specified. This structure will be created under the classification you specify, and the new records will be linked to the resp. classification.
  • ActionOnSuccess: indicates what should happen to the file after it has been uploaded successfully. Possible values:
    • Nothing: leaves the file in place
    • DeleteFile: deletes the file
    • DeleteBase: deletes all files with the same base name (i.e. filename without extension). This isuseful if you provide metadata in separate files, one for each file that should be uploaded. See Metadata Upload for more information.
  • FailedFilesFolder: (Optional) when specified, indicates where to move files that failed to upload
  • MetaDataFile: (Optional) specifies the location of a CSV file that contains the metadata of the files, if the metadata file is not located in the same folder as the files themselves. See Metadata Upload for more information.
  • MetaDataSchemaName: (Optional) the schema to be used for parsing the metadata, if you also want to upload metadata. See Metadata Upload for more information.

How it works

The script works as follows:

  • Upon execution, the script scans one root folder (Path parameter) and all subfolders for files.
  • For every detected file, a record is created
    • in the root classification if the Classification parameter is specified
    • or in a subclassification that matches the file’s parent folder name if the ClassifySubFolders parameter is specified. The subclassification will be created if it doesn’t exist yet.
    • or as a draft record if no Classification parameter is specified. In that case, the ClassifySubFolders parameter is ignored and the draft record will appear on the My uploads page.
  • If the MetaDataFile parameter is specified, Aprimo DAM will try to map the metadata in the specified CSV file to fields/properties on the record. See Metadata Upload for more information.
  • Files that have been processed are removed or stay in the folder (ActionOnSuccess parameter).
  • When a file fails to upload, it stays in the folder by default. It can optionally be moved to a failure folder (FailedFilesFolder parameter).

Metadata upload

The script offers 3 ways to associate metadata with the files you upload:

  • Pass the path to the metadata file as a parameter when executing the script. See Script execution
  • Add a metadata file for each file you want to upload. Their base name should be identical.
  • Add one metadata file for all files in a folder.

The script uses the following logic:

  1. Check if the path of the metadata file is passed as a parameter to the script.
  2. If not, check whether the uploaded file has a metadata file with the same base name.
  3. If not, check whether the folder contains a generic metadata file and whether the filename of the uploaded file appears in it. No record is created if more than one generic metadata file is found: the file is placed in the Failed Files folder.

See Metadata Upload for more information regarding the format, file type, schema etc. to use for metadata upload.