Create a data feed
When creating a data feed, you provide Adobe with:
-
The information about the destination where you want raw data files to be sent
-
The data you want to include in each file
Create and configure a data feed
-
Log in to experiencecloud.adobe.com using your Adobe ID credentials.
-
Select the 9-square icon in the upper-right, then select Analytics.
-
In the top navigation bar, go to Admin > Data feeds.
-
Select Add.
A page displays with three main categories: Feed information, Destination, and Data column definitions.
-
In the Feed Information section, complete the following fields:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 6-row-2 7-row-2 Field Function Name The name of the data feed. Must be unique within the selected report suite, and can be up to 255 characters in length. Report suite The report suite the data feed is based on. If multiple data feeds are created for the same report suite, they must have different column definitions. Only source report suites support data feeds; virtual report suites are not supported. Email when complete The email address to be notified when a feed finishes processing. The email address must be properly formatted. Feed interval Select Daily for backfill or historical data. Daily feeds contain a full day’s worth of data, from midnight to midnight in the report suite’s time zone. Select Hourly for continuing data (Daily is also available for continuing feeds if you prefer). Hourly feeds contain a single hour’s worth of data. Delay processing Wait a given amount of time before processing a data feed file. A delay can be useful to give mobile implementations an opportunity for offline devices to come online and send data. It can also be used to accommodate your organization’s server-side processes in managing previously processed files. In most cases, no delay is needed. A feed can be delayed by up to 120 minutes. Start & end dates The start date indicates the date when you want the data feed to begin. To immediately begin processing data feeds for historical data, set this date to any date in the past when data is being collected. The start and end dates are based on the report suite’s time zone. Continuous feed This checkbox removes the end date, allowing a feed to run indefinitely. When a feed finishes processing historical data, a feed waits for data to finish collecting for a given hour or day. Once the current hour or day concludes, processing begins after the specified delay. -
In the Destination section, in the Type drop-down menu, select the destination where you want the data to be sent.
note note NOTE Consider the following when configuring a report destination: -
We recommend using a cloud account for your report destination. Legacy FTP and SFTP accounts are available, but are not recommended.
-
Cloud accounts are associated with your Adobe Analytics user account. Other users cannot use or view cloud accounts that you configure.
Use any of the following destination types when creating a data feed. For configuration instructions, expand the destination type. (Additional legacy destinations are also available, but are not recommended.)
accordion Amazon S3 You can send feeds directly to Amazon S3 buckets. This destination type requires only your Amazon S3 account and the location (bucket).
Adobe Analytics uses cross-account authentication to upload files from Adobe Analytics to the specified location in your Amazon S3 instance.
To configure an Amazon S3 bucket as the destination for a data feed:
-
In the Adobe Analytics admin console, in the Destination section, select Amazon S3.
-
Select Select location.
The Amazon S3 Export Locations page is displayed.
-
(Conditional) If you previously added an Amazon S3 account and location:
-
Select the account from the Select account drop-down menu.
-
Select the location from the Select location drop-down menu.
-
Select Save > Save.
The destination is now configured to send data to the Amazon S3 location that you specified.
-
-
(Conditional) If you have not previously added an Amazon S3 account:
-
Select Add account, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 layout-auto Field Function Account name A name for the account. This can be any name you choose. Account description A description for the account. Role ARN You must provide a Role ARN (Amazon Resource Name) that Adobe can use to gain access to the Amazon S3 account. To do this, you create an IAM permission policy for the source account, attach the policy to a user, and then create a role for the destination account. For specific information, see this AWS documentation. User ARN The User ARN (Amazon Resource Name) is provided by Adobe. You must attach this user to the policy you created. -
Select Add location, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 layout-auto Field Function Name A name for the account. Description A description for the account. Bucket The bucket within your Amazon S3 account where you want Adobe Analytics data to be sent.
Ensure that the User ARN that was provided by Adobe has the
S3:PutObject
permission in order to upload files to this bucket. This permission allows the User ARN to upload initial files and overwrite files for subsequent uploads.Prefix The folder within the bucket where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/
-
Select Create > Save.
The destination is now configured to send data to the Amazon S3 location that you specified.
-
accordion Azure RBAC You can send feeds directly to an Azure container using RBAC authentication. This destination type requires an Application ID, Tenant ID, and Secret.
To configure an Azure RBAC account as the destination for a data feed:
-
If you haven’t already, create an Azure application that Adobe Analytics can use for authentication, then grant access permissions in access control (IAM).
For information, refer to the Microsoft Azure documentation about how to create an Azure Active Directory application.
-
In the Adobe Analytics admin console, in the Destination section, select Azure RBAC.
-
Select Select location.
The Azure RBAC Export Locations page is displayed.
-
(Conditional) If you previously added an Azure RBAC account and location:
-
Select the account from the Select account drop-down menu.
-
Select the location from the Select location drop-down menu.
-
Select Save > Save.
The destination is now configured to send data to the Azure RBAC location that you specified.
-
-
(Conditional) If you have not previously added an Azure RBAC account:
-
Select Add account, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 layout-auto Field Function Account name A name for the Azure RBAC account. This name displays in the Select account drop-down field and can be any name you choose. Account description A description for the Azure RBAC account. This description displays in the Select account drop-down field and can be any name you choose. Application ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the Microsoft Azure documentation about how to register an application with the Microsoft identity platform. Tenant ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the Microsoft Azure documentation about how to register an application with the Microsoft identity platform. Secret Copy the secret from the Azure application that you created. In Microsoft Azure, this information is located on the Certificates & secrets tab within your application. For more information, see the Microsoft Azure documentation about how to register an application with the Microsoft identity platform. -
Select Add location, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 layout-auto Field Function Name A name for the location. This name displays in the Select location drop-down field and can be any name you choose. Description A description for the location. This description displays in the Select location drop-down field and can be any name you choose. Account The Azure storage account. Container The container within the account you specified where you want Adobe Analytics data to be sent. Ensure that you grant permissions to upload files to the Azure application that you created earlier. Prefix The folder within the container where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example,
folder_name/
Make sure the Application ID that you specified when configuring the Azure RBAC account has been granted the
Storage Blob Data Contributor
role in order to access the container (folder).For more information, see Azure built-in roles.
-
Select Create > Save.
The destination is now configured to send data to the Azure RBAC location that you specified.
-
accordion Azure SAS You can send feeds directly to an Azure container using SAS authentication. This destination type requires an Application ID, Tenant ID, Key vault URI, Key vault secret name, and secret.
To configure Azure SAS as the destination for a data feed:
-
If you haven’t already, create an Azure application that Adobe Analytics can use for authentication.
For information, refer to the Microsoft Azure documentation about how to create an Azure Active Directory application.
-
In the Adobe Analytics admin console, in the Destination section, select Azure SAS.
-
Select Select location.
The Azure SAS Export Locations page is displayed.
-
(Conditional) If you previously added an Azure SAS account and location:
-
Select the account from the Select account drop-down menu.
-
Select the location from the Select location drop-down menu.
-
Select Save > Save.
The destination is now configured to send data to the Azure SAS location that you specified.
-
-
(Conditional) If you have not previously added an Azure SAS account:
-
Select Add account, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 6-row-2 7-row-2 layout-auto Field Function Account name A name for the Azure SAS account. This name displays in the Select account drop-down field and can be any name you choose. Account description A description for the Azure SAS account. This description displays in the Select account drop-down field and can be any name you choose. Application ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the Microsoft Azure documentation about how to register an application with the Microsoft identity platform. Tenant ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the Microsoft Azure documentation about how to register an application with the Microsoft identity platform. Key vault URI The path to the SAS token in Azure Key Vault. To configure Azure SAS, you need to store an SAS token as a secret using Azure Key Vault. For information, see the Microsoft Azure documentation about how to set and retrieve a secret from Azure Key Vault.
After the key vault URI is created:
-
Add an access policy on the Key Vault in order to grant permission to the Azure application that you created.
-
Make sure the Application ID has been granted the
Key Vault Certificate User
built-in role in order to access the key vault URI.For more information, see Azure built-in roles.
For information, see the Microsoft Azure documentation about how to assign a Key Vault access policy.
Key vault secret name The secret name you created when adding the secret to Azure Key Vault. In Microsoft Azure, this information is located in the Key Vault you created, on the Key Vault settings pages. For information, see the Microsoft Azure documentation about how to set and retrieve a secret from Azure Key Vault. Secret Copy the secret from the Azure application that you created. In Microsoft Azure, this information is located on the Certificates & secrets tab within your application. For more information, see the Microsoft Azure documentation about how to register an application with the Microsoft identity platform. -
-
Select Add location, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 layout-auto Field Function Name A name for the location. This name displays in the Select location drop-down field and can be any name you choose. Description A description for the location. This description displays in the Select location drop-down field and can be any name you choose. Container The container within the account you specified where you want Adobe Analytics data to be sent. Prefix The folder within the container where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example,
folder_name/
Make sure that the SAS token store that you specified in the Key Vault secret name field when configuring the Azure SAS account has the
Write
permission. This allows the SAS token to create files in your Azure container.If you want the SAS token to also overwrite files, make sure that the SAS token store has the
Delete
permission.For more information, see Blob storage resources in the Azure Blob Storage documentation.
-
Select Create > Save.
The destination is now configured to send data to the Azure SAS location that you specified.
-
accordion Google Cloud Platform You can send feeds directly to Google Cloud Platform (GCP) buckets. This destination type requires only your GCP account name and the location (bucket) name.
Adobe Analytics uses cross-account authentication to upload files from Adobe Analytics to the specified location in your GCP instance.
To configure a GCP bucket as the destination for a data feed:
-
In the Adobe Analytics admin console, in the Destination section, select Google Cloud Platform.
-
Select Select location.
The GCP Export Locations page is displayed.
-
(Conditional) If you previously added a GCP account and location:
-
Select the account from the Select account drop-down menu.
-
Select the location from the Select location drop-down menu.
-
Select Save > Save.
The destination is now configured to send data to the GCP location that you specified.
-
-
(Conditional) If you have not previously added a GCP account:
-
Select Add account, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 layout-auto Field Function Account name A name for the account. This can be any name you choose. Account description A description for the account. Project ID Your Google Cloud project ID. See the Google Cloud documentation about getting a project ID. -
Select Add location, then specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 layout-auto Field Function Principal The Principal is provided by Adobe. You must grant permission to receive feeds to this principal. Name A name for the account. Description A description for the account. Bucket The bucket within your GCP account where you want Adobe Analytics data to be sent.
Ensure that you have granted either of the following permissions to the Principal provided by Adobe:
roles/storage.objectCreator
: Use this permission if you want to limit the Principal to only create files in your GCP account.
Important: If you use this permission with scheduled reporting, you must use a unique file name for each new scheduled export. Otherwise, the report generation will fail because the Principal does not have access to overwrite existing files.- (Recommended)
roles/storage.objectUser
: Use this permission if you want the Principal to have access to view, list, update, and delete files in your GCP account.
This permission allows the Principal to overwrite existing files for subsequent uploads, without the need to auto-generate unique file names for each new scheduled export.
For information about granting permissions, see Add a principal to a bucket-level policy in the Google Cloud documentation.
Prefix The folder within the bucket where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/
-
Select Create > Save.
The destination is now configured to send data to the GCP location that you specified.
-
-
-
In the Data Column Definitions section, select the latest All Adobe Columns template in the dropdown, then complete the following fields:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 6-row-2 7-row-2 8-row-2 Field Function Remove escaped characters When collecting data, some characters (such as newlines) can cause issues. Check this box if you would like these characters removed from feed files. Compression format The type of compression used. Gzip outputs files in .tar.gz
format. Zip outputs files in.zip
format.Packaging type Select Multiple files for most data feeds. This option paginates your data into uncompressed 2GB chunks. (If multiple files is selected and uncompressed data for the reporting window is less than 2GB, one file is sent.) Selecting Single file outputs the hit_data.tsv
file in a single, potentially massive file.Manifest Whether or not Adobe should deliver a manifest file to the destination when no data is collected for a feed interval. If you select Manifest File, you receive a manifest file similar to the following when no data is collected:
text
Datafeed-Manifest-Version: 1.0
Lookup-Files: 0
Data-Files: 0
Total-Records: 0
Column templates When creating many data feeds, Adobe recommends creating a column template. Selecting a column template automatically includes the specified columns in the template. Adobe also provides several templates by default. Available columns All available data columns in Adobe Analytics. Click Add all to include all columns in a data feed. Included columns The columns to include in a data feed. Click Remove all to remove all columns from a data feed. Download CSV Downloads a CSV file containing all included columns. -
Select Save in the top-right.
Historical data processing begins immediately. When data finishes processing for a day, the file is sent to the destination that you configured.
For information about how to access the data feed and to get a better understanding of its contents, see Data feed contents - overview.
Legacy destinations
The following information provides configuration information for each of the legacy destinations:
FTP
Data feed data can be delivered to an Adobe or customer-hosted FTP location. Requires an FTP host, username, and password. Use the path field to place feed files in a folder. Folders must already exist; feeds throw an error if the specified path does not exist.
Use the following information when completing the available fields:
- Host: Enter the desired FTP destination URL. For example,
ftp://ftp.omniture.com
. - Path: Can be left blank
- Username: Enter the username to log in to the FTP site.
- Password and confirm password: Enter the password to log in to the FTP site.
SFTP
SFTP support for data feeds is available. Requires an SFTP host, username, and the destination site to contain a valid RSA or DSA public key. You can download the appropriate public key when creating the feed.
S3
You can send feeds directly to Amazon S3 buckets. This destination type requires a Bucket name, an Access Key ID, and a Secret Key. See Amazon S3 bucket naming requirements within the Amazon S3 docs for more information.
The user you provide for uploading data feeds must have the following permissions:
-
s3:GetObject
-
s3:PutObject
-
s3:PutObjectAcl
note note NOTE For each upload to an Amazon S3 bucket, Analytics adds the bucket owner to the BucketOwnerFullControl ACL, whether or not the bucket has a policy that requires it. For more information, see “What is the BucketOwnerFullControl setting for Amazon S3 data feeds?”
The following 16 standard AWS regions are supported (using the appropriate signature algorithm where necessary):
- us-east-2
- us-east-1
- us-west-1
- us-west-2
- ap-south-1
- ap-northeast-2
- ap-southeast-1
- ap-southeast-2
- ap-northeast-1
- ca-central-1
- eu-central-1
- eu-west-1
- eu-west-2
- eu-west-3
- eu-north-1
- sa-east-1
Azure Blob
Data feeds support Azure Blob destinations. Requires a container, account, and a key. Amazon automatically encrypts the data at rest. When you download the data, it gets decrypted automatically. See Create a storage account within the Microsoft Azure docs for more information.