Skip to main content
All CollectionsDataData import and export
Export data with the Firehose
Export data with the Firehose

Fresh Relevance can export real-time ecommerce data through FTP or other methods for your third-party ESP, CRM, or analytics package.

Updated over 4 months ago

The Firehose module is required to use this feature.

There are several methods by which the data can reach your systems:

  • Secure FTP

  • AWS S3

  • Webhook

Fresh Relevance doesn’t store the data. This is a real-time feed and when it's gone, it's gone.

Best practice is to configure the Firehose for your account as early as possible. This allows you to control how and where the data is sent.

Since October 1st 2020, the system automatically exports Firehose data to a fallback configuration - this is an AWS S3 bucket managed by Fresh Relevance.


Set up your data export

1. Create an export channel

  1. Expand the User menu and go to Settings > Exports.

  2. Select Data Export Channels.

  3. Under Set up a new channel, expand the drop-down menu and select from:

    • Export Firehose to Webhook

    • Export to FTP

    • Export to AWS S3

  4. Select CREATE EXPORT CHANNEL.

  5. Enter the required details for the channel type you have chosen, then select SAVE.

2. Configure the Firehose

Fallback Configuration

The system automatically detects client accounts which have the Firehose enabled on the account but no Firehose channel setup. It configures the Firehose to send data in JSON format to an S3 bucket managed by Fresh Relevance, in a folder for each client account: {website ID}/Firehose.

The configuration process creates the following:

  • A System Firehose Channel configured with the following options: JSON output, catchup on identify, and send non-identified person records.

  • A System Export Channel sending to S3.

  • Signal routing to send all normal signals to the Firehose.

If or when you decide that you want old data which was sent by the Firehose, TechOps can provide access to your folder within the S3 bucket. Contact Support to arrange this.

This access allows you to download the JSON files. The old data can’t be replayed through the Firehose and is as-is, in JSON format, without any field mapping.

Data in the Fresh Relevance managed S3 bucket is retained for 36 months or until your contract with Fresh Relevance ends.

To configure the Firehose:

  1. Go to Data > Firehose.

  2. Select CONFIGURE FIREHOSES.

  3. Expand the Set up a new Firehose drop-down menu and select Send Firehose Data to an Export Channel, then select CREATE NEW FIREHOSE.

  4. Select the Export Channel that you set up in Step 1.

  5. Expand the drop-down menu and select whether or not to use field mapping.

    • Use Field Mapping (CSV: required, JSON: optional).
      Required when CSV is selected in the export configuration. If JSON is selected in the export configuration, mapped fields are exported under a key mapped_fields.

    • Do not use field mapping (JSON only).

      If selected, does not run any field mapping for each signal exported. Ensure JSON is selected in the export configuration.
      To configure a new mapping, select Field Mapping.

  6. To receive notification of errors with the export, enter an email address in the Error Alert Email field.

  7. For Firehose Minutes Between Files, enter the number in minutes between uploads from the staging area to the FTP server.

  8. For Filename Pattern, enter the pattern for the filename to export. Leave blank to use the default format YYYY-MM-DD-hh-mm-ss-ffff, for example, firehose-%Y-%m-%d-%H-%M-%S-%f (firehose-YYYY-MM-DD-hh-mm-ss-ffff).
    Learn more about date formatting in this Python documentation.

  9. To include unidentified records in the export, select Include Anonymous Records.

  10. To pause the export job, select Is Service Paused?.

  11. Select Enable Firehose Catch Up to ensure that signals which have been captured for an anonymous session are sent again when the session is identified.
    By default, only the most recent 30 signals are stored and are available to be sent.

  12. Select Send Person Identified Signals to include signals for sessions or person records which have not yet been identified in the Firehose data.

  13. Select SAVE.

If you want to export Query Collection Data, that came from the URL query collection, first check the configuration of Query Collection Fields. To do this, expand the User menu and go to Settings > Website Settings > Query Collection Fields.

If you want to reduce the data volume by exporting less rows, for example, you don't want product browsed data, contact Fresh Relevance Support for assistance.


Firehose destinations

Secure FTP

Fresh Relevance exports real-time data onto your FTP server, saving it in new files named with the current date-time, and you configure your third-party system to load it from there.

For further details, contact the technical support team responsible for your FTP server and the system that you are integrating with.

Fresh Relevance supports three versions - FTP, SFTP and FTPes - your technical support team should know which you need.

Once you’ve set up Fresh Relevance to export to your FTP server, wait a while and then look at the data. Each data line should have a tms_type (or type depending how you mapped it) with the following values:

  • Product Browsed (pd)

  • Product Carted (c)

  • Product Purchased (t)

  • Product Abandoned (b for cart abandon, or ba for browse abandon)

  • Custom Signal (cu)

with the other fields you chose to map. Complete the Fresh Relevance setup so it's exporting the data you want with the field names you want, before starting to load data into your third-party system.

Learn more in the section Firehose data format.

Send data using JSON files

Data can be sent to the Firehose using JSON files. Here’s a sample file containing a single record for a cart abandon (b) signal:


Sample file

[
{
"signal_id": {
"$oid": "5ca4b38d32b4954a1119bb59"
},
"did": "auto90079",
"pid": "5b5b3924390cf45d2d03a808",
"cart": {
"sbr": "test",
"dont_check_marketing_pressure": true,
"curr": "GBP",
"qty": 2,
"p": [
{
"img": "https://s3-eu-west-1.amazonaws.com/tms-demo-store-images/product/test/canon-powershot-a630-8mp-digital-camera-with-4x-optical-zoom-2.jpg",
"up": "$139.99",
"uv": 139.99,
"prid": "canon-powershot-a630-8mp-digital-camera-with-4x-optical-zoom",
"qty": 1,
"u": "http://demostore.triggeredmessaging.com/canon-powershot-a630-8mp-digital-camera-with-4x-optical-zoom.html",
"n": "Shiny Camera (Test Product)",
"desc": "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent molestie pulvinar tortor a suscipit. Donec iaculis placerat lectus, sed aliquam metus tempor a. Integer non tincidunt nulla. Nam nec nibh."
},
{
"img": "https://s3-eu-west-1.amazonaws.com/tms-demo-store-images/product/test/universal-camera-charger.jpg",
"up": "$19.00",
"prid": "universal-camera-charger",
"qty": 1,
"u": "http://demostore.triggeredmessaging.com/universal-camera-charger.html",
"n": "Universal Camera Charger (Test Product)",
"desc": "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Praesent molestie pulvinar tortor a suscipit. Donec iaculis placerat lectus, sed aliquam metus tempor a. Integer non tincidunt nulla. Nam nec nibh."
}
],
"dt": {
"$date": "2019-04-03T13:22:21.458Z"
},
"cp": "\u00a3158.99",
"cv": 158.99
},
"dt": {
"$date": "2019-04-03T13:22:21.527Z"
},
"_id": {
"$oid": "5ca4b38e32b49549cce9bbc3"
},
"type": "b",
"evdt": {
"$date": "2019-04-03T13:22:21.458Z"
},
"unique_id": {
"$oid": "5ca4b38d32b4954a1119bb59"
}
}
]

Send data using CSV files

To configure the CSV files sent by the Firehose:

  1. Go to Data > Firehose.

  2. Select CONFIGURE FIREHOSES.

  3. To create a new Firehose, choose the firehose type then select CREATE NEW FIREHOSE.

  4. To edit an existing field mapping, select the highlighted entry in the Field Mapping column.

  5. Set what fields in the system get loaded to the Firehose by FTP.

The following example is for a FastStats installation, but yours may vary. Learn more in the section Firehose data format.

Records exported by the Firehose may not be in date order, and records from around the same time may be found in more than one file. This is because of the way data is processed by multiple servers.

Also, the order of fields is not fixed and should not be relied on. The order of fields may change at any time without warning, so you should always import data using the field names in the first row.

Webhooks

The data can also be sent to a Webhook. The JSON structure documented above is sent as a the body of the request.


Firehose data format

There’s a standard file format required for sending browse and transactional data from Fresh Relevance to external systems. This is normally done by uploading files at regular intervals to a FTP server.

Fresh Relevance supports three versions - FTP, SFTP and FTPes. Your technical support team should know which you need.

Some typical examples of usage are:

  • Exporting browse and transactional data to a CRM platform or ESP for marketing or other communication purposes.

  • Exporting to a marketing analytics system/SCV.

The rate of exporting files can be set to a value in minutes. For example, every 10 minutes or every day. By default, the system will upload a file every 30 minutes.

File format

A succession of flat files are exported, with names corresponding to the data-time.

The files contain CSV data, with one record per line item. So, for example, there will be five purchase events, each for one product, if a cart containing five products is purchased. The values which don't change with the line item are repeated on each record.

All data is optional and can be mapped to field names of your choice.

Standard signal types

The following types of signals can be enabled as standard for output in Firehose data.

Contact Support to check which signals are being generated in your account and which are enabled for output in Firehose.

Signal Type

Signal Name

Signal Description

pd

Product Browse

Signal fired when a shopper visits a product details page.

c

Cart Visit

Signal fired when a shopper visits the cart page.

p

Checkout Visit

Signal fired when a shopper visits the checkout page.

t

Purchase Complete

Signal fired when a shopper has completed a purchase.

ba

Browse Abandon

Signal fired when a shopper has abandoned a browse session without anything in their cart.

b

Cart Abandon

Signal fired when a shopper has abandoned a cart.

cu

Custom

Custom Signal generated for other page types or interactions, for example, wishlist page.

se

Session Expired

Signal fired when a session expires.

pcv

Price Drop

Signal fired for every person included on a pricedrop segment build.

pcs

Back in Stock

Signal fired for every person included on a back-in-stock segment build.

pcls

Low stock

Signal fired when a product is low in stock.

pi

Person Identified

Signal fired when a new person has been identified.

rt

Returned Purchase

Signal fired for any products returned.

cc

Cart Changed

Signal fired whenever someone adds or removes products from their shopping cart.

Attribution signal types

The following types of signals are generated for internal reporting purposes and are not likely to be useful in Firehose data. However they can be enabled for Firehose output if you want.

Contact Support to enable these.

Signal Type

Signal Name

Signal Description

ia

Internal Attribution

Signal fired when a purchase is attributed to onsite content.

Should not normally be sent to Firehose.

ea

External Attribution

Signal fired when a purchase is attributed to offsite content.

Should not normally be sent to Firehose.

ESP signal types

The following types of signals can only be generated for supported ESPs. They allow the transfer of email based events to Fresh Relevance.

Currently only Acoustic's UBX platform is supported. Contact Support if you want to enable these for output in Firehose.

Signal Type

Signal Name

Signal Description

es

Email Send

Signal fired when an external service reports an email send.

eo

Email Open

Signal fired when an external service reports an email open.

ec

Email Click

Signal fired when an external service reports an email clickthrough.

cs

Click Stream

Signal fired when an external service reports an email clickstream event.

email_attachment

Email Attachment

Signal fired when an external service reports a recipient has opened an email attachment.

ebl

Email Block

Signal fired when an external service reports an email has been blocked.

ebo

Email Bounce

Signal fired when an external service reports an email has bounced including the type (soft/hard).

eco

Email Conversion

Signal fired when an external service reports an email conversion.

eoo

Email Opt-Out

Signal fired when an external service reports a contact has opted out through an email unsubscribe link.

er

Email Restriction

Signal fired when an external service reports an email send has been restricted.

esu

Email Suppressed

Signal fired when an external service reports an email send has been suppressed.

rae

Email Reply Abuse

Signal fired when an external service reports an email has been reported for abuse.

ro

Email Reply Other

Signal fired when an external service reports a generic reply event.

Data

Common data:

  • Email address

  • First name

  • Last name

  • Any available field from the Person record

  • Date

  • Signal ID - the ID unique to each signal. Where multiple products are present in the signal, the same signal ID is used for multiple records (V2 only).

  • Unique ID - the ID unique to each record exported. This is based on the signal ID, but has the row number appended for signals containing multiple products (V2 only).

Additional data for Product Browsed records:

  • Item product ID (item_prid)

  • Item Price (item_up)

  • Regular item price

  • Item tax

  • CSV-delimited (,) list of categories for the product. The field is enclosed in double quotes.

  • Reviews

  • product_opt — Variable data selected by user. For example, shoe size, product.opt.size = 9.

  • product_ex — Fixed custom product data specific to a client's account, for example, brand, product.ex.brand. It should always be the same value.

Additional data for Carted or Abandoned records:

  • Campaign source

  • Campaign ID

  • Campaign link

  • cart_price

  • Shipping

Additional data for Purchase records:

  • Order date

  • order_id
    The same for all products in an order.

  • Item product ID (item_prid)

  • Item price (item_up)

  • Item name (item_n)

  • Quantity (item_qty)

  • cart_price

  • Shipping

  • Campaign source

  • Campaign ID

  • Campaign link

Additional data for Session Expired (SE) records:

  • Session length in seconds (signal.len_secs)

  • Session first start date/time (signal.fdt)

  • Session expiry date/time (signal.edt)

  • Session last activity date/time (signal.ldt)

  • First URL (signal.first_url)

  • Last URL (signal.last_url)

  • Signals in this session (signals)

  • Number of product details signals in this session (signals.pd|length)

  • Number of purchase complete signals in this session (signals.t|length)

  • Number of custom signals in this session (signals.cu|length)

  • Number of cart abandon signals in this session (signals.b|length)

  • Number of browse abandon signals in this session (signals.ba|length)

  • Number of cart signals in this session (signals.c|length)

Date/times are normally output in ISO format, for example, 2018-02-22 13:23:03. For integration purposes, it's sometimes useful to format the datetime in w3c format, for example, 2018-02-22T13:23:03.123221+00:00.

To do this, apply a filter to the field mapping, like this:

{{ signal.fdt | datetime_w3c }}

Product interaction data

Information about the products browsed or clicked during a session is also passed in Session Expired records, but only for channels other than FTP.

Additional data for product interaction:

  • SmartBlock seen (person_product_seen)

  • SmartBlock clicked (person_product_clicked)

These data collections include the product ID, Slot ID, and SmartBlock ID for each product that a visitor has interacted with.

For click data to be successfully exported, you must use the default JSON schema.

Did this answer your question?