Hotel customer data can be easily ingested from property management systems with the Oracle Opera using HAPI Opera Exchange Interface (OXI). This streaming import integration uses HAPI instead of requiring hotels to host their own on-premise server.

Hapi is a data integrations platform that improves the process of extracting, cleaning and normalizing event data being imported into Treasure Data.

This topic includes:

Prerequisites

Requirements and Limitations

  • Each request should have Content-Length of 998 KB or less.

  • There is a limit of 2 Source creations per account per connector type. 

  • Duplicate events are allowed.

  • A database must be created before creating a new Source.

  • At least one table must be created before creating a new Source. Any additional tables needed must be used before they can be used in an endpoint. 

  • The expected target latency for data to be available in Plazma is between 5  and 10 minutes.
  • TD users are required to use the Write-only API key to create a Hapi Streaming Connector.
  • There is a few seconds delay before the newly created source starts streaming data.

Import from Hapi via TD Console

Create an Authentication

Your first step is to create a new authentication with a set of credentials.

1.  Select Integrations Hub.
2. Select Catalog.

3. Search for your Integration in the Catalog; hover your mouse over the icon and select Create Authentication.

4. The New Authentication modal opens.

5. Ensure that the Credentials tab is selected and then enter credential information for the integration. 

New Authentication Fields 


Api Id (required)

Api Id for this connection

TD Api Key

(required)

TD write-only API Key for access to the database.

See Getting Your API Keys.


6. Select Continue.
7. Enter a name for your authentication, and select Done. 

Transfer Your Data to Treasure Data

1. Open TD Console.
2. Navigate to Integrations Hub > Authentications.
3. Locate your new authentication and select New Source.

Connection

ParameterDescription
Data Transfer NameYou can define the name of your transfer.
AuthenticationThe authentication name that is used to transfer.
1. Type a source name in the Data Transfer Name field.
2. Select Next.

Define Data Settings

1.  The Create Source page displays with the Data Settings tab selected.
2.  Edit the following parameters.
ParameterDescription

Datastore

Plazma is available option.

Tags (optional)

Tags can be used to find this source.

Database

Specify the database within Treasure Data for which you want to import data into.

Table

Specify the table within the database where you would like the data placed.

3.  Select Create.
4. Create the tables needed for each endpoint from Hapi. Examples are
  • reservation
  • inventory
  • profile
  • block
  • room_stay
  • rate
  • guest


Copy the Source Id

The source id (UUID v4) is issued when a Source is created. To prevent misuse, the Source Id should not be disclosed to any unauthorized persons. You need the source id to register the endpoint with Hapi.

  1. After creating the Source, you are automatically taken to the Sources listing page.

  2. Search for the source you created.

  3. Click on "..." in the same row and Click Copy Unique ID. This Unique ID is the Source ID required when registering an endpoint with Hapi.

Register Your Endpoint with Hapi

To complete the endpoint registration, you need the TD Api key (see Getting Your API Keys) and Source Id (see Copy the Source Id section). You need a secure channel with Hapi to transfer the two secrets using the following example configuration. 

  • Replace <TD Api Key> and <Source Id> with your TD Api key and Source Id, respectively.
  • reservation, inventory, profile, block, room_stay, rate, guest are the example tables created in Define Data Settings.
App id: Hapi
Api key: <TD Api Key>
Auth endpoint: https://hapi-in-streaming-development.treasuredata.com/v1/authenticate
Message Type endpoints:
RESERVATION: https://hapi-in-streaming-development.treasuredata.com/v1/task/<Source Id>/table/reservation
INVENTORY: https://hapi-in-streaming-development.treasuredata.com/v1/task/<Source Id>/table/inventory
PROFILE: https://hapi-in-streaming-development.treasuredata.com/v1/task/<Source Id>/table/profile
BLOCK: https://hapi-in-streaming-development.treasuredata.com/v1/task/<Source Id>/table/block
ROOM_STAY: https://hapi-in-streaming-development.treasuredata.com/v1/task/<Source Id>/table/room_stay
RATE: https://hapi-in-streaming-development.treasuredata.com/v1/task/<Source Id>/table/rate
GUEST: https://hapi-in-streaming-development.treasuredata.com/v1/task/<Source Id>/table/guest

Data Ingestion

Event data ingestion into TD begins when the first event is triggered after the streaming data source is created.


  • No labels