Documentation
  • The Fundamental
  • ACTIVE SYNC
    • Data Ingestion
      • Data Tracking
        • API Key Management
        • Generate Tracking ID
        • Install tracking with Tag Manager
        • Install Tracking from the Console
        • Tracking Method on Website
      • Datasource
        • MySQL
        • PostgreSQL
        • MongoDB
        • Microsoft SQL Server
        • Shopify
        • CSV
        • Google Sheets
    • Data Ingestion API
      • Data Lake
        • File upload
        • Tracking API
      • Data Warehouse
        • Batch upload
        • CSV upload
        • Tracking API
      • Data Schema Warehouse API
    • Data Integrations
      • Manage your API Key
      • Get Data using API
  • ROCKET.BI
    • Introduction
    • Data Warehouse
      • Data Management
      • Ad-hoc Query
        • Measure Schema
        • Calculated Field
      • Query Analysis
      • Relationship
    • Row-level Security
    • Dashboard
      • Dashboard Filter
      • Chart Control
        • Tab Control
        • Single Choice
        • Multiple Choice
        • Dropdown Control
        • Slicer Control
        • Date Control
        • Input Control
      • Manage Dashboard
        • Relationship
        • View and Share
        • Select Main Date Filter
        • Boost
        • Settings
        • Add Chart
        • Add Tab
        • Add Text
    • Chart Builder
      • Chart Types
        • Pie Chart
        • Column Chart
        • Bar Chart
        • Line Chart
        • Line Stock Chart
        • Pareto Chart
        • Bubble Chart
        • Scatter Chart
        • Map Chart
        • Area Chart
        • KPI Chart
        • Lollipop Chart
        • Parliament Chart
        • Funnel Chart
        • Pyramid Chart
        • Gauge Chart
        • Bullet Graph Chart
        • Heat Map Chart
        • Word Cloud Chart
        • Tree Map Chart
        • Stacked Column Chart
        • Stacked Bar Chart
        • Sankey Chart
        • Spider Web Chart
        • Wind Rose Chart
        • Histogram Chart
        • Bell Curve Chart
        • Table Chart
        • Pivot Table Chart
      • Chart Settings
        • Zoom
        • Inner chart filter
      • Chart Filters
        • Tab Filter
        • Single Choice
        • Multiple Choice
        • Dropdown Filter
        • Slicer Filter
        • Date Filter
        • Input Filter
      • Right-click Settings
        • Change date function
        • Drill down
        • Drill through
        • Use as a filter
    • SQL Query
      • Syntax
      • Functions
      • Aggregate Functions
      • Data Types
  • UNLOCK.CI
    • Unlock.CI
Powered by GitBook
On this page
  • Create table schema
  • Batch upload data
  1. ACTIVE SYNC
  2. Data Ingestion API
  3. Data Warehouse

Batch upload

Batch upload flows:

  1. Create table schema

  2. Upload data as rows corresponding to created table schema

Create table schema

Method: POST

Path:

https://[client].datainsider.co/api/databases/[client]_db/tables

Parameter
Type
Description

tbl_name

string

name of the table

db_name

string

name of the database that contains the table

api_key

string

API key that has database permission

columns

array<column>

columns of the table. The column of the table having the field class_name is an enum representing the data type of the column. List of supported data types: int32, uint32, int64, uint64, string, double, datetime.

Sample request:

curl --request POST \
  --url https://[client].datainsider.co/api/databases/<db_name>/tables \
  --header 'Content-Type: application/json' \
  --data '{
    "api_key": "cccccccc-14a1-4eb1-8964-000000000000",
    "tbl_name": "user_transactions",
    "columns": [
    	{
    	"class_name": "uint64",
    	"name": "trans_id",
    	"display_name": "Transaction Id",
    	"is_nullable": true
  	},
  	{
    	"class_name": "string",
    	"name": "from_user",
    	"display_name": "From User",
    	"is_nullable": true
  	},
     	 {
    	"class_name": "string",
    	"name": "to_user",
    	"display_name": "To User",
    	"is_nullable": true
  	},
  	{
    	"class_name": "double",
    	"name": "amount",
    	"display_name": "Amount",
    	"is_nullable": true
  	},
  	{
    	"class_name": "datetime",
    	"name": "at_time",
    	"display_name": "At Time",
    	"is_nullable": true
  	},
  	{
    	"class_name": "bool",
    	"name": "is_success",
    	"display_name": "Is Success",
    	"is_nullable": true
  	}
    ]
}'

Sample response:

{
  "name": "user_transactions",
  "db_name": "ingestion",
  "organization_id": 0,
  "display_name": "user_transactions",
  "columns": [
	{
  	"class_name": "uint64",
  	"name": "trans_id",
  	"display_name": "Transaction Id",
  	"is_nullable": true,
  	"is_encrypted": false
	},
	{
  	"class_name": "string",
  	"name": "from_user",
  	"display_name": "From User",
  	"is_nullable": true,
  	"is_encrypted": false
	},
	{
  	"class_name": "string",
  	"name": "to_user",
  	"display_name": "To User",
  	"is_nullable": true,
  	"is_encrypted": false
	},
	{
  	"class_name": "double",
  	"name": "amount",
  	"display_name": "Amount",
  	"is_nullable": true,
  	"is_encrypted": false
	},
	{
  	"class_name": "datetime",
  	"name": "at_time",
  	"display_name": "At Time",
  	"input_as_timestamp": false,
  	"input_formats": [],
  	"is_nullable": true,
  	"is_encrypted": false
	},
	{
  	"class_name": "bool",
  	"name": "is_success",
  	"display_name": "Is Success",
  	"is_nullable": true,
  	"is_encrypted": false
	}
  ],
  "primary_keys": [],
  "partition_by": [],
  "order_bys": [],
  "table_type": "default",
  "temporary": false
}

Batch upload data

Method: POST

Path:

https://[client].datainsider.co/api/databases/[client]_db/tables

Parameter
Type
Description

tbl_name

string

name of the table

db_name

string

name of the database that contains the table

api_key

string

API key that has database permission

records

array<array<object>>

data as a JSON array with the column corresponding to the generated schema, which may contain null field.

Sample request:

curl --request POST \
  --url http://[client].datainsider.co/api/ingestion/batch \
  --header 'Content-Type: application/json' \
  --data '{
    "api_key": "cccccccc-14a1-4eb1-8964-000000000000",
    "db_name": "ingestion",
    "tbl_name": "user_transactions",
    "records": [
   	 [1, "John", "Marry", 123.45, "2020-01-08 00:01:00", true],
   	 [2, "John", "Alex", 768.90,  "2020-01-09 12:30:30", false],
   	 [3, "Vi", "Tom", 123123.2,  "2020-01-11 12:59:59", true],
   	 [4, "Thuan", "Harley",  2134.0,  "2020-01-11 23:00:06", false]
    ]
}'

Sample response:

{
  "total_records": 4,
  "total_invalid_records": 0,
  "total_invalid_fields": 0,
  "total_skipped_records": 0,
  "total_inserted_records": 4,
  "total_failed_records": 0
}
PreviousData WarehouseNextCSV upload

Last updated 2 years ago