Bipost API

The API runs on AWS and has 4 main stages which are described next.

1. Create-Alter Schemas

  • If it doesn't exist, database is created with:

    • Name: The one you specified on Factor BI Console.

    • Encoding: cp1252 West European (latin1)

    • Collation: latin1_spanish_ci

  • Tables are created with the full set of fields found on source db.

  • Alter tables to match source schemas. Columns are never deleted.

  • Fields will appear on a different position as the source db.

  • Only the fields specified in customData.json will be populated.


2. Initial Statement

After schemas are created/altered and before new data is loaded, you can specify to run a query against your database.

This is very useful if you need to delete, truncate or make any changes before data is loaded.

Currently you need to include all desired statements in a stored procedure with the name spPostInitial and must not have parameters.

Please send us an email to activate this stored procedure for a given Service No.

Example:

DELIMITER $$
DROP PROCEDURE IF EXISTS `spPostInitial`$$
CREATE PROCEDURE `spPostInitial`()
postinitial:BEGIN

  INSERT INTO logPostInitial (message) VALUES ('auto');

  TRUNCATE TABLE `mytesttable`;
  TRUNCATE TABLE `bipostlog`;
  TRUNCATE TABLE `movtipo`;

END$$

3. Load Data

Data loading is performed by Aurora with a REPLACE parameter. Rows with the same primary key are updated and the rest inserted.

You can verify which tables where loaded by querying aurora_s3_load_history table like this:

select * from mysql.aurora_s3_load_history where file_name regexp 'mytablename';

Optionally convert load_timestamp to your local time, e.g.: CONVERT_TZ(load_timestamp,'UTC','America/Mexico_City')


4. Final Statement

After new data is loaded to your Aurora-MySQL database, and before it begins the downloading process to your on-prem, you can specify to run a query.

This is very handy if you need to execute several routines and, for example, populate new tables.

It is also very useful to prepare tables with data sets for the downloading process back to on-premises.

Currently you need to include all desired statements in a stored procedure with the name spPostFinal and must not have parameters.

Please send us an email to activate this stored procedure for a given Service No.

Example:

DELIMITER $$
DROP PROCEDURE IF EXISTS `spPostFinal`$$
CREATE PROCEDURE `spPostFinal`()
postfinal:BEGIN

  REPLACE INTO dateInfo (tag, cDate)
  SELECT 'yesterday', fnDateInfo('yesterday',fnServiceDate());

  REPLACE INTO ymInfo (tag, y, m)
  SELECT 'current', YEAR(fnDateInfo('yesterday',fnServiceDate())), MONTH(fnDateInfo('yesterday',fnServiceDate()));

  call spPopulateMyOtherTables;

  INSERT INTO logPostFinal (message) VALUES ('auto');

END$$