Influent builds transaction flow visualizations from your source relational databases. There are two types of data you can connect to Influent:
|Transaction||Transaction details such as:
||Using a SQL script in the source code, add a set of tables to your database that summarize the transactions and entities in your dataset.|
|Entity||Details about each entity in your transaction data such as:
The following sections describe the process of transforming your source transaction and entity data for use with Influent.
For ease of integration with conventional relational databases, Influent provides adapters for:
- Microsoft SQL Server
- Oracle Database
- HyperSQL Database (HSQLDB)
For Influent to read your transaction data, you must use the appropriate DataViewTables script (located in the influent-spi/src/main/dataviews directory) to add a set of tables to your source database.
The following sections describe the process of transforming your transaction data for use with Influent:
Before you execute any commands in the SQL script, ensure that your database contains a table with the following information for every transaction:
|Source ID||varchar||Unique IDs of entities who sent transactions|
|Destination ID||varchar||Unique IDs of entities who received transactions|
|Amount||float||Values of transactions (e.g., USD)|
|Date and Time||datetime||Date and time at which transactions occurred|
|Transaction ID||bigint or int||Unique IDs for each transaction|
NOTE: If your source and destination IDs are not strings, you can edit the SQL script to specify the correct data type. See the following section for more details.
The DataViewTables scripts in the influent-spi/src/main/dataviews directory enable you to add Influent tables to your source database.
To run the DataViewTables script
- Connect to the server on which your source database is located.
- Open the appropriate DataViewTables script for your relational database management system:
- Find and replace all instances of the placeholder YOUR_RAW_DATA with the name of the table that contains your source transaction data.
- Find and replace all instances of the following placeholder column names to reflect the schema of your source data, where:
Placeholder Type Contains [source_id] varchar Unique source IDs [dest_id] varchar Unique destination IDs [amount] float Amount (value) of the transactions [dt] datetime Date and time at which the transactions occurred
- If the entity IDs in your source data are not strings, find and replace all instances of the Influent entity ID data type varchar(100) with the appropriate type from your source data.
- If your source data contains multiple entity types, you must create a unique EntitySummary table for each type. Modify the following statement and include the name of the entity types in the table (e.g., EntitySummaryLender and EntitySummaryBorrower).
create table EntitySummary( EntityId varchar(100) primary key, IncomingLinks int not null, UniqueIncomingLinks int not null, OutgoingLinks int not null, UniqueOutgoingLinks int not null, NumLinks int, MaxTransaction float, AvgTransaction float, StartDate datetime, EndDate datetime -- additional type specific columns added here -- );
- If you created multiple EntitySummary tables, modify the
insert into EntitySummarystatement to your distinct entity types into the correct tables by uncommenting and editing following line:
-- where EntityId like 'myType.%' --
- Scroll to the Summary Stats Table section and edit the SummaryValue field for the About record insertion, which contains a description of your dataset that is presented to users on log in.
insert into DataSummary (SummaryOrder, SummaryKey, SummaryLabel, SummaryValue, UnformattedNumeric, UnformattedDatetime) values ( 1, 'InfoSummary', 'About', 'Some interesting description of your dataset can be written here.' null, null );
- Review the SummaryValue field for the remaining Summary Stats insertions. By default, this field omits formatting marks such as commas and currency signs that you may want to add manually.
- Run the SQL script to create the tables. We recommend you execute the script one statement at a time to facilitate troubleshooting if you encounter errors.
After you execute the DataViewTables script, your database should contain the following additional table sets:
|Entity||Entities in your dataset. Stored in six tables, each with a different time-based aggregation scheme.
Used to build aggregate flow diagrams and time series charts on entities.
|Flow||Transactions in your dataset, with each record representing one or more transactions between a specific source and destination entity. Stored in six tables, each with a different time-based aggregation scheme.
Used to build aggregate flow diagrams (by time) and highlighted sub-sections of time series charts on entities.
For more information on any of the new Influent tables, see the Database Reference Guide.
Influent allows users to investigate identifying attributes of individual entities involved in your transaction data. Entity searches can be enabled by indexing the EntitySummary table (created by the DataViewTables SQL script) with a plugin search platform such as Apache Solr.
By default, EntitySummary contains an EntityId column that can serve as a quick search field. Any additional entity details you want to make available in your app should be inserted into EntitySummary before indexing.
At a minimum, we recommend you insert the following columns into EntitySummary (if available):
- Name: A friendly name on which users can search and easily reference. This field can also serve as a field on which entities are clustered.
- Location: An address, country code or other data that can be geocoded. This field can serve as a field on which entities are clustered.
To add entity details to EntitySummary
- Add the appropriate columns to the EntitySummary table.
- Insert the additional entity details into the new EntitySummary columns by joining on the EntityId column.
To index your entity and transaction to enable enhanced search capabilities, see the Search Configuration topic.