LUSID’s record capture framework enables you to build custom integrations that ingest data into LUSID from your own SFTP servers or external APIs. These self-built integrations function similarly to standard LUSID integrations, allowing you to process data from sources that don’t have dedicated integrations.
This integration feature is ideal for:
Ingesting trades from external APIs without dedicated LUSID integrations
Processing data from your own SFTP servers
Triggering custom workflows based on incoming data
Building flexible, configurable data pipelines
Prerequisites
Before getting started, ensure you have:
Contacted your FINBOURNE representative to enable the custom integrations framework in your LUSID domain
An SFTP server or API endpoint with authentication credentials
Supported data types and methods
Currently, you can map and ingest the following entities into LUSID via the custom integrations framework:
Instruments
Portfolio properties
Holdings
Transactions
Quotes
Relational datasets
Setting up the integration
Step 1: Create an integration instance
An integration instance is a configuration of data points to import from an external SFTP or API.
Navigate to Integrations > Dashboard.
Click Create instance and select Record capture framework from the available integrations.

Specify the following in th Integration details tab:
Name: A friendly name for this integration instance
Description: A detailed description of the instance
Go to the Integration settings tab and specify the following:
Run date: Optional; defines the date for which the integration should run, defaulting to yesterday’s date (most recent close of business (COB)). You should only populate this field for adhoc reruns.
File date: Optional; defines the date for which LUSID should choose a file, defaulting to today’s date. You should only populate this field for adhoc reruns.
Drive root folder: The location of your control file in LUSID Drive.
Control filename: The name of your JSON configuration file that tells LUSID how to process the data in a custom integration file.
Additional parameters: The following additional parameters are currently supported:
row_limit:<integer>: Limit the number of rows processed from the input file, for examplerow_limit:20
API configuration: If importing data from an API, specify the following:
API Base URL
API Username
API Password
SFTP configuration: If importing data via SFTP, provide the SFTP configuration details
Click Save to create the integration instance.
Step 2: Supply configuration files to LUSID Drive
The record capture framework uses two types of configuration files stored in LUSID Drive to define mappings for your custom integration:
Control file
The control file defines how LUSID should transform data retrieved from an external source. Each control file must contain an array of fileSpecs in JSON format specifying:
Which Drive folder to store the retrieved file
What to name the retrieved file
The location of mapping files in Drive
How to transform the source data
For example:
{
"FileReference": "positions-file",
"FileCategory": "primary",
"SftpServer": 1,
"SftpSourceFolder": "/acme/positions/Data",
"DriveSubfolder": "COBDATE:yyyy-MM-dd",
"FileNameMask": "POSITIONS_[COBDATE:yyyyMMdd]_*.csv",
"RenameAs": "positions.csv",
"DataReaderType": "CsvReader",
"MappingFile": "positions-mapping.json",
"ImportTypes": ["Instruments", "Holdings"]
}See Appendix A for the full list of control file fields with detailed descriptions.
Mapping file
The mapping file defines how data fields in the source file map to LUSID entities and, if necessary, how LUSID should transform values. Each mapping file can contain the following components in JSON format:
The source columns from your input data
Value sources that allow you to combine or transform values from multiple columns
Map specific source values to target values
Conditional value sources that enable conditional logic for mapping, including:
Instrument type selectors
Identifiers that reference value maps
Load controls that define processing rules for each entity type
Data quality checks that use LUSID filtering syntax to skip rows with specific characteristics
Plug-in architecture that allows for custom hard-coded behaviour beyond the standard configuration options
For example:
{
"CsvSeparator": "Comma",
"ColumnSpecs": [
{ "ColumnHeader": "ISIN", "ColumnReference": "IsinColumn" },
{ "ColumnHeader": "Quantity", "ColumnReference": "QtyColumn" }
],
"ValueSources": [
{
"Name": "IsinValue",
"ColumnReferences": ["IsinColumn"]
}
],
"InstrumentsLoadControl": {
"CreateMissingInstruments": true,
"Identifiers": [
{
"IdentifierType": "Isin",
"IdentifierValueSourceName": "IsinValue",
"IsUnique": true
}
]
}
}See Appendix B for the full list of mapping file fields with detailed descriptions and more examples.
Running the integration
See how to run the integration.
Monitoring and troubleshooting integration runs
Logs for LUSID’s record capture framework function like other LUSID integration logs.
See how to monitor integrations and troubleshoot general issues.
Appendix A: Control file fields
LUSID’s record capture framework uses a control file to define which source files to process and how to retrieve them. Each entry in the control file is a FileControlSpec that specifies configuration such as:
File retrieval
File processing
Exception handling tasks in the Workflow Service
Click to expand the full list of FileControlSpec fields
FileControlSpec fieldsNaming the FileControlSpec
The following fields allow you to define the name and categorisation of this FileControlSpec:
FileReference
Mandatory; a unique name you can use to reference this FileControlSpec in other jobs.
FileCategory
Category of the FileControlSpec (e.g. primary, auxiliary)
SFTP file retrieval parameters
If the integration retrieves files via SFTP, the following fields allow you to specify the source of the file to retrieve and how LUSID should transform it:
SftpServer
1-based index indicating which SFTP server to use (defined in system configuration).
SftpSourceFolder
The source folder path in the SFTP server (e.g. /acme/data/transactions).
DriveSubfolder
Mandatory; the LUSID Drive location to load files into. Supports {{COBDATE:format}} or {{FILEDATE:format}} patterns, for example COBDATE:yyyy-MM-dd
FileNameMask
Mandatory; a matching pattern to match files on (e.g. TRANSACTIONS_[COBDATE:yyyyMMdd_*.csv]). Uses COBDATE/FILEDATE substitution.
RenameAs
A name for the file once loaded into LUSID.
API file retrieval parameters
If the integration retrieves data from an API, the following fields allow you to specify details for the API LUSID should retrieve data from:
ApiRetrievalType
An enum value for the type of API retrieval out of the following:
NoneEnfusion
ReportPath
The path to append to the API URL for retrieving reports, for example /api/holdings.
ReportFilter
A filter to apply to the report request URL, for example &valueDate=10/28/2026.
File import parameters
The following fields allow you to specify import configuration details:
DataReaderType
An enum value for the type of reader out of the following:
CsvReader
MappingFile
The name of the mapping file for this integration in Drive.
ImportTypes
An array of any of the following data types the files in this integration can import:
InstrumentsHoldingsPortfolios(properties to existing portfolios)TransactionsQuotesRelationalDatasets
MaxIngestionFailures
A limit for the number of ingestion failures that can occur for the file before LUSID stops the import.
Exception handling parameters
The following fields allow you to specify how LUSID should handle data quality check failures in runs of the integration:
WorkflowTaskScope and WorkflowTaskCode
The scope and code of a task definition in LUSID’s workflow service that LUSID should kick off a run of for DQ check failures.
LUSID data upsert parameters
The following fields allow you to specify how the integration should configure certain LUSID data points:
Scope
Specify a portfolio scope to only upsert holdings data to portfolios in the specified scope. This can be useful if you want to target a specific portfolio scope for holdings data imports, for example:
{
"FileReference": "Holdings-Finbourne-Examples",
"MappingFile": "holdings-mapping.json",
"ImportTypes": ["Holdings"],
"Scope": "Finbourne-Examples"
}If no scope is specified, the integration processes all portfolios that match all other criteria, regardless of their scope.
CutLabel
Specify a LUSID cut label to use as the EffectiveAt datetime for all the data the integration upserts. Defaults to 00:00 UTC.
Third party integration parameters
The following fields allow you to specify fields required for certain third party integrations:
BloombergRefreshInstanceId
Optionally, specify an integration instance ID to populate the value of the Bloomberg Data Licence Per Security Instrument/LUSIDConfig-Bloomberg-DLPerSecurity/RefreshReferenceData property.
Appendix B: Mapping file fields
LUSID’s record capture framework uses a mapping file to define how data from source files should be parsed, validated, and loaded into LUSID.
The integration guides data through the mapping file configuration in the following order:
Reads the file and applies
ColumnSpecsValidates data using
DQChecksIf
InstrumentsLoadControlis specified, creates/updates instrumentsIf
PortfoliosLoadControlis specified, updates portfolio propertiesIf
HoldingsLoadControlis specified, upserts holdingsIf
TransactionsLoadControlis specified, upserts transactionsIf
QuotesLoadControland/orRelationalDatasetsLoadControlis specified, upserts quotes and/or relational datasets
Click to expand the full list of mapping file fields
CsvSeparator
Specify one of the following options for the delimiter used in CSV import files:
Comma: Comma-separated (,) values (default)Pipe: Pipe-separated (|) valuesNone: Uses default behaviour
ColumnSpecs
Specify an array of the columns in your source file and how to reference them in this mapping file. Each list item should contain:
ColumnHeader: The column name as it appears in the source fileColumnReference: A reference name ending incolumnto use throughout the mapping file
For example:
"ColumnSpecs": [
{
"ColumnHeader": "Instrument ID",
"ColumnReference": "InstrumentIdColumn"
},
{
"ColumnHeader": "Price",
"ColumnReference": "PriceColumn"
}
]ValueSources
Specify an array of how to extract and transform values from columns in import files. Each list item should contain a unique reference Name ending in Value to use throughout the mapping file, and either:
Value: Hardcode a specific value, for example:{ "Name": "InstrumentScopeValue", "Value": "default" }ColumnReferences: A list of the columns that the value should be sourced from
For ColumnReferences you can additionally specify the following:
ValueConversion
Specify one of the following options to tell LUSID how to combine multiple columns:
None: Takes the value from the firstColumnReference{ "Name": "SedolValue", "ColumnReferences": ["SedolColumn"], "ValueConversion": "None" }Concatenate: Joins multipleColumnReferencevalues together using the specifiedConcatenationSeparator{ "Name": "FullNameValue", "ColumnReferences": ["FirstNameColumn", "LastNameColumn"], "ValueConversion": "Concatenate", "ConcatenationSeparator": " " }FirstNonEmpty: Uses the first non-emptyColumnReferencevalue{ "Name": "PreferredIdValue", "ColumnReferences": ["IsinColumn", "SedolColumn", "ClientIdColumn"], "ValueConversion": "FirstNonEmpty" }
ConcatenationSeparator
The separator to use when ValueConversion is set to Concatenate; defaults to a space (“ “)
ValueMapName
Specify the name of a ValueMap to use for the value transformation, for example:
{
"Name": "TransactionTypeValue",
"ColumnReferences": ["TxnTypeColumn"],
"ValueMapName": "TransactionTypeMap"
}Note
If the column value is not a key in the specified
ValueMap, the framework throws an error. You should specify a DQChecks check that validatesInMapKeysbefore applying the map, or ensure yourValueMapincludes all possible input values.
MultiplierValueSourceName
For numeric values, use this field to specify one of the defined ValueSources or ConditionalValueSources to multiply the numeric value by, for example:
{
"Name": "ScaledQuantityValue",
"ColumnReferences": ["QuantityColumn"],
"MultiplierValueSourceName": "ContractSizeValue"
}This can be useful for things such as:
Converting contract quantities to actual units
Scaling prices by exchange rates
Calculating total consideration
Applying percentage conversions
Note
The base value and multiplier value must be parsaeable as decimals.
LUSID returns the multiplied result as a string.
LUSID throws an error if parsing fails, detailing which value failed.
IsCurrencyIdentifier
Set to true to add the prefix CCY_ to each value, for example:
{
"Name": "CurrencyInstrumentIdValue",
"ColumnReferences": ["CurrencyCodeColumn"],
"IsCurrencyIdentifier": true
}With the above example, LUSID might transform an original value of USD to CCY_USD.
RowIdentificationSourceName
Specify which ValueSources or ConditionalValueSources to use for identifying rows in logging and error messages.
ConditionalValueSources
Allows you to specify an array of complex logic for value transformations based on conditions.
Each list item should contain:
A unique reference
Nameending inValueto use throughout the mapping fileA list of ordered
LogicalExpressionscontaining:Condition: A LUSID filtering syntax expression to evaluate the value source against. It must use the format<ValueSource/ConditionalValueSource> <Operator> <Specific value/ValueSource/ConditionalValueSource>.Supported operators
The integration framework supports the following operators in addition to the standard LUSID filtering syntax supported operators:
Result: The value to return if the condition is met; this can be any of the following:A specific value
One of the defined
ValueSourcesOne of the defined
ConditionalValueSources
Always include an empty condition ("Condition": "",) as the last LogicalExpression; this ensures unexpected values are handled.
For example:
{
"Name": "InstrumentTypeSelectorValue",
"LogicalExpressions": [
{
"Condition": "AssetClassValue eq \"Equity\"",
"Result": "Equity"
},
{
"Condition": "AssetClassValue eq \"Cash\"",
"Result": "SimpleInstrument"
},
{
"Condition": "",
"Result": "SimpleInstrument"
}
]
}ValueMaps
Allows you to specify an array of key-value pairs for mapping specific values. Each list item should contain:
A unique reference
Nameending inMapA dictionary of
Mapkey-value pairs
For example:
"ValueMaps": [
{
"Name": "TransactionTypeMap",
"Map": {
"BUY": "Buy",
"SELL": "Sell",
"DIVIDEND": "StockDividend"
}
},
{
"Name": "FrequencyMap",
"Map": {
"Monthly": "1M",
"Quarterly": "3M",
"Semi-Annual": "6M",
"Annual": "1Y"
}
}
]Note
If the column value is not a key in the specified value map, the integration throws an error. Use a
DQCheckwith anInMapKeysoperator to validate before applying the map, or ensure your value map includes all possible input values.You can set up a
ConditionalValueSourceto check a value exists in a value map using theInMapKeysorInMapValuesoperators, for example:{ "Condition": "RawTypeValue InMapKeys TransactionTypeMap", "Result": "Pass" }
DQChecks
Allows you to specify how the integration should validate data before processing in LUSID. You can define filters for how to present the result of a DQ check on a row in the logs. Each list item should contain:
A unique reference
NameA list of ordered
LogicalExpressionsto evaluate for each row of data containing:Condition: A LUSID filtering syntax expression to evaluate the value source against, as with ConditionalValueSourcesNote
If no condition matches, the integration sets the
Resultvalue toFail.Result: The value to return in the logs if theConditionis met; this can be any of the following:Pass: No error recorded in the logsFail: Logged as error and row marked as failedSkipWithMessage: Row is skipped and recorded in the logs as skippedSkipSilently: Row is skipped but nothing is recorded in the logs
Note
If a check hits a
SkipWithMessageorSkipSilentlyresult, the integration stops any further DQ checking for that row of data.
Optionally, a
Messageto log if a check fails
For example:
"DQChecks": [
{
"Name": "RequireInstrumentId",
"Message": "Instrument ID is required",
"LogicalExpressions": [
{
"Condition": "InstrumentIdValue neq \"\"",
"Result": "Pass"
},
{
"Condition": "",
"Result": "Fail"
}
]
},
{
"Name": "SkipClosedPositions",
"Message": "Skipping closed positions",
"LogicalExpressions": [
{
"Condition": "QuantityValue eq \"0\" AND StatusValue eq \"Closed\"",
"Result": "SkipWithMessage"
},
{
"Condition": "",
"Result": "Pass"
}
]
}
]Load controls
Each entity type has its own load control section defining how the entity should be processed. You only need to include sections for the entity types you specified in your control file ImportTypes list.
InstrumentsLoadControl
Required when ImportTypes includes Instruments.
Field | Required? | Description |
|---|---|---|
|
| |
| A list of identifier types to extract from data; at least one must include | |
| Type-specific instrument definitions. Supported instrument types:
| |
| A | |
| A list of properties to apply to all instrument types | |
| A
| |
| A |
For example:
"InstrumentsLoadControl": {
"InstrumentTypeSelectorSourceName": "InstrumentTypeSelectorValue",
"UnderlyingInstrumentHandlingSourceName": "UnderlyingHandlerValue",
"BloombergDataRefreshSourceName": "BloombergDataRefreshValue",
"CreateMissingInstruments": true,
"Identifiers": [
{
"IdentifierType": "Isin",
"IdentifierValueSourceName": "IsinValue",
"IsUnique": true
},
{
"IdentifierType": "Sedol",
"IdentifierValueSourceName": "SedolValue",
"IsUnique": false
},
{
"IdentifierType": "ClientInternal",
"IdentifierValueSourceName": "ClientIdValue",
"IsUnique": false
}
],
"InstrumentDefControls": [
{
"InstrumentType": "SimpleInstrument",
"InstrumentNameSourceName": "InstrumentNameValue",
"DomesticCurrencySourceName": "CurrencyValue",
"ForeignCurrencySourceName": "ForeignCcyValue",
"SimpleInstrumentTypeSourceName": "SimpleTypeValue",
"Properties": []
},
{
"InstrumentType": "Equity",
"InstrumentNameSourceName": "InstrumentNameValue",
"DomesticCurrencySourceName": "CurrencyValue",
"Properties": [
{
"LusidPropertyName": "Instrument/default/Sector",
"ValueSourceName": "SectorValue"
}
]
}
],
"CommonProperties": [
{
"LusidPropertyName": "Instrument/default/AssetClass",
"ValueSourceName": "AssetClassValue"
}
]
}PortfoliosLoadControl
Required when ImportTypes includes Portfolios.
Note
All portfolios you wish to include in integration runs must be decorated with the following properties:
Portfolio/rcf-config/LinkedFundCodematching theFundCodevalue in your source data
Portfolio/rcf-config/IsIncludedset totrue
Field | Required? | Description |
|---|---|---|
| A | |
| A list of portfolio properties to apply to all portfolios specified; each list item should contain:
|
For example:
"PortfoliosLoadControl": {
"FundCodeSourceName": "FundCodeValue",
"Properties": [
{
"LusidPropertyName": "Portfolio/default/Strategy",
"ValueSourceName": "StrategyValue"
},
{
"LusidPropertyName": "Portfolio/default/Manager",
"ValueSourceName": "ManagerValue"
}
]
}HoldingsLoadControl
Required when ImportTypes includes Holdings.
Note
All portfolios you wish to include in holdings integration runs must be decorated with the following properties:
Portfolio/rcf-config/LinkedFundCodematching theFundCodevalue in your source data
Portfolio/rcf-config/IsIncludedset totrue
Portfolio/rcf-config/LoadPositionsset totrue
Field | Required? | Description |
|---|---|---|
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A list of SHKs with | |
| A list of holding properties to apply to all portfolios specified; each list item should contain:
| |
|
| |
| Specify the following to combine multiple rows into a single holding:
|
For example:
"HoldingsLoadControl": {
"FundCodeSourceName": "FundCodeValue",
"EffectiveDateSourceName": "EffectiveDateValue",
"CurrencySourceName": "CurrencyValue",
"CostSourceName": "CostValue",
"QuantitySourceName": "UnitsValue",
"TaxlotPriceSourceName": "PriceValue",
"SubHoldingKeys": [
{
"LusidPropertyName": "Transaction/default/Strategy",
"ValueSourceName": "StrategyValue"
}
],
"Properties": [
{
"LusidPropertyName": "Holding/default/MarketValue",
"LusidPropertyDataType": "number",
"ValueSourceName": "MarketValueUSDValue"
}
],
"Filter": [
{
"Name": "ExcludeZeroQuantity",
"LogicalExpressions": [
{
"Condition": "QuantityValue eq \"0\"",
"Result": "SkipSilently"
},
{
"Condition": "",
"Result": "Pass"
}
]
}
]
"Aggregator": {
"KeyValueSources": ["FundCodeValue", "InstrumentIdValue", "StrategyValue"],
"SumColumns": ["QuantityColumn", "CostColumn"]
}
}TransactionsLoadControl
Required when ImportTypes includes Transactions.
Note
All portfolios you wish to include in transactions integration runs must be decorated with the following properties:
Portfolio/rcf-config/LinkedFundCodematching theFundCodevalue in your source data
Portfolio/rcf-config/IsIncludedset totrue
Portfolio/rcf-config/LoadTransactionsset totrue
Field | Required? | Description |
|---|---|---|
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A list of transaction properties to apply to all transactions; each list item should contain:
| |
| A list of SHKs with | |
|
| |
| Specify the following to combine multiple rows into a single transaction:
|
For example:
"TransactionsLoadControl": {
"FundCodeSourceName": "FundCodeValue",
"TransactionIdSourceName": "TransactionIdValue",
"TransactionTypeSourceName": "TransactionTypeValue",
"TransactionDateSourceName": "TradeDateValue",
"SettlementDateSourceName": "SettlementDateValue",
"UnitsSourceName": "UnitsValue",
"TotalConsiderationCurrencySourceName": "CurrencyValue",
"TotalConsiderationAmountSourceName": "ConsiderationValue",
"TransactionPriceSourceName": "PriceValue",
"Properties": [
{
"LusidPropertyName": "Transaction/default/Broker",
"ValueSourceName": "BrokerValue"
}
],
"Filter": [
{
"Name": "OnlyEquities",
"LogicalExpressions": [
{
"Condition": "AssetClassValue eq \"Equity\"",
"Result": "Pass"
},
{
"Condition": "",
"Result": "SkipSilently"
}
]
}
],
"Aggregator": {
"KeyValueSources": ["FundCodeValue", "InstrumentIdValue", "StrategyValue"],
"SumColumns": ["QuantityColumn", "CostColumn"]
}
}QuotesLoadControl
Required when ImportTypes includes Quotes.
Field | Required? | Description |
|---|---|---|
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
| A | |
Filter |
|
For example:
"QuotesLoadControl": {
"ScopeSourceName": "QuoteScopeValue",
"EffectiveDateSourceName": "QuoteDateValue",
"ProviderSourceName": "ProviderValue",
"InstrumentIdTypeSourceName": "IdTypeValue",
"InstrumentIdSourceName": "InstrumentIdValue",
"QuoteTypeSourceName": "QuoteTypeValue",
"FieldSourceName": "FieldValue",
"PriceValueSourceName": "PriceValue",
"CurrencySourceName": "CurrencyValue",
"Filter": []
}RelationalDatasetsLoadControl
Required when ImportTypes includes RelationalDatasets.
Field | Required | Description |
|---|---|---|
RelationalDatasetDefinitionScope | A string containing the scope for the relational dataset definition | |
RelationalDatasetDefinitionCode | A string containing the code for relational dataset definition | |
SeriesScopeSourceName | A | |
EntityTypeSourceName | A | |
EntityIdentifierScopeSourceName | A | |
Identifiers | A list of identifier types to extract from data; at least one must include | |
EffectiveAtSourceName | A | |
ValueFieldsMap | Map numeric/value dataset field names to | |
MetaDataFieldsMap | Map metadata dataset field names to | |
Filter |
|
For example:
"RelationalDatasetsLoadControl": {
"RelationalDatasetDefinitionScope": "my-scope",
"RelationalDatasetDefinitionCode": "my-dataset-code",
"SeriesScopeSourceName": "SeriesScopeValue",
"EntityTypeSourceName": "EntityTypeValue",
"EntityIdentifierScopeSourceName": "EntityIdScopeValue",
"Identifiers": [
{
"IdentifierType": "ClientInternal",
"IdentifierValueSourceName": "RecordIdValue",
"IsUnique": true
}
],
"EffectiveAtSourceName": "EffectiveDateValue",
"ValueFieldsMap": {
"Amount": "AmountValue",
"Price": "PriceValue"
},
"MetaDataFieldsMap": {
"Description": "DescriptionValue",
"Category": "CategoryValue"
},
"Filter": []
}Examples
Example A: Simple equity holdings file mapping
Let’s imagine you want to load the following equity holdings from a CSV file:
ISIN,Name,Quantity,Price,Currency,Fund,Date
GB0002374006,Tesco PLC,1000,2.50,GBP,FUND001,2025-03-10
US0378331005,Apple Inc,500,150.00,USD,FUND001,2025-03-10You might create the following mapping file:
{
"CsvSeparator": "Comma",
"ColumnSpecs": [
{ "ColumnHeader": "ISIN", "ColumnReference": "IsinColumn" },
{ "ColumnHeader": "Name", "ColumnReference": "NameColumn" },
{ "ColumnHeader": "Quantity", "ColumnReference": "QtyColumn" },
{ "ColumnHeader": "Price", "ColumnReference": "PriceColumn" },
{ "ColumnHeader": "Currency", "ColumnReference": "CcyColumn" },
{ "ColumnHeader": "Fund", "ColumnReference": "FundColumn" },
{ "ColumnHeader": "Date", "ColumnReference": "DateColumn" }
],
"RowIdentificationSourceName": "RowIdValue",
"ValueSources": [
{
"Name": "RowIdValue",
"ColumnReferences": ["IsinColumn", "FundColumn"],
"ValueConversion": "Concatenate",
"ConcatenationSeparator": "_"
},
{
"Name": "IsinValue",
"ColumnReferences": ["IsinColumn"]
},
{
"Name": "InstrumentNameValue",
"ColumnReferences": ["NameColumn"]
},
{
"Name": "UnitsValue",
"ColumnReferences": ["QtyColumn"]
},
{
"Name": "PriceValue",
"ColumnReferences": ["PriceColumn"]
},
{
"Name": "CurrencyValue",
"ColumnReferences": ["CcyColumn"]
},
{
"Name": "FundCodeValue",
"ColumnReferences": ["FundColumn"]
},
{
"Name": "DateValue",
"ColumnReferences": ["DateColumn"]
},
{
"Name": "CostValue",
"ColumnReferences": ["PriceColumn"],
"MultiplierValueSourceName": "UnitsValue"
}
],
"DQChecks": [
{
"Name": "RequireIsin",
"Message": "ISIN is required",
"LogicalExpressions": [
{ "Condition": "IsinValue neq \"\"", "Result": "Pass" },
{ "Condition": "", "Result": "Fail" }
]
},
{
"Name": "RequireQuantity",
"Message": "Quantity must be greater than zero",
"LogicalExpressions": [
{ "Condition": "UnitsValue gt \"0\"", "Result": "Pass" },
{ "Condition": "", "Result": "Fail" }
]
}
],
"InstrumentsLoadControl": {
"CreateMissingInstruments": true,
"Identifiers": [
{
"IdentifierType": "Isin",
"IdentifierValueSourceName": "IsinValue",
"IsUnique": true
}
],
"InstrumentDefControls": [
{
"InstrumentType": "Equity",
"InstrumentNameSourceName": "InstrumentNameValue",
"DomesticCurrencySourceName": "CurrencyValue",
"Properties": []
}
],
"CommonProperties": []
},
"HoldingsLoadControl": {
"FundCodeSourceName": "FundCodeValue",
"EffectiveDateSourceName": "DateValue",
"CurrencySourceName": "CurrencyValue",
"CostSourceName": "CostValue",
"QuantitySourceName": "UnitsValue",
"TaxlotPriceSourceName": "PriceValue",
"SubHoldingKeys": [],
"Properties": []
}
}Example B: Complex multi-asset transaction file mapping
{
"CsvSeparator": "Pipe",
"ColumnSpecs": [
{ "ColumnHeader": "TxnId", "ColumnReference": "TxnIdColumn" },
{ "ColumnHeader": "AssetType", "ColumnReference": "AssetTypeColumn" },
{ "ColumnHeader": "Identifier", "ColumnReference": "IdColumn" },
{ "ColumnHeader": "TxnType", "ColumnReference": "TxnTypeColumn" },
{ "ColumnHeader": "TradeDate", "ColumnReference": "TradeDateColumn" },
{ "ColumnHeader": "SettleDate", "ColumnReference": "SettleDateColumn" },
{ "ColumnHeader": "Units", "ColumnReference": "UnitsColumn" },
{ "ColumnHeader": "Price", "ColumnReference": "PriceColumn" },
{ "ColumnHeader": "CCY", "ColumnReference": "CcyColumn" },
{ "ColumnHeader": "Portfolio", "ColumnReference": "PortColumn" }
],
"RowIdentificationSourceName": "TxnIdentifierValue",
"ValueSources": [
{
"Name": "TxnIdentifierValue",
"ColumnReferences": ["TxnIdColumn"]
},
{
"Name": "AssetTypeValue",
"ColumnReferences": ["AssetTypeColumn"]
},
{
"Name": "IdValue",
"ColumnReferences": ["IdColumn"]
},
{
"Name": "TxnTypeRawValue",
"ColumnReferences": ["TxnTypeColumn"]
},
{
"Name": "TradeDateValue",
"ColumnReferences": ["TradeDateColumn"]
},
{
"Name": "SettleDateValue",
"ColumnReferences": ["SettleDateColumn"]
},
{
"Name": "UnitsValue",
"ColumnReferences": ["UnitsColumn"]
},
{
"Name": "PriceValue",
"ColumnReferences": ["PriceColumn"]
},
{
"Name": "CcyValue",
"ColumnReferences": ["CcyColumn"]
},
{
"Name": "PortValue",
"ColumnReferences": ["PortColumn"]
},
{
"Name": "StandardTxnTypeValue",
"ColumnReferences": ["TxnTypeColumn"],
"ValueMapName": "TxnTypeMap"
},
{
"Name": "TotalConsiderationValue",
"ColumnReferences": ["PriceColumn"],
"MultiplierValueSourceName": "UnitsValue"
}
],
"ConditionalValueSources": [
{
"Name": "InstrumentTypeSelectorValue",
"LogicalExpressions": [
{
"Condition": "AssetTypeValue eq \"EQ\"",
"Result": "Equity"
},
{
"Condition": "AssetTypeValue eq \"CASH\"",
"Result": "SimpleInstrument"
},
{
"Condition": "",
"Result": "SimpleInstrument"
}
]
}
],
"ValueMaps": [
{
"Name": "TxnTypeMap",
"Map": {
"B": "Buy",
"S": "Sell",
"DIV": "StockDividend"
}
}
],
"DQChecks": [
{
"Name": "ValidTxnType",
"Message": "Transaction type must be valid",
"LogicalExpressions": [
{ "Condition": "TxnTypeRawValue InMapKeys TxnTypeMap", "Result": "Pass" },
{ "Condition": "", "Result": "Fail" }
]
},
{
"Name": "RequirePrice",
"Message": "Price must be greater than zero",
"LogicalExpressions": [
{ "Condition": "PriceValue gt \"0\"", "Result": "Pass" },
{ "Condition": "", "Result": "Fail" }
]
}
],
"InstrumentsLoadControl": {
"InstrumentTypeSelectorSourceName": "InstrumentTypeSelectorValue",
"CreateMissingInstruments": true,
"Identifiers": [
{
"IdentifierType": "ClientInternal",
"IdentifierValueSourceName": "IdValue",
"IsUnique": true
}
],
"InstrumentDefControls": [
{
"InstrumentType": "SimpleInstrument",
"InstrumentNameSourceName": "IdValue",
"DomesticCurrencySourceName": "CcyValue",
"SimpleInstrumentTypeSourceName": "AssetTypeValue",
"Properties": []
},
{
"InstrumentType": "Equity",
"InstrumentNameSourceName": "IdValue",
"DomesticCurrencySourceName": "CcyValue",
"Properties": []
}
],
"CommonProperties": []
},
"TransactionsLoadControl": {
"FundCodeSourceName": "PortValue",
"TransactionIdSourceName": "TxnIdentifierValue",
"TransactionTypeSourceName": "StandardTxnTypeValue",
"TransactionDateSourceName": "TradeDateValue",
"SettlementDateSourceName": "SettleDateValue",
"UnitsSourceName": "UnitsValue",
"TotalConsiderationCurrencySourceName": "CcyValue",
"TotalConsiderationAmountSourceName": "TotalConsiderationValue",
"TransactionPriceSourceName": "PriceValue",
"Properties": []
}
}Best practice
Follow naming conventions
Always append suffixes to distinguish reference types:
ColumnReference: Append"Column"→"IsinColumn","PriceColumn"ValueSourceName: Append"Value"→"IsinValue","PriceValue"ConditionalValueSourceName: Descriptive name with"Value"→"InstrumentTypeSelectorValue"
This makes it immediately clear what type of reference you're working with.
Validate early with DQ checks
Put critical validation at the beginning of your DQChecks array:
"DQChecks": [
{
"Name": "RequireIdentifier",
"Message": "Instrument identifier is required",
"LogicalExpressions": [
{ "Condition": "IsinValue neq \"\"", "Result": "Pass" },
{ "Condition": "", "Result": "Fail" }
]
},
{
"Name": "ValidateBeforeMapping",
"Message": "Transaction type must be valid",
"LogicalExpressions": [
{ "Condition": "TxnTypeValue InMapKeys TransactionTypeMap", "Result": "Pass" },
{ "Condition": "", "Result": "Fail" }
]
}
]Reuse ValueSources
Define ValueSources once and reference them multiple times:
"ValueSources": [
{
"Name": "IsinValue",
"ColumnReferences": ["IsinColumn"]
}
]Then use "IsinValue" in:
DQChecks
ConditionalValueSources
Load controls
Other ValueSources
Use ValueMaps for transformations
Always use ValueMaps instead of multiple conditional checks:
"ValueMaps": [
{
"Name": "TransactionTypeMap",
"Map": {
"B": "Buy",
"S": "Sell",
"D": "Dividend"
}
}
],
"ValueSources": [
{
"Name": "StandardTxnTypeValue",
"ColumnReferences": ["TxnTypeColumn"],
"ValueMapName": "TransactionTypeMap"
}
]"ConditionalValueSources": [
{
"Name": "StandardTxnTypeValue",
"LogicalExpressions": [
{ "Condition": "TxnTypeRawValue eq \"B\"", "Result": "Buy" },
{ "Condition": "TxnTypeRawValue eq \"S\"", "Result": "Sell" },
{ "Condition": "TxnTypeRawValue eq \"D\"", "Result": "Dividend" }
]
}
]Always include default cases
Every ConditionalValueSource should have a default case (empty condition):
{
"Name": "InstrumentTypeSelectorValue",
"LogicalExpressions": [
{
"Condition": "AssetClassValue eq \"Equity\"",
"Result": "Equity"
},
{
"Condition": "",
"Result": "SimpleInstrument"
}
]
}This prevents errors when encountering unexpected values.
Test incrementally
Build your mapping file in stages:
Start with basic structure:
ColumnSpecsand simpleValueSourcesAdd DQ checks: Validate required fields
Add one
LoadControl: Test with a small fileAdd complexity:
ConditionalValueSources,ValueMaps, additionalLoadControlsAdd advanced features:
Filters,Aggregators,sub-holding keys
Use descriptive names
Make names self-documenting:
"InstrumentTypeSelectorValue""SettlementDaysCalculatorValue""ValidTransactionType"(DQ check)
"TypeValue""Value1""Check1"
Document complex logic
For sophisticated ConditionalValueSources, add comments in your source control commit messages explaining the business logic.
Common snippets
Multi-identifier fallback
Use the first available identifier from multiple options:
{
"Name": "BestIdentifierValue",
"ColumnReferences": ["IsinColumn", "SedolColumn", "ClientIdColumn"],
"ValueConversion": "FirstNonEmpty"
}Conditional property setting
Only set a property when certain conditions are met:
{
"Name": "MaturityDateValue",
"ColumnReferences": ["MaturityColumn"]
},
{
"Name": "MaturityPropertyValue",
"LogicalExpressions": [
{
"Condition": "AssetClassValue eq \"Bond\"",
"Result": "MaturityDateValue"
},
{
"Condition": "",
"Result": ""
}
]
}Then reference "MaturityPropertyValue" in properties - it returns empty string for non-bonds.
Composite key for row identification
Create a unique identifier by combining multiple fields:
{
"Name": "RowIdValue",
"ColumnReferences": ["FundColumn", "IsinColumn", "TradeDateColumn"],
"ValueConversion": "Concatenate",
"ConcatenationSeparator": "|"
}Use this as your RowIdentificationSourceName for clear logging.
Nested conditional logic
Build reusable logic blocks and compose them:
{
"Name": "USSettlementDaysValue",
"LogicalExpressions": [
{ "Condition": "AssetClassValue eq \"Equity\"", "Result": "3" },
{ "Condition": "", "Result": "1" }
]
},
{
"Name": "UKSettlementDaysValue",
"LogicalExpressions": [
{ "Condition": "AssetClassValue eq \"Equity\"", "Result": "2" },
{ "Condition": "", "Result": "1" }
]
},
{
"Name": "SettlementDaysValue",
"LogicalExpressions": [
{ "Condition": "CountryValue eq \"US\"", "Result": "USSettlementDaysValue" },
{ "Condition": "CountryValue eq \"UK\"", "Result": "UKSettlementDaysValue" },
{ "Condition": "", "Result": "1" }
]
}Validate before mapping
Always validate that a value exists in a ValueMap before applying it:
"DQChecks": [
{
"Name": "ValidateTransactionType",
"Message": "Transaction type must be valid",
"LogicalExpressions": [
{ "Condition": "RawTxnTypeValue InMapKeys TransactionTypeMap", "Result": "Pass" },
{ "Condition": "", "Result": "Fail" }
]
}
],
"ValueSources": [
{
"Name": "StandardTxnTypeValue",
"ColumnReferences": ["TxnTypeColumn"],
"ValueMapName": "TransactionTypeMap"
}
]Conditional multiplier for scaling
Use ConditionalValueSources to dynamically scale values:
{
"Name": "ContractMultiplierValue",
"LogicalExpressions": [
{ "Condition": "AssetTypeValue eq \"Future\"", "Result": "100" },
{ "Condition": "AssetTypeValue eq \"Option\"", "Result": "1000" },
{ "Condition": "", "Result": "1" }
]
},
{
"Name": "ActualQuantityValue",
"ColumnReferences": ["RawQuantityColumn"],
"MultiplierValueSourceName": "ContractMultiplierValue"
}Troubleshooting tips
Row failing DQ check unexpectedly
Check the order of
LogicalExpressions- first match winsVerify ValueSource names are spelled correctly (case-sensitive)
Check for empty string vs missing value handling
Instrument not matched
Verify at least one
IdentifierhasIsUnique: trueCheck identifier values are being extracted correctly
Confirm instrument exists in LUSID with that identifier
Conditional not working
Cannot mix
AND/ORin single conditionVerify
ValueSourcenames exist and are spelled correctlyCheck quotes around absolute values in conditions
Property not set
Verify property key format is correct (
Domain/Scope/Code)Check
LusidPropertyDataTypefor numeric properties (usenumber)For date properties, use
dateand provideDateFormatsarrayConfirm
ValueSourcereturns non-empty value
ValueMap lookup failing
ValueMap lookup failingValidate value is in map keys before applying with
DQChecksVerify key spelling matches exactly (case-sensitive)
Ensure all possible source values are mapped