Data Reconciliation Definition
Data Reconciliation is a process of verifying data during data migration, where the target data undergoes a comparison with the original source data to make sure that the migration architecture has transferred the data accurately. There are high chances of mistakes to be present in the mapping and transformation logic. Along with this, failures like broken transformation and runtime failures can lead to data loss or invalid data. This can further lead to a range of predicaments similar to incorrect values, missing records, and values, innocently formatted values, duplicate records, broken relationships across tables or systems and so on.
Data reconciliation and gross error detection are the commonly used methods to improve the accuracy of data which results in a satisfying the plant model of an industry. These methods were widely applied to petrochemical-plants, mineral industries, refineries, and so forth to deliver accurate plat-wide accounting. Deploying data reconciliation, the industries also aim to receive a higher profitability in the plant operations.
In a modern chemical plant or refinery, numerous variables such as temperatures, flowcharts, pressures, levels, compositions and so on are regularly measured and recorded for further process control, process economic evaluation, and online optimization. Data acquisition systems and computers facilitate the gathering and processing of these massive volumes of data. The use of computers to collect data enables to obtain data at a greater frequency and aids to eliminate errors that occur during manual recording. The increasing amount of bulk data can be broken down for further improving the reliability and accuracy of processed data through a systematic checking and treatment. The Processed measurements are corrupted by minor errors during the processing, measuring, and transmission of the measured signals.
The total error identified in a measurement that is derived from the difference between the measured value and the original value of a variable is considered as two different types of errors namely, gross error and random error. Gross errors occur due to nonrandom events such as failure of instrumental functionality, corrosion of sensors, miscalibration and so on. Random errors occur when inherent fluctuations occur unpredictably in the readings of measurement apparatus or in the interpretation of the experimenter’s reading on the instrument.
Data reconciliation process targets at correcting measurement errors that occur due to random errors. The principal contradiction between other techniques and data reconciliation is that it clearly makes use of process model constraints and acquires estimates of variables by altering and adjusting process measurements so that the estimates satisfy the constraint. For data reconciliation to be efficient, gross errors should also be nil.
Conventional approaches to data reconciliation have frequently relied on simple record counts to keep track whether the expected number of records have migrated. This was mainly due to the importance of processing essential data to execute field-by-field validation. Today’s data migration solution delivers comparative reconciliation capabilities and data prototyping functionality that allows full volume data reconciliation testing.
MiFID II data reconciliation: A practical guide
White Paper By: Duco
Data risk is an increasing challenge in the financial industry, for the innumerable processes that need to be taken care, before reporting the data to the regulators. It is extremely important to stay complaint and maintain data quality for Markets in Financial Instruments Directive II (MIFID II) during data reconciliation. Duco Cube with its powerful and flexible reconciliation platform...
MiFID II / MiFIR Transaction Reporting: A Practical Guide
White Paper By: Duco
One of the main criticisms of the original MiFID was that national regulators did not enforce the directive with the same zeal across Europe. The list of financial instruments covered has been extended to almost all instruments traded in European markets – with particular emphasis on the OTC derivatives market that was previously out of scope for MiFID I. The issue with making this...
Transaction Reporting – what’s changing?
White Paper By: AutoRek
Transaction Reporting is one of the key priorities for regulators. Some are already warning that there will be no latitude for non-compliance, including late reporting. The aim of Transaction Reporting is to assist EU regulators in the detection and investigation of suspected market abuse. By implementing a robust, automated financial control regime, investments firms will ensure readiness...
Break Down Data Silos to Boost Your Competitive Advantage
White Paper By: Deskera
Eliminating data silos with cloud-based business software can improve productivity, providing more accurate data, delivering better visibility, and reducing IT costs. Having different IT systems that can't communicate wrecks havoc across the organization has resulted in lower productivity, inaccurate and inconsistent data, poor insight and higher costs. What you will learn in this...
10 Critical Insights into Application Programming Interface to Leverage the Value of Your Data
White Paper By: OpenDataSoft
Understanding and offering Application Programming Interface (API) can be a challenge for business users. Over the past few years, Application Programming Interface has become a vector for the development of businesses and public institutions that cannot be ignored. This whitepaper unravels the ten critical insights to help you get the most out of Application Programming Interface in your...
The 10 Essential Steps to Open Your Data
White Paper By: OpenDataSoft
OpenDataSoft came up with a smart and innovative approach that boosts the creation of new urban services. Providing deep insight in the potential of Open Data program, benefits of Open Data and Open Data market value. Transforming data into APIs, data visualizations, and real-time monitoring, making it easy for government organizations and business users to publish, share and re-use data with...