Automate Data Pipeline To perform Incremental Load


Client database system is secure and not able to access from the outside organization, we need to extract the data from multiple tables in a different database. Then we need to upload and normalize the data in the Azure SQL staging server and export the data as Pipe delimiter format and upload it to Azure blob storage used by the Machine learning team for their analysis

Solution Overview


Generated BCP queries using ID and date fields to extract the latest data from the multiple tables present in a different database.


Built a data pipeline to extract and upload the data to the SQL server by creating a package for each table data to perform incremental load in the SQL server.


Uploaded data is then normalized and loaded into the staging database.


Configured automated SQL jobs to export the normalized data as a pipe-delimited file and uploaded it to the Azure blob storage for Machine learning team analysis.



Built an automated data pipeline to perform incremental uploads on weekly basis on both development and staging servers.


ML team will perform some competitor analysis using the exported normalized data.


Related Success Stories

Related Insights

Free Consulting