SQL Server Integrator Controller & Supervisor
Automatically keep your SQL Server data replicas up to date with live data.
Ultra fast data imports and automated management synchronization via AWS-DMS.
An ideal solution for Data Warehouse environments
Ultra-Fast Parallel And Simultaneous
Bulk Data Transfers
The Power of the SQL Server BCP technology
SQL Server BCP technology uses the fastest way to transfer massive data in SQL Server
You can read about it in Microsoft clicking here
Ultra-Speed Data Transfers
SQL Data Trans move the information between servers using the BCP SQL Server technology. That technology is the exactly one implemented by Microsoft to transfer bulk data between servers un the fastest way.
SQL Data Trans configures, starts and supervises as many BCP processes as there are at any given time to transfer your information in an ultra-fast way, maximising the availability times of your information at destination.
Configure as many source servers as you want, as many groups of tables and tables as you want, define your migration policies en each table and group and SQL Data Trans will do the rest of the work for you with an incredible speed.
Don't let disk speed be your bottleneck
You can configure the number of simultaneous transfers you want to perform using the maximum capacity of your network and the other elements of your infrastructure.
Define several different disks so that they can be used simultaneously maximising the bandwidth of your network and minimising the copying time of the tables.
The system will take care of everything necessary: the creation of the destination tables, creation of indexes, primary keys, subsequent data verification, etc.
Transfer in bulk only what is necessary
Use filters in the transfer of your data to transfer only the data you want from each table or even to be able to carry out an initial bulk transfer and subsequent incremental bulk transfers. Using this functionality, the system will respect the destination table and the data that do not comply with the filter that you define for the transfer, thus being able to carry out massive incremental migrations.
Think for example of a table of stock movements for the last five years and current, on which you, for example, have a guarantee that your records whose creation date, reflected in a field in the table, is older than one month. prior to today's date will not be modified, deleted or added. For that table it does not make sense to transfer all the data when you want to update it with mass transfer, it is much more convenient to only transfer the records created for example a month and a half ago.
In this situation, the system will erase the data from the last month and a half at the destination and will transfer only that data in streaming from the origin server to the destination, loading them in bulk, thus making the time interval in which the data is not available be minimal.