How to insert a datatable with more than 100 thousand records in mysql?

I’m connecting to a mysql database that is in the cloud by odbc, to load the complete table I use the insert activity (uipath database), but it is taking too long, like 7 hours approximately, I also tried it with a loop inserting the records 1 by 1, but, doing the calculations would take longer, so is there a faster and safer way to load data of this magnitude?

Any comment will help,

Sorry for the rustic writing jajaj

im from chile, regards!!

Do you only need to run this once or every day will you include this number of items in MySql? What is the Origin of Your Records?

Datatable with this volume of records will be slow.

This process runs everyday and inserts the same amount of data (approx), I am getting the data from a txt, but i’m working it as datatable.

I would try to go the other way.

Search on MySql “load data infile”, I believe it will be your best option.

Just like Jorge is saying i would go for the load data infile option. Working with a large set like yours i wouldn’t even consider using UiPath for that.

Plan A:
import the file from the MySql server itself:

Downside of load local data infile or load data infile:

  • disabled by default.
  • restrictions on load Local data or load data
  • user restrictions (extra permissions are needed)
  • security. (most of the time this setting is disabled for remote and also from local usage). Consider talking to your cloud admins. Or check the restrictions and options on your cloud MySql.
  • file needs to be present on mysql server. (so you need to upload the file).


  • a lot faster then running queries one by one.
  • can be automated by running a cronjob.

Use of UiPath:

  • upload the file to mysql server. Of course that can be automated too, but that is up to you.

Plan B:
splitting your file into smaller ones, so it uses less memory of your robot. I don’t know if this will be faster or not, but it’s worth the try if Plan A is not an option.

I’m going to try that way and I’ll tell you how I’m doing

oh thanks, I’ll try plan A,and I’ll tell how im doing ,thanks again for your answer and your time

I have a similar challenge,
I have a set of data running into millions provided by the client on a local machine in .DEL and Excel files.
I need to read the records into a database (SQL Server) on a different server (on the same network)
=> I am currently using Stored Procedure to read in the files (which is very fast) - files are on the same machine as my UiPath studio, the only challenge is, the remote database can’t access my local drive where the files are located (after I shipped my solution to UAT environment)
=> How can UiPath used to handle the file read directly (UiPath can connect to the database!)