I need a program to read a BIG CSV file ( 3GB or more in size) and then insert the data into the database. I need a fastest approach for these two processes (read BIG CSV file and insert into the database) to happen. My suggestion for the fastest approach is to use the NodeJS Stream concept. But if you have any better approach than using NodeJS that is also appreciable.
The above mentioned process/program (that is reading BIG CSV file and insert into the database) should set up as a scheduled job so that the job will run in a specified time.
Note that I am using Microsoft SQL server as the database.
I was thinking about NodeJS stream concept (Read Stream/Write Stream). But , as far as I know , we cannot use NodeJS Stream (Read Stream/Write Stream) concept with Microsoft SQL Server. If that is the case what should be the best approach.