Loading large amounts of data into an Oracle SQL database

I was wondering if anyone has experience in what I am going to do. I have several csv files that are about one GB in size, and I need to upload them to the oracle database. Although most of my work after downloading will be read-only, I will need to download updates from time to time. Basically, I just need a good tool to load several rows of data at a time before my db.

Here is what I have found so far:

  • I could use the SQL loader to do most of the work.

  • I could use Bulk-Insert commands

  • Some kind of batch contribution.

Using a prepared statement somehow might be a good idea. I guess I was wondering what everyone thinks, this is the fastest way to make this insert. Any tips?

+3
source share
3 answers

I would be very surprised if you could roll out your own utility that would be superior to SQL * Loader Direct Path Loads . Oracle built this utility specifically for this purpose - the probability of creating something more efficient is almost zero. There is also Parallel Direct Path Load , which allows you to run multiple direct path load processes simultaneously.

From the manual:

Oracle SQL INSERT, API . , .

Oracle . ( / , -).

. , , - -. /.

, Direct Path Load .

+5

- dbf-.

sqlldr - script , , , sql script. , , . sqlldr .

, ? , . ?

0

You might be able to create an external table in CSV files and load them using SELECTing from the external table into another table. Is this method faster, but perhaps faster in terms of clutter when loading sql * loader, especially when you have criteria for UPDATE.

0
source

Source: https://habr.com/ru/post/1748905/


All Articles