[prev in list] [next in list] [prev in thread] [next in thread] 

List:       python-db-sig
Subject:    [DB-SIG] Re: cx_Oracle cursor.executemanyprepared()
From:       pf_moore () yahoo ! co ! uk (Paul Moore)
Date:       2004-04-30 15:04:48
Message-ID: 8ygd9tik.fsf () yahoo ! co ! uk
[Download RAW message or body]

"Orr, Steve" <sorr@rightnow.com> writes:

> I need to develop a fast data insert routine for query result sets
> coming from the MySQLdb python module. I'm thinking about iterating
> through a MySQL result set (which is list of tuples), say 1,000 or
> 10,000 rows at a time, followed by mass inserts of those rows into
> Oracle committing every 1,000 or 10,000 rows. 

You do know that for an Oracle database doing a single commit at the
end is better than multiple commits as you go through the loop? You
need enough rollback to do it, but if you commit in the loop, you risk
ORA-01555 (snapshot too old) errors and the need to restart a
partially completed load...

Other thoughts for the load - use direct-mode inserts (with the
/*+ APPEND */ hint) and NOLOGGING tables to reduce redo usage. If you
can afford a flat file of the data, external tables (or SQL*Loader in
earlier versions of Oracle) are useful.

Paul.
-- 
This signature intentionally left blank


[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic