Eric on 2 Apr 2010 12:20:15 -0700

[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] postgres data loading


What kind of errors are causing you problems?  If the types of errors
are frequently the same type (for example: characters in an integer
field or an invalid date) then perhaps you could pre-process the data
with a script (suggest: perl).  I have frequently had to import data
from disparate sources and have almost always had to pre-process the
data to protect from this kind of thing.  Typically, I throw the records
that cannot be automatically corrected into a separate "bad" file so
that they can be manually scanned and then re-processed.


John Karr wrote:
> I'm working on a database that involves periodic replacement of a 10 million
> record database from an external source. 
> The postgres bulk load (copy from) is very sensitive to errors and crashes
> the entire operation for just one error in an operation, so my import script
> adds records one at a time so I can deal with or ignore failures. 
> On my test machine it takes nearly a day to load the database this way. I
> tried using transactions, but only got about a 30% speed boost at the cost
> of a single error in a batch crashing the whole batch. 
> Any ideas on how to do this load faster? Normally this load would occur
> about once every 3 months, but in times peak activity it could be weekly and
> it wouldn't be acceptable to have the application down for a day for every
> load.
> ___________________________________________________________________________
> Philadelphia Linux Users Group         --
> Announcements -
> General Discussion  --

#  Eric Lucas
#                "Oh, I have slipped the surly bond of earth
#                 And danced the skies on laughter-silvered wings...
#                                        -- John Gillespie Magee Jr

Philadelphia Linux Users Group         --
Announcements -
General Discussion  --