John Karr on 2 Apr 2010 11:55:50 -0700

[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

[PLUG] postgres data loading

I'm working on a database that involves periodic replacement of a 10 million
record database from an external source. 

The postgres bulk load (copy from) is very sensitive to errors and crashes
the entire operation for just one error in an operation, so my import script
adds records one at a time so I can deal with or ignore failures. 

On my test machine it takes nearly a day to load the database this way. I
tried using transactions, but only got about a 30% speed boost at the cost
of a single error in a batch crashing the whole batch. 

Any ideas on how to do this load faster? Normally this load would occur
about once every 3 months, but in times peak activity it could be weekly and
it wouldn't be acceptable to have the application down for a day for every

Philadelphia Linux Users Group         --
Announcements -
General Discussion  --