|
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
|
RE: Advice on inserting multiple rows into MySQL
|
It might be quicker to insert the header record, get the insert_id, then
write the detail records (with the id field) to another flat file that you
can load with the mysql bulk loader after all the header records are
inserted.
> -----Original Message-----
> From: Walt Mankowski [mailto:waltman@pobox.com]
> Sent: Friday, August 08, 2003 1:11 PM
> To: phl@lists.pm.org
> Subject: Advice on inserting multiple rows into MySQL
>
>
> I'm working on a project where I need to insert several gigabytes of
> data into mysql. The input is a flat file. Each record of the flat
> file generates one record in a master table, and 1-20 records in a
> secondary table. I need to insert the master records one at a time in
> order to fetch the mysql_insertid, but I'd like to insert the 1-20
> other records all at once to save time.
>
> The SQL I want to generate is pretty straightforward:
>
> INSERT INTO t (c1, c2) VALUES (v11, v12), (v21, v22), ...;
>
> but I'm having a hard time deciding how to best translate that into
> efficient DBI code. Ideally I'd like to prepare the insert statement
> once, and then execute it repeatedly inside my main loop. Inserting
> one row at a time, the code would look something like this:
>
> my $insert = $dbh->prepare("INSERT INTO t (c1, c2) VALUES (?, ?)");
> while (<>) {
> ...
> $insert->execute(v1, v2);
> ...
> }
>
> However, I can't find any examples that show how to do that if I want
> to insert more than one row at a time.
>
> I can think of a few possibilities:
>
> * prepare() an array of insert statement handles, one for
> each number of
> rows, and execute the one I want.
>
> * prepare_cached() the insert statement on the fly, and let DBI take
> care of making sure each one only gets compiled once.
>
> * use execute_array(), although I'm having a hard time figuring out
> how to use it from the DBI docs.
>
> Does anyone have any advice? It seems like I can't be the only one
> who's ever run into this problem.
>
> Thanks.
>
> Walt
>
>
-
**Majordomo list services provided by PANIX <URL:http://www.panix.com>**
**To Unsubscribe, send "unsubscribe phl" to majordomo@lists.pm.org**
|
|