Walt Mankowski on Fri, 8 Aug 2003 16:10:46 -0400


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Advice on inserting multiple rows into MySQL


I'm working on a project where I need to insert several gigabytes of
data into mysql.  The input is a flat file.  Each record of the flat
file generates one record in a master table, and 1-20 records in a
secondary table.  I need to insert the master records one at a time in
order to fetch the mysql_insertid, but I'd like to insert the 1-20
other records all at once to save time.

The SQL I want to generate is pretty straightforward:

  INSERT INTO t (c1, c2) VALUES (v11, v12), (v21, v22), ...;

but I'm having a hard time deciding how to best translate that into
efficient DBI code.  Ideally I'd like to prepare the insert statement
once, and then execute it repeatedly inside my main loop.  Inserting
one row at a time, the code would look something like this:

  my $insert = $dbh->prepare("INSERT INTO t (c1, c2) VALUES (?, ?)");
  while (<>) {
    ...
    $insert->execute(v1, v2);
    ...
  }

However, I can't find any examples that show how to do that if I want
to insert more than one row at a time.

I can think of a few possibilities:

* prepare() an array of insert statement handles, one for each number of
  rows, and execute the one I want.

* prepare_cached() the insert statement on the fly, and let DBI take
  care of making sure each one only gets compiled once.

* use execute_array(), although I'm having a hard time figuring out
  how to use it from the DBI docs.

Does anyone have any advice?  It seems like I can't be the only one
who's ever run into this problem.

Thanks.

Walt

Attachment: pgpry2zoaJCfO.pgp
Description: PGP signature