gabriel rosenkoetter on Mon, 5 May 2003 21:59:05 -0400 |
[No need to Cc me on replies that go to the list too, Ed. I do read the mailing list.] On Mon, May 05, 2003 at 03:50:40PM -0400, Edmund Goppelt wrote: > Given how easy it is to produce a SQL dump, the City's suggestion > seems both petty and harmful to the public. I've been trying really hard to keep a straight, Devil's advocate face on this, if only to help you reason through what kind of arguments the city'd attempt, but I'm out of steam. > I'm concerned that if I > follow their suggestion that their web site would probably crash or > become unavailable to the public for long periods of time. At six > queries a second, it would take me an entire day to download all > 440,000 properties. Do you have usage data on that web site? Will you really be invonceniencing that many users if it goes down? It might be worth doing (say, mid-morning on a Monday, so there's plenty of workday lag time for it to get fixed efficiently; I'd avoid any big crunches on the city's IT department that you can reasonably know about, in the same spirit), if only to prove your point. > Why go through all this rigamarole? Why not just take the five > minutes required to do a SQL dump and let me ftp the data from the > City's ftp site? Show them what happens if you play their game! At least, that's what I'd do. And document just how responsive the site was and wasn't from various points on the Internet, and how long a span of time it was inaccessible (if it was), and so on. I hate to advise being petty, but it's the kind of thing that may be necessary to present the situation clearly to the judge. (Incidentally, constructing your own DB with those 440000 records and getting some timing and load data for queries against *it* would be beneficial in proving that the city's being silly. Even if you have to build it by separate requests initially, you can refresh it with SQL dumps later, right?) -- gabriel rosenkoetter gr@eclipsed.net Attachment:
pgpPqXEN8mSli.pgp
|
|