Michael Grabenstein on Thu, 2 Nov 2000 16:29:36 -0500 (EST) |
Ok here is one that is stumping me... I have written a subroutine that uses regular expressions to parse XML. For a particularly large XML file when I write a simple program and run it on the command line it takes 1.5 to 1.9 seconds. The same XML file when parsed through the same library from a CGI takes 15-20 seconds. I have run the CGI under an Apache web server and a Netscape web server with the same basic results. Running the command line version at roughly the same time on the same machine yields the time difference stated above. The time stamps are basically the same locations for the command line version and the CGI version and only show time actually spent parsing. Watching these things run, it is evident that the times are not wrong (ie. not a decimal point problem in the time calculation). I have printed out each tag encountered and printed the time stamp at that tag and it shows a nice linear progression through all the tags in both the command line version and the CGI version. (IOW: my pattern is not getting caught in a recursive loop, etc...) I was prepared for the CGI to be slower than the command line version, but not by a factor of 10. Any ideas ? Thanks, Mike -- kill -9'em All and let Root sort'em out --From Slashdot Opinions, Flames, Irritations are solely mine. Useful, Productive information... the company claims. **Majordomo list services provided by PANIX <URL:http://www.panix.com>** **To Unsubscribe, send "unsubscribe phl" to majordomo@lists.pm.org**
|
|