[18051] in Perl-Users-Digest
Perl-Users Digest, Issue: 211 Volume: 10
daemon@ATHENA.MIT.EDU (Perl-Users Digest)
Mon Feb 5 00:05:55 2001
Date: Sun, 4 Feb 2001 21:05:07 -0800 (PST)
From: Perl-Users Digest <Perl-Users-Request@ruby.OCE.ORST.EDU>
To: Perl-Users@ruby.OCE.ORST.EDU (Perl-Users Digest)
Message-Id: <981349506-v10-i211@ruby.oce.orst.edu>
Content-Type: text
Perl-Users Digest Sun, 4 Feb 2001 Volume: 10 Number: 211
Today's topics:
Re: * Perl vs. Java in high-traffic Website * km0762@my-deja.com
Re: book <uri@sysarch.com>
Re: book (Rob - Rock13.com)
Re: book <georg_a_k@gnuage.com>
Re: book <uri@sysarch.com>
Comparing File and URL size <whataman@home.com>
Does anyone use File::Spec please? <james@NOSPAM.demon.co.uk>
Re: Help with simple Unix command script to delete some (Pat Smith)
Re: JAPH (John McNamara)
Re: JAPH <godzilla@stomp.stomp.tokyo>
Re: LWP on win32 <bart.lateur@skynet.be>
Re: Modules/Constants. <dorsettest@uk.insight.com>
Re: Modules/Constants. (Abigail)
Re: Modules/Constants. (Garry Williams)
Radical readdir suggestion <ldo@geek-central.gen.new_zealand>
Re: Radical readdir suggestion (Martien Verbruggen)
search and extract ray_maharaj@my-deja.com
Re: search and extract <fxn@retemail.es>
Re: swatch.pl - watch your website's traffic in real-ti <peter.sundstrom@eds.com>
Re: swatch.pl - watch your website's traffic in real-ti (Rob - Rock13.com)
Re: Text::Template Problem (OTR Comm)
This is driving me nuts and I need a guru <gaverth@home.com>
Re: This is driving me nuts and I need a guru <bart.lateur@skynet.be>
Digest Administrivia (Last modified: 16 Sep 99) (Perl-Users-Digest Admin)
----------------------------------------------------------------------
Date: Sun, 04 Feb 2001 23:23:49 GMT
From: km0762@my-deja.com
Subject: Re: * Perl vs. Java in high-traffic Website *
Message-Id: <95koa2$jd8$1@nnrp1.deja.com>
Are you talking about Perl/CGI or mod_perl ?
I guess 99% of CPU will be spent in Postgres anyway, so
the choice of Perl vs. Java doesn't really matter.
regards,
-Klaus
In article <Xns903D779016001cuthereretohdplanetc@195.186.1.107>,
cut_here.retoh@dplanet.ch (Reto Hersiczky) wrote:
>
> ! Need *your* opinion !
>
> "Some guys explain to a customer: Websites from <that> kind of traffic
> meet the edge of running dynamic pages with Perl. They sugguest to
> make future developments in pure Java."
>
> Technical Conditions:
>
> - Apache Webserver
> - Sun Solaris
> - Postgres database, ~ 15 Tables, < 50000 rows
> - Web traffic approximately 25000 page views per hour
>
> I assume the reason for their advise lies in the fact it is easyer
> to protect the intellectual property with Servlets rather than
> deploying a script where the customer becomes the ability to read
code.
>
> * Please reply your opinion!
> I'd prefer a CC to my mail address. Remove "cut_here." prefix.
>
> Thanks,
> Reto
> "guilty as another hacker of the Perl republic"
>
Sent via Deja.com
http://www.deja.com/
------------------------------
Date: Mon, 05 Feb 2001 00:32:31 GMT
From: Uri Guttman <uri@sysarch.com>
Subject: Re: book
Message-Id: <x7u26aq89r.fsf@home.sysarch.com>
>>>>> "gak" == georg a k <georg_a_k@gnuage.com> writes:
gak> a quicky book that was helpful is:
gak> VISUAL QUICKSTART GUIDE - PERL and CGI
gak> their various guides cover many areas of computers and are
gak> well-stocked on the shelves of barnes&noble
does B&N stocking them make it a good book?
>> Perl Black Book by Steven Holzner (CoriolisOpenPress)
have you reviewed this? have you seen other perl books to compare it to?
the black books are not considered high quality perl books. in fact, one
good thing to look for is that the author is active in the perl
community. i have not seen that author active in any perl
circles. wouldn't a perl author find it useful to get tech reviewing,
feedback, etc from the perl community? that alone is a major reason to
downgrade a book. some of these are written by publisher hacks who know
enough perl to write a book and not enough to know how bad a book they
wrote.
uri
--
Uri Guttman --------- uri@sysarch.com ---------- http://www.sysarch.com
SYStems ARCHitecture, Software Engineering, Perl, Internet, UNIX Consulting
The Perl Books Page ----------- http://www.sysarch.com/cgi-bin/perl_books
The Best Search Engine on the Net ---------- http://www.northernlight.com
------------------------------
Date: 5 Feb 2001 03:01:19 GMT
From: rob_13@excite.com (Rob - Rock13.com)
Subject: Re: book
Message-Id: <903EEAFB3rock13com@207.91.5.10>
myriam chagal <mchagal@total.net>:
>please recommend a good perl book
Learning Perl, Programming Perl, Perl Cookbook, Perl documentation.
(Not necessarily in that order:-)
--
Rob - http://rock13.com/
Web Stuff: http://rock13.com/webhelp/
------------------------------
Date: Sun, 4 Feb 2001 22:23:05 -0500
From: "georg_a_k" <georg_a_k@gnuage.com>
Subject: Re: book
Message-Id: <95l66c06rt@enews4.newsguy.com>
three people in our office used both these books as primers, including
myself.
true, but all reviews are subjective, the VISUALs are cheap enough to help
someone new
determine if they "want to go there", we did, and the black book was the
most informative of those on the shelf at the time.
being a long time programmer, i would much prefer an author, not a
programmer (we hated documentation, write a book???
if u have a suggestion Uri lets hear it and I may try it and email you my
opinions....enough said!
as to generally categorizing black books, the same can be said of o'reilly.
I've liked some and hated others.
"Uri Guttman" <uri@sysarch.com> wrote in message
news:x7u26aq89r.fsf@home.sysarch.com...
> >>>>> "gak" == georg a k <georg_a_k@gnuage.com> writes:
>
> gak> a quicky book that was helpful is:
> gak> VISUAL QUICKSTART GUIDE - PERL and CGI
>
> gak> their various guides cover many areas of computers and are
> gak> well-stocked on the shelves of barnes&noble
>
> does B&N stocking them make it a good book?
>
> >> Perl Black Book by Steven Holzner (CoriolisOpenPress)
>
> have you reviewed this? have you seen other perl books to compare it to?
> the black books are not considered high quality perl books. in fact, one
> good thing to look for is that the author is active in the perl
> community. i have not seen that author active in any perl
> circles. wouldn't a perl author find it useful to get tech reviewing,
> feedback, etc from the perl community? that alone is a major reason to
> downgrade a book. some of these are written by publisher hacks who know
> enough perl to write a book and not enough to know how bad a book they
> wrote.
>
> uri
>
> --
> Uri Guttman --------- uri@sysarch.com ----------
http://www.sysarch.com
> SYStems ARCHitecture, Software Engineering, Perl, Internet, UNIX
Consulting
> The Perl Books Page -----------
http://www.sysarch.com/cgi-bin/perl_books
> The Best Search Engine on the Net ----------
http://www.northernlight.com
------------------------------
Date: Mon, 05 Feb 2001 04:42:22 GMT
From: Uri Guttman <uri@sysarch.com>
Subject: Re: book
Message-Id: <x7r91drb9t.fsf@home.sysarch.com>
>>>>> "gak" == georg a k <georg_a_k@gnuage.com> writes:
gak> three people in our office used both these books as primers,
gak> including myself.
gak> true, but all reviews are subjective, the VISUALs are cheap
gak> enough to help someone new determine if they "want to go there",
gak> we did, and the black book was the most informative of those on
gak> the shelf at the time.
most informative? by what definition? the camel is a much more
informative book than the black perl book by a long shot.
gak> being a long time programmer, i would much prefer an author, not a
gak> programmer (we hated documentation, write a book???
or a programmer who can write? i wouldn't want a book by a writer who
can't program. tht is what most of the hack publishers use.
gak> if u have a suggestion Uri lets hear it and I may try it and
gak> email you my opinions....enough said!
plenty of other books to suggest. you haven't stated your needs since
only a few are general purpose like the camel. mastering regular
expressions, effective perl programming, object oriented perl are 3 of
the more useful other perl books (and only 1 is o'reilly).
gak> as to generally categorizing black books, the same can be said of
gak> o'reilly. I've liked some and hated others.
i never knocked black books in total. the perl one is not very good and
the author is not known in the perl community. that doesn't mean much in
some ways and it means a lot in others. the perl community is a very
active one and many people in it are connected via many threads. so if
an author of a perl book wanted tech reviews, help, code
checking. etc. he could get it. by not being involved in the community,
they mark themselves as someone who doesn't care about perl itself but
only about satisfying their publisher's dook needs.
<jeopardectomy>
and your perl book opinions are downgraded when you jeopardy quote entire
posts.
uri
--
Uri Guttman --------- uri@sysarch.com ---------- http://www.sysarch.com
SYStems ARCHitecture, Software Engineering, Perl, Internet, UNIX Consulting
The Perl Books Page ----------- http://www.sysarch.com/cgi-bin/perl_books
The Best Search Engine on the Net ---------- http://www.northernlight.com
------------------------------
Date: Mon, 05 Feb 2001 04:39:59 GMT
From: "What A Man !" <whataman@home.com>
Subject: Comparing File and URL size
Message-Id: <3A7E2F0B.D6061C5F@home.com>
How do I compare a file size with a URL size?
I am using LWP::Simple to "getstore($URL, $tmpfile);"
I want to compare the size of the URL file to the size of
the tmpfile, to make sure that the download was
successful. If the sizes are not the same, then I want to
exit() with the correct RC_MODIFIED message; and not
continue to the next process.
I've tried "is_error" without success. I either get an
"ok" returned when the file was not successfully
downloaded, or I get a "200",depending on what method I
use. I've also tried examples from LWP::User Agent, and
reviewed the other LWP docs.
Is there a simple way to do this?
Thanks,
Dennis
------------------------------
Date: Mon, 5 Feb 2001 03:58:56 +0000
From: James Taylor <james@NOSPAM.demon.co.uk>
Subject: Does anyone use File::Spec please?
Message-Id: <ant0503560e6fNdQ@oakseed.demon.co.uk>
I'm writing a Perl program to take a directory containing a website
in a kind of "source" form and then (via various template files and
automatic hierarchic navbar creation) to output a directory containing
the complete site ready for upload.
A major part of the program is that it has to calculate relative paths
between all the pages, the graphics, and the navbar items, making sure
that they are written into the HTML in the standard Unix format. At the
same time it has to know where all the source and destination files are by
native absolute pathname, and be able to convert between the two freely.
The program will run on RISC OS during the initial development of
sites, but then I also wish to be able to give it to my clients so
that they can update their sites themselves - and they could be
running Windows, Linux, MacOS or anything else. Obviously Perl is
nicely portable, but I have a problem with how to perform all the
pathname manipulations and conversions in a platform independent
manner. It's rather a headache and your assistance would be much
appreciated.
I've had a look at File::Spec and children but it does not seem to have
the necessary functionality, and to be honest, I'm not sure how to convert
back and forth between the unknown "native" format and Unix format.
Specific problems I've had are in the use of File::Spec->canonpath()
which does not seem to perform the tidying operations it promises.
I don't know where to start because the treatment of pathnames is so
central to the whole thing. Can someone please explain to me what I
need to understand. Thanks.
--
James Taylor <james (at) oakseed demon co uk>
Based in Hammersmith, London, UK.
PGP key available ID: 3FBE1BF9
Fingerprint: F19D803624ED6FE8 370045159F66FD02
------------------------------
Date: 05 Feb 2001 03:46:41 GMT
From: quasimojo321@aol.com (Pat Smith)
Subject: Re: Help with simple Unix command script to delete some temp files...
Message-Id: <20010204224641.15257.00000803@ng-ct1.aol.com>
Dr Lehmann,
I have never subscribed to a usenet mailgroup before so I may not be up on
ettiquette. If ( as your previous message seems to indicate ) you are running
on a UNIX based platform, than most likely a simple shell script put into a
crontab will fit your needs rather nicely.
You will want begin this script thusly...
#!/usr/bin/ksh
#
# /dirname would equal the exact path to the directory you wished to operate
on.
# you may type "man find" at the command line to get a rundown of what the
rest of this script is doing.
#
#
find /dirname -type f -name "*.dat" -mtime +1 -exec rm -f {} \;
#
find /dirname -type f -name "*.gif" -mtime +1 -exec rm -f {} \;
#
# EOF
If you vi this simple script and simply include it in roots crontab to run
once a day (say at 01:00) you should achieve the desired effect. I would
highly recommend reading the man pages of these various (and very powerful)
commands.
A very wise man once told me that not only will unix allow a person to shoot
themselves in the foot, it will helpfully take three quarters of your leg as
well!! Read those man pages!
------------------------------
Date: Sun, 04 Feb 2001 23:14:44 GMT
From: jmcnamara@cpan.org (John McNamara)
Subject: Re: JAPH
Message-Id: <3a7de06e.4097825@news1.eircom.net>
Ar Sat, 03 Feb 2001 09:41:08 -0800, do scriobh "Godzilla!"
<godzilla@stomp.stomp.tokyo>:
>My signature file is most unique, perhaps the first
>of its kind
Perhaps not. The obfuscation notwithstanding, the self replication
idea was demonstrated here before, by Abigail:
http://www.deja.com/threadmsg_ct.xp?AN=546839854&fmt=text
John McNamara
--
------------------------------
Date: Sun, 04 Feb 2001 16:59:18 -0800
From: "Godzilla!" <godzilla@stomp.stomp.tokyo>
Subject: Re: JAPH
Message-Id: <3A7DFAE6.F0701166@stomp.stomp.tokyo>
John McNamara wrote:
(snippage)
> Godzilla! wrote:
> >My signature file is most unique, perhaps the first
> >of its kind
> Perhaps not. The obfuscation notwithstanding, the self replication
> idea was demonstrated here before, by Abigail:
> http://www.deja.com/threadmsg_ct.xp?AN=546839854&fmt=text
Cute program affording single instance replication
within .pl files with no exponential growth. It is
safe and docile, very conformist in nature. I like
Abigail's script.
So tell me, why are you boys investing so much time
and effort into negative commentary about my unique
JAPH script and, you don't do this to others, those
who bend the knee and are accepted?
A rhetorical question begging no answer, of course.
Godzilla!
------------------------------
Date: Sun, 04 Feb 2001 23:10:35 GMT
From: Bart Lateur <bart.lateur@skynet.be>
Subject: Re: LWP on win32
Message-Id: <hgnr7t0o4ck1bguqje4n8i3v36liq7o93j@4ax.com>
Regy wrote:
>#!perl/bin/perl
>use LWP::Simple;
>
>print (get $ARGV[0]);
>
>This works perfectly when I used it to download a webpage, using a linux
>machine but does not return anything on a winnt machine. There are no
>errors either. Any ideas as to why?
You're likely not getting through. Are you connected on internet? Do you
need to use a proxy, by any chance? If so, set the HTTP_PROXY
environment variable to that server/port pair, in the format
"http://proxy.domain.net:8080" (without the quotes).
Oh, you can also try getstore() instead of get(), and print out the
numeric response code. If it's some error code, then at least you have a
clue what is going on.
--
Bart.
------------------------------
Date: Mon, 05 Feb 2001 00:26:22 +0000
From: Kelly Dorset <dorsettest@uk.insight.com>
Subject: Re: Modules/Constants.
Message-Id: <3A7DF32E.FE64C3E5@uk.insight.com>
>
> Incidently, I've just realised that having the constants in a seperate
> file and useing them isn't working :( Where abouts in the module do
> they need to be declared?
>
To clarify (managed to get rid of using Deja :) ), I get this error:
Argument "CONSTANT" isn't numeric in pack at include/COMSINCLUDE.pm
Could this be because of the I'm using the include twice, as previously
mentioned?
k
--
------------------------------
Date: 5 Feb 2001 01:02:44 GMT
From: abigail@foad.org (Abigail)
Subject: Re: Modules/Constants.
Message-Id: <slrn97rutk.vb5.abigail@tsathoggua.rlyeh.net>
Kelly Dorset (dorsettest@uk.insight.com) wrote on MMDCCXV September
MCMXCIII in <URL:news:3A7DF32E.FE64C3E5@uk.insight.com>:
``
`` >
`` > Incidently, I've just realised that having the constants in a seperate
`` > file and useing them isn't working :( Where abouts in the module do
`` > they need to be declared?
`` >
`` To clarify (managed to get rid of using Deja :) ), I get this error:
``
`` Argument "CONSTANT" isn't numeric in pack at include/COMSINCLUDE.pm
``
`` Could this be because of the I'm using the include twice, as previously
`` mentioned?
No.
Abigail
--
package Just_another_Perl_Hacker; sub print {($_=$_[0])=~ s/_/ /g;
print } sub __PACKAGE__ { &
print ( __PACKAGE__)} &
__PACKAGE__
( )
------------------------------
Date: Mon, 05 Feb 2001 04:33:42 GMT
From: garry@zvolve.com (Garry Williams)
Subject: Re: Modules/Constants.
Message-Id: <G2qf6.4462$Sn3.43834@eagle.america.net>
On Mon, 05 Feb 2001 00:26:22 +0000, Kelly Dorset
<dorsettest@uk.insight.com> wrote:
>> Incidently, I've just realised that having the constants in a seperate
>> file and useing them isn't working :( Where abouts in the module do
>> they need to be declared?
>>
>To clarify (managed to get rid of using Deja :) ),
Hooray! Your posts will likely be more visible, now.
>I get this error:
>
>Argument "CONSTANT" isn't numeric in pack at include/COMSINCLUDE.pm
That's a warning.
>Could this be because of the I'm using the include twice, as previously
>mentioned?
If you mean that you are not exporting the constants and that you
include the constants module in two different files, and the two files
define different packages, yes.
That was nobull's point.
The second time the module is use'd the compiler knows it's already
loaded, so it just calls its import method in the new file. Hmm. If
it doesn't export anything, there will be no names defined in the
second file's name space.
You would see this more clearly, if you used strict.
The only way around this is to explicitly make everything the same
name space. That's probably a Bad Thing. If you do that, why divide
things into separate files?
By the way, have you considered use strict?
--
Garry Williams
------------------------------
Date: Mon, 05 Feb 2001 16:00:54 +1300
From: Lawrence DčOliveiro <ldo@geek-central.gen.new_zealand>
Subject: Radical readdir suggestion
Message-Id: <ldo-BD3CE2.16005405022001@news.wave.co.nz>
What is the use of readdir returning the "." and ".." entries? Has
anybody ever written a Perl script that depended on these entries being
returned in order to work?
If not, is there any reason why the semantics of readdir should not be
changed so it never returns "." and ".."? It would simplify so many
scripts, that currently have to check for, and skip these entries.
Note that readdir already works this way on platforms where directories
do not have such entries (eg MacOS).
------------------------------
Date: Mon, 05 Feb 2001 03:13:29 GMT
From: mgjv@tradingpost.com.au (Martien Verbruggen)
Subject: Re: Radical readdir suggestion
Message-Id: <slrn97s6gk.64e.mgjv@verbruggen.comdyn.com.au>
On Mon, 05 Feb 2001 16:00:54 +1300,
Lawrence DčOliveiro <ldo@geek-central.gen.new_zealand> wrote:
> What is the use of readdir returning the "." and ".." entries? Has
Because readdir returns all the directory entries. . and .. are
directory entries. It's just how the file system works.
> If not, is there any reason why the semantics of readdir should not be
> changed so it never returns "." and ".."? It would simplify so many
> scripts, that currently have to check for, and skip these entries.
I don't see it as a problem. Breaking the way readdir works, and
making exceptiopns for 'special' entries is IMO a bad thing. . and ..
are part of the way the file system works. They should be there.
> Note that readdir already works this way on platforms where directories
> do not have such entries (eg MacOS).
But readdir presumably still returns _all_ entries in the directory,
or equivalent, right?
Martien
--
Martien Verbruggen |
Interactive Media Division | Make it idiot proof and someone will
Commercial Dynamics Pty. Ltd. | make a better idiot.
NSW, Australia |
------------------------------
Date: Sun, 04 Feb 2001 23:43:00 GMT
From: ray_maharaj@my-deja.com
Subject: search and extract
Message-Id: <95kpe4$jvf$1@nnrp1.deja.com>
I have a large file (500K) which I need to search...and extract matched
lines out of....
I have matched the lines to an array that holds the elements that I
wish to search for....
I can print the matched lines to STDOUT ....what I would like to do is
print the matched lines to an out file and remove these matched lines
from the original file....
I have included a portion of my code that does some of this...I would
gladly welcome your help....
37 @poppats = map{qr/\b$_\b/i}@popstates;
38 @whole=<$input>;
39 while (defined($line = @whole)) {
40 # foreach $patobj (@poppats) {
41 foreach $line (@whole) {
42 foreach $patobj (@poppats) {
43 if ($line =~ (/$patobj/)) {
44 print $line;
45 #print @whole;
Sent via Deja.com
http://www.deja.com/
------------------------------
Date: Mon, 05 Feb 2001 02:00:59 +0100
From: F. Xavier Noria <fxn@retemail.es>
Subject: Re: search and extract
Message-Id: <u8ur7t0iku2iurapunqknacl2rttmm60l9@4ax.com>
On Sun, 04 Feb 2001 23:43:00 GMT, ray_maharaj@my-deja.com wrote:
; I have a large file (500K) which I need to search...and extract matched
; lines out of....
; I have matched the lines to an array that holds the elements that I
; wish to search for....
; I can print the matched lines to STDOUT ....what I would like to do is
; print the matched lines to an out file and remove these matched lines
; from the original file....
How about using the -i switch? If you wasn't aware of -i read
about it in
perldoc perlrun
-- fxn
------------------------------
Date: Mon, 5 Feb 2001 11:29:43 +1300
From: "Peter Sundstrom" <peter.sundstrom@eds.com>
Subject: Re: swatch.pl - watch your website's traffic in real-time
Message-Id: <95kl4o$38v$1@hermes.nz.eds.com>
"James Thornton" <james_thornton@my-deja.com> wrote in message
news:95kf36$d6v$1@nnrp1.deja.com...
> swatch.pl -- Site Watch is a PERL script that provides a means for you
> to watch your website's traffic in real-time.
>
> http://www.jamesthornton.com/code/swatch.txt
Are you aware of the other swatch
http://www.stanford.edu/~atkins/swatch/
It's been around for about 7 years. I'd suggest you choose another name to
avoid confusion.
------------------------------
Date: 5 Feb 2001 03:05:07 GMT
From: rob_13@excite.com (Rob - Rock13.com)
Subject: Re: swatch.pl - watch your website's traffic in real-time
Message-Id: <903EE7B2Crock13com@207.91.5.10>
James Thornton <james_thornton@my-deja.com>:
>swatch.pl -- Site Watch is a PERL script that provides a means for
>you to watch your website's traffic in real-time.
Least you could do is crosspost, so my killfile will filter this
thing out.
--
Rob - http://rock13.com/
Web Stuff: http://rock13.com/webhelp/
------------------------------
Date: Mon, 05 Feb 2001 00:58:43 GMT
From: otrcomm***NO-SPAM***@wildapache**NO-SPAM***.net (OTR Comm)
Subject: Re: Text::Template Problem
Message-Id: <3a7dfa7a.146795931@news.wildapache.net>
On Sun, 04 Feb 2001 22:44:32 GMT, Bart Lateur <bart.lateur@skynet.be>
wrote:
>OTR Comm wrote:
>
>>my @items = qw(This is Cool);
>>
>>my $result = $text_template->fill_in(HASH => \%items);
>
>@items and %items is not the same variable.
>
>>but if I use:
>>
>>my $result = $text_template->fill_in(HASH => {items => ["This", "is",
>>"cool"]});
>>
>>then the results are as expected
>
>Well, then you need to use "{ items => \@items }" instead of "\%items".
Thanks, that did it!
Murrah Boswell
>
>--
> Bart.
------------------------------
Date: Sun, 04 Feb 2001 23:20:26 GMT
From: "Tim Gaverth" <gaverth@home.com>
Subject: This is driving me nuts and I need a guru
Message-Id: <_slf6.65678$B6.17421359@news1.rdc1.md.home.com>
My primary occupation is to maintain and advance a single web based
application. It's actually pretty big. It is housed on a dual cpu Sun E3500
with Solaris 2.6, and uses perl5.005_3, Sybase ASE 11.5 (with a single
engine, admittedly not optimized for SMP architecture), dblib, Netscape
Directory Server 3, and Netscape Enterprise 4. Until a week or so ago, the
perl part was pretty much straight perl, with a script and corresponding .pm
for each screen. Since last July, we've been converting the old app to an
object oriented perl model, and added a glob of new functionality as
requested by our customer. We developed this on a single cpu Sun Sparc 20,
Sol2.6, same everything else. We also deployed the new oo version to a
separate single cpu Sun Ultra 5, same software setup, for customer testing
prior to deployment to E3500. Development environment had a max of 5
concurrent users, testing had maybe 10, production is about 30 concurrent
web users. The perl code uses only 2 db logins, one for read-only, another
for r/w. httpd user is a separate user, doesn't have a db login. User's,
authenticated through NS directory server, don't have individual db logins.
I think that covers the setup and background. Oh, we built perl with
gcc2.8.1 without the "usethreads" config option, and sybase is configured to
use tcpip connections only, albeit only from localhost.
The problem is, on the production server (E3500 SMP) only, users are
getting http server error messages, and the error log shows "attempt to
initiate sql server operation with results pending". This occurs globally,
on all screens of the app, and it happens sporadically. By that, I mean a
user can click a submit button, get the error, go back, and it works. I've
been able to determine, through the correlation of several different logs on
the server, that the error occurs when one user's process (from a screen,
call it "screen4") is storing (insert) a new record to the db (which fires a
db trigger for a couple of "update" operations, and another user (separate
app login, separate workstation) submits the form from "screen4" on their
browser. It looks like the whole setup has forgotten that it's a
multi-user system.
I've seen this particular error before, when we didn't write the code
properly in a method or function with nested db handles. I really don't
think that's the problem in this case, as I've spent several days ensuring
those are correct. I've also seen it when we didn't write the code properly
in a method or function so that the receiving variable/hash/array didn't
fully process the return from the db. I've gone as far as to loop through
the results from every call to the db, even if only a single value is
returned. Still get the error.
I guess my questions are: Does this sound absolutely ridiculous? Is it
possible that Solaris is (improperly) threading the processes, so that a
user's process may be concurrently processing the hash returned by the db
call and opening a new db handle? Have you ever heard of anything like this?
Is Solaris/SMP/web/sybase/oo perl a bad combination?
--
Aloha,
Tim
------------------------------
Date: Mon, 05 Feb 2001 01:37:21 GMT
From: Bart Lateur <bart.lateur@skynet.be>
Subject: Re: This is driving me nuts and I need a guru
Message-Id: <ku0s7tccmv86ommtrvh8u7cve81cj43do3@4ax.com>
Tim Gaverth wrote:
>The problem is, on the production server (E3500 SMP) only, users are
>getting http server error messages, and the error log shows "attempt to
>initiate sql server operation with results pending". This occurs globally,
>on all screens of the app, and it happens sporadically. By that, I mean a
>user can click a submit button, get the error, go back, and it works.
Could it be a concurrency problem? Two users using the same database
access route at the same time?
--
Bart.
------------------------------
Date: 16 Sep 99 21:33:47 GMT (Last modified)
From: Perl-Users-Request@ruby.oce.orst.edu (Perl-Users-Digest Admin)
Subject: Digest Administrivia (Last modified: 16 Sep 99)
Message-Id: <null>
Administrivia:
The Perl-Users Digest is a retransmission of the USENET newsgroup
comp.lang.perl.misc. For subscription or unsubscription requests, send
the single line:
subscribe perl-users
or:
unsubscribe perl-users
to almanac@ruby.oce.orst.edu.
| NOTE: The mail to news gateway, and thus the ability to submit articles
| through this service to the newsgroup, has been removed. I do not have
| time to individually vet each article to make sure that someone isn't
| abusing the service, and I no longer have any desire to waste my time
| dealing with the campus admins when some fool complains to them about an
| article that has come through the gateway instead of complaining
| to the source.
To submit articles to comp.lang.perl.announce, send your article to
clpa@perl.com.
To request back copies (available for a week or so), send your request
to almanac@ruby.oce.orst.edu with the command "send perl-users x.y",
where x is the volume number and y is the issue number.
For other requests pertaining to the digest, send mail to
perl-users-request@ruby.oce.orst.edu. Do not waste your time or mine
sending perl questions to the -request address, I don't have time to
answer them even if I did know the answer.
------------------------------
End of Perl-Users Digest V10 Issue 211
**************************************