[30810] in Perl-Users-Digest

home help back first fref pref prev next nref lref last post

Perl-Users Digest, Issue: 2055 Volume: 11

daemon@ATHENA.MIT.EDU (Perl-Users Digest)
Sun Dec 14 11:09:47 2008

Date: Sun, 14 Dec 2008 08:09:10 -0800 (PST)
From: Perl-Users Digest <Perl-Users-Request@ruby.OCE.ORST.EDU>
To: Perl-Users@ruby.OCE.ORST.EDU (Perl-Users Digest)

Perl-Users Digest           Sun, 14 Dec 2008     Volume: 11 Number: 2055

Today's topics:
    Re: I need help with PDF::API2 to make a PDF file navig <hjp-usenet2@hjp.at>
    Re: MIME::Lite -- Add attribute to part's Content-Type <jerry@ieee.org>
        new CPAN modules on Sun Dec 14 2008 (Randal Schwartz)
    Re: Noob trying to understand simple Perl grep statemen (Greg Bacon)
    Re: Noob trying to understand simple Perl grep statemen <tim@burlyhost.com>
    Re: Noob trying to understand simple Perl grep statemen (Greg Bacon)
        opening a file <rihad@mail.ru>
    Re: opening a file <dmwREMOVEUPPERCASE@coder.cl>
    Re: opening a file <klaus03@gmail.com>
    Re: Processing Multiple Large Files <1usa@llenroc.ude.invalid>
    Re: Processing Multiple Large Files <tim@burlyhost.com>
    Re: Processing Multiple Large Files sln@netherlands.com
    Re: Processing Multiple Large Files sln@netherlands.com
    Re: Processing Multiple Large Files <tim@burlyhost.com>
        Digest Administrivia (Last modified: 6 Apr 01) (Perl-Users-Digest Admin)

----------------------------------------------------------------------

Date: Sun, 14 Dec 2008 11:47:26 +0100
From: "Peter J. Holzer" <hjp-usenet2@hjp.at>
Subject: Re: I need help with PDF::API2 to make a PDF file navigation aide
Message-Id: <slrngk9p1v.bnq.hjp-usenet2@hrunkner.hjp.at>

On 2008-12-11 22:30, Ted Byers <r.ted.byers@gmail.com> wrote:
> I have used this, along with a couple other of the PDF modules, to
> create what are now rather large PDF files.  They are a bit tedious to
> scroll through, so what I want to do now is create something like a
> table of contents that is always displayed in a narrow strip along the
> left margin that allows the reader to simply select an item in that
> window and have the corresponding page appear in the main window.  But
> unlike a conventional table of contents, it would never appear at the
> beginning of the document in the main window (and unlike an index, it
> would never appear at the end of the document).
>
> What is the normal term for this 'object' in the PDF documentation?

I think in PDF::API2 it's called an "outline", although normally the term
"bookmark" is more common.

> How do I create it (or is it always present even if not visible, in a
> PDF file created by new('filename.pdf')), and how do I specify that
> this page, but not that page, should be linked to it with this title?

perldoc PDF::API2
perldoc PDF::API2::Outlines
perldoc PDF::API2::Outline

unfortunately the docs are very terse.

	hp


------------------------------

Date: Sat, 13 Dec 2008 21:11:37 -0800 (PST)
From: Jerry Krinock <jerry@ieee.org>
Subject: Re: MIME::Lite -- Add attribute to part's Content-Type
Message-Id: <cee05145-4364-4075-a27b-c1cb40e3c758@s9g2000prm.googlegroups.com>

On Dec 13, 4:00 am, Todd Wade <waveri...@gmail.com> wrote:
> Just looking at what you've got here, I'd try:
>   Type => 'application/zip; x-mac-auto-archive=yes'

Thanks for the great guess, Todd.  It WORKS.

Can anyone interpret that documentation [1] to say that Todd's trick
is supported, or suggest a supported method?  Otherwise, I'll submit a
bug on the documentation, asking to please officially support Todd's
trick.

Jerry Krinock

Looking at the documentation further, it appears that the 'attr'
method [2] is provided to set attributes in header fields, but I can't
get it to work.

First Attempt: I invoked attr on the whole message, as in the example
given in the documentation.  This doesn't make sense, because I want
this attribute applied only to one part.  But I tried it anyhow:

    $msg->attach(
        Type     => 'application/zip',
        Path     => $myPath,
        Filename => "MyScript.app.zip"
    ) ;
    $msg->attr("x-mac-auto-archive" => "yes") ;

Result: Failed.  x-mac-auto-archive does not appear anywhere in the
message.

Second Attempt: Assuming that attach() would return the part (which is
undocumented), I tried to invoke 'attr' on the part.  At least this
makes sense:

    my $fileAttachmentPart = $msg->attach(
        Type     => 'application/zip',
        Path     => $myScriptPath,
        Filename => "MyScript.app.zip"
    ) ;
    $fileAttachmentPart->attr("x-mac-auto-archive" => "yes");

Result: Failed.  I get the attribute as a header field,
    X-Mac-Auto-Archive: yes
which does not have the desired effect on Apple's Mail.app.


REFERENCES

[1] http://search.cpan.org/~rjbs/MIME-Lite-3.023/lib/MIME/Lite.pm#Content_types

[2] attr ATTR,[VALUE]
Instance method. Set MIME attribute ATTR to the string VALUE. ATTR is
converted to all-lowercase. This method is normally used to set/get
MIME attributes:

    $msg->attr("content-type"         => "text/html");
    $msg->attr("content-type.charset" => "US-ASCII");
    $msg->attr("content-type.name"    => "homepage.html");

This would cause the final output to look something like this:
    Content-type: text/html; charset=US-ASCII; name="homepage.html"


------------------------------

Date: Sun, 14 Dec 2008 05:42:25 GMT
From: merlyn@stonehenge.com (Randal Schwartz)
Subject: new CPAN modules on Sun Dec 14 2008
Message-Id: <KBuqIp.107H@zorch.sf-bay.org>

The following modules have recently been added to or updated in the
Comprehensive Perl Archive Network (CPAN).  You can install them using the
instructions in the 'perlmodinstall' page included with your Perl
distribution.

Acme-CPANAuthors-Taiwanese-0.02
http://search.cpan.org/~gugod/Acme-CPANAuthors-Taiwanese-0.02/
We are Taiwanese CPAN Authors! 
----
Apache2-Mojo-0.003
http://search.cpan.org/~uvoelker/Apache2-Mojo-0.003/
mod_perl2 handler for Mojo 
----
App-ZofCMS-Plugin-SplitPriceSelect-0.0101
http://search.cpan.org/~zoffix/App-ZofCMS-Plugin-SplitPriceSelect-0.0101/
plugin for generating a <select> for "price range" out of arbitrary range of prices. 
----
App-ZofCMS-Plugin-SplitPriceSelect-0.0102
http://search.cpan.org/~zoffix/App-ZofCMS-Plugin-SplitPriceSelect-0.0102/
plugin for generating a <select> for "price range" out of arbitrary range of prices. 
----
App-ZofCMS-Plugin-YouTube-0.0102
http://search.cpan.org/~zoffix/App-ZofCMS-Plugin-YouTube-0.0102/
CRUD-type plugin to manage YouTube videos 
----
App-ZofCMS-Plugin-YouTube-0.0103
http://search.cpan.org/~zoffix/App-ZofCMS-Plugin-YouTube-0.0103/
CRUD-type plugin to manage YouTube videos 
----
Archive-Tar-1.42
http://search.cpan.org/~kane/Archive-Tar-1.42/
module for manipulations of tar archives 
----
Catalyst-View-TT-XHTML-1.002
http://search.cpan.org/~bobtfish/Catalyst-View-TT-XHTML-1.002/
A sub-class of the standard TT view which serves application/xhtml+xml content if the browser accepts it. 
----
Config-Multi-0.07
http://search.cpan.org/~taro/Config-Multi-0.07/
load multiple config files. 
----
DBD-Pg-2.11.7
http://search.cpan.org/~turnstep/DBD-Pg-2.11.7/
PostgreSQL database driver for the DBI module 
----
Devel-Tokenizer-C-0.08
http://search.cpan.org/~mhx/Devel-Tokenizer-C-0.08/
Generate C source for fast keyword tokenizer 
----
Finance-Bank-SCSB-TW-0.10
http://search.cpan.org/~gugod/Finance-Bank-SCSB-TW-0.10/
Check Taiawn SCSB bank info 
----
Finance-Bank-SCSB-TW-0.11
http://search.cpan.org/~gugod/Finance-Bank-SCSB-TW-0.11/
Check Taiawn SCSB bank info 
----
FormValidator-LazyWay-0.01
http://search.cpan.org/~taro/FormValidator-LazyWay-0.01/
Yet Another Form Validator 
----
FormValidator-LazyWay-0.02
http://search.cpan.org/~taro/FormValidator-LazyWay-0.02/
Yet Another Form Validator 
----
Guard-0.1
http://search.cpan.org/~mlehmann/Guard-0.1/
safe cleanup blocks 
----
Guard-0.5
http://search.cpan.org/~mlehmann/Guard-0.5/
safe cleanup blocks 
----
IPC-PubSub-0.29
http://search.cpan.org/~alexmv/IPC-PubSub-0.29/
Interprocess Publish/Subscribe channels 
----
Kephra-0.4.0_5
http://search.cpan.org/~lichtkind/Kephra-0.4.0_5/
crossplatform, GUI-Texteditor along perllike Paradigms 
----
Lingua-Identify-0.21
http://search.cpan.org/~ambs/Lingua-Identify-0.21/
Language identification 
----
Lingua-JA-Yomi-0.01
http://search.cpan.org/~mash/Lingua-JA-Yomi-0.01/
convert English into Japanese katakana 
----
Log-Message-0.02
http://search.cpan.org/~kane/Log-Message-0.02/
A generic message storing mechanism; 
----
Module-Loaded-0.02
http://search.cpan.org/~kane/Module-Loaded-0.02/
mark modules as loaded or unloaded 
----
MojoX-Fixup-XHTML-0.01
http://search.cpan.org/~fayland/MojoX-Fixup-XHTML-0.01/
serves application/xhtml+xml content for Mojo 
----
Nagios-Plugin-0.30
http://search.cpan.org/~tonvoon/Nagios-Plugin-0.30/
A family of perl modules to streamline writing Nagios plugins 
----
Net-Amazon-S3-ACL-0.0.1_00
http://search.cpan.org/~polettix/Net-Amazon-S3-ACL-0.0.1_00/
work with Amazon S3 Access Control Lists 
----
Net-Amazon-S3-ACL-0.0.1_01
http://search.cpan.org/~polettix/Net-Amazon-S3-ACL-0.0.1_01/
work with Amazon S3 Access Control Lists 
----
Net-OAuth-0.14
http://search.cpan.org/~kgrennan/Net-OAuth-0.14/
OAuth protocol support 
----
POE-Devel-Benchmarker-0.01
http://search.cpan.org/~apocal/POE-Devel-Benchmarker-0.01/
Benchmarking POE's performance ( acts more like a smoker ) 
----
POE-Devel-Benchmarker-0.02
http://search.cpan.org/~apocal/POE-Devel-Benchmarker-0.02/
Benchmarking POE's performance ( acts more like a smoker ) 
----
POE-Devel-Benchmarker-0.03
http://search.cpan.org/~apocal/POE-Devel-Benchmarker-0.03/
Benchmarking POE's performance ( acts more like a smoker ) 
----
Package-Constants-0.02
http://search.cpan.org/~kane/Package-Constants-0.02/
List all constants declared in a package 
----
Parse-Marpa-0.222000
http://search.cpan.org/~jkegl/Parse-Marpa-0.222000/
Generate Parsers from any BNF grammar 
----
Passwd-Unix-0.45
http://search.cpan.org/~strzelec/Passwd-Unix-0.45/
----
Rose-HTML-Objects-0.600
http://search.cpan.org/~jsiracusa/Rose-HTML-Objects-0.600/
Object-oriented interfaces for HTML. 
----
Syntax-Highlight-Perl6-0.023
http://search.cpan.org/~azawawi/Syntax-Highlight-Perl6-0.023/
a Perl 6 syntax highlighter 
----
Syntax-Highlight-Perl6-0.024
http://search.cpan.org/~azawawi/Syntax-Highlight-Perl6-0.024/
a Perl 6 syntax highlighter 
----
Text-MicroTemplate-0.01
http://search.cpan.org/~kazuho/Text-MicroTemplate-0.01/
----
Tk-Airports-0.06
http://search.cpan.org/~reneeb/Tk-Airports-0.06/
A widget to select airports 
----
Tk-Airports-0.061
http://search.cpan.org/~reneeb/Tk-Airports-0.061/
A widget to select airports 
----
VirtualBox-Manage-0.0.0
http://search.cpan.org/~apeiron/VirtualBox-Manage-0.0.0/
an API for managing VirtualBox VMs 
----
WWW-Translate-Apertium-0.08
http://search.cpan.org/~enell/WWW-Translate-Apertium-0.08/
Open source machine translation 
----
Win32-FileNotify-0.2
http://search.cpan.org/~reneeb/Win32-FileNotify-0.2/
Monitor file changes 
----
Wx-Perl-Dialog-0.04
http://search.cpan.org/~szabgab/Wx-Perl-Dialog-0.04/
Abstract dialog class for simple dialog creation 
----
Wx-Perl-PodEditor-0.03
http://search.cpan.org/~reneeb/Wx-Perl-PodEditor-0.03/
A RichText Ctrl for creating Pod 


If you're an author of one of these modules, please submit a detailed
announcement to comp.lang.perl.announce, and we'll pass it along.

This message was generated by a Perl program described in my Linux
Magazine column, which can be found on-line (along with more than
200 other freely available past column articles) at
  http://www.stonehenge.com/merlyn/LinuxMag/col82.html

print "Just another Perl hacker," # the original

--
Randal L. Schwartz - Stonehenge Consulting Services, Inc. - +1 503 777 0095
<merlyn@stonehenge.com> <URL:http://www.stonehenge.com/merlyn/>
Smalltalk/Perl/Unix consulting, Technical writing, Comedy, etc. etc.
See http://methodsandmessages.vox.com/ for Smalltalk and Seaside discussion


------------------------------

Date: Sat, 13 Dec 2008 22:09:53 -0600
From: gbacon@hiwaay.net (Greg Bacon)
Subject: Re: Noob trying to understand simple Perl grep statement
Message-Id: <1dOdnVtvJr-MGtnUnZ2dnUVZ_rjinZ2d@posted.hiwaay2>

In article <86skp05d7o.fsf@blue.stonehenge.com>,
    Randal L. Schwartz <merlyn@stonehenge.com> wrote:

: I still regret inventing JAPHs, which encouraged obfuperl, and in
: some sense was the motivation for Perl golf.  All three items
: today harm Perl's public perception more than they help, and I'm
: sad for that.

Don't be a puritan. Perl is fun, and that's great! Snobs who look
down their noses at jam sessions aren't worth worrying about.

Greg
-- 
If you love the state, you necessarily love war.
    -- Karen Kwiatkowski, Lt. Col. USAF (ret.)


------------------------------

Date: Sat, 13 Dec 2008 20:25:52 -0800
From: Tim Greer <tim@burlyhost.com>
Subject: Re: Noob trying to understand simple Perl grep statement
Message-Id: <lV%0l.2600$iY3.1516@newsfe14.iad>

Greg Bacon wrote:

> In article <86skp05d7o.fsf@blue.stonehenge.com>,
>     Randal L. Schwartz <merlyn@stonehenge.com> wrote:
> 
> : I still regret inventing JAPHs, which encouraged obfuperl, and in
> : some sense was the motivation for Perl golf.  All three items
> : today harm Perl's public perception more than they help, and I'm
> : sad for that.
> 
> Don't be a puritan. Perl is fun, and that's great! Snobs who look
> down their noses at jam sessions aren't worth worrying about.
> 
> Greg

I'm not defending Randal (I'm certain he can handle himself), but I
don't think you got what he meant by what he said.  One of the most
played out excuses of why Perl isn't seen in a good light with the
(ignorant portion of the) public, is because of the obfuperl aspect. 
Also, it wouldn't be hardly as known, fun or useful without people like
him.  I don't think he has anything to feel regret for, but it is a
valid point.
-- 
Tim Greer, CEO/Founder/CTO, BurlyHost.com, Inc.
Shared Hosting, Reseller Hosting, Dedicated & Semi-Dedicated servers
and Custom Hosting.  24/7 support, 30 day guarantee, secure servers.
Industry's most experienced staff! -- Web Hosting With Muscle!


------------------------------

Date: Sun, 14 Dec 2008 07:49:47 -0600
From: gbacon@hiwaay.net (Greg Bacon)
Subject: Re: Noob trying to understand simple Perl grep statement
Message-Id: <gtSdncWFiOxmk9jUnZ2dnUVZ_jGdnZ2d@posted.hiwaay2>

In article <lV%0l.2600$iY3.1516@newsfe14.iad>,
    Tim Greer  <tim@burlyhost.com> wrote:

: I'm not defending Randal (I'm certain he can handle himself), but I
: don't think you got what he meant by what he said.

I understood him just fine, thanks.

:                                                    One of the most
: played out excuses of why Perl isn't seen in a good light with the
: (ignorant portion of the) public, is because of the obfuperl
: aspect. [...]

Worrying about people who *choose* to be ignorant, as with any
form of bigotry, is a complete waste of time. Don't try to teach
a pig to dance, as the saying goes.

Greg
-- 
The common man will lose. He always loses when fraud is legalized by the
government. The common man wins only when markets are free, contracts are
enforced, and fraud is prosecuted.
    -- Gary North


------------------------------

Date: Sun, 14 Dec 2008 05:36:55 -0800 (PST)
From: rihad <rihad@mail.ru>
Subject: opening a file
Message-Id: <7725e24a-bc4e-44b2-91b3-04b0f1542f97@r37g2000prr.googlegroups.com>

This function is from a daemon (long-lived process):

sub foo($) {
        my ($command) = shift;

       BEGIN {
                open(FOO, ">/var/tmp/foo") or return;
                my $old_fh = select(FOO);
                $| = 1;
                select($old_fh);
      }

        print FOO "$command\n";

       END {
                close(FOO);
       }
}

/var/tmp/foo is a FIFO (mkfifo). It also has a reader daemon (not
shown).
Now when foo('blah-blah-blah') is called, nothing gets written to the
FIFO. Even if I call it several times in succession. I need to stop
the Perl daemon to see them finally output. If I get rid of the BEGIN/
END surroundings, everything works, but opening and closing the fifo
every time is a performance hit for me. Maybe I'm not using autoflush
and select in BEGIN properly? Please help, as I'm not experienced with
Perl.

Perl v5.8.8
FreeBSD 7.0


------------------------------

Date: Sun, 14 Dec 2008 11:29:16 -0300
From: Daniel Molina Wegener <dmwREMOVEUPPERCASE@coder.cl>
Subject: Re: opening a file
Message-Id: <7KydnemJKJvmhdjUnZ2dnUVZ_g6dnZ2d@giganews.com>

-----BEGIN xxx SIGNED MESSAGE-----
Hash: SHA1

rihad <rihad@mail.ru>
on Sunday 14 December 2008 10:36
wrote in comp.lang.perl.misc:


> This function is from a daemon (long-lived process):
> 
> sub foo($) {
>         my ($command) = shift;
> 
>        BEGIN {
>                 open(FOO, ">/var/tmp/foo") or return;
>                 my $old_fh = select(FOO);
>                 $| = 1;
>                 select($old_fh);
>       }
> 
>         print FOO "$command\n";
> 
>        END {
>                 close(FOO);
>        }
> }
> 
> /var/tmp/foo is a FIFO (mkfifo). It also has a reader daemon (not
> shown).
> Now when foo('blah-blah-blah') is called, nothing gets written to the
> FIFO. Even if I call it several times in succession. I need to stop
> the Perl daemon to see them finally output. If I get rid of the BEGIN/
> END surroundings, everything works, but opening and closing the fifo
> every time is a performance hit for me. Maybe I'm not using autoflush
> and select in BEGIN properly? Please help, as I'm not experienced with
> Perl.

  Did you tried using:

open(FOO, ">> /var/tmp/foo") or return;

  Instead of:

open(FOO, "> /var/tmp/foo") or return;

> 
> Perl v5.8.8
> FreeBSD 7.0

Best regards,
- -- 
 .O. | Daniel Molina Wegener   | FreeBSD & Linux
 ..O | dmw [at] coder [dot] cl | Open Standards
 OOO | http://coder.cl/        | FOSS Developer

-----BEGIN xxx SIGNATURE-----
Version: GnuPG v1.4.8 (FreeBSD)

iEYEARECAAYFAklFGEYACgkQxyPEFPXO3WHybgCdGMQ0iXl7qHjjqXcU+KQ0JMya
Nw8AmgJM6YAfGyjpXUiuM6IRNC6r1IPm
=TW7T
-----END PGP SIGNATURE-----


------------------------------

Date: Sun, 14 Dec 2008 06:58:07 -0800 (PST)
From: Klaus <klaus03@gmail.com>
Subject: Re: opening a file
Message-Id: <92cfb17a-50b4-4bc6-86b2-2bffb1b0bd9b@d36g2000prf.googlegroups.com>

On Dec 14, 2:36=A0pm, rihad <ri...@mail.ru> wrote:

> Now when foo('blah-blah-blah') is called, nothing gets written

> open(FOO, ">/var/tmp/foo") or return;
open FOO, '>', '/var/tmp/foo' or die "Can't open > /var/tmp/foo
because $!";

> print FOO "$command\n";
print FOO "$command\n" or die "Can't print FOO '$command' because $!";

> close(FOO);
close(FOO) or die "Can't close FOO because $!";

--
Klaus


------------------------------

Date: Sun, 14 Dec 2008 02:23:06 GMT
From: "A. Sinan Unur" <1usa@llenroc.ude.invalid>
Subject: Re: Processing Multiple Large Files
Message-Id: <Xns9B73D989478AAasu1cornelledu@127.0.0.1>

xhoster@gmail.com wrote in news:20081213192853.428$mr@newsreader.com:

> "Peter J. Holzer" <hjp-usenet2@hjp.at> wrote:
>> On 2008-12-12 13:09, A. Sinan Unur <1usa@llenroc.ude.invalid> wrote:
>> >
 ...
>> >> In fact, I should probably have used threads on Windows. Anyway,
>> >> I'll boot into Linux and see if the returns there are greater.
>> >
>> > Hmmm ... I tried it on ArchLinux using perl from the repository on
>> > the exact same hardware as the Windows tests:
>> >
>> > [sinan@archardy large]$ time perl process.pl 0
>> >
>> > real    0m29.983s
>> > user    0m29.848s
>> > sys     0m0.073s
>> >
>> > [sinan@archardy large]$ time perl process.pl 2
>> >
>> > real    0m15.281s
>> > user    0m29.865s
>> > sys     0m0.077s
>> >
>> > with no changes going to 4, 8, 16 or 20 max instances. Exact same
>> > program and data on the same hardware, yet the no fork version was
>> > 40% faster.
>>
>> Where do you get this 40% figure from? As far as I can see the
>> forking version is almost exactly 100% faster (0m15.281s instead of
>> 0m29.983s) than the non-forking version.
> 
> 
> I assumed he was comparing Linux to Windows, not within linux.

A very astute observation ;-)

My purpose was to show to the OP how to test if forking etc could 
provide performance gains. I did not think so (I did say "you are going 
to run into IO bottlenecks before you run into CPU bottlenecks").

I was still astonished by the fact that the exact same Perl program, 
with the exact same data, on the exact same hardware, being run under 
the latest available perl binary for each platform, was faster in 
ArchLinux than in Windows XP. 

Sinan

-- 
A. Sinan Unur <1usa@llenroc.ude.invalid>
(remove .invalid and reverse each component for email address)

comp.lang.perl.misc guidelines on the WWW:
http://www.rehabitation.com/clpmisc/


------------------------------

Date: Sat, 13 Dec 2008 20:29:46 -0800
From: Tim Greer <tim@burlyhost.com>
Subject: Re: Processing Multiple Large Files
Message-Id: <%Y%0l.2610$iY3.1655@newsfe14.iad>

sln@netherlands.com wrote:

> If the files are are to be read across a real 1 Gigabit network, just
> reading the data will take about 10 minutes (I think, or about 600
> seconds). A Gigabit ethernet, theoretically can transmit 100
> MB/second, if its cache is big enough. But that includes packetizing
> data and protocol ack/nak's. So, in reality, its about 50 MB/second.

By all accounts, this aspect is probably irrelevant, as there was no
mention of needing to transfer the data across a network.  If this is
the case, I'm hoping the OP mentions it and any other aspect that could
play a potential role.  Still, I'm pretty certain they mean they'll
process the data on the system the data resides on, or else they'll
transfer it to another system and then process the data on the system
the data is then (now) on.  Otherwise, there are certainly other
aspects to consider, to be sure.
-- 
Tim Greer, CEO/Founder/CTO, BurlyHost.com, Inc.
Shared Hosting, Reseller Hosting, Dedicated & Semi-Dedicated servers
and Custom Hosting.  24/7 support, 30 day guarantee, secure servers.
Industry's most experienced staff! -- Web Hosting With Muscle!


------------------------------

Date: Sun, 14 Dec 2008 06:34:27 GMT
From: sln@netherlands.com
Subject: Re: Processing Multiple Large Files
Message-Id: <n6a9k4po4ak82omtq4oqf9i8dgslhgapca@4ax.com>

On Sat, 13 Dec 2008 20:29:46 -0800, Tim Greer <tim@burlyhost.com> wrote:

>sln@netherlands.com wrote:
>
>> If the files are are to be read across a real 1 Gigabit network, just
>> reading the data will take about 10 minutes (I think, or about 600
>> seconds). A Gigabit ethernet, theoretically can transmit 100
>> MB/second, if its cache is big enough. But that includes packetizing
>> data and protocol ack/nak's. So, in reality, its about 50 MB/second.
>
>By all accounts, this aspect is probably irrelevant, as there was no
>mention of needing to transfer the data across a network.  If this is
>the case, I'm hoping the OP mentions it and any other aspect that could
>play a potential role.  Still, I'm pretty certain they mean they'll
>process the data on the system the data resides on, or else they'll
>transfer it to another system and then process the data on the system
>the data is then (now) on.  Otherwise, there are certainly other
>aspects to consider, to be sure.
 OP:
"Hi,

I analyzing some netwokr log files. There are around ...
"

sln



------------------------------

Date: Sun, 14 Dec 2008 08:06:28 GMT
From: sln@netherlands.com
Subject: Re: Processing Multiple Large Files
Message-Id: <lgf9k4dk3fshrdv6tpqvhi2drmogljk9rv@4ax.com>

On Sun, 14 Dec 2008 06:34:27 GMT, sln@netherlands.com wrote:

>On Sat, 13 Dec 2008 20:29:46 -0800, Tim Greer <tim@burlyhost.com> wrote:
>
>>sln@netherlands.com wrote:
>>
>>> If the files are are to be read across a real 1 Gigabit network, just
>>> reading the data will take about 10 minutes (I think, or about 600
>>> seconds). A Gigabit ethernet, theoretically can transmit 100
>>> MB/second, if its cache is big enough. But that includes packetizing
>>> data and protocol ack/nak's. So, in reality, its about 50 MB/second.
>>
>>By all accounts, this aspect is probably irrelevant, as there was no
>>mention of needing to transfer the data across a network.  If this is
>>the case, I'm hoping the OP mentions it and any other aspect that could
>>play a potential role.  Still, I'm pretty certain they mean they'll
>>process the data on the system the data resides on, or else they'll
>>transfer it to another system and then process the data on the system
>>the data is then (now) on.  Otherwise, there are certainly other
>>aspects to consider, to be sure.
> OP:
>"Hi,
>
>I analyzing some netwokr log files. There are around ...
>"
>
>sln

In Chinese, this translates to "I got your US job files, 
no need to keep your workers, fire thos bastards and join
the Communist revolution"

sln



------------------------------

Date: Sun, 14 Dec 2008 00:26:48 -0800
From: Tim Greer <tim@burlyhost.com>
Subject: Re: Processing Multiple Large Files
Message-Id: <cr31l.6$a01.0@newsfe03.iad>

sln@netherlands.com wrote:

> "Hi,
> 
> I analyzing some netwokr log files. There are around ...
> "
> 

I didn't get the impression that meant the large preexisting logs needed
to be transfered or read over the network as they were processed, but
people have done stranger things, I suppose. :-)
-- 
Tim Greer, CEO/Founder/CTO, BurlyHost.com, Inc.
Shared Hosting, Reseller Hosting, Dedicated & Semi-Dedicated servers
and Custom Hosting.  24/7 support, 30 day guarantee, secure servers.
Industry's most experienced staff! -- Web Hosting With Muscle!


------------------------------

Date: 6 Apr 2001 21:33:47 GMT (Last modified)
From: Perl-Users-Request@ruby.oce.orst.edu (Perl-Users-Digest Admin) 
Subject: Digest Administrivia (Last modified: 6 Apr 01)
Message-Id: <null>


Administrivia:

#The Perl-Users Digest is a retransmission of the USENET newsgroup
#comp.lang.perl.misc.  For subscription or unsubscription requests, send
#the single line:
#
#	subscribe perl-users
#or:
#	unsubscribe perl-users
#
#to almanac@ruby.oce.orst.edu.  

NOTE: due to the current flood of worm email banging on ruby, the smtp
server on ruby has been shut off until further notice. 

To submit articles to comp.lang.perl.announce, send your article to
clpa@perl.com.

#To request back copies (available for a week or so), send your request
#to almanac@ruby.oce.orst.edu with the command "send perl-users x.y",
#where x is the volume number and y is the issue number.

#For other requests pertaining to the digest, send mail to
#perl-users-request@ruby.oce.orst.edu. Do not waste your time or mine
#sending perl questions to the -request address, I don't have time to
#answer them even if I did know the answer.


------------------------------
End of Perl-Users Digest V11 Issue 2055
***************************************


home help back first fref pref prev next nref lref last post