[76] in Back_Bay_LISA

home help back first fref pref prev next nref lref last post

To perl or not to perl?

daemon@ATHENA.MIT.EDU (Noah Friedman)
Thu Nov 19 14:37:27 1992

From: friedman@gnu.ai.mit.edu (Noah Friedman)
To: adamm@inset.com
Cc: bblisa@inset.com
Reply-To: friedman@gnu.ai.mit.edu
In-Reply-To: <9211191525.AA16763@garlic.inset.com> (adamm@inset.com)
Date: Thu, 19 Nov 92 13:49:11 EST

>I would like to start a rational discussion of perl versus the traditional
>tools.

   Good luck. :-)

   Before I say any more, let me just comment that I don't particularly
like perl.  I don't want to argue about why because it's a value judgement
about how much inconsistency, redundancy, and imitation of "natural
language" a computer language should or should not have and is somewhat
orthogonal to the question of perl's functionality.  I'd stop using it in a
second if something else had as much expressive power without all the icky
syntactic nightmares (and the GNU Project may come up with
something...we're thinking about it in the background).  The only
observation I will make on this point is that if you are careful, you can
make perl programs readable and maintainable.  For the long-term, that's
more important to me than how fast you can write the program initially and
sometimes (depending on what it does) how fast the program itself runs.

>The traditional tools (sh, sed, awk, tr, etc.) are (mostly) built
>on ``the UNIX philosophy'' of ``one job, one tool.'' The advantage
>is that small tools can be combined in many different ways to solve
>many different problems. Another advantage is that small scripts
>that use only a few small tools do not incur large start-up
>overheads. 

   Actually, if you consider the overhead from running exec all the time
to start up lots of different programs in traditional shell scripts, the
single startup time cost for perl probably wins, even for relatively small
scripts.  And furthermore, if you run a lot of scripts (like I do) the
interpreter will likely be in swap already or even partially paged in, and
start up that much faster.

>            However, the syntax of each of the tools is drastically
>different from the others, and such lack of uniformity is both
>annoying and frustrating, in that you have to remember so many
>different syntaxes.

   Perl has similar problems, actually.  In order to merge all that
functionality from different programs, their syntaxes were roughly
incorporated into perl's language.  That means you must still be familiar
with the essentials of how to use sed and regexps, how to use certain
awk-like constructs, and so on.  The syntax of perl feels much less uniform
to me than anything else I've ever used.  It's much more powerful than,
say, awk, but then again other languages like C and Scheme don't have this
consistency problem and they're just as powerful or more so than perl.
Larry Wall freely admits perl is a "pidgin" language.

>Perl is also more difficult to extend as the source code is quite large
>and complicated.  Yes, you could extend it via perl scripts, but again,
>that forces you to run perl to use the new tool.  Using the traditional
>approach, you simply write a small C program to solve the new problem.

   This is still the same problem as mentioned earlier.  The rationale for
perl is that you don't (usually) have to run any other programs at all.
You can write modular programs in perl the same as you can in C.  It has
functions and so on.  I have a small library of routines I've written for
several interpreters, including bash and perl, which I reuse quite often.
If perl is what you have to run in the first place, extending it in its
runtime language is probably easier than modifying the C source and won't
run critically slower for typical applications.

--
Send mail for the `bblisa' mailing list to `bblisa@inset.com'.
Mail administrative requests to `bblisa-request@inset.com'.

home help back first fref pref prev next nref lref last post