[prog] Sample implementations of UNIX utilities.

Meredydd Luff meredydd at everybuddy.com
Sun Dec 29 12:22:20 EST 2002


OK, ante upping somewhat here - once again, the opinions expressed within are 
my own, and not necessarily those of my employ-oops, wrong disclaimer. But 
you get the idea.

On Friday 27 December 2002 21:07, Robert J. Hansen wrote:
> > masochistic, I do think C is very relevant indeed to modern programming.
> > For
>
> Oh, I never said it wasn't.  :)  At some point every programmer needs to
> know how to break the silicon to his/her will, and the best way to do
> that is to learn Assembly.  But since we've long past the point where
> universities routinely teach Assembly, C will do in a pinch--you lose
> the entire notion of registers [*], you lose most of the direct
> connection to the CPU, but at least you get direct memory addressing and
> pointer fun.
Yep, I'm a case in point. I don't even know what a register *is*, and I call 
myself a developer. If it weren't for what I was taught with C, I'd know 
nothing about the innards of a machine. Though, to be fair, I haven't come 
within spitting distance of university yet, and none of what I know was 
taught at school, so that's a reflection on nobody's Comp Sci course :^)

> > one, a "portable assembly language" is a very good way of teaching how
> > computers actually *work*.
>
> Or, alternately, it's a lousy way of teaching how computers actually
> work.  The lambda calculus is, IMO, the best mathematical construct we
> have right now to describe the Universal Turing Machine--thus, I think
> languages which emphasize the lambda calculus are very good ways of
> teaching how computers work.
Oh, a theoretician are we? You know, that explains a lot - your examples of 
"real-world" applications seem to be mostly mathematical problems, and you're 
completely right when you say that C sucks at pure maths, expecially next to 
languages like ML (I know that much). But your original post wasn't talking 
about mathematical problem-solving, it was talking about systems programming, 
which is far closer to C's strong spots. And about the lamda calculus - that 
describes, as you say, how a Universal Turing Machine works. Not what your 
average person will call "a computer". What most people mean by "a computer" 
is the hunk of hardware sitting on your desk, and you learn far more about 
how *it* works by learning a low-level language than by studying lamda 
calculus. Not that the mathematical stuff is irrelevant or anything - the 
theory is great fun, but you can't say that C is no good for most systems 
programming just because it's not a highly theoretical language.

> On the other hand, if you want to know how a given _implementation_ of a
> Turing Machine works... then yeah.  You have to get gritty with the
> silicon.  The problem is this knowledge is generally pretty
> nonportable.  Take a coder who's worked on IA32 his/her entire life,
> drop their best code onto something with a truly _bizarre_ memory layout
> (Intel = ABCD, PPC = DCBA, old PDPs were ACBD, if I recall!) and see how
> many errors get flagged.
Yeah, actually - I think that there ought to be public access to the amazing 
Debian automated compile farm, cause I'd really like to just be able to check 
that I have in fact written stuff portably. But as Kathryn was pointing out, 
Good Style is important whatever language you're using. Yes, it's easy to be 
carelessly machine-dependent in C, but it's also easy to be carelessly 
platform-dependent in Java, and though I don't know about, say, LISP, once 
you use what you yourself called "messy" constructs to talk to the rest of 
the system (which, of course, is exactly what you need to do if you want to 
do systems programming as opposed to solve abstract mathematical problems), I 
should imagine it's quite hard to *avoid* making yourself platform-dependent 
by which calls you use, or at very least tying yourself down to one of these 
implementations.

> > Hmm...having read that link, I'm interested in why you call it a "rant,"
>
> Mostly because I've received quite a few emails from morons since it's
> gone up talking about how C is the talismanic ne plus ultra of
> computation.  :)
I rather liked that quotation Jenn (I think it was) gave about how concepts 
of the One True Language would be called a childhood disease of programmers 
if so many adults didn't do it too <g>

> Honestly, while I think C is a great teaching tool, I think the
> open-source world is in an unhealthy relationship with it.  There are
> too many young punks (and punkettes)
I think "punk" is a non-gender-specific word. Rather like "geek" :-P
> who think that they know C and
> Perl, and thus they know Everything Worth Knowing about programming.  We
> need to open our eyes to other ways of thinking about computation, other
> ways of solving problems.
See Jenn's point about how that happens all the time, in all languages and 
indeed all walks of life, but computing especially. I think this is more to 
do with it being the first thing people encounter when they start out, and 
therefore sticking to it. I suppose it's also easier to do it with C than 
many other languages because you *can* do *anything* in C without being too 
obviously unreasonable. Ugh, that last sentence *felt* clunky - did you get 
what I was trying to say?

> > know about (I don't know much in the way of what (s)he calls "theoretical
>
> He.  :)
Yes, sorry, didn't realise it was yours until you mentioned a little further 
down the thread :)

> > have in C... :-) It must be something in me, really - I *enjoy* the
> > sensation of being able to tell my machine *exactly* what to do, and no
> > messing. Why
>
> Please note that my rant was only about practical programming--when
> you've got a specific goal which you need to accomplish in the shortest
> time possible while still meeting your reliability and safety targets.
> If you're just programming for fun, hell, write in INTERCAL.  If it's
> for fun, go to town with it.  :)
True. I imagine the area of programming as a field that any one language is 
good at as a coloured blob, which gets fainter around the edges. I often do 
even the faint edges in C for pleasure's sake, but I still think that the 
nice bold area where it's definitely practical is larger than you do.

> > memory. It also lends an understanding, as I have mentioned earlier,
> > about what's actually going on - also explains why you feel that
> > performance hit when you use a GC'ed and memory protected language like
> > Java (you're right in
>
> As an interesting aside, I saw an IEEE article a while ago which showed
> a GC algorithm which could outperform most manual GC.  As I recall, it
> applied only to FORTRAN, but... the upshot is it's not a hard and fast
> rule that GC _must_ impose a performance penalty.  Except in Java, where
> I think they actually specify that the GC algorithm must suck.
Yeah, that sounds about my opinion of Java GC. Not that it makes Java an 
unusable language or anything - I quite like it, in fact. And they seem to 
have rediscovered the meaning of the word "efficiency" when they wrote the 
KVM for J2ME (for those who haven't been immersed in Sun marketing-speak, 
that's the Kilobyte Virtual Machine for Java Micro Edition - it's what runs 
the Java games in newer mobile phones, and it's good fun).

About that better-than-manual automatic GC, by the way - how does that work? 
How can you get an automated collector to be faster than a program which 
explicitly free()s it the moment it's done with it? Do the IEEE archive these 
articles? I suspect that my father's long since thrown away that issue...

> (In reality, Java puts very few requirements on the GC algorithm--it's
> very much open to the implementor.  But forgive me for thinking Java GC
> is bletcherous.)
Forgiven!

*snip*
> Sure, the skills are valuable.  I'm not suggesting anyone should say
> "bah, why learn C, it's useless".  But a skill that's just as important
> is being able to choose the right tool for the job, and being able to
> use that tool effectively.
Agreed. But you started off suggesting that the students on this computing 
course should learn something more high-level _instead of_ C, which shrinks 
the toolbox as much as doing C at the expense of all higher-level languages.

> > to the "low-level drivers and VM code" shoebox that you seem to believe
> > is all C is good for ;-)
>
> I don't think C is good only in those areas.  Off the top of my head,
> there's the embedded space and FFI, both of which are _large_ problem
> domains.  But most of us aren't going to spend our careers working
> there.  :)
But you still don't acknowledge that it's ever a reasonable language for 
ordinary systems programming. You still shoebox it into "times when 
performance is important", and then go on, in the face of quite a bit of 
evidence, to say that realtime computing, low-level drivers, virtual machines 
and the like are the only times where the performance advantage is even worth 
looking at.

> > Oh yes, and while I'm at it - about that list of write vs run times for
> > the mathematical algorithm whose name I can't remember (d'oh): that's a
> > tad on
> > the misleading side. Of *course* the C++ one is going to take bloody
> > *ages* to do, if you're doing weird things with constructors so as to do
> > most of the compilation at compile-time. I call unfair test! :-P
>
> It's not at all unfair to use the strengths of a language.  C++ _rocks_
> for numeric programming thanks to template metaprogramming.  Take a look
> at the Blitz project, which just blows my mind in a dozen different
> ways.  We've spent fifty years optimizing FORTRAN compilers to be the
> absolute best, bar none, at numeric programming and here comes Blitz and
> blows FORTRAN away.

I still call unfair test - you went and used the most advanced, time-critical 
tools in your toolbox to do the C++ version, and then pointed at it and said, 
"look, it's not worth the hassle to use C++ as a systems language, look how 
much effort it takes to write!"


> Read Chapter 24 of Stroustrup's "The C++ Programming Language".  He
> makes a very good point, one which I wish more of my managers
> understood, that programmers and programming languages aren't identical,
> and that by forcing everyone in a department into the same
> lowest-common-denominator you wind up wasting (a) the best talent in
> your office and (b) the best tools in your toolbox.
Was this in reference to the last paragraph still? If so, I take your point, 
but once you're taking the time to use a very powerful tool properly, you're 
comparing apples and oranges when you put that next to other languages.

> > Right, I've just spewed a whole load of stuff, and I'm not quite sure how
> > comprehensible it is to people here, and whether anyone will agree.
> > Thoughts?
>
> Oh, I disagree--but I certainly find your comments to be interesting.
Likewise, and I hope we can carry on in that spirit. Looking back, this mail 
seems to be a bit heated, but I assure you it's not meant that way!

Meredydd
-- 

MSN Developer, Everybuddy project
http://www.everybuddy.com/

MSN:     blip109 at hotmail.com
AIM:     blip109
Yahoo:   modula7



More information about the Programming mailing list