Discussion:
Dennis Ritchie -- An Appreciation
(too old to reply)
Steve Summit
2011-10-18 00:00:45 UTC
Permalink
[I haven't posted here in quite some time, but I should
definitely post this here. It's also on the web at
http://www.eskimo.com/~scs/dmr.html .]

I'm a programmer, and just about always have been.
My favorite programming language is still C, and my favorite
operating system is still Unix. Dennis Ritchie was, of course,
jointly responsible for both. So I have definitely lost a
personal hero and, to the extent that I can claim I've learned
from his work, a mentor as well.

It's been said that Unix killed research in operating systems.
I find I don't mind, because Unix is just about perfect.
It's said that you have to keep updating your skills in the tech
world, but I've been programming professionally in C and Unix for
more than 30 years now, and I don't expect to have to switch anytime
soon. In a field that does tend to burn down and reincarnate
itself at least once every five years or so, those two wonderful
little programming systems have proved remarkably durable.
(And they *are* little, which is one of their underappreciated charms.)

Just about everybody of a certain era in programming probably
considers Dennis a hero. The tech world being a bit more
gregarious and less stratified than (say) Hollywood, Dennis was
delightfully approachable. It was always a thrill to see a post
from dmr in a Usenet newsgroup, the more so if it was in response
to one of your own posts, the more so if he agreed with you.
And if you got an email out of the blue -- well, that was
*really* one to be treasured. But you didn't have to wait; any
random hacker out there on the net could send an email to dmr,
and he'd often reply. (I know this because he once thanked me --
another email to treasure! -- for being able to save time by
simply pointing supplicants to the comp.lang.c FAQ list I'd
compiled.)

Random reminiscence: it's a USENIX conference, sometime in the
mid-90's. There's a session on copyright and other intellectual
property issues, and as always happens when computer types
discuss this topic, there are a bunch of flamboyant statements
being made about how copyrights and patents on software are
Evil, information wants to be free, etc., etc. One commentator,
objecting to the possibility that too-strict copyrights might
stifle progress, solemnly opines that he doesn't want to be
stuck using 20 year old software. But sitting right in front
of me happens to be Dennis Ritchie, who calls out in a rather
commanding voice, "But you all do!"

I'd like to say I'll miss him not only as a mentor but as a
personal friend, but I only met him once or twice, so I can't
honestly say that. But I can say this: every time I simply type

r = read(fd, buf, 13);

to read 13 bytes from a file without worrying about its record
structure, Dennis Ritchie lives. Every time I pipe something to
grep rather than having to eyeball it for a pattern I'm looking
for, Dennis Ritchie lives. Most importantly, every time I have
the pleasure of writing (or using!) a software tool that's
wondrously small and simple, that does one job and does it well,
Dennis Ritchie lives.

In fact, that's not a bad epitaph. Dennis Ritchie: he did one
job, and he did it well.

Steve Summit
2011-10-13
Uno
2011-10-18 11:20:54 UTC
Permalink
Post by Steve Summit
[I haven't posted here in quite some time, but I should
definitely post this here. It's also on the web at
http://www.eskimo.com/~scs/dmr.html .]
I'm a programmer, and just about always have been.
My favorite programming language is still C, and my favorite
operating system is still Unix. Dennis Ritchie was, of course,
jointly responsible for both. So I have definitely lost a
personal hero and, to the extent that I can claim I've learned
from his work, a mentor as well.
It's been said that Unix killed research in operating systems.
I find I don't mind, because Unix is just about perfect.
It's said that you have to keep updating your skills in the tech
world, but I've been programming professionally in C and Unix for
more than 30 years now, and I don't expect to have to switch anytime
soon. In a field that does tend to burn down and reincarnate
itself at least once every five years or so, those two wonderful
little programming systems have proved remarkably durable.
(And they *are* little, which is one of their underappreciated charms.)
Just about everybody of a certain era in programming probably
considers Dennis a hero. The tech world being a bit more
gregarious and less stratified than (say) Hollywood, Dennis was
delightfully approachable. It was always a thrill to see a post
from dmr in a Usenet newsgroup, the more so if it was in response
to one of your own posts, the more so if he agreed with you.
And if you got an email out of the blue -- well, that was
*really* one to be treasured. But you didn't have to wait; any
random hacker out there on the net could send an email to dmr,
and he'd often reply. (I know this because he once thanked me --
another email to treasure! -- for being able to save time by
simply pointing supplicants to the comp.lang.c FAQ list I'd
compiled.)
Random reminiscence: it's a USENIX conference, sometime in the
mid-90's. There's a session on copyright and other intellectual
property issues, and as always happens when computer types
discuss this topic, there are a bunch of flamboyant statements
being made about how copyrights and patents on software are
Evil, information wants to be free, etc., etc. One commentator,
objecting to the possibility that too-strict copyrights might
stifle progress, solemnly opines that he doesn't want to be
stuck using 20 year old software. But sitting right in front
of me happens to be Dennis Ritchie, who calls out in a rather
commanding voice, "But you all do!"
I'd like to say I'll miss him not only as a mentor but as a
personal friend, but I only met him once or twice, so I can't
honestly say that. But I can say this: every time I simply type
r = read(fd, buf, 13);
to read 13 bytes from a file without worrying about its record
structure, Dennis Ritchie lives. Every time I pipe something to
grep rather than having to eyeball it for a pattern I'm looking
for, Dennis Ritchie lives. Most importantly, every time I have
the pleasure of writing (or using!) a software tool that's
wondrously small and simple, that does one job and does it well,
Dennis Ritchie lives.
In fact, that's not a bad epitaph. Dennis Ritchie: he did one
job, and he did it well.
Steve Summit
2011-10-13
I usually read you as a hard copy.

Cheers,
--
Uno
Dann Corbit
2011-10-18 21:37:51 UTC
Permalink
C is close enough to the hardware to allow me to avoid writing assembly
in order to keep a program fast. Yet C is abstract enough to write
complicated ideas in a symbolic way in order to make the code easy to
maintain.

C is the mother of the modern OO languages like C++ and Java.

My first programming language was Fortran IV. My second programming
language was PL/1. But C (while the 3rd programming language that I
learned) was the first programming language that I loved.

And I am really, really hard to please.

There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Ian Collins
2011-10-18 21:52:06 UTC
Permalink
Post by Dann Corbit
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Alan Turing.
--
Ian Collins
James
2011-10-18 23:47:21 UTC
Permalink
"Dann Corbit" <***@connx.com> wrote in message news:***@aioe.org...
[...]
Post by Dann Corbit
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Grace Hopper
Nick Keighley
2011-10-19 07:02:21 UTC
Permalink
Post by James
[...]
There are some programming giants.  Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list.  How about yours?
Grace Hopper
Dijkstra
Bill Davy
2011-10-19 07:57:07 UTC
Permalink
Post by Dann Corbit
C is close enough to the hardware to allow me to avoid writing assembly
in order to keep a program fast. Yet C is abstract enough to write
complicated ideas in a symbolic way in order to make the code easy to
maintain.
C is the mother of the modern OO languages like C++ and Java.
My first programming language was Fortran IV. My second programming
language was PL/1. But C (while the 3rd programming language that I
learned) was the first programming language that I loved.
And I am really, really hard to please.
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
That "Programming Pearls" guy.
Bill Wolf (for "Design of an Optimising Compiler")
Aho, Weinberg, Kernighan (AWK)
Kernighan (Di-Troff, and much else)
Per Brinch Hansen
Tony Hoare
Malcolm McLean
2011-10-19 09:13:11 UTC
Permalink
There are some programming giants.  Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list.  How about yours?
Ada Lovelace was not only the first programmer, she also understood
what programming was and what a computer could and couldn't do. So
she's got to be on the list.

I'd also add Cooley and Tukey for the fast Fourier transform.
Kleuskes & Moos
2011-10-19 13:41:00 UTC
Permalink
There are some programming giants.  Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list.  How about yours?
Ada Lovelace was not only the first programmer, she also understood what
programming was and what a computer could and couldn't do. So she's got
to be on the list.
I'd also add Cooley and Tukey for the fast Fourier transform.
Niklaus Wirth wasn't mentioned yet. Alain Colmerauer and David Warren for
their work on Prolog and the WAM.

-------------------------------------------------------------------------------
_________________________________________
/ Thousands of days of civilians ... have \
| produced a ... feeling for the |
\ aesthetic modules -- /
-----------------------------------------
\
\
___
{~._.~}
( Y )
()~*~()
(_)-(_)
-------------------------------------------------------------------------------
Nick Keighley
2011-10-20 07:04:34 UTC
Permalink
Post by Malcolm McLean
There are some programming giants.  Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list.  How about yours?
Ada Lovelace was not only the first programmer, she also understood
what programming was and what a computer could and couldn't do. So
she's got to be on the list.
there's some doubt about these claims. Some think she was just a ghost
writer for Babbage. She was rather over-rated as a mathematician too.
Post by Malcolm McLean
I'd also add Cooley and Tukey for the fast Fourier transform.
Jorgen Grahn
2011-10-19 15:32:18 UTC
Permalink
Post by Dann Corbit
C is close enough to the hardware to allow me to avoid writing assembly
in order to keep a program fast. Yet C is abstract enough to write
complicated ideas in a symbolic way in order to make the code easy to
maintain.
C is the mother of the modern OO languages like C++ and Java.
My first programming language was Fortran IV. My second programming
language was PL/1. But C (while the 3rd programming language that I
learned) was the first programming language that I loved.
And I am really, really hard to please.
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list.
I don't know Stevens as a programmer, more as an explorer/teacher.
He's on my top lists, but not this one.
Post by Dann Corbit
How about yours?
The rest of Bell Labs, up to and including Stroustrup.

/Jorgen
--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
Patrick Scheible
2011-10-19 16:57:17 UTC
Permalink
Post by Dann Corbit
C is close enough to the hardware to allow me to avoid writing assembly
in order to keep a program fast. Yet C is abstract enough to write
complicated ideas in a symbolic way in order to make the code easy to
maintain.
C is the mother of the modern OO languages like C++ and Java.
My first programming language was Fortran IV. My second programming
language was PL/1. But C (while the 3rd programming language that I
learned) was the first programming language that I loved.
My exposure was Basic, Pascal, C, then various assembly languages and
other languages. C was my favorite for a long time. And while I have
other favorites now, they were invented after C.
Post by Dann Corbit
And I am really, really hard to please.
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Here are a few more:

Robert Sedgwick, for clear and cogent explanations of algorithms.
Ralph Griswold (RIP), creator of Snobol and Icon.
Daniel Murphy, primary creator of TOPS-20, the first modern timesharing
OS.

-- Patrick
Frederick Williams
2011-10-19 19:46:41 UTC
Permalink
Post by Dann Corbit
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
There is one of those whom I would exclude. I would add (in no
particular order) Tony Hoare, Charles Moore, Bertrand Meyer and John
McCarthy.
--
When a true genius appears in the world, you may know him by
this sign, that the dunces are all in confederacy against him.
Jonathan Swift: Thoughts on Various Subjects, Moral and Diverting
88888 dihedral
2011-10-19 20:06:10 UTC
Permalink
I recommend the Winograd's Fourier transform programs for 1 D DFT not covered by the Oppenheim nad Schaifer's text books, and programs of the Nussbaumer polynomial transform for multidimensional DFT.
Kaz Kylheku
2011-10-19 20:49:17 UTC
Permalink
Post by Dann Corbit
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Dough Smith: Very nice work on the game Lode Runner for the Apple II circa 1980.

Philip Greenspun: http://philip.greenspun.com/narcissism/resume

This new resume doesn't have a lot details of the cool exploits that I remember
from a prior version. But I didn't know that Greenspun was one of the brains
behind PA-RISC.

``Helped architect, simulate and design prototype of HP's Precision
Architecture RISC computer. The prototype took two man-years to complete and
ran at VAX 11/780 speed in June 1983. This architecture became the basis of
HP's computer product line for 15 years and then became the basis for the
64-bit generation of Intel processors.''

That was in 1982-1983. He was only born in 1963. :)
Roberto Waltman
2011-10-19 23:04:19 UTC
Permalink
Post by Dann Corbit
...
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Not sure if to call them "programmers", but ...

Add Niklaus Wirth, Edser Djikstra, Per Brinch-Hansen, Chuck Moore,
Alan Perlis (even if only for the humor,) Alan Kay.
--
Roberto Waltman

[ Please reply to the group,
return address is invalid ]
lovecreatesbeauty
2011-10-20 14:43:14 UTC
Permalink
Post by Dann Corbit
C is close enough to the hardware to allow me to avoid writing assembly
in order to keep a program fast.  Yet C is abstract enough to write
complicated ideas in a symbolic way in order to make the code easy to
maintain.
C is the mother of the modern OO languages like C++ and Java.
I think C is just enough
Post by Dann Corbit
There are some programming giants.  Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list.  How about yours?
Dennis Ritchie is the one on my list
Frederick Williams
2011-10-21 15:56:22 UTC
Permalink
Post by Dann Corbit
C is the mother of the modern OO languages like C++
We mustn't hold that against it.
Post by Dann Corbit
and Java.
--
When a true genius appears in the world, you may know him by
this sign, that the dunces are all in confederacy against him.
Jonathan Swift: Thoughts on Various Subjects, Moral and Diverting
sadsailor
2011-10-21 21:37:41 UTC
Permalink
Post by Dann Corbit
C is the mother of the modern OO languages like C++
There is some truth in that, but you have to parse out the tidbit that is
true:

C is a mutha!
Kaz Kylheku
2011-10-21 17:17:44 UTC
Permalink
Post by Dann Corbit
C is close enough to the hardware to allow me to avoid writing assembly
in order to keep a program fast. Yet C is abstract enough to write
complicated ideas in a symbolic way in order to make the code easy to
maintain.
C is the mother of the modern OO languages like C++ and Java.
I would say that Pascal, Algol, BCPL and PL1 are the parents of C,
and C++ is just a dialect of C which adds Simula 67 to the parentage.

The core dictinctive C++ features are very similar to Simula, even
in the terminology. Virtual functions, access specifiers like protected,
etc.

``Simula 67 introduced objects, classes, subclasses, virtual methods,
coroutines, discrete event simulation and features garbage collection.''
[Wikipedia]

With no coroutines or garbage collection, C++ looks like a step backwards.
Roberto Waltman
2011-10-21 17:30:01 UTC
Permalink
I would say ... C++ is just a dialect of C which adds Simula 67 to the parentage.
What a coincidence, Bjarne Stroustrup says the same:

"C++ was designed to provide Simula’s facilities for program
organization together with C’s efficiency and flexibility for systems
programming"

--
Roberto Waltman

[ Please reply to the group,
return address is invalid ]
Malcolm McLean
2011-10-22 14:57:16 UTC
Permalink
Post by Kaz Kylheku
The core dictinctive C++ features are very similar to Simula, even
in the terminology. Virtual functions, access specifiers like protected,
etc.
``Simula 67 introduced objects, classes, subclasses, virtual methods,
coroutines, discrete event simulation and features garbage collection.''
[Wikipedia]
With no coroutines or garbage collection, C++ looks like a step backwards.
A language is a compromise between theoretical purity, efficiency, and
compatibility.

C++ could be learnt in a day by anyone who knew C. It only added five
of six keywords. You also very easily call routines written in C, or
fiddle about with them to turn them into C++. It had a very catchy
name. It could be implemented without too much fuss as a front end to
C, and it was about as efficient as C. Coroutines and garbage
collection were presumably rejected for this reason.

It's very easy to propose a new language, and it's moderately easy to
implement one. What's hard is to get a substantial number of users.
Dennis Ritchie and Bjarne Strousup succeeded brilliantly, but Strousup
wouldn't have succeeded without leveraging Ritchie's achievement,
arguably unfairly (if C had been a brand of soda then doubtless
Ritchie would have sued for billions, but the computer industry
doesn't work like that).
Bradley K. Sherman
2011-10-22 15:51:27 UTC
Permalink
Post by Malcolm McLean
...
It's very easy to propose a new language, and it's moderately easy to
implement one. What's hard is to get a substantial number of users.
Dennis Ritchie and Bjarne Strousup succeeded brilliantly, but Strousup
wouldn't have succeeded without leveraging Ritchie's achievement,
arguably unfairly (if C had been a brand of soda then doubtless
Ritchie would have sued for billions, but the computer industry
doesn't work like that).
Correction: the computer industry *didn't* work like that.

--bks
Kaz Kylheku
2011-10-22 18:19:42 UTC
Permalink
Post by Malcolm McLean
It's very easy to propose a new language, and it's moderately easy to
implement one. What's hard is to get a substantial number of users.
Getting users is a matter of historic accident, or of appealing to the masses
using factors which are either nontechnical, or in fact are technical
negatives.

The programming language with the most users is most probably Basic. Most
programmers can hardly name two other languages besides the one they (think
they) know; they did not make a decision that could be called informed.

MS-DOS had a substantial number of users. Q.E.D.

Eat shit; a billion flies can't be wrong?
Ian Collins
2011-10-22 19:00:58 UTC
Permalink
Post by Kaz Kylheku
Post by Malcolm McLean
It's very easy to propose a new language, and it's moderately easy to
implement one. What's hard is to get a substantial number of users.
Getting users is a matter of historic accident, or of appealing to the masses
using factors which are either nontechnical, or in fact are technical
negatives.
Languages gain users either through necessity to use a platform or
technology (.NET and JavaScript[1] for example), utility (C, C++ and
popular scripting languages) and specialised niches (Lisp and friends).
The ones that live on are in the latter two categories. How many
people will still be using .NET once its owners get board and move on?

Languages that live on are those whose evolution isn't driven by one vendor.
Post by Kaz Kylheku
The programming language with the most users is most probably Basic. Most
programmers can hardly name two other languages besides the one they (think
they) know; they did not make a decision that could be called informed.
MS-DOS had a substantial number of users. Q.E.D.
How many people us it now its owners are board and move on?

[1] JavaScript should probably bi in the third category.
--
Ian Collins
Frederick Williams
2011-10-23 14:39:22 UTC
Permalink
Post by Kaz Kylheku
The programming language with the most users is most probably Basic.
Which of its one hundred mutually incompatible varieties?
--
When a true genius appears in the world, you may know him by
this sign, that the dunces are all in confederacy against him.
Jonathan Swift: Thoughts on Various Subjects, Moral and Diverting
Nick Keighley
2011-10-23 12:55:31 UTC
Permalink
Post by Malcolm McLean
Post by Kaz Kylheku
The core dictinctive C++ features are very similar to Simula, even
in the terminology. Virtual functions, access specifiers like protected,
etc.
``Simula 67 introduced objects, classes, subclasses, virtual methods,
coroutines, discrete event simulation and features garbage collection.''
[Wikipedia]
With no coroutines or garbage collection, C++ looks like a step backwards.
A language is a compromise between theoretical purity, efficiency, and
compatibility.
C++ could be learnt in a day by anyone who knew C.
when! Stroustrup e1? I've been at it a decade or so and I don't
consider myself completly familiar with C++. Template meta-programming
anyone!
Post by Malcolm McLean
It only added five of six keywords.
a damn sight more than that!

this, public, private, protected, class, virtual, template, exception,
catch, try, operator...
Post by Malcolm McLean
You also very easily call routines written in C, or
fiddle about with them to turn them into C++. It had a very catchy
name. It could be implemented without too much fuss as a front end to
C, and it was about as efficient as C. Coroutines and garbage
collection were presumably rejected for this reason.
It's very easy to propose a new language, and it's moderately easy to
implement one. What's hard is to get a substantial number of users.
Dennis Ritchie and Bjarne Strousup succeeded brilliantly, but Strousup
wouldn't have succeeded without leveraging Ritchie's achievement,
arguably unfairly (if C had been a brand of soda then doubtless
Ritchie would have sued for billions, but the computer industry
doesn't work like that).
thank <deity>

imagine if mathematicians could sue... The Greeks would own the world.
Ben Bacarisse
2011-10-23 20:30:35 UTC
Permalink
<snip>
Post by Nick Keighley
Post by Malcolm McLean
C++ could be learnt in a day by anyone who knew C.
when! Stroustrup e1? I've been at it a decade or so and I don't
consider myself completly familiar with C++. Template meta-programming
anyone!
Agreed.
Post by Nick Keighley
Post by Malcolm McLean
It only added five of six keywords.
a damn sight more than that!
this, public, private, protected, class, virtual, template, exception,
catch, try, operator...
... new, delete, true, false, friend, typename, typeid, using,
namespace, mutable, export, reinterpret_cast, const_cast, static_cast,
dynamic_cast, explicit, throw...

...and it significantly changed the meaning of auto.

I know you were not offering a fill list (and I doubt I have either) but
the point is better made if the list is fuller. The journey from C to
C++ has become quite an adventure. In the days if cfront, it was not
much more than a stroll in the park.

<snip>
--
Ben.
Ben Pfaff
2011-10-23 21:27:25 UTC
Permalink
Post by Nick Keighley
Post by Malcolm McLean
It only added five of six keywords.
a damn sight more than that!
this, public, private, protected, class, virtual, template, exception,
catch, try, operator...
I don't think that "exception" is a keyword in C++.
--
int main(void){char p[]="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz.\
\n",*q="kl BIcNBFr.NKEzjwCIxNJC";int i=sizeof p/2;char *strchr();int putchar(\
);while(*q){i+=strchr(p,*q++)-p;if(i>=(int)sizeof p)i-=sizeof p-1;putchar(p[i]\
);}return 0;}
James Kuyper
2011-10-23 14:28:05 UTC
Permalink
On 10/22/2011 10:57 AM, Malcolm McLean wrote:
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances - science fictional
concepts like machines or drugs that implant information directly in
your head come to mind. I doubt, however, that such speed learning is
the norm in the real world.

I was already very familiar with C when I first learned C++. I
immediately recognized the value of many of the new features offered by
C++. I have a long history of rapidly acquiring new computer languages,
and the amount of time it took me to learn C++ is an example of that -
but the amount of time was a lot longer than a single day. It took
longer than that for me just to finish reading a detailed description of
the new features.
... It only added five
of six keywords.
I'm curious: which of the following C++ keywords are the "five or six"
you were thinking of? When did you first learn of the keyword status of
any of the other 26 keywords in this list? I've heard that 'const',
'volatile' and 'inline' were only added to C after borrowing them from
C++, though I can't personally vouch for the truth of that assertion.
None of the other items on this list has ever been a C keyword (though
the asm keyword is marked in Annex J as a common C extension.

asm
bool
catch
class
const
const_cast
dynamic_cast
explicit
export
false
friend
inline
mutable
namespace
new
operator
private
protected
public
reinterpret_cast
static_assert
static_cast
template
this
throw
true
try
typeid
typename
using
virtual
volatile
wchar_t
--
James Kuyper
Malcolm McLean
2011-10-23 15:36:31 UTC
Permalink
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances.
I learnt it in a day. I bought a little book describing C++ for C
programmers, and read it on the bus. By the time the bus journey had
ended, I'd read the book and knew C++.

However that was when it was still quite new, before it had ballooned
into what it is now.
Charles Richmond
2011-10-24 04:21:14 UTC
Permalink
Post by Malcolm McLean
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances.
I learnt it in a day. I bought a little book describing C++ for C
programmers, and read it on the bus. By the time the bus journey had
ended, I'd read the book and knew C++.
However that was when it was still quite new, before it had ballooned
into what it is now.
Almost anyone can pick up an acorn, but *not* after it growns into
a 200 foot oak tree!!!
Charles Richmond
2011-10-24 04:21:14 UTC
Permalink
Post by Malcolm McLean
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances.
I learnt it in a day. I bought a little book describing C++ for C
programmers, and read it on the bus. By the time the bus journey had
ended, I'd read the book and knew C++.
However that was when it was still quite new, before it had ballooned
into what it is now.
Almost anyone can pick up an acorn, but *not* after it growns into
a 200 foot oak tree!!!
Charles Richmond
2011-10-24 04:23:41 UTC
Permalink
Post by Malcolm McLean
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances.
I learnt it in a day. I bought a little book describing C++ for C
programmers, and read it on the bus. By the time the bus journey had
ended, I'd read the book and knew C++.
However that was when it was still quite new, before it had ballooned
into what it is now.
Almost anyone can pick up an acorn, but *not* after it growns into
a 200 foot oak tree!!!
--
+<><><><><><><><><><><><><><><><><><><>+
| Charles Richmond ***@aquaporin4.com |
+<><><><><><><><><><><><><><><><><><><>+
Robert Wessel
2011-10-23 17:35:40 UTC
Permalink
On Sun, 23 Oct 2011 10:28:05 -0400, James Kuyper
Post by James Kuyper
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances - science fictional
concepts like machines or drugs that implant information directly in
your head come to mind. I doubt, however, that such speed learning is
the norm in the real world.
I was already very familiar with C when I first learned C++. I
immediately recognized the value of many of the new features offered by
C++. I have a long history of rapidly acquiring new computer languages,
and the amount of time it took me to learn C++ is an example of that -
but the amount of time was a lot longer than a single day. It took
longer than that for me just to finish reading a detailed description of
the new features.
... It only added five
of six keywords.
I'm curious: which of the following C++ keywords are the "five or six"
you were thinking of? When did you first learn of the keyword status of
any of the other 26 keywords in this list? I've heard that 'const',
'volatile' and 'inline' were only added to C after borrowing them from
C++, though I can't personally vouch for the truth of that assertion.
None of the other items on this list has ever been a C keyword (though
the asm keyword is marked in Annex J as a common C extension.
asm
bool
catch
class
const
const_cast
dynamic_cast
explicit
export
false
friend
inline
mutable
namespace
new
operator
private
protected
public
reinterpret_cast
static_assert
static_cast
template
this
throw
true
try
typeid
typename
using
virtual
volatile
wchar_t
const and volatile were keywords in C89. wchar_t is a bit of a mix -
it was a standard type in C89 (with one of the TCs), but not actually
a keyword. inline was added to c99 (although it was a common
extension).
James Kuyper
2011-10-23 18:01:44 UTC
Permalink
Post by Robert Wessel
On Sun, 23 Oct 2011 10:28:05 -0400, James Kuyper
...
Post by Robert Wessel
... I've heard that 'const',
'volatile' and 'inline' were only added to C after borrowing them from
C++, though I can't personally vouch for the truth of that assertion.
...
Post by Robert Wessel
const and volatile were keywords in C89.
The rumored borrowing that I was talking about above was said to have
occurred before standardization of C89, but I would assume that it was
sometime after "C with Classes" evolved into "C++", in late 1983.
Post by Robert Wessel
... wchar_t is a bit of a mix -
it was a standard type in C89 (with one of the TCs), but not actually
a keyword. inline was added to c99 (although it was a common
extension).
Correct.
--
James Kuyper
Richard Damon
2011-10-23 18:34:02 UTC
Permalink
Post by James Kuyper
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances - science fictional
concepts like machines or drugs that implant information directly in
your head come to mind. I doubt, however, that such speed learning is
the norm in the real world.
I was already very familiar with C when I first learned C++. I
immediately recognized the value of many of the new features offered by
C++. I have a long history of rapidly acquiring new computer languages,
and the amount of time it took me to learn C++ is an example of that -
but the amount of time was a lot longer than a single day. It took
longer than that for me just to finish reading a detailed description of
the new features.
I could see a person spending a day reading a tutorial on C++ for C
programmers and come away with enough to write some basic C++ programs.
They may be more using "C with Classes" then full C++. After all, it
probably only takes an hour or two to learn how to move from C to C that
is compatible with C++ (using prototypes, no implicit cast from void*,
avoid keywords like class, etc.). At that point you can almost claim to
be writing C++, add in a few basics like member functions and
inheritance, a few basic template rules, and you are then writing C++.
Maybe even have time for some simple stream I/O and it even starts to
look like basic C++.

In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
Joe Pfeiffer
2011-10-23 19:49:13 UTC
Permalink
Post by Richard Damon
Post by James Kuyper
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances - science fictional
concepts like machines or drugs that implant information directly in
your head come to mind. I doubt, however, that such speed learning is
the norm in the real world.
I was already very familiar with C when I first learned C++. I
immediately recognized the value of many of the new features offered by
C++. I have a long history of rapidly acquiring new computer languages,
and the amount of time it took me to learn C++ is an example of that -
but the amount of time was a lot longer than a single day. It took
longer than that for me just to finish reading a detailed description of
the new features.
I could see a person spending a day reading a tutorial on C++ for C
programmers and come away with enough to write some basic C++
programs. They may be more using "C with Classes" then full C++. After
all, it probably only takes an hour or two to learn how to move from C
to C that is compatible with C++ (using prototypes, no implicit cast
from void*, avoid keywords like class, etc.). At that point you can
almost claim to be writing C++, add in a few basics like member
functions and inheritance, a few basic template rules, and you are
then writing C++. Maybe even have time for some simple stream I/O and
it even starts to look like basic C++.
In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
I had the experience of having students ask if they could turn in
assignments written in C++ instead of C, and receiving C programs with
iostreams.
James Kuyper
2011-10-23 22:14:57 UTC
Permalink
Post by Richard Damon
Post by James Kuyper
...
C++ could be learnt in a day by anyone who knew C. ...
Perhaps it could, under the right circumstances - science fictional
concepts like machines or drugs that implant information directly in
your head come to mind. I doubt, however, that such speed learning is
the norm in the real world.
...
Post by Richard Damon
I could see a person spending a day reading a tutorial on C++ for C
programmers and come away with enough to write some basic C++ programs.
They may be more using "C with Classes" then full C++. After all, it
probably only takes an hour or two to learn how to move from C to C that
is compatible with C++ (using prototypes, no implicit cast from void*,
avoid keywords like class, etc.). At that point you can almost claim to
be writing C++, add in a few basics like member functions and
inheritance, a few basic template rules, and you are then writing C++.
Maybe even have time for some simple stream I/O and it even starts to
look like basic C++.
In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
A statement that "I've learned C++" would not be justified by that level
of understanding..
--
James Kuyper
Malcolm McLean
2011-10-24 08:59:03 UTC
Permalink
Post by James Kuyper
Post by Richard Damon
In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
A statement that "I've learned C++" would not be justified by that level
of understanding..
There weren't any templates or namespaces, and exception handling was
widely non-implemented.

Basically you were using C++ once you had an object hierarchy, and for
that you just needed "class" and "public, protected, private) together
with the base membership syntax. That was fundamentally it. The
operator overloaded IO stream library made programs look very
different to C, but was ultimately a distraction. inline, slash slash
comments, operator and references were just minor tweaks. The
difficult one was "virtual".
James Kuyper
2011-10-24 10:25:25 UTC
Permalink
[restoring clipped attribution:]
...
Post by Malcolm McLean
Post by James Kuyper
Post by Richard Damon
In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
A statement that "I've learned C++" would not be justified by that level
of understanding..
There weren't any templates or namespaces, and exception handling was
widely non-implemented.
Basically you were using C++ once you had an object hierarchy, and for
that you just needed "class" and "public, protected, private) together
with the base membership syntax. That was fundamentally it. The
operator overloaded IO stream library made programs look very
different to C, but was ultimately a distraction. inline, slash slash
comments, operator and references were just minor tweaks. The
difficult one was "virtual".
So Charles Richmond's acorn analogy is entirely apt.
--
James Kuyper
Jorgen Grahn
2011-10-24 14:43:31 UTC
Permalink
Post by Malcolm McLean
Post by James Kuyper
Post by Richard Damon
In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
A statement that "I've learned C++" would not be justified by that level
of understanding..
There weren't any templates or namespaces, and exception handling was
widely non-implemented.
Basically you were using C++ once you had an object hierarchy,
The kind of perverted C++ I hate the most is C code squeezed into an
unnatural inheritance hierarchy. I wasn't around in the early days,
but I get the impression that virtual inheritance was the C++ feature
stuck in the minds of many C programmers switching.

I find run-time polymorphism one of the less useful features of C++. I
design a lot of classes, but there are very few "is-a" relationships
between them.

/Jorgen
--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
Malcolm McLean
2011-10-24 16:01:31 UTC
Permalink
Post by Jorgen Grahn
The kind of perverted C++ I hate the most is C code squeezed into an
unnatural inheritance hierarchy.  I wasn't around in the early days,
but I get the impression that virtual inheritance was the C++ feature
stuck in the minds of many C programmers switching.
I find run-time polymorphism one of the less useful features of C++. I
design a lot of classes, but there are very few "is-a" relationships
between them.
That was partly why I stopped using C++.

Unless you're defining an object hierarchy, you're not doing object-
oriented design. Instead you're encapuslating structures.

MYFILE *myfopen();
void myfclose(MYFILE *stream);
int myfput(MYFILE *fp, int ch);

etc is a perfectly legitimate idiom, but you don't gain much by going
to C++ and making MYFILE a class.

I've seen a few very well designed object-oriented programs, notably
3D Studio Max, but they're very much the exception. An object-oriented
design is difficult to get right. C++ also makes it harder than it
should be to define base classes designed for other programmers to
inherit from. Largely it's a documentation issue.


--
Visit my website
http://www.malcolmmclean.site11.com/www
André Gillibert
2011-10-24 16:23:55 UTC
Permalink
Post by Malcolm McLean
I've seen a few very well designed object-oriented programs, notably
3D Studio Max, but they're very much the exception. An object-oriented
design is difficult to get right. C++ also makes it harder than it
should be to define base classes designed for other programmers to
inherit from. Largely it's a documentation issue.
And, of course, the fact the most basic and standard C++ library,
iostreams, is poorly documented and has a weird design (e.g. two seek
methods), doesn't help C++ programmers to get a sane mind.

Even the Qt library abuses inheritance. For example, QProcess (which
represents a child process) is derived from QIOProcess (which
represents a I/O stream like a file descriptor on UNIX), with a
setReadChannel method to select if the QIOProcess reads from
the stderr or stdout of the child process.
--
André Gillibert
Ian Collins
2011-10-24 19:02:14 UTC
Permalink
Post by Malcolm McLean
Post by Jorgen Grahn
The kind of perverted C++ I hate the most is C code squeezed into an
unnatural inheritance hierarchy. I wasn't around in the early days,
but I get the impression that virtual inheritance was the C++ feature
stuck in the minds of many C programmers switching.
I find run-time polymorphism one of the less useful features of C++. I
design a lot of classes, but there are very few "is-a" relationships
between them.
That was partly why I stopped using C++.
Unless you're defining an object hierarchy, you're not doing object-
oriented design. Instead you're encapuslating structures.
That's a rather strange reason for not using a language, considering C++
isn't just an OO language.
Post by Malcolm McLean
MYFILE *myfopen();
void myfclose(MYFILE *stream);
int myfput(MYFILE *fp, int ch);
etc is a perfectly legitimate idiom, but you don't gain much by going
to C++ and making MYFILE a class.
As soon as the resource (be it a file, or memory or whatever) needs some
form of management, then it certainly dose make sense. I'd happily just
stick to C + smart pointers if I had to.
--
Ian Collins
Nick Keighley
2011-10-25 07:44:00 UTC
Permalink
Post by Malcolm McLean
Post by Jorgen Grahn
The kind of perverted C++ I hate the most is C code squeezed into an
unnatural inheritance hierarchy.  I wasn't around in the early days,
but I get the impression that virtual inheritance was the C++ feature
stuck in the minds of many C programmers switching.
I find run-time polymorphism one of the less useful features of C++. I
design a lot of classes, but there are very few "is-a" relationships
between them.
That was partly why I stopped using C++.
Unless you're defining an object hierarchy, you're not doing object-
oriented design. Instead you're encapuslating structures.
I seem to find myself naturally finding IS_A relations. The Open
Closed Principle, as usually implemented, relies on polymorphism. Do I
work in wierd domains or am I in a thought strait jacket?

Graphical Editor- every widget is derived froma base class
Mobile Radio- each "end user" type is derived from a common base
class and an end user can be anything from a handset to a despatcher
to a complete telephony network
Post by Malcolm McLean
MYFILE *myfopen();
void myfclose(MYFILE *stream);
int myfput(MYFILE *fp, int ch);
etc is a perfectly legitimate idiom, but you don't gain much by going
to C++ and making MYFILE a class.
until you want MYFILE to be a string or a comms link a database or...
Post by Malcolm McLean
I've seen a few very well designed object-oriented programs, notably
3D Studio Max, but they're very much the exception. An object-oriented
design is difficult to get right. C++ also makes it harder than it
should be to define base classes designed for other programmers to
inherit from. Largely it's a documentation issue.
maybe true
Jorgen Grahn
2011-10-25 21:24:31 UTC
Permalink
Post by Malcolm McLean
Post by Jorgen Grahn
The kind of perverted C++ I hate the most is C code squeezed into an
unnatural inheritance hierarchy.  I wasn't around in the early days,
but I get the impression that virtual inheritance was the C++ feature
stuck in the minds of many C programmers switching.
I find run-time polymorphism one of the less useful features of C++. I
design a lot of classes, but there are very few "is-a" relationships
between them.
That was partly why I stopped using C++.
Unless you're defining an object hierarchy, you're not doing object-
oriented design. Instead you're encapuslating structures.
I take it you think encapsulating structures is inferior to defining
object hierarchies? I don't. Encapsulating structures means building
new types, documenting them, deciding what operations they should
support and which ones should be explicitly forbidden, defining
invariants and so on.
Post by Malcolm McLean
MYFILE *myfopen();
void myfclose(MYFILE *stream);
int myfput(MYFILE *fp, int ch);
etc is a perfectly legitimate idiom, but you don't gain much by going
to C++ and making MYFILE a class.
That's not C++. The C++ version would not use pointers, or have
superfluous "myf" prefixes on the function names (C++ supports
overloading).

/Jorgen
--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
Malcolm McLean
2011-10-26 07:58:37 UTC
Permalink
Post by Jorgen Grahn
I take it you think encapsulating structures is inferior to defining
object hierarchies?  I don't. Encapsulating structures means building
new types, documenting them, deciding what operations they should
support and which ones should be explicitly forbidden, defining
invariants and so on.
It's not really object-oriented design. And C++ doesn't buy you much.
You can encapsulate objects in C, and it's a design paradigm I use
heavily.

You're doing object-oriented design when your objects have
relationships to each other, in C++ class hierarchies, in other
languages interfaces.

--
BMP, GIF, JPEG file formats, all described in Basic Algorithms
http://www.malcolmmclean.site11.com/www
Ian Collins
2011-10-26 10:21:25 UTC
Permalink
Post by Malcolm McLean
Post by Jorgen Grahn
I take it you think encapsulating structures is inferior to defining
object hierarchies? I don't. Encapsulating structures means building
new types, documenting them, deciding what operations they should
support and which ones should be explicitly forbidden, defining
invariants and so on.
It's not really object-oriented design. And C++ doesn't buy you much.
You can encapsulate objects in C, and it's a design paradigm I use
heavily.
Why are you so hung up on object-oriented design? C++ isn't just an OO
language (a read of the standard library will show you that).
--
Ian Collins
Jorgen Grahn
2011-10-29 06:56:19 UTC
Permalink
Post by Malcolm McLean
Post by Jorgen Grahn
I take it you think encapsulating structures is inferior to defining
object hierarchies?  I don't. Encapsulating structures means building
new types, documenting them, deciding what operations they should
support and which ones should be explicitly forbidden, defining
invariants and so on.
It's not really object-oriented design.
(reordered)
Post by Malcolm McLean
You're doing object-oriented design when your objects have
relationships to each other, in C++ class hierarchies, in other
languages interfaces.
I accept that object-oriented programming (as Stroustrup defines it,
at least) includes run-time polymorphism. But let's say I do my design,
look for places where a "is-a" relationship is the best solution --
and I don't find any. Am I suddenly not doing object-oriented design?

Because that's what happens to me in practice. Less than 5% of my
classes do run-time polymorphism.
Post by Malcolm McLean
And C++ doesn't buy you much.
I work in both languages and I can assure you: it *does* make a big
difference. Not the inheritence, but all the other features.
Post by Malcolm McLean
You can encapsulate objects in C, and it's a design paradigm I use
heavily.
So do I, and it's definitely workable -- but not having RAII, operator
overloading, function overloading, templates and container types
really hurts.

/Jorgen
--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
Markus Wichmann
2011-10-29 19:18:44 UTC
Permalink
Post by Jorgen Grahn
So do I, and it's definitely workable -- but not having RAII, operator
overloading, function overloading, templates and container types
really hurts.
Oh, please. At least to of the above tend to make code unreadable. I
mean, if you see the following line, for instance:

x = y++;

and the compiler doesn't warn on this line on highest level, then in C
we have a pretty limited set of possibilities, namely:

- x and y can both be arithmetic types
- x and y can both be pointers to the same type
- x can be a void* and y can be a typed pointer
- x or y can be preprocessor macros, though in this case that would
really be bad... oh wait, there's at least one instance of it in my
system library:

#define errno (*__errno_location())

C++ adds:
- x or y can be class instances, in which case:
- there might be a typeof y::operator++() which is then called
- there might be an operator++(typeof y) defined somewhere else
- there might be typeof x::operator=(typeof y&)
- there might be operator=(typeof x&, typeof y&)
- typeof x might have an implicit constructor taking a typeof y
- typeof x might have an implicit constructor taking whatever
typeof y::operator++() or operator++(typeof y) return
- I don't know exactly about this one, but operator++(typeof y) might
even return a class instance that has an operator(typeof x)(), IIRC

- even if they aren't class instances, we have:
- x might be a reference to typeof y
- y might be a reference to something
- x might even reference y, which means that this line invokes
undefined behaviour
- or y might reference x, which yields the same

As you can see, C++ blows this line all the way from "something will be
incremented and something else will store the old value" to "no friggin
idea what this line does! Might be anything. And I wouldn't even know
where to start looking"

As of yet I only saw one useful application of operator overloading and
references, and that was a typesafe printf() implementation (which
basically has the compiler choose the correct functions according to the
argument type and those functions then check whether the conversion
specification on the format string was correct).

Templates are yet another pothole to the learning programmer: Every
instanciation creates an entirely new and unrelated set of routines and
class variables. I mean, if I have:

class A { static int count; };

then there's only one A::count in the whole program. If I have

template <class T> class A {static int count; };

suddenly there's no bound to the number of A::counts in the program,
because there is _no_ A::count, but instead A<int>::count,
A<long>::count, etc.
Post by Jorgen Grahn
/Jorgen
Ciao,
Markus
nroberts
2011-10-30 17:43:53 UTC
Permalink
Post by Markus Wichmann
Post by Jorgen Grahn
So do I, and it's definitely workable -- but not having RAII,
Not having scope tied destructors hurts. There are a whole lot of
methods for simplifying code through tying to scope that can't be done
without them. The gcc compiler has an extension though that allows
this in C. Of course it is quite obviously possible to go without
these things in C, especially since you're probably not using
exceptions (setjmp/longjmp is available but pretty rarely used), it
just makes things easier and more straight forward in some people's
opinions.

Some people like being able to create a variable that is guaranteed to
have a function called on it so you can fill that function with
important things like closing file handles, releasing resources,
etc... That way you don't have to remember to do so for every error
condition and in the right order, and only those parts you've
initiated, etc... Other people find the hiding of these details in an
interface like that makes code harder to understand and prefer the
verbosity of having to clean up properly for different conditions
within the block that deals with those conditions. Neither
perspective is wrong.
Post by Markus Wichmann
Post by Jorgen Grahn
operator
overloading, function overloading, templates and container types
really hurts.
Oh, please. At least to of the above tend to make code unreadable. I
x = y++;
and the compiler doesn't warn on this line on highest level, then in C
- x and y can both be arithmetic types
- x and y can both be pointers to the same type
- x can be a void* and y can be a typed pointer
- x or y can be preprocessor macros, though in this case that would
really be bad... oh wait, there's at least one instance of it in my
#define errno (*__errno_location())
- there might be a typeof y::operator++() which is then called
- there might be an operator++(typeof y) defined somewhere else
- there might be typeof x::operator=(typeof y&)
- there might be operator=(typeof x&, typeof y&)
- typeof x might have an implicit constructor taking a typeof y
- typeof x might have an implicit constructor taking whatever
  typeof y::operator++() or operator++(typeof y) return
- I don't know exactly about this one, but operator++(typeof y) might
  even return a class instance that has an operator(typeof x)(), IIRC
All of these features are important and have their uses. They can all
be used badly. Poor design is poor design in C, C++, or any other
language. Consider this function:

void function_that_outputs_blob(blob* b);

If that function does something other than outputting a blob, whatever
that means, then people reading code that use it are not going to know
what that code does. This is exactly the same for using standard
function names like operator++ to mean something other than the
standard use or having conversion operators for types to which
conversion makes no sense.
Post by Markus Wichmann
- x might be a reference to typeof y
- y might be a reference to something
- x might even reference y, which means that this line invokes
  undefined behaviour
- or y might reference x, which yields the same
These later cases can all occur in C if the line is changed to:

*x += (*y)++;
Post by Markus Wichmann
As you can see, C++ blows this line all the way from "something will be
incremented and something else will store the old value" to "no friggin
idea what this line does! Might be anything. And I wouldn't even know
where to start looking"
Only if you're an idiot. I'm sorry, but it's simply true. If you
can't look at declarations of your x and y variables to see what type
they are then you are quite fucked in whatever language you choose to
be programming in.

You might of course be tempted to say something like, "But I shouldn't
have to go looking at the type to know what that line is doing." This
is of course a good argument but it is a good argument for variable
names that are more informative than 'x' and 'y'. Variable names need
to have some semantic meaning that explains what the variable is used
for. When that is done well you generally know what can be done with
the variable, which is more important than what type it is (which is
something I can let the compiler worry about).
Post by Markus Wichmann
As of yet I only saw one useful application of operator overloading and
references, and that was a typesafe printf() implementation (which
basically has the compiler choose the correct functions according to the
argument type and those functions then check whether the conversion
specification on the format string was correct).
Which is a pretty excellent example of great use of templates.

If that's the only example that you've seen though then you're
obviously not looking at a lot of C++.
Post by Markus Wichmann
Templates are yet another pothole to the learning programmer: Every
instanciation creates an entirely new and unrelated set of routines and
class variables.
As well it should. A list of widgets is not the same as a list of
blobbets. If they were then you'd instantiate the same template to
hold them both and your argument is moot.

C programmers tend to bring up this bloat "issue" a lot. It's not an
issue. In C++ you write a template to create new types. Templates
are like the preprocessor on steroids and fulfill many, but not all,
of the tasks originally reserved for macros. While I can write a
template for a container that works with any type in C++, in C I'm
stuck having to use void* or macros. The former might be a good
solution that doesn't introduce bloat, but it does require a lot of
care and may also introduce an unnecessary level of obscurity; it's
also no different from instantiating std::list<void*>. The other
option, macros, can get you a lot of the same thing, but they're MUCH
harder to use and are inherently unsafe.

If you want to write generic code that operates at the static type
level then you really need templates. Macros can almost get you there
but at a very high cost.

The one thing that can be said against templates is that they can be
hard to learn. The syntax isn't exactly optimal and the requirements
of when, where, and how to use the typename and template keywords can
be confusing. For example:

template < typename T >
struct example
{
typedef typename T::some_typedef local_typedef;

template < typename U > static void fun();
};

example<int>::template fun<double>();

It's kind of confusing and ugly. Bloated? Depends on how much
depends on the templates' parameters. If none of it does, then yeah
it's going to be unnecessarily bloated. If most of it does then no,
it's going to be about as "bloated" as it has to be.

Of course, the generic programming paradigm, which is what templates
are pretty much about, is often foreign to someone stuck in the C
world and unwilling to expand their knowledge. This is no different
from any other tool though. If you don't want to learn these things,
that's fine, but your willing ignorance isn't a good argument.
Post by Markus Wichmann
class A { static int count; };
then there's only one A::count in the whole program. If I have
template <class T> class A {static int count; };
suddenly there's no bound to the number of A::counts in the program,
because there is _no_ A::count, but instead A<int>::count,
A<long>::count, etc.
As well it should. If you mean to count the amount of times a
particular class is instantiated then you SHOULD have a different
count for each class. Templates are not classes. They are
*templates* for creating classes...and other things. If you want to
count the amount of times any class built with that template are
instantiated then you need to account for that with a different
structure. Perhaps something like so:

struct count_these { static int count; };
template < typename T > struct A { static count_these our_count; };

This also more accurately reflects the fact that you are counting many
different things in the same bucket.
Malcolm McLean
2011-10-30 08:07:42 UTC
Permalink
Post by Jorgen Grahn
Post by Malcolm McLean
You're doing object-oriented design when your objects have
relationships to each other, in C++ class hierarchies, in other
languages interfaces.
I accept that object-oriented programming (as Stroustrup defines it,
at least) includes run-time polymorphism.  But let's say I do my design,
look for places where a "is-a" relationship is the best solution --
and I don't find any. Am I suddenly not doing object-oriented design?
Because that's what happens to me in practice. Less than 5% of my
classes do run-time polymorphism.
Let's say you examine your program, and find that there are no logical
subroutines you can break out of the main loop. Are you still doing
structured programming?
Post by Jorgen Grahn
Post by Malcolm McLean
And C++ doesn't buy you much.
I work in both languages and I can assure you: it *does* make a big
difference. Not the inheritence, but all the other features.
I don't find it does.

I find that typically my programs are structured as arrays of arrays
of arrays, mainly of structures at the upper levels. Where I need more
complex structures, quite often special links are needed for
performance, or for the algorithm. So in both cases the use of fancy
containers doesn't make much sense. I'm not saying it wouldn't help if
there was a way of avoiding the little realloc dance when resizing a C
array, but this small benefit doesn't justify all the complexity in
interfacing that use of containers introduces. I know that in the C++
stl you can do everything with iterators, the problem is that most
programmers aren't disciplined enough to use the system.

--
Basic Algorithms: in C
http://www.malcolmmclean.site11.com/www
Ian Collins
2011-10-30 08:22:11 UTC
Permalink
Post by Malcolm McLean
Post by Jorgen Grahn
Post by Malcolm McLean
And C++ doesn't buy you much.
I work in both languages and I can assure you: it *does* make a big
difference. Not the inheritence, but all the other features.
I don't find it does.
I find that typically my programs are structured as arrays of arrays
of arrays, mainly of structures at the upper levels. Where I need more
complex structures, quite often special links are needed for
performance, or for the algorithm. So in both cases the use of fancy
containers doesn't make much sense. I'm not saying it wouldn't help if
there was a way of avoiding the little realloc dance when resizing a C
array, but this small benefit doesn't justify all the complexity in
interfacing that use of containers introduces. I know that in the C++
stl you can do everything with iterators, the problem is that most
programmers aren't disciplined enough to use the system.
Where is "all the complexity in interfacing" introduced when using a
vector of whatever in place of a raw array? As you state, the
complexity is significantly reduced by automatic memory management.

You seem to do a good job of avoiding answering direct questions, so I'm
curious to see if you reply.
--
Ian Collins
Malcolm McLean
2011-10-30 11:41:11 UTC
Permalink
Post by Ian Collins
Where is "all the complexity in interfacing" introduced when using a
vector of whatever in place of a raw array?  As you state, the
complexity is significantly reduced by automatic memory management.
In C, typically functions that operate on a list will take a pointer
to an array of structures, and a count. There's then a loop from 0 to
count stepping through the array.

In C++ standard template library, the standard way is to pass in two
iterators, one to the start and one to the end of the sequence. The
iterator is then incremented until it equals the end point.

The C++ way is considerably more complicated, both in terms of what
the compiler is actually doing, and to use. The proof is that it's
very common to see functions written "incorrectly".

--
MiniBasic - a complete simple Basic interpreter, in C
http://www.malcolmmclean.site11.com/www
James Kuyper
2011-10-30 12:40:39 UTC
Permalink
Post by Malcolm McLean
Post by Ian Collins
Where is "all the complexity in interfacing" introduced when using a
vector of whatever in place of a raw array? �As you state, the
complexity is significantly reduced by automatic memory management.
In C, typically functions that operate on a list will take a pointer
to an array of structures, and a count. There's then a loop from 0 to
count stepping through the array.
In C++ standard template library, the standard way is to pass in two
iterators, one to the start and one to the end of the sequence. The
iterator is then incremented until it equals the end point.
You can write C that way, too; and the C++ standard library also
contains counted versions of most of the functions that take a range
delimited by two iterators. Example:

std::copy_n(in_array, size, out_array)
--
James Kuyper
nroberts
2011-10-30 17:56:34 UTC
Permalink
Post by Malcolm McLean
Post by Ian Collins
Where is "all the complexity in interfacing" introduced when using a
vector of whatever in place of a raw array?  As you state, the
complexity is significantly reduced by automatic memory management.
Couple things on this...
Post by Malcolm McLean
In C, typically functions that operate on a list will take a pointer
to an array of structures, and a count. There's then a loop from 0 to
count stepping through the array.
Why would a C developer write a function that operates on a list but
takes an array?
Post by Malcolm McLean
In C++ standard template library, the standard way is to pass in two
iterators, one to the start and one to the end of the sequence. The
iterator is then incremented until it equals the end point.
This is if you're writing function templates that need to operate on
generic containers. If you are writing a function that operates on a
particular instance you generally simply pass the container by
reference and use its members to get the information you need to
perform your algorithm.
Post by Malcolm McLean
The C++ way is considerably more complicated, both in terms of what
the compiler is actually doing, and to use. The proof is that it's
very common to see functions written "incorrectly".
Please explain the former claim. What that the compiler is doing is
more complicated here? I would tend to intuit that it would be
exactly the opposite as the compiler would generally translate the
index version into the iterator version.

As to more complicated to use, I just don't see it. Can you give an
example of an incorrectly written function?
Nick Keighley
2011-10-30 14:56:53 UTC
Permalink
Post by Malcolm McLean
Post by Jorgen Grahn
Post by Malcolm McLean
You're doing object-oriented design when your objects have
relationships to each other, in C++ class hierarchies, in other
languages interfaces.
I accept that object-oriented programming (as Stroustrup defines it,
at least) includes run-time polymorphism.  But let's say I do my design,
look for places where a "is-a" relationship is the best solution --
and I don't find any. Am I suddenly not doing object-oriented design?
Because that's what happens to me in practice. Less than 5% of my
classes do run-time polymorphism.
Let's say you examine your program, and find that there are no logical
subroutines you can break out of the main loop. Are you still doing
structured programming?
Post by Jorgen Grahn
Post by Malcolm McLean
And C++ doesn't buy you much.
I work in both languages and I can assure you: it *does* make a big
difference. Not the inheritence, but all the other features.
I don't find it does.
I find that typically my programs are structured as arrays of arrays
of arrays, mainly of structures at the upper levels.
what? How can your *program* be structured as "arrays of arrays...".
My programs are hierarchies of function calls. Nearly a tree but most
likely a DAG. Did you mean your data structures are strcts containing
arrays? Seems very limited.
Post by Malcolm McLean
Where I need more
complex structures, quite often special links are needed for
performance, or for the algorithm. So in both cases the use of fancy
containers doesn't make much sense.
the STL containers have pretty good performance.
Post by Malcolm McLean
I'm not saying it wouldn't help if
there was a way of avoiding the little realloc dance when resizing a C
array, but this small benefit doesn't justify all the complexity in
interfacing that use of containers introduces.
*what* compilcation.

int a [10];
int b [10];
a[0] = 27;
a[1] = a[2];
memcpy (b, a, 10);

std::vector<int> v1(10);
std::vector<int> v2(10);
v1[0] = 27;
v1[1] = v1[2];
memcpy (&v2[0], &v1[0], 10);

I don't see the problem.
Post by Malcolm McLean
I know that in the C++
stl you can do everything with iterators, the problem is that most
programmers aren't disciplined enough to use the system.
but you don't have to. Why does it matter if they don't?
Malcolm McLean
2011-10-30 15:13:22 UTC
Permalink
Post by Nick Keighley
Post by Malcolm McLean
I know that in the C++
stl you can do everything with iterators, the problem is that most
programmers aren't disciplined enough to use the system.
but you don't have to. Why does it matter if they don't?
Think of the data elements as plugs and the functions as sockets.

Not every plug will fit every socket. As you multiply plugs and
sockets, the system gradually becomes harder and harder to understand,
use, and change.

That's why the standard template library tries to be a sort of adapter
that will hold any plug. But the system only works if everyone writes
all their functions to take iterators. In practice, people don't, for
a variety of reasons. So you end up with an even more complicated
welter of plugs and sockets than you had before.

--
C programming resources: Basic Algorithms now available in both print
and electronic editions
http://www.malcolmmclean.site11.com/www
Ben Bacarisse
2011-10-30 18:05:31 UTC
Permalink
Post by Malcolm McLean
Post by Nick Keighley
Post by Malcolm McLean
I know that in the C++
stl you can do everything with iterators, the problem is that most
programmers aren't disciplined enough to use the system.
but you don't have to. Why does it matter if they don't?
Think of the data elements as plugs and the functions as sockets.
Not every plug will fit every socket. As you multiply plugs and
sockets, the system gradually becomes harder and harder to understand,
use, and change.
That's why the standard template library tries to be a sort of adapter
that will hold any plug. But the system only works if everyone writes
all their functions to take iterators. In practice, people don't, for
a variety of reasons. So you end up with an even more complicated
welter of plugs and sockets than you had before.
How does C solve this problem of having a great may data types and badly
written (i.e. overly specific) function interfaces?

<snip>
--
Ben.
Malcolm McLean
2011-10-30 18:38:21 UTC
Permalink
Post by Ben Bacarisse
How does C solve this problem of having a great may data types and badly
written (i.e. overly specific) function interfaces?
It doesn't really solve the problem. One of the most dangerous
features of C is the ability to typedef a basic type to something
like, say DWORD. Then you find yourself rewriting perfectly good code,
just because someone decided to put DWORDs where they really meant
"int", and the code no longer runs under the particular operating
system where DWORDs are used.

But it alleviates it, because it's easy to write a function that
operates on lists (a list is an ordered collection, usually of like
items) as taking an array and a count. It's hard to do anything
fancier, like wrapping the list into a structure with a "length"
member, creating a linked list, or semi-hardcoding the length of the
array with a preprocessor define. So the plugs might not fit the
sockets, but at least all the sockets are set up in a similar way.

Once you start allowing containers, that simplicity goes.

--
Lots of programming resources
http://www.malcolmmclean.site11.com/www
Ian Collins
2011-10-30 18:56:51 UTC
Permalink
Post by Malcolm McLean
Post by Ben Bacarisse
How does C solve this problem of having a great may data types and badly
written (i.e. overly specific) function interfaces?
It doesn't really solve the problem. One of the most dangerous
features of C is the ability to typedef a basic type to something
like, say DWORD. Then you find yourself rewriting perfectly good code,
just because someone decided to put DWORDs where they really meant
"int", and the code no longer runs under the particular operating
system where DWORDs are used.
Conditionally typedef depending on the platform?
Post by Malcolm McLean
But it alleviates it, because it's easy to write a function that
operates on lists (a list is an ordered collection, usually of like
items) as taking an array and a count. It's hard to do anything
fancier, like wrapping the list into a structure with a "length"
member, creating a linked list, or semi-hardcoding the length of the
array with a preprocessor define. So the plugs might not fit the
sockets, but at least all the sockets are set up in a similar way.
Once you start allowing containers, that simplicity goes.
Once again, how? Is

void f( std::vector<int>& );

more complex than

void f( int*, size_t );

?

With a container, the size information is embedded, no need to pass a
count (which you would have to manually track). From where I sit, the
container removes at least two bits of complexity: you don't have to
track a size and you don't have to worry about a null pointer.
--
Ian Collins
Malcolm McLean
2011-10-31 08:11:38 UTC
Permalink
Once again, how?  Is
void f( std::vector<int>& );
more complex than
void f( int*, size_t );
?
With a container, the size information is embedded, no need to pass a
count (which you would have to manually track).  From where I sit, the
container removes at least two bits of complexity: you don't have to
track a size and you don't have to worry about a null pointer.
You have to use std::vectors everywhere with the C++ version, if you
want to call f. You have to use arrays of ints everywhere with the C
version if you want to do the same.

The C situation will happen, the C++ situation won't.

Snippets don't really illustrate the problem very well. Computer
programs begin to break down when complexity gets beyond a certain
level, and a human can no longer keep track of all the variables and
types within the program.

Tagging an array with a size member is a genuine advantage of a
vector. The cost is that it becomes harder to see what the compiler is
doing. Another problem is that, often, code ends up being written with
vect.size() rather than N, which makes the expressions unreadable.
Loss of the null pointer isn't such an advantage. I've seen C++ code
with "null objects" to get round the problem that sometimes things are
missing and we need to express "not there".

--
C programming resources
http://www.malcolmmclean.site11.com/www
Ian Collins
2011-10-31 08:34:28 UTC
Permalink
Post by Malcolm McLean
Post by Ian Collins
Once again, how? Is
void f( std::vector<int>& );
more complex than
void f( int*, size_t );
?
With a container, the size information is embedded, no need to pass a
count (which you would have to manually track). From where I sit, the
container removes at least two bits of complexity: you don't have to
track a size and you don't have to worry about a null pointer.
You have to use std::vectors everywhere with the C++ version, if you
want to call f. You have to use arrays of ints everywhere with the C
version if you want to do the same.
The C situation will happen, the C++ situation won't.
Eh? If you want to call a function that requires a specific type, you
have to use that type irrespective of the language.
Post by Malcolm McLean
Snippets don't really illustrate the problem very well. Computer
programs begin to break down when complexity gets beyond a certain
level, and a human can no longer keep track of all the variables and
types within the program.
Which is where encapsulation is the programmer's friend, it helps to
reduce the complexity exposed to the programmer.
Post by Malcolm McLean
Tagging an array with a size member is a genuine advantage of a
vector. The cost is that it becomes harder to see what the compiler is
doing.
In all but the most basic situations, the programmer lost track of what
the compiler is doing when compilers grew optimisers.
Post by Malcolm McLean
Another problem is that, often, code ends up being written with
vect.size() rather than N, which makes the expressions unreadable.
You've lost me there, what is N? If in the the size of an array,
doesn't all the additional code used to track it do far more to make the
code hard to follow? container.size() is very idiomatic in C++, so its
use doesn't cause any readability issues. If you do want to simplify an
expression, just an a "const size_t n = vect.size();" before the
expression. This is no worse than keeping taps of an array's length -
you just keep table at the last moment!
Post by Malcolm McLean
Loss of the null pointer isn't such an advantage. I've seen C++ code
with "null objects" to get round the problem that sometimes things are
missing and we need to express "not there".
You'll see kludges an any language! Null objects are an abomination and
anyone using them should be tarred and feathered by their peers.
--
Ian Collins
Ben Bacarisse
2011-10-30 19:46:32 UTC
Permalink
Post by Malcolm McLean
Post by Ben Bacarisse
How does C solve this problem of having a great may data types and badly
written (i.e. overly specific) function interfaces?
It doesn't really solve the problem. One of the most dangerous
features of C is the ability to typedef a basic type to something
like, say DWORD. Then you find yourself rewriting perfectly good code,
just because someone decided to put DWORDs where they really meant
"int", and the code no longer runs under the particular operating
system where DWORDs are used.
But it alleviates it, because it's easy to write a function that
operates on lists (a list is an ordered collection, usually of like
items) as taking an array and a count. It's hard to do anything
fancier, like wrapping the list into a structure with a "length"
member, creating a linked list, or semi-hardcoding the length of the
array with a preprocessor define. So the plugs might not fit the
sockets, but at least all the sockets are set up in a similar way.
Once you start allowing containers, that simplicity goes.
Best just to say I disagree, in that his does not match my experience
with C++. In my limited experience, as the types proliferate, C++
starts to win out big time.

Some code might clarify things. You might want to sketch a situation
with lots of "plugs and sockets" which gets more complex hen written in
C++.
--
Ben.
Malcolm McLean
2011-10-30 15:33:58 UTC
Permalink
Post by Nick Keighley
what? How can your *program* be structured as "arrays of arrays...".
My programs are hierarchies of function calls. Nearly a tree but most
likely a DAG. Did you mean your data structures are strcts containing
arrays? Seems very limited.
The data in the program, not the code.

Most things naturally fall into arrays of arrays. For instance a
protein consists of an array of atoms, each of which has an element
type and an x y z position. The atoms are grouped into amino acid
residues. The residues are grouped into chains. Then you might be
working on more than one protein.

That's what most data is like. Arrays are by far the most commonly
used data structure in C. They're the only one which has explicit
syntactical support.

That's not to say you never use other structures, there's a case for
representing the bonds between atoms as a graph, for instance, you
might want to do that for some applications.
--
Fuzzy Logic Trees - interpretable machine learning
http://www.malcommclean.site11.com/www
Ben Bacarisse
2011-10-30 18:00:03 UTC
Permalink
Post by Malcolm McLean
Post by Nick Keighley
what? How can your *program* be structured as "arrays of arrays...".
My programs are hierarchies of function calls. Nearly a tree but most
likely a DAG. Did you mean your data structures are strcts containing
arrays? Seems very limited.
<snip>
Post by Malcolm McLean
That's what most data is like. Arrays are by far the most commonly
used data structure in C. They're the only one which has explicit
syntactical support.
What makes this true of arrays and not of structs? I.e. I don't see how
you are define a data structure with explicit syntactical support.

<snip>
--
Ben.
nroberts
2011-10-30 18:00:00 UTC
Permalink
Post by Nick Keighley
Post by Malcolm McLean
Post by Jorgen Grahn
Post by Malcolm McLean
You're doing object-oriented design when your objects have
relationships to each other, in C++ class hierarchies, in other
languages interfaces.
I accept that object-oriented programming (as Stroustrup defines it,
at least) includes run-time polymorphism.  But let's say I do my design,
look for places where a "is-a" relationship is the best solution --
and I don't find any. Am I suddenly not doing object-oriented design?
Because that's what happens to me in practice. Less than 5% of my
classes do run-time polymorphism.
Let's say you examine your program, and find that there are no logical
subroutines you can break out of the main loop. Are you still doing
structured programming?
Post by Jorgen Grahn
Post by Malcolm McLean
And C++ doesn't buy you much.
I work in both languages and I can assure you: it *does* make a big
difference. Not the inheritence, but all the other features.
I don't find it does.
I find that typically my programs are structured as arrays of arrays
of arrays, mainly of structures at the upper levels.
what? How can your *program* be structured as "arrays of arrays...".
My programs are hierarchies of function calls. Nearly a tree but most
likely a DAG. Did you mean your data structures are strcts containing
arrays? Seems very limited.
Post by Malcolm McLean
Where I need more
complex structures, quite often special links are needed for
performance, or for the algorithm. So in both cases the use of fancy
containers doesn't make much sense.
the STL containers have pretty good performance.
Post by Malcolm McLean
I'm not saying it wouldn't help if
there was a way of avoiding the little realloc dance when resizing a C
array, but this small benefit doesn't justify all the complexity in
interfacing that use of containers introduces.
*what* compilcation.
   int a [10];
   int b [10];
   a[0] = 27;
   a[1] = a[2];
   memcpy (b, a, 10);
   std::vector<int> v1(10);
   std::vector<int> v2(10);
   v1[0] = 27;
   v1[1] = v1[2];
   memcpy (&v2[0], &v1[0], 10);
Why would you do that?? You're much better off using assignment and
letting the implementation decide if memcpy is even the best thing to
do! You're getting no benefit by using it directly and only introduce
extra complexity and confusion by using the C API here.
Post by Nick Keighley
I don't see the problem.
Post by Malcolm McLean
I know that in the C++
stl you can do everything with iterators, the problem is that most
programmers aren't disciplined enough to use the system.
but you don't have to. Why does it matter if they don't?
io_x
2011-10-30 18:09:14 UTC
Permalink
Post by Malcolm McLean
Post by Malcolm McLean
You're doing object-oriented design when your objects have
relationships to each other, in C++ class hierarchies, in other
languages interfaces.
<cut>
Post by Malcolm McLean
Where I need more
complex structures, quite often special links are needed for
performance, or for the algorithm. So in both cases the use of fancy
containers doesn't make much sense.
the STL containers have pretty good performance.
Post by Malcolm McLean
I'm not saying it wouldn't help if
there was a way of avoiding the little realloc dance when resizing a C
array, but this small benefit doesn't justify all the complexity in
interfacing that use of containers introduces.
*what* compilcation.

int a [10];
int b [10];
a[0] = 27;
a[1] = a[2];
memcpy (b, a, 10);

#io_x
#the complication is 'a' and 'b' are 10 int and you copy 10 chars
#i not consider you want to copy just the first 10 chars only.
#i whould write memcpy(b,a, 10*sizeof(int));

std::vector<int> v1(10);
std::vector<int> v2(10);
v1[0] = 27;
v1[1] = v1[2];
memcpy (&v2[0], &v1[0], 10);

#here as above
#i whould not write memcpy with a class i don't know
#what it is in; possibly
#memcpy(&v2[0], &v1[0], 10*sizeof(v1[0]));
#is ok but i would do it if i implement vector<> only, or
#if the debugger has access to library code
#too for step in what happen, and goes all [compiler traslation] ok.
#i don't say i'm right in that, nor i'm a good programmer

I don't see the problem.
Post by Malcolm McLean
I know that in the C++
stl you can do everything with iterators, the problem is that most
programmers aren't disciplined enough to use the system.
but you don't have to. Why does it matter if they don't?
Malcolm McLean
2011-10-30 18:51:48 UTC
Permalink
Post by Nick Keighley
   std::vector<int> v1(10);
   std::vector<int> v2(10);
   v1[0] = 27;
   v1[1] = v1[2];
   memcpy (&v2[0], &v1[0], 10);
I don't see the problem.
This is classic badly-written C++. As always, the issue is hard to
illustrate in a snippet or toy example, but bites you in real code.

std::vector<int> grocery_ids(10);

/* deep down in the gubbins */
memcpy(grocery_ids, toptensellers, 10 * sizeof(int));

Ten years later, the business expands. We have been giving each line
of groceries an id, but now we've reached the 2 billion mark.

the first line becomes

std::vector<BIGNUM> grocery_ids(10);

Now most of the code has been written using iterators, and is robust
to this. But your little routine, hidden away deep in the gubbins, is
now a bug.

--
Lots of C programming resources
http://www.malcolmmclean.site11.com/www
Ben Bacarisse
2011-10-30 20:36:45 UTC
Permalink
Post by Malcolm McLean
Post by Nick Keighley
   std::vector<int> v1(10);
   std::vector<int> v2(10);
   v1[0] = 27;
   v1[1] = v1[2];
   memcpy (&v2[0], &v1[0], 10);
I don't see the problem.
This is classic badly-written C++. As always, the issue is hard to
illustrate in a snippet or toy example, but bites you in real code.
std::vector<int> grocery_ids(10);
/* deep down in the gubbins */
memcpy(grocery_ids, toptensellers, 10 * sizeof(int));
I agree it's bad C++ but it's bad C as well. You need

memcpy(grocery_ids, toptensellers, 10 * sizeof grocery_ids[0]);

to avoid the most obvious problems down the line. If the two arrays
end up having different element types, the bug is much more serious than
just this memcpy. Here again, C++ offers a simple solution (use the
"counted" form of std::copy) where C does not.
Post by Malcolm McLean
Ten years later, the business expands. We have been giving each line
of groceries an id, but now we've reached the 2 billion mark.
the first line becomes
std::vector<BIGNUM> grocery_ids(10);
Now most of the code has been written using iterators, and is robust
to this. But your little routine, hidden away deep in the gubbins, is
now a bug.
I'm still struggling to see where C++ adds problems rather than offering
solutions.
--
Ben.
Richard Damon
2011-10-24 16:16:13 UTC
Permalink
Post by Jorgen Grahn
Post by Malcolm McLean
Post by James Kuyper
Post by Richard Damon
In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
A statement that "I've learned C++" would not be justified by that level
of understanding..
There weren't any templates or namespaces, and exception handling was
widely non-implemented.
Basically you were using C++ once you had an object hierarchy,
The kind of perverted C++ I hate the most is C code squeezed into an
unnatural inheritance hierarchy. I wasn't around in the early days,
but I get the impression that virtual inheritance was the C++ feature
stuck in the minds of many C programmers switching.
I find run-time polymorphism one of the less useful features of C++. I
design a lot of classes, but there are very few "is-a" relationships
between them.
/Jorgen
I probably depends a lot on the type of programs you write, I find I use
it a lot. It also is fundamental to some parts of C++, for example
iostreams (the generic use of streams for insertion and the separation
of streams and buffers as implemented very much depends on
polymorphism). Also exceptions would be a lot less friendly without
polymorphism.

I do agree that using inheritance just because you might get some code
sharing can lead to very bad code, and I've seen my share of "forced"
class hierarchy, but there are a lot of good uses for run-time
polymorphism.
Jorgen Grahn
2011-10-25 20:30:07 UTC
Permalink
Post by Richard Damon
Post by Jorgen Grahn
Post by Malcolm McLean
Post by James Kuyper
Post by Richard Damon
In one day they aren't using much of STL, name spaces (except adding a
using namespace std:) or other advance features.
A statement that "I've learned C++" would not be justified by that level
of understanding..
There weren't any templates or namespaces, and exception handling was
widely non-implemented.
Basically you were using C++ once you had an object hierarchy,
The kind of perverted C++ I hate the most is C code squeezed into an
unnatural inheritance hierarchy. I wasn't around in the early days,
but I get the impression that virtual inheritance was the C++ feature
stuck in the minds of many C programmers switching.
I find run-time polymorphism one of the less useful features of C++. I
design a lot of classes, but there are very few "is-a" relationships
between them.
I probably depends a lot on the type of programs you write, I find I use
it a lot.
Yes -- I was about to add such a reservation to my posting.
I think my programming is quite varied, but I do no GUI programming
for example.
Post by Richard Damon
It also is fundamental to some parts of C++, for example
iostreams (the generic use of streams for insertion and the separation
of streams and buffers as implemented very much depends on
polymorphism).
It's great that you can treat strings as files, but apart from that
the polymorphism is pretty much under the hood.
Post by Richard Damon
Also exceptions would be a lot less friendly without
polymorphism.
Yes, that's another -- they more or less need it.
Post by Richard Damon
I do agree that using inheritance just because you might get some code
sharing can lead to very bad code, and I've seen my share of "forced"
class hierarchy, but there are a lot of good uses for run-time
polymorphism.
There *are* good uses, and it's a feature that belongs in C++, but
it's IMHO overused.

/Jorgen
--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
Bill Cunningham
2011-10-24 19:52:29 UTC
Permalink
Post by Dann Corbit
C is close enough to the hardware to allow me to avoid writing
assembly in order to keep a program fast. Yet C is abstract enough
to write complicated ideas in a symbolic way in order to make the
code easy to maintain.
C is the mother of the modern OO languages like C++ and Java.
My first programming language was Fortran IV. My second programming
language was PL/1. But C (while the 3rd programming language that I
learned) was the first programming language that I loved.
And I am really, really hard to please.
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Charles Babbage and Ada Lovelace.
Bill Cunningham
2011-10-25 01:40:51 UTC
Permalink
Post by Bill Cunningham
Post by Dann Corbit
C is close enough to the hardware to allow me to avoid writing
assembly in order to keep a program fast. Yet C is abstract enough
to write complicated ideas in a symbolic way in order to make the
code easy to maintain.
C is the mother of the modern OO languages like C++ and Java.
My first programming language was Fortran IV. My second programming
language was PL/1. But C (while the 3rd programming language that I
learned) was the first programming language that I loved.
And I am really, really hard to please.
There are some programming giants. Donald Knuth, W. Richard Stevens,
and Dennis Ritchie top my list. How about yours?
Charles Babbage and Ada Lovelace.
And don't forget Jim Gosling.
Fritz Wuehler
2011-10-26 04:13:19 UTC
Permalink
Post by Bill Cunningham
Post by Dann Corbit
C is the mother of the modern OO languages like C++ and Java.
Oh it's a mother alright, but the OO comes from Simula and elsewhere. Go
read some Bjarne and find out the truth.
Post by Bill Cunningham
Post by Dann Corbit
My first programming language was Fortran IV. My second programming
language was PL/1. But C (while the 3rd programming language that I
learned) was the first programming language that I loved.
You're one sick puppy. PL/I (not 1, ya big MORON!) stomps C flat.
Post by Bill Cunningham
Post by Dann Corbit
And I am really, really hard to please.
Not that you're hard to please, you're just dopey.
Post by Bill Cunningham
And don't forget Jim Gosling.
Yeah, the guy who wrote Emacs! Hey Stallman, eat shit!
Nick Keighley
2011-10-26 08:34:47 UTC
Permalink
PL/I [...] stomps C flat.
which is why magazines used to feature articals that started "So, what
does this [apparently innocuous] line of PL/I actually do? You might
be surprised at the answer!"

Its tendency to allow almost anything to be assiged to anything else
led to quite astonishing results.

"PL/I: the best of both FORTRAN *and* COBOL"
Bill Cunningham
2011-10-26 13:47:38 UTC
Permalink
Post by Fritz Wuehler
Post by Bill Cunningham
Post by Dann Corbit
C is the mother of the modern OO languages like C++ and Java.
Oh it's a mother alright, but the OO comes from Simula and elsewhere.
Go read some Bjarne and find out the truth.
Post by Bill Cunningham
Post by Dann Corbit
My first programming language was Fortran IV. My second
programming language was PL/1. But C (while the 3rd programming
language that I learned) was the first programming language that I
loved.
You're one sick puppy. PL/I (not 1, ya big MORON!) stomps C flat.
Post by Bill Cunningham
Post by Dann Corbit
And I am really, really hard to please.
Not that you're hard to please, you're just dopey.
Post by Bill Cunningham
And don't forget Jim Gosling.
Yeah, the guy who wrote Emacs! Hey Stallman, eat shit!
True but I kinda meant Java.

Bill
Fritz Wuehler
2011-10-27 02:54:07 UTC
Permalink
Post by Bill Cunningham
Post by Fritz Wuehler
Post by Bill Cunningham
And don't forget Jim Gosling.
Yeah, the guy who wrote Emacs! Hey Stallman, eat shit!
True but I kinda meant Java.
Yeah but Java borrowed/stole heavily from other stuff also, notably
Modula-3 and the same foundations C++ was based on (Simula for example). I
don't think Java was Gosling's biggest achievement although it certainly
made him the most fame and fortune. Before Java he did do some real
invention.
luser- -droog
2011-10-27 18:27:43 UTC
Permalink
On Oct 26, 9:54 pm, Fritz Wuehler
Post by Fritz Wuehler
Post by Fritz Wuehler
Post by Bill Cunningham
And don't forget Jim Gosling.
Yeah, the guy who wrote Emacs! Hey Stallman, eat shit!
    True but I kinda meant Java.
Yeah but Java borrowed/stole heavily from other stuff also, notably
Modula-3 and the same foundations C++ was based on (Simula for example). I
don't think Java was Gosling's biggest achievement although it certainly
made him the most fame and fortune. Before Java he did do some real
invention.
Is that a veiled reference to NeWS? The greatest Window System before
or since!
AK
2011-10-20 07:01:02 UTC
Permalink
Post by Steve Summit
[I haven't posted here in quite some time, but I should
definitely post this here.  It's also on the web athttp://www.eskimo.com/~scs/dmr.html.]
I'm a programmer, and just about always have been.
My favorite programming language is still C, and my favorite
operating system is still Unix.  Dennis Ritchie was, of course,
jointly responsible for both.  So I have definitely lost a
personal hero and, to the extent that I can claim I've learned
from his work, a mentor as well.
Same here. In fact, I'm quite pissed at the press in my country, who
gave a front page tribute to Steve Jobs and not even a mention about
Dennis Ritchie.
Post by Steve Summit
It's been said that Unix killed research in operating systems.
I find I don't mind, because Unix is just about perfect.
It's said that you have to keep updating your skills in the tech
world, but I've been programming professionally in C and Unix for
more than 30 years now, and I don't expect to have to switch anytime
soon.  In a field that does tend to burn down and reincarnate
itself at least once every five years or so, those two wonderful
little programming systems have proved remarkably durable.
(And they *are* little, which is one of their underappreciated charms.)
Just about everybody of a certain era in programming probably
considers Dennis a hero.  The tech world being a bit more
gregarious and less stratified than (say) Hollywood, Dennis was
delightfully approachable.  It was always a thrill to see a post
from dmr in a Usenet newsgroup, the more so if it was in response
to one of your own posts, the more so if he agreed with you.
And if you got an email out of the blue -- well, that was
*really* one to be treasured.  But you didn't have to wait; any
random hacker out there on the net could send an email to dmr,
and he'd often reply.  (I know this because he once thanked me --
another email to treasure! -- for being able to save time by
simply pointing supplicants to the comp.lang.c FAQ list I'd
compiled.)
Random reminiscence: it's a USENIX conference, sometime in the
mid-90's.  There's a session on copyright and other intellectual
property issues, and as always happens when computer types
discuss this topic, there are a bunch of flamboyant statements
being made about how copyrights and patents on software are
Evil, information wants to be free, etc., etc.  One commentator,
objecting to the possibility that too-strict copyrights might
stifle progress, solemnly opines that he doesn't want to be
stuck using 20 year old software.  But sitting right in front
of me happens to be Dennis Ritchie, who calls out in a rather
commanding voice, "But you all do!"
I'd like to say I'll miss him not only as a mentor but as a
personal friend, but I only met him once or twice, so I can't
honestly say that.  But I can say this: every time I simply type
        r = read(fd, buf, 13);
to read 13 bytes from a file without worrying about its record
structure, Dennis Ritchie lives.  Every time I pipe something to
grep rather than having to eyeball it for a pattern I'm looking
for, Dennis Ritchie lives.  Most importantly, every time I have
the pleasure of writing (or using!) a software tool that's
wondrously small and simple, that does one job and does it well,
Dennis Ritchie lives.
In fact, that's not a bad epitaph.  Dennis Ritchie: he did one
job, and he did it well.
Amen...
lovecreatesbeauty
2011-10-20 14:50:36 UTC
Permalink
Post by AK
Same here. In fact, I'm quite pissed at the press in my country, who
gave a front page tribute to Steve Jobs and not even a mention about
Dennis Ritchie.
I read it on Google News and New York Times reported it.
Walter Banks
2011-10-20 15:07:45 UTC
Permalink
Post by AK
Post by Steve Summit
[I haven't posted here in quite some time, but I should
definitely post this here. It's also on the web athttp://www.eskimo.com/~scs/dmr.html.]
I'm a programmer, and just about always have been.
My favorite programming language is still C, and my favorite
operating system is still Unix. Dennis Ritchie was, of course,
jointly responsible for both. So I have definitely lost a
personal hero and, to the extent that I can claim I've learned
from his work, a mentor as well.
Same here. In fact, I'm quite pissed at the press in my country, who
gave a front page tribute to Steve Jobs and not even a mention about
Dennis Ritchie.
Dennis Ritchie is on a short list of individuals who had a huge impact
on computing as we know it. The press unfortunately finds it difficult
to explain how the tools we use are essential for the applications we
write.

Steve Jobs set a standard for computing from a user perspective.

w..
Kaz Kylheku
2011-10-20 16:31:58 UTC
Permalink
Post by AK
Post by Steve Summit
[I haven't posted here in quite some time, but I should
definitely post this here. It's also on the web athttp://www.eskimo.com/~scs/dmr.html.]
I'm a programmer, and just about always have been.
My favorite programming language is still C, and my favorite
operating system is still Unix. Dennis Ritchie was, of course,
jointly responsible for both. So I have definitely lost a
personal hero and, to the extent that I can claim I've learned
from his work, a mentor as well.
Same here. In fact, I'm quite pissed at the press in my country, who
gave a front page tribute to Steve Jobs and not even a mention about
Dennis Ritchie.
Why would they start mentioning Ritchie when he died, if they didn't mention
him while he was alive?

Steve Jobs made news while he was alive, so of course (that's why) he made news
when he died.

When an athlete makes a new world in the 100m hurdles, we do not honor
the inventor of the hurdle.

Whatever people achieve in C is /in spite/ of the language.
Malcolm McLean
2011-10-20 18:13:06 UTC
Permalink
Post by Kaz Kylheku
Whatever people achieve in C is /in spite/ of the language.
Oh rubbish. There are many, many computer languages out there.
University computer scientists are almost obliged to release a new one
as a career development point. C was the one which stuck, and gave
birth to C++ and Java, the other two popular languages.

--
MiniBasic - how to write a script interpreter (my career development
point)
http://www.malcolmmclean.site11.com/www
88888 Dihedral
2011-10-20 18:43:51 UTC
Permalink
I recommend Bill gates for gwbasic and Guido for python!
Jorgen Grahn
2011-10-21 08:35:58 UTC
Permalink
Post by Malcolm McLean
Post by Kaz Kylheku
Whatever people achieve in C is /in spite/ of the language.
Oh rubbish.
I was going to write a much longer response, but your first sentence
sums it up, really.

/Jorgen
--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
Nick Keighley
2011-10-21 09:45:33 UTC
Permalink
Post by Jorgen Grahn
Post by Malcolm McLean
Post by Kaz Kylheku
Whatever people achieve in C is /in spite/ of the language.
Oh rubbish.
I was going to write a much longer response, but your first sentence
sums it up, really.
quite. "Why Pascal Isn't My Favoutite Language" (or something like
that) is worth a read
André Gillibert
2011-10-24 16:41:52 UTC
Permalink
Post by Nick Keighley
Post by Jorgen Grahn
Post by Malcolm McLean
Post by Kaz Kylheku
Whatever people achieve in C is /in spite/ of the language.
Oh rubbish.
I was going to write a much longer response, but your first sentence
sums it up, really.
quite. "Why Pascal Isn't My Favoutite Language" (or something like
that) is worth a read
The original Pascal language was a teaching tool, although derivates of
1980's and 1990's (Object Pascal, Delphi), became complete programming
languages.

The C programming language was designed as practical system
programming tools, and so was a good candidate at becoming popular
for system programming.

IMO, details of the language are not an important part of its success.

If Dennis Ritchie had designed his language by adding the needed
features (explicit pointer arithmetic, dynamic memory allocation,
recursion) to Fortran, than, UNIX and the rest of the world, would
probably be using his Fortran derivative.

And, if Simula had a faster linker, C++ may have never been created
by Bjarne Stroustrup ...
--
André Gillibert
Jorgen Grahn
2011-10-25 14:06:33 UTC
Permalink
On Mon, 2011-10-24, André Gillibert wrote:
...
Post by André Gillibert
If Dennis Ritchie had designed his language by adding the needed
features (explicit pointer arithmetic, dynamic memory allocation,
recursion) to Fortran, than, UNIX and the rest of the world, would
probably be using his Fortran derivative.
And, if Simula had a faster linker, C++ may have never been created
by Bjarne Stroustrup ...
IIRC, there were other fundamental problems[1] which made Simula unusable
for him. He liked the language, but ran into performance problems when
doing some specifik job with it.

He disliked core parts of C, but based C++ off it because ... I guess
for the same reasons we're using it 20 years later.

/Jorgen

[1] Unless the Simula linker is a runtime thing and we're really
talking about the same problem.
--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .
88888 Dihedral
2011-10-25 17:13:12 UTC
Permalink
I think that calling OS related services is always subject to exceptions and failures. Checking I/O streams with exceptions is a burden!
Charles Richmond
2011-10-21 02:58:47 UTC
Permalink
Post by Kaz Kylheku
[snip...] [snip...]
[snip...]
Same here. In fact, I'm quite pissed at the press in my country, who
gave a front page tribute to Steve Jobs and not even a mention about
Dennis Ritchie.
Why would they start mentioning Ritchie when he died, if they didn't mention
him while he was alive?
Both Dennis Ritchie and Ken Thompson appeared on the national news
broadcasts in the U.S., when President Clinton awarded both the National
Medal of Technology. So dmr was *not* totally ignored by the reporters.
--
+<><><><><><><><><><><><><><><><><><><>+
| Charles Richmond ***@aquaporin4.com |
+<><><><><><><><><><><><><><><><><><><>+
Charles Richmond
2011-10-21 02:52:27 UTC
Permalink
Post by AK
Post by Steve Summit
[I haven't posted here in quite some time, but I should
definitely post this here. It's also on the web
athttp://www.eskimo.com/~scs/dmr.html.]
I'm a programmer, and just about always have been.
My favorite programming language is still C, and my favorite
operating system is still Unix. Dennis Ritchie was, of course,
jointly responsible for both. So I have definitely lost a
personal hero and, to the extent that I can claim I've learned
from his work, a mentor as well.
Same here. In fact, I'm quite pissed at the press in my country, who
gave a front page tribute to Steve Jobs and not even a mention about
Dennis Ritchie.
"Thunder is good, thunder is impressive;
but it is lightning that does the work."
-- Mark Twain
--
+<><><><><><><><><><><><><><><><><><><>+
| Charles Richmond ***@aquaporin4.com |
+<><><><><><><><><><><><><><><><><><><>+
io_x
2011-10-23 08:32:35 UTC
Permalink
Post by Charles Richmond
"Thunder is good, thunder is impressive;
but it is lightning that does the work."
L'ape è piccola tra gli esseri alati,
ma il suo prodotto ha il primato fra i dolci sapori.
Sir {11: 3}
Nick Keighley
2011-10-23 12:58:57 UTC
Permalink
Post by io_x
Post by Charles Richmond
"Thunder is good, thunder is impressive;
but it is lightning that does the work."
L'ape è piccola tra gli esseri alati,
ma il suo prodotto ha il primato fra i dolci sapori.
                        Sir {11: 3}
<google translate>

"The bee is small among flying creatures,
but its product has primacy among the sweet flavors."

cute
lovecreatesbeauty
2011-10-20 14:40:21 UTC
Permalink
Post by Steve Summit
[I haven't posted here in quite some time, but I should
definitely post this here.  It's also on the web athttp://www.eskimo.com/~scs/dmr.html.]
R.I.P Dennis.
88888 Dihedral
2011-10-25 17:19:25 UTC
Permalink
There was no unit test in C long time ago. Programs written with DEBUG, TESTING and FINAL modes in C needs a lot macros and defines! Of course, in the final mode the testing and debugging parts wont be needed! I stumbled into the same problem in C++, TOO!
Nick Keighley
2011-10-26 08:53:00 UTC
Permalink
could you leave some context in your posts?
Post by 88888 Dihedral
There was no unit test in C long time ago.
it's not very hard to implement a basic unit test facility in C. Even
the humble assert() will do at a pinch.
Post by 88888 Dihedral
Programs written with DEBUG, TESTING and FINAL modes in C needs a lot macros and defines!
I don't agree that it has to be that way. And how do other languages
work around this alleged problem?

unit test for instance can be in a standalone module
Post by 88888 Dihedral
Of course, in the final mode the testing and debugging parts wont be needed!
this only applies to programs that are never modified and hence known
to be bug free
Post by 88888 Dihedral
I stumbled into the same problem in C++, TOO!
probably because you were using the same programmer
88888 Dihedral
2011-10-26 21:40:41 UTC
Permalink
I test extremely, thus I have to recompile a lot and fast for my obj lib and package to grow fast!
Nick Keighley
2011-10-27 07:18:00 UTC
Permalink
I test extremely, thus I have to recompile a lot and fast for my obj lib and package to grow fast!  
CONTEXT!!!!!

is that a response to my post or just a disjointed stream of
conciousness?
Continue reading on narkive:
Loading...