SCOUG Logo


Next Meeting: Sat, TBD
Meeting Directions


Be a Member
Join SCOUG

Navigation:


Help with Searching

20 Most Recent Documents
Search Archives
Index by date, title, author, category.


Features:

Mr. Know-It-All
Ink
Download!










SCOUG:

Home

Email Lists

SIGs (Internet, General Interest, Programming, Network, more..)

Online Chats

Business

Past Presentations

Credits

Submissions

Contact SCOUG

Copyright SCOUG



warp expowest
Pictures from Sept. 1999

The views expressed in articles on this site are those of their authors.

warptech
SCOUG was there!


Copyright 1998-2024, Southern California OS/2 User Group. ALL RIGHTS RESERVED.

SCOUG, Warp Expo West, and Warpfest are trademarks of the Southern California OS/2 User Group. OS/2, Workplace Shell, and IBM are registered trademarks of International Business Machines Corporation. All other trademarks remain the property of their respective owners.

The Southern California OS/2 User Group
USA

SCOUG-Programming Mailing List Archives

Return to [ 16 | February | 2003 ]

<< Previous Message << >> Next Message >>


Date: Sun, 16 Feb 2003 13:37:08 PST8
From: "Gregory W. Smith" <gsmith@well.com >
Reply-To: scoug-programming@scoug.com
To: scoug-programming@scoug.com
Subject: SCOUG-Programming: In retrospect

Content Type: text/plain

> If you ever read some of the early glowing reports about open
> source, one thing touted was the availability of thousands of
> beta testers. I always enjoy it when a liability becomes a
> benefit...or when someone doesn't fully comprehend the
> implications of his statements.
>
> Open source relies on individual contributors who must
> somehow have confidence in their source contributions. When
> you based that confidence in the random testing of others,
> hoping somehow that you can achieve with a shotgun what
> you cannot achieve with surgical precision, you're in trouble.
> Part of that trouble lies in someone else like you making a
> change to source not completely verified. Thus when an error
> occurs did it occur on your version or on a more recent one?
>
> I've had open source advocates brag about the daily releases
> of Linux and Mozilla as if each release marked a step forward.
> If you have a thousand people beta testing a release and you
> have a new release daily, something ought to tell you that
> the one time delay far exceeds the other, that you have a
> serious problem in error detection, correction, and
> synchronization. True changes will not occur at that rate,
> thus you are spinning your wheels engaged in error correction
> due to the inefficiencies of your error detection technique, i.e.
> the multitude of beta testers and the unpredictable delay in
> reporting test results.

Ow! Ow! Ow! My brain hurts! Maybe being an engineer and not a
programmer has something to do with it. Or maybe having worked
doing software QA for a firm that specializes in chemical process
design and control sofware has something to do with it. The
"problem" with lots of beta testers has nothing to do with the
different timing of builds and reports. As far as I know, the
open source projects log and track their bugs--just as we dumb
engineers used to do. (We had to, the former IBMers in the
company made us. That is why the company hired real CS people
to keep us engineers in line.)

Anyway, we had regular builds to whack on. And, it is true, when
we found a problem it might have been introduced in one of the
prior builds. Going back to find where the problem was introduced
was not the point. The point from then on was to log the bug,
track the bug, and verify the programmer's assertion that he had
fixed it. ... or confront management about downgrading the bug
from immediate or serious to something that would not stop release.

As for comprehensive testing of data, it is a bit difficult to
enumerate and have the compiler check ranges when your variable
is a floating point number between -40 Celcius and 250 Celcius.
Checking in range is one thing, but having a comprehensive test
of the whole range is another. But being an engineer, I know
that when that variable applies to water, I should pay special
attention to the range between -1 and 1. And I should ckeck
around the tripple point. And when the opeating pressure is
atmospheric, we need to watch between 99 and 101. And when
the pressure is two atmospheres, I should check for funky
behavior around 120, In fact, I can go the steam tables and
generate a range of test data--and we had a lot of it. But
a large set of regression test data does not make a
comprehensive enumeration.
--
Gregory W. Smith (WD9GAY) gsmith@well.com

=====================================================

To unsubscribe from this list, send an email message
to "steward@scoug.com". In the body of the message,
put the command "unsubscribe scoug-programming".

For problems, contact the list owner at
"rollin@scoug.com".

=====================================================


<< Previous Message << >> Next Message >>

Return to [ 16 | February | 2003 ]



The Southern California OS/2 User Group
P.O. Box 26904
Santa Ana, CA 92799-6904, USA

Copyright 2001 the Southern California OS/2 User Group. ALL RIGHTS RESERVED.

SCOUG, Warp Expo West, and Warpfest are trademarks of the Southern California OS/2 User Group. OS/2, Workplace Shell, and IBM are registered trademarks of International Business Machines Corporation. All other trademarks remain the property of their respective owners.