SCOUG-Programming Mailing List Archives
Return to [ 01 |
January |
2005 ]
<< Previous Message <<
>> Next Message >>
Content Type: text/plain
Greg,
"Can you say "red herring?" So if you can't 'natively express'
something, it can't be done at all?
Balderdash. Express what you want in terms of something
else that you can 'natively express.'..."
If C cannot express bit strings, then it makes no difference
what an intermediate language can produce, if you can't
provide the cause, the source, to get the effect, the code. I
leave it to you to examine what goes wrong, what differences
exist in a solution set that uses binary integer registers with
fixed point decimal numbers. Again it doesn't do any good to
have a intermediate language that supports variable precision
decimal or binary data, if the source language can't express it.
But you can use a library. How interesting. What language
does it use? Obviously if the source language can't cover the
expression, you can't use it for the library routine. Now you
have at least two languages, the code source and the library
source, to input to the intermediate language.
For this reason "special" subroutines to compensate for
"native" language support occur as an adjunct to compilation:
they do not take part within it, thus unknown to the compiler
and more importantly to the compiler writer or the
intermediate code generator.
It's not a "red herring" to have a PL/I language with its
"builtin" functions that correspond to the libraries of other
languages. The builtin functions are part of the language,
thus known to the compiler and the compiler writer, thus
capable of producing the best possible with the intermediate
code generation.
Now if you have paid attention to the horde of comments I
have made on this subject, you will remember that I propose a
language which utilizes only one source library. The language
source itself is in that library. The tool source is in that
library written in the same language as the language source.
All the application source is in that same library written in the
same language as the language and the tool. Source. Source.
Source. One language. Capable of expressing anything in any
other language with equal facility or effort.
One language. One tool. One source library. No possibility of
unresolvable compatibility of libraries or library routines. The
point is not to create problems, but to resolve them. We have
too many programming languages. We have too many tools.
We have too many incompatible libraries. Depending upon
your threshold we have too many to master.
Your point on floating point precision of the S/360 doesn't
simply pertain to PL/I, but to all programming languages
written to that hardware. But PL/I does allow a variability in
precision, programmer not implementation specified, not
present elsewhere. Granted it does require extra software
instructions and thus less efficient than certain fixed values,
but it does adhere to the PL/I design principle of allowing the
programmer to determine the implementation.
Look before it was replaced by the current "optimizing"
compiler, the F-level PL/I compiler changed its internal
implementation drastically. It shifted from interpretive runtime
library routines operating on "dope vectors" to compile time
optimization. Having very intimate familiarity with the
evolution of the F-level compiler, I have to say your
assertions are "dated".
Apparently no one has extrapolated the effect of significant
increases in programmer productivity. It means needing fewer
programmers to produce the same or greater output. If you
increase it far enough, you will not only have to rely on
"good" programmers, but within that group on the "better"
ones until only the "best" remain.
So if you take the best and the demand does not exceed their
supply, then your tool, your Developer Assistant, can even
make the best better, i.e. even more productive. You can, for
example, further "allow people to do what software cannot
and software what people need not". If the software can
detect the habits, the programming patterns, of a "best"
developer, then it can "anticipate" a set of likely results
dynamically. Presenting them "on the side" to the developer
allows him a simple method of selection and thus provide
"live" reuse.
Thus if everything is in source in one source library the tool
can denote the "patterns" among all applications, patterns in
their structure, patterns in their detail. It can then under the
direction of a "best" developer, modify and re-apply patterns
globally. Doesn't it seem strange that we have extremely
sophisticated pattern recognition routines and yet we continue
to manually "detect" patterns in software?
Intentionally during our second pass through several
programming languages including Python we can consider
what makes a program good along with what it takes to make
it better on the way towards best. I'm sure that initially it will
come down to a matter of taste and style, some of which we
have expressed in this thread. That's what separates the
"good" programmer from the "better" and eventually leads to
the best. Not simply pronouncements of personal views, but
their effectiveness in actual application along with the cost in
effort and time. I'm willing to let the marketplace decide.
=====================================================
To unsubscribe from this list, send an email message
to "steward@scoug.com". In the body of the message,
put the command "unsubscribe scoug-programming".
For problems, contact the list owner at
"rollin@scoug.com".
=====================================================
<< Previous Message <<
>> Next Message >>
Return to [ 01 |
January |
2005 ]
The Southern California OS/2 User Group
P.O. Box 26904
Santa Ana, CA 92799-6904, USA
Copyright 2001 the Southern California OS/2 User Group. ALL RIGHTS
RESERVED.
SCOUG, Warp Expo West, and Warpfest are trademarks of the Southern California OS/2 User Group.
OS/2, Workplace Shell, and IBM are registered trademarks of International
Business Machines Corporation.
All other trademarks remain the property of their respective owners.
|