vicist wrote:People who enjoy the challenge of programming but are not that good at m/c must either use a basic compiler to achieve the speed they need or give the routines to someone willing to convert them into pure m/c, which is not always practical or possible.
If the basics are there, why not take such a project as incentive to master the finer aspects of machine code?
And what's wrong about teaming up with other people? A good deal of the nicer releases in the last few years here at Denial were by no means single person acts.
I won't deny, that programming in machine language is more difficult than doing so in BASIC. It also requires a different mind set, and the willingness to distill a problem down to the smallest details. Ultimately, the resulting code avoids the implied overhead, which a BASIC compiler just hasn't any chance to eliminate:
If you take the simple assignment statement "X=X+1", the higher language necessarily needs to assume, that X as variable is contained in some memory addresses. The interpreted BASIC needs to scan the variable list for a match. The compiled object code might be more lucky when the address of the variable is encoded as constant. Then, the original contents of X are transferred to a dedicated area in the zeropage for arithmetic processing. The literal "1" is converted to the internal number format. CBM BASIC redoes this everytime the literal is encountered. Once again, compilers can pre-process all literals. A routine in the interpreter is called to do the addition - no difference between interpreter and compiler here. Finally, the result is transferred from zeropage back to main memory. There's no decisive difference between (16-bit) integers and floats - CBM BASIC adds int<->float conversion to the overhead and compilers might supply extra, faster routines for 16-bit integer, provided the programmer can supply the compiler with hints, that what he meant with X really is supposed to be an integer.
But all this dwarfs against what can happen, when the above statement is embedded into a larger context. Suppose this "X=X+1" is part of the following routine, which is supposed to clear the screen:
Code: Select all
1 X=0
2 POKE7680+X,32
3 POKE7933+X,32
4 X=X+1
5 IFX<253THEN2
It's difficult to imagine, how a BASIC compiler is supposed to infer, that it is sufficient to hold X in a CPU register! The whole processing of the longer paragraph above just collapses into a single instruction in machine language: INX! The whole routine then looks like this in assembly:
Code: Select all
LDX #0
LDA #32
.loop
STA $1E00,X
STA $1EFD,X
INX
CPX #$FD
BCC loop
...
Anyway, we've gone a long way from doing UDGs with larger RAM expansions to the (assumed) benefits of BASIC compilers.
In the context of user defined graphics, the BASIC compilers add a restriction which should not be overlooked. If the programmer is already beyond the stage of having a working/good/efficient implementation of the program logic (albeit still a bit slow) in interpreted BASIC, then goes to design a nice graphic design, and only then realises, that these two stages - namely getting a compiled (fast) version, and having user defined graphics - are mutually incompatible, then he has all the right to feel conned.
Doesn't happen with the use of machine language. It just adds to the capabilities a program is supposed to show.