Help converting numbers onto Char in ML

Basic and Machine Language

Moderator: Moderators

jalavera
Vic 20 Newbie
Posts: 14
Joined: Wed Aug 05, 2009 4:49 am

Help converting numbers onto Char in ML

Postby jalavera » Sun Mar 19, 2017 1:58 am

Hi!

I'm programming a little in ML but I don't know what's the way to convert a byte number in a memory location to its converted string writable in screen.

For example if I PEEK nnnn memory we can use: LDA nnnn. But how can I print the acculator value to screen? I know internal $FDD2 routine, but this write the ASCII value not the string one.

Thanks!

User avatar
srowe
Vic 20 Afficionado
Posts: 448
Joined: Mon Jun 16, 2014 3:19 pm

Re: Help converting numbers onto Char in ML

Postby srowe » Sun Mar 19, 2017 3:06 am

Depending on what base you want it's either quite easy or a little complicated.

For printing in hex the bytes simply map on to pair of characters to display. So $1234 comes out as "1" ($31), "2" ($32), "3" ($33) and "4" ($34). Values above 9 have to be handled as there's a 'gap' between "9" ($39) and "A" $(41). Using blocks of 4 shifts to normalize each nibble you can do something like

Code: Select all

PRINTHEX16
   LDA STAL+1      ; get high byte
   JSR PRINTHEX8      ; output as hex digits
   LDA STAL      ; get low byte
PRINTHEX8
   PHA         ; save value
   LSR         ; shift top nibble ..
   LSR
   LSR
   LSR         ; .. into bottom nibble
   JSR LAD82      ; output most significant nibble as hex digit
   PLA         ; restore value
   AND #$0F      ; mask off top nibble
LAD82   ORA #'0'
   CMP #'9'+1
   BCC LAD8A      ; numeric, output it
   ADC #$06      ; make hex digit
LAD8A   JMP CHROUT      ; output character to channel


Printing a number as decimal is more complex because you need to do successive divisions by ten. You can find examples of doing multibyte division in most 6502 assembly books.

If all you want is to print a 16 bit unsigned number you can call the BASIC PRTFIX ($DDCD) subroutine, load .A with the MSB and .X with the LSB.

jalavera
Vic 20 Newbie
Posts: 14
Joined: Wed Aug 05, 2009 4:49 am

Re: Help converting numbers onto Char in ML

Postby jalavera » Sun Mar 19, 2017 10:12 am

Thanks. I will try!
Where can I found this Basic PRTFIX in detail? I never listened and read before. I suppose there are others which can be used.

User avatar
srowe
Vic 20 Afficionado
Posts: 448
Joined: Mon Jun 16, 2014 3:19 pm

Re: Help converting numbers onto Char in ML

Postby srowe » Sun Mar 19, 2017 10:56 am

jalavera wrote:Thanks. I will try!
Where can I found this Basic PRTFIX in detail? I never listened and read before. I suppose there are others which can be used.


There are many BASIC and KERNAL routines which can be used in your own programs. I have a dissassembly of the VIC-20 ROMs here

http://eden.mose.org.uk/gitweb/?p=rom-reverse.git;a=blob;f=src/basic_kernal.asm;hb=HEAD

which are commented. PRTFIX is convenient but not efficient. It converts the integer into a floating point number then uses the floating point print routines.

User avatar
Mike
Herr VC
Posts: 2925
Joined: Wed Dec 01, 2004 1:57 pm
Location: Munich, Germany
Occupation: electrical engineer

Re: Help converting numbers onto Char in ML

Postby Mike » Sun Mar 19, 2017 3:04 pm

srowe wrote:Printing a number as decimal is more complex because you need to do successive divisions by ten.

It is also possible to use the decimal mode of the 6502 to build up a number in BCD code from its binary representation and then use your example routine to print HEX numbers: http://6502.org/source/integers/hex2dec-more.htm.

Those routines need to be used with caution on the VIC-20 though: the standard IRQ handler in the KERNAL doesn't issue a CLD as first instruction and this can lead to problems with interrupt routines that don't take the decimal mode into account. Unless the IRQ is adapted, this means these conversion routines must be run with interrupts disabled.

Also, the number conversion can implement the division as table-driven repeated multibyte subtractions. That works amazingly well even in the presence of mixed bases, as it's the case with a clock display from the jiffy timer: "Display of TI$ in border (unex. or +3K)".


... P.S. see this thread, "best way to display decimal number?", for a nice déjà vu. :mrgreen:

wimoos
Vic 20 Devotee
Posts: 236
Joined: Tue Apr 14, 2009 8:15 am
Website: http://wimbasic.webs.com
Location: Netherlands
Occupation: farmer

Re: Help converting numbers onto Char in ML

Postby wimoos » Mon Mar 20, 2017 5:51 am

Printing signed 16-bit integers (where the high bit of the msb denotes the sign):

Code: Select all

   LDA msb
   LDY lsb
   JSR $D391
   INY
   JMP $DDD7


Regards,

Wim.
Last edited by wimoos on Wed Mar 22, 2017 6:17 am, edited 1 time in total.
PAL, two-prong VIC20 on 65C02 with 3k RAM expansion internal, 32k NOVRAM expansion external, DS1307 RTC and S-Video mod; 64HDD in FreeDOS on a thin-client; selfwritten 65asmgen; tasm; maintainer of WimBasic

User avatar
Kweepa
Vic 20 Scientist
Posts: 1082
Joined: Fri Jan 04, 2008 5:11 pm
Location: Austin, Texas
Occupation: Game maker

Re: Help converting numbers onto Char in ML

Postby Kweepa » Mon Mar 20, 2017 8:48 am

Codebase64 has a great collection of efficient 6502 snippets.
http://www.codebase64.org
For example (to answer your question):
http://www.codebase64.org/doku.php?id=b ... ii_routine

jalavera
Vic 20 Newbie
Posts: 14
Joined: Wed Aug 05, 2009 4:49 am

Re: Help converting numbers onto Char in ML

Postby jalavera » Tue Mar 21, 2017 6:29 am

After testing routines you suggest I think Kweepa's one is what I was looking for: 1 byte (0-255) in decimal using .A (ones), .X (tens), .Y. (hundreds) registers. When combined it prints YXA on screen. It's like PRINT PEEK (1234).
Many thanks to all of you.


Return to “Programming”

Who is online

Users browsing this forum: OhMyCommodore and 1 guest