How does RAL work for calculating multiplication in 8085 assembly language?

  • Thread starter Thread starter swty todd
  • Start date Start date
  • Tags Tags
    Program
AI Thread Summary
RAL (Rotate Accumulator Left) in 8085 assembly language effectively multiplies the accumulator's value by 2 for each rotation. The RST 1 instruction is a one-byte call to a specific memory location, which is part of an interrupt vector table used for handling interrupts. This table allows the CPU to branch to different routines based on the interrupt received. The discussion also touches on the compatibility of the 8085 with the 8088, noting that while they share some timing and pin configurations, they are distinct architectures. Understanding these concepts is crucial for programming and interfacing with older CPUs like the 8085.
swty todd
Messages
69
Reaction score
0
Hey I read this solution for calulating the mulitplcation of (09)H & (04)H in Assembly Langauge Programming for 8085.
I am not sure how the RAL works in this case, Help me out please.

MVI A, 09
RAL
RAL
RST 1
 
Technology news on Phys.org
hey actually someone helped me out with thsi problem.But this thread is still here as I am unable to figure out how to delete threads.
 
For those with inquirying minds ...

RAL is rotate A (accumulator) left one bit. It's the same as multiplying by 2.

RST 1 is a 1 byte call to location 8 in memory, it's not part of the algorithm.
 
we were taught that RST 1 stands for Return to command mode. What's up with this location 8 thing?
 
From address 0x0000 to 0x003f there is a table with addresses.
Each entry in 4 byte size. Some of these entries are used with electrical signals some of them with the RST command. Such a table is mostly named ,,interrupt vector table''. Each interrupt, software or hardware, is connected to one entry in this table. If such an interrupt occurs the execution gets interrupted and continues at the location stored in the table.
If you call RST 1 the execution will continued at the address that is stored at address 0x0008-0x000b
This is the way software interrupts are done in the old-days. But the way has not made a significant change.
Would be of good help if you can get an old programming guide for the 8085. Because of the simple architecture of these old CPUs it is a good starting point to learn how they work.

Ah. I forget.
At location 8 I would assume the address to your command interpreter.

And:

The command has the following bit-coding. 1 1 A A A 1 1 1 the ,,A A A'' gets filled with the value behind the command RST. Did you know that the commands are called mnemonics? Strange.
 
Yes I knew they were mnemonics. The first time I heard the word it sounded wierd.
AND Thanks .I get it now!
 
swty todd said:
we were taught that RST 1 stands for Return to command mode. What's up with this location 8 thing?
For CP/M, programs exited via a jump to 0. DOS calls were made to location 5. The 8259 interrupt controller uses call instructions, so I'm not sure which devices use the RST instructions for interrupts. The 8085 has 4 interrupt lines, tied to RST 4.5, 5.5, 6.5, 7.5. RST 4.5 was considered a "trap".

The 8085's timing was compatable with the 8088, and could share the same chip sets, so by replacing the chip (I think this was possible with an adapter), reprogramming the BIOS, and writing the "BIOS" for the hardware to OS interface, an 8085 system could be converted to run MSDOS or CP/M-86. Intel provided a converter tool, and I did a conversion once for CP/M -> CP/M-86.

The instruction set was the same as 8080, with these added, shown in "macro" form:

Code:
ARHL    MACRO           ;ARITH RIGHT SHIFT HL
        DB      10H
        ENDM
DSUB    MACRO           ;HL=HL-BC
        DB      08H
        ENDM
JNXC    MACRO   X       ;JMP IF NOT X CARRY (INX, DCX)
        DB      0DDH
        DW      X
        ENDM
JXC     MACRO   X       ;JMP IF X CARRY
        DB      0FDH
        DW      X
        ENDM
LDHI    MACRO   I       ;DE=HL+I
        DB      28H
        DB      I
        ENDM
LDSI    MACRO   I       ;DE=SP+I
        DB      38H
        DB      I
        ENDM
LHLX    MACRO           ;HL=(DE)
        DB      0EDH
        ENDM
RDEL    MACRO           ;ROTATE DE,CY LEFT
        DB      18H
        ENDM
RIM     MACRO           ;RESET INT MASK
        DB      20H
        ENDM
RSTV    MACRO           ;RST IF V SET TO 40H
        DB      0C8H
        ENDM
SHLX    MACRO           ;(DE)=HL
        DB      0D9H
        ENDM
SIM     MACRO           ;SET INT MASK
        DB      30H
        ENDM
 
Last edited:
Jeff Reid said:
The 8085 is pin compatable with 8088,
No. I have both pinouts here. And they are definately not the same.

The 8259 has two different modes. One where it send the call instruction with an address, this is for 8080/8085 CPUs and one where it sends only an Index into the interrupt vector table. This is used for 8086 CPUs.

The RST instruction maybe used to make different calls to different subsystems. It is easy to replace a single byte with this one-byte instruction to interrupt the execution and branch to a debugger. This way you can set breakpoints in your application.

As far as I know Atari did so in there TOS. But they even used the co-processor interface for that. Which give them the possibility to run an instruction in software or in hardware depending whether the co-processor is present or not.

an 8085 system could be converted to run MSDOS or CP/M-86
If you change the processor and the BIOS and write your own hardware access, I would not think that it is a 8085 system anymore. It is an 8086 system. It would be easier to buy an old PC to run MSDOS applications.
 
Jeff Reid said:
8085 ... 8088 pinout

simbad said:
No. I have both pinouts here. not the same.
I checked on this and your right. I corrected my previous post. Based on a project I worked on, the timing was the same or close enough that the existing Intel peripheral chips for the 8080-8085 would also work with the 8088. I worked for Pertec Computer

http://en.wikipedia.org/wiki/Pertec_Computer

from about 1980 to 1987, and they made a CP/M system based on the 8085 called a PCC 2000. The hardware guys made an adapter board with an 8088 that plugged into the 8085 cpu socket on the PCC 2000. Intel had a program to convert assembely code from 8080 to 8088, doing about 90% of the job, with about 10% left over for clean up. I did all the software work on this project. Although it never got released for sale, we ended up with a 8088 based PCC 2000 runing CP/M-86 with 256K of memory on it.

One reason for the 8088 having the LAHF instruction was to help with the conversion of 8080 code to 8088.

The 8259 has two different modes. One where it send the call instruction with an address, this is for 8080/8085 CPUs and one where it sends only an Index into the interrupt vector table. This is used for 8086 CPUs.
Note that in 8086 mode, the index is sent in response to a second INTA (interrrupt acknowledge) from the cpu. If I remember correctly (which I'm starting to doubt now), the early 8259 output a 2 byte "INT" instruction. The first INTA from the cpu caused the 8259 to output a hex CD on the bus and the second INTA from the cpu cause the 8259 to output the immediate byte value (index). I don't remember if the "CD" was a programmable value or hard coded into the 8259. It's gone now or at least undocumented, but the index from the 8259 still isn't sent until a second INTA is received, so that much of the legacy handshake remains.

The RST instruction maybe used to make different calls to different subsystems. It is easy to replace a single byte with this one-byte instruction to interrupt the execution and branch to a debugger. This way you can set breakpoints in your application.
The 8088 and later cpus use the single byte INT3 instruction for the same purpose.

As far as I know Atari did so in their TOS.
Atari? The Atari 400/800/65XE/130XE systems ran on 2mhz Motorola 6502 (twice as fast as an Apple II). The Atari ST and early Mega ST systems ran on 8mhz Motorola 68000.

If you change the processor and the BIOS and write your own hardware access, I would not think that it is a 8085 system anymore. It is an 8086 system. It would be easier to buy an old PC to run MSDOS applications.
As mentioned above, it was done with an adapter board, allowing the rest of the hardware in that system to be the same (except that the BIOS chip was programmed with 8088 code.
 
  • #10
Hey thanks guys.
 
  • #11
The 8085 is pin compatable with 8088
After more research on this, it's just easier to make an adaptor for the 8085. The layouts are different, but both 8085 and 8088 use a single 5 volt source, and share data and the lower 8 bits off address on the same pins (unlike the 8080), and the timing is the same or close enough to work with supporting chip sets. I recall that the PCC 2000 already had some hardware to bank in more than 64k of memory, and the adaptor board converted pinouts for A16-> A19 to a second mini-adaptor board that plugged into a chip socket(s?) to replacing the banking logic that was used with the 8085.
 
Last edited:
Back
Top