So, how do I design a microcontroller using verilog?

In summary: I want to write input and output for microcontroller in verilog langugae (8051)In summary, the conversation is about building a microcontroller on an FPGA using verilog code. The person has written some verilog code for a counter, ALU, and decoder, and is now trying to make some specifications for their design. They ask for help with determining which ALU, data memory, program memory, and instruction decoder to use based on the data size and number of instructions. The conversation ends with a suggestion to check out available processors on opencores.org and a challenge to diagram out the internal execution subcycles of 4 8051 instructions.
  • #1
vead
92
0
hello experts
I want to write input and output for microcontroller in verilog langugae (8051)

I have tried to write code

Code:
Module mcu (clk, rst, en, p_in, p_out, t_in, t_out, i_in,i_out, rx_in, tx_out ,a0,a1,a2,d0,d1,d2,d3,d4,d5,d6,d7,a,b,s0,s1,s2,f,...etc);
Rst= reset input
Clk=  clock input
En= enable input

Port
P_in=port input
P_out= port output

Timer
T_in = timer input
T_out = timer output

Interrupt
i_in= input for interrupt
i_out,= output for interrupt

ALU
A=input
B=input
So=input
S1=input
S2=input
F=output

Decoder
opcode_in
opcode_out

ram
wr
wd

ram
read
write
enable


I have read about pin diagram but I don't understand which pins is used for ALU, decoder rom and ram memory
 
Computer science news on Phys.org
  • #2
vead said:
hello experts
I want to write input and output for microcontroller in verilog langugae (8051)

I have tried to write code

Code:
Module mcu (clk, rst, en, p_in, p_out, t_in, t_out, i_in,i_out, rx_in, tx_out ,a0,a1,a2,d0,d1,d2,d3,d4,d5,d6,d7,a,b,s0,s1,s2,f,...etc);
Rst= reset input
Clk=  clock input
En= enable input

Port
P_in=port input
P_out= port output

Timer
T_in = timer input
T_out = timer output

Interrupt
i_in= input for interrupt
i_out,= output for interrupt

ALU
A=input
B=input
So=input
S1=input
S2=input
F=output

Decoder
opcode_in
opcode_out

ram
wr
wd

ram
read
write
enable
I have read about pin diagram but I don't understand which pins is used for ALU, decoder rom and ram memory
We don't get many questions like this, so we're not likely to be much help to you. might do a web search for tutorials on how to write verilog code, such as this one - http://www.asic-world.com/verilog/verilog_one_day.html
 
  • #3
I don't understand exactly what you want to do with the verilog. You can create a verilog module for the mcu, but then, within that, you need to create verilog modules for all the functional elements within the mcu that you want to affect the mcu outputs. If you took it to an extreme, within the MCU model you would need to decode instructions from a model of the ROM, and execute them such that they affected the IO with the proper timing.

On another level, you could build a limited behavorial model, for example you could simply sequence the IO in a defined way to stimulate an external device connected to the mcu model.

I'm not sure how familiar you are with verilog (the code you supplied isn't legal verilog). Have you ever designed a simple verilog system? For example, an RTL up/down resetable counter module instantiated within a behavorial testbench.

Let me know what you are really trying to do, and I'll try to help.
 
  • Like
Likes vead
  • #4
meBigGuy said:
I don't understand exactly what you want to do with the verilog. You can create a verilog module for the mcu, but then, within that, you need to create verilog modules for all the functional elements within the mcu that you want to affect the mcu outputs. If you took it to an extreme, within the MCU model you would need to decode instructions from a model of the ROM, and execute them such that they affected the IO with the proper timing.

On another level, you could build a limited behavorial model, for example you could simply sequence the IO in a defined way to stimulate an external device connected to the mcu model.

I'm not sure how familiar you are with verilog (the code you supplied isn't legal verilog). Have you ever designed a simple verilog system? For example, an RTL up/down resetable counter module instantiated within a behavorial testbench.

Let me know what you are really trying to do, and I'll try to help.
actually I want to make microcontroller on FPGA using verilog code
I have written some verilog code for counter , alu decoder
the code that i posted in my post is just sample code

now I am trying to make some specification
so I did some homework
If you can check my work its very good for meQ1. If data is 4 bit ,which alu you should have to use
A . 2 bit ALU
B . 4 bit ALU
C 6 bit ALU
D 8 bit ALU
my answer . (B) 4 bit alu
reason : data is 4 so I used 4 bit alu . 4 bit alu deal with 4 bit number like 0001+1010

Q2. If data is 8 bit,which data memory you should have to use ?
A. data memory 8 x 256
B. data memory 4x16
my answer .A data memory 8 x 256
reason : If data is 8 bit , ALU is 8 bit so data memory should be 8 x 256

Q3.If instruction is 4 bit ,which program memory you should have to use ?
A. program mmory should be 4x16 bit
B. program mmory should be 8x256 bit
C program mmory should be 6x64 bit
my answer A . program memory should be 4x16 bit
reason : Instruction are 4 bits so program memory should be 4x16 bit

Q 4. If we have 4 to 16 Instruction decoder , how many instruction it can decode ?
A. 8 instruction
B. 16 instruction
C. 256 instruction
my answer (B) 16 instruction
reason : 4 control word can decode 16 instructionQ5. Why Program counter is 4 bit ?
A. 4 bit program counter can address 16 instructions
B 8 bit program counter can address 256 instructions
C.16bit 4 bit program counter can address around 6500 instructions
my answer. A
because 4 bit program counter can address 16 instructionsthis is not real specification for my design . its just basic Idea . If my idea is correct then I will make real specification and then I will write verilog code
 
  • #5
Why build your own? If it is just for grins, I'd pick something more useful. I'd do something with the 8051 (actually, I wouldn't use an 8051 though)

Check out the many processors available at:
http://opencores.org/projects

And for the 8051:
http://opencores.org/project,8051

If you insist on doing your own 8051, then for homework, diagram out the internal execution subcycles of 4 different 8051 instructions (on a stock 8051), including a JSR and a conditional branch. You can pick any instructions.
What is the longest (most internal clock cycles) 8051 instruction?

The ALU, program counter, memory, etc are trivial. Hooking it all together and controlling it is the hard part.
 
  • Like
Likes vead
  • #6
meBigGuy said:
Why build your own? If it is just for grins, I'd pick something more useful. I'd do something with the 8051 (actually, I wouldn't use an 8051 though)

Check out the many processors available at:
http://opencores.org/projects

And for the 8051:
http://opencores.org/project,8051

If you insist on doing your own 8051, then for homework, diagram out the internal execution subcycles of 4 different 8051 instructions (on a stock 8051), including a JSR and a conditional branch. You can pick any instructions.
What is the longest (most internal clock cycles) 8051 instruction?

The ALU, program counter, memory, etc are trivial. Hooking it all together and controlling it is the hard part.

I want to learn some basic then I will do my own design
I have one zip file for mov instruction

5 bit opcode + 3 bit register specification
7 bit opcode + 1 register specification
8 bit opcode

5 bit opcode + 3 bit register specification
MOV A, Rn
MOV Rn, #immediate
MOV Rn, direct
MOV direct, Rn
MOV Rn, A


7 bit opcode + 1 register specification

MOV @Ri, A
MOV @Ri, direct
MOV A, @Ri
MOV direct, @Ri

8 bit opcode
8+0 = 8 bit
MOV A, #immediate
MOV A, direct
MOV direct, A
I think first I have to use ALU then I interface decoder with alu
I am explaining my work for ALU

ALU 8 bit
source , destination
source 1 8 bit
source 2 8 bit
destination 8 bit
alu carry
operation code from decoder

so we know the source of ALU
data from accumulator
data from ram memory
data from register

how to make ALU For 8051 ?
Look the following file for mov instruction
 

Attachments

  • file2.zip
    14.4 KB · Views: 283
  • #7
The execution of an instruction happens over multiple clock cycles. There are many varieties of 8051 with different clock cycles per byte of instruction. Some are 1, some are many. The datapath architecture of your design is dependent on what you decide regarding clocks/byte of instruction.

You are defining the instruction structure, but not how each instruction is broken down and executed. What data is transferred on each clock cycle of the instruction.

The classic 8051 is 12 clock cycles per byte of instruction. Instructions are 1 to 4 bytes, meaning 12 to 48 clock cycles.

The atmel 8051 is 1 cycle per byte of instruction, so instructions are 1 to 4 cycles.

The process of scheduling the operation of each instruction drives the architecture of the design.

Check this out:
http://archive.org/details/bitsavers_intel80518liminaryArchitecturalSpecificationMay80_6120863/

It is an old style 8051, but it illustrates what you need. I didn't find a complete document for a single cycle 8051, but this sort of addresses it:
http://www.cs.unc.edu/~vicci/comp261/project/mcu/80251_prog_man.pdf
 
Last edited:
  • Like
Likes vead
  • #8
meBigGuy said:
The execution of an instruction happens over multiple clock cycles. There are many varieties of 8051 with different clock cycles per byte of instruction. Some are 1, some are many. The datapath architecture of your design is dependent on what you decide regarding clocks/byte of instruction.

You are defining the instruction structure, but not how each instruction is broken down and executed. What data is transferred on each clock cycle of the instruction.

The classic 8051 is 12 clock cycles per byte of instruction. Instructions are 1 to 4 bytes, meaning 12 to 48 clock cycles.

The atmel 8051 is 1 cycle per byte of instruction, so instructions are 1 to 4 cycles.

The process of scheduling the operation of each instruction drives the architecture of the design.

Check this out:
http://archive.org/details/bitsavers_intel80518liminaryArchitecturalSpecificationMay80_6120863/

It is an old style 8051, but it illustrates what you need. I didn't find a complete document for a single cycle 8051, but this sort of addresses it:
http://www.cs.unc.edu/%7Evicci/comp261/project/mcu/80251_prog_man.pdf

first of all thank you for your cooperation

I am sharing my big doubt
8051 Instruction set
5 bit opcode + 3 bit register specification
7 bit opcode + 1 register specification
8 bit opcode

5 bit opcode that allow 32 operations
7 bit opcode that allow 128 oprations
8 bit opcode that allow 256 operations

Many ALU can directly access program memory . so they use registers

· Accumulator register
· Rn register (R0 to R7)
· Ram memory
· Program memory
· Instruction decoder
· Program counter
· ALU (arithmetic and Logic unit)
· PSW register

5 bit opcode + 3 bit register specification
Example
MOV A, Rn
11101 n n n

7 bit opcode + 1 register specification
MOV @Ri, A

1111011i

8 bit opcode
MOV A, direct

11100101 direct

5 bit opcode + 3 bit register specification
7 bit opcode + 1 register specification
8 bit opcodeQ I don’t understand what's the use of three different Instruction format, how they execute ?

I am not sure but look my answer


I think, first two format they don’t use Instruction decoder. Data store directly to respective register, To execute this type of Instruction they don’t use ALU

Last formmet , I think it use Instruction decoder

how the different instruction execute in different format ?
 
  • #9
I don't know the 8051 to that level of detail, but some processors always pass data through the ALU to get to the registers. They OR with 0, or add 0.

I'm not going to be able to tell you exactly how to decode instructions to perform the operations you require.

You are going to have to study all of the operations required by every instruction and design the logic to make it happen.

There are open source designs out there you can learn from, but they seem to be vhdl.
 
  • Like
Likes vead

1. What is controller input and output?

Controller input and output refers to the process of sending and receiving information between a controller, such as a keyboard, mouse, or gamepad, and a device, such as a computer or gaming console. This allows users to interact with and control the device.

2. What are the different types of controller input?

The different types of controller input include buttons, joysticks, touchpads, motion sensors, and voice commands. Some controllers may also have specialized inputs, such as accelerometers or pressure-sensitive buttons, for specific types of games or tasks.

3. How does controller input work?

Controller input works by translating the physical actions of the user, such as pressing a button or moving a joystick, into digital signals that the device can understand. These signals are then processed by the device's software, which determines how to respond based on the type of input and the current context.

4. What is controller output?

Controller output refers to the feedback or response provided by the device to the user's input. This can include visual feedback, such as on-screen prompts or changes in graphics, as well as audio feedback, such as sound effects or voice prompts.

5. How does controller input and output impact user experience?

Controller input and output can greatly impact the user experience by providing a more intuitive and immersive way to interact with a device. Well-designed controller input and output can enhance gameplay, improve accessibility, and make tasks more efficient and enjoyable for users.

Back
Top