How would you convert ASCII characters to decimal numbers in assembly?

  • Thread starter twoski
  • Start date
  • Tags
    Numbers
In summary: The intent of B is just to get an idea of how a string of ascii digits would be converted into a number. So in essence, the answer to B is to have an array that stores the digits (0-9), and to iterate through the string, multiplying the current digit by 10 and adding the digit from the string to the result. This would be a way to store the digits without allocating memory for them.
  • #1
twoski
181
2

Homework Statement



In assembly, data is usually input in the form of ASCII character strings and it is then converted to the appropriate form for processing.

a) Explain how you would do to convert an ASCII character representing a decimal digit (i.e. ‘0’, ‘1’, …, ‘9’) into an actual decimal value

b) Extend your answer in (a) by explaining how you would do to convert ASCII character strings to whole integer numbers (i.e. 10s, 100s, 100s, …).

c) Considering the fact that each ASCII character is stored on 8 bits, explain how you would do to fit your converted number in (c) into a 32-bit word.


The Attempt at a Solution



A. The hex values for each ASCII character representing a decimal number range from 0x30 to 0x39. We take this hex value and subtract 0x30 from it to achieve the proper decimal value. For example, the ascii value '1' is 0x31, so 0x31 – 0x30 = 0x01 = decimal value 1.

B. These are the steps you would take to convert a string to a number.

Start with a stored final result of 0.
Loop through the string starting at the most significant digit.
1. Multiply the stored result by 10.
2. Subtract 0x30 from the current ascii digit and add the result to the final result.
3. Move to the next digit in the string and start the process over at step 1.
4. Return with the final result if there are no more ascii digits.

C. I don't understand this question... If you're converting, say, "1234" into a number then it should fit easily into a 32 bit word...
 
Physics news on Phys.org
  • #2
I think your logic is the answer for part c. Part b is confusing, is it asking you to store each digit (0 through 9) in separate memory locations, so you just end up with yet another string of bytes, but in the range 0 through 9 versus hex 30 through hex 39? For part b, you could store the digits as BCD (packed decimal), 4 bits per digit, but I doubt that is the intent of part b.
 
  • #3
I was a little confused because the only way you can convert a string to a number (to my knowledge) is with the aid of a loop.

However, my solution to B does work... I just don't understand how it ties into C.
 
  • #4
I think in (c) they want you to consider that you should not store intermediate results in an 8-bit character.
 
  • #5
twoski said:
However, my solution to B does work... I just don't understand how it ties into C.
You solution is really a solution to C. It seems that B just wants you to convert a string of ASCII digits into a string of bytes that range from 0 to 9, although you'd need to define some value as a terminator for the string of bytes. Perhaps B is just an intermediate step that you're just supposed to think about without actually implementing it.
 
  • #6
Would it be possible that C is referring to overflow conditions (ie. if you're converting a string that has over 10 digits then you are definitely going to overflow)?
 
  • #7
twoski said:
Would it be possible that C is referring to overflow conditions (ie. if you're converting a string that has over 10 digits then you are definitely going to overflow)?
I'm not sure, but B implies separate integer numbers for 10's, 100's, ... .
 
  • #8
That seems awfully inefficient... Having an array which stores each integer representing 10's, 100's, etc. would be rather wasteful wouldn't it?
 
  • #9
twoski said:
That seems awfully inefficient... Having an array which stores each integer representing 10's, 100's, etc. would be rather wasteful wouldn't it?
Yes, it would be inefficient. That's not the point. This is a programming exercise where the student is to build up capabilities piece by piece. The intent is to aid the student's understanding.
 
  • #10
I just don't think it's right though... We haven't discussed arrays in class very much.

The only way i can see C making sense is if i store each individual converted number in 8 bits. In which case the answer to C would be to simply add each 8-bit number together in order for it to fit into a 32 bit word. I think.
 
  • #11
You already solved C. My impression of B is that it's just asking what a string of ascii digits represents, and what you should do with the digits, such as multipy the 10's digit by 10, multiply the 100's digit by 100, and then C asks how to create a loop that does this while converting a string into a 32 bit binary number. I don't think B is actually asking for a code fragment that stores the digits into an array.
 
  • #12
Ohhh, I see what you mean...
 

1. What is ASCII and how does it relate to numbers?

ASCII stands for American Standard Code for Information Interchange. It is a character encoding standard that assigns a unique number to each character, including letters, numbers, and symbols. This allows computers to represent and store text as numbers.

2. How do you convert ASCII to numbers?

To convert ASCII to numbers, you can use the ASCII table to look up the corresponding number for each character. You can also use programming languages such as Python or Java to automate the conversion process.

3. Why is converting ASCII to numbers important?

Converting ASCII to numbers is important because it allows computers to process and manipulate text data. This is essential for tasks such as data analysis, text processing, and communication between different systems.

4. Can ASCII numbers be converted back to text?

Yes, ASCII numbers can be converted back to text using the reverse process of looking up the character associated with each number. This allows the original text to be retrieved and displayed.

5. Are there any variations of ASCII?

Yes, there are variations of ASCII such as UTF-8, which is a more comprehensive character encoding standard that supports a wider range of characters from different languages and symbols. However, ASCII is still widely used and is the basis for many other character encoding standards.

Similar threads

  • Engineering and Comp Sci Homework Help
Replies
2
Views
4K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
Replies
4
Views
929
  • Engineering and Comp Sci Homework Help
Replies
7
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
1K
  • Precalculus Mathematics Homework Help
Replies
11
Views
838
  • Programming and Computer Science
Replies
32
Views
1K
  • Computing and Technology
Replies
4
Views
767
Back
Top