[C] variable system in programming

Click For Summary

Discussion Overview

The discussion revolves around the concept of variables in programming, particularly in the context of the C programming language. Participants explore how variables are defined, assigned values, and manipulated, along with the differences between numeric and string variables. The conversation includes technical explanations and examples to clarify these concepts for beginners.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant expresses confusion about variable assignment, suggesting that the assignment operation implies a misunderstanding of how variables are updated.
  • Another participant clarifies that in programming, "=" is used for assignment, meaning to store a value in a variable rather than indicating equality.
  • Examples are provided to illustrate how variables are assigned and updated, such as "A = A + B" resulting in a new value for A while B remains unchanged.
  • Discussion includes the distinction between numeric and string variables, emphasizing the importance of defining variable types before use to avoid unexpected results.
  • Some participants mention that C does not support operator overloading, which affects how arithmetic operations are interpreted compared to other languages like C++ or C#.
  • There is a mention of different programming paradigms, such as imperative and declarative languages, and how they handle variable assignment and operations differently.
  • Historical context is provided regarding programming languages like COBOL and BASIC, highlighting their unique approaches to variable assignment and syntax.

Areas of Agreement / Disagreement

Participants generally agree on the basic principles of variable assignment and the distinction between numeric and string types. However, there are varying opinions on the implications of assignment operations and the differences between programming languages, indicating that multiple views remain on these topics.

Contextual Notes

Some participants note that the understanding of variable assignment can vary significantly between programming languages, and the discussion does not resolve the nuances of how different languages handle these concepts.

Who May Find This Useful

Beginners in programming, particularly those learning C or interested in understanding variable manipulation and assignment in various programming languages.

jd12345
Messages
251
Reaction score
2
I just started programming and i don't understand how variables work. First the book defines a as 3 and b as 5 then defines a =a+b. For me who hasnt done programming ever that implies that b=0 but i think second a is different from the first one.
Help regarding how variables work in computer
 
Technology news on Phys.org


In the computer language example you show, "=" means to assign a value to a variable, as opposed to meaning the variable is equal to that value. This would be a language like C, and each of those statement can be explained as:

A = 3
means to store the value 3 into A

B = 5
means to store the value 5 into A

A = A + B
means to add the values stored in A and B, then store the sum into A which will be 8.

Some languages like APL use an extended character set and use left arrow for assignment:

A ← 3
B ← 5
A ← A + B

Others languages like Pascal use :=

A := 3
B := 5
A := A + B
 


jd12345 said:
I just started programming and i don't understand how variables work. First the book defines a as 3 and b as 5 then defines a =a+b. For me who hasnt done programming ever that implies that b=0 but i think second a is different from the first one.
Help regarding how variables work in computer

In that description your changing the value "a" to be equal to the sum of ("a" + "b") , your incrementing the value of "a" by value "b"

a=3
b=5
a=a+b

a now = 8
b still = 5

it is the same as saying a=3+5
 


By the way there are two main types of variables, numeric and string. Its important not to get them mixed up.

Strings work like this.

a="hello "
b="world"
a=a+b
a now = "hello world"

Always define your variable types before you use them otherwise

a=5
b=3
c=a+b

c might end up being 53

Variables are usually assigned at the start of a program and can be used to tell the computer how they are to be treated.

As a string

Dim A as string
Dim B as string
A=9
B=6
A=A+B
A now = 96As a number

Dim A as integer
Dim B as integer
A=9
B=6
A=A+B
A now = 15
 
Last edited:


Just as a general comment... it helps to say which programming language you are using.
 


Ok thank you. I am learning C.
 


jd12345 said:
Ok thank you. I am learning C.

C doesn't have operator overloading in it's standard, so typically when see an arithmetic operation, the results will be numeric and have the same interpretation as you would find in mathematics, and not the string example above (although in languages like C++ and C#, the string example is common).

The best way to think about these is that the left hand variable is where the stuff gets stored and the right hand side (of the equals sign) is what is taken in. If a variable appears in both sides then the variable may change after the line has been executed.
 


There are two types of programming language. The most common type (including C) could be described as "imperative", i.e. most of the source code contains commands telling the computer to do something. The "=" sign is just as shorthand for "work out the value of what is on the right hand side, and then assign it to the variable on the left hand side". As you said, this isn't what "=" means in mathematics, but there is a limited set of characters available on a standard computer keyboard.

There are a few languages that are "declarative", where the source code is more like a "set of equations" that the computer "solves" to produce a result (or several results, if the solution isn't unique). These languages are usually less general-purpose, because the method for "solving the equations" has to be built into the implementation, so you can't just throw anything at it and hope it will figure out how to solve it. PROLOG is an example of that type of language.

Some early versons of the BASIC programming language carried the idea of an "imperative" language to its logical conclusion, and every statement in the language stared with a "verb". Arguably it's a bit more obvious what
LET A = A + B
means compared with just A = A + B, but since the computer doesn't need the "LET" to figure out what the statement means, it didn't survive into any "modern" programming language that I know of.

The inventors of COBOL language (now pretty much obsolete) chose not to use symbols at all, and went for long winded alternatives like
ADD B TO A
ADD B AND C GIVING A
instead of a = a + b and a = b + c. That might not seem too bad, but typing words like MULTIPLY and DIVIDE in full instead of * and / wasn't much fun.
 
Last edited:


AlephZero said:
The inventors of COBOL language (now pretty much obsolete)
chose not to use symbols at all, and went for long winded alternatives like
ADD B TO A
ADD B AND C GIVING A
instead of a = a + b and a = b + c. That might not seem too bad, but typing words like MULTIPLY and DIVIDE in full instead of * and / wasn't much fun.
For COBOL, a programmer can use just COMPUTE followed by a Fortran like arithmetic statement. COBOL is still heavily used in financial institutions, and in the case of IBM mainframes, along with some amount high level assembly language (HLASM, ALC, left over legacy code).
 
Last edited:

Similar threads

  • · Replies 43 ·
2
Replies
43
Views
8K
  • · Replies 22 ·
Replies
22
Views
2K
  • · Replies 54 ·
2
Replies
54
Views
5K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 36 ·
2
Replies
36
Views
3K
Replies
73
Views
6K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 59 ·
2
Replies
59
Views
12K
Replies
16
Views
3K
Replies
235
Views
15K