A-level Mathematics programming

Click For Summary

Discussion Overview

The discussion revolves around the intersection of mathematics and programming, particularly in the context of A-level Mathematics. Participants explore various programming paradigms, the limitations of current programming languages, and the potential for future computational models that align more closely with mathematical concepts.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant expresses a desire for reading materials that connect mathematics and programming, particularly with a focus on examples and accessible language.
  • Another participant suggests that mathematical problems often require multiple steps, similar to algorithms in programming, and mentions the optimization of algorithms using parallel processing.
  • There is a discussion about the notation used in programming languages, with some arguing that the expression "x = x + 1" is a limitation of character sets, while others highlight the differences in assignment operators across languages.
  • A participant critiques a blog post as being unrealistic regarding the capabilities of current technology, suggesting that it reflects a fantasy rather than feasible computing models.
  • Another participant introduces the concept of reversible computing and contrasts current computing methods with biological processes, emphasizing the bottleneck caused by the separation of processing and memory in computers.
  • There is mention of research on phase-change materials that could potentially allow for simultaneous processing and memory functions, drawing parallels to human brain function.

Areas of Agreement / Disagreement

Participants express differing views on the feasibility of certain programming concepts and the limitations of current languages. There is no consensus on the implications of the blog post or the future of programming paradigms.

Contextual Notes

Some participants note the limitations of current programming languages and the potential for future developments, but these ideas remain speculative and unresolved.

thespiritroom
Messages
3
Reaction score
0
Hi..

Until very recently, I have what can be called a marginal interest in computers and even more so in math (mostly because any math I can do is limited to AS Level and despite being able to deal with abstractions in other disciplines, I seem not to have caught on the math front..I was also aversed to the notations < very stupid in retrospect)

I stumbled across a blogpost below incidentally and was wondering if anybody can suggest reading material that I can work through related to topics the post below encapsulates? Or at least a starting point? With language I can reasonably understand or maybe written in a way that uses a lot examples to illustrate a point?

AS Level Math = http://tinyurl.com/6jj97c (i remember struggling the most with mechanics and the decision maths. less so on pure maths.. but this was a good 2-3 years ago)

(disclaimer:) I find that I mostly gain momentum/interest through stumbling across connections made across subject areas.. maybe my brain works kind of weird this way but there have been plenty more instances in the past where I go from 0 interest to desire to understand the relations/overlaps between X & Y & etc"Well, I was thinking of something quite a bit more different than a tricked out C processor. I know that great work was done to create the OS(s), etc. And that some tangible, lasting innovation happened in that group. However, my point is that C and friends are designed specifically to control a monolithic, serial instruction pipeline. All else is second-order fluff.

Math (not Lisp) has less relation to time. We force computation into the time dimension, which is very much against the Truth represented in our statements. The number of cycles it takes to get an answer should not be a function of the algorithm. Your brain does massive “computation” in a single cycle. That’s what I’m getting at.

We can define a bitmap representation of a cube mathematically, taking positional inputs and maybe some lighting specs. Once the inputs are specified, the answer is simultaneously defined. Reality is defined in less than one cycle. Until our hardware can be configured to do our mathematical dirty work in a single cycle, we are stuck with telling idiotic circuitry to do simplistic things.

Of course, processing lists is another matter. Counting cycles may be as difficult as count list items. Maybe processor power will be measured in terms of both algorithmic complexity and list length (handled in a single time slice, of course).

The original premise, that Math is the one true programming language, will only bear fruit when our hardware can handle the truth.

BTW, even my 8-yr-old daughter scratches her head when I write x=x+1. I resolve to belittle any language that supports such blasphemy."Thanks very much in advance
 
Last edited:
Technology news on Phys.org
I'm not quite sure of your point, but even a person will break up a mathematical problem into multiple steps, like going through the steps it takes to solve a differential equation based problem, such as acceleration based on position instead of time.

Some algorithms can be optimized using parallel processors (PC, mainframe, supercomputer) or vector based math units (some supercomputers like a Cray 1), or a PC can use SSE instructions to do a set of math operations in parallel.

x = x + 1
This is a limitation imposed by the character set used for most computer languages. In APL, it would be:

XX + 1

Note that in APL, X could be a scalar, vector, matrix, ... in which case every element of X would be incremented by 1.
 
It seems like you are more interested in http://en.wikipedia.org/wiki/Declarative_programming than procedural languages. As the wiki page shows, people have already been there and done that.

But if you get sidetracked into notational trivia like "writing x = x + 1 is stupid", you aren't going to get very far IMO.
 
AlephZero said:
It seems like you are more interested in http://en.wikipedia.org/wiki/Declarative_programming than procedural languages. As the wiki page shows, people have already been there and done that.

But if you get sidetracked into notational trivia like "writing x = x + 1 is stupid", you aren't going to get very far IMO.


lol absolutely not what I meant! Sorry I didn't realize I copied up till that far! When I said notation I meant as in further maths, not standard algebra! Thank you!
 
rcgldr said:
In APL, it would be:
XX + 1
And in Pascal, Modula-2, and (I believe) Ada, this would be X := X + 1.

The point is, that many languages distinguish between the assignment operator (←, :=, = in C/C++/C#/etc.) and the "is equal to" operator. In no way is the expression X = X + 1 suggesting that X and X + 1 happen to be the same value.
 
Your blog post looks like the transcript of someone who was daydreaming/fantasizing on what could be possible with computers. As it is stated, his fantasy is outside what is technologically feasible today. It would probably imply both designing hardware in a fundamentally different manner, as well as needing programming languages which would support 'reality in less than one cycle.'

Well, the above, or the ramblings of a madman.

(It somehow reminded me a bit of reversible computing, a nonstandard computing model which may be possible.)
 
Last edited by a moderator:
thespiritroom said:
"...
Math (not Lisp) has less relation to time. We force computation into the time dimension, which is very much against the Truth represented in our statements. The number of cycles it takes to get an answer should not be a function of the algorithm. Your brain does massive “computation” in a single cycle. That’s what I’m getting at.
...
Of course, processing lists is another matter. Counting cycles may be as difficult as count list items. Maybe processor power will be measured in terms of both algorithmic complexity and list length (handled in a single time slice, of course).

Computers currently deal with processing and memory separately, resulting in a speed and power 'bottleneck' caused by the need to continually move data around. This is totally unlike anything in biology, for example in human brains, where no real distinction is made between memory and computation. To perform these two functions simultaneously the University of Exeter research team used phase-change materials, a kind of semi-conductor that exhibits remarkable properties.
Their study demonstrates conclusively that phase-change materials can store and process information simultaneously. It also shows experimentally for the first time that they can perform general-purpose computing operations, such as addition, subtraction, multiplication and division. More strikingly perhaps it shows that phase-change materials can be used to make artificial neurons and synapses. This means that an artificial system made entirely from phase-change devices could potentially learn and process information in a similar way to our own brains.

http://www.sciencedaily.com/releases/2011/06/110623130736.htm
 

Similar threads

Replies
86
Views
3K
  • · Replies 102 ·
4
Replies
102
Views
4K
Replies
16
Views
3K
Replies
8
Views
2K
Replies
4
Views
3K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K