Space Complexity of Number-theoretic Algorithms

Click For Summary

Discussion Overview

The discussion revolves around the space complexity of number-theoretic algorithms, with participants seeking examples and clarifications on how to calculate and understand space complexity in this context. The conversation includes references to specific algorithms, models of computation, and theoretical frameworks.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant requests an example of calculating space complexity for a number-theoretic algorithm.
  • Another participant suggests searching for the Euclidean algorithm and its complexity, noting that many textbooks cover this topic.
  • A participant mentions finding examples that only discuss time complexity, not space complexity.
  • Links to resources discussing space complexity in relation to the Euclidean algorithm and number theory are provided by another participant.
  • A participant references Neal Koblitz's model of bit operations for estimating time complexity and poses questions about the RAM model and space complexity definitions.
  • One participant explains the differences between general unit cost RAM and log-cost RAM models, emphasizing the importance of precision in referring to RAM models.
  • Another participant expresses interest in using the logarithmic cost criterion for big numbers, seeking confirmation on their understanding of space complexity in this context.
  • There is a discussion about the implications of using unit cost RAM models versus Turing Machine models, with one participant cautioning against the limitations of the unit cost RAM model.

Areas of Agreement / Disagreement

Participants express differing views on the appropriate models for discussing space complexity, with some favoring the logarithmic cost criterion while others highlight the limitations of the unit cost RAM model. The discussion remains unresolved regarding the best approach to defining space complexity in the context of number-theoretic algorithms.

Contextual Notes

There are unresolved assumptions regarding the definitions of space complexity and the models of computation being referenced. The discussion also reflects varying levels of familiarity with the theoretical frameworks involved.

nworm
Messages
31
Reaction score
0
Can you write an example where a space complexity of any number-theoretic algorithm is calculated? Thanks In Advance.
 
Physics news on Phys.org
Have you tried google? Try 'euclidean algorithm' and 'complexity', you'll get piles of examples where the time and space complexity is discussed. I'd expect most textbooks covering complexity would have an example from number theory as well (the euclidean algorithm is probably common here as well being a very 'classic' algorithm)
 
Yes, I have. I have found examples where the time complexity of number-theoretic algorithms is discussed only.
 
Try

http://pharos.cpsc.ucalgary.ca/Dienst/UI/2.0/Describe/ncstrl.ucalgary_cs/1989-376-38

(first hit from "space complexity" euclidean algorithm)

You might want to check Andrew Odlyzko's webpage

http://www.dtc.umn.edu/~odlyzko/doc/cnt.html

It's filled with number theory algorithms, the handfull I've read included time and space complexity discussions.
 
Last edited by a moderator:
Thank you. This is very interesting links.
 
Dear shmoe.

I have read the part of these links. Not long ago I read "A Course in Number Theory and Cryptography" by Neal Koblitz. I have used Neal Koblitz's model of bit operations for estimation of the time complexity of my algorithm.

If you read Neal Koblitz's book say please what do you think about next questions?
Is this model a RAM with infinity space and word length?
Can I speak that the space complexity of my algorithms is the number of bits using to story information?
Thanks In Advance.
 
I haven't read the book, but normally when we speak of time and space complexity, we do in terms of Turing Machine (commonly referred to as Turing Machine model). I presume you are aware of this and hence won't delve further into this.

Now when one speaks of RAM model, one has to be very precise as to which one he/she is referring to.

There are two types of RAM models,
1. General Unit Cost RAM model
Unit cost inucurred per instruction execution
2. Log-cost RAM model
Cost incurred per instruction execution is proportional to logarithm on the size of the operands

The general unit cost RAM model is much more powerful than our Turing Machine. The Turing Machine equivalent in RAM models is the log-cost RAM model. Hence, if you are using the RAM model, one normally should adhere to log-cost RAM model. However, working with unit-cost RAM model is simpler and hence we do a work around.

Work Around :
We make sure that all the input operands are polynomially bounded in the input size. For such a unit cost RAM model, there is an equivalent log-cost RAM model and thereby a Turing model.

Most algorithms (especially the number theoretic algorithms that you are interested in) today commonly appeal to unit-cost RAM model in the way i described above.

-- AI
 
Thank you.
If I understand rightly then you mean "the uniform cost criterion" and "the logarithmic cost criterion" from

http://www.cse.ohio-state.edu/~gurari/theory-bk/theory-bk-fivese1.html"

I deal with big numbers. So I think to use the logarithmic cost criterion, i.e. O(log N) units of space for a number N (or O(log N) bits for a number N). Am I right?
 
Last edited by a moderator:
nworm said:
Thank you.
If I understand rightly then you mean "the uniform cost criterion" and "the logarithmic cost criterion" from

http://www.cse.ohio-state.edu/~gurari/theory-bk/theory-bk-fivese1.html"
Yes.

I deal with big numbers. So I think to use the logarithmic cost criterion, i.e. O(log N) units of space for a number N (or O(log N) bits for a number N). Am I right?
Sort of close but not completely. Our computers are known to simulate Turing Machines. As i said earlier, the unit cost RAM model is far more powerful than our Turing Machine. That is, a unit cost RAM model can outperform our computers at certain things and obviously, since we are looking at things that our computers are capable of doing, we wouldn't want to work with unit cost RAM model.

-- AI
 
Last edited by a moderator:

Similar threads

  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 14 ·
Replies
14
Views
5K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 0 ·
Replies
0
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
86
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K