How can I learn computational complexity?

Click For Summary
SUMMARY

The discussion focuses on learning computational complexity and algorithms, emphasizing the importance of understanding concepts like time-complexity and big-O notation. The recommended resource is "Algorithms" (4th Edition) by Robert Sedgewick, which provides clear explanations suitable for beginners with some math background. The conversation highlights the significance of efficient algorithms, such as bubble sort, in handling large datasets and the implications for encryption security. A solid grasp of mathematical principles is essential for mastering these topics.

PREREQUISITES
  • Basic understanding of algorithms and their functions
  • Familiarity with big-O notation for time-complexity analysis
  • Mathematical concepts such as iteration and linearity
  • Experience with programming fundamentals
NEXT STEPS
  • Study "Algorithms" (4th Edition) by Robert Sedgewick for foundational knowledge
  • Learn about time-complexity and its impact on algorithm performance
  • Explore different sorting algorithms and their efficiencies
  • Investigate encryption methods and their reliance on algorithmic complexity
USEFUL FOR

Students, aspiring computer scientists, and software developers interested in algorithms, computational complexity, and encryption security will benefit from this discussion.

barasakar
Messages
2
Reaction score
0
Hi all,
I was watching a lecture on MIT OpenSource website that is called Algorithmic Lower Bounds: Fun with Hardness Proofs. I am very interested in the content of the lecture, but I couldn't understand most of the information, such as P and some specific terminologies. I tried to look them up on Wikipedia, but there were more unclear things came up. I am totally new Algorithm, but I have some programming experience and currently learning calculus in high school. Is there any way for me to start approaching Algorithm and Computational complexity?
 
Technology news on Phys.org
A lot of textbooks on algorithms explain this clearly, like big-O notation. For an exact example and definition try R Sedgwick 'Algorithms' 4th Ed. You need some math to read this usefully.

For an offhanded, non-technical explanation:

Computers operate by stepping through one-by-one over each data item. This is iteration. Sorting is an important example of iterating. The idea is to go through the list of items as few times as possible. The number of times your algorithm has to go through the list to achieve a verified fully sorted list is a measure of time-complexity.
Suppose you have N items in your list, a bubble sort requires worst case: N-1 times N-1 passes through the list. If N is 11, well that won't take long. If there are a million items, uh oh. 999999 times 999999 is waaaaay bigger than 10 times 10, right? Not just a hundred thousand steps more than 10 times 10 steps. Answer: 999997999901 steps, total. Close to a million times slower. As the list gets longer time gets used up faster and faster. With gigantic lists the time becomes easier to measure like Geologic era. Eons. Instead of seconds or minutes.

So a bubble sort is fine -if you want to sort tiny lists. You need to know what resources algorithm will consume, time is one them. Encryption uses this fact to its advantage.

A bad guy is not going to waste hundreds of hours on a very expensive supercomputer to break really hard encryption. That is not going to make him rich overnight. He is going to break into something easy - where he got the password because the user was careless.This is money much faster.

So governments who have time and money may seem to want to try breaking stuff - or plant back doors. Encryption depends largely on the fact that all of the known algorithms to break a specific scheme are super time consuming. So the fact is that it becomes really impractical to break good encryption without backdoors (shortcuts), or some other interesting ways to steal passwords or encryption keys.

That is the concept. Real time-complexity is far more interesting. And requires precise definitions.

Math is VERY linear. If you miss a step in learning from A to Z ( G for example) you may not really understand and be able to use most of the concepts after G. One great reason to go to class every day.
Really good math text writers know this, those are the ones to read. Too many explanations on the internet assume way too much previous knowledge that is required to use it well. Anybody not fully conversant with the field -as you found out- gets a headache fast..
 
  • Like
Likes   Reactions: barasakar
jim mcnamara said:
A lot of textbooks on algorithms explain this clearly, like big-O notation. For an exact example and definition try R Sedgwick 'Algorithms' 4th Ed. You need some math to read this usefully.

For an offhanded, non-technical explanation:

Computers operate by stepping through one-by-one over each data item. This is iteration. Sorting is an important example of iterating. The idea is to go through the list of items as few times as possible. The number of times your algorithm has to go through the list to achieve a verified fully sorted list is a measure of time-complexity.
Suppose you have N items in your list, a bubble sort requires worst case: N-1 times N-1 passes through the list. If N is 11, well that won't take long. If there are a million items, uh oh. 999999 times 999999 is waaaaay bigger than 10 times 10, right? Not just a hundred thousand steps more than 10 times 10 steps. Answer: 999997999901 steps, total. Close to a million times slower. As the list gets longer time gets used up faster and faster. With gigantic lists the time becomes easier to measure like Geologic era. Eons. Instead of seconds or minutes.

So a bubble sort is fine -if you want to sort tiny lists. You need to know what resources algorithm will consume, time is one them. Encryption uses this fact to its advantage.

A bad guy is not going to waste hundreds of hours on a very expensive supercomputer to break really hard encryption. That is not going to make him rich overnight. He is going to break into something easy - where he got the password because the user was careless.This is money much faster.

So governments who have time and money may seem to want to try breaking stuff - or plant back doors. Encryption depends largely on the fact that all of the known algorithms to break a specific scheme are super time consuming. So the fact is that it becomes really impractical to break good encryption without backdoors (shortcuts), or some other interesting ways to steal passwords or encryption keys.

That is the concept. Real time-complexity is far more interesting. And requires precise definitions.

Math is VERY linear. If you miss a step in learning from A to Z ( G for example) you may not really understand and be able to use most of the concepts after G. One great reason to go to class every day.
Really good math text writers know this, those are the ones to read. Too many explanations on the internet assume way too much previous knowledge that is required to use it well. Anybody not fully conversant with the field -as you found out- gets a headache fast..
Thanks man, that's pretty helpful!
 

Similar threads

  • · Replies 102 ·
4
Replies
102
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
29
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
7K
  • · Replies 6 ·
Replies
6
Views
2K