How can I learn computational complexity?

AI Thread Summary
Understanding algorithms and computational complexity requires a foundational grasp of mathematical concepts, particularly as they relate to time complexity and sorting methods. The discussion highlights the importance of iterating through data efficiently, using sorting algorithms like bubble sort as an example. It explains that the time complexity of an algorithm, such as bubble sort, can grow significantly with larger datasets, illustrating how the number of operations increases exponentially. This exponential growth emphasizes the need for efficient algorithms, especially in contexts like encryption, where breaking secure systems often requires impractical amounts of time and computational resources. The conversation also stresses the importance of a solid mathematical background to comprehend these concepts fully, as many resources may assume prior knowledge that can lead to confusion for beginners. Textbooks like R. Sedgwick's "Algorithms" are recommended for clearer explanations. Overall, a structured approach to learning algorithms, starting from basic principles and building up, is essential for mastering the subject.
barasakar
Messages
2
Reaction score
0
Hi all,
I was watching a lecture on MIT OpenSource website that is called Algorithmic Lower Bounds: Fun with Hardness Proofs. I am very interested in the content of the lecture, but I couldn't understand most of the information, such as P and some specific terminologies. I tried to look them up on Wikipedia, but there were more unclear things came up. I am totally new Algorithm, but I have some programming experience and currently learning calculus in high school. Is there any way for me to start approaching Algorithm and Computational complexity?
 
Technology news on Phys.org
A lot of textbooks on algorithms explain this clearly, like big-O notation. For an exact example and definition try R Sedgwick 'Algorithms' 4th Ed. You need some math to read this usefully.

For an offhanded, non-technical explanation:

Computers operate by stepping through one-by-one over each data item. This is iteration. Sorting is an important example of iterating. The idea is to go through the list of items as few times as possible. The number of times your algorithm has to go through the list to achieve a verified fully sorted list is a measure of time-complexity.
Suppose you have N items in your list, a bubble sort requires worst case: N-1 times N-1 passes through the list. If N is 11, well that won't take long. If there are a million items, uh oh. 999999 times 999999 is waaaaay bigger than 10 times 10, right? Not just a hundred thousand steps more than 10 times 10 steps. Answer: 999997999901 steps, total. Close to a million times slower. As the list gets longer time gets used up faster and faster. With gigantic lists the time becomes easier to measure like Geologic era. Eons. Instead of seconds or minutes.

So a bubble sort is fine -if you want to sort tiny lists. You need to know what resources algorithm will consume, time is one them. Encryption uses this fact to its advantage.

A bad guy is not going to waste hundreds of hours on a very expensive supercomputer to break really hard encryption. That is not going to make him rich overnight. He is going to break into something easy - where he got the password because the user was careless.This is money much faster.

So governments who have time and money may seem to want to try breaking stuff - or plant back doors. Encryption depends largely on the fact that all of the known algorithms to break a specific scheme are super time consuming. So the fact is that it becomes really impractical to break good encryption without backdoors (shortcuts), or some other interesting ways to steal passwords or encryption keys.

That is the concept. Real time-complexity is far more interesting. And requires precise definitions.

Math is VERY linear. If you miss a step in learning from A to Z ( G for example) you may not really understand and be able to use most of the concepts after G. One great reason to go to class every day.
Really good math text writers know this, those are the ones to read. Too many explanations on the internet assume way too much previous knowledge that is required to use it well. Anybody not fully conversant with the field -as you found out- gets a headache fast..
 
  • Like
Likes barasakar
jim mcnamara said:
A lot of textbooks on algorithms explain this clearly, like big-O notation. For an exact example and definition try R Sedgwick 'Algorithms' 4th Ed. You need some math to read this usefully.

For an offhanded, non-technical explanation:

Computers operate by stepping through one-by-one over each data item. This is iteration. Sorting is an important example of iterating. The idea is to go through the list of items as few times as possible. The number of times your algorithm has to go through the list to achieve a verified fully sorted list is a measure of time-complexity.
Suppose you have N items in your list, a bubble sort requires worst case: N-1 times N-1 passes through the list. If N is 11, well that won't take long. If there are a million items, uh oh. 999999 times 999999 is waaaaay bigger than 10 times 10, right? Not just a hundred thousand steps more than 10 times 10 steps. Answer: 999997999901 steps, total. Close to a million times slower. As the list gets longer time gets used up faster and faster. With gigantic lists the time becomes easier to measure like Geologic era. Eons. Instead of seconds or minutes.

So a bubble sort is fine -if you want to sort tiny lists. You need to know what resources algorithm will consume, time is one them. Encryption uses this fact to its advantage.

A bad guy is not going to waste hundreds of hours on a very expensive supercomputer to break really hard encryption. That is not going to make him rich overnight. He is going to break into something easy - where he got the password because the user was careless.This is money much faster.

So governments who have time and money may seem to want to try breaking stuff - or plant back doors. Encryption depends largely on the fact that all of the known algorithms to break a specific scheme are super time consuming. So the fact is that it becomes really impractical to break good encryption without backdoors (shortcuts), or some other interesting ways to steal passwords or encryption keys.

That is the concept. Real time-complexity is far more interesting. And requires precise definitions.

Math is VERY linear. If you miss a step in learning from A to Z ( G for example) you may not really understand and be able to use most of the concepts after G. One great reason to go to class every day.
Really good math text writers know this, those are the ones to read. Too many explanations on the internet assume way too much previous knowledge that is required to use it well. Anybody not fully conversant with the field -as you found out- gets a headache fast..
Thanks man, that's pretty helpful!
 
Dear Peeps I have posted a few questions about programing on this sectio of the PF forum. I want to ask you veterans how you folks learn program in assembly and about computer architecture for the x86 family. In addition to finish learning C, I am also reading the book From bits to Gates to C and Beyond. In the book, it uses the mini LC3 assembly language. I also have books on assembly programming and computer architecture. The few famous ones i have are Computer Organization and...
I had a Microsoft Technical interview this past Friday, the question I was asked was this : How do you find the middle value for a dataset that is too big to fit in RAM? I was not able to figure this out during the interview, but I have been look in this all weekend and I read something online that said it can be done at O(N) using something called the counting sort histogram algorithm ( I did not learn that in my advanced data structures and algorithms class). I have watched some youtube...
Back
Top