Learning data structures and algorithms in C provides a solid foundation, but transitioning to other languages like Java or Lisp may present challenges due to differences in paradigms and memory management. While C requires manual pointer management, modern languages often abstract these complexities, allowing for a focus on core concepts. Data structures are universal across languages, though terminology and implementation details may vary. Understanding recursion and the specific features of each language, such as object-oriented programming in Java, is crucial for successful implementation. Familiarity with graph theory can aid in understanding graph data structures, but its direct application in programming may be limited.
#1
elias001
389
26
I have a quick questions. I am going through a book on C programming on my own. Afterwards, I plan to go through something call data structures and algorithms on my own also in C.
I also need to learn C++, Matlab and for personal interest Haskell.
For the two topic of data structures and algorithms, I understand there are standard ones across all programming languages. After learning it through C, what would be the biggest issue when trying to implement the same data structures/algorithms say in Java, Lisp, Ada, etc. I know it might be asking something very naive. When I see data structures and algorithms, I go through a discrete math textbooksand do the relevant exercises in pseudocode.
I don't know if different programming paradigms would make translate an algorithm from one language to another. The only thing I do know that are big difference between C/C++ and other newer generation languages are that c/C++ require pointers management. But i think that might be a separate issues all together.
If you don't already know any general programming language (e.g. enough to write at least a minimum useful program) and you plan to learn multiple languages and environments anyway, I would recommend you start with something other than C, e.g. Java, Python or similar managed language in an IDE where you can focus on the essentials of common data structures and then worry about the finer details on pointers and memory management later once you get to C. You can possibly go C++ first if you focus on using "modern C++" with smart-pointers and similar abstractions, but be aware that even C++ has some subtle details lurking just below the surface that managed languages are just better at hiding by default. On the other hand, if you feel you absolutely need to learn C (e.g. for maintaining an existing C code base or similar), then training good memory management practices from day one in C will probably not hurt, but might feel it a bit like a tedious and non-productive distraction.
Also, I recommend you search for "is X good as a first programming language" to get a feel for what others think for a specific language/environment X. If you already know some language or part of it you can add that to your search.
Regarding "porting" algorithms between languages, that is indeed a thing. Often reference algorithms are made in a simple (managed) language and then ported and optimized for specific type of usage in another language and environment.
Almost everything has arrays. Almost anything has structures.
Javascript, C++, and some other languages (but not C) allow you to put functions within structures (or classes).
That's a big thing because it allows you to keep you code and the data structures that the code relies on to share a home.
Say you code up a bubble sort for an array of structures. That basic algorithm could be implemented in Java, Javascript, C, C++, Lisp, Ada, Cobol, Matlab, etc. In some cases, the terminology will be different - "structure" may be "record". Certainly, not all of these languages will create code that would be immediately recognized as a bubble sort. Some variations of Basic and Fortran could also be used.
At a certain point, things will get too cumbersome. For example, Bash would be pretty cryptic - and Forth is always cryptic.
Let me add this: Data structures is a topic of its own - unrelated to programming languages. If your interest is specific to data structures, you should probably venture into php/mySql. Every language has its own semantics for coding structures - but the real topic is designing those structures, especially when they exceed available RAM by many factors; need to be developed in direct support of a "customers"; have mixed security requirement; and such. If you don't know what a Boyce-Codd normalization is, you haven't really touched the surface of data structures.
C is a basic, fundamental language. You can easily mimic a C program in the other, more modern, languages. C (as opposed to C++) will not have Object Oriented Programming (OOP) concepts built into the language. If you are later working in modern languages where OOP is expected, that will be a learning process.
Data structures are universal in all those languages, including C.
One algorithm approach that you need to pay special attention to is that of recursion. Some computer languages support it well and others do not. I can not tell you in general which are which.
#6
sysprog1
20
26
The rosettacode.org site has many sample programming tasks/problems and it shows solutions implemented for each of them in dozens of languages,
The best all-around OOP language is Java. It's verbose at times, but it's used everywhere on the web. All major IDEs support it as well.
Java is a single inheritance language, which means a class has a single parent, who may in turn have a single parent. C++ allows for multiple parents and the use of mix-in classes. The multiple inheritance of C++ has caused many programmers problems.
Consider the case of a child class C inheriting everything from parent classes PA and PB. PA and PB have been implemented to inherit from GP, the grandfather class. Inheriting means that the child class can change attributes in the grandfather class GC.
The question is, which path do we use to reference the grandfather attribute?
1) C->PA->GP
2) C->PB->GP
In C++, separate memory areas are created, meaning it's essential to follow a convention and always use the first approach. But if you forget and mix the two, you'll have a fun time diagnosing the problem.
---
There's also the Processing IDE, which is great for learning while doing stuff. Its base language is Java, although not the very latest Java.
For Data Structures, there are linked lists, doubly linked lists, stacks, and queues. They are used a lot in programming. The linked list is the basic one to learn first. You'll learn to add to a linked list either the front or the end, deleting a node from the. linked list without breaking it, inserting a node between two nodes.
Stacks and queues can be built of top of linked lists.
Here's a java implementation using its linked list datatype:
Java:
import java.util.LinkedList;
public class Example {
public static void main(String[] args) {
LinkedList<String> list = new LinkedList<>();
list.add("Alice");
list.add("Bob");
list.add("Charlie");
list.addFirst("Zara"); // insert at head
list.addLast("David"); // insert at tail
System.out.println(list); // [Zara, Alice, Bob, Charlie, David]
list.removeFirst(); // removes Zara
list.removeLast(); // removes David
System.out.println(list); // [Alice, Bob, Charlie]
}
}
When you study Data Structures, you will need to build these structures yourself from scratch.
In the Java case, the more common ones are provided as a convenience to the programmer. You use the examples to show you how they should work in a program.
---
The next one is the tree datatype, where again you learn how to create a tree, traverse a tree in any of several ways, add a node, and delete a node without breaking the tree structure.
---
The last one I remember is the graph, where a given node points to several other nodes, and yet other nodes point to the given node. Again, you learn how to create trees, add nodes to trees, delete nodes from trees, and insert nodes between nodes.
There are many variations here of how nodes are connected to other nodes.
Each of the structures have the capability of either going from one direction to the end or reverse direction and retrace your steps. This means they may have a forward pointing pointer and a backward pointing pointer.
Attachments
Screenshot 2025-08-19 at 4.05.41 PM.webp
18.3 KB
· Views: 7
Last edited:
#8
elias001
389
26
@Filip Larsen the first serious programming language is learned was C and i have fond memory of it. Python and all those interpreted languages feel like I am playing a video game. I really really dislike it. C feels like one is telling a story, it has a beginning, a middle and an end, kind of like going through all the works of Tolkein. Where as Python is like watching one of those unending Japanese anime like the Doraemon series, except Python is not as cute nor adorable as Doraemon. It felt more like watching all the bad guys in all the myraid series of Dragon Balls all roll into one.
#9
elias001
389
26
@jedishrfu for the life of me, I never knew nor appreciate why Java is so popular, then there is Java and coffee/bean script. Wait, was it called coffee nor bean. Anyways, what was wrong with sticking to Assembly, C, C, C++, fortran, Cobol? Ok, I know there was lisp, prolog and Ada. But then new languages kept popping up. But the funny part is that OOP languages came out like a few decades before Java got popular. I am guessing whatever new languages with new programming paradigms that comes out, it takes more than a decade or so before it gain popularity. But I never understand why certain language becomes popular nor was there really an actual need for it.
Also, i plan to stay in the mathematical sciences. So i need to know C, C++, Matlab and possibly fortran more than I need to know Java. I would prefer to learn to only program in assembly if there is a general version that is not hardware specific like C.
You mentioned graphs as a data structure. Would knowing the math of graph theory make my life easier when dealing with graph data structures?
@jedishrfu for the life of me, I never knew nor appreciate why Java is so popular, then there is Java and coffee/bean script. Wait, was it called coffee nor bean. Anyways, what was wrong with sticking to Assembly, C, C, C++, fortran, Cobol? Ok, I know there was lisp, prolog and Ada. But then new languages kept popping up. But the funny part is that OOP languages came out like a few decades before Java got popular.
A lot of the motivation of Java was to make a language that was more secure and less likely to allow mistakes than C/C++. It did not allow some of the things that C/C++ allowed which had let to bad pointers and overwriting memory. It was intended to provide a "sandbox" that a programmer could use for running his program but which protected the rest of the computer from any bad behavior of the program. That would make the programs safer to download as part of internet web pages and run on millions or computers. CORRECTION: It's my understanding that Microsoft made their own modifications, which became popular, and weakened some of those protections, but I am not an expert on any of that. These comments may be more about Javascript than about Java. (This was the result of my faulty memory. @jedishrfu gives accurate information in post #31.)
elias001 said:
I am guessing whatever new languages with new programming paradigms that comes out, it takes more than a decade or so before it gain popularity. But I never understand why certain language becomes popular nor was there really an actual need for it.
A lot of that is due to security and program safety issues that the new programs try to fix. Of course, there are a lot of other features where the language developer thinks he knows better than all other programmers.
elias001 said:
Also, i plan to stay in the mathematical sciences. So i need to know C, C++, Matlab and possibly fortran more than I need to know Java. I would prefer to learn to only program in assembly if there is a general version that is not hardware specific like C.
OH! THE HORROR!
Programming only in assembly language would be awful for a mathematician. ;-)
elias001 said:
You mentioned graphs as a data structure. Would knowing the math of graph theory make my life easier when dealing with graph data structures?
I would think that the mathematical subject of graph theory is not very helpful in computer programming of data structures, except for terminology and basic concepts. ADDED: I was too careless in this statement. @sysprog1 mentions a book in post #50 that is interesting. The author has several books on parallel algorithms and computations. I can see the connection to graph theory in that subject.
Last edited:
#11
elias001
389
26
@FactChecker I still remember when Java first got popular and the computer was so so so slow. I mean I knew it was running Java because it keep saying something about updating Java... It felt like the computer was stuck ruminating about the digital meaning of its own existence. It just felt eerie.
Also programming in Assembly is not horror if one is in numerical analysis. There are books about numerical methods in Java or Python. That might as well be taking a Toyota down the F1 race track and try to compete in it.
Fun fact, roller coaster tycoon was written in assembly.
Another feature of java design was the same source could run on any computer that could run java. This is a central feature and for many years was only a dream.
In fact, some languages like Kotlin, Scala, Groovy and Net Rexx depend ie run on the Java JRE runtime component which is ported to all major computer architectures and OS systems providing a common though OS specific GUI.
Java has one of the most comprehensive API packages. Many are imbedded in the JRE runtime module.
Write once run anywhere was the motto. You have to have grown up through this turmoil to truly appreciate what Sun’s Java team lead by Gosling did.
Last edited:
#13
elias001
389
26
@jedishrfu Can't one do same thing with C or fortran, write once, run anywhere? Also to this day, I still don't understand what on earth is an API, or have never seen one.
Also programming in Assembly is not horror if one is in numerical analysis. There are books about numerical methods in Java or Python. That might as well be taking a Toyota down the F1 race track and try to compete in it.
You may be underestimating the ability of the high-level optimization in modern compilers. They are impressive. (Although Python had a reputation of being very slow. Hopefully, it is much faster now.)
Also, I would prefer using a numerical analysis routine in MATLAB to writing one in assembly language. In fact, the authors of the MATLAB routines are likely to be the authors of standard numerical analysis textbooks. That was the case 15 years ago.
And finally, the world of computer mathematics is expanding spectacularly. The things to worry about now include neural networks and other AI methods, sensor fusion (Kalman filters, etc.), etc. It's very hard to keep up, even at a superficial level.
Last edited:
#15
elias001
389
26
@FactChecker wait, if you have graduate level in the following: functional analysis, operartor theory, pde, including integral transforms, generalized functions, optimization both linear and non linear, variational methods, fourier and harmonic analysis. All the things you mentioned should not be hard to keep up. Many engineers learn a bit of all of these topics, they would not cover any of them in depth. I mean if you have the math background, you only have to worry about learning the science/engineering materials. Since an undergraduate mathematics degrees content applies to many area of physical science and engineering. Many area of interest opens up to you and you don't have to worry about the math.
@FactChecker wait, if you have graduate level in the following: functional analysis, operartor theory, pde, including integral transforms, generalized functions, optimization both linear and non linear, variational methods, fourier and harmonic analysis. All the things you mentioned should not be hard to keep up. Many engineers learn a bit of all of these topics, they would not cover any of them in depth.
Ha! Well, there are a lot of things that I can't keep up with, and more every day.
Good luck in your computer efforts. I think you will enjoy them. These are exciting times.
#17
Filip Larsen
Gold Member
2,008
950
elias001 said:
Python and all those interpreted languages feel like I am playing a video game.
Well, if you have fond memories of C then by all means go for C/C++. In C you would mostly be doing all the pointer and memory management yourself. If you go with modern C++ (e.g. C++20) it offers templates and STL (Standard Template Library) with support for a range of data structures and algorithmic constructs useful for new structures (in C++, like many other modern languages, you'd rarely make your own custom data structure completely from scratch but build it from existing well-documented and well-performing structures and libraries).
And rest assured, trying to understand the precise semantics of various C++ language and library constructs will not make you feel like you are playing a video game (for kids), no, it will more make you feels like you are the detective in a sci-fi story trying to solve a mysterious death in a foreign country involving strange complicated machines and hidden devices.
Serious business, I tell you.
Last edited by a moderator:
#18
elias001
389
26
@FactChecki don't know if you go on twitter, like five or so years ago
Anyways, it was the time where the media won't talk about deep learning or machine learning. This whole business of machine learning and data science was still new in the sense more people were choosing to go into it for $$$ or whatever personal reasons. Anyways, little were they ready to face the reality of all the pure math they have to learn. I don't mean statistics and probability. I mean functional analysis proper, including metric space topology. They were all bitching left right and center as if they were all wondering why they all have to learn such things as "hilbert spaces" or theory of linear operators. I hope they did not get the impression that optimization only involves linear programming and nothing else.
@jedishrfu Can't one do same thing with C or fortran, write once, run anywhere? Also to this day, I still don't understand what on earth is an API, or have never seen one.
No, we may say something is a FORTRAN program and it might run on Linux, but likely won't run on Windows or MacOS because they have different FORTRAN Compilers or the system calls are different. Most people now use the GNU FORTRAN compiler, and everyone is on the same FORTRAN dialect, and there's a certain measure of conformity across platforms.
However, this wasn't always the case.
When I started out on mainframes at GE, I used Honeywell FORTRAN-Y, which was an enhanced variant of FORTRAN IV. Honeywell supported FORTRAN-Y and FORTRAN IV. However, IBM supported its own version of FORTRAN. The big difference that I recall was the CHARACTER datatype.
Companies tried to get ahead of the standards committee by implementing some or all of the recommended changes before things were formalized as a way to stand out from the crowd and make their programs difficult to port to other vendors. Honeywell was no different.
Here's a list of FORTRAN compilers:
Early Versions
• FORTRAN I (1957) – The very first version, developed at IBM for the IBM 704 computer.
• FORTRAN II (1958) – Added subroutines and functions.
• FORTRAN IV (1962) – Introduced machine-independent standardization and removed machine-dependent features.
ANSI/ISO Standards
• FORTRAN 66 (ANSI X3.9-1966) – The first standardized version by ANSI. Sometimes called FORTRAN IV standardized.
• FORTRAN 77 (ANSI X3.9-1978) – Added block IF…ENDIF, CHARACTER data type, DO…ENDDO loops, and other structured programming features.
Vendors would offer conversion programs to move a Honeywell FORTRAN application to IBM so customers could switch to an IBM mainframe and know that their programs would work there (maybe).
However, most companies tended to stay with one vendor over many years and shied away from conversions, usually IBM, as there was a saying, "Nobody ever got fired for buying IBM."
What is an API?
It's short for Application Programming Interface. FORTRAN programs mainly came with libraries of routines, and sometimes a site would add IMSL libraries to the mix. IMSL could be considered an API as it had routines for many advanced mathematical algorithms and functions. Its documentation described what each routine did, how to call it in a FORTRAN application, and what output or error to expect.
In the case of Java, APIs are defined by packages and classes. A package may contain an API for how to set up and perform matrix arithmetic by instantiating a class, loading a matrix into it, and then calling various math methods to operate on the matrix.
As an example, here is a simple API based on a quote from the Wizard of Oz. It starts by defining a Java interface that all classes of the API must implement. By implement, I mean they all must define the getName(), speak(), and habitat(), all of which return strings.
Next, the animal classes are defined: Lion, Tiger, Bear. Notice how each class implements the Animal interface and provides the relevant methods of getName(), speak(), and habitat(), and returns a resultant string that the user of the API can display. Java requires the \@override tag to indicate that this method's definition comes from an interface or parent class.
By the way, interface programming is a powerful OOP strategy for simplifying code while providing consistency across different classes. I can now write a generic program that works for all animals, and regardless of the animal instance provided, I can be sure that specific methods will be there for me to call.
Java:
public class Lion implements Animal {
@Override
public String getName() { return "Lion"; }
@Override
public String speak() { return "Roar!"; }
@Override
public String habitat() { return "Savannah"; }
}
public class Tiger implements Animal {
@Override
public String getName() { return "Tiger"; }
@Override
public String speak() { return "Grrr!"; }
@Override
public String habitat() { return "Jungle"; }
}
public class Bear implements Animal {
@Override
public String getName() { return "Bear"; }
@Override
public String speak() { return "Growl!"; }
@Override
public String habitat() { return "Forest"; }
}
And finally, a sample program that iterates through a list of animals. Notice it's iterating through the list of animals, not knowing what each animal is, but assured that the methods getName() ... can be called.
Some years ago, there was a movement called structured programming, where goto statements were banished. When I see Java interface designs such as Animal, I am reminded that it does away with the need for the switch statement (although not really), so I call it switchless programming.
Interface programming is a powerful strategy, but it is also harder to follow as you try to debug why one class worked while another didn't, even though they implemented the same interface. It's a long story of a tool to identify unused code in a Java repository getting tripped up by interfaces.
Java:
public class WizardOfOz {
public static void main(String[] args) {
Animal[] animals = { new Lion(), new Tiger(), new Bear() };
for (Animal a : animals) {
System.out.println(a.getName() + " says \"" + a.speak() + "\" and lives in the " + a.habitat() + ".");
}
}
}
and the generated output:
Java:
Lion says "Roar!" and lives in the Savannah.
Tiger says "Grrr!" and lives in the Jungle.
Bear says "Growl!" and lives in the Forest.
I have not tested this code in java but have tested it as a processing sketch:
Java:
// 🦁🐯🐻 Wizard of Oz Animal API in Processing
// --- The API Contract ---
interface Animal {
String getName();
String speak();
String habitat();
}
// --- Implementations ---
class Lion implements Animal {
public String getName() { return "Lion"; }
public String speak() { return "Roar!"; }
public String habitat() { return "Savannah"; }
}
class Tiger implements Animal {
public String getName() { return "Tiger"; }
public String speak() { return "Grrr!"; }
public String habitat() { return "Jungle"; }
}
class Bear implements Animal {
public String getName() { return "Bear"; }
public String speak() { return "Growl!"; }
public String habitat() { return "Forest"; }
}
// --- Processing Setup (like main) ---
void setup() {
// We don’t need a window here, but you can set size if you want to display
size(600, 200);
background(255);
textSize(16);
fill(0);
Animal[] animals = { new Lion(), new Tiger(), new Bear() };
int y = 40;
for (Animal a : animals) {
String line = a.getName() + " says \"" + a.speak() +
"\" and lives in the " + a.habitat() + ".";
println(line); // print to console
text(line, 20, y); // draw on canvas
y += 40;
}
}
getting the same output on the canvas and in the output window below the code.
**The code samples in this post were generated by ChatGPT 5 and were tested in a Processing sketch environment. Testing as a java application is left to the interested poster.
@FactChecki don't know if you go on twitter, like five or so years ago
Anyways, it was the time where the media won't talk about deep learning or machine learning. This whole business of machine learning and data science was still new in the sense more people were choosing to go into it for $$$ or whatever personal reasons. Anyways, little were they ready to face the reality of all the pure math they have to learn. I don't mean statistics and probability. I mean functional analysis proper, including metric space topology. They were all bitching left right and center as if they were all wondering why they all have to learn such things as "hilbert spaces" or theory of linear operators. I hope they did not get the impression that optimization only involves linear programming and nothing else.
I am not an expert in machine learning, but here are my two cents:
I think we should distinguish between the theoretical research in machine learning versus its application. Neural networks have been applied in significant ways much earlier than many people realize. IMO, the current explosion of neural networks in real-time image recognition is due more to hardware/software improvements than to theoretical advances in machine learning.
If you are planning to do programming in a mathematical context then take a look at Julia from MIT.
Julis is free and in a trajectory to replace matlab. Its syntax is similar to Matlab and has the capability to work with GPUs, and can interface with Python, R, and Fortran.
Theres also Mathematica to consider but depends on what youre working on.
I would argue that Java is as good as any language with its OOP features and robust libraries for every thing from data structure collections, accurate floating pt math, to big numbers to jdbc access to sql databases to GUIs to zip files to …
can add computational modeling to java. Their site contains many examples of Java apps doing accurate modeling of physical systems. I took a course in it some years ago.
I forgot to add that OSP can work inside processing sketches too.
Last edited:
#22
elias001
389
26
@jedishrfu there is all these new languages. I don't even know how R vs Julia are different. Or if I knie R and Matlab, why I would still need to know Julia. I don't know that C, C++ have very large libraries that have been contributed by many people for doing many different computational tasks in the hard sciences. I imagine same can be said for Fortran and Matlab.
MATLAB has an entire environment that allows for advanced design work. It includes simulation, analysis, and many specialized libraries developed by experts in the relevant fields. It is relatively expensive and many of the extensions must be purchased/licensed, but I know engineering managers who will not hire anyone who does not know MATLAB.
PS. The cost may be greatly reduced for students and personal use.
#24
elias001
389
26
@FactChecker and @jedishrfu if I know R and python, why does it need to know Julia. I understand if I know Haskell and there is a very specific reasons to learn say Lean, Cog, Isabella, since they are theorem proof checkers. Aren't Julia have a lot of overlap with both R and python together?
At this point, there is no reason to bring up all the multitude of languages with their specific advantages, specializations, and disadvantages. You can learn them as needed. Any modern, general-purpose language will do. A couple with current popularity are Python, which has a large, enthusiastic, user group, and C/C++, which has a huge base of existing code and established programs.
#26
elias001
389
26
@FactChecker C++ still has a larger user community generated libraries than python?
#27
Filip Larsen
Gold Member
2,008
950
elias001 said:
C++ still has a larger user community generated libraries than python?
Considering that most high performing and system interfacing libraries used in Python usually are a set of C/C++ libraries with Python bindings, I would say C++ is more useful on the library side, even if Python seems to win the popularity vote over C++.
(I may be biased, since I personally find Python mostly to be a "toy" language good for quickly gluing things together in the "kitchen sink", but very limited once you want to "scale up" application/system complexity.)
#28
elias001
389
26
@Filip Larsen when i first tried out python. I thought it must be the space invaders equivalent. I hope Javascript or Java won't feel the same.
@FactChecker and @jedishrfu if I know R and python, why does it need to know Julia. I understand if I know Haskell and there is a very specific reasons to learn say Lean, Cog, Isabella, since they are theorem proof checkers. Aren't Julia have a lot of overlap with both R and python together?
It's not about learning and writing in multiple languages. It's about needing a script written by someone else in another language, like R, Python, Fortran, or C, and now you want to combine them and use Julia as the glue as part of a bigger script, so you don’t have to rewrite anything.
Many research projects are like that. They make a script in bash, redirecting output to a file for an awk script to parse. The awk script generates the input files for the next stage, a program written in Python, R, FORTRAN, or C/C++. Julia is a better choice and maye even have better algorithms that you can use as well.
To be clear javascript was originally called livescript. Its focus was to create interactive webpages.
When java and its applet code became popular livescript changed its name to javascript to catch some of the glory of java applets.
When applets became too large and too slow then javascript gained some traction and surpassed java applets.
MS killed java applets on its windows-based browser by not supporting the full runtime library of java. This ticked off Sun since they knew what MS was doing, trying to break java’s write once run anywhere philosophy.
There was a battle for dominance the MS Extend, Embrace, Extinguish philosophy and loss of the court battle to Sun split the developer community once MS released .NET and C# to compete with java on windows the dominant OS cutting Sun out of the picture.
I don't know what "space invaders equivalent" means. That seems to be a foreign idea in the subject of computer languages.
I think he means code and indentation via spaces or tabs.
#33
elias001
389
26
@jedishrfu I mean by running a python script feels like playing a video game. To add two numbers in C, you have to write a program, ccompile, then run whatever it is you wrote. In python, well, just type in 1+1:. Actually I don't even remember the proper syntax. But you folks know what i am trying to get at.
Yeah, that's just a nice convenience feature of the Python interactive shell.
Many other languages support an interactive shell. Julia is one such example.
Interactive shells may not support the full language. I recall Julia and Python have limitation on what language features may work.
#35
elias001
389
26
@jedishrfu can i ask if once upon a time, programmers had to learn all the data structures and algorithms and learn to write it in assembly? Also is it worth learning to write programs that have the equivalent of few to ten thousand lines of code in C? i have seen from youtubets where neural networks were wtitten in assembly.
Simple answer is no. The most complex structure would usually be a simple array. We never got any formal instruction beyond reading macro assembler reference and learning from others how they coded.
Formal data structures theory wasn't defined rather coders followed their instincts and the problem at hand. Fortran offered 2 or 3 dimensioned arrays, while Cobol offered simple record structures and tables ie simple arrays and would have been the primary influence to assembler coders.
The OS was basically a set of tables of record structures to handle batch job run info. I wrote one major assembler based program that read the system tables and displayed the info on screen rather like the Linux top command of today.
It ran in supervisory mode and had access to all memory. Any mistake of mine could overwrite important job data and crash the system.
Mine did and it was discovered that a user routine I used could not run in supervisory mode.
Many coders wrote code that reused the initialization code area meaning if your program crashed after looping thru some algorithm you could not look at initialization code and what it did. The notion of separation of code and data came later as people realized many error were a result of this coding decision.
Memory was limited and considered expensive and assembly language coders were frugal. Their program was data mixed with opcodes.
Bottom line there was no stack or queues or more complex data structures. only tabular data.
It was a simpler time.
#37
elias001
389
26
@jedishrfu wait, isn't C/C++ one of those write once, run anywhere language? I mean I have never heard of Mac having Java virtual machines being installed on them.
So data structures and algorithms already matured by the time of algol came on the scene. And for all the simpler times, they were able to do a lot with so little. Kind of like what Euler and his contemporaries had to work with in math for solving applied math problems.
Oh before I forget, for the topic of pointers, there are quite a few books written on it for C, and also C++. Are there significant difference between how C and C++ handles this topics. I assume whatever i learn for C carries over to C++, but not necessarily the other way around.
Also i found a book that have the topic of smart pointers. I should learn about the topic of pointers pre "smart pointers" era before I go onto learning about smart pointers??
Also i don't know if i you have seen this: object oriented C. Can i learn about pointers for oop setting in C before moving onto learning about it for C++. Would it make sense easier in the sense of learning to write safer code?
@jedishrfu can i ask if once upon a time, programmers had to learn all the data structures and algorithms and learn to write it in assembly? Also is it worth learning to write programs that have the equivalent of few to ten thousand lines of code in C? i have seen from youtubets where neural networks were wtitten in assembly.
The series of books by Donald Knuth, "The Art of Computer Programming" were not "had to learn", but were considered standard classics to study. Knuth used a hypothetical assembly language called MIX.
Even if you do not get those books or study them, you should be aware of their existence and importance in the history of computer algorithms. IMO, they are very inexpensive for such classic books. (about $205 for 5 books on Amazon) But studying them is a very significant amount of work.
@jedishrfu wait, isn't C/C++ one of those write once, run anywhere languages? I mean I have never heard of Mac having Java virtual machines being installed on them.
So data structures and algorithms already matured by the time of algol came on the scene. And for all the simpler times, they were able to do a lot with so little. Kind of like what Euler and his contemporaries had to work with in math for solving applied math problems.
Oh before I forget, for the topic of pointers, there are quite a few books written on it for C, and also C++. Are there significant difference between how C and C++ handles this topics. I assume whatever i learn for C carries over to C++, but not necessarily the other way around.
Also i found a book that have the topic of smart pointers. I should learn about the topic of pointers pre "smart pointers" era before I go onto learning about smart pointers??
Also i don't know if i you have seen this: object oriented C. Can i learn about pointers for oop setting in C before moving onto learning about it for C++. Would it make sense easier in the sense of learning to write safer code?
No. While it appears that C/C++ is like that, you can't just compile and run when switching from a Unix system to a Windows system. Programmers would look for cross-system libraries to solve some of this. For example, Windows uses the \ for pathnames of directories and files, whereas Linux uses the / separator. Knowing that C programmers would factor it into their code.
One problem with your question is that these concepts developed over time. Early coders didn't know the formal names of data structures, but may have built them into their code, and as no libraries existed at the time, at least none that I knew of.
Fortran, Basic, or COBOL didn't have stacks until much later. This meant you couldn't do recursive programming. For FORTRAN doing a recursive solution could put in an I finite loop until the compiler guys fixed it via warning not to call the function you're defining. However, some assembly coders could hand code a custom solution.
Last edited:
#40
elias001
389
26
@jedishrfu what about the issues of pointers that I mentioned. If I know C pointers, transitioning to C++ pointers should not be much of a problem?
You could use pointers in C++ the same way as in C, but that would miss out on some very important advantages of C++ and OOP.
Suppose you were making a data structure of a tree, where each node has an undetermined number of children. In C, you might define the structure with a pre-set maximum number of children, say 5, and have an array of up to 5 children. That would allocate a lot of space in memory that might be unused at some nodes and might not be enough at other nodes. Alternatively, you might allocate memory space each time a child is created and free it when the child is terminated. That requires careful programming and bookkeeping.
A concrete example (although not a tree structure) would be a simulation of traffic in a road network. Vehicles are entering and leaving the simulated network as simulation time progresses and each road section and intersection keeps track of vehicles on/at it. Intersections with traffic lights need to keep track of an undetermined number of vehicles.
In C, it gets messy.
In C++ using OOP, you can create (instanciate) new vehicles (objects) easily and follow them through the road network until they leave the area. The memory allocation and freeing is done by C++, with much less effort on your part.
PS. There are many other advantages of OOP that can be/are addressed in other threads.
#42
elias001
389
26
@FactChecker wait I thought dealing with and learning about pointers has to do with memory management and dynamic memory management. I only care about that issue and would worry about OOP complicated the issue further or make things simpler?
@FactChecker wait I thought dealing with and learning about pointers has to do with memory management and dynamic memory management. I only care about that issue and would worry about OOP complicated the issue further or make things simpler?
Good question. OOP helps to facilitate memory management. For the time being, you can ignore other features of OOP if you want to.
Keep in mind that the memory allocation can get very complicated. You might be allocating memory for an entire structure and structures containing structures of variable sizes.
#44
elias001
389
26
@FactChecker later on as I get more advanced into learning about programming. For languages like python that evangelize that is a memory safe language. One can write a program in python that can cause a buffer overflow? I mean basically to show that its memory management safety feature doesn't work 100% of all cases.
Go was developed for google server-side operations by a team who also worked on C decades before.
It was designed to be general purpose, and meet Google’s need for a common language for the millions of lines of code they maintain.
Last edited:
#46
Filip Larsen
Gold Member
2,008
950
elias001 said:
what about the issues of pointers that I mentioned. If I know C pointers, transitioning to C++ pointers should not be much of a problem?
While the abstract concept of pointing to or referencing data (as opposed to work directly with the "value" of the data) is similar in C and C++ and share a bit of the same syntax, the actual patterns and mechanisms for memory/ownership management is almost completely different even for raw pointers. E.g. in C you would use malloc() and free(), whereas in C++ you would use new and delete (on very rare occasions) and preferably smart-pointers (in most cases).
I would say that while it doesn't hurt to know about pointer management in C before going to C++ it probably won't really help much either beyond the absolute basics.
#47
elias001
389
26
@jedishrfu and @FactChecker Factch thank you both. You two's advice has helped me a lot.
@FactChecker later on as I get more advanced into learning about programming. For languages like python that evangelize that is a memory safe language. One can write a program in python that can cause a buffer overflow? I mean basically to show that its memory management safety feature doesn't work 100% of all cases.
I can't answer that question. I don't know enough about Python. If I were to guess, I would guess that it makes memory errors less likely but not perfect. Those errors are the type that crooks try to exploit for fraud. They will work hard to find any little vulnerability. A lot depends on the operating system that the code is running on.
#49
elias001
389
26
@FactChecker I have read some of Knuth's first volume. I would not say his books are hard. I would say it takes a bit more time than other math books. He tried his best to be clear and meticulous. Too bad it was not like Feymann's lecture notes where you can hear his lectures.
#50
sysprog1
20
26
FactChecker said:
I would think that the mathematical subject of graph theory is not very helpful in computer programming of data structures, except for terminology and basic concepts.
I found this book very helpful in my student days -- its content is still comp-sci foundational, and by it the applicability of graph theory to computer programming of/or/and data structures is clearly shown:
from the description at Cambridge University Press:
"This is a textbook on graph theory, especially suitable for computer scientists but also suitable for mathematicians with an interest in computational complexity. Although it introduces most of the classical concepts of pure and applied graph theory (spanning trees, connectivity, genus, colourability, flows in networks, matchings and traversals) and covers many of the major classical theorems, the emphasis is on algorithms and thier complexity: which graph problems have known efficient solutions and which are intractable. For the intractable problems a number of efficient approximation algorithms are included with known performance bounds. Informal use is made of a PASCAL-like programming language to describe the algorithms. A number of exercises and outlines of solutions are included to extend and motivate the material of the text."