How do programmers keep it all straight?

  • Thread starter Thread starter tstarling
  • Start date Start date
AI Thread Summary
The discussion revolves around the challenges of learning programming, particularly for those who feel overwhelmed by the vast array of commands and languages. Many participants emphasize that it's unrealistic to expect to know every command or function in a programming language. Instead, they advocate for mastering the fundamentals and utilizing resources like online documentation, IDEs, and community forums for assistance. The conversation highlights the evolution of programming education, noting that modern learners can benefit from structured courses, online platforms, and collaborative learning. Python and Java are mentioned as popular introductory languages, with a consensus that understanding core programming concepts is crucial for success. Participants also stress the importance of practical experience, suggesting that learners should engage in hands-on projects and seek mentorship to navigate the complexities of coding. Overall, the key takeaway is that programming is a continuous learning process, and leveraging available tools and resources can significantly enhance understanding and productivity.
tstarling
Messages
2
Reaction score
0
You all might think it is a silly question, but I have the impression that one should be able to somehow know the entire string of commands, and so feel overwhelmed and defeated. I never really earned the basics, just eased into computer work while in the healthcare field and have been struggling ever since. There seems to be so much to learn then more added so I switch and get very confused about even where all my files are.
Don't really have another question. Thanks if anyone feels like commenting.
 
Technology news on Phys.org
tstarling said:
You all might think it is a silly question, but I have the impression that one should be able to somehow know the entire string of commands,

Are you talking about writing computer programs in a programming language such as Java? - or are you talking about using an interactive environment such as a "shell" and typing commands in a terminal window?
 
  • Like
Likes harborsparrow
tstarling said:
You all might think it is a silly question, but I have the impression that one should be able to somehow know the entire string of commands, and so feel overwhelmed and defeated. I never really earned the basics, just eased into computer work while in the healthcare field and have been struggling ever since. There seems to be so much to learn then more added so I switch and get very confused about even where all my files are.
Don't really have another question. Thanks if anyone feels like commenting.
Coursework! Long ago in the old days when computers were for "smart" people and the technology was far less advanced, students could learn Computer Programming, in whatever languages were available for this included in the course, at their local institution - whether community college, university, or high school. Students did not learn everything all at once. Students were taught to make table of variables or data, draw and label flow-diagrams, and translate the flow diagrams into computer language code.
 
Long ago there was less to know. Fewer machine architectures, instructions, commands, languages, interfaces, etc.. Learn fundamentals well, be selective about which sets of specifics you learn in depth, and be ready to read the manuals and code. Set up safe test environments. Try things out.
 
  • Like
Likes harborsparrow
I often use the internet to find some code that matches what I'm trying to do and then extract out what I need. Some of my work involves coding in Java then switching to Python or Matlab and back so its useful to have programming examples and old code on hand to see how I did it in the past and if not the search the internet.

Also modern languages like Java demand that programmers use an IDE like Eclipse, Netbeans or Intellij. These tools can look up Java apis, prettify your code and help your refactor it as needed. Before the advent of these tools, it was quite cumbersome to even change a variable name everywhere correctly or refactor code. So if you're not using them you should look into it for a productivity boost.

Its quite true that over the years programming languages have grown in features and in library support api. When I learned Fortran IV, the manual was like 50-60 pgs and 30 or so statements and conventions with pretty much everything in it. Later other programmers added third party libraries that you could incorporate into your code the most notable being the IMSL numerical computing library.

As newer languages are developed they have to incorporate the popular library apis of older languages into their code base in order to compete for programmers. We see this in Java today which arguably has everything including the kitchen sink in its api set making learning it fully an improbable task. Programming book authors often don't know the whole language (that's often why they wrote the book) but instead research the api, write about it and go on to something else. Later they can use their own book to remember the more arcane parts. One famous author I knew did this and it amazed me how little he knew what he was writing about. He made a lot of money on it too. So I guess money trumps knowing.

Anyway there are a couple of cookbooks for Java, for Python and several other languages that are good for finding specific tasks like sorting data or date time display and numerous other things.

https://www.amazon.com/dp/144933704X/?tag=pfamazon01-20
 
  • Like
Likes harborsparrow
I keep notes of commands and code facts that I need often. Long ago, I had to switch often between computers with different systems (VAX, IBM, PC Windows, Unix, Unix-like) with completely different commands and tools. I had a pocket-sized notepad for each one and I would carry the one that I needed that day. Even now that I stay on one system, I have a file of notes.

PS. Also, Google helps a lot.
 
  • Like
Likes harborsparrow and jedishrfu
I use a ticket system called Jira, but any ticketing system should allow you to track your bugs and tasks. Sourceforge has a nice free once and I think trac has a free version.

You should place all of your code files into a version control repository. It not only allows you to track your changes, but check out the same piece of code and edit it no matter what computer you are on.

As for languages, no, we don't know all of the functions and their parameters. I tend not to remember things unless I know that I won't have access to them later. I can google any function that I need to know about, and most of the time, the IDE will show me a function's parameters if I start typing it.
 
  • Like
Likes harborsparrow
newjerseyrunner said:
the IDE will show me a function's parameters if I start typing it.

Also if you don't know exactly what you need you can type in various words that match the task at hand. If you are lucky there is already something in the API to use and it will pop up for you.

So much easier than a dumb text editor and a stack of books where search means flipping pages.

BoB
 
  • Like
Likes TheOldFart and harborsparrow
tstarling said:
You all might think it is a silly question, but I have the impression that one should be able to somehow know the entire string of commands, and so feel overwhelmed and defeated. I never really earned the basics, just eased into computer work while in the healthcare field and have been struggling ever since. There seems to be so much to learn then more added so I switch and get very confused about even where all my files are.
Don't really have another question. Thanks if anyone feels like commenting.
Learning the theory of coding is as important as having a big tool set. Start with the basics. Whenever you need a command you haven't learned, use the help/man function or a search engine online. DuckDuckGo returns stackoverflow whenever it suspects a programming inquiry which is most helpful. As with anything, practice and then practice more.
 
  • Like
Likes harborsparrow
  • #10
Remember this ?
https://imgs.xkcd.com/comics/ballmer_peak.png

ballmer_peak.png
 
  • Like
Likes HAYAO, Telemachus, jim mcnamara and 5 others
  • #11
tstarling said:
You all might think it is a silly question, but I have the impression that one should be able to somehow know the entire string of commands, and so feel overwhelmed and defeated. I never really earned the basics, just eased into computer work while in the healthcare field and have been struggling ever since.
Even the most experienced programmers have reference books they use. Take time to learn the basics :) Once you have the core programming concepts down you can apply them to any language and the rest is just learning syntax.
 
  • Like
Likes brainierthaneinstein, harborsparrow, Laurie K and 3 others
  • #12
Greg Bernhardt said:
Once you have the core programming concepts down you can apply them to any language and the rest is just learning syntax.

This is very true when the core is stable. Learning a new programming language is more like learning a different dialect rather than a whole new language.

But from time to time there are additions to that core. If you switch to a language that has adopted these new concepts as requirements but your preferred language has not you have two learning curves to climb.

BoB
 
  • #13
Greg Bernhardt said:
Even the most experienced programmers have reference books they use. Take time to learn the basics :) Once you have the core programming concepts down you can apply them to any language and the rest is just learning syntax.
That's what so many programmers keep saying. Not true for me.
 
  • #14
To keep in line with what O.P. wants, how would a person today begin learning computer programming and data processing? Many decades ago, one would be able to enroll in a couple of beginning computer science courses and learn programming. This could have been done through a community college and at the time, no Window operating system yet. Students would learn BASIC, Cobol, or Fortran, or any/ or all of them. Things changed? Right? No?
 
  • #15
symbolipoint said:
To keep in line with what O.P. wants, how would a person today begin learning computer programming and data processing? Many decades ago, one would be able to enroll in a couple of beginning computer science courses and learn programming. This could have been done through a community college and at the time, no Window operating system yet. Students would learn BASIC, Cobol, or Fortran, or any/ or all of them. Things changed? Right? No?
Step 1: Figure out what programming you want to do.
Step 2: Find out which is the most popular language for that task. It may not be the best language, but with more users, the better chance you have a query answered.
Step 3: Make a stackoverflow, github, and some type of online education site. I personally use edX. They have a lot of free programming courses. I am not a Microsoft guy, but the Microsoft courses have been very beneficial and easy to follow.
Step 3.5: If you want to do Data Analystics go ahead and make a kaggle account as well.
Step 3 Alternative: Buy a programming textbook or find a free one. https://en.wikibooks.org/wiki/Main_Page and https://openlibrary.org/ have plenty of programming texts. DuckDuckGo search will yield some books as well.
Step 4: Find/make friends that are also interested in learning programming. They will help keep you motivated and lots of troubleshooting time can be avoided by someone else intervening.
Step 5: Start writing programs. The beginning is the hardest. Don't get frustrated if your first programs have many errors. That is natural.
Step 6: Find something you want to program. Knowing how to program is no good if you have nothing to program!

Edit: Step 5.5: Learn to read man and help pages and documentation effectively! Teaching yourself is better than having to rely on forums.
 
Last edited:
  • Like
Likes member 563992
  • #16
symbolipoint said:
To keep in line with what O.P. wants, how would a person today begin learning computer programming and data processing? Many decades ago, one would be able to enroll in a couple of beginning computer science courses and learn programming. This could have been done through a community college and at the time, no Window operating system yet. Students would learn BASIC, Cobol, or Fortran, or any/ or all of them. Things changed? Right? No?
I did a Google search for "cs1 programming language survey". The most recent hit on the first page is a 2014 blog post from the Communications of the ACM:

Python is Now the Most Popular Introductory Teaching Langauge at Top U.S. Universities

Java is a close second. Of course, this doesn't include more run-of-the-mill universities, colleges and community colleges.

As a single obsolete data point, I taught intro programming using C++ until about 2002. During the late '90s and early '00s, C++ was very common for this because it was the language used in high school AP computer science courses in the US. Then that course switched to Java in 2003. The professor who took over our intro programming course from me, accordingly switched to Java also. I don't know what the prof who teaches the course now uses.
 
  • Like
Likes Jamison Lahman
  • #17
jtbell said:
I did a Google search for "cs1 programming language survey". The most recent hit on the first page is a 2014 blog post from the Communications of the ACM:

Python is Now the Most Popular Introductory Teaching Langauge at Top U.S. Universities

Java is a close second. Of course, this doesn't include more run-of-the-mill universities, colleges and community colleges.
It is definitely beneficial to see what university classes are being taught. It may be beneficial to check out Indeed/LinkedIn and find which languages employers are looking for as well.
 
  • Like
Likes harborsparrow
  • #18
jtbell said:
Python is Now the Most Popular Introductory Teaching Language

If you start out with Python make sure you learn the basics and simple data structures first. Python with its rich set of libraries makes it easy to get into the flashy stuff too quickly. If your goal is to complete a task easily then that is actually a good thing. If you want to learn programming well then understanding the foundation first is the way to go.

Documentation is scant on some of the Python libraries so if you have trouble wading into some of the heavier weight ones don't be discouraged. It might not be your fault.

A feature of Python that makes it easier to learn than many others is the console. You can execute commands interactively and get instant feedback.

BoB
 
  • Like
Likes harborsparrow
  • #20
symbolipoint said:
... Many decades ago, one would be able to enroll in a couple of beginning computer science courses and learn programming. This could have been done through a community college and at the time, no Window operating system yet. Students would learn BASIC, Cobol, or Fortran, or any/ or all of them. Things changed? Right? No?
No Windows, but you did have mainframes and mini computers with multi-tasking operating systems. As for decades old programming languages: RPG (report generator), somewhat of a software replacement for plug board programming. PL/1 - advanced but not that popular. APL (A Programming Language), dating back to the early 1960's, a very high level language where the operators work with variables with any number of dimensions.
 
Last edited:
  • #21
Before you ever sign up for a programming class, take a course in Logic. That is an essential fundamental that programming is built upon (and for that matter, good written and verbal communications too.)
 
  • Like
Likes gradev and Greg Bernhardt
  • #22
Dr_Zinj said:
Before you ever sign up for a programming class, take a course in Logic.
I agree that a small amount of logic would be helpful, but most of what would be taught in a quarter or semester of a logic course would have little or no bearing at all on programming.

Being able to categorize logical statements as modus ponens, modus tollens, etc. is not an essential skill for someone writing code, IMO, based on teaching many classes in programming over the past 30+ years. What is important is understanding simple logical expressions such as NOT p, p AND q, p OR q, and ##p \Rightarrow q## (or in more C-like form if(p) q; ). All of these can be learned by the use of a truth table, which wouldn't take an entire quarter or semester.
 
Last edited:
  • Like
Likes symbolipoint, Telemachus, jim hardy and 2 others
  • #23
I once heard the claim that philosophers are notoriously good at understanding the more complex ideas of programming. For example, how the value of a pointer is able to be in two places simultaneously. I only took an ethics class at university and I doubt the effect in programming is immediately noticeable, but a metaphysics course may be beneficial as well. At the least, it would probably be enjoyable. My hypothesis is that any skill learned in a metaphysics course would still be learned in a programming course and probably take more time to develop it.
 
  • #24
Just like "A writer writes", a programmer programs. You can't learn/know everything about any technology these days, almost all of them change so quickly that you can't keep up - and it isn't necessary. You learn what you need to solve the current problem, you move on.

As to anyone suggesting Python as a learning language... let's just say I'd eat a whole High-C can full of worms first...which may explain why the learning curve looks so steep. If I were staring into the abyss that Python is, I'd be overwhelmed too.

A good programming language does what good software does: Fail-fast. With Java or C# or other modern language, you make a mistake that can be caught early, it will be because it won't let you compile something that's obviously broken. Python doesn't just take the opposite approach, it actively hides common mistakes from you and you'll spend a great deal of your learning time learning how to troubleshoot Python code rather than learning how to program. Typo'ed a property name in Python? No problem, Python will automatically create a property with that incorrect name and it's up to you to figure it out at runtime. While customers are looking over your shoulder. And your boss. And an angry swarm of lawyers, 3 human rights organizations, 2 animal rights organizations, the entire military of 4 countries you've never heard of and that damn partridge in that damn pear tree... Okay maybe I'm _slightly_ exaggerating, but you will spend time troubleshooting stuff at run time and that is *not* the time to be detecting/fixing bugs.

Also if you're smart, you'll learn from the tools. I was a C guru for 14 years, you know what I used to do? Run every line of code I wrote through PC Lint. Not because I didn't know the language inside and out (even wrote my own C compiler once upon a time) but because I'd get in a hurry, I'd forgot to check something, I'd succumb to old habits... always some sort of problem or another that can be caught early by good tools which would teach me yet another lesson.

Likewise with Java. I use Eclipse with PMD and all the warnings turned up to 11, plus I run Infinitest on my projects. The instant I type in something stupid, I get a free lesson from the tools. The second or third or fourth time I get the same lesson presented, I finally remember why it's wrong and how it should have been done.

The other reason for using a modern language like Java and IDE like Eclipse is that you can attach to a running JVM. That' simply not possible in Python where "remote debugging" means something entirely different. If you want to learn what your software really does, you can run your application normally and then completely independently start up your IDE and attach to the JVM running your application.

That can also provide really important lessons that most beginners never think of and most old timers really should have learned: Predict where the bottlenecks in your application are, and then go try to prove your predictions correct. You won't, because your predictions are wrong. It's a fact of life, and the sooner you learn that lesson about software the sooner you begin to learn what things you _actually_ need to know about programming rather than what you think you need to know or what someone told you you need to know. If I had a nickel for every software engineer or hacker I've seen (including me, several decades ago) who wasted huge amounts of time learning obscure language tricks in order to "optimize" their code, I'd be rich.

Those are some things you learn along the way to learning what you set out to understand.

Basically, lit boils down to choose a problem you can solve with software, pick up a tool and try to solve the problem. Repeat until you understand why one tool worked and the other didn't, or why one solution took you so long and the other was so fast. The successful tools will be the ones you go back to and learn more in depth. Repeat until like a carpenter, you know what all the tools are for, even the odd shaped and obscure ones.

Personally if I were going to teach someone it'd be Java and Eclipse. They're free, well documented, supports most modern language features, and not only do your resulting apps run on any platform, your development environment runs on any platform (Eclipse is java based and runs on Linux, Windows and Mac) There are other good tools out there, I just happen to know those well. I keep telling myself I'll put some tutorials on youtube... maybe I should get off my butt...
 
  • Like
Likes harborsparrow, gradev and jim hardy
  • #25
Jamison Lahman said:
For example, how the value of a pointer is able to be in two places simultaneously.
No, that's not true.
In the context of C and C++, a pointer contains a memory address. So the value of a pointer is the address it contains. You can dereference a pointer, using the dereference operator (*) to access the memory pointed to. A pointer can't "be in two places at once." Maybe you're confusing the value of a pointer (an address) with the contents of the memory the pointer points to.

Jamison Lahman said:
I only took an ethics class at university and I doubt the effect in programming is immediately noticeable, but a metaphysics course may be beneficial as well. At the least, it would probably be enjoyable. My hypothesis is that any skill learned in a metaphysics course would still be learned in a programming course and probably take more time to develop it.
IMO, a course in metaphysics would be of no use whatsoever in learning to program. I worked for 15 years among many professional programmers. To the best of my knowledge, none of them had ever taken a metaphysics class. Certainly none of them ever touted the advantages of such a course in learning to program.
 
  • Like
Likes symbolipoint
  • #26
None of us know everything. Keep learning and try to remember what you learned today before you go to sleep.
We all use GOD programming when we reach the limit of our own knowledge: Google Oriented Development. And if you are in a team talk to your peers. Sometimes just by talking about something the solution suddenly seems obvious.
 
  • #27
For example, how the value of a pointer is able to be in two places simultaneously.
Mark44 said:
No, that's not true.
In the context of C and C++, a pointer contains a memory address. So the value of a pointer is the address it contains. You can dereference a pointer, using the dereference operator (*) to access the memory pointed to. A pointer can't "be in two places at once." Maybe you're confusing the value of a pointer (an address) with the contents of the memory the pointer points to.
This is semantics. The "value of a pointer" can refer to the address that is being pointed to. In that sense, here is an example of the value of a pointer being in two places at once:
Code:
{
  int n=123;
  volatile int *pPlaceA, *pPlaceB;
  pPlaceA = pPlaceB = &n;
}

Addressing the OP, when it comes to learning how to program, nothing beats programming. Everyone has there own ways of learning, but here are my suggestions:
1) Have a mentor - an experienced programmer.
2) Start with this exercise:
- a) Set up your development environment - this may take some research.
- b) Find an already-written "Hello World" program. Compile it, execute it, break it, fix it.
- c) Read enough about the language syntax to fully understand the "Hello World" program. Change it, break it, fix it.
- d) Play with whatever debugging methods are available to you: breakpoints, watch points, lines of "print" statements, etc.
3) Find more sample code. It each case:
- a) Work with the reference material to understand exactly what the program is doing.
- b) Change it, break it, fix it. Other than input from your mentor, this is the only evidence that you have that you understand what the code is really doing and how it is doing it.
4) Take on a very small project. This is where the mentor will really be needed. He/she can not only tell you why your program won't compile or operate as intended, they can also tell you what methods you can use to make things easier for yourself.
5) If you are having trouble with things like pointers or logic, ask you mentor and play with them. Find specific examples, break them and fix them.
6) Once you have some achievement, go to the reference manual and read through it - not for the purpose of memorizing it, but so that you know what resources are available to you. Should you need something later, the reference material will still be there. In a lot of cases, you won't even understand what the reference material is saying. That's fine, later the material you have read may give you ideas and with a case in point, you may find the reference material a lot less cryptic.
 
  • #28
For example, how the value of a pointer is able to be in two places simultaneously.
Mark44 said:
No, that's not true.
In the context of C and C++, a pointer contains a memory address. So the value of a pointer is the address it contains. You can dereference a pointer, using the dereference operator (*) to access the memory pointed to. A pointer can't "be in two places at once." Maybe you're confusing the value of a pointer (an address) with the contents of the memory the pointer points to.
.Scott said:
This is semantics. The "value of a pointer" can refer to the address that is being pointed to. In that sense, here is an example of the value of a pointer being in two places at once:
Code:
{
  int n=123;
  volatile int *ptrA, *ptrB;
  ptrA = ptrB = &n;
}
I disagree that it is just a semantic difference. The value of a pointer is the address stored in the pointer variable, something that can be verified by the use of a debugger or by an output statement.

In your example, in which I changed the names of the pointer variables slightly, each pointer variable contains the same address; namely, the address of n. A pointer variable, like all other scalar variables, can contain only a sngle number.

If we assume that n is stored in four bytes starting at location 1000 (hypothetical), then the assignment in the third line stores this address in both pointer variables. So the value of ptrA and ptrB would be 1000, and the value of *ptrA and *ptrB would 123, the number stored at address 1000. The difference between ptrA and *ptrA is hardly due to just semantics.

Is there some reason you made both pointer variables volatile?
 
Last edited:
  • #29
Mark44 said:
Is there some reason you made both pointer variables volatile?
Just to make sure that the pointer variables were really assigned to memory and that the address of n was really stored. Otherwise, the entire code block would be subject to being optimized away.

I assumed there was an issue of semantics when you posted:
A pointer can't "be in two places at once."
You seemed to be contradicting the statement that the "value of the pointer" could be in two places at once. So it sounded like semantics to me.

However, in the example you provided, I would take the "value of the pointer" to be (int *)1000. Certainly (int *)1000 can be stored in more than one place, and retained in those places simultaneously. And you also seem to take the "value of the pointer" to be (int *)1000.

So, I am now confused by exactly what you meant when you said "No, that is not true" in response to Jamison's "For example, how the value of a pointer is able to be in two places simultaneously.".

You seem to agree that the value can be stored in more than place, as Jamison asserted.
 
  • #30
Here's what Jamison said:
Jamison Lahman said:
For example, how the value of a pointer is able to be in two places simultaneously.
.Scott said:
You seemed to be contradicting the statement that the "value of the pointer" could be in two places at once. So it sounded like semantics to me.
My comment, "A pointer can't "be in two places at once", was an attempt to understand and interpret what Jamison said, which I believe was not actually what he meant. Certainly multiple variables can all hold the same value, and the same is true for pointer variables. Given that Jamison was looking at things from the contexts of ethics, philosophy, and metaphysics, but apparently not in the context of programming, I interpreted what he said to mean that somehow we could store two different things in the same (scalar) variable. In summary, my response was germane to what I thought he meant, not the literal words he wrote.
 
  • #31
I'm an old timer, so my learning process involved Fortran, assembly, Cobol, APL, C, Java. In the olden days, the WatFor Fortran compiler was debug oriented, and could print out what line and variables were involved with an error. Fortran and C were the easiest for me to learn. APL is the most complex language I've learned. For Java, I use NetBeans for the IDE, which to me is similar to working with Visual Studio for C / C++ / C#.

TheOldFart said:
That can also provide really important lessons that most beginners never think of and most old timers really should have learned: Predict where the bottlenecks in your application are, and then go try to prove your predictions correct. You won't, because your predictions are wrong.
For most of the projects I worked on dating back to the 1970's, the teams did a pretty good job of predicting and later confirming where the key bottlenecks would be. Most of these involved multi-threaded applications and/or operating systems, some of the early ones were multiple (mini) computer applications.
 
Last edited:
  • Like
Likes harborsparrow
  • #32
The answer to the original question ('How do programmers keep it all straight?') is simple: they (we) don't. We try, sure, we plan ahead (at least we should), and in the end we get it right (at least we should), but I've seen ghosts, if you get my meaning ;)
Making software is a process, and the tech is so huge, there are so many options... you can't remember everything everytime. But you don't need to remember every key command, every data structure and every [type here any word]. You just need to remember the what/why/when of conditions, loops and values (variables, constants, pointers... that doesn't matter in this ethereal scenario) and understand your requirements and goals.
Dr_Zinj said:
Before you ever sign up for a programming class, take a course in Logic.
I agree. Logic is the most important thing.
Every language (Java, C#, c++, ASM, perl, PHP, QBasic...) is basically the same if your logic mind is working. Command names may be different, a for-next loop could be optimal using one language and a bad idea in other cases... You should focus on that kind of thing for every project, and let go after that.
The more you use a technology deeper will be your understanding of it, of course, and that's a good thing, but if you try to remember everything about it when you are not using that technology... well, you will forget about your real target (to make good/useful software) and that is the worst nightmare possible for you and your peers.

TheOldFart said:
Python doesn't just take the opposite approach, it actively hides common mistakes from you
So true...
TheOldFart said:
Personally if I were going to teach someone it'd be Java and Eclipse. They're free, well documented, supports most modern language features, and not only do your resulting apps run on any platform, your development environment runs on any platform (Eclipse is java based and runs on Linux, Windows and Mac)
I'd add c#, because now you can use it in every platform (Win, Mac, Linux, IOS, Android...) and Visual Studio Community Ed is my favorite free IDE. But Visual Studio doesn't work in Linux (although there are multiplatform IDEs for .Net languages).
 
  • Like
Likes TheOldFart
  • #33
In principle, programmers should thoroughly document every function and the meaning of the variables and constants in it.
In practice that doesn't happen much, because they are under pressure to get a job finished.
 
  • #34
tstarling said:
You all might think it is a silly question...there seems to be so much to learn then more added so I switch and get very confused about even where all my files are.

I have been a programmer for decades and used many languages. My memory isn't any better than anyone else's, but I do a few things to supplement matters:

1) Get organized with files, and keep ALL old code you ever wrote. You'd be amazed now useful it can be to look back on how you did that thing LAST time, even in a different language.
2) Get a web server, and install a personal wiki. I use pmwiki because it's drop-dead easy to install on either Linux or Windows. In your wiki, make notes on how to do particularly tricky or common things that you want to remember. For example, I have a page specifically on T-SQL (for Sql server); I can't remember SQL code all that well, but I have marvelous records, so I can refresh my memory pretty easily.
3) Blog when you solve a sticky problem in any language. Somebody else will have the same problem, and thank you for posting the solution. And when you go for job interviews, they'll find your blog and love it. Plus, we all benefit from everybody else's blogs.

Programming gets easier, because like anything else, over time, you've seen it all. So be patient with yourself. I used to live in terror because I had to use so many different languages and operating systems, so I was never your deep expert. But working across a broad swathe of languages and technologies is also an asset--if that's you, embrace it. And take good notes. Get a wiki. Save all code. Blog. And try to enjoy the endless learning curve that is programming.
 
  • Like
Likes QuantumQuest
  • #35
rootone said:
In principle, programmers should thoroughly document every function and the meaning of the variables and constants in it.
In practice that doesn't happen much, because they are under pressure to get a job finished.
No, not every variable.
The most important thing to include in the documentation in a function prologue is an answer to the question "Why did I write this program?". The most important thing to document within the code are all of the special things that you are programming around. For example, if you are using a bin sort, why did you choose that method? For example, perhaps you want to keep the execution time within O(n) and so say why you are doing that?

Variables do need to be documented - but not all of them. If you have an array "CRabbit Rabbits[MAX_RABBITS]", and you are indexing through it "for(nRabbit=0;nRabbit<nNRabbits;nRabbit++)" and then you have a a pointer "CRabbit *pRabbit" and the first statement in your for loop is "pRabbit=Rabbits+nRabbit", none of those variables need further documentation. Anyone maintaining this code can go to the CRabbit module and discover everything they need to know about the CRabbit class. You may want to include something about how the objects in you array are constructed.

The purpose of the inline documentation is to help yourself or another programmer maintain the code. Your audience is someone who is a programmer capable of maintaining the code - not a layman.

I see tons and tons of code that meet a standard for documentation that is pro forma and utterly useless. The documentation should be bone fide communication of exactly the kind of information it will take for a programmer to thoroughly catch on to what is going in.

As far as there being pressure to get the job finished - software engineers get to say when they are finished. You don't have to hand your code in when it is not complete.
 
  • Like
Likes harborsparrow, QuantumQuest and rootone
  • #36
harborsparrow said:
keep ALL old code you ever wrote.
Assuming, of course, that you are allowed to.
Most of the code that I have written has been either classified, proprietary, or both.
 
  • Like
Likes harborsparrow
  • #37
.Scott said:
As far as there being pressure to get the job finished - software engineers get to say when they are finished. You don't have to hand your code in when it is not complete.
That may be the case for serious science and engineering systems, at least I sincerely hope it is.
My own of experience programming for the commercial sector is that very often the client just wants something that 99% works.
I had a chat with with somebody on the the marketing side of the fence, and his attitude was, (nice guy mind you), that.programmers are hired to get a working result as soon as possible, as opposed to them fiddling around trying to make things perfect.
 
  • Like
Likes symbolipoint
  • #38
THIS really does make sense!
rootone said:
That may be the case for serious science and engineering systems, at least I sincerely hope it is.
My own of experience programming for the commercial sector is that very often the client just wants something that 99% works.
I had a chat with with somebody on the the marketing side of the fence, and his attitude was, (nice guy mind you), that.programmers are hired to get a working result as soon as possible, as opposed to them fiddling around trying to make things perfect.

Not to specify any particular industry or field, but some companies focus on shipping products, and others focus on good, reliable development, before shipping products.
 
  • #39
rootone said:
My own of experience programming for the commercial sector is that very often the client just wants something that 99% works.
I had a chat with with somebody on the the marketing side of the fence, and his attitude was, (nice guy mind you), that.programmers are hired to get a working result as soon as possible, as opposed to them fiddling around trying to make things perfect.
"Perfect is the enemy of the good."

symbolipoint said:
Not to specify any particular industry or field, but some companies focus on shipping products, and others focus on good, reliable development, before shipping products.
Companies are generally in business to make and sell a product, so there is naturally a tension between shipping the product and making it perfect. If the balance is more on the shipping side at the expense of quality, customers are likely to be unhappy, and opt for a different product. If the balance is more on perfecting the product, the product ships later, possibly losing money for the company.
 
  • Like
Likes harborsparrow and QuantumQuest
  • #40
rootone said:
...programmers are hired to get a working result as soon as possible, as opposed to them fiddling around trying to make things perfect.

The expressed attitude, unfortunate in my opinion, is common in the software industry among people who do not themselves have much experience writing code. Also unfortunately, such folks sometimes end up managing those who DO write code.

We can all agree that producing working results in a timely manner is important--but it is not the only valuable thing. Overemphasizing quick results has risks, such as: (1) inadequate performance when usage ramps up, (2) buggy code that is difficult to fix, and (3) code that is difficult to improve, add to, or enhance.

I've had my current job for ten years; before me, there were a half dozen students who stayed for one year, wrote a ton of code "quickly", and then left for a higher-paying job. I estimate that I had to rewrite about 80% of the code which I inherited from those "fast" programmers. My boss never agreed to this, but I understood what he really needed--which was, for stuff for work, be stable, and be able to be modified and grown. He never thought I was fast enough, but overall, he was satisfied with my efforts. I did for him what was necessary (in this case, rewrite with better designs, better coding standards, and sometimes in a different language, replacing old third-party components that, while working "quickly", were not upgradable over time).

"Fiddling around" is simply NEVER, in my opinion, the way one gets software to be good. Use of appropriate, modular designs, good coding practices, good testing, and adequately understood requirements--in other words, planning and preparation in the early stages before most "results" are available--are IMO often necessary to get decent working software.

Remember the Obamacare website fiasco? That is what one risks from working really fast. The people who replaced it with something that worked also did really fast work--BUT they had the benefit of the requirements and preparation of the first attempt.
 
  • Like
Likes symbolipoint and jim hardy
  • #41
The most important parts of computer programs are the algorithmic parts. Simpy programming because you can type commands will not be very useful. The more you understand the underlying theory, the better. Every computer language you use will be based in some way upon computer theory, including its structure, syntax, semantics and functionality. If this sounds too time-consuming you can see for yourself that you cannot get far in a casual manner.
 
  • Like
Likes harborsparrow
  • #42
In my view and according to my experience, the only things required for a programmer to keep it all straight are love for the job and organization. So, we basically talk here about good, efficient programmers. This distinction nowadays is more necessary than in the past because many people have rushed into the programming profession the last decade or so, motivated by the paying rates and based on mixing and matching premade things and do the job "fast" and "furiously". Of course the ever increasing need of more complex and more efficiently working software, which must on the other hand be modular, easily configured, modified and adapted and the ever increasing abundance of software online in any form, created the conditions for this. While it is a good thing for a society to create new job opportunities in the market, this on the other hand has tainted the programming profession and created various consequences, not the least of which are unmaintainable - or hardly maintainable at best, code, efficient programmers that they are not allowed to do the job in the proper way and - unfortunately in many cases, even not finding a programming job. Although this varies among sectors and fields and among countries I think that it is the common denominator globally.

Now, for a decent, efficient programmer, the qualities and skills to keep it all straight, with "all" growing on a day by day basis, develop through application of good principles - solid background of the basics of programming, perseverance, lots of hours spent, being broad minded, chasing bigger projects - job challenges and very importantly learning from his / her own mistakes and don't repeat them. There is a whole lot of mistakes especially for a beginner programmer to do, so he / she - at the very least, must not do the same mistake(s) over and over. In the solid background of the basics there is a whole lot of CS things that a programmer must have under his / her belt, with utterly important a good, sufficient knowledge of algorithms and data structures, among many other things. At a practical level, he / she has to be able to work even with the modest tools that do the job and learn new IDEs / tools, frameworks etc. fast. All the above things that a programmer must do / exercise have to do with a real love for the job. The other part is organization. High level concepts, methodologies, models and - at the practical level, code including templates, self - made libraries / frameworks and any other relevant thing, must be kept organized and mixed and matched appropriately. Programming, may not be anymore a matter of art as it was in the past but there is still some art in it which is not so of a "hard - wired" talent in my opinion, but love for the job.
 
  • Like
Likes harborsparrow
  • #43
It's a craft, like many other endeavours of human experience.

The minutae of keeping what you are doing at the moment straight in your head is all about layers and layers of abstraction. You focus on one small task at a time. And then you package up that in a function/module/etc. and move on to the next thing and forget all the details of how the last one work - you just need to know it does. A construction worker doesn't need to know the intrinsics of how his electric drill works to utilize it properly and efficiently.

The meta part of programming - i.e. the stuff that does not involve actual typing of code is foremost knowing the right tool for the right job - just like a carpenter should know what tool to use for each wood so he doesn't damage the material or his equipment. In software it's a bit worse, since our tools change so fast. This necessitates an attitude of being open to learning. Plus various other habits that help working in teams, communication skills, etc.
 
  • #44
I have a set of text files, documents, copies of web pages, and example programs to keep track of algorithms, language specific and hardware specific nuances. In the case of a complicated algorithm new to a group of programmers, companies usually acquire documentation and/or hire a consultant to teach the algorithm.

There's also the research aspect. For example, a group of companies fund UC San Diego's CMRR (Center for Memory and Recording Research), which is a relatively large university research group to develop new methods and algorithms related to that part of the industry.
 
  • #45
tstarling said:
You all might think it is a silly question, but I have the impression that one should be able to somehow know the entire string of commands, and so feel overwhelmed and defeated. I never really earned the basics, just eased into computer work while in the healthcare field and have been struggling ever since. There seems to be so much to learn then more added so I switch and get very confused about even where all my files are.
Don't really have another question. Thanks if anyone feels like commenting.

I generally don't focus on details like commands up front; instead, I start by identifying and understanding the problem. Afterwards, I create a plan for the solution. Finally, I worry about those commands. It's pretty much the same concept as found in mathematics.
 
Last edited by a moderator:
  • #46
Greg Bernhardt said:
Even the most experienced programmers have reference books they use. Take time to learn the basics :) Once you have the core programming concepts down you can apply them to any language and the rest is just learning syntax.

Agree - I've got nearly 40 years programming experience, at the stage now where I can learn a new language in a few days. It comes to anyone with a the right mind and a little practise
 
  • #47
Before you ever sign up for a programming class, take a course in Logic. That is an essential fundamental that programming is built upon (and for that matter, good written and verbal communications too.)
 
  • Like
Likes symbolipoint
  • #48
Ben Gilliam said:
Before you ever sign up for a programming class, take a course in Logic. That is an essential fundamental that programming is built upon (and for that matter, good written and verbal communications too.)
Beginning programming courses seem to not have a course on Logic as a prerequisite, but the necessary logic is introduced both in the beginning programming course and in one or more of the prerequisite courses, such as in Algebra 1 and Algebra 2. Students (at least some of them) are not accustomed to using that logic.
 
  • Like
Likes QuantumQuest
Back
Top