Open question - How do you mentally process commands/data when coding....

In summary: But if you're working on a trivial project or something where you're not necessarily looking for the best solution, you might make those same mistakes.In summary, people tend to think in languages that they know well. They might use pseudo code or a similar method to help them remember the steps they need to take. They usually think in a sequential manner, from the most basic concepts to the more complex ones. When they write code, they often start with the most basic parts and work their way up.
  • #1
hs100e
1
0
I'll start off by saying that this isn't really a question about how to code, but about how you code personally. If there's a more appropriate forum for me to post in just let me know!

A little exposition: I'm an illustration/ISTA major doing a project for a class that requires interviewing people about my topic. Given my interests, I wanted to do a project about how people 'think' when coding (specifically Piet, but I'm not sure how many people use it and I need at least 10 answers by Monday).

I work as a TA for a Python class and a lot of students will try using operators or syntax that work for other languages, but not python. When I show them the problem, they sometimes say something along the lines of 'sorry I was thinking in <some language>' which I find interesting. I started thinking about how I think when I code, specifically in Piet. Ex. I usually think of concepts in english, think of how I'd make those concepts work in python, and then think of how i'd make those python operations work in piet. It's really a process... So, anyways, here's the question:

TL:DR How do you process your data when writing in Piet (or any language, really). Do you have a language you learned first that you 'think' in? Do you like writing down your ideas with pseudo-code or something similar first? Do you jump straight into working with the language? Similarly, how do you break down code when viewing it?

You can be as brief or as in-depth as you like, I'm not afraid to read paragraphs!. You don't have to include age or gender or anything unless you think it might be relevant. Huge thanks to anyone who answers!
 
Technology news on Phys.org
  • #2
I taught myself Basic on a BBC Micro in about 1984 and still think of 'For...Next' loops in those terms. My work for the last 18 years has been supporting ERP systems (HR, Payroll, Financials, inventory etc.) on an IBM server. The code is generally written in very old languages - RPG dating from the 1980s and COBOL from even earlier. Often the easiest way to perform an update or write a query is to use SQL, but this can rarely be integrated into the existing software so I tend to do a 'proof of concept' in SQL, then work backwards to convert this to procedural code:
"Insert Into DateOfHireList (EmpNum, EmpName, HireDate) Select EmployeeNumber, EmployeeName, DateOfHire From EmployeeMaster" returns a full employee listing. In the older style of coding this has to be converted to a line by line process:
[Read first line of data]
Read EmployeeMaster
[Perform loop until end of file]
DoUntil EndOfFile EmployeeMaster
[move data into target fields]
Move EmployeeMaster.EmployeeNumber into DateOfHireList.EmpNum
Move EmployeeMaster.EmployeeName into DateOfHireList.EmpName
Move EmployeeMaster.DateOfHire into DateOfHireList.HireDate
[write data into output file]
Write DateOfHireList
[read next data line]
Read EmployeeMaster
EndDo

I follow a similar kind of process for writing new programs: work out the key steps required, produce a mock-up, write a skeleton program with just comments, build the key logic and loops and then fill in the detail. For simple programs I normally cheat - find a program which performs a similar function, copy the code and butcher it until it works :wink:.
 
  • #3
It depends on how long you've coded in a given language. When I started out I knew GE Basic as my first language and later used Fortran and COBOL. It was easy to think in those languages although I think I thought mostly in a mashup of Basic and Fortran and pseudo code. Pseudo code mapped easily to COBOL as COBOL was very wordy as an often described self-documenting language.

For awhile, I thought in C/C++ and used AWK a lot for miscellaneous projects. All three are C based at the most basic level with AWK and C++ having some more exotic features like regexs(AWK), inheritance features and the dreaded STL (C++). For those features I used pseudo code to get by not being able to easily recall them.

Later, I migrated to Java and after years of using it, think primarily in Java. However there are times when I forget that some operator in one language isn't in another. My most recent confusion was over the ^ operator which in some languages means power whereas in Java it means bitwise exclusive-OR so when I should have written Math.pow(x,2) for x-squared I wrote x^2. Java didn't complain but my code failed to work as expected either.

There are a few other operators that get abused too, like | vs || or & vs && for logical vs bit-wise operators in Java. If you look a lots of code in a big project you might spot incorrect usage but somehow it worked okay for the cases tested and never got discovered.

The worst part about operators is that some languages allow for custom operators where they can be redefined to something else entirely from the default. As a designer you are supposed to keep to the default programmer notion of the operator but that doesn't always happen. C++, Groovy and Scala as examples allow this with all or some operators. In C++ >> and << for cin and cout sometimes confuse folks who expect them to be left / right shift operators.

There are also some languages which have paradigm switch / different way of thinking mode and only thinking in pseudo code will help you get by like Assembler, APL, Prolog, Lisp or Forth but once experienced enough you will start to think and write directly in those languages too.

For me now, I write in a mashup of Java, Awk, Python and pseudo code (Esperanto-code?).
 
  • #4
I tend to write data-driven programs when I am not coding device drivers. Example from communications:
  1. Driver level: Acknowledge that a data packet has arrived. Check the status, if there is an error, delete the packet and increase the error counter. Otherwise assign a new buffer to the hardware, put the address of the data packet into a queue, touch the packet level semaphore and return
  2. Data link level: Check if there really is a new packet available. If so, trip the packet level semaphore, find out what type of packet it is and dispatch it to the next level handler, touching the appropriate semaphore.Do it over until you hang on the packet level semaphore
  3. Routing level: Trip the routing level semaphore.Check if the packet should be handled locally. If so find the type of transport level handler should be used and dispatch it to the transport level handler, touching the appropriate semaphore. Otherwise check if we are supposed to do routing. If so, dispatch the packet to the routing handler, touching the appropriate semaphore.Do it over until you hang on the routing level semaphore
  4. And so on..
Thus, through the whole program complex, the data contains the information on what to do (except for the driver level where the data to be acted upon is in the hardware interface).
 
  • #5
The whole thing depends on the experience one has as a coder/developer as jedishrfu points out and on what you write code for. At a student level, programs are relatively small and usually there is not much working experience in coding. Now, in my opinion, high level helper tools like pseudocode, flow charts (not used much anymore), UML diagrams for OO programming and every other such tool is of great help for a student, as they let them think and write their ideas in an abstract level and form a basis for further refinement and finally implementation. Delving directly into coding a specific problem requires a lot of experience of the used language(s) and for any serious project is a no go - even for seasoned programmers.

Personally, I developed the habit of using the tools I mentioned as I was learning Pascal, some decades ago. I also paid much attention to develop an algorithmic way of thinking and to develop a good programming style. This helped not only become better at coding but also to develop a much broader sense of programming in general. Truth is that C helped me a lot more, as its concise style and its fastness in many respects, allowed me to write more involved code in less time.

As someone progresses in writing more and more involved code in any programming language, gets in the habit of using the available language constructs and other contributed code in this language's ecosystem extensively and its concern becomes how to organize big chunks of code in more abstract ways. One good example is thinking in patterns which although heavily criticized by many programmers, they are the way to go in a vast number of heavy projects cases. I've used it in Java desktop and Android programming extensively and I've never regretted this. I also use it for the web beyond the usual MVC pattern. I won't go further into agile development, TDD etc. as this is out of the scope of the OP.

In breaking down code I usually start from the "main" function of the program (talking about C, C++ and Java I use). I look at what functions (or methods on an object) are called and I examine each one and their interdependencies. Of course I talk about understanding a working program and not about debugging some code where the experience of the programmer and the right tools are in charge.

Now, taking data the right way is an integral part of algorithmic thinking I previously mentioned, as a good algorithm can become useless without the right data structures or not very efficient at best. And this is also why a good design of a program plays a major role. Efficient algorithms plus appropriate and efficient data structures give good, maintainable programs. This is something that a good student must develop as well.

Every programmer - more or less, has a bias in some programming language or at least to a programming paradigm. When it comes to think about solving a specific problem, this bias usually has a heavy influence on the way he/she thinks about the problem. I usually tend to think in objects (Java bias). Having a broad sense in problem solving, helps to balance this bias with the real needs of the problem at hand. There are cases where thinking in a functional way is much more natural and/or appropriate - the fuel of the ongoing functional - OO war, and vice versa.

Last but not least, not all programming languages are good for any kind and size of data. Programmer has often to take pains of seeking/learning the right one.
 
  • #6
hs100e said:
I work as a TA for a Python class and a lot of students will try using operators or syntax that work for other languages, but not python. When I show them the problem, they sometimes say something along the lines of 'sorry I was thinking in <some language>' which I find interesting.
It would help to include some examples, doesn't have to be a complete program, just what the differences were with the operators. I don't know Python, but I'm able to read some simple programs. One difference I noticed is how Python handles negative indexing. In languages like assembly or C that support pointers (most common for stack pointer in assembly), you can set a pointer to the middle of an array, and use -/+ indexes (or offsets) to access members before/after the pointer. I don't like how Python uses the term list to refer to what are called arrays or vectors in other languages. List usually implies linked list in most languages.

hs100e said:
How do you process your data when writing in Piet (or any language, really). Do you have a language you learned first that you 'think' in?
For myself and most experienced programmers, you think about the project that your working on, rather than the language. Even in the case of OOP programming, once you get to the actual methods, you're back to procedural programming, the same as most languages. There are some language based exceptions.

RPG and the plug board programming it was designed to somewhat emulate has an associative aspect. The classic example is a program that reads cards and generates reports (thus the name RPG - report generator). Much of this simply describes input fields and output fields, linking the input field to output fields for formatting, but the order of operations is not specified. For the plugboards, you literally wired input fields to output fields and/or link them to sub-totaling and totaling accumulation type fields. COBOL's move corresponding achieves the same thing, moving and formatting data by name between the equivalent of an input and output structure. There's no ordering of these operations, and ideally, they could all be performed in parallel.

APL includes operators that in addition to working with scalars, work with vectors and/or multi-dimensional arrays (called tensors). For example to compare two vectors of the same length for equality, the syntax is A^.=B, inner product, where the first step generates a set of 0s' and 1's based on A = B (the right operator in this case), then the vector of 0's and 1 binary "anded" (the left ^) to produce a 1 if they're equal, 0 if not. A+.xB is how matrices are multiplied, but it works even when A and B have more than two dimensions, the requirement is the last dimension of A has to be the same as the first dimension of B. A lot of simple APL programs don't involve loops.

The application / environment is also a factor. High speed input data collection may require multi-threaded output to multiple devices for later post processing. Event driven / multi-threaded applications are different than typical programs. The application may be mathematical, like curve fitting, error correction code, servo control, differential equations, ... .
 
  • #8
I think most of us alternate between languages so often we tend to think in more abstract general concepts than language specific directives. I don't think of adding two strings together as using "+" in C++, or "." in PHP or "CONCAT" in Mysql, I just think of the strings joining.
 
  • #9
Hey hs100e.

I've learned to organize information in my head by doing lots of code and working in lots of languages solving a variety of problems in these languages.

It's much like learning to read and write - after a while the mind organizes things for you and it becomes intuitive.

If you can see the memory in your head [you get this after you do a lot of debugging] and you can organize the code in your head then you get better the more you practice the art of writing code, reading code, and debugging code.
 
  • #10
Personally, I think mentally in terms of algorithms (the way the codes should flow), and then try to implement them using the language I'm using (from my experience all high-level languages are close enough and have similar implementation of concepts like conditional operators). I don't write down the algorithms, but I test the code later for two things: results and efficiency. If either is wrong, then I would revise the algorithm to make it running reliably and efficiently.
 
  • #11
I am a completely visual person. I do not think about the code itself until I have everything about the code defined ( inputs, outputs, formulas to be used). Then I try to make it more modular. There are parts of this that the language has limitations so an understanding of that is necessary to an extent. Then I use pseudo code. Then I write actual code.
 
  • #12
Another scheme I've used is a top down approach where I write the main method, decide on command line arguments, property files... and then write comment lines for each task the program needs to do.

I then replace each comment line with a method call passing in parameters and write a method to support it repeating the comment lines scheme for each task the method should do.This allows me to have a testable program after each iteration that contains an every detailed outline of what it's supposed to do.

Eventually I have a completed program that has been tested to a unit test level and can then begin the arduous task of trying out all conditions and options. Fixing and rebuilding it as needed.
 

1. What is the purpose of mentally processing commands/data when coding?

The purpose of mentally processing commands/data when coding is to understand and interpret the instructions and information that are needed to complete a task or solve a problem. It is an essential part of the coding process as it allows the programmer to effectively translate their ideas into functioning code.

2. How do you mentally process commands/data when coding?

Mentally processing commands/data when coding involves breaking down the problem or task into smaller, more manageable steps. This can include identifying variables, determining the logic and algorithms needed, and considering potential errors or edge cases. It also involves visualizing the code and predicting its output to ensure it aligns with the desired outcome.

3. How does mental processing of commands/data differ between programming languages?

The mental processing of commands/data may differ between programming languages due to variations in syntax, structure, and built-in functions. For example, some languages may require more complex algorithms or use different data types, which can impact how a programmer mentally processes the commands and data. However, the overall process of breaking down the problem and visualizing the code remains the same.

4. How do you handle errors or bugs while mentally processing commands/data?

Handling errors or bugs while mentally processing commands/data involves identifying potential issues and anticipating how the code may behave in different scenarios. This can include using debugging tools, testing the code with different inputs, and making adjustments to the logic to resolve any errors. It is essential to have a systematic approach to troubleshooting in order to effectively handle errors and bugs.

5. Is mental processing of commands/data a skill that can be learned or improved?

Yes, mental processing of commands/data is a skill that can be learned and improved through practice and experience. As with any skill, it takes time and effort to develop, but with consistent practice, a programmer can become more efficient and effective in mentally processing commands and data. Additionally, learning new programming languages and techniques can also improve one's mental processing abilities.

Similar threads

  • Programming and Computer Science
Replies
1
Views
505
  • Programming and Computer Science
Replies
17
Views
1K
  • Programming and Computer Science
Replies
10
Views
2K
  • Programming and Computer Science
Replies
5
Views
966
  • Programming and Computer Science
4
Replies
107
Views
5K
Replies
6
Views
1K
  • Programming and Computer Science
Replies
8
Views
835
  • Programming and Computer Science
Replies
11
Views
778
  • Programming and Computer Science
Replies
6
Views
1K
  • Programming and Computer Science
Replies
3
Views
985
Back
Top