Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

C Programming Nightmare

  1. Mar 11, 2012 #1
    I'm completely lost, I have an exam in C programming in a month & a half based off of this:
    http://www.maths.tcd.ie/~odunlain/1261/prog.pdf [Broken]
    Basically I need some recommendations of books &/or online resources that follow the flow of those notes extremely closely but offer additional insight, different perspectives, more of an explanation of wtf is going on. My search thus far has found nothing, I just haven't a clue.
    Really appreciate any help :cool:
     
    Last edited by a moderator: May 5, 2017
  2. jcsd
  3. Mar 11, 2012 #2

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    Hm... I skimmed through your PDF, and I think you are going to have a hard time finding another set of notes that are as randomly (dis)organized as that.

    If you want a good book on C, get "The C programming language" by Kernighan and Ritchie. It doesn't follow the same "structure" as your course notes, but it will teach you C.
     
  4. Mar 11, 2012 #3
    Ugh. Doesn't that thing have an index somewhere so I can see what you actually have to learn? I hope you don't mind that I'm not very eager to go through the whole document before being able to help you out.
     
  5. Mar 11, 2012 #4
  6. Mar 12, 2012 #5
    Wow what a horrible document!

    I skimmed through it, and I do think you will learn what you need to know from a good C book:

    https://www.amazon.com/C-Programmin...3628/ref=sr_1_1?ie=UTF8&qid=1331546349&sr=8-1

    There's some good tutorials that start right at the beginning here too:

    http://www.cprogramming.com/

    A few things I noticed in your notes that might not be on these free websites, things like floating point representation. There's a nice little tutorial on that here:

    http://steve.hollasch.net/cgindex/coding/ieeefloat.html

    Post up some specific questions if you have them :)

    Have you got a compiler to use (for whatever your platform is - Linux, Windows, etc)?
     
  7. Mar 12, 2012 #6

    rcgldr

    User Avatar
    Homework Helper

    I can't find my copy of this right now, and it's been years since I read it, but I seem to think it was more of a language reference than a tutorial (or maybe that was the first edition?).

    I have some issues with that prog.pdf file, but most of my issues are with the early part of the document that reviews the history of programming languages.

    machine language - in addition to being used on the earliest of computers, in the 1970's, small machine language programs were "toggled" into memory, and used to read a larger program from some device on the machine, in order to boot up the machine.

    assembly language - the example given was generated by a C compiler and not typical of the type of program a human would write. Assembly language is still used for small parts of operating systems that deal with processor specific instructions needed for multi-threading and interrupt handling. High level assembly language (ALC, HLASM) is still used on IBM mainframes for parts of business applications, although there is an effort to convert most of that assembly language to Cobol.

    Fortran - mostly used because there is a large amount of existing code written in Fortran that would take a long time to convert to another language, and because of that, some super computers makers have put more effort into optimizing Fortan language, including some processor specific launguage extensions. A sort of self-perpetuating cycle.

    Cobol - still used in a lot of business environments, such as banking.

    However all of the non-C language stuff is just background stuff and probably won't get used in your class.

    - - -

    "smallest piece of data in C is a byte" - C supports bit fields in structures.

    Most current PC's support 64 bit integers and 64 bit pointers if you use a 64 bit compatable operating system.

    Intel processors also support 80 bit (10 byte) floating point numbers in hardware registers, but it's not common to see this data type supported in current C compilers.

    "arrays and pointers ... starting address" - C treats the name of an array as the starting address of an array, but the starting address of an array is not stored in the array (it's stored in a symbol table during compilation, and ends up in the program code, but not in the data for an array).

    "char * argv[] ... array of strings" - argv is an array of pointers to characters. Each pointer points to the first character of a string of characters.

    "operator precedence - &" The binary operators, & ^ | are very low precedence, lower than logical (&&, ||), or comparator (==, !=, ...) operators. Some consider this a poor design decision for the C language as it generally requires parenthesis that wouldn't be needed otherwise.
     
    Last edited: Mar 12, 2012
  8. Mar 12, 2012 #7

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    ... unless he's the sort of lecturer who tests that you have learned everything in the notes verbatim, including all the mistakes :devil:

    Incidentally, the "fortran program" in the introduction is not written in any version of fortran that I've seen or used in the past several decades. It certainly doesn't comply with any of the standards like Fortran II, IV, 77, 90, 95, etc.
     
  9. Mar 12, 2012 #8
    I tried to go through the document one more time, but it truly is on of the most terrible things I've ever read. If I were you, and I wanted to keep as close to the flow (for lack of a better term) of his notes as possible, I would simply look things up on the internet for every heading.

    For example, he first talks about 'programming languages'. I would then search for programming languages on the internet to get a general idea of what they're for, what programming languages are widely used, etc. (Although this might not be a good example, as you can most likely skip the first four or so chapters.)

    I often used this method when I had to use an unintelligible book.
     
  10. Mar 12, 2012 #9
    Thanks a lot guys, as you can imagine most of us haven't a clue what's going on with C.
    My main problem with C is that I haven't a clue what's going on, in other words it's like being given an integral table & being told that whenever you come across any derivative/integral/power series etc... in physics refer to this table, no need to learn wtf is going on from the fundamentals...
    I think it's really a perfect analogy, I tried to learn physics only to be forced to learn mathematical methods for dealing with physical concepts, then learning this math forces you to learn calculus, which forces you to learn real analysis, which forces you to learn axiomatic set theory (it did for me anyway).
    I hardly want to go that deep into computing but I mean I don't even conceptually know wtf is going on, what's a compiler? I would have thought having some program like a compiler is cheating in that programming is supposed to design those things? :uhh:
    If not, why not? If C is something that uses pre-determined programs like compilers then wtf is C, & where do the other things come from? What are their limits? What are C's limits?
    The main problem here is that I can't even adequately phrase the issues I have with C, all I know is that I need to understand a computer in stages from the moment the power button is turned on & I need to understand where C is in this heirarchy, with say Windows/Ubuntu as the top of the heirarchy & a computer with power & nothing else as the bottom of the heirarchy :redface:
    Hopefully these aren't too stupid as questions but this is how badly I understand C compared to the rest of mathematics...
     
  11. Mar 12, 2012 #10
    Wow, wow, wow. Just stop there for a moment, my friend. You're definitely overanalyzing this. What you need to learn for this course is the C programming language; compilers and linkers are the tools you will be using to convert the code you write into something your computer can understand and use. Thus, using a compiler+linker is no more 'cheating' than using a calculator is cheating when doing a physics test: the point is not to find out how calculators work, but to do physics. Likewise, the point of this course is to learn C, not how the internals of a computer work.
     
  12. Mar 12, 2012 #11

    Mark44

    Staff: Mentor

    That's my sense, as well (on lang ref vs. tutorial). I have the 2nd and 3rd editions, and they have some exercises, but K & R is a terse, bare-bones presentation of C. There are many other books out there that go into much more detail, although most cover C++ these days.

    Another good reference that isn't as wellknown as K & R is "C: A Reference Manual," by Harbison and Steele.
     
  13. Mar 12, 2012 #12

    rcgldr

    User Avatar
    Homework Helper

    Most people can use a calculator without understanding it's inner workings. C and a computer is similar, except you have a programmable calculator without an interactive mode. For some, it might be easier to start off with classic Basic, or (modern) Python, since these have an interactive mode, but only spending a day or so to get a sense of programming as opposed to learning yet another language.

    You'd want to start off with the simplest of programs, which is what most tutorial books or online websites do. Using a source level debugger will help quite a bit, as these will show what is happening step by step. If curious about how the computer works, source level debuggers usually include a assembly window option where you can follow the machine language step by step.

    The class document mentions gcc (a particular compiler), so are you learning this on a Unix type system as opposed to Windows?
     
  14. Mar 14, 2012 #13
    Thanks for the replies - I think we've already made progress:
    If C is just a language then I guess you could distill my problems into the more general question of where a language fits into the overall scheme of a computer if thought of in stages from a computer with power & a circuit board to a computer with windows/ubuntu.

    If compilers & linkers are a way to convert code into something a computer can understand, what is a compiler interacting with to make things work?

    As for the gcc question, I feel it's completely useless to begin writing programs until I first know what's going on but once I get there I can use either windows (my laptop) or unix (college). I've written them in college but I mean it was all complete nonsense to me.

    Just as with a calculator there is a certain amount of mystery when you don't understand a taylor series or approximation techniques &, with understanding, you are able to go from fearful reliance on a calculator to viewing it as merely a speeding up process, so too do I just need to sort all the nonsense in my head about what's going on before I go off learning C, it's the background stuff I need to establish first. It really feels like trying to learn Ancient Greek without knowing what language is.
     
  15. Mar 15, 2012 #14

    Mark44

    Staff: Mentor

    The compiler translates your code into object code - machine code that the computer can process. Another step in the process of producing an executable program is linking, in which calls to library functions (e.g., printf, scanf, malloc, pow, and so on) get matched (or linked) to the actual code in the libraries, and the library code is written into the executable. This is somewhat dated, as many programs these days don't actually include the library code, but instead of using static libraries, use dynamic link libraries (or DLLs). This is certainly the way it works in Windows programming, and there might be a counterpart in Unix/Linux programming.

    When the code runs, it interacts with the computer operating system
    I would advise jumping in sooner rather than later. You don't really need a deep understanding of how the computer works to be able to write code, just a fair understanding of some basic ideas of input and output, flow control using if ... else if ... branching and for/while loops.
     
  16. Mar 15, 2012 #15

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    Wel, you learned your native language without knowing what "language" was (and even before you could read, write, or speak!)

    The main reason for progamming computers in any high level language is so that you don't need to know how they work. (But possibly the person who wrote the first sections of your course notes doesn't realize this.) All you need is some basic ideas like
    * there is a memory, and you can access data in memory by inventing names for parts of it and defining what sort of data each name refers to. That's what statements like "int i;" or "double xyz[3];" do. The programming language knows about some basic data types like integers, real numbers, and character strings. You can also define your own data types to represent more complicated "things" (but you don't need to know about that to get started).
    * You can do calculations on the data by writing expressions that look similar to math notation, for example "x = (- b + sqrt(b^b - 4*a*c)/(2*a);"
    * You can control "what happens next" by constructs like "if" statements, and make sectiions of the program repeat (loop) with "do while(...)", "for ...", etc.
    * To store information permanently there are "files" which you can "open", "close", "read" and "write". In C (and many other languages), the computer keyboard and display screen are just special types of file (though trying to "write" to the keyboard or "read" from the screen isn't likely to do anything useful).

    That's probably enough knowledge about "how computers work" to get started on programming. The best way to learn it is by doing it, not reading about it.
     
  17. Mar 15, 2012 #16
    This is not entirely true. The object file created by a compiler doesn't actually contain machine code. Rather, it contains symbols (defined, undefined and local - but let's not get too deep into detail here). The linker combines all this in a unified single executable file, and resolves all the symbols. DLLs allow dynamic linking, which means there are still undefined symbols in the executable that get resolved when the program run, instead of in the linking process itself. It's true that dynamic linking is used more and more often, but there's still linking involved in the actual creation of said executable. (If that's what you meant, then I'm sorry. Although maybe this clarifies things a bit for others. :smile:)

    And yes, there's something similar to DLLs in Linux (although how it's used is defined less sharply, and I suspect non-ELF executables might use a different format). These are called SO files, which means, if I remember correctly, 'shared object'.
     
  18. Mar 15, 2012 #17

    Mark44

    Staff: Mentor

    I disagree. The object file has to contain at least some machine code from statements in the source code such as assignment statements, loops, etc. You're right about the object code also containing placeholders for symbols that aren't defined in the source code (such as library functions and the like), but I too didn't want to get too deep into the explanation.
     
  19. Mar 15, 2012 #18
    I stand corrected. I thought that because the code still has to go through a linker, it would not contain any machine code yet (only optimized C code or assembly, for example). Just looked it up, and I was wrong about that. :blushing:
     
  20. May 6, 2012 #19
    Hi everyone, so I'm giving C a shot again & I need to know everything about the basic arithmetic of the subject but have been having a lot of trouble getting to grips with it if you could spare a few minutes.

    Here are a bunch of questions I need to be able to answer before actually getting to hello world:
    The reason I can't answer them yet is because I haven't fully understood everything.

    As I understand it, everything falls out of the following information I've gathered:

    This is the first bit of structure I've been able to pin down so hopefully I can develop the subject along these lines, however from my notes:
    He seems to be mixing everything together as a big glop of facts, & using different names too... Is there a way to make sense of everything I've written along with this quote & better yet am I on the right lines & can I make sense of the questions in the opening quote by continuing down these lines? [links appreciated]

    Even better, only if you have the patience, is to think along the structured lines I'm trying to develop & to see how my approach can be used to deal with the material in pages 7 to 17 of http://www.maths.tcd.ie/~odunlain/1261/prog.pdf [Broken].

    Thanks :cool:
     
    Last edited by a moderator: May 6, 2017
  21. May 6, 2012 #20

    Borek

    User Avatar

    Staff: Mentor

    This is reversed - int is and was always larger than char. But even after switching char and int in the quote it doesn't get entirely correct.

    Trick is, size of an int is architecture and compiler dependent. When the processors were 16 bits (times of 8086) int was usually 16 bits, long int was 32 bits. When the processors became 32 bits int was 32 bits, long int was 64 bits (unless you were still working in some kind of 16 bit DOS box or something like that, or unless long int was assumed to be 32 bits as well by whoever designed the compiler). In todays architectures int can be 64 bits, so long int would be 128 bits - or not.

    I have a feeling I remember reading about machines with even other sizes of ints (like 12 bits), which doesn't make things easier. My suggestion is to follow the information given by your prof, as he most likely refers to the system you will be working on. Note he states "in our system it appears to be" - which is a subtle reference to the mess I described.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: C Programming Nightmare
  1. C program (Replies: 4)

  2. C Program (Replies: 3)

  3. C program (Replies: 0)

  4. C programming (Replies: 9)

  5. C# '*' nightmare (Replies: 17)

Loading...