Experts in compiling C projects often prefer minimalistic tools like vi, make, and gcc due to the perceived bloat and slowness of IDEs. For complex setups, batch files can automate the build process by managing dependencies and environment variables, allowing for efficient compilation from version control. Some users find makefiles cumbersome and may use external tools to generate them, while others appreciate the simplicity of writing their own for smaller projects. The discussion highlights a preference for command-line tools over graphical interfaces, with some users recalling the ease of earlier programming environments. Overall, the choice of tools varies based on project complexity and personal preference, emphasizing a balance between control and efficiency.
#1
Bob Busby
44
0
IDEs are bloated and slow usually and makefiles are confusing to write by myself. I was just wondering what you experts do to organize and compile projects with.
Depends on how complex the project is. Note that Visual Studio includes a dos console window under programs / ... visual studio ... / tools, where you can use cl and link (and ml if you want to do assembly programs). If it's just a single module, I just use cl. If it's a few modules, I create a batch file.
For the most complex setup I've used at work, the user can create an empty directory and get a single batch file that will get everything needed from version control and do a build. In some cases that same batch file can be used with an existing directory, only updating anything that is out of date (mostly makefile). The batch file gets the build tools, (nmake, ml, cl, link, ...), via version control, sets environment variables, creates the entire directory tree, then switches to a large makefile that gets source files if not present or out of date, then does the actual build process. The build tools are included so that old code is built with the same versions of the tools used to originally build the old code.
One trick in the batch file for real dos or dos console windoes to allow it to be used with either empty or existing directories is the ability to test for the existence of a directory by looking for a file named "nul" in that directory:
Code:
if not exist exampledirectory\nul md exampledirectory
I don't recall being able to duplicate this with an makefile, but once the directories are created, then the makefile has no problem checking for existence of files or out of date files.
In addition to visual studio, old copies of visual c 4.0 (32 bit) and 2.2 (16 bit), are used to work with old projects.
IDEs are bloated and slow usually and makefiles are confusing to write by myself. I was just wondering what you experts do to organize and compile projects with.
You should understand that many IDE's call separate processes that compile different source files and link these modules to create an EXE, DLL, or other binary object.
Usually what can happen is that the IDE calls these processes and simply pipes the output to an output window of some sort.
If you have a decent computer (not top of the line, but average), this should not be a major issue.
I use vi, make and gcc. But I am an old-timer, wouldn't recommend it to anyone else.
Also an old-timer, so take my comments with a grain of salt.
vi/vim can be fairly magical, particularly when you have magic enabled. On the other hand, vim (who uses vi nowadays?) is a rather dimwitted editor, even with ctags (which don't work quite so well with an OO language where names intentionally overlap). I like the command line concept in vim versus the popups that tend to pervade IDEs; I find that those popups to be slow, distracting, and disconcerting. Some people justifiably do like the added power of an IDE, and find the modal behaviors of vi, vim and emacs to be distracting and disconcerting.
Regarding makefiles, I sometimes have to use external tools that write makefiles. (Using an IDE in these situations can be a bit tough, when the external tool and the IDE contend over who owns the makefile.) When I am in control, if the makefile is simple I just write the makefile. If the makefile will be of any complexity, I will either
include some master makefile I have written earlier and add the few lines need to make that master makefile work. I find this approach to be very useful for unit tests. My unit test makefiles tend to be very small thanks to the include capability.
Build a Configure script that generates a top level makefile in some project. This top level makefile will inevitably contain a makefiles target that invokes some secondary script to build makefiles in applicable subdirectories, each of which has a makefiles target as well.
The underlying machinery can be a bit convoluted, but usage is simple: Just type make.
I remember back when I started learning to program using QB and Assembler/Machine code you had to do all the compiling and linking of stuff by hand (The assembler stuff) and when I first used Windows Visual Studio 6, it was like the transition of going from a TV without a remote to with one.
The only thing I've used in a *nix environment was pico. I'm not a linux user anyway, but at least with pico you could telnet into the environment from a windows machine and the graphics environment with pico was all hunky-dory! I remember when I used emacs on the telnet session, everything went to crap. This was for remote compilation for uni-work quite a while back though.
i am a cse student and as a second year student i started building apps. by sing chatgpt i am getting frontend files and backend files but i fail to connect those files. how to learn it and when i asked my friend he said learn about api keys. should i learn that or learn something new
I've tried to intuit public key encryption but never quite managed.
But this seems to wrap it up in a bow.
This seems to be a very elegant way of transmitting a message publicly that only the sender and receiver can decipher.
Is this how PKE works?
No, it cant be. In the above case, the requester knows the target's "secret" key - because they have his ID, and therefore knows his birthdate.
I tried a web search "the loss of programming ", and found an article saying that all aspects of writing, developing, and testing software programs will one day all be handled through artificial intelligence. One must wonder then, who is responsible. WHO is responsible for any problems, bugs, deficiencies, or whatever malfunctions which the programs make their users endure? Things may work wrong however the "wrong" happens. AI needs to fix the problems for the users. Any way to...