# C/++/# How can I stop from hating C#?

1. Mar 20, 2016

### SlurrerOfSpeech

Coming from a background of basic C++98 skills and trying to become proficient in C#, I am finding the language to be very tedious to learn.
• For every keyword in C++, there are 2 related keywords in C#, with very slight differences between them. For example, in C++ there is const, and in C# there is const and readonly, with the subtle difference being that readonly can be intialized at runtime. As another example, in C++ there is &, whereas in C# there is ref and out, subtly differentiated.
• Inferior for writing generalized algorithms. For example, it is impossible to create the equivalent of C++'s std::reverse because there is no natural concept of a bidirectional iterator.
• Stupid syntax for core data structures such as multidimensional arrays. I would like hear an justification for changing the [ ][ ] for defining arrays in almost every other language to [ , ] in C#. That's like MATLAB-style initialization, except you use the [ ] [ ] to actually access the elements. Confusing!
• Can't define sizes of arrays in parameters to a function. This is downright annoying.
• So much syntactic sugar that can actually cause major errors if you don't understand that it is syntactic sugar.
• LINQ reversing the position of the select statement in comparison to SQL queries, when the main reason people use LINQ is to hook it up to SQL databases. The language developers were trolling us with that one.

2. Mar 20, 2016

### FactChecker

Find a subset that is like C++ and gets the job done. Leave the rest for later. You will learn to like the language and only hate the programmers that use those other things. They are crazed heathens. ;>)

PS. I seem to remember that C# was developed when Microsoft lost a law suit for hijacking Java. So they made C#. There may be legal reasons that they have to do things differently.

3. Mar 20, 2016

### ChrisVer

I would like to hear a reason why someone would use the syntax [ ][ ] for multidimensional arrays in the first place... I find it confusing... not in programming but in reading it...
[ , ] is much more like what you'd do in a scrap of paper, like writing $M_{nm}$ or $M[n,m]$, rather than $M[n][m]$. So it is easier to read...
I started learning python with lists etc rather than objects that come from modules (like arrays of numpy)... I find it annoying to write in a multidimensional list the L[m][ i][j], instead of using the L[i,j] of a NxN multidimensional array.

Last edited by a moderator: Mar 20, 2016
4. Mar 20, 2016

### Ibix

It's because in C you are hiding pointer arithmetic. The array name is actually a pointer to the head of the array and the thing in square brackets is a memory offset - so a[2] is convenient shorthand for *(a+2), which should be read as "the thing that's at the address two higher than the address stored in a", and is much less intuitive. Incidentally, that's why C array indices range from 0 to n-1. A 2d array is an array of 1d arrays, so b[2][3] is shorthand for *(*(b+2)+3). In that sense it's a brilliant notation, and follows the C Way in being extremely flexible, and reflective of what's going on underneath. And a more complex structure would require carrying around extra data about array size, as modern array classes do. But the downside is a certain clunkiness from a "principle of minimum surprise" viewpoint, yes.

5. Mar 20, 2016

### FactChecker

Many would say that the syntax A[m][n] is more systematically defined. A[m] is an array of A's and A[m][n] is an array of A[m]s. The C-based languages parse [m] very systematically.

6. Mar 20, 2016

### Hornbein

You could learn COBOL. C# will then seem heaven-sent.

7. Mar 20, 2016

### Staff: Mentor

8. Mar 20, 2016

### Staff: Mentor

Only if you've not programmed in COBOL would you say that. Its true its quite wordy but many legacy systems today still use it and its wordiness and relative simplicity as compared to C# or any OO language makes it more maintainable for what it does best, processing business data and generating business reports.

9. Apr 7, 2016

### rumborak

IMHO the few quirks of C# are heavily outweighed by it being a very concise and clean language, with the best IDE I know. And that's coming from someone who writes C++ all day. C++ is great when you've finally learned how to *not* use it. Before that it can be downright treacherous. So, don't mistake familiarity with superiority.

10. Apr 7, 2016

### rootone

It is a clean language as you say, but I don't like the idea of that it's a 'Microsoft thing' dependant on a windows environment.
I know there are open source equivilants, but then, why not just use C** ?

11. Apr 8, 2016

### rumborak

Have you ever written a GUI in C++? You probably haven't, and there's a reason for that.

12. Apr 8, 2016

### rootone

Me?, well I have written GUI interfaces based on trustworthy open source GUI libraries.
Opengl and derivatives of that.

13. Apr 15, 2016

### harborsparrow

C# is standardized by ECMA (the ECMA-334 standard) and by ISO/IEC (the ISO/IEC 23270 standard). Microsoft’s C# for the .NET Framework is a conforming implementation of both of these standards. An independent version of the Common Language Runtime (not developed by Microsoft) is available as a result of the open source http://en.citizendium.org/wiki?title=Mono_Project&action=edit&redlink=1 [Broken][3]; it provides software to develop and run .NET applications on Linux, Solaris, Mac OS X, Windows, and Unix.

I've been writing code for over 30 years, and have used literally dozens of programming languages, several extensively. Including a lot of C. And of all of them, I like C# the best, in part also because the (free) IDE from Microsoft is so very good. And it is more platform-independent than Java these days. So I recommend that you give it a chance.

Last edited by a moderator: May 7, 2017
14. Apr 21, 2016

### Staff: Mentor

@SlurrerOfSpeech This post is meant in a positive way and the message is: - 'get the while the getting is good'

I'll take a different stance - I've worked with UNIX since 1977, generally that meant C, maybe FORTRAN, and shell scripts in traditional systems. It now means SQL, HTML, PHP, python, perl. Environments like: Docker, virtualbox, SANs, zones, Hadoop, etc. === all manner of changes.

If you cannot stand change and learning new IDE's , languages, then pick something else to do with your life. Not programming. NOW. You will be miserable otherwise. There have unbelievable changes since 1977. The next 40 years it will be just as change-filled, if not more. Likelihood is: Nobody will pay you much 40 years from now because you learned a 1992 language passably -- even though it is a good one. Bail out now! Save yourself grief.

COBOL programmers will always have mid-level careers because of management's aversion to risk. But all they do is fix badly written code from 1983. Do you want to spend the next 40 years undoing what junior programmers did to C++ code in 2001? I am not implying that C++ is at fault here, so I do not wanna hear it.

Also, please know, I'm not being funny or sarcastic. I worked for years with a guy who refused to learn the tools and languages we had in our system. I was directed to help him. Did so for 9 years. He balked and never learned squat. His answer was 'I do not read books and manuals. I hate C.'. Okay. What anybody else could do in seconds took him many hours using an ancient editor. Well, I just retired a month ago. I heard he is floundering on a horrible scale because the company can no longer afford to babysit him, he has to produce or leave, but he still has to pay his mortgage. Do not fall into that trap.

IMO, you seem to display the mindset of Mr. Failure above. So find something that you are truly driven to do and learn about. Or be miserable for years. Your choice.