C/C++ Learn C++ with Bjarne Stroustrup's "Programming Principles and Practice" Book

  • Thread starter Thread starter Hacker Jack
  • Start date Start date
AI Thread Summary
Bjarne Stroustrup's "Programming Principles and Practice" serves as a foundational text for beginners learning C++, though its complexity can be challenging due to its advanced language and writing style. Users often find that transitioning from procedural programming in C to object-oriented programming in C++ is difficult, particularly with concepts like manual memory management and the intricacies of the language. Many learners express a desire for more modern, accessible resources, with some suggesting alternatives like Python for beginners. Practical experience through coding is emphasized as essential for truly understanding C++, as theoretical knowledge from books alone may not suffice. Overall, while C++ is recognized as a powerful language, its steep learning curve can lead some to reconsider their choice of first programming language.
Hacker Jack
Messages
10
Reaction score
14
TL;DR Summary
How was the learning experience for you when you started out with C++? Was it hard or easy?
I'm using the book "Programming principles and practice using C++" by Bjarne Stroustrup who is the creator of C++. It is an introduction to programming for the complete beginner and I am up to chapter 3 still learning the basics and trying to compete all the exercises.

Using this book requires better English than I have so I'm usually going over sentences all the time and googling words which I guess is part of the learning process. Also I have to get use to understanding what the writer is intending with his writing style... I would like it if there was more modern introduction to C++/programming books out there but there wasn't any good ones or hardly any with a Google search and asking around. Also reason I choose book over tutorial on YouTube or some other is that it is more in depth and explains more stuff than in YouTube or something where you have to keep pausing and google searching. And last but not least because it is from the creator of C++ who knows the intricacies of the language.

So wondering how you guys learned C++ and lots of people say it's one of the hardest languages to learn and others are easier once you learn C++. So yeah, how was C++ learning for you, was it easy or hard?
 
  • Like
Likes harborsparrow
Technology news on Phys.org
I started with a base in C. OO became a trend in the 80’s and C++ was implemented as a frontend to C. Later, it had its own compiler.

As a frontend, it converted C++ to C and so when certain errors occurred you could look at the generated C code and figure out what you did wrong in C++. However, as a compiler it was often more difficult to understand why something failed.

It was hard to change from procedural C to OO C++ at first but you get used to it. Designing general purpose classes is harder than application coding.

I stopped using it when the standard template library was introduced and jumped to Java instead.

Multiple inheritance didnt always work as expected and many projects I was on would insist on single inheritance with mixins. Mixins were classes that provided some useful methods for convenience. Serialization and stack support come to mind.

Bottomline, you’ll be reading a lot of books to improve your coding skills. If you use it professionally, you’ll be reading a lot of code in a lot of personal styles and it can get quite tough without an IDE tool to guide you.
 
  • Like
Likes harborsparrow
The names and terms are the biggest problem for me learning C++. All the terms like objects, instance, arguments, instantiation, attributes, methods, dereferencing, streaming. Took me a while to get familiar. I don't find C++ that hard, it's the English language they use that make it hard, reading online that's hard. Took me a while to get familiar. Try youtube video, I find that much more helpful than reading online articles. It's not that I cannot read, I studied plenty of advanced electronics, math and physics textbooks, English is NEVER been a problem, somehow those C++ books and online articles are hard to understand. They seems to have their own language or lingle.

Did you look into the book by Gaddis?https://www.amazon.com/s?k=C+++Gaddis+6th+edition&ref=nb_sb_noss
I use the 6th edition as I can buy it cheap for like $5+shipping. I told you I got like 5 books on C++, only Gaddis is easy enough to read for self study. You can go on-line and you can find the 6th edition in pdf file you can download. I am one chapter away from finishing the brief version of the book.

No matter what you do, do NOT get this one:https://www.amazon.com/gp/product/0789757745/?tag=pfamazon01-20 This got to be the worst one. It LIED you can study C++ one hour a day.

The more I study C++, I question whether it's the right language as the first language I learn. If I were to do it again, I might pick Python.
 
Last edited:
  • Love
Likes harborsparrow
yungman said:
If I were to do it again, I might pick Python.
:biggrin:
 
I learned C++ by writing C++ and (since 2005) by only considering jobs where C++ is the primary language. You can read books all day but without actually programming in the language, it's not going to do you much good.
 
  • Like
Likes jedishrfu
jbunniii said:
I learned C++ by writing C++ and (since 2005) by only considering jobs where C++ is the primary language. You can read books all day but without actually programming in the language, it's not going to do you much good.
But you have to study the book before you can get hired. That said, I agree that books are not very good in learning the language, the examples are so simple you really don't learn. That's the reason I always venture out, think of some programs to write on my own. Something that is longer, over 100 lines of code that involve more subjects. I just finished analyzing a program step by step to see how the compiler thinks and works. I learn so much more than just following the book. I was being discouraged to do that, but I insisted on completing them. I feel I learn so much more than just work on the few lines program examples in the book.

That's part of the reason when people hire graduates from college, they are taking a big chance. Just because one gets good grades or from a big name universities doesn't mean they can transition into a good worker, that one can translate the knowledge from school and books to real life.
 
  • Like
Likes harborsparrow and jedishrfu
C++ is hard For a few reasons. Compared to C the language specification is very complex. A single person could write a pretty good c compiler as a college project. I doubt many could do that for C++.

It has manual memory management. Most modern languages abstract this away from the developer. It’s easy in C++ to accidentally create bugs that a novice would have trouble understanding let alone fixing.

also, unlike python or JavaScript where you can get along pretty well just using notepad and your browser, with C++ you need to learn a tool chain as well. IDEs and Debuggers become much more important when you have to manually manage memory.

‘If you’re really set on learning C++ I think learning C first would probably be a good idea. You’ll get the experience of fixing low level pointer and memory bugs without having C++s abstractions hiding them from you.
 
  • Like
  • Love
Likes yucheng, harborsparrow and FactChecker
yungman said:
The names and terms are the biggest problem for me learning C++. All the terms like objects, instance, arguments, instantiation, attributes, methods, dereferencing, streaming. Took me a while to get familiar.
Yes. You were learning OOD at the same time as you learned a new language. That is rough. But these are standard OOD terms that will be good to know for any modern language.
The more I study C++, I question whether it's the right language as the first language I learn. If I were to do it again, I might pick Python.
You were not driven by the job market, so you are free to choose for your own use. Otherwise, C++ has an advantage. This Youtube video is an example of using Python in an application to call OpenCV (Open Computer Vision) library functions. Notice that the Python code is about 1% of it and almost all of the difficulty is in applying OpenCV.
 
  • Like
Likes harborsparrow and yungman
DavidSnider said:
C++ is hard For a few reasons. Compared to C the language specification is very complex.
That's for sure. The new C++20 standard (draft version) is 1829 pages long! And available as a free legal download:

https://isocpp.org/files/papers/N4860.pdf
 
  • #10
jbunniii said:
That's for sure. The new C++20 standard (draft version) is 1829 pages long! And available as a free legal download:

I'm like @jedishrfu. I loved C++ 1.0 as learned from Stroustrup's book Rev 1.0, but I jumped ship when the STL libraries came out.

This thread could benefit if we made more distinction between learning languages and production languages. I also think that students and beginners should be told to plan on learning several programming languages. Too often we hear the question, "Which is the best language I should learn?" That presumes only one, and it excludes starting with a learning language, then moving to others for job purposes, and it presumes that there can be one choice best for all purposes.
 
  • Like
Likes harborsparrow
  • #11
Hacker Jack said:
Summary:: How was the learning experience for you when you started out with C++? Was it hard or easy?
Well, I had been poking myself in the eye with a sharp stick and decided that I would try C++ instead. I was tempted to go back to the stick.

OOP was hard for me at first, but it is SO cool.
 
  • Like
  • Love
Likes harborsparrow and FactChecker
  • #12
@anorlunda makes a good point on learning multiple languages. Over the years, I've had to learn and relearn at least a dozen programming languages due to switching platforms from mainframe to mini to micro and back.

Other events such as joining a new project team and having to follow their conventions and language choices or because one language works better for a given project than another.

Sometimes, project requirements will dictate the best choice of a language. Other times, the principal programmer will do a pet project that becomes a production project and so we adopt their choice of language(s).

Some of my current projects required web-based servers and so I've migrated from java to NodeJS to python and now to golang. Web programming in java with Apache Tomcat is painful. I moved to nodejs which is simpler but NodeJS required use of third party repositories of code which are hard to get at in an air gapped system. I had to use wrap some code in Java for the netcdf files I had to return and the java had to call a C-based command that actually read the data.

In my next rendition, I moved to python which replaced the NodeJS and java components because we had access an internal Anaconda repo but this created really huge docker images. And so now, I'm learning golang with the hope that everything can be written in golang and the size will be more compact. (golang compiles to binary with no external code dependencies)

If you notice the docker aspect aspect that meant I also needed to write bash and windows scripts for launching the docker image itself and inside docker some more scripts for setting up the environment and launching the application. I had to also write a docker build file which is a kind bash / makefile script with its own limited syntax.

The watchword today for me is to design for microservices, package them in docker images, use golang as the application language, bash as the glue and alpine linux for the docker image OS in order to make the most compact easily deployed applications.

One last note, there are a lot of machine learning projects today, many based on python that show great promise but when moved to production hit the performance roadblocks because python is slower than many compiled languages. The solution is to rewrite the python code as golang code and package as a docker image.

C/C++ could have been used here but golang is a lot like C for the internet age. The designers dropped the OO aspect and replaced it with something less powerful but still quite useful. They've replaced the pointer arithmetic that can introduce bugs and security flaws if not done quite rights with slices on top of array buffers.

So that's why I'm now learning golang hoping to retire my other languages into the dustbin of CS history until the next great language comes along.

Julia, Julia where art thou, Julia?

(Julia is a new language making inroads in the computational and ML worlds of MATLAB and Python)
 
  • Like
Likes harborsparrow, FactChecker and anorlunda
  • #13
Hacker Jack said:
How was the learning experience for you when you started out with C++? Was it hard or easy?
My experience probably isn't very relevant to yours, because C++ was the third language that I got to know well.

First was Fortran, which I learned as an undergraduate in the early 1970s, and used a lot in graduate school (experimental particle physics) through the early/mid 1980s.

Next came Pascal, which I taught in intro programming courses about 1985-1995. During that period I dabbled in programming for the classic MacOS, for which Apple used an object-oriented dialect of Pascal (Object Pascal). That was where I first learned object-oriented programming concepts, although I didn't teach them in my courses. (My textbooks used standard Pascal which wasn't object-oriented.)

In the mid 1990s, the "standard" college/university intro programming courses in the US, along with the high-school level AP computer science courses, shifted to C++, and I followed suit. It wasn't too hard for me because I already had been exposed to object-oriented programming. It was mainly a matter of becoming familiar with C++ syntax, figuring out how to present it to my students, and deciding how much to present to them. The full language (including the standard library) is too much for even a two-semester course, at least at my non-elite college.

Around 2005, someone else took over my intro programming course and moved it to Java, because that was the new trend in such courses. Since then, I've retired from teaching, and am only now becoming acquainted with "recent" developments in C++, such as the 2011 and later standards.
 
Last edited:
  • Like
Likes harborsparrow
  • #14
As a beginner just learning C++, I don't think it's hard, it just have a LOT of things to learn. Every single one is easy(except pointers so far). Just you have endless of things to remember. Like all the little functions, cin.getline() for c-string, getline(cin,...) for std::string. Used to be strcpy to copy c-string, now no no no, you have to use strncpy_s()...Then when to use const... It has straight rule in class not that no one else can touch the private member data...BUT, you can bend the rule by using friend!

It's frustrating to study C++, So many little things to learn even the book cannot go into detail in anything. If I just go through the chapters and just work on the example programs, I learn NOTHING because it only present one single case of each. There are so many ways to use each topic you really don't get to know the topic. Only time I really learn is to venture out and dream of a program and write it like I spent two weeks writing a person address phone directory that really put things into action. Then I get a real feel the stuffs I use, how they work.

It is just so dry learning C++, the kicker is I am finishing up chapter 14 of total 15 chapters that is almost 1000 pages book, all I can do is write something in cmd window. When is it start to be fun? How many more of these I have to learn. The only fun recently I have is working on and add more to Jtbell's program in stepping through the debug looking at how the compiler think.

That's another thing, you can thing you know C++, but wait, it's not what you know that is important, it's what the compiler think you know that is important! I do not remember anything the book talk about the compiler how it is in action. The thread by Jtbell is an eye opener. You experts might think it's common sense, believe me, it's NOT.

Then of cause, you people even have your own language...I am talking about English terms and sentence. You think you explain well, but you really don't for a beginner. I was being told why didn't I google before asking. Do you know it really did not help to google as they sounded Russia if you don't know the "language"? It took me over 3 months before it got easier. Also, problem with those cplusplus sites give example of material of beginning class with examples with strut, this, pointers and all that. You DO NOT understand until you get to chapter 11 on up in the book! the example means NOTHING to a beginner that is on chapter 5! Based on over 40 years of track record, I do NOT think I am slow and dumb and I study hard. The most difficult part of learn C++ is NOT that it is hard, it's learning how to read your language and the terms and names.
 
  • Like
Likes harborsparrow
  • #15
By the time I ran into C++, I had already coded in numerous machine codes and assemblers, Cobol, numerous Fortran variants, RPG, Pascal, C, and others. I had also done a fair share of database normalization work. My reaction to C++ was that object-oriented design had a lot to borrow from database normalization. Most data doesn't really form a data base, but you can push the point. And if you do, as a design exercise consider making each normalized "tuple" its own class.

Basically, I liked C++ from the start and had no problem in adopting it.
 
  • Like
Likes FactChecker
  • #16
Similar to others, I had a lot of experience with a variety of languages before I encountered C++. They included some discrete event simulation languages where the appeal of OOD concepts was clear (with large numbers of different types of objects entering and leaving a simulation). C++ was first introduced to me as a mechanism for Object-Oriented programming. That made the learning very pleasant. That being said, I never really became an expert at C++ because practically all of the code where I worked was in C and other languages.
 
  • #17
Hacker Jack said:
I'm using the book "Programming principles and practice using C++" by Bjarne Stroustrup who is the creator of C++. It is an introduction to programming for the complete beginner
I've never read that book myself, but I think I remember flipping through it in a bookstore once or twice, years ago. And I've read the reviews on Amazon. And I have a couple of Stroustrup's other books, including his "bible", The C++ Programming Language, which I've used as a reference. My impression is that he doesn't waste words, so you have to read slowly and carefully. I can see the lack of redundancy, for a better word, might be a problem if English isn't your native language.
Hacker Jack said:
I would like it if there was more modern introduction to C++/programming books out there but there wasn't any good ones or hardly any with a Google search and asking around.
Up until maybe 10-15 years ago there were a number of introductory programming textbooks that used C++, because from about 1995-2005 C++ was the language used in the Advanced Placement computer science exam in US high schools. Therefore many high schools, colleges and universities used C++ in their intro programming courses. Then around 2005 the AP people switched from C++ to Java, which reduced the market for intro C++ books for use in schools.

The book that I used in the course that I taught is now out of print, but the author has made it freely available. I posted about it here:

https://www.physicsforums.com/threads/old-but-good-c-textbook-available-for-free.992947/

You might consider taking a look at it, at least as a supplementary textbook.
 
  • Like
Likes harborsparrow
  • #19
Yep, I remember Scott Meyers's books. A couple of them are probably buried in my closet somewhere! I remember him having an engaging style. Definitely not for beginners, though.
 
  • Informative
Likes harborsparrow
  • #20
jtbell said:
I've never read that book myself, but I think I remember flipping through it in a bookstore once or twice, years ago. And I've read the reviews on Amazon. And I have a couple of Stroustrup's other books, including his "bible", The C++ Programming Language, which I've used as a reference. My impression is that he doesn't waste words, so you have to read slowly and carefully. I can see the lack of redundancy, for a better word, might be a problem if English isn't your native language.

Up until maybe 10-15 years ago there were a number of introductory programming textbooks that used C++, because from about 1995-2005 C++ was the language used in the Advanced Placement computer science exam in US high schools. Therefore many high schools, colleges and universities used C++ in their intro programming courses. Then around 2005 the AP people switched from C++ to Java, which reduced the market for intro C++ books for use in schools.

The book that I used in the course that I taught is now out of print, but the author has made it freely available. I posted about it here:

https://www.physicsforums.com/threads/old-but-good-c-textbook-available-for-free.992947/

You might consider taking a look at it, at least as a supplementary textbook.
Ha ha, do NOT trust the ratings on Amazon regarding to C++ books. I bought this book because of the rating:
https://www.amazon.com/gp/product/0789757745/?tag=pfamazon01-20
This got to be the worst worst book I ever read. I actually studied 4 or 5 chapters before I switched to Gaddis and never look back. That book is so deceiving, making it sounds like you can study C++ in a month one hour a day. It covered the most C++ of all the books I have, you'll never learn anything if you are new and try to study C++. The most sickening thing is after I studied Gaddis and just read that stupid book for the hell of it thinking it must give me more. NOT! It doesn't cover any more advanced stuffs than Gaddis, just very hard to read.

when you say AP people, you mean in college? I notice my grandson, they concentrate on Javas, he only took one class in C++ using Gaddis, I am surprised the school covered so little. They covered only up to part of chapter 11 on Struct, they SKIPPED the most difficult chapter 9 on pointers. What the hell are they studying for one semester? What kind of C++ class is that? It only starts to be more difficult after Struct when going into advanced file, class and overloading. Then of cause, the pointers. I put more time on the few later chapters than the first 10 chapters.
 
  • Informative
Likes harborsparrow
  • #21
anorlunda said:
This thread could benefit if we made more distinction between learning languages and production languages.

Well, I'm OK with a thread entitled "How did you learn C++?" to be about C++, but I see your point. There is this idea that learning a programming language and learning to program, and they are not. Just like learning to play chess well is more than figuring out how the pieces that look like little horsies move.

As far as C++, it helps a lot to know C first, and in particular it's memory paradigm and how it interacts with functions. Otherwise you'll drown in *'s, &'s, .'s and ->'s as soon as you attempt something non-trivial. It helps to understand OO so you work with the language instead of fighting it. And while I am hearing a lot of hating on STL, I think it forces you to think about design early: should this be a vector? Maybe a list? Perhaps a deque.

If you look at people here who are struggling, they often are missing one or more of these things. They don't know C and/or they don't know how to program (e.g. data structures), and/or they don't know how memory is managed, and/or they don't understand objects. Of course they struggle - their foundation makes it impossible for them not to.

jedishrfu said:
Multiple inheritance

Is Soviet plot to destroy American productivity! 💂‍♂️ (That's supposed to be an ushanka)

Seriously, multiple inheritance is one of those ideas that looks great on paper, but not so good in real life. The advantages tend to be small or speculative, but the issues - including the Diamond of Death (just like on ski slopes, the diamond should serve as a warning) - tend to be very real.
 
  • Love
Likes harborsparrow
  • #22
Vanadium 50 said:
As far as C++, it helps a lot to know C first, and in particular it's memory paradigm and how it interacts with functions. Otherwise you'll drown in *'s, &'s, .'s and ->'s as soon as you attempt something non-trivial.
That's the order in which I learned these languages, and IMO that's the best way to go. After I retired, I taught the first course of a three-quarter sequence several times at a nearby community college. Their approach was C for the first quarter, then C++ for the next two quarters, where OO stuff and data structures are covered.

IMO, the switch to Java by so many colleges is unfortunate. Joel Spolsky's blog post is a bit dated, but is very pertinent, I believe - The Perils of JavaSchools – Joel on Software . His main gripe is that Java insulates programmers from the hardware, and in particular, pointers. Spolsky is one of the founders of Stack Overflow for one of his hats.
 
  • Wow
Likes harborsparrow
  • #23
Why colleges are switching to Java? there is a HUGE field in firmware, I already think C++ is little too remote from hardware already.

I think people that do programming should have knowledge with the hardware. Like those stupid people program FPGA thinking AHDL and VHDL are just another language and really really screw up everything. This is also true people design hardware SHOULD do some programming so they have the appreciation of each other.

EDIT:

I think they should require CS student to take a class of digital electronics and something like a microprocessor design like 8080 in my days. Same time every EE student should be required to take two C++ and maybe C type of lower level language. The two fields are so isolated from each other I don't think they appreciate each other's skill. I was required to learn assembly language and it was so useful. Literally got my career started.
 
Last edited:
  • Haha
Likes jedishrfu
  • #24
jedishrfu said:
I remember using Scott Meyers books to pickup best practice programming tips:

- Effective C++

https://www.amazon.com/dp/0321334876/?tag=pfamazon01-20

- Effective Modern C++

- Effective STL
Effective Modern C++ is the only one of those three that is still even slightly current, and even it is quite behind the times as it only covers up to C++14. The language has moved on substantially since then. C++20 is a huge step forward from C++17, which in turn was a significant step forward from C++14. Unfortunately, Meyers has more or less retired so it's unlikely that we'll see updates to these books.

Even Stroustrup's bible is way out of date at this point.

If you're looking for up to date C++ in book form, Josuttis's C++17: The Complete Guide is good and has no competition. But it assumes you already know pre-17 C++ and just want to learn the new stuff. And it doesn't touch C++20 at all.
 
  • Informative
  • Like
Likes harborsparrow and jedishrfu
  • #25
@yungman I once worked on a vlsi test system development team. The goal was to use gallium arsenide technology to test our fastest bipolar memory chips. The engineers designing the hardware made each channel identical so that in concert they would sing a perfectly timed pattern to test the chip. Each pin of the test chip would be mapped to a channel and the programming goal was to convert human understandable test patterns into a set of programs one for each channel.

Basically, the program had nested loop capability and just emitted a string of 0s and 1s or compared test chip outputs to the generated string of 0s and 1s noting mistakes. Perfect tests would have no mistakes.

Programming-wise the channel programs had a two dimensional array of bits each row reserved n bits for the outer loop, m bits for the inner loop... The loop counts were not on byte boundaries.

It was an amazing challenge to program. We had to define what the counts would be and they had to be synchronized with all the other channels and before loading we had to pack the loop counts using bitwise operations and set the proper flags that controlled how many rows were a part of the loop and other what not fields defined by the hardware spec.

We kept asking ourselves why did they pack everything so tight and of course the answer was to save on hardware memory costs. The programming then had to adapt to make it even remotely usable. A software versed engineer might have placed the loop counters on byte boundaries knowing the difficulty programmers would have in coding this stuff But not our engineers.

One interesting note on why a major computer company would design in-house testers was because any vendor specification for a third party test system would necessarily giveaway where your technology was going and how fast you would get there.For that reason we never looked for outside help and never considered Japanese vendors for fear of alerting them to where we were headed for future chip and computer technology.

One engineer, at an IEEE conference asked our lead engineer how could our company afford to spend so much money in-house to design and build a state of the art tester. Our lead engineer responded “how can we afford not to“ to thunderous applause.

One final note, the programming was done in a combination of C and C++ coding but as I recall it was more C since we were still new to the OO way of designing software.
 
  • Informative
Likes harborsparrow
  • #26
Mark44 said:
His main gripe is that Java insulates programmers from the hardware, and in particular, pointers.

I'm sort of sympathetic. Java has some nice features, but it's not the last word in programming. As you point out, pointers are not a "thing" in Java.

However, C pointers have a particular philosophy that can interact badly with the underlying hardware. It was very hard for C programmers accustomed to thinking of a single linear address space to program the 80286 which had a segment/address concept. The Vax had a similar segment/address concept but implemented it in a way that better fit the (works, but only by accident) "pointer = int" mentality.
 
  • #27
jedishrfu said:
@yungman I once worked on a vlsi test system development team. The goal was to use gallium arsenide technology to test our fastest bipolar memory chips. The engineers designing the hardware made each channel identical so that in concert they would sing a perfectly timed pattern to test the chip. Each pin of the test chip would be mapped to a channel and the programming goal was to convert human understandable test patterns into a set of programs one for each channel.

Basically, the program had nested loop capability and just emitted a string of 0s and 1s or compared test chip outputs to the generated string of 0s and 1s noting mistakes. Perfect tests would have no mistakes.

Programming-wise the channel programs had a two dimensional array of bits each row reserved n bits for the outer loop, m bits for the inner loop... The loop counts were not on byte boundaries.

It was an amazing challenge to program. We had to define what the counts would be and they had to be synchronized with all the other channels and before loading we had to pack the loop counts using bitwise operations and set the proper flags that controlled how many rows were a part of the loop and other what not fields defined by the hardware spec.

We kept asking ourselves why did they pack everything so tight and of course the answer was to save on hardware memory costs. The programming then had to adapt to make it even remotely usable. A software versed engineer might have placed the loop counters on byte boundaries knowing the difficulty programmers would have in coding this stuff But not our engineers.

One interesting note on why a major computer company would design in-house testers was because any vendor specification for a third party test system would necessarily giveaway where your technology was going and how fast you would get there.For that reason we never looked for outside help and never considered Japanese vendors for fear of alerting them to where we were headed for future chip and computer technology.

One engineer, at an IEEE conference asked our lead engineer how could our company afford to spend so much money in-house to design and build a state of the art tester. Our lead engineer responded “how can we afford not to“ to thunderous applause.

One final note, the programming was done in a combination of C and C++ coding but as I recall it was more C since we were still new to the OO way of designing software.
I believe in doing everything in house, it's a total waste of time to hire outside vendor to do part of the job...period, be it worry about stealing technology or efficiency. I personally worked for a consulting company that was contracted out to other companies to do part of their job. The idea is company don't have to hire permanent workers and pay for the benefit and don't have to lay them off when work is slow. But it just doesn't work. Problem is the bean counter don't understand that. I know how screw up the contracting company can be. I remember one time our company ( Pacific Consultant LLC at the time) contracted to do some work to another company. In the mid stream, the engineer quit, I had to take over. That's was ok. Then...I found a better job and I quit. Last I heard, they assigned a junior engineer to take over. I know that guy, I have mercy on the company that hired us!

You have to be in the field to understand the ins and outs of the job and requirement. Like you work for the chip company, even you are software, but you see each other day in and day out, the little talks here and there will give you a lot of inside knowledge that you won't get if you are outsider contracting in. I've on both side. After I started working for the new company, I hired the consultant company to do so fpga programming and other work. Did I get screwed!:mad::mad:! That's what I was talking about the FPGA programming. I had to go in and fixed the glitches the FPGA produced because they program as if it is like software one line of code at a time. FPGA is NOT like that, all signals are parallel running at the same time. They are not sequential! Race condition cause glitches in the signal and false trigger the logic...like loading a register with garbage.I am talking about contracting here, forget going to China...Before you say anything, I am a Chinese from Hong Kong before. The quality of their product suck to holy hack. They are too greedy, they take short cut and they don't care the repercussion. Look at the quality of all the stuffs made in China. I much rather pay more and get good quality stuffs. I think people here slowly realize this a few years back already and quietly moving work back to US from China. The silicon valley have been booming for a while, they know it just doesn't work to outsource any hightech stuffs to China. this is on top of worrying about them stealing the technology.
 
Last edited:
  • Love
  • Like
Likes harborsparrow and sysprog
  • #28
yungman said:
I think people that do programming should have knowledge with the hardware.
If they need to. There is a huge amount of programming done that should be hardware-agnostic and will run on many different hardware platforms. This is probably the majority of programming done today.
 
  • Like
Likes harborsparrow
  • #29
FactChecker said:
If they need to. There is a huge amount of programming done that should be hardware-agnostic and will run on many different hardware platforms. This is probably the majority of programming done today.
It's not about programming for specific hardware, it's about understanding the hardware. Same as EE should learn programming to understand how software people think.

Not to mention there is a very big field of firmware programming in the silicon valley. Actually all the programmers I worked with before are all working very closely with hardware. There are a lot of jobs like this, not all programs are just for pc and bank type.

Actually the place I worked for did a lot of scientific programming, signal processing, Forier Transforms and physics related stuffs, I am surprised I did not see and physics requirement in CS major. In scientific type programming, those can be important. It's a very big field of this.

Remember software and even hardware is ONLY a tool to get a job done, it's a means to an end, not the end to the means. That, I think it's very important to keep in mind. It's to get the job done, not about elegant, style.
 
Last edited:
  • Like
Likes harborsparrow
  • #30
There is a copy of that Bjarne Stroustrup C++ in my basement if anyone would like it. I can't give it away and don't want to throw it in a landfill right now.

Note that my background is computer science rather than Physics so my perspective is a little different because I was never mainly doing numerical applications (though I've had to wrestle with them on occasion).

I am enjoying reading this thread. I never quite mastered C++ though I became a thoroughly competent C programmer for a few years. C++ was so large and complex that I never had the luxury of spending the time that it was apparently going to take to be really good at it; hence, I always delegated any necessary C++ work off to a real expert. Like sometimes, when I needed a DLL in Windows. But I mostly avoided C++ when I could. It used to be needed by Windows programmers to call into the operating system for certain reasons, but when Microsoft came out with .NET, we were all rescued from DLL hell for the most part and the world rejoiced.

I didn't really learn OO concepts well until I had written a lot of OO programs in C# and Java--AND (this is key!) I don't think I could ever have become a decent "designer" (different from programmer/coder) if I had not learned an additional OO topic called design patterns. I did master most of the design patterns and knowing them, can use them in any object-oriented language, and they are extremely helpful mental constructs.

Design patterns are from a famous 1995 book (named "Design Patterns") that is notoriously difficult to plow through but had a lot of influence on the field of programming in computer science. The book's four authors (fondly called "the gang of four") had a lot of influence on language designers, and one of their claims that I began to understand is that, at least when it first came out, inheritance was overused as a design ploy. The Design Pattern authors delve into the risks of relying too much on inheritance in designs (having to do with maintainability and flexibility of design in maintenance and growth of a program later on, as well as memory loading issues and performance). So people writing large complex programs that were expected to be used for a long time started to be more cautious in the ways they applied inheritance and started to do other things instead, such as interfaces. If you think you are an OO programmer, it is probably necessary to become familiar with at least several of the most common design patterns. They go by different names, and I'd say there are six to ten (of the twenty odd ones identified in the original book) that every single programmer ought to know--if I were interviewing them for a job, I'd probe to find out if they did.

Lastly, a couple of people here also noted that multiple inheritance proved to be disliked in C++, and that absolutely dovetails with what I experienced. There are possible ambiguities that can arise in multiple inheritance, and because of that, the OO languages that came along later (Java and C# among them) mostly disallowed it. Someone jumped on me badly in here for saying so recently; it was an ugly interaction, and they cited that Python (I think) allows multiple inheritance. But I'm glad to point out that I'm not the only person who ever said that actually USING multiple inheritance is probably a bad idea.

As for inheritance being overused, there are places where it should be used, and one of those is when using a library, it is generally consider okay to "extend" a library function by just adding something to it. But people who do a lot of OO design, these days, will shy away from building a skyscraper in code based solely on inheritance. They are much more likely to use interfaces and delegation instead, for lots of reasons, like having looser coupling and maybe a later binding time upon execution or even communicating across multiple threads.

And certain other features such as operater overloading might as well, in my experience, be academic toys. Because in complex real-world code, I have never seen a good reason to do anything like that, and I would in fact advise against using things like operator overloading (without a strong compelling reason, which I did not encounter--one may exist though) because--think of someone coming along ten years later, trying to fix a big hunk of unknown code, and not knowing immediately that the operator has been overloaded. It's unexpected, obscure, and open to misinterpretation. Having had to work on tons of legacy code written by other people who scarcely bothered to document their OO designs, I can testify to the wisdom of the four authors of Design Patterns who warned against overuse and overreliance on some of OO's fancier features. They (and I) liked loose coupling and clear module boundaries and obvious design patterns that maintainers coming along years later would easily recognize.

The point is, books (and some teachers) will spend an inordinate amount of time trying to teach you certain esoteric OO design features that you will likely never need and even perhaps ought never to use, while leaving you with an incomplete understanding of things you absolutely need to know to be a good code designer and get good performance out of code.

Anyone really interested in OO design I would advise to learn it on a different language than C++, or at least eventually, because my perception of C++ is that it has a whole lot of complexity in it that is chaff. Which is why computer science people only use it as a last resort these days.

C++ was a very early OO language, and it got everything but the kitchen sink thrown into it.
 
Last edited:
  • Like
Likes yungman, Vanadium 50 and pbuk
  • #31
@yungman is correct. Most CS majors tend to be offered numerical programming as an option rather than a requirement, and I agree that it (as well as some assembler level programming) would be useful as a requirement instead.

But CS sees, accurately I think, that it will only be needed mainly in scientific fields. Programmers can work for decades in industries such as telecom or pharma and never need to do more than very trivial numerical work. It will be, for those types, lots of database, web services, or UI work--very different from what physicists need.

@yungman is also correct that CS departments do not teach enough "close to the metal" real-time programming, which is a dark art requiring intimate knowledge of the hardware. However, EE departments tend to offer those courses, and they are usually available as options for CS majors.

This was true at the two CS departments I had contact with, anyway (Univ of Tenn and Univ of Penn).

P S - I learned a little of the "dark art" doing microprocessor programming in telecom many decades ago. Not only did it have to be assembler level, but RAM was so small that the code had to jump to take advantage of every free byte of RAM in order to fit it into the tiny chip. These days, memory is cheap and people think they can afford the overhead of OO and compiled languages for robotics etc. That is debatable.
 
Last edited:
  • Like
Likes yungman
  • #32
harborsparrow said:
And certain other features such as operater overloading might as well, in my experience, be academic toys. Because in complex real-world code, I have never seen a good reason to do anything like that

(1) Streamers.

(2) Assignment operators for objects. I am actually not a big fan of this, because of ambiguities, especially with regard to pointers. Does a pointer point to the exact same data or a copy of this data? Is the answer different if the data is internal or external to the object? But whether or not I like it, it is commonly done.

(3) Comparators. Let's you sort - and search efficiently.
 
  • Informative
Likes harborsparrow
  • #33
Leetcode has great c++ exercises for the beginner. It's a great companion to your stroustrup book. Also, try to learn c++11 or better.
 
  • Informative
Likes harborsparrow
  • #34
  • #35
yungman said:
I believe in doing everything in house, it's a total waste of time to hire outside vendor to do part of the job...period, be it worry about stealing technology or efficiency. I personally worked for a consulting company that was contracted out to other companies to do part of their job. The idea is company don't have to hire permanent workers and pay for the benefit and don't have to lay them off when work is slow. But it just doesn't work.

Ha ha @yungman you're on track! I did consulting for a few years, and I found that (besides the reason given above) there were a few other reasons they hired consultants:

1) to have the consultants do a study to rubber stamp a (possibly very bad) idea they wanted to further within the company

2) to have the consultants come in and quickly do something that urgently needed doing but that some bureaucratic part of their own company was refusing to do (like, someone running a fiefdom in the IT department and blocking all innovation--it happens more than you'd ever think)

3) to have someone come in and give an outside perspective, and possibly, tell the truth when no one inside could afford to tell the truth for fear of being (virtually) shot

As a consultant, I always needed to understand which kind of consultant the client was looking for. I was never going to be the kind to rubber stamp a bad idea, because I cost a lot of money and I felt it was robbing the hiring corporation to lie at the behest of some unscrupulous manager, even if they were going to pay me a nice bribe to do so. But plenty of consultants will eagerly grab onto that role. I didn't, because, in my view, a consultant's credibility with the client (and reputation) would go to hell if they even once got caught out in a lie or major mistake. Lying by commercial consultants, and even internal corporate employees, is unfortunately rampant in large companies.

I was often hired for reasons (2) or (3). I liked being a truly independent consultant, so I formed my own company, because whenever I tried working for a contract house, using them as middle-man, I ran into either the problem @yungman mentioned (wanting to exploit outside workers) or item (1). So I chose always to work "corp to corp", which meant, the contract house acted as my payroll from the big client company, siphoning off parts of the profits, but they had no role in trying actually to manage my work so that I could treat honestly with the person funding my work and that way, officially, I didn't have to report to a twenty-something "manager" in the contract house who didn't have a clue about what kind of work I actually did.

I also wanted to set my own schedule, because I work best that way. The notion that I ought to follow some kind of structured pattern such as show up every single day for a scrum meeting in person did not work well for me, nor did sitting in a bullpen full of programmers wearing headphones and not talking to each other. I would rather work in a McDonalds serving hamburgers than live like that, which is what programming as a profession had unfortunately largely become before the pandemic.

I believe what I did was quite rare. Most people went through one of the big contract houses, or even one of the really expensive contract middle-man companies such as Booz Allen Hamilton. I know that BAH did sometimes get really good people but they also had a lot of soulless hacks who would endorse any kind of lie for bucks.
 
Last edited:
  • #36
@yungman many programmers never come to understand race conditions. I remember arguing, years ago at Bell Labs, with a designer who wanted to leave in a design that could result in a race condition. "But it will never happen in a million years", was his reply. I considered the guy a total idiot. Do you want to fly in an airplane with software that could even ever remotely have a race condition? Remember all those (was it Toyota?) cars that got stuck in full acceleration mode even though people were pumping the brakes? They never exactly revealed what caused it, but multiple people died and many accidents occurred because some designer pushed out a bad design in code somewhere.

Anyway, that designer in MY case was a close friend of the department head, and he was thus untouchable, and his views always won out in arguments. I moved to get away from him, and years later, they finally had to fire him--not because he harassed all the women and Jews, not because he was an idiot, oh no--but finally he got fired because in his arrogance, he got drunk at a company picnic and struck a supervisor in the face with his fist. Not only a supervisor too--he struck the only African American supervisor, and at that point, there was nothing even his well-placed manager friend could do to keep him from getting fired. I was one of about two dozen people who went out for a beer to celebrate afterwards.

That guy getting fired was the first time in my life I started to think about karma being, maybe, a real thing.
 
  • #37
@Hacker Jack I apologize that we have hi-jacked your interesting thread to vent about corporate management issues. I too am always fascinated to find out why anyone bothers to spend a lot of lifetime with C++, as I personally found it so--(no words to describe but life is too short).
 
  • #38
harborsparrow said:
what are streamers?

cout << "Object contents: " << object;
 
  • Love
Likes harborsparrow
  • #39
jbunniii said:
If you're looking for up to date C++ in book form, Josuttis's C++17: The Complete Guide is good and has no competition. But it assumes you already know pre-17 C++ and just want to learn the new stuff. And it doesn't touch C++20 at all.
Lippman/Lajoie/Moo's C++ Primer (6th ed.) is apparently due Real Soon Now. At least it's available for pre-order at Amazon. Maybe this will fill the bill for an up-to-date "complete modern C++" book.
 
  • Like
  • Informative
Likes jbunniii and harborsparrow
  • #40
harborsparrow said:
And certain other features such as operater overloading might as well, in my experience, be academic toys. Because in complex real-world code, I have never seen a good reason to do anything like that

Here's a (very simple) example of when you might choose to overload a comparison operator. You might think "well, I can always take an object, and from the contents of that object derive an integer value, and then sort or select on that. For example red is 3, green is 2 and blue is 1, so red > green and green > blue." Here is a case where that does not work:

rock > scissors
scissors > paper
paper > rock
 
  • Informative
Likes harborsparrow
  • #41
Vanadium 50 said:
Here's a (very simple) example of when you might choose to overload a comparison operator. You might think "well, I can always take an object, and from the contents of that object derive an integer value, and then sort or select on that. For example red is 3, green is 2 and blue is 1, so red > green and green > blue." Here is a case where that does not work:

rock > scissors
scissors > paper
paper > rock

Interesting example. I think there are other ways that I personally would choose to deal with it though. Because the comparison going on is not really "greater than" in the arithmetic sense, I would probably just write a function to implement that logic and call it, rather than overload an operator. But then, I am a big proponent of making things extremely obvious to readers of my code, since I had the privilege of maintaining a pile of legacy code written by others, and that sharpened up my cussing vocabulary quite a bit.
 
  • #42
To be honest, I probably wouldn't code this up with an overloaded ">" either. But it's an example where the standard trick of "oh, just assign an integer to it somehow" doesn't do it.
 
  • Love
Likes harborsparrow
  • #43
harborsparrow said:
@yungman is correct. Most CS majors tend to be offered numerical programming as an option rather than a requirement, and I agree that it (as well as some assembler level programming) would be useful as a requirement instead.

But CS sees, accurately I think, that it will only be needed mainly in scientific fields. Programmers can work for decades in industries such as telecom or pharma and never need to do more than very trivial numerical work. It will be, for those types, lots of database, web services, or UI work--very different from what physicists need.

@yungman is also correct that CS departments do not teach enough "close to the metal" real-time programming, which is a dark art requiring intimate knowledge of the hardware. However, EE departments tend to offer those courses, and they are usually available as options for CS majors.

This was true at the two CS departments I had contact with, anyway (Univ of Tenn and Univ of Penn).

P S - I learned a little of the "dark art" doing microprocessor programming in telecom many decades ago. Not only did it have to be assembler level, but RAM was so small that the code had to jump to take advantage of every free byte of RAM in order to fit it into the tiny chip. These days, memory is cheap and people think they can afford the overhead of OO and compiled languages for robotics etc. That is debatable.
The reason I keep mention about cross learning is because the reliability and speed of ALL the new products with computer are going DOWN. I am talking about everything I bought in the last 2 years. I know people here think this is off the topic in this forum and got deleted before. I am talking about a 2 years old new car, new 82" Samsung expensive tv, numerous printers, new Direct tv receivers. They are SLOW, unreliable, confused. You have to restart, unplug them all the time. Things I bought up to 5 years ago are NOTHING like that.

Obviously there is a big disconnect between hardware and software. This is going to blow up in the industry as people will get fed up with this. as expected, I just bought an Epson printer that is over $300, it occasionally print the old file in cash when asked to print a new file. Print half way and stop and got confused. This is the 6th printers I got in the last two or 3 years, Canon, HP. Now I just bought a Brother waiting to arrive. Don't even get me started on cars and tv or else this post is going to be deleted. there's a HUGE difference between the latest and greatest to ones that are still consider quite new only afew years old....and difference NOT in a good way.

Then I started learning C++ and see all the fancy "elegant" ways of doing things, then I started to realize software and maybe hardware are getting a live of their own forgetting they are just TOOLS to get a job done. They are no more or no less than say the Word program...to make people's lives easier, then GET OUT OF THE WAY!
 
  • Like
Likes harborsparrow
  • #44
On the questions of "why colleges (and high schools) switched to Java", it has to do with the relative "safety" for the average programmer in using Java vs. using C++. It is difficult if not impossible to cause a segmentation fault using Java or to crash an entire machine, or screw up the DLL's so much that it is necessary to reinstall the operation system. All this IS possible with C++.

Here's an analogous situation. By the early 2000's, there were approximately 40 major and minor versions of Windows in public use. Each version of Windows (counting not only, say, Windows 2000 but Windows 2000 Professional, and then versions 1 vs 2 etc) had a slightly different set and configuration of DLL's (dynamic link libraries), and the ecosystem of Windows operating systems had become utterly unmanageable even to expert C++ programmers with years of experience. This situation was known as "DLL hell" as most of you may remember.

To solve it, someone at Microsoft took a look what James Gosling at Sun did when he create the Java programming language. Java does not run on the machine's hardware directly; instead, it runs on a Java "virtual machine", and this "virtual machine" looks the same no matter which operating system it happens to be running on. Microsoft used this idea of a virtual machine to bootstrap itself out of DLL hell by creating something called .NET. , which was a Microsoft virtual machine that would be implemented for every Windows operating system (and some Linuxes at least), and all so-called application programming would be done to the .NET virtual machine instead of to the operating system directly. This saved Microsoft's bacon. Further, Microsoft went to ECMA and got both their .NET virtual machine and it's premier language, C#, standardized. Languages which never create an international standard for tools builders to work against usually have failed (think: Ruby, just for one example).

So--Java became the new programming language for kids to learn because it would run on Mac OS X and it would run on Windows and it was a decent programming language that was fully object oriented. Both it and C# have now been extended to include everything but the kitchen sink, but you still cannot create a segmentation fault using them to save your life. Furthermore, over the years, the .NET VM and Windows compilers have become so highly smart and optimized that performance for numerical number crunching programs is very high indeed and it is probably well worth consider these newer and nicer behaved "sweeter" languages over C++ whenever it is feasible. The .NET VM is so good, in fact, that dozens and dozens of programming languages now target it to run on, including modern versions of COBOL, the ancient language of the business world that still exists due to legacy code written back in the 1960s.

As far as I can tell, physicists, astrophysicists, the space program, and people writing actual operating systems are pretty much the only holdouts who still actively use C++ with all its problems, and that is in large part because there is a lot of legacy number crunching programs lying around written in C++, FORTRAN and the like, not because more modern languages would not be up to the task. At least, this is my impression based on a long career in software development that spanned literally dozens of languages and reached from short stints in physics and astrophysics projects, to biology, to telecommunications, to big pharma, to web development, and eventually to teaching.

C++ on the other hand has all the portability issues that C always had. If you change hardware from 32-bit to 64-bit, the C and C++ compilers will have to be doctored. But using a virtual machine underneath, not so much. Hence, Java. Java solved a million more problems than just automatic garbage collection, and BTW, automatic garbage collection also introduced some new problems for really high performance applications or large-scale websites, just for example. You don't get nothing for free in computers; there is always a trade off.

By the way, since Microsoft tended to name everything differently than the UNIX/Linux world, it took everyone about a decade to figure out what the hell .NET actually was, and that it was a good thing. Where they came up with the name .NET I don't know, but the way the name (mis-name) things doesn't do them any favors.
 
Last edited:
  • #45
@yungman you are talking about the safety and reliability of the so-called "internet of things" (a dumb name), referring to all the devices now in common use that have microprocessors inside them, especially in smarthome and automative products. It is true that these have started out low quality and have big security issues. But, I think it is a well recognized issue (at least in computer science circles) and that help is on the way. A lot of attention is now being put into finding out and exposing the hacking weaknesses of these products at the Black Hat conferences, and companies such as Google have entered the marketplace with a higher standard of quality than was at first being applied.

I've been watching (and using) smarthome products from the get-go. You can see just by reading reviews on Amazon (which can be pretty entertaining by the way) how these products are slowly getting their act together and are coming out with more stable and secure versions. Most of them were horribly rushed into the market with little forethought because they were desperately trying to beat out competitors and also they were innovating and doing things never done before--and no one was giving them lots of money to do it, they had to do it on shoe-string budgets.

The marketplace has not been kind to any kind of culture that would develop reliable software since the crash of the telecom industry thirty years ago in the US. Amazon, Google, Microsoft and those big corporations are the only players which had the resources to fund long-term development, but those were not the innovators in IoT or smarthome products--the big corporations followed along later and helped clean up the mess.

I am actually an admirer of Jeff Bezos, founder of Amazon, because he refused to promise his investors that Amazon would make a profit for, like, 8 years at first. And it took him almost the entire 8 years to make it profitable, and he did it by being smart. He's no angel but he has been extremely innovative in software and in business and gets less credit than he should.
 
  • #46
Hi Harborsparrow

I am not talking about smart phone and windows stuffs, for whatever reason, I don't have issue with laptops. We average buy one laptop every two years max between me and my wife, we always get in price range of about $700 which gets us into flash drive, latest gen I5 or I7. We don't play video game, we save on the graphics. Laptops are one that always quite reliable. I always get HP, few Lenovo did not work out very well. I don't put anything important on my smart phone, I only put my play email address in it, not bank info or anything. They can take my phone and I won't worry about anything. So those are not my issue.

My issue is all the appliance( necessities). I have a 3 yr ol Samsung 65"TV, it works perfectly, remote is fast and consistence. Then I bought an 82" Samsung early this year, man that is bad. The remote is so so slow, it has to "think" before doing anything as simple as navigate to the menu and through the menu. It is no more sophisticate than the 3 years old one, just slow. Then it got confused sometimes and had to start all over again.

Then the car. I have a 2014 Mercedes ML, other than changing oil and stuffs, only been to the shop ones about a warning signal that turned out to be tire pressure that got fixed by inflating the tires. It has voice, navigation and all the fancy stuffs. We had so much luck with the ML and an old 2003 E class so when it's time to replace the E, we went and bought a 2018 E. MAN! Is it a bomb. Spent at least a month in the shop the first half a year, ALL computer problems. You can drive and all of a sudden you lost control of the radio and everything in the center console. I had to stop the car, put in park, then it reset. The radio change station by itself, all the stupid mouse pads and joy sticks. They never got it fixed, they just say wait for the next software update. They never got the garage door opener to work. They even sent a specialist to our home and still couldn't make it work. but he can get his C class he drove to work with my garage door! Then about the printers, I am still playing musical printers. Then of all things, my 2 years old washer got confused and started blinking! Had to turn it off to reset it. Since when you see a cloth washer get confused? Again, it's a high model of Maytag.

Anything that is over 4 years old never have issue like this. We buy a lot of stuffs. I don't care they break down, but they are not working right from the get go. Yes, I blame on the software/firmware. Learning C++ make me start to understand why. Too fancy, too elegant. What gets me is the computers are getting faster and faster, the stuffs are getting slower and slower. Must be all the calling class to class, aggregations and collaborations that I just learned. Keep pushing on stack( I mean stack pointer when you make a call and store the existing data). All the push pop, general purpose language that works on all platform causing a lot of overhead. Imagine you try to make a 3 point U turn in a busy street, you switch the transmission, the transmission has to thing. " are you sure you want to do this?" before it switch. AND YES, I am talking about my 2018E! make my blood boils just talking about all these. Yes, I blame the software. they told me as such.

I am not say hardware never at fault. But hardware is really dumb, only a few control bits, say a DAC and ADC. Those either works or if it fail, it's not intermittent, it fail very noticeable and you cannot turn off the power to reset that often. Hardware is dumb like a rock.
 
Last edited:
  • Love
Likes harborsparrow
  • #47
The software in automobiles now is just plain scary sometimes. And, self driving cars are coming soon. Think about that.!
 
  • Like
Likes yungman
  • #48
harborsparrow said:
The software in automobiles now is just plain scary sometimes. And, self driving cars are coming soon. Think about that.!
Do you know I had to order the car to get the lowest level equipment already. I don't have intelligent braking, no driving assist. Still it's like that. Funny thing is they advertising the new ML can "hoop" out of a pot hole. Imagine when it goes south, you could be driving down the road hooping like a low rider?! The 2014 is not any inferior in high tech, it's just more reliable.

No, I will NEVER buy a self drive car. If I get too old, I'll take Uber. Just look at the Boeing 737 Max.

I am seriously thinking about Tesla next, I don't drive that much particular during this virus, stupid gas car die if you don't drive them or drive short distance. I have to make up reason to take a longer drive every other week to keep them alive. An electric car don't have that problem...BUT then, the Tesla??! Another gadgetgy one I don't need.
 
  • Haha
Likes harborsparrow
  • #49
Don't mistaken me anti technology. I have a lot of patience with computers, it's understandable. I just don't have patience with "appliances", car, printers, tv etc. to me are appliance to make my life easier, not more challenging. I PAID for them to make my life easier.
 
  • Love
Likes harborsparrow
  • #50
Well--@yungman--you should have read the report in the 1980's when several of the Bell Labs engineers had to rush to testify before the U S Congress to try and get them NOT to build lazers that could shoot things from space. Bell Labs had officially lobbied for the project because the company stood to get a lot of money out of it, but the engineers rebelled and essentially told Congress: "Are you effing out of your MINDS?" And cited all the rockets going off course and blowing up in the early days of space travel due to programming errors, and all the ongoing failures in software, and the absolute dumbness that it would take to trust software not to shoot at the wrong thing etc. But funding finally did that project in, not wisdom. Thank goodness it never happened.
 
  • Like
Likes yungman
Back
Top