One striking feature of this discussion is the widespread misconceptions undergraduates have that they are able to distinguish between good and bad teachers while taking a course. Since all they have is their past experience, they tend to base their assessment on how much they like the prof, how easy it is to reach their grade goals, how comfortable they are in the class, and how well they think they are learning at any given point.
More objective assessments of how well they learned are not usually available until after a course is completed and there has been ample opportunity for the learning objectives to be demonstrated (or not) in other settings - downstream courses, standardized tests, the working world, etc. Performance in Calculus and Physics are much better indicators of how much one learned in precalc than one's feelings about the prof during the precalc course itself.
I was put in charge of the precalc course at the Air Force Academy. The admins there were clear that the priority was preparing those students for the challenging STEM core rather than winning a popularity contest. Since the US taxpayer is paying $100k a year for students to attend, admins were concerned with their return on investment and minimizing the drop out rate from students who arrive at USAFA too weak in math to start in Calculus. All USAFA students are required to pass two semesters each of Calculus, Physics, and Chemistry and seven semesters of engineering. The faculty who taught the course before a colleague and I redesigned it had very favorable student evaluations, but student success in downstream courses (the STEM core) and subsequent graduation rates for the cohort who started in precalc (rather than Calc) was very low.
Most of this cohort were recruited athletes with weaker high school math preparation than average for selective schools. In some cases, most of high school math was weak, and we only had a single semester to fix it. Since many of these students had slid by in high school because they were athletes, they brought with them unreasonable expectations of favoritism. To get them working hard quickly, we needed to do things like have them sit in the hall and finish assignments when they came to class unprepared, contact their military chain of command if they neglected homework, and contact a rep of their sports teams when they performed poorly. I must say, support from the Athletic Department was outstanding, and nothing gets a student athlete's attention like his coaches. However, the "boot camp" approach to a math class did not win many popularity contests with the students. The faculty who taught this course after the revisions had much poorer outcomes on the student evaluations than the faculty who taught it before. However, success rates in downstream courses including Calculus, Physics, and Engineering Mechanics skyrocketed, and the admins were very pleased.
In hindsight, I think the downstream success was less about the actual math they learned and more about the fact that they learned to work hard and actually do ALL the assigned homework, making use of the available resources when help was needed. They learned that they would not be given a pass on their academic work because they were special. Some are even attending graduate school.