Purchased a Flat Screen TV: Picture Not That Great

Click For Summary

Discussion Overview

The discussion revolves around the perceived poor picture quality of a newly purchased 46" Insignia LCD television, particularly when viewing standard definition cable content. Participants explore various factors that could affect picture quality, including signal type, connection methods, and television settings.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Conceptual clarification

Main Points Raised

  • The original poster (OP) expresses disappointment with the picture quality, noting it appears blurry while watching standard definition content.
  • Some participants suggest that the OP may not be receiving an HD signal, which could explain the lack of clarity.
  • Questions are raised about whether the cable connection is analog or digital, and how this might impact the viewing experience.
  • One participant mentions that standard definition channels often look worse on LCD or plasma TVs compared to CRTs due to the larger screen size magnifying imperfections.
  • Another participant proposes that the TV's picture settings may need adjustment, as they could be set to a mode that does not optimize for home viewing conditions.
  • Suggestions are made to compare the new TV with an older model side by side to evaluate the differences in picture quality directly.
  • Concerns are raised about the limitations of standard definition video quality, particularly in relation to the capabilities of the new TV.
  • Some participants discuss the potential impact of the aspect ratio settings on picture quality, recommending adjustments to avoid stretching the image.
  • Technical details are provided regarding how standard definition signals are processed by digital TVs, which may contribute to the perceived quality issues.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the cause of the poor picture quality. There are multiple competing views regarding the impact of signal type, television settings, and the inherent limitations of standard definition content on modern displays.

Contextual Notes

Participants note that the quality of standard definition signals can vary significantly, and that the OP's expectations may not align with what is realistically achievable given the equipment and signal type. There is also mention of potential differences in picture quality between various TV technologies (LCD vs. CRT).

Who May Find This Useful

Individuals experiencing similar issues with picture quality on new televisions, particularly when using standard definition signals, may find this discussion relevant. Additionally, those interested in optimizing their TV settings or understanding the impact of signal types on viewing experience could benefit.

Saladsamurai
Messages
3,009
Reaction score
7
Hi folks :smile:

I am thinking that there are plenty of people here that have these and was hoping for some input. I bought a 46" Insignia (Best Buy's own brand), 60 HZ, 1080p, LCD television. I got it on sale at Best Buy for $399 and it had fairly decent reviews (average of 4.3 stars out of 5 with a total of 40 reviews).

I got it hooked up last night and to be honest, I am not too impressed with the picture quality. I don't think that my expectations are too high in expecting fairly sharp picture quality. The picture seems a little bit blurry. I am watching documentary style TV on the science channel, so I do not think the refresh rate has anything to do with it.

Does my TV need to go through a 'warm-up' period? I am connected using a coaxial connection and just using it to watch standard definition cable. Is it possible that the cable coming from the wall could be sub-quality? I am not getting any 'static' type effects, so I assumed that the cable was fine.

I was thinking that I could do some tests. I think if I play a new DVD that is not in HD, this should set an upper bound on what I can ever hope to get from standard definition cable.

Any thoughts on this? Are there any flaws to that test?

Thanks y'all!
 
Computer science news on Phys.org
Are you sure you are watching HD picture, and not a standard TV picture rescaled to HD?

Edit: ah, I see - standard definition cable probably means just a low-res TV. You need a HD signal to see what your TV is really able to show. Even DVD will look rather lousy.
 
Is this a residential cable TV connection at home, or a connection in a dormitory where the university buys the service for you and may be modifying the signals? Do you know whether the channels are analog or digital? Standard definition can be either digital or analog.

If this is at home, most cable TV companies in the US encrypt all of their digital channels (whether HD or SD) except for the channels that you could have (at least in principle) gotten over the air with an antenna: ABC, CBS, NBC, etc. You should be able to get those channels in HD without a cable box, but they probably have different channel numbers than you're used to.

If it's a dormitory setup, the university may be passing along the standard-definition digital channels, or only the analog channels, or they may even convert digital to analog.

In any case, most people tend to be disappointed with the way analog channels look on a LCD or plasma TV. It may be simply because the larger picture magnifies the imperfections in the picture, or maybe the digital displays simply don't do as well as CRT TVs with analog video.

Also, I think standard-definition digital video from cable and satellite TV providers tends to be kind of crappy to begin with. They don't provide enough bits to do the picture full justice. Analog TVs (CRTs) smooth out the crappiness a bit, but digital TVs (LCD and plasma) don't.

At home, I get my TV over the air with an antenna, but I've had some experience with cable and satellite setups in motels while traveling, and this fits with my experiences then.
 
OP said standard analog cable...sorry, but an HDTV can't fix a poor quality signal.

DVDs should look great though (no, SD cable is nowhere close to DVD quality). Or stream some internet video.
 
I think it's actually digital. Sorry if op was ambiguous. It's a coax connection, but I believe Comcast in Massachusetts moved to all digital a couple of years back. Analog receivers don't work anymore. My parents living room tv seems to get better quality picture even though it's standard def in that room too. Their tv is plasma, but I'm not sure how much difference that makes when the programming is only SD.
 
If you're "stretching" those standard-definition channels horizontally so they fill the screen, try setting your TV to display them in the proper 4:3 aspect ratio, with black bars on the sides. That might improve the picture a little bit.
 
jtbell said:
If you're "stretching" those standard-definition channels horizontally so they fill the screen, try setting your TV to display them in the proper 4:3 aspect ratio, with black bars on the sides. That might improve the picture a little bit.

Yeah. I have them stretched in 'wide' mode. That's the way my downstairs tv is set up. Maybe my expectations are too high. I'm setting the bar according to what I see on the living room tv. It's definitely better down there.
 
Saladsamurai said:
Maybe my expectations are too high. I'm setting the bar according to what I see on the living room tv. It's definitely better down there.

Carry your new set downstairs, find a coax splitter and hook both sets up side by side to the same signal. That should let you eliminate some questions about the connection and let you directly comare the two sets. You can also Google for information on adjusting your brand set and see if that helps.

After all that, if you are only getting standard def channels from Comcast and you have any local over-the-air high def channel available then get a couple of pieces of 20 foot copper wire, bare one end of each, gently slip the bare end into the coax RF input on the back of each set to make two temporary HD antennas, scan for channels on each set and set both to the same channel with satisfactory signal strength with at least the resolution of the set.

A good quality (like the better Samsung, Sony or LG) HD set with a good HD signal when compared to a ten year old somewhat fuzzy CRT with an SD signal can show how much better HD can appear, or it can show that a really good monitor can show just how many artifacts and stray bits there are in a Comcast SD signal that are invisible on an old set and really obvious on a nice new Samsung.
 
Bill Simpson said:
You can also Google for information on adjusting your brand set and see if that helps.

That's a good idea. The picture settings on your new TV are probably not optimized for a good, "natural" looking picture in home viewing conditions, and are likely different from the old TV. They're often set by default so as to stand out in a store display. I've seen it called "torch mode." When I got my LCD HDTV several years ago, I spent a few days experimenting with the picture controls (brightness, contrast, etc.) until I got something that looked natural to me. This was with good HD signals, though. Colors and brightness levels tend not to be as consistent on analog or SD digital channels.
 
Last edited:
  • #10
NTSC standard def is going to look bad on a digital TV because the source for the 4:3 picture is composed of 720x480 pixels with width = .9 and height = 1. A 1920x1080 digital TV should display the image as a 1440x1080 image, but vertically for every 2 source lines of video, it has to display 2.25 output lines, which means most of the displayed lines of pixels are some type of averaging of adjacent source video lines. Another option would be to dsiplay a cropped image of 1280x960, but then then the horizontal pixel multplier is 1.777 (16/9), which is even more awkward.

Even with an HD source, it seems that there's a limited color pallette on the cheaper LCD screens, resulting in a lack of subtle shading, resulting in a sort of washed out look.

This is why I've kept my old analog crt rear projection HDTV (Mitsubishi last made these in 2006, there are 3 crt projectors, red, green, blue, they need to be calibrated occasionally to keep them in sync), since standard def 480p is one of it's "native" modes. An analog display just changes beam width and sweep rate to deal with standard def versus high def broadcasts, and analog means the color pallette isn't an issue.
 
Last edited:

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
4K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 37 ·
2
Replies
37
Views
9K
Replies
15
Views
4K
  • · Replies 45 ·
2
Replies
45
Views
5K
Replies
3
Views
3K
Replies
4
Views
4K
  • · Replies 6 ·
Replies
6
Views
13K