Questions about a TV Tuner Card

Discussions about anything Computer Hardware Related. Overclocking, underclocking and talk about the latest or even the oldest technology. PCA Reviews feedback
Post Reply
User avatar
Mike89
Senior Member
Posts: 457
Joined: Sat Nov 25, 2000 3:04 am
Location: California
Contact:

Questions about a TV Tuner Card

Post by Mike89 »

I thought I'd post some questions for the techies here to help me better understand about TV Tuner Cards.

I just bought and installed the Winfast TV2000XP tv tuner card.

I wanted the capability to have TV on my computer and heard this one was about the best one to get.

I have cable TV at my house and had another jack installed in my computer room.

These are my questions.

I have the option of watching TV either windowed (I can stretch it any size I want) or full screen.

Now I didn't really know what to expect but I notice that the picture quality is not as good as a regular TV. I was surprised at this because with a AMD 2000+ and GeForce 4 4600 and superior quality of a monitor vs TV, just expected it be look better, not worse.

Why is this exactly?

I mean it looks decent and I wouldn't take the card out, but it sure could be better. At close up shots, it looks fine. But watching something like a baseball game or something with lots of stuff in the background, it looks kind of blurry. Watching a pitcher from the back, I can't even read his name on the back of his shirt (when he is not right up close), where as with a TV, it's nice and clear. I catch myself wanting to have a focus knob to mess with.

Is this as far as things have progressed? I wouldn't think this would be that big a deal to get right. Maybe I'm not understanding how a vid card works as opposed to a TV. I mean play a game on a PC vs a console and the clarity of the PC just stomps on the console. I expected this same sort of thing here but sure didn't get it.

I have been talking to Leadtek tech support about some things but it appears they are learning as they go and don't know much more than I do. Heh heh.

I mean the manual says it supports 'mono', 'stereo', and 'SAP' but all I get is 'mono' (my cable carrries all three). The guy at Leadtek says it doen't support it. Funny, manual says one thing and he says another. In the manual, the on screen control panel shows the 3 indicators, Even the remote has a button to cycle through these but just 'mono' displays on the screen when cycling. He had no answer for that and I think he is now doing some research.

Anyway, any comments I can get on this subject will be appreciated. Thanks.
TruckStuff
Golden Member
Posts: 1056
Joined: Thu Feb 07, 2002 5:17 pm
Location: Dallas, TX

Post by TruckStuff »

Its supposed to be that way. Computer monitors display images on a pixel basis and TVs display images on a line basis. As such, TV signals are only broadcast in lines rather than pixels. A DVD will do the same thing if you get a DVD-ROM in your box. The "fuzz" you see is the card converting that line based signal from the cable to a pixel based image for your monitor. Its not that your card is bad, its a limitation of the technology. This will be the case for the foreseeable future, meaning that HDTV or any other new technology won't change that because they are still broadcast on a line-by-line base rather than a pixel-by-pixel base.
User avatar
Mike89
Senior Member
Posts: 457
Joined: Sat Nov 25, 2000 3:04 am
Location: California
Contact:

Post by Mike89 »

Thanks for the info. That's a bummer, I had just hoped it would be better. Makes me think twice about getting a DVD player as I don't particularly like the idea of watching a movie that is blurred out. I'll just stick to watching movies on the TV.

I got a reply from Leadtek on the stereo, SAP thing. It appears that the US version of the card does not support stereo or SAP.
User avatar
FlyingPenguin
Flightless Bird
Posts: 33162
Joined: Wed Nov 22, 2000 11:13 am
Location: Central Florida
Contact:

Post by FlyingPenguin »

I've yet to see ANY TV tuner card look anything but like crap. The two video standards just aren't compatible.

The main problem is that broadcast video is INTERLACED while computer monitors are not any longer (we used to use interlaced monitors in the early days).

Interlaced means that on a TV first the odd lines are drawn, then the even lines. This induces a certain amount of flicker and leads to optical distortions like the rainbow of colors in a striped tie.


HDTV COULD HAVE been an improvement but the IDIOTS blew it. American HDTV will also be interlaced. Frankly, in my opinion, HDTV is a waste of money if it's interlaced - there's no way it'll ever look as good as non-interlaced.

Europe and (I think) Japan went with non-interlaced. This is ideal as it's 100% compatible with a computer monitor - you just resize to fit the screen and deal with refresh issues.

Converting an interlaced video signal to non-interlaced is a major problem involving a lot more hardware and the result is never satisfactory.


I use the WinTV USB TV tuner. Works okay as far as they go.
---
“The Government of Spain will not applaud those who set the world on fire just because they show up with a bucket.” - Prime Minister of Spain, Pedro Sánchez

Image
User avatar
Mike89
Senior Member
Posts: 457
Joined: Sat Nov 25, 2000 3:04 am
Location: California
Contact:

Post by Mike89 »

Very interesting.

Why would the US go with interlaced instead of non-interlaced? There must have been a reason for the choice. Was it a matter of simpler vs better?

So you're saying that if the US broadcast would have been non-interlaced, that the TV tuner cards would perform much better and look as good as TV?

What about the pixel vs lines thing that Truckstuff mentioned?
User avatar
FlyingPenguin
Flightless Bird
Posts: 33162
Joined: Wed Nov 22, 2000 11:13 am
Location: Central Florida
Contact:

Post by FlyingPenguin »

Why would the US go with interlaced instead of non-interlaced? There must have been a reason for the choice.
Frankly, it was the fact that there were TOO MANY people involved. Anything designed by committee is destined to be a joke.

There were two competing standards the US was looking at, and (if I remember) we went with one from an American company (Phillips?) which was interlaced. The competing standard (and I believe the one Europe is using) was from a Japanese company. I'm sure some congressional "Soft Money" (or whatever the polite term for "Bribery" is nowadays) was involved.

Also the film industry put it's big foot in where it didn't belong. They had concerns about the image quality being TOO good. They don't want to chase people from theatres, and they have their continuing concerns with copyright infringment (they're leary of making it TOO easy to make computer and broadcast video standards compatible in the belief it will encourage more piracy).


I'm speculating, but from an engineer's perspective, and the ability to integrate existing systems to the new one, it's probably more attactive to stay with interlaced.


Computer and video card manufacturers DESPERATELY wanted to see the US go with non-interlaced. It would make it EXTREMELY practical to build-in TV capabilities into computers without a LOT of ancillary hardware to convert the signals. Dual head video cards would be MUCH cheaper and easier to make for instance.

The computer people would like to eventually see a merging of consumer entertainment products into computers as an intelligent multimedia entertainment system. THEN we would finally see a 40" console TV that also could double as a computer monitor (or even have a computer system built into it) and the quality wouldn't suffer (as the lame attempts so far to try to interface a computer to regular TVs have).

So once again the Japanese will have all the REALLY cool toys before we do (we're 10 years behind the Japanese in cell phone and PDA integration because, again, of stupid committies making decisions based on politics and national prestige instead of practical engineering design). Almost every teenager in Japan has a cell phone that is essentially a PDA.


The lines versus pixels thing isn't really an issue. In the end, your computer's monitor (not LCDs though) also draws the image line by line. That's not an insurmountable problem. On all modern TV's there is a VIRTUAL horizontal resolution that can coorespond to a computer screen's resolution.

That's no problem at all - look at the SGI graphics workstations or other computers designed for interface with broadcast television being used by sports channels and news networks. At my old job we used $10,000 Matrox broadcast video cards to display graphics from computers for television broadcast for the graphics shown on TVs at horse tracks (this by-the-way, to reference another thread elsewhere, is the "niche" market that everyone thinks Matrox is not making money on and thus MUST get into 3D gaming - LOL!).


Bottom line is you CAN get crystal clear TV reception on your computer, and send it from your computer to a TV - BUT the hardware needed to do it right (not the half-assed way a $150 computer video card or TV tuner card does it) is VERY EXPENSIVE because of the incompatibility of the two systems - and the biggest hassle is converting to and from interlaced video.

That hardware is used in to generate computer graphics on everything from sporting events to the weather map on the news. TRUST ME, they aren't using an All-in-Wonder card to get video out from the computer that displays the weather maps on the Weather channel. That's some pricey hardware.
---
“The Government of Spain will not applaud those who set the world on fire just because they show up with a bucket.” - Prime Minister of Spain, Pedro Sánchez

Image
Post Reply