One channel = 720p instead of 1080p

Ask fellow members about Ceton's infiniTV tuners here.
Forum rules
Ceton no longer participate in this forum. Official support may still be handled via the Ceton Ticket system.
Post Reply
User avatar
Sleerts

Posts: 31
Joined: Mon Jan 28, 2013 3:13 pm
Location:

HTPC Specs: Show details

One channel = 720p instead of 1080p

#1

Post by Sleerts » Tue Aug 27, 2013 11:43 am

Ticket ID: IXO-830-77084

Looking for some assistance as I'm stuck between Ceton and Comcast on this issue...

I've got Digital Economy with HD via Comcast which gives me like 30+ HD channels, all of them work great in Windows Media Center with the Ceton infiniTV tuner 4, running windows 8 pro, Nvidia 660ti 2GB and 16GB of RAM.
All channels come in 1080p except FOX(234,1005) which comes in 720p; I never noticed it until the return of Football season where I noticed the picture looked bad, so I'm not sure when it started.

I've ran the diagnostics tools and attached the results, I've checked the settings for each tuner -
1 -
signal level = 0.7
signal to noise = 35.8
temp = 47.9C

2-
signal level = -0.2
signal to noise = 36.6
temp = 47.0C

3-
signal level = -1.4
signal to noise = 36.6
temp = 48.0C

4 (Manually Tuned to 234)-
signal level = -2.9
signal to noise = 35.8
temp = 47.9C
CetonInfiniTVDiagnostic_HTPC_20130827_07-36.zip
(510.26 KiB) Downloaded 49 times
HTPC Noob....Learning tho

JohnW248

Posts: 786
Joined: Fri Jul 20, 2012 7:23 pm
Location:

HTPC Specs: Show details

#2

Post by JohnW248 » Tue Aug 27, 2013 1:25 pm

Nothing you can do. Fox & ABC transmit at 720p, CBS & NBC at 1080i. Those are the standards that are available and that's what they chose. Some of the cable channel step-children of these channels will also be 720p.

This goes back to the original digital standards and what broadcaster could choose. The 720p allows more sub channels and in the case of Fox, they felt smoother movement on sports broadcast.

kmp14

Posts: 138
Joined: Sat Sep 08, 2012 7:23 pm
Location:

HTPC Specs: Show details

#3

Post by kmp14 » Tue Aug 27, 2013 1:31 pm

I may be over-simplifying it, and you have a problem that I dont understand, but I DO KNOW that Fox is a 720p channel. In other words, they broadcast their HD channel in 720p. ABC and ESPN are also 720p channels. It is possible, but unlikely, that Comcast is converting some 720p channels to 1080i, but I doubt it.

Check ABC and ESPN, my guess is that you would also find that they are 720p.

It doesn't sound like you have a problem, other than not liking the quality of Fox's signal (or Comcast's, since they may be compressing the signal).

joelkirzner

Posts: 128
Joined: Wed May 22, 2013 3:18 pm
Location:

HTPC Specs: Show details

#4

Post by joelkirzner » Tue Aug 27, 2013 2:21 pm

The previous replies are all correct.
I have Comcast as well and Fox and some other channels have always broadcast in 720p, NOT 1080i.
Now, the quality of the actual picture will depend on how it's compressed. As you've discovered, Fox sports can look pretty awful. For me, it's almost unwatchable at times. This is Comcast's fault, most likely with how they're compressing the signal through their pipes.

This has nothing to do with Ceton InfiniTV.

User avatar
Sleerts

Posts: 31
Joined: Mon Jan 28, 2013 3:13 pm
Location:

HTPC Specs: Show details

#5

Post by Sleerts » Tue Aug 27, 2013 2:26 pm

Thanks guys, I assumed it had nothing to do with the Ceton InfiniTV.
HTPC Noob....Learning tho

joelkirzner

Posts: 128
Joined: Wed May 22, 2013 3:18 pm
Location:

HTPC Specs: Show details

#6

Post by joelkirzner » Tue Aug 27, 2013 2:35 pm

One way I've confirmed it has nothing to do with Ceton is that I also have a plain jane Comcast HD box.

Aside from some slight color saturation and sharpness differences (computer-hdmi-tv vs Comcast box-component-tv settings), the picture is identical when switching from the Ceton to the box...

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#7

Post by richard1980 » Tue Aug 27, 2013 4:59 pm

The tuner doesn't do video processing....that would be the responsibility of the GPU, not the tuner.

tzr916

Posts: 445
Joined: Tue May 28, 2013 11:56 pm
Location: Stockton CA

HTPC Specs: Show details

#8

Post by tzr916 » Wed Aug 28, 2013 4:56 pm

Question-
My Echo has an option called "NATIVE" that is supposed to change the output format of the box to whatever that channel is broadcasting. So if I am watching a 1080i channel then my Tv gets a 1080i signal from the box, change to a 720p channel then my Tv gets a 720p signal from the box, etc.

Does WMC on my HTPC have this option?

joelkirzner

Posts: 128
Joined: Wed May 22, 2013 3:18 pm
Location:

HTPC Specs: Show details

#9

Post by joelkirzner » Wed Aug 28, 2013 9:03 pm

tzr916 wrote:Question-
My Echo has an option called "NATIVE" that is supposed to change the output format of the box to whatever that channel is broadcasting. So if I am watching a 1080i channel then my Tv gets a 1080i signal from the box, change to a 720p channel then my Tv gets a 720p signal from the box, etc.

Does WMC on my HTPC have this option?
I'm not entirely sure about the native option from the Echo... I installed the Echo a couple months ago and returned it soon afterwards. My media center HTPC will always output a 1080p signal to the TV (the highest resolution the TV has). If I hit "411 info" on my media center remote while watching live tv, scroll over to the last information screen, there is a video size line and display size line. Most HD channels will be 1920 x 1080. For Fox and some others, the native size will be 1280 x 720 and the display size will be 1920 x 1080 (I think). I may be wrong about this... I'll have to check when I get home. So WMC may be upconverting the picture to fill the screen.

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#10

Post by barnabas1969 » Wed Aug 28, 2013 9:45 pm

tzr916 wrote:Question-
My Echo has an option called "NATIVE" that is supposed to change the output format of the box to whatever that channel is broadcasting. So if I am watching a 1080i channel then my Tv gets a 1080i signal from the box, change to a 720p channel then my Tv gets a 720p signal from the box, etc.

Does WMC on my HTPC have this option?
The short answer: No.

The long answer:

All digital TV's (LCD, DLP, or plasma) have a fixed "native" display resolution. In other words, there are a fixed number of pixels on a digital display. Most newer TV's have a native resolution of 1920x1080 (1080p). Some older or less-expensive TV's use 1280x720 (720p). All digital TV's are "progressive" displays (hence the "i" or "p" in 1080i or 1080p). Only analog (CRT) displays are capable of multiple resolutions and/or interlaced display. Analog displays are becoming rare.

Everything below refers to "digital" displays.

If the video content that is sent to the TV doesn't match the panel's native resolution, then the TV must scale and/or de-interlace it to fit the panel. So... the TV's built-in scaler/de-interlacer is used to accomplish the task.

When the source device is a consumer device like a DVD or Bluray player, you need to (subjectively) determine which device (the TV or DVD/Bluray player) has the best scaler/de-interlacer. For now-on, let's refer to them as the "display" (TV) and "playback" (DVD/Bluray) devices. FYI: your PC is a "playback" device.

For the remainder of this post, let's assume that you have a 1080p display device. Obviously, if your digital display device can only display 720p, then you should substitute 720p below, instead of 1080p.

If you choose to allow the playback device to scale/de-interlace the image to the display's native resolution, then you'll set the playback device to always output 1080p. If you choose to allow the display device to scale/de-interlace the image, then you'll set the playback device to output the same resolution and/or frame rate as the source material.

In the case of a Cable-box/Satellite-box/DVD-player/Bluray-player (the playback device), you need to (subjectively) determine which device has the best scaler/de-interlacer. If you determine that the display device has a better scaler/de-interlacer than the playback device, then you should set the playback device to output the content as-is (so that the display device can scale and/or de-interlace it). If you determine that the playback device has the best scaler/de-interlacer, then you should set the playback device to always output the native resolution of the display.

Unfortunately, Windows Media Center does not have the ability to output the content as-is. So, this means that you must choose to use the playback device's (the PC in this case) scaler/de-interlacer.

But... this isn't bad news. Most graphics processors (GPU) are more powerful than the one that is built into your TV. If you've chosen a GPU that is suitable for use in an HTPC, then your PC already contains a scaler/de-interlacer that is far more powerful than the one in your TV.

What does this mean for you? It means that you can allow your TV to display exactly what it receives without any scaling or de-interlacing. Some TV's (not all) have an option for 1:1 pixel mapping, which can be called different things on different brands of TV. For Samsung TV's, it is called "screen fit".

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#11

Post by richard1980 » Wed Aug 28, 2013 11:34 pm

barnabas1969 wrote:Most graphics processors (GPU) are more powerful than the one that is built into your TV. If you've chosen a GPU that is suitable for use in an HTPC, then your PC already contains a scaler/de-interlacer that is far more powerful than the one in your TV.
Unless he's got a very crappy TV, the graphics processing ability of his TV is going to be superior to even the best HTPC GPU. Case in point: 29/59.

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#12

Post by barnabas1969 » Wed Aug 28, 2013 11:52 pm

richard1980 wrote:
barnabas1969 wrote:Most graphics processors (GPU) are more powerful than the one that is built into your TV. If you've chosen a GPU that is suitable for use in an HTPC, then your PC already contains a scaler/de-interlacer that is far more powerful than the one in your TV.
Unless he's got a very crappy TV, the graphics processing ability of his TV is going to be superior to even the best HTPC GPU. Case in point: 29/59.
The graphics processor in a TV is designed for a single purpose. The GPU in a computer is a "Swiss army knife" that is designed for many purposes. The 29/59 issue is only one aspect of the whole picture. A computer's GPU is much more powerful, overall, than the graphics processor in most TV's. The OP was asking specifically about scaling and interlacing/de-interlacing. Not the ability of the GPU to switch modes, as what is seen with the 29/59 issue.

richard1980

Posts: 2623
Joined: Wed Jun 08, 2011 3:15 am
Location:

HTPC Specs: Show details

#13

Post by richard1980 » Thu Aug 29, 2013 12:41 am

barnabas1969 wrote:The graphics processor in a TV is designed for a single purpose.
That's my point...the single purpose is video processing. Computer GPUs are certainly more powerful overall, and are even more powerful when it comes to other specific tasks like video gaming. However, for the specific task of processing video, you won't find a consumer-grade PC GPU that can perform video processing as well as any decent CE video device such as a TV, STB, or AVR.
barnabas1969 wrote:Not the ability of the GPU to switch modes, as what is seen with the 29/59 issue.
Unless a TV ignores the flags or performs its own interlace detection, the 29/59 issue will affect the TV the same way it affects the GPU in the computer. If one frame doesn't match the preceding frame, the graphics processor will have to change the instruction set used to process that frame, regardless of whether we're talking about a TV or a computer. The only difference is, CE graphics processors are designed specifically for that kind of task, whereas most computer GPUs are not.

barnabas1969

Posts: 5738
Joined: Tue Jun 21, 2011 7:23 pm
Location: Titusville, Florida, USA

HTPC Specs: Show details

#14

Post by barnabas1969 » Thu Aug 29, 2013 1:17 am

OK, I'll grant you that a TV's scaler and interlacer/de-interlacer can switch modes faster than the one in a PC's GPU. However, that doesn't mean that the TV's video processor will do a better job of scaling and/or (de)interlacing a video stream that doesn't have the 29/59 issue.

The difference is subjective. Every person will have his/her own opinion of which graphics processor did the better job.

Post Reply