Native Resolution....1080i... Confused.

Grimezy

Prolific Poster
I'll try to keep this short, I use a 32 inch Hitachi LCD for my PC at the moment and I have it hooked up via HDMI. Now I had problems when I first got my pc with the screen being cut off around the sides when I set the resolution to 1920x1080. I was having to use it in 1766x992 to fit the whole screen on but then it was only displaying in 30hz. I discovered that the zoom settings weren't set up right so I adjusted it and suddenly I could display properly in 1920x1080.

Whenever I flick onto my HDMI channel, a box pops up saying 1080p 60hz which I thought was brilliant. It's was only last night when I was routing through Nvidia control panel that I noticed my native resolution is only 1080i which only has a max refresh rate of 30hz.

Now I'm utterly confused, my tv tells me it's 1080p and 60hz. My desktop tells me I'm running 1920x1080 at 60hz. My nvidia control panel lets me select and use 1080p at 60hz with no problems. My games let me select 60hz. But it says my native resolution will only do 30hz at 1080i... Which makes me assume that it's not actually doing anymore than 30hz even if it says it is?

I'm briefly aware of interlacing which from what I've read makes it sound like it runs at 30hz but magically doubles the frames (which would imply 60hz surely?) but I honestly have no clue.

Have I been running at 30hz this whole time? Or have I been overriding my native resolution and magically made it display more frames that it should be able to? Any help would be much appreciated. As some of you may have seen I'm on the market for a new monitor anyway to get rid of this horrid tv but I'm not sure when I'll be able to have a standard desk setup due to current space in the house so I'm trying to make the most of what I've got at the moment!
 
Last edited:

Drunken Monkey

Author Level
Simple just follow this equation.
a+b=orange.

apple-orange-banana-vector.jpg



seems legit
 

Pagey

Bright Spark
If you can select 60p hz output in the Nvidia control panel, go for it. Interlace is as you put a scan doubler, but in reality makes the output look like ****. Setting the output to 60hz "passive" will make a dramatic improvement to the picture.
 

Grimezy

Prolific Poster
Thanks guys. Tom my tv does have a game mode although I believe it's just a basic 'lets change all your colours' mode rather than an actual mode.

Thanks Pagey, I'll set it as 1080p in my nvidia panel then! Still clueless as to why I can override my tv's settings though.. Sure if it's 1080i it can't magically convert itself into 1080p or is it something to with my graphics card?
 

Pagey

Bright Spark
I'm unsure of what resolutions your TV can actually handle, most of the time it's down to TV specs as to what it can actually display. If you still have it's manual it should give you all the resolutions it can handle with a specific connector.
 

Ash

Well-known member
The problem you mentioned about your screen being cut off when you first got your PC, wouldn't Overscan/Underscan in the nVidia control panel fix that?
 
Top