Grimezy
Prolific Poster
I'll try to keep this short, I use a 32 inch Hitachi LCD for my PC at the moment and I have it hooked up via HDMI. Now I had problems when I first got my pc with the screen being cut off around the sides when I set the resolution to 1920x1080. I was having to use it in 1766x992 to fit the whole screen on but then it was only displaying in 30hz. I discovered that the zoom settings weren't set up right so I adjusted it and suddenly I could display properly in 1920x1080.
Whenever I flick onto my HDMI channel, a box pops up saying 1080p 60hz which I thought was brilliant. It's was only last night when I was routing through Nvidia control panel that I noticed my native resolution is only 1080i which only has a max refresh rate of 30hz.
Now I'm utterly confused, my tv tells me it's 1080p and 60hz. My desktop tells me I'm running 1920x1080 at 60hz. My nvidia control panel lets me select and use 1080p at 60hz with no problems. My games let me select 60hz. But it says my native resolution will only do 30hz at 1080i... Which makes me assume that it's not actually doing anymore than 30hz even if it says it is?
I'm briefly aware of interlacing which from what I've read makes it sound like it runs at 30hz but magically doubles the frames (which would imply 60hz surely?) but I honestly have no clue.
Have I been running at 30hz this whole time? Or have I been overriding my native resolution and magically made it display more frames that it should be able to? Any help would be much appreciated. As some of you may have seen I'm on the market for a new monitor anyway to get rid of this horrid tv but I'm not sure when I'll be able to have a standard desk setup due to current space in the house so I'm trying to make the most of what I've got at the moment!
Whenever I flick onto my HDMI channel, a box pops up saying 1080p 60hz which I thought was brilliant. It's was only last night when I was routing through Nvidia control panel that I noticed my native resolution is only 1080i which only has a max refresh rate of 30hz.
Now I'm utterly confused, my tv tells me it's 1080p and 60hz. My desktop tells me I'm running 1920x1080 at 60hz. My nvidia control panel lets me select and use 1080p at 60hz with no problems. My games let me select 60hz. But it says my native resolution will only do 30hz at 1080i... Which makes me assume that it's not actually doing anymore than 30hz even if it says it is?
I'm briefly aware of interlacing which from what I've read makes it sound like it runs at 30hz but magically doubles the frames (which would imply 60hz surely?) but I honestly have no clue.
Have I been running at 30hz this whole time? Or have I been overriding my native resolution and magically made it display more frames that it should be able to? Any help would be much appreciated. As some of you may have seen I'm on the market for a new monitor anyway to get rid of this horrid tv but I'm not sure when I'll be able to have a standard desk setup due to current space in the house so I'm trying to make the most of what I've got at the moment!
Last edited: