Graphic Cards :: HDMI - Getting Decent Video Quality From Typical Monitors
Nov 27, 2015
Most people I've seen usually have appalling (almost unreadable) quality when using an external monitor with an HDMI cable -- it's quite simple to fix really easily. Films / video are usually OK but if you are reading email or doing standard computer work then you want the monitor to give you decent results.
First ensure that the "Sharpness" setting on the monitor is set correctly -- oversharpening makes text etc look terrible.
secondly use something like "Auto size" so the picture size fits the screen properly rather than be slightly too large or small even if say 1920 X 1080 HD is selected.
thirdly use correct frequency for scanning -- in Europe should be 60 or 59HZ - use the "p" choice not the "I" one.
finally use sensibly the smart picture settings if you have one or use colour / tint / contrast / brightness settings properly.
Messing around with these controls really does turn an external monitor from in some cases having an almost unreadable screen to a pleasure to use --even on cheap 22 inch TV's.
Use the monitors own menu - don't do it from the PC display settings.
A typical laptop's video card will be perfectly ok on these types of monitors.
I have an Asus Desktop. I plugged a working HDMI cord into my older Sony Bravia TV (Smart TV). I have a GTX 760 192 Bit NVIDIA graphics card. I went to the NVIDIA control panel, and it wont read the TV. I tried going through the windows 10 64 bit version of trying to find connected devices, and the TV shows up but as a media streaming device. It almost seems like the PC itself doesn't know it has an HDMI slot.
when i connect a dvi or hdmi from my gtx 780 to my acer H274HL Monitor all works fine but when i want to connect display port cable on my gtx 780 and the other side is hdmi that goes to my monitor than there is no signal at all.
i am running windows 10 64 bit enterprise and all drivers are up to date also windows updates are up to date.
I have a Lenovo T450s as my primary machine issued to me at work. It has an Intel Integrated Graphics Card and a 1920 X 1080 screen. I have two additional monitors, a 28 inch 4K Samsung, and a 24 inch 1080p Samsung. I have the following DPI settings.
Laptop Monitor - 100% 4K Monitor - 150% (anything lower and it is completely unusable, everything is super tiny) 24 inch 1080 - 100%
If I set my Laptop screen as my primary screen the laptop screen and the 24 inch 1080 monitor look share, but most apps look fuzzy, sometimes somewhat noticeable, other times it is very noticeable when moved to the 4K monitor with DPI scaling set at 150%.
The opposite happens if I set the 4k monitor as the primary monitor and restart. Then the 4K monitor looks very sharp and clear. However then the laptop screen and 24 inch 1080 monitor look fuzzy and blurry.
Visual Studio 2013, SSMS 2014, Chrome are the application that are bothering me the most right now. There are a small number of apps like Edge that seem to work fine regardless of whatever the DPI settings are.
I was excited to see Windows 10 added per monitor DPI scaling and claimed to fix the issues with DPI scaling on Windows 8 when having multiple monitors of different pixel densities. However from what I've seen they really haven't fixed much.
I have a gigabyte MB with on-board graphics, 1 vga and 1 DVI as outputs.
I have 2 new asus monitors with 1 vga input and 2 hdmi inputs.
The vga to vga connection to monitor 1 produces full screen on 1920x1080 (recommended)
Asus furnished a cable DVI to HDMI. This connection (monitor 2) leaves about 1/2 inch blank space on either side and about 1/4 inch top and bottom. 1920x1080 (recommended). 1280x1024 setting fills the screen, but display looks stretched.
It worked fine in windows 7 but now when I choose "duplicate display" from the side menu popup the tv goes blank as if Ive unplugged the hdmi. All 3 of the remaining display options work including the extend display and tv only setting so its not a cable issue. Laptop model is XPS 17 L702X. Now when I used to do it in win 7 sometimes it wouldnt autodetect it so I would have to right click desktop, open invidia up and click rigorous display detect or something like that and then laptop screen would blink and make the beep noise like when hdmi cable got connected. Now in the invida panel it just simply wont allow mirroring, nothing happens when I right click on the display box that would normally show 1 and a smaller box would say 2. I dont get it but Id really like that feature back If its something simple I could do.
Just unboxed my computer from CyberpowerPC, GTX 720 graphics card, installed/updated all drivers with drivereasy (paid for just now) and I am still having the issue with HDMI port not working. I have tried to plug into a monitor and a TV, neither are working, I have also tried to use multiple HDMI cables, still does not work. My monitor has an auto-switch source availability and turned on, when I plug my computer into the monitor with HDMI it continually cycles, finds no signal and goes into sleep mode. Then rinse/repeats until I either turn off the computer, monitor or just unplug the monitor from the computer. Currently I am using a DVI(computer) to HDMI (tv) and its working fine. So I know its the port on the computer. Do I need to contact CyberpowerPC to send the computer back to get this issue resolved?
Every time I restart or boot up my PC with my monitors turned on first I always get a issue with the monitor connected to the HDMI port, it always says "Input not supported" and all I do is turn my monitor off then back on and the issue has been resolved...
I'm running my audio via HDMI from my graphics card and it plays through the speakers built into my monitor.
When I first did the Win 10 update I had no audio whatsoever, and followed some suggestions on online forums that lead me to disable the Realtek High Definition Audio which I did and upon a reboot- my HDMI audio was working.
But then I had another weird issue. My PC is set to never sleep as I use it as a media server, but my screens turn off automatically after 10 minutes. What im finding is that when I wake up my monitors the audio stops. When I view playback devices I can't do any tests because "the device is already in use by another application"
(note: I have the checkbox for allowing an application to take exclusive control of a device un-checked).
So far there are only 2 ways to restore audio. 1) restart the PC. 2) go into device manager > Sound, video and game controllers > AMD High Definition Audio Device > Disable... wait a few moments > Enable.
So, my old 670 burnt out in a weird way where it would do basic video but only if nvidia drivers werent installed, if they were installed then the system with that card was a brick. Was using a 550 while waiting on shipping, works fine but obviously can't have good settings with it. My new Gigabyte gtx 960 windforce 4GB just arrived, I let it get up to room temperature since it had been outside in the cold for a little bit, then plugged it in. Through HDMI and both DVI ports the only output is a blank screen with a " _ " at the top and nothing else. Looks like a prompt but does nothing. It does not even boot the bios that I can tell. Even if the old 670 had a problem it at least did basic video and showed the bios.
Windows 10 64 Bit Intel i7 3770 @ 3.4 GHz Gigabyte g1.sniper m3 BIOS F10f Gigabyte gtx 960 Windforce 4GB HP Pavilion 27xi 27"
Basically, after closing a game (WoT), but not all times, I get a BSOD which refers to the folder igdkmd64. I tried any solution, but none of them worked. How can I solve this?
The monitor is connected to the computer via HDMI and we can see Windows on the screen but Windows sprawls beyond the physical edges of the screen in every direction. That means we see only the top little bit of the task bar and the Windows button is beyondn the left edge of the screen and the clock is beyond the right edge of the screen. I've tried a few different screen resolutions but the image ALWAYS sprawls so that all four edges are lost. What do I need to do so that this sprawling stops?
I am running Windows 10 with the Insider Program. It's the 64 bit version. I have the latest (just checked) NVIDIA driver for my GTX 560 Ti.
In the last few days I've experienced a problem in which the graphics driver crashes and then recovers after a few seconds. When it crashes, the screen goes black, as if there's no signal. Then it goes back on, but if I was playing a video when it crashed, the media player (MPC-HC) needs to be restarted, otherwise playback cannot be resumed.
It also happens during general use, ie web browsing and such, though it certainly is more frequent when playing back videos. I'd say 80% of the crashes occur during video playback. And when that happens, in addition to the black screen, the GPU's fan goes full power until the driver is restored.
Since NVIDIA hasn't released a new Windows 10 driver since July, is it safe to say that it's a hardware and not a software issue?
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
So I recently had to reinstall Windows 10 to resolve an issue I was having, so my Nvidia drivers were deleted in that process. When I went to turn my computer back on after i was finished, there was no display on the screen so I took out my GPU and plugged the VGA into the integrated graphics and it worked fine. This just made me think I have to install the Nvidia drivers again but then I found out that you need the GPU in your computer to install the drivers, but My display won't work anymore with the GPU inside and no drivers for it.
In NVIDIA Control Panel, when I select my settings and click apply every things is OK. The next time I reboot they change. Why are they not being saved?
I upgrade my Win7 Pro to Win10 Pro last night and lost my 3840x2160x30 setup, being reduced back to 1980x1200. I've since done a clean install and reinstalled the latest drivers for my Samsung U28D590D and My ATI Radeon HD 5770.Nothing i have done has given me back my 4K resolution.
I am cleaning out and setting up an old-ish PC and whenever I connect it to my TV or Monitor it says either "No Input to TV" or it is just black when I turn it on, I am using VGA to connect it, I tried with a GTX 650 and with just the built in graphics, and still no luck, any reason why?
In October I bought the MSI GTX 980 graphics card as an upgrade for my Radeon R9 280x. I presumed that I would be getting a lot better performance in games since the 980 is a much more powerful card, this was the case in some games like GTA 5 where I can run the game at much higher settings. However I have found that in a lot of other games my performance seems to have gone down, as an example I will use Borderlands 2 as I have been playing this a lot recently.
Using my old card I was getting pretty much a constant 90+FPS when playing Borderlands 2 but with my current card I am currently getting less than 60 at most times and during combat it can drop to 20's. This doesn't make any sense to me as I have not changed any other hardware other than the graphics card and somehow It is performing worse than my old card.
I understand that the CPU is one of the biggest FPS factors and know that an upgrade to that would be needed for a lot higher FPS but surely an upgrade shouldn't cause an FPS decrease.
So I read about setting display to Extend I've tried updating the graphics card, looks ok and Device Mgr sez it's working The external display, a TV, worked before on Win 7 just fine!
this is: Win 10 64 bit Home was Win 7 HP Pavilion dv7 laptop AMD Radeon HD 6300M Series video card external display: large TV via HDMI
whenever I try to install the AMD drivers either via running the installer or auto detect, in the end I'm getting a message saying files did not install and to check the log file. I've got an Asus r9 280x. Control panel says "microsoft basic display adapter" under display adapters.
I have temporarily connected a 22 inch LCD tv monitor( until i get a new one) to pc via HDMI lead but i cannot get resolution set right.The recommended native makes the taskbar disappear of screen and the desktop icons are barely visible on left side of screen.
I have tried various resolution settings but all are no good apart from one which makes the desktop look like like a laptop screen.
I have reset Windows 10 and updated Intel Graphic Card drivers to the latest but still resolution cannot be set correctly for monitor and used a VGA lead aswell.
Is it because you need a proper pc monitor to have correct resolution ?
My computer is crashing every time I turn on my computer. It crashes after around 15 minute. After it crashes once, it crashes right away right before or after the welcome screen. Before it crashes, my screen fuzzes, sometimes it doesn't. I tried a lot of things by deleting my graphics driver completely using ddu program or something like that and reinstall only the display driver.
So I bought a new monitor, the MG279Q and tried plugging it in to my PC via displayport, the monitor comes with a DP->miniDP cable, (from GTX780 DP to monitors miniDP) and the monitor gives no signal. However the computer does recognize the monitor and for example gives the sound alarm for new device attached. I can also see the device describes as "Generic pnp monitor" and I can see it in the Windows's own control panel for monitor + the nvidia control panel (pics in the end). But it won't allow me to set the monitor to use..
I suspected that the cable might be broken so I tested it with my UX32LN (GeForce 840M) laptop which has a miniDP. I plugged it to the monitor (from laptops miniDP to monitors normal DP, the monitor has both DP and mDP) and wow, image straight away. So the cable should be fine I guess.. But this is somewhat of an different situation as I use different port in the monitor compared to my desktops GTX780. And I don't have other computers or gadgets that I could use to test the monitors miniDP port.
Next I plugged in the monitor via HDMI to my desktop and that worked just fine. It recognizes the monitor without problems. Using my old XL2411Z monitor with DVI + MG279Q with DP or the MG279Q with HDMI+DP at the same time I can see the MG279Q being connected twice but it only chooses the HDMI connection and doesn't let me change it to DP even though it shows the connection.
Here are some pics to explain it better:
Multiple panel settings window. It shows the MG279Q on the top (under GTX780 the one not clicked) but it won't let me click it to use. Sometimes it lets me tick it for a second but instantly ticks the box off. The monitor nro 2 is the MG279Q via HDMI.
Surround+PhysX window shows that I've connected the monitor via DP but it is greyed out. It also won't let me make a surround set with the Xl2411z + mg279q DP.
Changing resolution settings only displays the HDMI connected monitor.
So I tried uninstalling the Nvidia drivers + freshly installing them, did not work at all. Neither did uninstalling the generic pnp monitor so the computer would reinstall the displays drivers. I also freshly installed from Win7 Pro -> Win 10 Pro and no difference (was intended to do that later on anyway so I thought that I might check, my laptop is running Win10 and it's DP worked after all). I also went to BIOS and checked that iGPU is disabled + monitor setting is set to PCIE instead of auto, no use. I would also test the GTX780's DP on other desktop computer but I can't. Anyway I'm somewhat suspecting a software/hardware compatibility issue as the monitor is being recognized. Doubt the fact that the monitors miniDP or my graphics cards DP port is broken..
What should I do or try? I really want to run the monitor via DP so I can utilize the 144Hz refresh rate.