when i connect a dvi or hdmi from my gtx 780 to my acer H274HL Monitor all works fine but when i want to connect display port cable on my gtx 780 and the other side is hdmi that goes to my monitor than there is no signal at all.
i am running windows 10 64 bit enterprise and all drivers are up to date also windows updates are up to date.
So I bought a new monitor, the MG279Q and tried plugging it in to my PC via displayport, the monitor comes with a DP->miniDP cable, (from GTX780 DP to monitors miniDP) and the monitor gives no signal. However the computer does recognize the monitor and for example gives the sound alarm for new device attached. I can also see the device describes as "Generic pnp monitor" and I can see it in the Windows's own control panel for monitor + the nvidia control panel (pics in the end). But it won't allow me to set the monitor to use..
I suspected that the cable might be broken so I tested it with my UX32LN (GeForce 840M) laptop which has a miniDP. I plugged it to the monitor (from laptops miniDP to monitors normal DP, the monitor has both DP and mDP) and wow, image straight away. So the cable should be fine I guess.. But this is somewhat of an different situation as I use different port in the monitor compared to my desktops GTX780. And I don't have other computers or gadgets that I could use to test the monitors miniDP port.
Next I plugged in the monitor via HDMI to my desktop and that worked just fine. It recognizes the monitor without problems. Using my old XL2411Z monitor with DVI + MG279Q with DP or the MG279Q with HDMI+DP at the same time I can see the MG279Q being connected twice but it only chooses the HDMI connection and doesn't let me change it to DP even though it shows the connection.
Here are some pics to explain it better:
Multiple panel settings window. It shows the MG279Q on the top (under GTX780 the one not clicked) but it won't let me click it to use. Sometimes it lets me tick it for a second but instantly ticks the box off. The monitor nro 2 is the MG279Q via HDMI.
Surround+PhysX window shows that I've connected the monitor via DP but it is greyed out. It also won't let me make a surround set with the Xl2411z + mg279q DP.
Changing resolution settings only displays the HDMI connected monitor.
So I tried uninstalling the Nvidia drivers + freshly installing them, did not work at all. Neither did uninstalling the generic pnp monitor so the computer would reinstall the displays drivers. I also freshly installed from Win7 Pro -> Win 10 Pro and no difference (was intended to do that later on anyway so I thought that I might check, my laptop is running Win10 and it's DP worked after all). I also went to BIOS and checked that iGPU is disabled + monitor setting is set to PCIE instead of auto, no use. I would also test the GTX780's DP on other desktop computer but I can't. Anyway I'm somewhat suspecting a software/hardware compatibility issue as the monitor is being recognized. Doubt the fact that the monitors miniDP or my graphics cards DP port is broken..
What should I do or try? I really want to run the monitor via DP so I can utilize the 144Hz refresh rate.
Just unboxed my computer from CyberpowerPC, GTX 720 graphics card, installed/updated all drivers with drivereasy (paid for just now) and I am still having the issue with HDMI port not working. I have tried to plug into a monitor and a TV, neither are working, I have also tried to use multiple HDMI cables, still does not work. My monitor has an auto-switch source availability and turned on, when I plug my computer into the monitor with HDMI it continually cycles, finds no signal and goes into sleep mode. Then rinse/repeats until I either turn off the computer, monitor or just unplug the monitor from the computer. Currently I am using a DVI(computer) to HDMI (tv) and its working fine. So I know its the port on the computer. Do I need to contact CyberpowerPC to send the computer back to get this issue resolved?
Every time I restart or boot up my PC with my monitors turned on first I always get a issue with the monitor connected to the HDMI port, it always says "Input not supported" and all I do is turn my monitor off then back on and the issue has been resolved...
I have an Asus Desktop. I plugged a working HDMI cord into my older Sony Bravia TV (Smart TV). I have a GTX 760 192 Bit NVIDIA graphics card. I went to the NVIDIA control panel, and it wont read the TV. I tried going through the windows 10 64 bit version of trying to find connected devices, and the TV shows up but as a media streaming device. It almost seems like the PC itself doesn't know it has an HDMI slot.
It worked fine in windows 7 but now when I choose "duplicate display" from the side menu popup the tv goes blank as if Ive unplugged the hdmi. All 3 of the remaining display options work including the extend display and tv only setting so its not a cable issue. Laptop model is XPS 17 L702X. Now when I used to do it in win 7 sometimes it wouldnt autodetect it so I would have to right click desktop, open invidia up and click rigorous display detect or something like that and then laptop screen would blink and make the beep noise like when hdmi cable got connected. Now in the invida panel it just simply wont allow mirroring, nothing happens when I right click on the display box that would normally show 1 and a smaller box would say 2. I dont get it but Id really like that feature back If its something simple I could do.
Most people I've seen usually have appalling (almost unreadable) quality when using an external monitor with an HDMI cable -- it's quite simple to fix really easily. Films / video are usually OK but if you are reading email or doing standard computer work then you want the monitor to give you decent results.
First ensure that the "Sharpness" setting on the monitor is set correctly -- oversharpening makes text etc look terrible.
secondly use something like "Auto size" so the picture size fits the screen properly rather than be slightly too large or small even if say 1920 X 1080 HD is selected.
thirdly use correct frequency for scanning -- in Europe should be 60 or 59HZ - use the "p" choice not the "I" one.
finally use sensibly the smart picture settings if you have one or use colour / tint / contrast / brightness settings properly.
Messing around with these controls really does turn an external monitor from in some cases having an almost unreadable screen to a pleasure to use --even on cheap 22 inch TV's.
Use the monitors own menu - don't do it from the PC display settings.
A typical laptop's video card will be perfectly ok on these types of monitors.
I'm running my audio via HDMI from my graphics card and it plays through the speakers built into my monitor.
When I first did the Win 10 update I had no audio whatsoever, and followed some suggestions on online forums that lead me to disable the Realtek High Definition Audio which I did and upon a reboot- my HDMI audio was working.
But then I had another weird issue. My PC is set to never sleep as I use it as a media server, but my screens turn off automatically after 10 minutes. What im finding is that when I wake up my monitors the audio stops. When I view playback devices I can't do any tests because "the device is already in use by another application"
(note: I have the checkbox for allowing an application to take exclusive control of a device un-checked).
So far there are only 2 ways to restore audio. 1) restart the PC. 2) go into device manager > Sound, video and game controllers > AMD High Definition Audio Device > Disable... wait a few moments > Enable.
So I recently had to reinstall Windows 10 to resolve an issue I was having, so my Nvidia drivers were deleted in that process. When I went to turn my computer back on after i was finished, there was no display on the screen so I took out my GPU and plugged the VGA into the integrated graphics and it worked fine. This just made me think I have to install the Nvidia drivers again but then I found out that you need the GPU in your computer to install the drivers, but My display won't work anymore with the GPU inside and no drivers for it.
So I read about setting display to Extend I've tried updating the graphics card, looks ok and Device Mgr sez it's working The external display, a TV, worked before on Win 7 just fine!
this is: Win 10 64 bit Home was Win 7 HP Pavilion dv7 laptop AMD Radeon HD 6300M Series video card external display: large TV via HDMI
Since updating to Windows 10 I have a display settings issue. I set my display setting to 1920x1080 (recommended) apply and check 'keep settings'. When I shut down and restart the display has changed to 1366x768. Happens every time I restart.
ATI Mobility Radeon HD 4200 seriesAMD Radeon HD 6300M series
For one; the ATI Mobility card there are no drivers installed for this device. Ive been on the AMD website, where I downloaded an old legacy catalyst suite, built for windows vista/7. after that there are still no drivers installed for this card.
For the AMD Radeon; I'm getting a "Windows has stopped this device because it has reported problems. (Code 43)"
I'm starting to believe the problems for the 6300M card is coming from the 4200 card not having any drivers.
I've uninstalled and reinstalled the drivers for the 6300 card, and can only get the code 43 to go away if I disable and then re-enable it through the Device Manager. After a restart though, the problem persists.
My display driver kept Crashing my Computer, so I updated it with the latest driver for Windows 10.
Now I keep getting a message to say that 'Scene Selection Has Changed' which annoyingly keeps on popping up on the Screen at regular intervals. (How can I stop this before it drives me Crazy!) too late it already has!
Then I get another 'Pop Up' which says 'Display Driver Stopped Responding but has now recovered' and is something else that I don't need to know (or maybe I do?)
If that message pops up a couple of times, it Crashes the Computer which then re-boots itself (sometimes!) or else gives me a 'Black Screen'
My Nvidea Graphics Card is virtually brand new and always worked faultlessly with XP and Windows 7.
Could it be the Updates we keep getting or something else?
So the bottom of my monitor display is cut off. It is a Dell ST2010 monitor very old but still in working condition. It has nothing to do with screen resolution btw.
This is how it looks now, the problem started occurring like 3 months ago
This is how its suppose to look when i take a screen shot of it
I'm having this frequent problem with my desktop computer. Since updating from Windows 7 to Windows 10, every time I use my PC (at random times) the whole screen will just go blank and I'll get a little message popup in the right hand corner saying.
Display driver stopped responding and has recovered Display driver AMD driver stopped responding and has successfully recovered.
I don't use my PC for anything real heavy it just randomly happens.. From either scrolling up and down a website page or watching a YouTube video etc... I've tried upgrading my graphics driver but still no luck.
Computer type: PC/Desktop System Manufacturer/Model Number: DELL INSPIRON ONE OS: Windows 10 CPU: AMD Athlon II X2 240 Motherboard: Dell 0DPRF9 Memory: DDR 3 4GB Graphics Card(s): AMD Mobility Radeon HD 5000 Series
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
Since upgrading to Windows 10 from 8.1, I've seen the following message several times.
Display Driver stopped responding and has recovered Display driver Intel HD Graphics Drivers for Windows 8(R) stopped responding and has successfully recovered.
When it happens, my laptop screen goes black for a few seconds, except for the taskbar and toast notification. I also have an external monitor that doesn't seem affected at all.
I can reproduce the error pretty consistently if I rapidly switch a Flash video back and forth between full-screen and embedded, but that's not the only time I've seen the problem.
As far as I can tell, everything continues running. Even the full-screen video continues running after the screen isn't black any more.
The weird thing is that the driver name it shows isn't the driver I'm using. I've already upgraded my display drivers to the latest available from Intel's site.
Some things I've observed:
The information presented by Windows in Settings -> Display -> Advanced display settings shows the driver for Windows 10 that I downloaded from Intel.The "Intel HD Graphics Control Center" that was installed along with the driver shows the latest driver version that I downloaded from Intel.In Settings -> Apps & Features there's only one "Intel(R) Processor Graphics Driver" listed.Event viewer shows very little useful information:
General Display driver igfx stopped responding and has successfully recovered.Details (XML View): - <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> - <System> <Provider Name="Display" /> <EventID Qualifiers="0">4101</EventID>
[Code] ....
It's conceivable that Intel updated the driver, but forgot to change the string somewhere so the error is actually for the Windows 10 driver. I don't know how to find out exactly what failed or why. It's getting that string from somewhere.
I just installed a new SSD and moved my OS onto it, however now I can't start my computer with both of my displays plugged in (1 HDMI and 1 DVI). The main display (HDMI) will flash on and off while the DVI does not receive any signal, until I unplug the DVI and then the computer starts like normal. Both displays work just fine after it has booted.
My display driver is NVIDIA GeForce GTX 950, and it is up-to-date as of the time I'm posting this. The Nvidia control panel recognizes both displays and has the HDMI selected as the main display without me having to touch it.
I could boot with both displays plugged in before I installed the SSD.
I did a clean install of Windows (not cloning) originally on the SSD and even tried to re-install again to see if it would fix it and it does not.
The only option in BIOS related to display is to go back to my processor's integrated graphics instead of my Nvidia card, but this really should not be related to the issue at all as both the DVI and HDMI are plugged directly into the Nvidia card's ports.
Something I found very weird when I was checking the BIOS is that if on boot I go to BIOS, exit, and proceed to boot, it works just fine. But not if I just do a normal boot.
after upgrading to the windows 10 OS, I could not get my HDMI port to work. It does not get recognized at all and is very frustrating. I run the OS on my laptop which is a HP Pavilion. I really need it to be working. I love windows 10 but this is a big problem.
Last night I have updated my laptop (Alienware mx17r3) to Win 10 Tech build 10041, updated all my drivers now my display port is not working, When booting I can see it showing BIOS and post screen but once I am in windows just goes dead with a msg no display port detected. Running my laptop screen, dell ultrasharp-u2913wm (Display port) and dell S2340L (VGA) port. On VGA both screen works fine no problem but as soon as I try running on display port dead as a doornail. I checked device manager its not even detecting the 3rd screen. Have updated latest drives from intel and ran windows update but to no avail.
After installing windows 10, most of the time when I boot after the windows logo with circles below it the signal to display stops, if I press reset then the computer boots but the startup is very slow. With windows 8 from pressing the power button I would get to desktop in 8 secs, here it takes at least 30 secs or more. I would like to know the if problem is with windows 10 or Nvidia drivers.
In NVIDIA Control Panel, when I select my settings and click apply every things is OK. The next time I reboot they change. Why are they not being saved?
I upgrade my Win7 Pro to Win10 Pro last night and lost my 3840x2160x30 setup, being reduced back to 1980x1200. I've since done a clean install and reinstalled the latest drivers for my Samsung U28D590D and My ATI Radeon HD 5770.Nothing i have done has given me back my 4K resolution.
I am cleaning out and setting up an old-ish PC and whenever I connect it to my TV or Monitor it says either "No Input to TV" or it is just black when I turn it on, I am using VGA to connect it, I tried with a GTX 650 and with just the built in graphics, and still no luck, any reason why?