Graphic Cards :: NEC External Monitor Not Detected?
Mar 5, 2016
I have a Dell XPS 13 (2016) with a Dell USB-C adapter. It will not detect my NEC LCD external monitor. I have tried a Dell USB-C adapter and a D-SUB cable to the monitor. I have also tried a Monoprice USB-C adapter with a Monoprice HDMI cable. Neither work. The monitor works fine with my Mac. I have told the computer to extend and duplicate the display using the Windows-P command. I have tried to update the driver for the monitor (I have NEC's driver), but it will not install because the monitor is not detected.
I have a MSI GS60 ghost pro gaming laptop which has Intel HD graphics 4600 and Nvidia GTX970M GPU, CPU i7 4720HQ, running Windows 10. I study abroad, several months ago i went back home and plugged-in my laptop to the TV via HDMI for gaming like normal. Yesterday i bought an ASUS VC239H external monitor and tried to do the same thing, but there is no HDMI input signal, the Laptop cannot detect the monitor (include display settings, intel HD graphics settings, nvidia settings). This is what i tried:
- Fn+f2, Fn+f8 keys, windows P switching to duplicate or extend or anykind of options - plug the monitor into my friend's laptop with windows 8 through that HDMI cable and it works fine, the same with other laptop running windows 7==> cable and monitor are normal - plug my laptop to my neighbor's smart TV. There was ''connected' symbol which appears on the TV but black screen instead of image, Laptop couldn't detect the TV - reinstalling nvidia and onboard-graphics drivers and update to the latest - Made a clean install of windows, tried windows 8 also but the same result - Updated my BIOS and EC - Bought an adapter HDMI-Mini Displayport to use the mini displayport with HDMI cable
Ok so I recently updated to Windows 10 but now when I connect my laptop to my external monitor the laptop screen doesn't show up on the external monitor.
My laptop's screen is broken so I can't see anything. The laptop is still on the screen immediately after updating so I could blindly work my way through the update screens, but someone would have to upload pictures of the screens immediately after updating to Windows 10.
So I read about setting display to Extend I've tried updating the graphics card, looks ok and Device Mgr sez it's working The external display, a TV, worked before on Win 7 just fine!
this is: Win 10 64 bit Home was Win 7 HP Pavilion dv7 laptop AMD Radeon HD 6300M Series video card external display: large TV via HDMI
I've been running a multi-monitor setup for a while now, with an ASUS VG248QE 144Hz monitor as my main, and one or more spare monitors as secondary displays. I've set the ASUS monitor to 144Hz without issue in the past, while the other monitors stay at 60Hz.
The other night, Windows wanted to update. I figured it was the usual patch session, so I let it do its thing. However, after restarting my computer, I noticed that my ASUS monitor would only operate at 144Hz if it was the only monitor plugged into the system. When I try to set it higher in the Nvidia control panel, it just reverts back to 60Hz after applying changes. I updated my graphics drivers (though I was only one version behind) and even sought a special ASUS driver for the monitor to resolve the issue, but nothing has worked.
Did Windows recently add a forced refresh rate sync between all monitors recently? If so, that really sucks because I only own one 144Hz monitor.
I installed Windows 10 and found some apps did not fit the monitor screen (too big). I went in and changed the monitor resolution to a larger size (windows recommended 1920 x 1080) but I tried something larger, and now the monitor has a black screen with a little box moving around which says "input not supported". Not being able to use that monitor to go in and change it back, I hooked up a flat screen t.v. which worked but Windows only showed the current monitor size (which was smaller than the original monitor.
I tried the monitor on another computer I have also running Windows 10 and it worked fine. It shows the resolution as 1920 x 1080. When I plugged it back into the original computer I get the same plack screen with "input not supported" so I can't change it there. I suspect the issue is with the registry somewhere in Windows 10 on the first computer, but what the problem is. I've been working on this for a whole day now. You can probably tell I'm not well versed in computer technology. I just want my old monitor back (Acer K272HL) and I promise I won't mess around with it again.
I just installed Windows 10. Now my laptop monitor won't work - only the external monitor. I go to Graphic setting and click on multiple displays, but it states it doesn't see another device only the external monitor. Tried disconnecting the external monitor and rebooting, but the laptop display still doesn't turn on.
I am cleaning out and setting up an old-ish PC and whenever I connect it to my TV or Monitor it says either "No Input to TV" or it is just black when I turn it on, I am using VGA to connect it, I tried with a GTX 650 and with just the built in graphics, and still no luck, any reason why?
I have temporarily connected a 22 inch LCD tv monitor( until i get a new one) to pc via HDMI lead but i cannot get resolution set right.The recommended native makes the taskbar disappear of screen and the desktop icons are barely visible on left side of screen.
I have tried various resolution settings but all are no good apart from one which makes the desktop look like like a laptop screen.
I have reset Windows 10 and updated Intel Graphic Card drivers to the latest but still resolution cannot be set correctly for monitor and used a VGA lead aswell.
Is it because you need a proper pc monitor to have correct resolution ?
Have a user with a Sony vaio laptop, 15" screen, 1920x1080. They also plug an HDMI cable into a 23" dell widescreen monitor for a second display. This 23" monitor is also 1920x1080. They are complaining that fonts on this second monitor are blurry. They tried to live with it a week but they said can it be made clear again like it was when they had Windows 7?
Are you familiar with the possibility of fonts being blurry only on one monitor if they are both set to the native?
I'm asking them to see if they change the font dpi to 100%, logoff and on if its any better. For whatever reason that seemed to work for my laptop (another Sony Vaio), but my concern is they will be disappointed at legibility on the laptop screen (it makes everything tiny).
It worked fine in windows 7 but now when I choose "duplicate display" from the side menu popup the tv goes blank as if Ive unplugged the hdmi. All 3 of the remaining display options work including the extend display and tv only setting so its not a cable issue. Laptop model is XPS 17 L702X. Now when I used to do it in win 7 sometimes it wouldnt autodetect it so I would have to right click desktop, open invidia up and click rigorous display detect or something like that and then laptop screen would blink and make the beep noise like when hdmi cable got connected. Now in the invida panel it just simply wont allow mirroring, nothing happens when I right click on the display box that would normally show 1 and a smaller box would say 2. I dont get it but Id really like that feature back If its something simple I could do.
My computer currently has 3 monitors connected to it. 2 through the graphics card (R9 280), 1 through the integrated graphics (AMD HD 3000). Not sure when why or how, but the integrated graphics card can never pick up what type of monitor is connected to it... so it always gives me a bunch of generic 4:3 resolutions...
I've tried uninstalling the drivers and letting them reinstall. I've tried swapping around the cables to see if i could get lucky with one of the other screens, but they all face the same issues. I've tried buying brand new cables across the board because sometimes people say that's the issue. But not success.
The cloest thing i have is apparently a custom resolution setter, but it was created for windows 7 in mind and doesn't seem to work on windows 10 (PowerStrip).
My question is, is there anyway to format the list so i can select the correct resolution? It's a Toshiba TV/Monitor and it seems like every thing else attached to my pc (even 3rd party programs i tried to use to forcably change the resolution) can spot it and hand me back the correct resolution but i can't seem to change it. I get Genertic pnp through the control panel (the only place the monitor shows up in) and even then the correct driver doesn't show, it's the Microsoft Generic driver.
Mobo: MSI AMD 3000 series, cant find exact model.
I've updated my AMD drivers to the latest and so far can only modify the resolution to stock ones that look horrid on the screen. Any program in windows 10 to change the resolution of a screen.
Every time I restart or boot up my PC with my monitors turned on first I always get a issue with the monitor connected to the HDMI port, it always says "Input not supported" and all I do is turn my monitor off then back on and the issue has been resolved...
I'm running my audio via HDMI from my graphics card and it plays through the speakers built into my monitor.
When I first did the Win 10 update I had no audio whatsoever, and followed some suggestions on online forums that lead me to disable the Realtek High Definition Audio which I did and upon a reboot- my HDMI audio was working.
But then I had another weird issue. My PC is set to never sleep as I use it as a media server, but my screens turn off automatically after 10 minutes. What im finding is that when I wake up my monitors the audio stops. When I view playback devices I can't do any tests because "the device is already in use by another application"
(note: I have the checkbox for allowing an application to take exclusive control of a device un-checked).
So far there are only 2 ways to restore audio. 1) restart the PC. 2) go into device manager > Sound, video and game controllers > AMD High Definition Audio Device > Disable... wait a few moments > Enable.
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
Surfing the web like normal with Photoshop in the background (not doing anything on it yet)Monitor goes dark blue/green and returns to normal saying Photoshop has stopped working, and saying in the bottom right corner that the driver had failed and then recovered.
This is probably the most painful issue I have with Windows 10 right now (and likely previous versions as well, but I didn't have a multi-monitor setup back then).
The monitors I have are as follows: 3840x2160 (4K UHD) monitor with preferred DPI: 144 (150%)1920x1080 (Full HD) monitor with preferred DPI: 96 (100%)
Whenever one of these monitors are set as primary, all desktop applications displayed on the secondary monitor (doesn't matter which) has blurry text. Exceptions are the Windows Store Apps like Windows Store and Microsoft Edge, along with the Taskbar/Start Menu, the Taskbar/Start Menu settings screen, the Taskbar context menu, and the desktop context menu which passes the DPI test with flying colours, with crispy text on both monitors (occasionally a DPI switch bug gets in, but I can mostly ignore that). The problem is, as you can probably guess, is that >99% of the applications I use aren't Windows Store Apps.
Here are some screenshots. The "Taskbar and Start Menu Properties" text is what the text should look like while the Visual Studio 2015 text is an example of the text most desktop apps get. The blurry image is what happens when the UHD monitor is not the primary monitor. Attachment 48493Attachment 48494 Note: Both of these screenshots came from the 150% DPI monitor so it's best viewed at that (144) DPI level. The 96 DPI monitor is similarly affected.
Things I've already tried: Reinstall the graphics driver. Did this multiple times in fact for unrelated reasons.Reinstall Windows 10 (through Reset This PC recovery option). I did this for also unrelated reasons but it definitely doesn't fix this issue.Use the XP Explorer "fix". Merely worsens the problem. Adjusting Clear Type options. Alleviates the issue a bit but see next point.Disabling Clear Type on the affected monitor. The text obviously sharpens, but it's painful to read and a close inspection of the text reveals the issue isn't solved at all, only mitigated slightly.Replacing the video card. I swapped this in with my older GTX 560 Ti but it's obvious the problem remains. Both it and my current card are NVIDIAs though, so it's vaguely possible the drivers or the cards themselves are the cause. I don't have an ATI/AMD card (that still works, at least) to test the setup and every Intel iGPU I have either has only one monitor output or is incapable of handling UHD resolutions.
Things I won't try: Setting both monitors' DPI to 96. Text would become microscopic considering the UHD monitor's actual size.Use the text resizing feature instead. I'm going take a wild guess that this is not monitor-specific and would cause everything on the HD monitor to be far too large to the point that I'd rather unplug it.
Looking for multi-resolution, multi-DPI, multi-monitor setup with or without this issue? The text is painful to read on whichever is the secondary monitor right now, and is extremely apparent whenever the background is dark.
I have a digital tv hooked up to my laptop via HDMI and windows 10 doesn't detect it. But when I go into AMD catalyst it sees it. What am I doing wrong?
So I recently had to reinstall Windows 10 to resolve an issue I was having, so my Nvidia drivers were deleted in that process. When I went to turn my computer back on after i was finished, there was no display on the screen so I took out my GPU and plugged the VGA into the integrated graphics and it worked fine. This just made me think I have to install the Nvidia drivers again but then I found out that you need the GPU in your computer to install the drivers, but My display won't work anymore with the GPU inside and no drivers for it.
In NVIDIA Control Panel, when I select my settings and click apply every things is OK. The next time I reboot they change. Why are they not being saved?
I upgrade my Win7 Pro to Win10 Pro last night and lost my 3840x2160x30 setup, being reduced back to 1980x1200. I've since done a clean install and reinstalled the latest drivers for my Samsung U28D590D and My ATI Radeon HD 5770.Nothing i have done has given me back my 4K resolution.
In October I bought the MSI GTX 980 graphics card as an upgrade for my Radeon R9 280x. I presumed that I would be getting a lot better performance in games since the 980 is a much more powerful card, this was the case in some games like GTA 5 where I can run the game at much higher settings. However I have found that in a lot of other games my performance seems to have gone down, as an example I will use Borderlands 2 as I have been playing this a lot recently.
Using my old card I was getting pretty much a constant 90+FPS when playing Borderlands 2 but with my current card I am currently getting less than 60 at most times and during combat it can drop to 20's. This doesn't make any sense to me as I have not changed any other hardware other than the graphics card and somehow It is performing worse than my old card.
I understand that the CPU is one of the biggest FPS factors and know that an upgrade to that would be needed for a lot higher FPS but surely an upgrade shouldn't cause an FPS decrease.
When I down loaded Win 10 yesterday, both my monitors worked fine. Now Win 10 says it can't detect the second monitor and can't save settings for a 2nd monitor. Monitor 2 is plugged in (Surface recognizes it with a chirp) and on. How do I fix this?
whenever I try to install the AMD drivers either via running the installer or auto detect, in the end I'm getting a message saying files did not install and to check the log file. I've got an Asus r9 280x. Control panel says "microsoft basic display adapter" under display adapters.
I have an Asus Desktop. I plugged a working HDMI cord into my older Sony Bravia TV (Smart TV). I have a GTX 760 192 Bit NVIDIA graphics card. I went to the NVIDIA control panel, and it wont read the TV. I tried going through the windows 10 64 bit version of trying to find connected devices, and the TV shows up but as a media streaming device. It almost seems like the PC itself doesn't know it has an HDMI slot.
My computer is crashing every time I turn on my computer. It crashes after around 15 minute. After it crashes once, it crashes right away right before or after the welcome screen. Before it crashes, my screen fuzzes, sometimes it doesn't. I tried a lot of things by deleting my graphics driver completely using ddu program or something like that and reinstall only the display driver.