GPU Will Only Display To TV When Another Monitor Is Also Connected To Motherboard?
Aug 9, 2015
For the last 2 months since I got my new PC, I have been using my TV as a monitor for my PC. I had a few issues with resolution due to my TV overscanning the display but I used the resize feature in the NVidia control panel. However, yesterday when I downloaded Windows 10, it installed fine but when my computer opened up again it was trying to use my TV's native resolution which was a problem because of the overscanning. So I went to the NVidia control panel, and when I clicked the resize button (before choosing the new resolution) the TV just said it was receiving no input. Whenever I then tried booting my PC, it would display to my TV until Windows had loaded, then it would say no input.
The same thing happened when I connected to the TV through my motherboard rather than my GPU. So I spent ages trying to get it to work and eventually connected it to an old monitor. However the monitor only has a DVI input slot, which my GPU doesn't have, so I had to connect it through my motherboard and it worked fine. I then uninstalled all NVidia stuff from my computer, including the gpu from my device manager, and manually installed the new GTX 970 windows 10 driver from their website.
When this was done, I connected my graphics card to the tv through the HDMI, keeping the computer connected to the monitor through the motherboard. Both displays then worked as a dual display setup. However, even now whenever I try booting up my PC with only my gpu connected to the TV the same thing happens that was happening before and the TV will only display through the GPU when the motherboard is also connected to the monitor. I have ordered a DVI to HDMI cable so I can attach my monitor directly to the GPU, which will hopefully work, and then I will stop using my TV, but I would rather fix this problem and carry on using the TV until I can buy a better monitor.
So recently I upgraded to windows 10 at the same time I got a 4k display.When I set the display scaling to 150% on my main 4k monitor and ensuring that the scaling stays at 100% on my other two 1080p monitors, the entire chrome application is blurry when viewing on the 1080 monitors, and I cannot seem to fix this. Other applications seem to be okay.
Furthermore after the scaling change, some icons on my 1080 monitor's desktop get larger, as if some scaling is applied to them. Is there any solution to this? At the moment I have had to set the scaling back to 100% on my main monitor in order to use chrome on the smaller monitors.
I got an external monitor today that does not have speakers included. I was surprised to find that when the monitor is connected (over DisplayPort) sound will not come out of my bluetooth speaker or my Windows 10 powered laptop's main speakers. How can I get around this huge issue??
I have temporarily connected a 22 inch LCD tv monitor( until i get a new one) to pc via HDMI lead but i cannot get resolution set right.The recommended native makes the taskbar disappear of screen and the desktop icons are barely visible on left side of screen.
I have tried various resolution settings but all are no good apart from one which makes the desktop look like like a laptop screen.
I have reset Windows 10 and updated Intel Graphic Card drivers to the latest but still resolution cannot be set correctly for monitor and used a VGA lead aswell.
Is it because you need a proper pc monitor to have correct resolution ?
I'm running B10074 on my laptop , and I connected my external display to it as well as my external keyboard and mouse at my desk so I could sit there and do the setup etc without moving all of that out of the way. I had the laptop lid shut and off to the side while I was doing this. Then, I shut the laptop down, unplugged all the external peripherals, and turned it on again. It went through the windows boot sequence with the spinning circles, then black screen.
After 20 minutes of frustration, I realised that the laptop would show a login prompt only when I plug in the external monitor BEFORE i start it, and only if the laptop is closed while it is booting (when the lid is closed during boot, the Dell logo and spinning circles come up on the external display)
I further figured out that in Control Panel, it only displays the external display, its like the computer doesn't even recognise the internal display anymore. I dont know why that would be since when I first booted it up and installed drivers, I was using the internal display and everything was working fine.
Every time I restart or boot up my PC with my monitors turned on first I always get a issue with the monitor connected to the HDMI port, it always says "Input not supported" and all I do is turn my monitor off then back on and the issue has been resolved...
Using the same settings and programs as in my 8.1 installation. Nothing funny, or game-related running in the background (also tried disabling everything possible). The problem is that the monitor will not turn off, either from the Power/Display power-off options (nothing happens after set time passes), or with a program like ScreenOff and Monitor Off.
Using one of these apps, will start turning the monitor off, but it will turn itself back on automatically almost immediately. Something is interfering but I cannot figure out what. No security software installed (using Defender, tried disabling it). The only way this works as intended, is in Safe Mode.
I upgraded my Windows 8.1 computer to 10 over a month ago. Although I like Windows 10 a lot, there is a problem I've been having with the monitor not turning off automatically.
The display is set to turn off after 5 minutes of inactivity, but it never does. No matter how long I set the wait time. I set up a hot key to automatically turn the monitor off. It worked in 8.1 but in Windows 10 the monitor turns off for a half second and then turns right back on.
I did not change any hardware settings when I went from 8.1 to 10. I've tried unplugging various things connected via USB but nothing so far has solved this problem.
The monitor I have is the Dell E2414Hx 24-Inch Screen LED-Lit Monitor.
I have a Toshiba laptop running Windows 10. My main display monitor is a LG 24 widescreen plugged into the laptop.
With Windows 7, I had no problems in using Fn key f5 to set the display to only the LG (showing as "CRT" on the screen). This did not change when I re-booted the laptop.
With windows 10, it reverts back to to the default setting and I get the laptop screen and the monitor lit up.. I frequently get a different resolution on the display too, until I open the laptop and re-set the display with the Fn key. This is annoying, as I use an external keyboard (which doesn't flip the Fn key) with the laptop lip down.
I've also tried via Control Panel (Display) but that doesn't hold the setting either.
I have three PC's all running the same OS setup but one no matter what I try after 2 minutes the display goes blank, move the mouse and the desktop is there! I found in link but for Windows 7 with the term called "System unattended sleep timeout" added to Advanced Power Settings" but cant find anything like this in Windows 10. And I do not know even if this feature is part of Windows 10 it could be something totally different.
Today I upgraded to windows 10, and the first thing I noticed was that the display on my main monitor was out of bounds(1920x1080 btw). I know the task-bar and icons on the desktop are there, but they aren't in the frame of the monitor. My other display is flawless, but the main one isn't. I turned down the refresh rate from 60 to 30 on the main monitor, and that let me see everything on my desktop, however, there is a flicker when I go over stuff. Also, I feel that there is somewhat of an input lag with the mouse.
Since upgrading from Windows 7 x64 to Windows 10, my monitor/display on my desktop will no longer go to sleep/turn off after the scheduled 5 minutes. I had no issues with this in Windows 7. I have updated the BIOS, video drivers, USB wireless mouse driver and anything that could be related and no change. All i want is for the monitor to turn off after the set time. I don't use a screensaver. There is no driver update for the monitor (Viewsonic VX2450). I've also tried uninstalling the monitor driving and reinstalling but no change. Seems like the system also will not sleep when set to sleep.
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
I've been running a multi-monitor setup for a while now, with an ASUS VG248QE 144Hz monitor as my main, and one or more spare monitors as secondary displays. I've set the ASUS monitor to 144Hz without issue in the past, while the other monitors stay at 60Hz.
The other night, Windows wanted to update. I figured it was the usual patch session, so I let it do its thing. However, after restarting my computer, I noticed that my ASUS monitor would only operate at 144Hz if it was the only monitor plugged into the system. When I try to set it higher in the Nvidia control panel, it just reverts back to 60Hz after applying changes. I updated my graphics drivers (though I was only one version behind) and even sought a special ASUS driver for the monitor to resolve the issue, but nothing has worked.
Did Windows recently add a forced refresh rate sync between all monitors recently? If so, that really sucks because I only own one 144Hz monitor.
I installed Windows 10 and found some apps did not fit the monitor screen (too big). I went in and changed the monitor resolution to a larger size (windows recommended 1920 x 1080) but I tried something larger, and now the monitor has a black screen with a little box moving around which says "input not supported". Not being able to use that monitor to go in and change it back, I hooked up a flat screen t.v. which worked but Windows only showed the current monitor size (which was smaller than the original monitor.
I tried the monitor on another computer I have also running Windows 10 and it worked fine. It shows the resolution as 1920 x 1080. When I plugged it back into the original computer I get the same plack screen with "input not supported" so I can't change it there. I suspect the issue is with the registry somewhere in Windows 10 on the first computer, but what the problem is. I've been working on this for a whole day now. You can probably tell I'm not well versed in computer technology. I just want my old monitor back (Acer K272HL) and I promise I won't mess around with it again.
I recently got a new monitor, an Eizo EV2750, which Windows only recognizes as a generic PnP monitor.
I've tried connecting it thorugh DisplayPort, HDMI and DVI without any luck. The often mentioned sollution, to unlplug everything and wait 10 minutes, does not work either.
The graphics card seems to read the EDID however, as in nVidia control panel it displays correctly as EV2750. Same thing with other applications, like DisplayCAL which I use as my color management system.
Seems like it's only an issue with Windows. This issue, however, also means that I can't get full functionality from the monitor. For instance, the ambient light sensor does not work (which is typical when Windows doesn't recognize the display properly).
I just installed Windows 10. Now my laptop monitor won't work - only the external monitor. I go to Graphic setting and click on multiple displays, but it states it doesn't see another device only the external monitor. Tried disconnecting the external monitor and rebooting, but the laptop display still doesn't turn on.
I've been planning on swapping out my motherboard for a new one. I've looked around and it has come to my attention that i would need to reinstall my Operating System. But it wasn't mine and i no longer have contact with the friend that let me use it. my question is, do I need the Disk that i used to install it? or any sort of key? and if so, are there any alternatives to get it cheaper? (my os is windows 10 and i used to have windows 7 before i upgraded)
So I am going to upgrade from an a8-7600 apu to possibly an AMD 6 core cpu which will require a new motherboard since its an am3+ Ok so since I need to change motherboards and on this current pc i am on i upgraded my windows 7 to windows 10 does that mean I cannot use this copy of windows 10 when i change my mobo and cpu? I would have to buy a new copy of windows basically?
MSI R9 390 GPU AMD Catalyst version 15.7.1 Main monitor is CrossOver QHD270 via Dual Link DVI Second display is a Vizio TV via HDMI/DisplayPort
So I was messing around with a 3 display setup for the first time today, one monitor (DVI) and two TVs (one HDMI, one DisplayPort). Everything worked okay, and I could unplug one TV, but when I unplugged the second TV, my monitor went to a black screen. As soon as I plugged the TV back in, the monitor started working again. The main monitor I use every day is now completely dependent on a TV I occasionally use as a second display to be plugged in to function. When I turn on the computer with just the monitor plugged in, I can see the BIOS load up, and the first Windows 10 logo, but as soon as the password screen should be shown, the screen goes black. I tried uninstalling my graphics drivers. When the drivers were completely uninstalled, the monitor functioned normally, but as soon at AMD Catalyst Control Center was reinstalled, it went black until the TV was plugged back in. I've tried plugging the TV up through HDMI and DisplayPort, it makes no difference. Also, my monitor is listed as the main display in both CCC and Windows 10 display settings. Every once in a while when I unplug the TV, the Monitor will get darker and slowly fade to black instead of going straight to black.
I'm just wondering, once i upgrade my OS to windows 10, does that upgrade bind to my current motherboard and will only work with the motherboard it has been upgraded on only. I am asking this cause i am planning to upgrade these days, but i also have plans to upgrade my cpu during the next year which will ultimately require a new motherboard. So i just want to know if i can upgrade without worrying that once i change my motherboard that the OS will become invalid and that i will need a new one.
As the question says. I want to upgrade my motherboard & CPU. Recently I got windows 10 update for free. & I heard that significant hardware change can deactivate windows. I don't want to go back to windows 8...
So after some recent hardware problems I had to replace my mother board. I bought a different model than the one I had previously. I had previously installed a copy of Windows 7 and had it activated by a man on Craigslist who I assume had a membership or something because the key was legit. I later upgraded to Windows 10.
Now my computer is fixed and Windows 10 is installed but it will not activated because the OS considers this motherboard a "new device."Also, I do NOT have a key for any version of Windows and I would really rather not spend 100 dollars buying another copy.
I am building my first computer with windows 10 64-bit and I can't find the usb 3.0 drivers for my motherboard. Do the usb 3.0 drivers come with windows 10? here is my motherboard and the available drivers from asrock for my motherboard, I see everything except usb 3.0 drivers. URL...
I've clean installed windows, reinstalled all my drivers, and reset the bios but my computer is still performing slow. I went from a MSI FM2-A75MA-E35 motherboard to a MSI A55M-E35 one and from Corsair Vengeance 8 GB (2 x 4 GB) CMZ8GX3M2A1600C9 to Crucial Ballistix Sport 8GB (1 x 8 GB) BLS8G3D1609DS1S00. I don't believe there should have been much of a performance change because of the hardware so I'm thinking I probably did something wrong.