I was using a 2nd monitor with win 10. then I upgraded to a better monitor. now, when I start a program, win10 still thinks the monitor is still attached and starts the program in the old monitor. is there a way to tell a program which monitor to start on?
I have my TV connected to HDMI and a small, old monitor connected to VGA on a Radeon 4600 video card. I have Kodi (multimedia player) set to open full screen when the default user account logs in and it does so on the main monitor which is my TV. I have the smaller, older monitor set up so I can still use the computer when someone is watching a movie on the TV. I have multi-monitor set to extend the desktop.
But my problem is when I open a program it wants to default to opening on the main monitor and when I open a program from the second monitor, it opens behind Kodi on the main monitor (and I have to click and drag it over to the second monitor) - so I would like to switch the old VGA monitor to be the main monitor and the TV to be the second.
Is there a way I can force Kodi to open and stay on the second monitor once I make my TV the second monitor?
Start up after installing Windows 10 and the final loading of start ups hangs up on NEC SpectraView II my monitor software color calibration as it continues to loop.
I recently updated my 4 year old acer laptop to win10. I have a calendar program called "windates" I placed a short cut in the system startup folder but the program does not start when I boot up my laptop. I did put a short cut on the desktop which works fine. Everything else has been working fine with win10.
I just upgrade to Windows 10 from Windows 8.1 two days ago. I installed Microsoft Office Home & Student 2013 last night. Office seems working fine. However, this morning I cannot start any program, such as Adobe Reader and Google Chrome. No error message. I even cannot uninstall the chrome from Control Panel!
Nothing changed. Only one thing I notice is that Microsoft Visual C++ 2005, 2008, 2010 Redistributables have been added automatically.
I have a program that came with an IP cam I use for remote viewing that will display the video in IE but not in Edge. The program correctly displays the internal parameter screens from the camera ie-resolution settings, alarm settings et-al. just not the video.
When I use Edge to view the camera directly by typing in the IP address I get the normal sign in screen from the camera and everything I would normally expect but no video image. Just a blank black screen within the camera software border. When I use IE the video is fine.
I have viewed the " How to Change Default Programs" article and it only shows how to force an IE start for "groups" of programs not a specific program.
Is there a way to force a SPECIFIC program to use IE in Windows 10 rather than Edge?
I want a program to start automatically on windows startup, Nothing needs to be done on the program I just want it to open up, can't see anything on the program's settings suggesting there is a way.
On task manager under the start up tab it says that it is enabled, although I have never seen it active even on the active programs list, I can't disable it either.
The program is Gigabyte OC Guru and I'm running windows 10. It's not urgent, just want it to start so the led on my gpu is the right colour.
- call another batch file - start a program and then wait for the program to close - call another batch file
My goal with this is to work around an issue in Windows 10 where the screensaver won't work when I have my game controllers connected. I already created two batch files where one of them enables my game controllers and the other one disables them. Both these batch files work just fine.
However I now want to create a third batch file that will do what listed above - first call the batch file that will enable my game controllers, then start my program which is an executable and then after I exited out of the program the batch file that will disable my game controllers should be called.
Here's what I've tried but this doesn't work, it will only run the first batch file enabling my game controllers.
When I start the system and after the desktop appears for a few minutes, I get an indication from the lower right portion of the screen that my anti-virus is off and I need to tap to turn it on.
Is this the normal way Win 10 handles Internet Security programs? I'm using ESET Smart Security 9.
I use the start menu of the secondary monitor to open an application, that application open on the primary monitor.
Is there a way to get that application to open on the monitor I started it from. (Using the Start Button of that monitor) ?
I don't actually get the point of having an extra start menu taskbar on that second monitor if it actually open the application on the primary one instead of the current one. It defeat the purpose of having a Start button on those secondary monitor. (I means if you gonna open an application on the primary one, I might as well just use the Start button of the primary one!!)
Also i'm having a second issue related to the first one i guess:
Example: I open ChromeFile Explorer or whatever on my secondary monitor and close it. When I try to open a new one on any monitor it would open it at the LAST KNOW LOCATION of where I closed it.
This is very annoying when you use "Virtual Desktop" "Multiple Monitor" If I open an application I expect it to open where I launch it from(Monitor,Virtual Desktop etc).
The computer goes to sleep after a period of inactivity, when mouse or keyboard is used the tower starts up but the monitor won't. I've gone into Device Manager and made sure mouse and keyboard will wake machine, but alas, no monitor. Must reboot every time.
When I click to start some programs, I get this AFS Error Info window. After clicking OK, I then get the UAC window. Except for the extra step, nothing goes wrong--it's just annoying. Sometimes programs requiring a UAC show this AFS error, sometimes not. What this is and how to stop it?
I have installed Aome hdd manager on my W10 PC and laptop. On the PCs it appears in the start menu and also as an icon in the start menu.
But on the laptop it is not showing, have uninstalled and reinstalled and created a shortcut from the installed .exe file then chose pin to start menu, but still nothing ...
I installed Windows 10 from Windows 7 and I have some minor problems with the start menu. When I search for an installed program most of the times it wont find it but instead it will show me to look it on the web. I believe that it has a connection with the fact that the menu "All apps" doesn't show all apps but only some of them.
I've been running a multi-monitor setup for a while now, with an ASUS VG248QE 144Hz monitor as my main, and one or more spare monitors as secondary displays. I've set the ASUS monitor to 144Hz without issue in the past, while the other monitors stay at 60Hz.
The other night, Windows wanted to update. I figured it was the usual patch session, so I let it do its thing. However, after restarting my computer, I noticed that my ASUS monitor would only operate at 144Hz if it was the only monitor plugged into the system. When I try to set it higher in the Nvidia control panel, it just reverts back to 60Hz after applying changes. I updated my graphics drivers (though I was only one version behind) and even sought a special ASUS driver for the monitor to resolve the issue, but nothing has worked.
Did Windows recently add a forced refresh rate sync between all monitors recently? If so, that really sucks because I only own one 144Hz monitor.
I installed Windows 10 and found some apps did not fit the monitor screen (too big). I went in and changed the monitor resolution to a larger size (windows recommended 1920 x 1080) but I tried something larger, and now the monitor has a black screen with a little box moving around which says "input not supported". Not being able to use that monitor to go in and change it back, I hooked up a flat screen t.v. which worked but Windows only showed the current monitor size (which was smaller than the original monitor.
I tried the monitor on another computer I have also running Windows 10 and it worked fine. It shows the resolution as 1920 x 1080. When I plugged it back into the original computer I get the same plack screen with "input not supported" so I can't change it there. I suspect the issue is with the registry somewhere in Windows 10 on the first computer, but what the problem is. I've been working on this for a whole day now. You can probably tell I'm not well versed in computer technology. I just want my old monitor back (Acer K272HL) and I promise I won't mess around with it again.
I recently got a new monitor, an Eizo EV2750, which Windows only recognizes as a generic PnP monitor.
I've tried connecting it thorugh DisplayPort, HDMI and DVI without any luck. The often mentioned sollution, to unlplug everything and wait 10 minutes, does not work either.
The graphics card seems to read the EDID however, as in nVidia control panel it displays correctly as EV2750. Same thing with other applications, like DisplayCAL which I use as my color management system.
Seems like it's only an issue with Windows. This issue, however, also means that I can't get full functionality from the monitor. For instance, the ambient light sensor does not work (which is typical when Windows doesn't recognize the display properly).
So recently I upgraded to windows 10 at the same time I got a 4k display.When I set the display scaling to 150% on my main 4k monitor and ensuring that the scaling stays at 100% on my other two 1080p monitors, the entire chrome application is blurry when viewing on the 1080 monitors, and I cannot seem to fix this. Other applications seem to be okay.
Furthermore after the scaling change, some icons on my 1080 monitor's desktop get larger, as if some scaling is applied to them. Is there any solution to this? At the moment I have had to set the scaling back to 100% on my main monitor in order to use chrome on the smaller monitors.
I just installed Windows 10. Now my laptop monitor won't work - only the external monitor. I go to Graphic setting and click on multiple displays, but it states it doesn't see another device only the external monitor. Tried disconnecting the external monitor and rebooting, but the laptop display still doesn't turn on.
MSI R9 390 GPU AMD Catalyst version 15.7.1 Main monitor is CrossOver QHD270 via Dual Link DVI Second display is a Vizio TV via HDMI/DisplayPort
So I was messing around with a 3 display setup for the first time today, one monitor (DVI) and two TVs (one HDMI, one DisplayPort). Everything worked okay, and I could unplug one TV, but when I unplugged the second TV, my monitor went to a black screen. As soon as I plugged the TV back in, the monitor started working again. The main monitor I use every day is now completely dependent on a TV I occasionally use as a second display to be plugged in to function. When I turn on the computer with just the monitor plugged in, I can see the BIOS load up, and the first Windows 10 logo, but as soon as the password screen should be shown, the screen goes black. I tried uninstalling my graphics drivers. When the drivers were completely uninstalled, the monitor functioned normally, but as soon at AMD Catalyst Control Center was reinstalled, it went black until the TV was plugged back in. I've tried plugging the TV up through HDMI and DisplayPort, it makes no difference. Also, my monitor is listed as the main display in both CCC and Windows 10 display settings. Every once in a while when I unplug the TV, the Monitor will get darker and slowly fade to black instead of going straight to black.