Graphic Cards :: No Audio Via HDMI After Monitor Wakes From Sleep - AMD Radeon HD 7800
Aug 9, 2015
I'm running my audio via HDMI from my graphics card and it plays through the speakers built into my monitor.
When I first did the Win 10 update I had no audio whatsoever, and followed some suggestions on online forums that lead me to disable the Realtek High Definition Audio which I did and upon a reboot- my HDMI audio was working.
But then I had another weird issue. My PC is set to never sleep as I use it as a media server, but my screens turn off automatically after 10 minutes. What im finding is that when I wake up my monitors the audio stops. When I view playback devices I can't do any tests because "the device is already in use by another application"
(note: I have the checkbox for allowing an application to take exclusive control of a device un-checked).
So far there are only 2 ways to restore audio. 1) restart the PC. 2) go into device manager > Sound, video and game controllers > AMD High Definition Audio Device > Disable... wait a few moments > Enable.
It worked fine in windows 7 but now when I choose "duplicate display" from the side menu popup the tv goes blank as if Ive unplugged the hdmi. All 3 of the remaining display options work including the extend display and tv only setting so its not a cable issue. Laptop model is XPS 17 L702X. Now when I used to do it in win 7 sometimes it wouldnt autodetect it so I would have to right click desktop, open invidia up and click rigorous display detect or something like that and then laptop screen would blink and make the beep noise like when hdmi cable got connected. Now in the invida panel it just simply wont allow mirroring, nothing happens when I right click on the display box that would normally show 1 and a smaller box would say 2. I dont get it but Id really like that feature back If its something simple I could do.
Every time I restart or boot up my PC with my monitors turned on first I always get a issue with the monitor connected to the HDMI port, it always says "Input not supported" and all I do is turn my monitor off then back on and the issue has been resolved...
I have a Dell 8500 that I recently upgraded from Microsoft Windows 8.1 Professional to Windows 10 Professional. The upgrade went smooth but after a few days the system would only boot with a resolution of 640x480. The AMD Radeon HD 7700 is connected via DisplayPort to a Dell U2312HM monitor.
I tried every solution that I could find on the Web and still could not resolve the issue. I did a complete uninstall of the latest AMD Windows 10 Drivers. Since the resolution was stuck at 640x480 I was unable to re-install the drivers.
Based on a suggestion from one Windows TenForums I was going to attempt to use another connections on the Dell U2312M monitor. This monitor has a DisplayPort, Mini Display Port and a DVI.
I unplugged the DisplayPort Cable from the AMD Radeon HD 7700 to get a better look at the available connectors. The video card has a DVI, DDMI and DisplayPort. Since I did not have a long enough cable to go to the monitor I plugged the DisplayPort cable back in.
When I re-started the computer the resolution was back to the 1920 x 1080.
I checked with two other friends that were having similar issues with a Dell and HP computer stuck at 640 x 480 and removing an reinserting the DisplayPort Cable. I suspect I would have had the same results if I powered down both the computer and the monitor. In all case the monitors and computers had not been powered down at the same time.
I have an Asus Desktop. I plugged a working HDMI cord into my older Sony Bravia TV (Smart TV). I have a GTX 760 192 Bit NVIDIA graphics card. I went to the NVIDIA control panel, and it wont read the TV. I tried going through the windows 10 64 bit version of trying to find connected devices, and the TV shows up but as a media streaming device. It almost seems like the PC itself doesn't know it has an HDMI slot.
when i connect a dvi or hdmi from my gtx 780 to my acer H274HL Monitor all works fine but when i want to connect display port cable on my gtx 780 and the other side is hdmi that goes to my monitor than there is no signal at all.
i am running windows 10 64 bit enterprise and all drivers are up to date also windows updates are up to date.
Just unboxed my computer from CyberpowerPC, GTX 720 graphics card, installed/updated all drivers with drivereasy (paid for just now) and I am still having the issue with HDMI port not working. I have tried to plug into a monitor and a TV, neither are working, I have also tried to use multiple HDMI cables, still does not work. My monitor has an auto-switch source availability and turned on, when I plug my computer into the monitor with HDMI it continually cycles, finds no signal and goes into sleep mode. Then rinse/repeats until I either turn off the computer, monitor or just unplug the monitor from the computer. Currently I am using a DVI(computer) to HDMI (tv) and its working fine. So I know its the port on the computer. Do I need to contact CyberpowerPC to send the computer back to get this issue resolved?
I've been running a multi-monitor setup for a while now, with an ASUS VG248QE 144Hz monitor as my main, and one or more spare monitors as secondary displays. I've set the ASUS monitor to 144Hz without issue in the past, while the other monitors stay at 60Hz.
The other night, Windows wanted to update. I figured it was the usual patch session, so I let it do its thing. However, after restarting my computer, I noticed that my ASUS monitor would only operate at 144Hz if it was the only monitor plugged into the system. When I try to set it higher in the Nvidia control panel, it just reverts back to 60Hz after applying changes. I updated my graphics drivers (though I was only one version behind) and even sought a special ASUS driver for the monitor to resolve the issue, but nothing has worked.
Did Windows recently add a forced refresh rate sync between all monitors recently? If so, that really sucks because I only own one 144Hz monitor.
I installed Windows 10 and found some apps did not fit the monitor screen (too big). I went in and changed the monitor resolution to a larger size (windows recommended 1920 x 1080) but I tried something larger, and now the monitor has a black screen with a little box moving around which says "input not supported". Not being able to use that monitor to go in and change it back, I hooked up a flat screen t.v. which worked but Windows only showed the current monitor size (which was smaller than the original monitor.
I tried the monitor on another computer I have also running Windows 10 and it worked fine. It shows the resolution as 1920 x 1080. When I plugged it back into the original computer I get the same plack screen with "input not supported" so I can't change it there. I suspect the issue is with the registry somewhere in Windows 10 on the first computer, but what the problem is. I've been working on this for a whole day now. You can probably tell I'm not well versed in computer technology. I just want my old monitor back (Acer K272HL) and I promise I won't mess around with it again.
Most people I've seen usually have appalling (almost unreadable) quality when using an external monitor with an HDMI cable -- it's quite simple to fix really easily. Films / video are usually OK but if you are reading email or doing standard computer work then you want the monitor to give you decent results.
First ensure that the "Sharpness" setting on the monitor is set correctly -- oversharpening makes text etc look terrible.
secondly use something like "Auto size" so the picture size fits the screen properly rather than be slightly too large or small even if say 1920 X 1080 HD is selected.
thirdly use correct frequency for scanning -- in Europe should be 60 or 59HZ - use the "p" choice not the "I" one.
finally use sensibly the smart picture settings if you have one or use colour / tint / contrast / brightness settings properly.
Messing around with these controls really does turn an external monitor from in some cases having an almost unreadable screen to a pleasure to use --even on cheap 22 inch TV's.
Use the monitors own menu - don't do it from the PC display settings.
A typical laptop's video card will be perfectly ok on these types of monitors.
I am cleaning out and setting up an old-ish PC and whenever I connect it to my TV or Monitor it says either "No Input to TV" or it is just black when I turn it on, I am using VGA to connect it, I tried with a GTX 650 and with just the built in graphics, and still no luck, any reason why?
I'm having a problem with the HDMI audio when switching inputs on my monitor. With my last OS i was able to switch between the HDMI inputs and both devices would have audio. Now, PC only has audio when the computer is started up, but when inputs are switched and switched back, i no longer have PC audio.
I have temporarily connected a 22 inch LCD tv monitor( until i get a new one) to pc via HDMI lead but i cannot get resolution set right.The recommended native makes the taskbar disappear of screen and the desktop icons are barely visible on left side of screen.
I have tried various resolution settings but all are no good apart from one which makes the desktop look like like a laptop screen.
I have reset Windows 10 and updated Intel Graphic Card drivers to the latest but still resolution cannot be set correctly for monitor and used a VGA lead aswell.
Is it because you need a proper pc monitor to have correct resolution ?
Have a user with a Sony vaio laptop, 15" screen, 1920x1080. They also plug an HDMI cable into a 23" dell widescreen monitor for a second display. This 23" monitor is also 1920x1080. They are complaining that fonts on this second monitor are blurry. They tried to live with it a week but they said can it be made clear again like it was when they had Windows 7?
Are you familiar with the possibility of fonts being blurry only on one monitor if they are both set to the native?
I'm asking them to see if they change the font dpi to 100%, logoff and on if its any better. For whatever reason that seemed to work for my laptop (another Sony Vaio), but my concern is they will be disappointed at legibility on the laptop screen (it makes everything tiny).
I have a Dell XPS 13 (2016) with a Dell USB-C adapter. It will not detect my NEC LCD external monitor. I have tried a Dell USB-C adapter and a D-SUB cable to the monitor. I have also tried a Monoprice USB-C adapter with a Monoprice HDMI cable. Neither work. The monitor works fine with my Mac. I have told the computer to extend and duplicate the display using the Windows-P command. I have tried to update the driver for the monitor (I have NEC's driver), but it will not install because the monitor is not detected.
My computer currently has 3 monitors connected to it. 2 through the graphics card (R9 280), 1 through the integrated graphics (AMD HD 3000). Not sure when why or how, but the integrated graphics card can never pick up what type of monitor is connected to it... so it always gives me a bunch of generic 4:3 resolutions...
I've tried uninstalling the drivers and letting them reinstall. I've tried swapping around the cables to see if i could get lucky with one of the other screens, but they all face the same issues. I've tried buying brand new cables across the board because sometimes people say that's the issue. But not success.
The cloest thing i have is apparently a custom resolution setter, but it was created for windows 7 in mind and doesn't seem to work on windows 10 (PowerStrip).
My question is, is there anyway to format the list so i can select the correct resolution? It's a Toshiba TV/Monitor and it seems like every thing else attached to my pc (even 3rd party programs i tried to use to forcably change the resolution) can spot it and hand me back the correct resolution but i can't seem to change it. I get Genertic pnp through the control panel (the only place the monitor shows up in) and even then the correct driver doesn't show, it's the Microsoft Generic driver.
Mobo: MSI AMD 3000 series, cant find exact model.
I've updated my AMD drivers to the latest and so far can only modify the resolution to stock ones that look horrid on the screen. Any program in windows 10 to change the resolution of a screen.
I have a MSI GS60 ghost pro gaming laptop which has Intel HD graphics 4600 and Nvidia GTX970M GPU, CPU i7 4720HQ, running Windows 10. I study abroad, several months ago i went back home and plugged-in my laptop to the TV via HDMI for gaming like normal. Yesterday i bought an ASUS VC239H external monitor and tried to do the same thing, but there is no HDMI input signal, the Laptop cannot detect the monitor (include display settings, intel HD graphics settings, nvidia settings). This is what i tried:
- Fn+f2, Fn+f8 keys, windows P switching to duplicate or extend or anykind of options - plug the monitor into my friend's laptop with windows 8 through that HDMI cable and it works fine, the same with other laptop running windows 7==> cable and monitor are normal - plug my laptop to my neighbor's smart TV. There was ''connected' symbol which appears on the TV but black screen instead of image, Laptop couldn't detect the TV - reinstalling nvidia and onboard-graphics drivers and update to the latest - Made a clean install of windows, tried windows 8 also but the same result - Updated my BIOS and EC - Bought an adapter HDMI-Mini Displayport to use the mini displayport with HDMI cable
Ok so I recently updated to Windows 10 but now when I connect my laptop to my external monitor the laptop screen doesn't show up on the external monitor.
My laptop's screen is broken so I can't see anything. The laptop is still on the screen immediately after updating so I could blindly work my way through the update screens, but someone would have to upload pictures of the screens immediately after updating to Windows 10.
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
Surfing the web like normal with Photoshop in the background (not doing anything on it yet)Monitor goes dark blue/green and returns to normal saying Photoshop has stopped working, and saying in the bottom right corner that the driver had failed and then recovered.
I've had a problem where I would put my laptop to sleep, and when it wakes up it doesn't save my session; it's basically boots from scratch so when I log in it has to reload all background applications as well as restoring my open applications. I've posted this here because it may be related to the Intel HD Graphics 4400 driver I recently updated (with which I can't roll back, as I just reset the system).
This is probably the most painful issue I have with Windows 10 right now (and likely previous versions as well, but I didn't have a multi-monitor setup back then).
The monitors I have are as follows: 3840x2160 (4K UHD) monitor with preferred DPI: 144 (150%)1920x1080 (Full HD) monitor with preferred DPI: 96 (100%)
Whenever one of these monitors are set as primary, all desktop applications displayed on the secondary monitor (doesn't matter which) has blurry text. Exceptions are the Windows Store Apps like Windows Store and Microsoft Edge, along with the Taskbar/Start Menu, the Taskbar/Start Menu settings screen, the Taskbar context menu, and the desktop context menu which passes the DPI test with flying colours, with crispy text on both monitors (occasionally a DPI switch bug gets in, but I can mostly ignore that). The problem is, as you can probably guess, is that >99% of the applications I use aren't Windows Store Apps.
Here are some screenshots. The "Taskbar and Start Menu Properties" text is what the text should look like while the Visual Studio 2015 text is an example of the text most desktop apps get. The blurry image is what happens when the UHD monitor is not the primary monitor. Attachment 48493Attachment 48494 Note: Both of these screenshots came from the 150% DPI monitor so it's best viewed at that (144) DPI level. The 96 DPI monitor is similarly affected.
Things I've already tried: Reinstall the graphics driver. Did this multiple times in fact for unrelated reasons.Reinstall Windows 10 (through Reset This PC recovery option). I did this for also unrelated reasons but it definitely doesn't fix this issue.Use the XP Explorer "fix". Merely worsens the problem. Adjusting Clear Type options. Alleviates the issue a bit but see next point.Disabling Clear Type on the affected monitor. The text obviously sharpens, but it's painful to read and a close inspection of the text reveals the issue isn't solved at all, only mitigated slightly.Replacing the video card. I swapped this in with my older GTX 560 Ti but it's obvious the problem remains. Both it and my current card are NVIDIAs though, so it's vaguely possible the drivers or the cards themselves are the cause. I don't have an ATI/AMD card (that still works, at least) to test the setup and every Intel iGPU I have either has only one monitor output or is incapable of handling UHD resolutions.
Things I won't try: Setting both monitors' DPI to 96. Text would become microscopic considering the UHD monitor's actual size.Use the text resizing feature instead. I'm going take a wild guess that this is not monitor-specific and would cause everything on the HD monitor to be far too large to the point that I'd rather unplug it.
Looking for multi-resolution, multi-DPI, multi-monitor setup with or without this issue? The text is painful to read on whichever is the secondary monitor right now, and is extremely apparent whenever the background is dark.
I just installed Windows 10. Now my laptop monitor won't work - only the external monitor. I go to Graphic setting and click on multiple displays, but it states it doesn't see another device only the external monitor. Tried disconnecting the external monitor and rebooting, but the laptop display still doesn't turn on.
So I recently had to reinstall Windows 10 to resolve an issue I was having, so my Nvidia drivers were deleted in that process. When I went to turn my computer back on after i was finished, there was no display on the screen so I took out my GPU and plugged the VGA into the integrated graphics and it worked fine. This just made me think I have to install the Nvidia drivers again but then I found out that you need the GPU in your computer to install the drivers, but My display won't work anymore with the GPU inside and no drivers for it.
In NVIDIA Control Panel, when I select my settings and click apply every things is OK. The next time I reboot they change. Why are they not being saved?
I upgrade my Win7 Pro to Win10 Pro last night and lost my 3840x2160x30 setup, being reduced back to 1980x1200. I've since done a clean install and reinstalled the latest drivers for my Samsung U28D590D and My ATI Radeon HD 5770.Nothing i have done has given me back my 4K resolution.