stupid question about default display settings when turning on the PC

Cirkustanz

Limp Gawd
Joined
Sep 26, 2007
Messages
501
So, I have two displays hooked up to my computer lately. One is a monitor, another is a television. I only use the television for when I am playing a game. The tv is in front of my couch. The monitor is by my desk. (Tv is in a corner of the living room)

My card is an asus rog strix oc 4090. My monitor is connected to one of the DP connections, and the television is connected to one of the two HDMI ports on the card.

Sometimes I go several days without playing games, and for *reasons* (I play an old MMO that if there are two displays connected, I lose the ability to use windowed gamma to change the brightness, so whenever I play the game I specifically disable the television) and really only enable the television as a display when I want to play a game on my couch.
1690404966488.png
1 is the monitor (2560x1440) , 2 is the tv (a 4k) and it looks this way because the television is disabled. Whenever the television is enabled obviously the size comparison in windows display results in display 2 being much larger than 1)
1690405015948.png

Whenever I want to play a game on the couch, I just change multiple displays to either have both enabled, or enable display 2 (the television) and also set it as main display.

Sounds good, right? Pretty simple?

Here's the stupid question part:

When I turn my computer off, I have always turned it off like in the first screenshot, with display 1 (monitor) being the only active display in windows. But even if the television is powered off (both when the PC is shut down and when I turn the PC back on) windows tries to use the television as the primary display. The UEFI shows on the monitor without any problem, but the second it gets to windows the monitor shuts off , and it stays this way...right up until I turn the television on. THEN I see windows on the monitor as I would expect to see. But if I don't enable the television in windows display, when my television eventually powers off from not receiving a signal, guess what happens...my monitor shuts off. I've tried changing the HDMI connector the television is plugged into, I've even tried plugging the HDMI port into an DP to HDMI adapter, and it still does it.

It's so freaking annoying! What am I doing wrong?
 
Last edited:
It sounds more like an issue with the TV, or the communication between the TV and Windows, or just a Windows issue. Everything sounds correct to me.
 
Have you tried always having your computer monitor has the primary and using the windows+P shorcut and use extend monitor only option for when you want to play on the tv then windows+p again PC screen only to go back ?

You could use DisplaySwitch script in a bat file in your windows startup folder to run it on boot if needed (or a keyboard shortcut for it you can execute without seeing anything on your monitor)

A more boring solution would be, if this does not occur when you wake up from sleep, to never shut down the computer.
 
Last edited:
Have you tried always having your computer monitor has the primary and using the windows+P shorcut and use extend monitor only option for when you want to play on the tv then windows+p again PC screen only to go back ?

You could use DisplaySwitch script in a bat file in your windows startup folder to run it on boot if needed (or a keyboard shortcut for it you can execute without seeing anything on your monitor)

A more boring solution would be, if this does not occur when you wake up from sleep, to never shut down the computer.

Windows-p is just a way to change the options for what displays you want to use a little faster than manually selecting them in the display setting of windows. When I am powering off the computer, the television is intentionally disabled, as I do not want it to be a valid display when am not gaming on the couch. Having it enabled when I am not using it introduces two undesired side effects. #1 applications can sometimes go to the wrong display such as if they are just weird and always open on the higher res or furthest to the right display, which is more problematic when it's a television on the other side of the room. #2 As mentioned earlier, an old mmo I still play is just ancient, and whenever you have two displays running, you lose the ability to use windowed gamma, which is problematic...because without it, the game is far too dark. Adjusting the gamma on the monitor while that game is being played is not a solution, because the UI becomes too bright. But don't let that weird and buggy MMO confuse the topic...when I turn the computer off the television is disabled as a display, and the behavior I'm experiencing is windows just seems to want to use all displays that are connected, even if I had disabled them when I shut down.
 
Last edited:
I did once, but I have not done that lately. The monitor is the primary (and only) display when the PC gets turned off.
You might want to explicitly set the monitor as primary with the tv on and connected. It's possibly remembering previous settings.
 
Back
Top