I am posting this in hopes it will save someone else from the hours of frustration and viewing countless discussion forums on how to fix this fairly simple problem. In my home office I have two separate computer systems one is an off-the-shelf HP computer which runs both Windows Vista and Linux Ubuntu. The other which is my primary was custom built at the end of last year to replace my aging Vista system. I run Windows 7 on this computer. On my primary system I have an Acer H243H 24″ Monitor which has inputs for VGA, DVI and HDMI. The primary computer has an AMD ATI Radeon HD 4250 (integrated with the motherboard) which has outputs for VGA, DVI-D and HDMI.
Earlier this year I had decided I would add Ubuntu and may be some other Linux distros to my old Vista system. I figured if I borked the computer it would not be a big deal to do a recovery and start from scratch. I had thought about going the KVM switch route to run both systems, but I had read there were a lot of issues with these. Plus I have ended up using both at the same time. Because of the later I bought an HP 20-inch flat screen monitor that has only a standard VGA input for the Vista/Linux system.
I’ve been trying to find a way to redo my office as such that it would be easier to use both computers. They were both on separate desks and I would have to roll across the room to the other computer. The other day I got to thinking about my main computer and monitor. They both have multiple input/outputs. So I did a little research and determined that indeed the AMD ATI Radeon HD 4250 will support dual monitors. However, it will only support VGA with HDMI OR DVI. Okay, that will work as the smaller monitor is VGA input only.
So I went to my local Office Supply mega-store and found a desk that would not only accommodate both monitors but both computers as well. While I was there I also picked on an HDMI cable. Now before I even unpacked much less started assembling the desk, I want to make sure I could do what I wanted to do. That is run dual monitors on my main system and dual system on the 24-inch. In theory I could switch the input to VGA when working on the Vista/Linux system, while still having the smaller monitor displaying on my main system. I disconnected the VGA cable from the computer and plugged in the HDMI cable to the monitor and computer. Fired up the computer and the monitor. The computer booted up, but the monitor just displayed ‘No signal’. Wasn’t sure what the problem was, but turned out the HDMI cable was loose. So now the new monitor was displaying my Windows 7 desktop. I then connected the smaller monitor via the VGA connection and Windows 7 picked it up and now showed 2-monitors in my display settings.
Okay, so far so good. However, I noticed on the 24-inch connected via HDMI had this black border around the screen and the display was distorted. I knew it had something to do with Windows as when I was booting up and poking around in the BIOS the display was full screen. A simple Google search turned up this is common issue with HDMI monitors and AMD/ATI video cards. By default HDMI scaling is enabled which reduces the overall screen size. The fix is really simple, just go into the ATI Catalyst and turn off HDMI scaling.
First problem, I couldn’t find ATI Catalyst anywhere on my system. I know a couple months ago I had to do a reinstall of the video driver and .NET Framework because I was getting MOM implementation errors on boot-up. So I went to the ATI site and discovered the new name is now AMD VISION Engine Control Center. So, I downloaded the suite and installed the software. I went to run the application and it would act like it was launching but would never do so. Another Google search turned up this was also a common issue with AMD/ATI cards. I spent the next several hours, uninstalling, reinstalling AMD VISION Engine Control Center, repairing .NET Framework and even running a registry cleaner to remove all references to AMD, ATI and CCC (Catalyst Control Center) all to no avail.
I looked around some more and came into another forum. One of the replies was from an individual who said they ran into this issue every time there was an update. Their solution was very simple and it actually worked for most everyone:
- Run the Express Uninstaller for AMD VISION Engine Control Center and select uninstall ALL.
- Reboot (Windows wants you to do so anyway).
- Run the Express installer for AMD VISION Engine Control Center
- Reboot again (Windows does NOT prompt for a reboot).
- Run AMD VISION Engine Control Center
The problem I was having was not doing step #4. So I followed all the steps and after I rebooted for the second time, the AMD VISION Engine Control Center launched. Now it was just a matter of figuring out where the scaling option was hidden. After some searching I found it under My Digital Flat-Panels menu and then Scaling-Options (Digital Flat-Panel). Moved the slider to 0% and I had a full screen again.
One last test and that was make sure I could also have my Vista/Linux system connected via the VGA input on my main monitor while also having the other computer connected via the HDMI input. That worked perfectly. Such a simple solution to the initial problem, but what a pain it was to get to that solution.