A computer’s graphics card, also called a video card or GPU, contains special memory designed to aid in the display of numerous complex objects. The most common use for high-level graphics cards is in computer gaming, although even baseline computers make use of a graphics card for normal display functions. For a graphics card to function, a system must be loaded with device drivers.
1. Determine the make and model of your video card by going to the start menu and opening the control panel. Open the device manager and search for “Display Adapters.” Your video card should be listed there. Another way to quickly determine this is to go to the start menu, click “Run” and type “dxdiag” into the box. An info box should appear with a tab labeled “Display,” which will list information about your video card.
2. Attempt to update automatically. Select your video card in the device manager, right-click on it, and click on the properties menu. The second tab should be the driver tab, where there will be an option to update the driver. This will allow your computer to search the Internet to install officially released and signed drivers. This can be a quick and easy way to install drivers, but often will not install the latest or best drivers directly from your video card manufacturer.
3. Go to the website of your video card manufacturer and search for drivers. Nvidia and ATI are the two main graphics card manufacturers, and each offers constantly updated driver support and search tools. Enter information about your computer as necessary and download the latest drivers designed for your card. You can also go to the website of your computer manufacturer, such as Dell, HP or Gateway, and download graphics drivers for your computer model.
4. Install the drivers you downloaded using the bundled auto installer.
5. Restart your computer. Your drivers will not take effect until after you restart.