Thursday , July 27 2017
Home / component / Video Card ( GPU ) Definition

Video Card ( GPU ) Definition

A Video Card is an element of computers that decrypts and decodes the language that is produced inside the computer’s processor to convert it into symbols, images and representations that can be understood by the end user, in this way, the video card is The computerized device that processes the final result that the system makes for the computer for the user. Video cards are known as graphic cards or video accelerator cards, these can be integrated into the central plate (GPU: ” Graphic Processing Unit “) or be a peripheral that gives the terminal a special or more optimal performance.

The first video cards that existed were in the first equipment that tuned the signal of the television, processed the data that arrived through the radio antennas and showed them through a monitor illuminated with lights in a drawer. Nowadays, computers are the main trade for video cards that offer higher resolution or greater ability to connect screens to a single computer . Video cards have evolved in parallel with the quality of the images taken by cameras.

The invention of HD (“High Definition”) technology forced video cards to incorporate higher power interface connectors and to have a much higher processing capacity HD “High Definition Multimedia Interface” or ” High Definition “) , which also influenced the essential features such as RAM , processor and storage capacity of computers.

Other devices that have given revolutionary use to video cards have been video game consoles and smartphones . Stations playing for home, business cards much more sophisticated graphics that are capable of detecting most used way effective movements and control settings and support different types of games that can be played. Meanwhile in smartphones, as GPU, process the data in these small handheld devices, creating an ambitious trade telephony in screens more resolution are selling and bring the best graphical user experience thanks To the combination of power between the processor and the GPU .

It should be noted that this is still happening today with consoles, and it is thanks to this type of unalterable design that after a few years the developers get much better results than the first experiments; This is not possible on a PC, however powerful, since a software company can not consider all the possible combinations of the machines of its consumers. In addition, the architecture of a computer has weak points precisely because its parts are interchangeable , the most notable being the distance between the memory , the graphics card and the main processor.

In the early 1980s, IBM relied on the design of the unforgettable Apple II and made the interchangeable video card become popular, although in its case it only offered the ability to display on- screen characters . It was an adapter with the modest amount of 4KB of memory (currently can have 2GB, 512 times more) and was used with a monochrome monitor. This was the starting point, and the improvements were not long expected.

Later, IBM standardized the term VGA , which refers to a video card technology capable of offering a resolution of 640 pixels wide by 480 high, as well as the monitors that could represent those images and the connector needed for their use. After the work of several companies exclusively dedicated to graphics, Super VGA (also known as SVGA ) saw the light of day, increasing the definition available (to 1024 x 768) as well as the amount of colors that could be represented simultaneously Of 16 colors in 640 x 480 was changed to 256 in 1024 x 768).

About howcircuit

Check Also

Sunway TaihuLight, the world’s fastest computer

In Wuxi, a province of Jiangsu, China, we can find the Sunway TaihuLight, the fastest computer …

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.