THE EVOLUTION OF VIDEOWALLS
By Bruce Genricks (Managing Member)
Up until the early 80's multi-image displays consisted of banks of slide projectors synchronised to a sound track. While this was a very effective and artistic way of displaying multiple images on a single screen, it was very expensive, cumbersome and required a lot of maintenance. "Virtual" movement or animation could be achieved using a large number slides and clever programming, but it was not quite the same as video. However, this was the only practical and affordable way of achieving a big, dynamic image.
In the 80's video projection was in its infancy and was based on cathode ray tube (CRT) technology. The biggest challenge with video projection at the time was brightness: a typical CRT projector could achieve only about 600 to 800 lumens. This meant that projected images were limited to about 4 meters in width, and required a darkened room.
Monitors and television sets also made use of CRT technology and were limited in size to about 70cm diagonally. In order to overcome these challenges innovators in the industry started "lacing" projectors or monitors together to achieve bigger images.
The first multi-image video displays
In 1985 the first multi-image video displays consisted of two or more projectors (or monitors) projecting side by side onto the same screen. A typical setup would consist of two videotape recorders, synchronized via time code. One time code for the left hand projector and one for the right. In order to shoot the material for this setup, two cameras mounted side by side would be used. Although other methods of splitting the image could also be used, this was the favoured method as it retained the full resolution of each image. By 1987, video tape recorders were replaced by video disc players, and by 1996 by DVD players.
The introduction of the videowall processor
In the mid 1980's the first true videowall processors were introduced to the audio visual world. They consisted of large racks of equipment connected by metres of multicore cables. Because they could only handle standard PAL or NTSC resolutions they had very limited functionality; typically handling only up to four simultaneous video inputs.
However, they were capable of including basic video effects and were able to freeze an image per display.
In 1985 purpose-built videowall monitors did not exist. Instead, the first videowall used modified CRT television sets as the displays. The TVs were modified to accept RGB video and fitted in custom designed sheet metal cabinets. The steel cabinets served three main purposes: they allowed the TVs to be stacked; they reduced the image to image gap; and, they provided electro-magnetic shielding.
Some of the early challenges faced by the technicians were matching the colours across the displays and aligning the images. Colour purity across individual screens was a problem, due the magnetic effect of adjacent monitors.
Needless to say, the whole system was bulky, difficult to set-up and very sensitive.
By 1985, after the introduction of the videowall processor in South Africa, events companies started making use of arrays of video projectors as displays. These usually consisted of CRT projectors mounted in custom frames projecting onto rear projection material. This was followed by purpose-built projection cubes which utilized mirrors and rigid rear projection Fresnel screens. They were also designed to stack and were specially mechanized to help with image alignment.
In the early years between 1985 to 1990 of video walls, video projection and multi-image slide projection were sometimes mixed. This was especially useful large numbers of video projectors were not freely available.
The resolution revolution
Early video walls had two main problems: resolution, and wide area flicker.
The resolution of the early videowall was based on PAL or NTSC standards. It therefore had a maximum resolution of 625 TV lines across the entire image. This made it almost impossible to see any detail when magnified across multiple displays.
Wide area flicker occurred as a result of the slow refresh rate of the image (25 frames per second). While this was slightly annoying on standard television sets, it became a real problem when displayed across a large screen, and especially so in a darkened room.
The graphics processor
In the early 1990's Video Graphics Array (VGA) was introduced as a computer graphic standard, and soon graphic videowall processors followed. This new "high resolution" standard (which at that stage was only 640 x 480 pixels) opened up a whole new market for video or graphic walls as they came to be known. New applications included control rooms and monitoring facilities which typically displayed graphical representations of processing plants, computer or telephone networks.
One of the advantages of the graphic processor was the ability to display different resolutions and video standards, simultaneously, on the same wall.
VGA quickly developed into SVGA, XGA and so on. Today videowall processors can display images of up to 4k (4 x HD resolution).
Video projectors rapidly adopted new technologies including liquid crystal display (LCD) and later digital light processing (DLP) technology. These new techniques allowed for brighter and eventually higher resolution images. This technology was soon incorporated into videowall cubes. Videowall cubes are specially designed "boxes" which house the projector, a mirror and a rear projection screen. The mirror is used to fold the light path, thereby reducing the required depth of the cube. Another important function of the cube is to exclude any extraneous light from reaching the rear of the screen. This improves the contrast of the image. A problem with these new technologies was cost of ownership; in particular the expense of lamp replacement. In order to retain uniform brightness across all displays, it was necessary to replace all the lamps at the same time. DLP projectors also required color wheel replacements.
When static, high contrast images were displayed on LCD projectors for extended periods they suffered from "image burn" or image retention. Due to this DLP was the preferred projection technology for videowall cubes.
Projection cubes are still being used today although the preferred illumination source is now light-emitting diode (LED). Due to the long lamp life of LED, this has brought the overall cost of ownership down significantly.
Flat panel displays
In 1995 flat panel television sets based on plasma technology were launched. The AV industry was soon using them as videowall displays. The main problem with using them in video walls was the wide frame or bezel surrounding them. Later on "bezel less" plasmas specifically designed for videowall were introduced. These did not produce a seamless image and were fragile. They were more suited to fixed installations, than to the rental industry.
In 2003 large LCD flat panels with a 46" screens were launched. Although they still had large bezels, the size made them viable for videowall applications. This changed in 2006 with the introduction of thin bezel LCD displays. The next few years saw ever thinner bezels with the slimmest at about 5mm image to image. The introduction of LED edge-lit and direct-lit LCD displays have made this the most popular display technology to date.
Today's videowall processor
In the second decade of the 21st century the most sophisticated videowall processors are card-based and driven by powerful computers. They are custom configured for the number of displays and the number and type of source. Usually the inputs are hardwired but can also be decoded from an IP stream, using built-in decoders in the processor.
Most dedicated videowall displays include scalers and daisy chain inputs. This allows a video input to be daisy-chained through the monitors, with onboard software displaying a portion of the image. In this way a large image can be displayed across multiple monitors. However, this technology is limited to a single input, with little or no effects.
A software based distributed system is also available, but requires a computer per display, which is often fitted in an optional slot in the display. This setup allows for multiple images to be displayed across the display.
Videowall remains an important medium for digital signage and control room applications. They are most commonly found in military, communication, surveillance and advertising applications.
The first videowall in South Africa
I was privileged to be involved with the installation of, what I believe, the first true videowall in South Africa, and one of the first in the world.
In the mid 1980's Electrosonic UK started developing a videowall processing system which was to become known the Picbloc system. PIC was an abbreviation for programmable image controller. This new product line incorporated a "new generation of large scale integrated circuits".
Johann Kruger, owner of Multivisio, was the first person to invest in videowall technology in South Africa. Upon hearing of this new technology, he travelled to Electrosonic in the UK to see what all the hype was about. After the demonstration of the prototype he was so impressed, that even though the product was still in the development phase, he immediately placed an order for a system for an upcoming product launch.
Lourie Coetzee, who was the owner of Twin Imports and the exclusive distributor of Electrosonic products arranged the importation and logistics of this equipment. I was employed by Twin Imports as a technician, and was responsible for the technical aspects of the project. The equipment arrived and consisted of flight cases populated with 2U rack mount boxes. Each video input required a digitiser which was housed in a 19" 3U cabinet. Likewise each video output required a similar box which was called a PicBloc. Each video input required a data bus consisting of a multicore cable linking the digitiser to the first and subsequent PicBloc.
It was a nightmare to setup, with frequent firmware updates. New EPROMS were shipped via courier and had to be physically replaced in each PicBloc. The modified TV sets were sourced locally and prone to magnetic interference from adjacent sets. Gaffer tape was used to insulate each TV from the next to avoid eddy currents. A lot of tweaking was required to get the monitors displaying a uniform color, but, after many late nights, the wall was finally ready for the product launch. It was a great success and Multivisio went on to do some of the most memorable product launches in South Africa to date, often using videowall technology.