Interlace is a technique of improving the picture quality of a video signal primarily on CRT devices without consuming extra bandwidth. Video is the technology of electronically capturing, Recording, processing storing transmitting and reconstructing a sequence of Still images The cathode ray tube (CRT is a Vacuum tube containing an Electron gun (a source of electrons and a Fluorescent screen with internal or Interlacing causes problems on certain display devices such as LCDs. It was invented by RCA engineer Randall C. RCA Corporation, founded as Radio Corporation of America, was an electronics company in existence from 1919 to 1986 An engineer is a person professionally engaged in a field of Engineering. Ballard in 1932, and first demonstrated in 1934, as cathode ray tube screens became brighter, increasing the level of flicker caused by progressive (sequential) scanning. Progressive or noninterlaced scanning is a method for displaying storing or transmitting Moving images in which all the lines of each frame are drawn in  It was ubiquitous in television until the 1970s, when the needs of computer monitors resulted in the reintroduction of progressive scan. Television ( TV) is a widely used Telecommunication medium for sending ( Broadcasting) and receiving moving Images, either monochromatic A visual display unit, often called simply a monitor or display, is a piece of Electrical equipment which displays images generated from the Video Interlace is still used for most standard definition TVs, and the 1080i HDTV broadcast standard, but not for LCD, micromirror (DLP), or plasma displays; these displays do not use a raster scan to create an image, and so cannot benefit from interlacing: in practice, they have to be driven with a progressive scan signal. High-definition television (HDTV is a Digital television Broadcasting system with higher resolution than traditional television systems (standard-definition Digital Light Processing (DLP is a Trademark owned by Texas Instruments, representing a technology used in projectors and Video projectors It was A plasma display panel (PDP is a type of Flat panel display now commonly used for large TV displays (typically above 37-inch or 940 mm The deinterlacing circuitry to get progressive scan from a normal interlaced broadcast television signal can add to the cost of a television set using such displays. Deinterlacing is the process of converting Interlaced video like common Analog television signals into a non-interlaced form Currently, progressive displays dominate the HDTV market.
Interlaced scan refers to one of two common methods for "painting" a video image on an electronic display screen (the second is progressive scan) by scanning or displaying each line or row of pixels. This technique uses two fields to create a frame. One field contains all the odd lines in the image, the other contains all the even lines of the image. A PAL based television display, for example, scans 50 fields every second (25 odd and 25 even). The two sets of 25 fields work together to create a full frame every 1/25th of a second, resulting in a display of 25 frames per second. Frame rate, or frame frequency, is the measurement of the Frequency (rate at which an imaging device produces unique consecutive images called frames The
With progressive scan, an image is captured, transmitted and displayed in a path similar to text on a page: line by line, from top to bottom. Progressive or noninterlaced scanning is a method for displaying storing or transmitting Moving images in which all the lines of each frame are drawn in
The interlaced scan pattern in a CRT (cathode ray tube) display completes such a scan too, but only for every second line. The cathode ray tube (CRT is a Vacuum tube containing an Electron gun (a source of electrons and a Fluorescent screen with internal or This is carried out from the top left corner to the bottom right corner of a CRT display. This process is repeated again, only this time starting at the second row, in order to fill in those particular gaps left behind while performing the first progressive scan on alternate rows only.
Such scan of every second line is called interlacing. A field is an image that contains only half of the lines needed to make a complete picture. The afterglow of the phosphor of CRTs, in combination with the persistence of vision results in two fields being perceived as a continuous image which allows the viewing of full horizontal detail with half the bandwidth that would be required for a full progressive scan while maintaining the necessary CRT refresh rate to prevent flicker. Persistence of vision is the phenomenon of the eye by which even nanoseconds of exposure to an image result in milliseconds of reaction (sight from the retina to the optic nerves
Only CRTs can display interlaced video directly – other display technologies require some form of deinterlacing. Deinterlacing is the process of converting Interlaced video like common Analog television signals into a non-interlaced form
When motion picture film was developed, it was observed that the movie screen had to be illuminated at a high rate to prevent visible flicker. Flicker is visible fading between cycles displayed on video displays especially the refresh interval on Cathode ray tube (CRT based computer screens. The exact rate necessary varies by brightness, with 40 Hz being acceptable in dimly lit rooms, while up to 80 Hz may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film three times using a three bladed shutter: a movie shot at 16 frames per second would thus illuminate the screen 48 times per second. Later when sound film became available, the higher projection speed of 24 frames per second enabled a two bladed shutter to be used maintaining the 48 times per second illumination — but only in projectors that were incapable of projecting at the lower speed.
But this solution could not be used for television — storing a full video frame and scanning it twice would require a frame buffer, a method that did not become feasible until the late 1980s. A framebuffer is a video output device that drives a video display from a memory buffer containing a complete frame of data In addition, avoiding on-screen interference patterns caused by studio lighting and the limits of vacuum tube technology required that CRTs for TV be scanned at AC line frequency. An alternating current ( AC) is an Electric current whose direction reverses cyclically as opposed to Direct current, whose direction remains constant (This was 60 Hz in the US, 50 Hz Europe. ) In 1936 when the analog standards were being set in the UK, CRTs could only scan at around 200 lines in 1/50th of a second. By using interlace, a pair of 202. 5-line fields could be superimposed to become a sharper 405 line frame. The 405-line Monochrome analogue Television broadcasting system was the first fully electronic television system to be used in regular broadcasting The vertical scan frequency remained 50 Hz, so flicker was not a problem, but visible detail was noticeably improved. As a result, this system was able to supplant John Logie Baird's 240 line mechanical progressive scan system that was also being used at the time. John Logie Baird (August 13 1888 – June 14 1946 was a Scottish engineer and inventor of the world's first working television system
From the 1940s onward, improvements in technology allowed the US and the rest of Europe to adopt systems using progressively more bandwidth to scan higher line counts, and achieve better pictures. However the fundamentals of interlaced scanning were at the heart of all of these systems. The US adopted the 525 line system known as NTSC, Europe adopted the 625 line system, and the UK switched from its 405 line system to 625 in order to avoid having to develop a unique method of color TV. NTSC ( National Television System Committee) is the Analog television system used in the United States, Canada, Japan, Mexico France switched from its unique 819 line system to the more European standard of 625. It should be noted that although the term PAL is often used to describe the line and frame standard of the TV system, this is in fact incorrect and refers only to the method of superimposing the colour information on the standard 625 line broadcast. PAL, short for Phase Alternating Line, is a colour -encoding system used in Broadcast television systems in large parts of the world The French adopted their own SECAM system which was also adopted by some other countries, notably Russia and its satellites. SECAM, also written SÉCAM ( Séquentiel couleur à mémoire, French for "Sequential Color with Memory" is an analog color television system PAL has been used on some otherwise NTSC broadcasts notably in Brazil.
Interlacing is used by all the analogue TV broadcast systems in current use:
With any video system there are trade-offs. One of the most important factors is bandwidth, measured in megahertz (for analog video), or bit rate (for digital video). The greater the bandwidth, the more expensive and complex the entire system (camera, storage systems such as tape recorders or hard disks, transmission systems such as cable television systems, and displays such as television monitors).
Interlaced video reduces the signal bandwidth by a factor of two, for a given line count and refresh rate.
Alternatively, a given bandwidth can be used to provide an interlaced video signal with twice the display refresh rate for a given line count (versus progressive scan video). A higher refresh rate reduces flicker on CRT monitors. The higher refresh rate improves the portrayal of motion, because objects in motion are captured and their position is updated on the display more often. The human visual system averages the rapidly displayed still pictures into a moving picture image, and so interlace artifacts aren't usually objectionable when viewed at the intended field rate, on an interlaced video display.
For a given bandwidth and refresh rate, interlaced video can be used to provide a higher spatial resolution than progressive scan. For instance, 1920x1080 pixel resolution interlaced HDTV with a 60 Hz field rate (known as 1080i60) has a similar bandwidth to 1280x720 pixel progressive scan HDTV with a 60 Hz frame rate (720p60), but approximately 50% more spatial resolution.
Note that this is assuming an analog or uncompressed digital video signal. With digital video compression, as used in all current digital TV standards, interlacing introduces some additional inefficiencies over fully progressive video, and so the bandwidth savings are significantly less than half.
Interlaced video is designed to be captured, transmitted or stored and displayed in the same interlaced format. Because each frame of interlaced video is composed of two fields that are captured at different moments in time, interlaced video frames will exhibit motion artifacts if the recorded objects are moving fast enough to be in different positions when each individual field is captured. These artifacts may be more visible when interlaced video is displayed at a slower speed than it was captured or when still frames are presented.
Because modern computer video displays are progressive scan systems, interlaced video will have visible artifacts when it is displayed on computer systems. Computer systems are frequently used to edit video and this disparity between computer video display systems and television signal formats means that the video content being edited cannot be viewed properly unless separate video display hardware is utilized.
To minimize the artifacts caused by interlaced video display on a progressive scan monitor, a process called deinterlacing can be utilized. Deinterlacing is the process of converting Interlaced video like common Analog television signals into a non-interlaced form This process is not perfect, and it generally results in a lower resolution, particularly in areas with objects in motion. Deinterlacing systems are integrated into progressive scan television displays in order to provide the best possible picture quality for interlaced video signals.
Interlace introduces a potential problem called interline twitter. This aliasing effect only shows up under certain circumstances, when the subject being shot contains vertical detail that approaches the horizontal resolution of the video format. For instance, a person on television wearing a shirt with fine dark and light stripes may appear on a video monitor as if the stripes on the shirt are "twittering". Television professionals are trained to avoid wearing clothing with fine striped patterns to avoid this problem. High-end video cameras or Computer Generated Imagery systems apply a low-pass filter to the vertical resolution of the signal in order to prevent possible problems with interline twitter.
This animation demonstrates the interline twitter effect. The two interlaced images use half the bandwidth of the progressive one. The interlaced scan (second from left) precisely duplicates the pixels of the progressive image (far left), but interlace causes details to twitter. Real interlaced video blurs such details to prevent twitter, as seen in the third image from the left, but such softening (or anti-aliasing) comes at the cost of resolution. A line doubler could never restore the third image to the full resolution of the progressive image. Note – Because the frame rate has been slowed down, you will notice additional flicker in simulated interlaced portions of this image.
Interline twitter is the primary reason that interlacing is unacceptable for a computer display. Each scanline on a high-resolution computer monitor is typically used to display discrete pixels that do not span the scanlines above or below. When the overall interlaced framerate is 30 frames per second, a pixel that spans only one scanline is visible for 1/30th of a second followed by 1/30th of a second of darkness, reducing the per-line/per-pixel framerate to 15 frames per second.
To avoid this problem, sharp detail is typically never displayed on standard interlaced television. When computer graphics are shown on a standard television, the screen is treated as if it were half the resolution of what it actually is or even lower. If text is displayed, it will be large enough so that horizontal lines are never just one scanline wide. Most fonts used in television programming have wide, fat strokes, and do not include fine-detail serifs that would make the twittering more visible. In typography a font (also fount) is traditionally defined as a complete character set of a single size and style of a particular Typeface. Origins & etymology Serifs are thought to have originated in the Roman alphabet with inscriptional lettering —words carved into stone in Roman antiquity
Despite arguments against it and the calls by many prominent technological companies such as Microsoft to leave interlacing to history, interlacing continues to be supported by the television standard setting organizations and is still being included in new digital video transmission formats such as DV, DVB (including its HD modifications), and ATSC.
In the 1970s, computers and home video game systems began using TV sets as display devices. At this point, a 480-line NTSC signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal which caused each video field to scan directly on top of the previous one, rather than each line between two lines of the previous field. NTSC ( National Television System Committee) is the Analog television system used in the United States, Canada, Japan, Mexico This marked the return of progressive scanning not seen since the 1920s. Progressive or noninterlaced scanning is a method for displaying storing or transmitting Moving images in which all the lines of each frame are drawn in Since each field became a complete frame on its own, modern terminology would call this 240p on NTSC sets, and 288p on PAL. Low-definition television or LDTV refers to television systems that have a lower resolution than Standard-definition television systems Low-definition television or LDTV refers to television systems that have a lower resolution than Standard-definition television systems PAL, short for Phase Alternating Line, is a colour -encoding system used in Broadcast television systems in large parts of the world While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as CGA were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT. The Color Graphics Adapter ( CGA) originally also called the Color/Graphics Adapter or IBM Color/Graphics Monitor Adapter
By the mid-1980s computers had outgrown these video systems and needed better displays. The Apple IIgs suffered from the use of the old scanning method, with the highest display resolution being 640x200, resulting in a severely distorted tall narrow pixel shape, making the display of realistic proportioned images difficult. The Apple, the fifth model inception of the Apple II, was the most powerful member of the Apple II series of personal computers made by Apple Computer. In Digital imaging, a pixel ( pict ure el ement is the smallest piece of information in an image Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8 MHz of bandwidth that NTSC and PAL signals were confined to. The hertz (symbol Hz) is a measure of Frequency, informally defined as the number of events occurring per Second. Apple built a custom 342p display into the Macintosh, and EGA for IBM compatible PCs was 350p. The apple is the pomaceous Fruit of the apple tree Species Malus domestica in the Rose family Rosaceae. Macintosh, commonly nicknamed Mac is a Brand name which covers several lines of Personal computers designed developed and marketed by Apple Inc The Enhanced Graphics Adapter (EGA is the IBM PC Computer display standard specification located between CGA and VGA in terms of graphics The Commodore Amiga created a true interlaced NTSC signal (as well as RGB variations). The Amiga is a family of Personal computers originally developed by Amiga Corporation. This ability resulted in the Amiga dominating the video production field until the mid 1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required. 1987 saw the introduction of VGA, which PCs soon standardized on, Apple only followed suit some years later with the Mac when the VGA standard was improved to match Apple's proprietary 24 bit colour video standard also introduced in 1987. The term Video Graphics Array ( VGA) refers specifically to the display hardware first introduced with the IBM PS/2 line of computers in 1987, but through its widespread
In the early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at very high refresh rates, intending that this would alleviate flicker problems. Such monitors proved very unpopular. While flicker was not obvious on them at first, eyestrain and lack of focus nevertheless became a serious problem. The industry quickly abandoned this practice, and for the rest of the decade all monitors included the assurance that their stated resolutions were "non-interlace". This experience is why the PC industry today remains against interlace in HDTV, and lobbied for the 720p standard.