From Razor Blade to Desktop - A History of Video Editing
Before editing software was developed and even before there were any edit suite controllers, video tape was edited by manually slicing it by people using very sharp razor blades.
This was a process known as Kamikaze editing. Early editors also used a microscope, a cutting block, magnetic developing fluid and degauzed (demagnetised) razor blades. For a clean edit, the tape had to be sliced at the video vertical interval between frames. This was found by painting the surface with a special developing fluid, which Ampex called Edivue. This dyed the tape, exposing the magnetic scan lines to the the naked eye.
The first programme using this method was Playhouse 90, aired on CBS in 1958. Rowan & Martin's Laugh In also used the process extensively until NBC added a kinescope into the mix, and produced The Fred Astaire Special. This was essentially an off-line editing technique.
Throughout the 1960s, tape continued to be physically sliced and spliced, like film, until the early 1970s, when timecode and computerized editing became common.
Linear Video Editing
With the advent of computer-controlled edit suite controllers, the process became much less antiquated - and much less risky. These machines could orchestrate edits based on an Edit Decision List (EDL), which the editor programmed, using in and out points.
Taking these points, the controller could time the tape rollback precisely and record a signal from the raw footage tape onto the edit master. This was achieved by the use of a timecode, which was either recorded on one of the audio tracks, or embedded within the video track.
Sony and Ampex manufactured these suites, as did CMX, and their widespread use was responsible for the music video boom of the 1980s.
The downside of this technique was the degradation of the image. Editors were essentially making a second generation copy of the raw footage, which on analogue video tape resulted in lower quality.
Non Linear Video Editing
To overcome this degradation of picture, a way was found to edit material without modifying the source. This meant edits were made "off-line," then the EDL was used to make the programme "on-line." But as hardware and software developed further, it soon became possible to render the final product digitally, then output onto any desired medium. Any drop in video quality has now become so negligible as to be imperceptible.
The CMX 600, produced by CBS and Memorex in the 1970s, was the first truly non-linear editing machine. It had disk pack drives which were the size of washing machines, and recorded and played back in black and white.
Video was stored digitally on mainframe computers, and the CMX 600 had a console with two built-in monitors. One monitor was used to preview the video, and using a light pen, the editor could make edit decisions using superimposed menus. The other monitor displayed the edited video. Once the EDL was produced, an on-line editor could then make the physical edit master tape.
This process continued through the 1980s, being refined to control multiple video cassette recorders and even Laser Disc machines, until computing power progressed enough to enable true non-linear editing.
In 1985, Quantel introduced Harry, an effects compositing system, but Harry also had some non-linear capabilities - it could record up to 80 seconds of broadcast quality, non-compressed video.
Non-linear editing using computers was first pioneered by Avid, who still remain market leaders in the field. The Media Composer system, launched in 1988, was a hardware development based on the Apple Macintosh II. Avid installed their editing software onto these machines, the first system to introduce now familiar concepts such as timeline editing and clip bins.
But non-linear editing continued to be used as an off-line option, because the digital output of Media Composer was only about VHS resolution. Also, because it was encoded in the M-JPEG format, video compression required very high processing speeds. Hardware was needed, as the software could not cope. The data rate of digital video on these systems, and the limitations of removable storage at the time, meant that work had to be stored on fixed hard discs.
This next step was taken by Eidos in 1990. Early versions of its Optima suite used new compression software, allowing for lower bitrates. So it was possible for software to decode the video, without the need for more expensive hardware.
Another huge development came in the late 1990s, when DV tape formats became available. Because the video source was recorded in digital instead of analogue technology, it could be transferred onto the computer without compression and conversion. With the new Firewire connections it became ever more simple. True desktop editing became possible, with the ability to output the edited video at a broadcast quality.
There are currently three major providers of broadcast Non-Linear Editing Software - Avid, Final Cut and Adobe. These companies also publish software for the home market, along with other developers such as Pinnacle.