There was another technique on editing video coming into play - using two video decks, you could transfer the video from one deck to the other and build up a show by assembling a bunch of different cuts one after the other assembled in a linear fashion - this was linear editing. But again, it sounds simpler than it was - for one how do you make sure the signals between the video decks matched up? Before Linear editing became a popular solution it would require a few technological advancements.

The first came In 1963 Ampex introduced the Editec - an all electronic videotape recorder that had simple microprocessor that could control in and out points marked by audible tones.

helical scan tape

Helical scan systems coming more into use, wrapped the tape around a spinning read head. This could fit more bandwidth on the tape which allowed you to pause and see individual frames making editing much easier.

And in 1967 - the SMPTE timecode developed by EECO and Society of Motion Picture and Television Engineers was a way for a video tape player to locate any frame on the tape as each frame was assigned “address” in terms of hours, minutes, seconds and frames...

The practice of Timecode accurate Linear Editing became common place in the 1970. We went from TV networks spending thousands per hour on kinescope film and development fees, to now smaller market TV affiliates having their own video editing systems to cut their own shows.

But linear editing did nothing to advance the craft creatively. Editing became almost a strictly technical profession - managing large EDLs - edit decision lists that marked in and out points of clips to be used. And because a show was assembled in a linear fashion, any changes to the beginning of a show would mean everything after would have to be reassembled so there was no such thing as a rough cut.

Non-Linear Editing 

There was alternative starting to emerge - almost a rejection of the strict time code rules of linear editing and going back to the freedom of cutting actual film - a system that would eventually be called Non-Linear editing. Non linear editing was nondestructive. You could assemble a cut in whatever order you wanted and go back and make changes without disturbing the rest of the assembly. There was no generation loss that you had with linear editing that required you to copy from one tape to another. It was a much more natural way of editing.

The first NLE was the CMX 600 in 1971 -it was a beast of a machine that recorded half resolution black and white video files onto washing machine size disk packs and cost a little over $250,000 in 1971 dollars - which is about 1.2 million in 2013 money. Only 6 were produced.

Editdroid

But the idea of editing nonlinearly was too good to go to waste. Through the 80s it was really matter of waiting for computational power and storage capability to catch up. You had a few experiments during this time - notably the EditDroid which debuted at NAB in 1984 from a George Lucas spin-off company, DroidWorks. This computer pulled footage stored on LaserDisks which really didn’t work very well and the company shut down in 1987. Other machines tried using a bank of VCRs but they were also slow and cumbersome.

Then in 1988, EMC2 introduced the first All Digital Offline NonLinear editor with data stored on optical disks

This was followed a year later by the public release of the Avid1 - a macintosh based Non Linear Editor in 1989. Avid would go on to become the gold standard for editing in Hollywood.

Storage was still a issue and these machines could only edit short music videos and commercials. But In 1993, engineers added more storage to an Avid System - debuting a 7 terabyte system capable of handling a feature length film. Now these films were being cut “offline” - the reverse of NBC EGS method. They used low quality tape to create a work cut and use the timecode to create an  EDL which was given to the Film Lab to assemble the original film prints.

Lost_in_yonkers
>The first studio film cut on an Avid was Lost in Yonkers in 1993 and just a few years later in 1996, editor Walter Murch would accept the Oscar for Best editing for the English Patient which he cut offline on an Avid.

From clumsy kinescope film to bulky magnetic tapes which finally bring us to the first NLE systems which proved to be an important creative tool just a few yearrs after it’s release,  the path to modern editing in this period has been dominated by some amazing achievements by electronics engineers. But story is only half finished, as it enters the last part of the 20th century, computer scientists, programmers, and mathematicians would pioneer the revolution that would join film and television as more or less interchangeable visual mediums - a medium now accessible to nearly everyone.