Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Audio Collaboration in the Network Age

In the two decades since entered the pro audio vocabulary, digitized sound has become integral to nearly every segment of the industry. Initially, the

In the two decades since “PCM” entered the pro audio vocabulary, digitized sound has become integral to nearly every segment of the industry. Initially, the emphasis was on the theoretical quality advantages of digital, a promise that wasn’t always realized. As production became increasingly computerized, however, sound became data to be manipulated as freely as the computational power of a digital audio workstation would allow. When the speed of computer processors exploded, traditional approaches to production workflow were supplemented — and frequently supplanted — by computer-centric techniques. Today, you might still make a case for analog based on its sound quality, but if your business survival depends on how much an engineer can do in a day, then it’s hard to compete with the DAW.

With digitally enabled advances in individual productivity now taken for granted, the frontier in recent years has shifted to group productivity. Particularly in sound-for-picture (film and video), it takes a team effort to move a project to the finish line. And, because time is money, there’s a huge incentive to make that effort flow as efficiently as possible. But the number of people and facilities involved in a typical project — plus the incompatibilities between computer platforms, storage media and file formats — make for some pretty significant hurdles.

Everyone agrees that networks are crucial to allowing greater collaboration and more efficient use of resources in multiroom facilities. In the past, questions about reliability and speed have slowed down adoption in the audio industry, but the widespread deployment of networks throughout the economy and the resulting improvements in technology have removed such obstacles. Today, the real question for owners of most multiroom facilities isn’t whether a network is needed but what kind of network best suits their needs.


Historically, there have been two main alternatives to networks for moving digital audio around a facility. One is digital tie-lines; the other is “sneaker-net,” the physical transport of removable drives from system to system.

“Comparing sneaker-net to a network is like comparing carrier pigeons to a phone system,” says Doug Perkins, VP of sales and marketing at mSoft in Woodland Hills, Calif. “There may be scenarios where the pigeons are better, but it’s hard to think of them.” mSoft makes the ServerSound system, which gives multiple workstations access to centrally stored sound libraries through a browser-based interface.

“With sneaker-net,” Perkins continues, “someone typically asks for audio files to be copied onto some sort of physical media, which is then copied to another media, with who knows how many people touching it throughout the process. Not only is this not a good use of time for many creative and highly skilled people, but the quality of work ultimately suffers from the delivery delays.”

Ed Bacorn, Storage Area Network (SAN) specialist at Glyph Technologies in Ithaca, N.Y., adds that there is an increased risk of damage to a drive — and the data it holds — whenever it is removed for transport to another room or station. “All too many drives are dropped or get bent pins,” he says. “Any number of common disasters can happen when drives are physically moved around to several locations.”

Another problem with sneaker-net, according to Joe Rorke, VP of sales at Rorke Data in Minneapolis, is the issue of interoperability between different systems. “In many cases, the user can’t easily exchange sneaker-net media between OS platforms: Macintosh, Windows NT, etc.,” says Rorke. He also notes that a network can make the bridge between applications in heterogeneous configurations, and it offers better time-to-data speeds than sneaker-net.

mSoft’s CEO, Amnon Sarig, agrees that networks are superior to sneaker-net on almost every level. However, he says that sneaker-net cannot be pronounced dead yet. “With sneaker-net, you can move a 73GB drive from one side of town to the other faster than you can send even a small fraction of that data over a T1 line,” he explains. Within a facility, however, he says that moving files over a network “saves you time, media costs and labor.”

As for digital tie-lines, Bacorn points out that, in most cases, patching must be manually reconfigured in each room for each specific operation. “This requires physically plugging and unplugging cables per task,” he says. “When your facility is on multiple floors or spread out among multiple departments, this becomes a major problem.”

Beyond inconvenience, tie-lines can also be technically unsatisfactory. The Village Recorder, a music and post facility in West Los Angeles, has its four rooms integrated into a single network provided by Glyph. “If we went with digital tie-lines,” says chief engineer Mitch Berger, “we would have to be working in real time. And with the tie-lines, in some cases running long distances, that would have created problems with sync.”


The decision to install a network may be easy enough, but there are a lot of variations on the network theme. Variables include the kind of cabling (copper or fiber), the switches, the type of storage and the network protocol. Perhaps most important, however, is the overall system architecture.

Bacorn says the most common configuration in general use is the Local Area Network (LAN). “This is typically Ethernet-based,” he explains, “using either 10BaseT, 100-BaseT or Gigabit Ethernet. Ethernet is relatively inexpensive to install and maintain. But a LAN is designed as an interoffice communication network. Its primary use throughout the world is for e-mail, moving spreadsheets around or accessing the Internet. Though it can be used to move large files, it’s not recommended.”

One problem with Ethernet is that the actual throughput doesn’t measure up to the nominal bandwidth. “Ethernet has very high overhead,” Bacorn explains. “Rather than doing a simple file copy all at once, an Ethernet must run algorithms, calculate check sum and constantly monitor the transaction of each bit of data. This slows down the transfer enormously. Add in the fact that others are attempting to do the same thing, and you have a bottleneck. All copies slow to a crawl.”

Another difficulty, Bacorn points out, is that most DAWs will not work directly with a networked file. And all storage accessed through Ethernet is considered remote rather than local. “This means you have to copy a file to your desktop or local storage in order to use it,” Bacorn says. On large projects, allowing different users to download and alter files creates the need for version management, adding another layer of complexity.

For demanding studio applications, the favored alternative to an Ethernet LAN is a Fibre-Channel SAN. “At best, Ethernet is a stop-gap measure before the implementation of a Fibre-Channel network,” says Gary Holladay, chief systems engineer at Studio Network Solutions in St. Louis, Mo. The company markets a solution called A/V SAN, which Holladay says offers 64-track playback and record capabilities from a single SAN-based drive.

“Fibre-Channel offers more bandwidth than any other topology, including FireWire,” Holladay continues, “and it gives you more throughput than any DAW can handle at this point, 200 MB-per-second in full duplex. If you’re going to spend money on networking your facility, spend it on a technology that already has the bandwidth to sustain your studio for years.”

“High-end Fibre-Channel storage is ideal for audio post,” agrees Rorke, whose company makes the Rorke Data SAN. “Fibre-Channel bandwidth delivers the necessary data rates. It’s not just a matter of the overall sustained MB per second per user, but also the often-sporadic burst-rate requirements of multitrack environments.”

As described by Glyph, which sells a SAN solution called Coba/SAN, a SAN is a shared, high-speed storage network allowing multiple users to access different types of storage devices through secure management software. Hard disk storage is pooled for use by the entire work group, with each workstation accessing the storage as if the drives were local. That means a file created on one workstation is immediately available — depending on access privileges — to everyone else on the network.

In a typical setup, the SAN hardware and storage devices are located in a machine room, isolating production areas from drive noise. In each of the facility’s DAWs (and nonlinear video editing systems), there is a Fibre-Channel host bus adapter (HBA) card. These are linked to the SAN hardware via fiber-optic cable. The SAN hardware is also hooked to SCSI or fiber drives, which may be RAID arrays (Redundant Array of Independent Disks), in which a number of hard disks are linked together as a single volume. The drive volume appears on the desktop as if it were local external storage.

The SAN architecture offers several advantages. “You eliminate the server,” Bacorn says. “Your system is working directly off the storage, and it sees the storage as local, which is required by most audio applications. Also, in a SAN the overhead is put directly on the Fibre-Channel adapter card. All checking and error correction is built into the hardware using FC protocol.”

Berger says The Village’s SAN makes life much easier for multiple users working on the same project. “We have had sessions that used three Pro Tools rigs in separate rooms working on the same movie. Two of them were doing offline editing and capture, and one was being used for mixing. They were able to exchange files and access the arrays very quickly. Another use is that the same client can go from room to room without having to transfer any files at all.”


As far as cabling between the component parts of a network, the options are pretty straightforward. According to Sarig, it boils down to either CAT-5 wire with 100BaseT or two flavors of fiber-optic cable: 1000BaseT Gigabit Ethernet or Fibre-Channel.

There is a downside to fiber, but it’s financial rather than technical. “The major drawback is that each fiber ‘seat’ costs an arm and a leg,” Sarig says. “It’s much more expensive because of the need to move light. The switch usually has mechanical mirrors, which makes it very expensive — a typical switch can cost $20,000 to $25,000.”

Given the price tag, Perkins believes it’s wise to consider a facility’s requirements before committing to fiber. “For the average facility with two to 10 audio workstations, if the main thing you are trying to do is move around individual sound effects and music tracks, the money can be better spent [elsewhere],” he says. “But if you are dealing with video or trying to share Pro Tools projects, that might require Fibre-Channel.”

Perkins adds that, in addition to file size, the major consideration is how many concurrent users are expected to be moving audio files at a given time. “Even if you have 20 users,” he says, “if more than half of them are Avids that will only rarely be pulling audio, you may not need the strength of network required by a facility with seven Pro Tools and Fairlight users cranking continuously.”

Actually, the question of which approach to take is not strictly an either/or dilemma, because a Fibre-Channel SAN is generally in addition to, rather than in place of, an Ethernet LAN.

“You usually end up with a concurrent network of at least 100-BaseT,” Sarig says, “because there is no good solution for TCP/IP traffic over a Fibre-Channel network. Especially in large facilities, it’s very important that you do not let the e-mail and Internet server share the same LAN as your audio server. Also, in the case of fiber drives, there is the need for communication with the RAID resource allocation server, because the drives need to be ‘locked’ for write, meaning that many users can read, but usually only one can write at the same time.”

Sarig describes the outlines of a “combination network” that he says has worked out many times in the past. “You can have SCSI drives, which are cheap, and a SAN controller with Fibre-Channel connector, which is fast, and you get a sustained 25 MB per second. You can then decide to use it over a fiber switch, which is expensive, or hook it to a server and distribute over Gigabit Ethernet and 100BaseT, which is really cheap compared to a fiber network.

“It’s all a question of speed vs. dollars,” Sarig continues. “We have found that allocating a fiber RAID over a fiber switch just to push an average 20-second sound effect is not very cost-effective and usually not needed. However, to push 48 channels of full-bandwidth audio plus high-resolution video over 100BaseT is just not a good move.”


Of course, even the fastest network plumbing can’t guarantee that a system will perform at its fullest potential. “There are several factors that can affect the operation of a network,” Bacorn says. “Many stem from poor network design or running nonsupported applications, such as Beta software or pre-released products.”

Bacorn adds, however, that the most common reason for slow network access is simply the lack of enough drives. “If you think of the network as a car,” he says, “then the drive arrays are the engine. The larger or more powerful the engine, the faster the access time. The number of channels supported can range from 32 tracks to 64 and beyond, if you use the new 15,000 rpm drives.”

In addition to speed, reliability is a key consideration for drives. “We wanted our drives to be arrays,” Berger says, “because we had heard of clients’ drives getting corrupted and files lost, both from poor SCSI cabling issues and from not unmounting drives correctly. By switching to an array, we now have complete fault-tolerance — we are covered from any data loss, short of catastrophic failure.”

Berger says The Village’s storage design is completely redundant. “Every component has two power supplies, which have their own UPS backup. The drives are RAID-3 and have dynamic spares. We also back up the whole system daily to tape. We can assure the clients that all their data is both secure and safe at all times.”

The RAID approach also gives The Village some additional benefits. “We no longer have to do any disk allocation,” Berger says. “And when transferring files to a backup set or a clients drive, the files we need are not spread out over several drives.”

As far as actually installing the network, construction is usually not an issue. “Cabling is a relatively easy installation process,” Bacorn says. “And most new facilities already have fiber optics installed. The storage is all installed in one location, typically the machine room. From that point, with the exception of the software and the HBA cards, the installation is complete.”

Berger adds that the location of the networking gear needs to be well thoughtout. “We had to make sure that we had ventilation to maintain a optimal temperature for computer equipment,” he says. “We also had to make sure that we were able to hear any audible alarms. Because a failure would not bring the system down, we would need to know if the system required immediate attention. Using fiber gave us flexibility. We didn’t have to be centrally located in the facility, and we didn’t have to worry about any kind of external interference when installing the cables.”

Despite the many advantages of a Fibre-Channel SAN, Bacorn acknowledges that it’s not the right networking approach for all facilities. “Most could use one,” he says, “but there is a cost involved that some may find beyond their means. If you are looking for a SAN solution, however, don’t just purchase the cheapest. There are a number of SAN vendors in the world that sell on price, not functionality. Like anything that sounds too good to be true, it probably is, and your facility is not a good place to test a SAN.”

Philip De Lancie is Mix’s new technologies editor.