I got fat pipes in my house last month. No, I didn’t redo the plumbing or buy an old church organ: I got high-speed Internet access. It’s something I’ve been considering for a while, because I’m managing about a dozen different Websites these days, and the upload and download time on my 56k connection (which was usually running at more like 40k) was killing me.
The choice of what type of high-speed access to get was, alas, easy. Even though I get about a dozen e-mail and phone pitches each month for DSL, and even though I live only seven miles from downtown Boston and two miles from a large local switch, as it turns out, the switch that services my town is more than three miles away from me. So all of these breathless offers for DSL are worthless. Fortunately, my local cable system was picked up by AT&T in its recent multi-multi-billion dollar acquisition spree, and the company has been upgrading it, slowly but surely. Recently, that process included putting Internet access on the line that runs down my street.
It’s $50 a month (it would be $10 cheaper if I had cable TV, but I’m completely uninterested in wasting any more time in front of a CRT than I already do), and installation was free. Thanks to the previous owner and his six children, cable was already running through the house, so all the technician had to do was put in a new drop from the pole on the sidewalk. He and I then tried to install AT&T’s RoadRunner software, but we couldn’t, because the central office had given me the PC version — although they knew full well that I was using a Mac (it was on the work order). But the documentation wasn’t too bad (although it was wrong in one crucial place, which, fortunately, the technician knew about), and so I managed to configure my TCP/IP settings to recognize the new Jetsons-style modem connected to my Ethernet port and be up and running in less than an hour. (“You’ve done this before,” he said.) Considering all the technological hells I regularly find myself in, I consider myself darn lucky.
When it comes to moving text and, especially, graphics around, the improvement in performance the new service gives me is nothing short of astounding. When I encounter a 1MB PDF file I want to read, I no longer have to plan downloading it around my coffee break — I just push the button, and in a few seconds I’ve got it. I no longer cringe each time I see the icon that tells me a Flash animation is about to load. And what’s really enlightening is to see how huge the differences are among Websites, which were masked by my old, slow connection: Some are as quick as my own desktop, while others are every bit as pokey as they were at 56k.
It’s also allowed me to join the online music revolution. I can actually grope around Napster in real time and check to see whether any of my stuff is on there. (It isn’t). And I can listen to Internet radio.
To take full advantage of my new pipes, I went out and bought one of those little “multimedia” speaker systems, with the cute subwoofer and the even cuter satellite speakers. It sounded very impressive with my portable CD player, so I hooked it up to my Mac’s audio output and called up the site for my favorite local public folk-music station, which has such a weird radiation pattern that the $400 Sony digital tuner in my office can’t pick it up.
Astute readers of this column may recall that four years ago, I complained mightily about the quality of Internet audio. Well, you know what? The sound still sucks. In those days — eons ago, in Internet time — it was remarkable that it worked at all. Like a talking dog whose grammar isn’t very good, it was easy to overlook the crummy sound. But Internet audio isn’t supposed to be a novelty anymore — heck, we produce a whole magazine devoted to it — it’s supposed to be the delivery system of the future.
Well, if this is the future, I’ll stick to the past. My favorite station sounds awful. It’s compressed all to hell, the bass is ridiculously loud (I guess someone thinks the kids like that), there’s no top end and I can’t even tell for sure whether it’s in stereo or not, because the separation is so weak. Worse, whenever I save something to disk (like right now, as I back this paragraph up), it stutters, and every five minutes or so it stops dead — sometimes for only 10 seconds while it rebuffers, and sometimes for 10 minutes or more because of “problems with your network connection.” And, sometimes, it crashes my computer.
Other Net radio stations I’ve sought out, whether they’re re-broadcasting over-the-air signals or are Web-only services, don’t fare any better. Half of them are already going at full capacity when I sign on and won’t accept any more connections, while half of the rest require some unnamed extra plug-in for my player, which, when I try to download it, “is not available” for my “browser’s version.” And one service that boasts 150 channels makes me listen to a goddamn 30-second commercial each time I switch channels — hardly encourages surfing, does it?
And while the Net was supposed to open up all sorts of new channels, as I write this, the selection of on-air radio stations is actually getting smaller. Some 500 commercial radio stations have recently been pulled off the Internet by their corporate overlords because of concerns over rights. This isn’t a Napster-like problem, because the rights they’re worried about are not those of the artists or songwriters, or even record companies whose material they play. No, it’s concerns over the rights of the announcers and singers on the commercials they run, whose union is demanding that they get paid extra for the re-broadcasts. We do live in strange times.
But at least re-broadcasters actually sound like radio stations. When you listen to the Web-only channels, you have no idea where they’re actually coming from. There’s nobody home — wherever home is. There are no announcers or even ads; just generic IDs, which make them kind of spooky, in a Muzak-like way. There’s supposed to be text on the screen that tells you what song is playing, but it often disappears after the first song (at least in my browser), and when it does re-appear, it’s out of sync by several minutes. Despite the heavy compression, the levels jump around from song to song — another indication that nobody’s paying attention. The classical channels are the worst: Not only are they heavily compressed (most FM stations that broadcast classical music make a point of not compressing it), but the audio levels are ridiculously low, so what’s coming out of my cute little speakers sounds less like the Amsterdam Concertgebouw and more like the deep fryer at a greasy-spoon.
When it comes to a lack of imagination, these services make my local megamedia-owned-and-programmed“classic rock” station look like the Library of Congress — after three evenings of listening, I’ve heard everything in their rotation. One hundred and twenty channels, and there just ain’t much on. To add insult to injury, every site that I even try to log onto leaves a little RealPlayer dummy file on my desktop, so after a half-hour of searching for something decent to listen to, my screen looks like a flock of mutant blue pigeons flew over it.
One of analog radio’s better features is that when the signal is weak, you can still listen to it. When I teach audio theory, I point to radio as an example of how analog audio allows us to perceive sounds below the noise floor, because as a radio signal degrades, the hiss level goes up, but you can still hear the music. Listening to a Net radio station degrade is a very different experience: When traffic starts to build up, the codecs make the music sound like it’s being processed through one of those old tube-frequency shifters they used to create alien voices on The Outer Limits.
The really sad part about all of this is that it’s not going to get any better any time soon. Part of it is the capacity of the Internet backbone. There will always be, as there always have been, a race between the Net’s capacity to carry traffic and the amount of traffic it’s being called upon to carry. Every year, some expert solemnly intones that, “in two years, we will have three times as much bandwidth as the nation will require,” but, inevitably, the demand manages to catch up with, and usually exceed, the supply.
But that’s not even the biggest problem. That would be — as it has always been when it comes to feeding new media to the home — the “last mile.” The solutions available now, cable and DSL, are really stop-gap measures — ways for the local telephone companies (the “baby Bells”) and cable companies to get high-speed Internet service onto existing copper without investing a huge amount of money. The long-term answer is going to be optical fiber. But getting a high-speed data signal on a fiber running underneath a main street is one thing, and getting that signal inside people’s homes is quite another.
Fiber in the home is going to be too expensive to be very common for some time. Not the cable itself, but the process of getting it inside. Streets and sidewalks and lawns have to be dug up and replaced, permits acquired, police details hired… Multiply that by 150 million or so homes and you run into some serious infrastructure expenses. And the financial returns on residential fiber just aren’t good enough, even in large apartment buildings, unless they’re being built from the ground up with high-speed access in mind.
As it is now, DSL and cable Internet are penetrating the domestic market very slowly — only about 5% of residential customers have high-speed hookups, two-thirds of them through cable and the rest through DSL. But that is all we’re going to have for a while. And even they are going to stay expensive, which is going to make it hard for a lot of people to sign on. I can afford to pay $50 a month for a high-speed connection, because it’s my business, but Joe Napsterphile — who’s already shelling out $59.95, plus $20 for the box on the TV in the kids’ room just so he can watch the WWF/XFL/NBA Battle of the Giants Ultimate Takedown Super Special — isn’t going to look at that expense the same way.
“What about the competition?” I hear you cry. The local phone companies are supposed to let anyone who wants to lease their lines, and surely there are well-funded startups that are doing that and putting their own services on them, which eventually will force prices to drop. And cable companies are not supposed to be monopolies anymore either.
Well, it’s true that the Telecommunications Act of 1996 was supposed to ensure that competing companies have access to the baby Bells’ copper, thus (hopefully) allowing market forces to push down prices the way they did in the long-distance market 20 years ago, when AT&T was broken up. But the baby Bells have managed to hem and haw for years about the technical problems of leasing lines, and a lackadaisical FCC didn’t do anything to discourage them. More than one legal expert has categorized the Act as “toothless.”
Eventually, many companies that were trying to offer local service and/or DSL ran out of patience and money and gave up. Northpoint Communications and HarvardNet are just two of the larger DSL providers that have gone belly-up, and there are more not that far behind. One state watchdog official described the situation this way: “If the Bells can keep them on the ropes for a few months, they will be out of business.”
In Illinois, the local phone company got into a tiff with state regulators who tried to force it to open its lines to competitors, and so they simply pulled the plug on DSL expansion. As the head of the office of consumer affairs in another state has said, “Guess what, guys — this really is a monopoly.” With no competition and no incentive to either invest in a new infrastructure or to drop prices, the baby Bells see no reason to not let things stay just the way they are.
With the venture capital market in the toilet and last year’s crash in tech stocks — the ones that deserved it and the ones that didn’t — the baby Bells are sitting pretty. The new, even more regulation-averse FCC isn’t going to touch them, and we can forget about Congress: The single largest contributor to the (unopposed) re-election campaigns of Representative Billy Tauzin (R-La.), chairman of the House Commerce committee (and former head of the Telecommunications subcommittee), and John Dingell of Michigan, ranking Democrat on the committee, was Verizon (aka Bell Atlantic), with SBC (aka Southwestern Bell), BellSouth and the other baby Bells close behind. It is without a doubt that the best investment they ever made — and a lot cheaper than upgrading.
The picture on cable Internet access is not quite as clear (pun more or less intended), but the number of municipalities that have more than one viable cable company can probably be counted on one hand. Here in my part of the world, the once well-heeled RCN is struggling, a victim of the capital markets and the huge expense of duplicating somebody else’s existing infrastructure. (What were they thinking?) With cities’ hands tied (by the same Telecommunications Act) when it comes to controlling the cable monopolies within their midst, with rates for basic service reaching new highs while the regulatory bodies look the other way, and with consolidation of the nation’s cable systems into a mere handful of companies like AT&T, Viacom/CBS/Infinity, and AOL/Time Warner/Reprise/Elektra/Atlantic/Netscape/CNN, there ain’t much motivation to innovate there either.
But what about wireless? Isn’t it going to bypass all of this limited technology and make an end run around the monopolies? I don’t know about you, but my experience with what’s supposed to be state-of-the-art cellular service makes me choke at the thought of trusting my Net connection to the people that built the network my new multimode cell phone communicates with. The thing goes into “Analog Roam” mode (at extra charge, of course) whenever I go into a building with a steel frame, and it won’t work at all if I’m in a room with more than one computer (which I am most of the time). As one recording engineer, who prides himself on his conservative approach to jumping on bandwagons, told me recently, “This has got to be the worst technology I’ve ever bought into.”
There’s one more thing the Internet radio broadcasters haven’t quite figured out. In conventional broadcasting, the more listeners you have, the better. One transmitter covers the same geographic area, whether there are 10 or 10 million people tuning in. But the more listeners a Net radio station has, the more expensive it is for them to broadcast, because each streaming server can handle only a finite number of listeners. Who’s going to pick up the tab? Good question.
So I’m not holding out great hopes for the Internet to soon become a great medium for delivering high-quality music to large audiences. The stuff available now, with RealAudio and MP3 technology, may well do damage to the record companies (although they seem to be quite capable of shooting off their own toes) and, somewhat more sympathetically, record stores. But it may well happen anyway, and that could end up being a rotten shame, because it’s going to give the next generation of listeners a very strange idea of what music is supposed to sound like. Ironically, as we put more bits and channels into our recordings, striving — as we always have — for that next level of realism and listener envelopment, the delivery system is getting crummier and crummier.
It reminds me, sadly, of a magazine ad that Fisher, the early hi-fi maker, used to run that showed a kid looking at a broadcast of an orchestra on a tiny black-and-white TV. The headline read, “Don’t let your child grow up thinking this is what a cello sounds like.” That battle isn’t over: Let’s not let our children think that what’s coming out of their computer speakers is all there is to music.
Paul D. Lehrman can still operate at a relatively high bandwidth.