[163547] in North American Network Operators' Group
Re: Webcasting as a replacement for traditional broadcasting (was Re:
daemon@ATHENA.MIT.EDU (Michael McConnell)
Tue Jun 11 12:13:36 2013
From: Michael McConnell <michael@winkstreaming.com>
In-Reply-To: <20371516.7332.1370620430594.JavaMail.root@benjamin.baylink.com>
Date: Tue, 11 Jun 2013 10:10:48 -0600
To: Jay Ashworth <jra@baylink.com>
Cc: NANOG <nanog@nanog.org>
Errors-To: nanog-bounces+nanog.discuss=bloom-picayune.mit.edu@nanog.org
On Jun 7, 2013, at 9:53 AM, Jay Ashworth <jra@baylink.com> wrote:
> ----- Original Message -----
>> From: "Michael Painter" <tvhawaii@shaka.com>
>=20
>> Anyone besides jra remember the last Super Bowl?
>> Better this year? Worse?
>> I'm sure whomever is listening in would like to know as well.
>>=20
>> =
http://www.multichannel.com/blogs/translation-please/multicast-unicast-and=
-super-bowl-problem
>=20
> Well, in fact, the most recent Massive Failure was the webcast of the=20=
> Concert For Boston, on 5/31. They were using a vendor called =
LiveAlliance.tv,
> who did not appear to be farming it out to Limelight or Akamai or =
Youtube, a..
> far as I could tell, and they apparently only figured for a scale 5 =
audience,
> and then got more than 500k attempts.
Such a common story. ..
>=20
> They got rescued by a vendor named Fast Hockey who are an amateur =
hockey
> webcast aggregator, I gather, and *are* an Akamai client.
>=20
> My estimation is that the reason that webcasting will never completely
> replace broadcasting is that -- because it is mostly unicast -- its
> inherent complexity factor is a) orders of magnitude higher than =
bcast, and
> b) *proportional to the number of viewers*. Like Linux, that doesn't =
scale.
This is the primary reason companies including Internap, Peer1 and XO =
(The list goes on and on, and includes several company that only provide =
CDN services) all used to run their own CDN networks and now all three =
have outsourced this CDN service / sold their customers to Limelight. =
Edgecast even sold off all their services in Asia and just runs a US =
based CDN.
The general policy in data centres has been 30 - 40% utilisation to =
allow for bursting and unexpected temporary increases, in CDN its more =
like 5 - 10% especially when you are a CDN for hire you really can't =
make any predictions about what your customers might do. Its common for =
CDN's to have entire rack's sitting powered off that only need to be =
powered up to join the cluster, our company has multiple full racks per =
data centre just a alert to the NOC staff or email away from being =
turned on.
>=20
> And broadcasters are not prone to think of the world in a view where =
you
> have to provide technical support to people just to watch your show.
>=20
> "He's at the 40... the 30... the 20... this is gonna be the Super =
Bowl,=20
> folks... the 10... [buffering]"
>=20
> Cheers,
> -- jra
> --=20
> Jay R. Ashworth Baylink =
jra@baylink.com
> Designer The Things I Think =
RFC 2100
> Ashworth & Associates http://baylink.pitas.com 2000 Land =
Rover DII
> St Petersburg FL USA #natog +1 727 =
647 1274
>=20
--
Michael McConnell
WINK Streaming;
email: michael@winkstreaming.com
phone: +1 312 281-5433 x 7400
cell: +506 8706-2389
skype: wink-michael
web: http://winkstreaming.com