The following is a Q&A intended to help OnSync users, Partners, & customers achieve the best possible experience when using OnSync in a high-quality production environment.
The team at Commonwealth Media Services is among the smartest we've seen when it comes to production-quality broadcasts using the OnSync platform. So, under pressure from our user community to deliver the down-and-dirty details for achieving incredible, yet reliable, TV-quality live event streaming over the web, our first call was to Jerry Kambic & Tug Bressler at CMS. They were kind enough to share their knowledge via this Q&A:
[Note: JM is Jonathan Maley from Digital Samba, TB is Tug Bressler from CMS]
JM: Before we even begin, I think it’s important to offer some background and context to those who might read this Q&A. Commonwealth Media Services (CMS) is a division of the Department of General Services for the State of Pennsylvania. CMS, in a nutshell, provides media-related products & services to State Agencies, including Department of Banking, Office of Administration, Department of Public Welfare, and others. Basically, CMS is the go-to authority for all things audio/visual at the State level. True?
TB: That’s correct.
JM: As it relates to OnSync, CMS has been a client & Partner of Digital Samba since 2008; if I recall, we started out small with a subscription service model, but quickly outgrew it. Today, CMS has somewhat of a custom OnSync Enterprise solution, fully administered by you, but hosted and maintained by us. Some might call it a “managed service”, I guess. You guys are very self-sufficient, so we don’t always know what you’re up to. Give us some background on how you’ve used the system so far.
TB: Sure. We’ve used it for everyday internal staff collaboration and video conferencing around town, across the state, and even as a means of connecting DMVA with a soldier on-base in Iraq. It’s perfect for that because it’s easy – there’s no software, and we use built-in laptop cameras. Simple, portable, and good quality. However, I suspect it’s our not-so-ordinary high-end production-quality OnSync broadcasts that have drawn the attention of your other Partners and certain customers.
JM: You’re right. We think you have secret knowledge for getting exceptional, television-like quality out of OnSync. So let’s get right to it. Most of your high-profile events originate in what’s called the “Capitol Media Center” over in the Capitol building, not far from the rotunda. I’ve been in there, and it’s like a studio. What kind of cameras are you using in there?
TB: Currently we are using some old composite Sony broadcast blocks with Telemetrics robotic controls and Canon glass. We are in the process of replacing them all with Vaddio HD-18 robotic 1080p cameras that are fairly inexpensive; about $5000 each. But, your customers and Partners should keep in mind that there are lots of different cameras out there, each geared towards different budgets and scenarios. The HD-18 cameras look great but they need a lot of light, which is why they’re so inexpensive. We can get by using those for the Capitol Media Center because it is already very well lit.
JM: Obviously, these aren’t simple USB webcams. If they’re analog cameras, how do you convert the signal to digital – something that can be fed into the PC where the OnSync session is running? We’re familiar with basic analog-to-digital converter cables, but have heard that the quality is low, certain video sizes aren’t always supported, & they’re only 480 lines.
TB: Which 480i cables or devices have you tried?
JM: I’m too embarrassed to tell you, so I won’t.
TB: A lot of the cheaper "480i" capture devices are actually a "340p" or "320p" device that only samples half the fields. The signal needs to be progressive to get into most applications, so rather than putting a 480i-to-480p converter inside the device, they just sample half the fields, making it progressive. This saves a ton of money in production.
JM: Okay, excellent. Good thing to be aware of.
TB: Another thing that really helps quality is to have a camera that is a higher native resolution than the 480i output. The thing that makes standard-definition broadcast cameras look really good is oversampling. For example, a Sony DSR-570 broadcast camera is actually 700 lines resolution, even though the output of the camera is only 480 lines. The CCD chips sample everything at 700 lines and then the DSP circuitry inside down-converts to 480 lines. The result is a much sharper picture. Sorry for boring you with camera theory, but I hope this explains some of what you see.
JM: Are you kidding? This is great stuff. So, just to be clear, despite the more sophisticated cameras and converters, you end up at 480 lines, too?
TB: Yes, what we're doing ends up only being 480 lines, too. If you're using the encoder built into Flash Player (normal OnSync with internal video), the max you can do is standard definition (480i).
JM: And you don’t use an External Video Source, such as Flash Media Enconder, to encode the live video stream being sent through OnSync?
TB: No. Flash Media Encoder is not supported on Mac; besides, we think it’s easier to simply use the standard built-in OnSync video controls.
JM: Good to know. Tell us more.
TB: So, we're using a MOTU V4HD connected to a Mac. It works well with a PC, too. Even though the device is HD, we're only using it in SD 4x3 mode. We didn't buy the MOTU specifically for doing OnSync; we happened to have it lying around from another project, and were able to repurpose it for this.
JM: Anything you’d like to change about this setup?
TB: When we rebuild the Capitol Media Center, everything will be set up to 720p60 through the entire system. The Panasonic video switcher we purchased has a down-converter card inside that can take the 720p signal (16x9), down-res to 480, interlace the signal, then side-crop to 4x3. The cool thing about doing it through the switcher, rather than direct from a camera like we do now, is that you can set up that output as an aux bus. Basically, without changing cables around, you can switch which camera or video source goes to the output. You can also set the aux-out to mirror the program (switched) output of the switcher. Most of the time we’d just want the camera that's the straight-on headshot, but it's nice to have that kind of flexibilty. We've actually done some training events where we switched between the head-shot and pre-recorded content to feed into OnSync.
TB: At this point I think I'm taking the SD-SDI output from the Panasonic switcher and pushing it into one of the AJA IO interfaces we have. The only disadvantage is that they are Mac only. Now that everything we do is HD, we can't really use them for anything, so it would be a good re-purpose for them. I'm planning on keeping the MOTU for a portable rig. I actually have it rack-mounted in a small case with power conditioning and an 8-port 1gig switch (with fiber and copper so we can connect to pretty much anything).
JM: Tug, we really appreciate your time and willingness to share your knowledge. As a software company, we’re pretty focused on development of the platform. We rely pretty heavily on the user community, and especially Partners, to provide this deep-level expertise on related systems, hardware, and integration. Thanks for delivering! You guys are really talented.
TB: No problem.
JM: So what’s it like working around high-profile politicians?
JM: I have a feeling there’s more we could talk about. Are you open to doing a part 2 of this Q&A for a future post on similar topics?
JM: Meantime, can we, our customers, and our other Partners, call you?
JM: What’s your home phone number?
TB: Don’t even think about it.
Tug Bressler is a Digital Technology Specialist at Commonwealth Media Services. Again, special thanks to Tug and the good people at CMS for their time and assistance!