Stay in Touch via Social Networks

Home → Archives → Category → HPA Newsline
27 2016 Sep

Glendale Premieres Tech Week

Following in the footsteps of New York, LA, and other tech-forward cities, Glendale launched its own week-long technology forum this September.  Nearly 1,000 people attended all events, representing over 50 companies. Entertainment tech companies featured on panels included Stashimi, Mashable, and Disney.

CBRE

“This was our first year hosting Glendale Tech Week, and we were delighted with the turnout, the connections and the conversations,” said Glendale Principle Economic Development Officer and event organizer Jennifer McClain. “The talk hosted at CBRE with Rick Caruso and Lewis Horne was a hit, the startup pitchfest generated more than 150 people, and the panels during the day highlighted local companies like the Walt Disney Company as well as nearby talent such as Sidebench and Flashfunders. It was great to see people talking with VCs, startup companies, students, and current businesses.”

Plans are already underway for Glendale Tech Week 2017.

 

 

27 2016 Sep

Xytech MediaPulse Propels SuperSport

SuperSport, a major provider of premium subscription television channels in Sub-Saharan Africa, has implemented Xytech’s facility management platform MediaPulse to manage its production logistics, crew and crew logistics. SuperSport provides sports content across its region as part of the MultiChoice Group, South Africa’s leading entertainment and internet company.

SuperSport logistics are complex and time-sensitive.  Their programming includes basketball and football games and requires movement and management of staff members and actions in an array of locations. MediaPulse manages flights, hotels, car rental, and visa and vaccination requirements, along with other logistical assignments, to help SuperSport get the games on the air.

27 2016 Sep

Imagine Products Releases Next Generation of ShotPut Pro

Imagine Products Inc., creator of software utilities for backing up, viewing, sharing, transcoding, and archiving video assets, has released ShotPut Pro 6, the next generation of its ShotPut Pro offloading application. With this update, users will see a completely new user interface, significantly faster media offloading, and a number of other new or improved features that give them much more control over their offloads – not only compared to ShotPut Pro 5, but to any other offloading application on the market.

“No other offloading application has the functionality of ShotPut Pro, which is why it is the industry’s de facto standard and why so many major movie studios and insurance companies require it in their workflows. We listened carefully to our customers’ needs and wants and channeled all of that feedback into creating ShotPut Pro 6,” said Michelle Maddox, marketing director at Imagine Products. “ShotPut Pro 6 simplifies the entire process without losing the security that the software is known for. It is the most feature-rich application we’ve ever created, and our competitors don’t even come close. With ShotPut Pro 6, we’re providing the most robust offloading application on the market at a price that’s still extremely affordable.”

27 2016 Sep

Converged Storage-Network Appliance with an Integrated Compute Platform for Media-centric Applications

Synergy is the next evolution in converged storage appliances. Synergy is designed to meet the high-performance storage, compute and network requirements of the Media & Entertainment market, while still being open to a wide range of content creation, media asset management and content distribution OEMs. Synergy delivers hardware consistency supporting ease of installation and simplified support that maximizes user ROI.

Scale Logic engineers and deploys storage centric solutions for broadcast, OTT, streaming, and video post production workflows. These solutions are designed specifically for customers requiring high-performance, highly-reliable data storage and network infrastructure. Its 30+ years experience in both structured and unstructured data workflows underpins its consultative approach as a trusted extension of its customers’ resources.

Scale Logic’s value to customers is demonstrated by the range of products and related services it offers, from entry-level and mid-tier storage solutions, to global enterprise platforms, including Scale-Out NAS, SAN, and archive. Scale Logic offers a complete suite of lifecycle management, workflow enhancements, solution design, installation, post warranty

maintenance options, and system integration.

More information: http://scalelogicinc.com/SYNERGY/

27 2016 Sep

Cinedeck Announces Windows Support for CineXinsert While Streamlining Finishing Workflows with Additional Toolsets

CineXinsert provides true insert editing to flat files for program repair and versioning workflows, saving time and money.

Cinedeck, developer of versatile videoand playback tools, announced that the Windows version of their cineXtools kit for cineXinsert will be shipping the end of September. CineXinsert, introduced at the 2016 NAB show, is the world’s only cross-codec, cross-wrapper, file-based insert editor. Since its launch earlier this year, cineXtools kit for OS X has already started changing the way Hollywood works.

Using a familiar NLE style Player/Recorder interface, cineXinsert is the patent-pending, file-based insert edit technology built into cineXtools. True file-based insert editing allows users to frame-accurately overwrite new segments of video, audio, or closed captioning in closed flat files, without re-rendering or re-exporting entire programs. Editors can dramatically increase their efficiency by using cineXinsert to make changes to files rather than repeating the multi-hour re-export and re-QC process.

Accompanying the Windows version are several new features. Newly added is a complete audio versioning module for reorganizing existing audio tracks or building new layouts. Audio versioning also supports inserting up to 32 tracks of separate wave files providing full deliverable flexibility. In addition, a complete DPP (UK – Digital Production Partnership) metadata editing interface has been added. Other unique capabilities include instant timecode re-striping that allows changing the running timecode of files, switching between drop and non-drop timecode, file trimming to remove portions of the start and or end of a file and more.

“The ability to simply replace a few frames of video or a bit of audio in an existing file instead of re-exporting an entire program is life-changing for an editor. File-based insert edit shatters the decades old belief that files cannot be edited,” says Charles Dautremont, Cinedeck CTO. “With cineXinsert now a cross-platform application and new features like audio versioning and the built-in DPP metadata editing, cineXinsert enables everyone to streamline their deliverables process.”

The resolution independent CineXtools can be used with 4K, HD and SD material and supports the most common production codecs including ProRes, DNxHD, XDCAM HD and AVC Intra in the most common file wrappers including Quick Time and MXF.

27 2016 Sep

Cinedeck at Sept. 30 Editor’s Lounge!

Discussion of post finishing file-based workflows! Demonstrations include new features
developed for IBC, including the Audio Reversioning Tool for WAV file
inserts.

And, we will be raffling 2 copies of cineXinsert! ($1495 msrp)

When: Friday, 09/30, 6:30 ~ 9:30pm | Where: 1612 W. Olive Ave. #200, Burbank, CA 91506

27 2016 Sep

American Society of Cinematographers Releases “Cinema Display Evaluation Plan and Test Protocol”

Document Defines Method for Visual Evaluation of Parameters for Next Generation Cinema Presentations.

The Technology Committee of the American Society of Cinematographers (ASC) is pleased to announce the publication of its “Cinema Display Evaluation Plan and Test Protocol,” which defines a method for the visual evaluation of parameters that characterize next generation cinema projection and active screens. As part of the industry’s move to high dynamic range (HDR) and wide color gamut, the document represents the first step towards the goal of identifying where value is created from the filmmaker’s point-of-view. It is available for download now here on the ASC Web site.

The “Cinema Display Evaluation Plan and Test Protocol” explores the capabilities in projectors and displays that go beyond those commonly found in cinema today. The focus is on deeper blacks, practical primaries for wider color gamut, effective contrast ratios, and optimal peak white levels for HDR cinema. Phase one of this work focuses on understanding how different parameter values impact the perception of image quality, establishing a baseline for further testing.

The “Test Protocol” is the work of the Next Generation Cinema Display (NGCD) subcommittee of the ASC Technology Committee. The ASC Technology Committee is chaired by Curtis Clark, ASC. The NGCD subcommittee is co-chaired by Michael Karagosian, Eric Rodli, and Steve Schklair.

Formed in 2002, the ASC Technology Committee examines emerging imaging technologies in an effort to understand and advise ASC membership and the motion picture industry in the convergence of new digital imaging technologies with traditional motion picture techniques.

For more information, visit www.theasc.com.

 

27 2016 Sep

Update on NASA Imagery with Rodney Grubbs

RodneyGrubbsAt the HPA Tech Retreat this year, NASA Imagery Experts Program Manager Rodney Grubbs gave a compelling presentation about how the space agency is using professional cameras to capture outer space images. For fans of space exploration or sci-fi, the chance to see these images is a thrill. But NASA isn’t simply showing off pretty pictures. It’s actively asking the film/TV community for its expertise to take even better pictures.

During his presentation, Grubbs enumerated the challenges in capturing high-res imagery in space: radiation, fickle and extreme temperatures, operating a camera in a vacuum, and, at the end of the line, transmitting those spectacular images to earth. All the cameras with high-resolution sensors tested thus far, says Grubbs, have been consistently damaged by ionizing radiation, up to seven to 10 pixels a day.

Since then, the NASA team began testing its first Bayer-pattern camera, the RED EPIC on the SpaceX Dragon, the first-ever private spacecraft to rendezvous with the Space Station. The first RED camera, housed in a standard body, has just come back and is undergoing testing to determine the level of pixel damage. Meanwhile, says Grubbs, RED president Jarred Land commissioned a carbon fiber camera body to house the second ISS RED camera’s sensors. “It’s not getting damaged as much as the first one,” says Grubbs. “It’ll be interesting when we get it back and compare it with damage to the first RED.”

Other cameras that NASA has tested are two Panasonic AG-3DA1 twin lens HD 3D camcorders. “The first one flew on the last space shuttle mission and was up for a little more than a year,” he says. “The second flew on a SpaceX Dragon and it stayed up for over 1,000 days. Both of them performed better than any CMOS camera we’ve flown before. We just got the second one back and it actually performed better than the first one, although it was up for three times as long.”

The Canon XF305 is the workhorse day-to-day camera, says Grubbs, but it exhibits enough damaged pixels that it’s replaced every eight months or so. He’s looking to replace it with a camera that can shoot 4K and has a built-in encoder, such as new models from Canon and Panasonic. “The Panasonic AJ-PX270PJ microP2 handheld camcorder is showing promise,” he says. Grubbs is eager to segue to a camera with a built-in encoder, to make it easier to stream live HD. “The Panasonic 270 features an optional built-in encoder, which would make life easier on the crew and much less cumbersome than a standalone encoder.” Currently, Grubbs’ team is testing the AJ-PX270PJ in a lab “to simulate the Internet on the Space Station, with the latency for communicating with an orbiting spacecraft.”

The SpaceX explosion on Sept. 1 was a temporary setback. “We were going to fly the next RED Epic cameras with a UHD encoder, to give us live UltraHD, the first such broadcast from space,” says Grubbs. “But the explosion put that on hold.” Since RED is on the verge of introducing its Super 35mm Helium 8K sensor, says Grubbs, NASA is in a “wait and see” mode. “We’ll probably switch in 2017 or 2018, because it gives us more aperture and would be easier for low light and time-lapse photography,” he says.

In addition to live UHD from space, virtual reality is next on NASA’s “event horizon.” “We’re looking at the Nokia OZO,” he says. “NASA has also received proposals from all kinds of startup companies saying they’re building their own VR cameras. All these are proprietary, so I can’t go into detail, but we’re reviewing these proposals for technical merit. In a year or so, we hope to fly one if not more VR camera rigs.”

One of the biggest benefits of using a VR rig, says Grubbs, is that NASA would be able to show smooth pans and tilts of the Space Station exterior, something not possible with mechanical pan/tilt units. “If I want to build a new pan/tilt unit, the cost is estimated to be millions of dollars,” he says. “But with a VR rig with virtual pan and tilt, via stitching views from cameras with no moving parts, that gives me the functionality I need.”

“If you could be inside the Space Station with any view you wanted, that would excite the public like nothing we’ve done before,” he says.

29 2016 Aug

Data Gravity in the Content Creation Industry

By Chuck Parker, HPA Board Member

The longer you work in and around the production and post production industries, the more you notice the accelerating change to the industry’s underlying workflows.  In the mid-2000’s, most of the cataclysmic change facing the industry was a transition from physical and analog workflows to digital workflows.  A few years later the digital business model hit the industry with full force, causing upheaval in pricing and cost models for those that did not adapt quickly.  The result was a number of new players across the value chain and the end of some storied brands and companies in the industry.

Chuck ParkerAfter a decade of significant change, the majority of the steep process (physical / analog to digital) and massive price pressure changes have been endured.  However, our industry is now largely subject to Moore’s Law as a result of that digital transition.  The net effect of a “doubling” of “digital power” for price point parity every 12-18 months is a phenomenon that touches many (but not all) parts of the production and post production process and is a driving economic force that can bear good tidings or incredible pain.  However, its continued source of disruption is often the result of “linear thinking” – falling prey to the assumption that a change which took 5 years to achieve in price point capability will take just as long to be duplicated. In fact, the capability to price point will improve somewhere between 8x and 32x over that period.

What does all this mean?  Taken in its most basic form, many would believe it means the cost of shooting a 2K feature in 2015 should deliver a significant cost reduction to shooting that same feature in 2016.     However, if we take a cue from the storage industry, they would tell our industry that the amount of data stored by enterprises on average has doubled every year for as long as anyone can remember.  This is largely the result of changing behaviors and processes as the cost of keeping data around gets cheaper (i.e. everything becomes less efficient).  Don’t believe this?  Ask yourself how many photos you have stored online with your Apple or Android phone today vs. 3 years ago—at some point the cost of iCloud, DropBox or Google Photos got so “cheap” that managing your photo catalog was “more expensive” to you than the cost of the service to just keep everything.

Combine this impact of price point on behavior and process efficiency with the march of 4k data presentation to the consumer, the increasing availability of 6k cameras at affordable price points, and the inevitable march to 8k for either virtual reality or some future Japan-like broadcast standard, and we have rapidly arrived at a point where shooting that used to result in 1-2 TB of raw footage daily is now often more than double that amount and occasionally results in 15-25TB of daily capture on high-end projects.

The net impact of this incredible change is that every professional in our industry needs to master a new term: data gravity.

Everyone in the production and post production value chain needs to start thinking differently about the way they impart their magic upon the content creation process.  Visual effects teams  have been dealing with  the challenge of remote compute resources for years. already (geographically dispersed office locations) and adapted to the challenge of cloud compute rather quickly, which typically requires them to push a large amount of data to another location very quickly (and securely), work their magic on the content in that location (data gravity), and then move the end result to where it needs to be delivered.  It is very likely that this kind of process change will be required of most creative processes in the very near future.  For example, the convenience offered to professionals by the creation of digital dailies (think Pix/Dax for review and approvals on iPads) will become a requirement for real-time color grading, editing, and review and approve workflows to match this data gravity concept and content production sizes continue to grow as these rapid rates.

Managing data gravity requires both the change of process to account for some abstraction layer (i.e. visual review of a sliver of the data vs. moving all of the data for the review) and the ability to push large amounts of data from one creative collaboration partner to the next quickly and securely.  Solving the data abstraction layer is about understanding your creative process and at which points others can peer into that process to impart their magic vs. having to have all of the data to impart their magic (i.e. review and approve vs. creation of a new visual effect).  Solving the data transport issue requires an understanding of the problem from a transport layer (layer 3), including the speed of light, contention and latency, as well as understanding how the application layer (layer 7) can help solve transport problems.  In short, it means knowing when you need dedicated private bandwidth vs. lots of cheap internet and the right software to solve your data gravity problem.

Regardless of how you decide to change your creative processes and deal with data gravity, one thing is for sure: as long as the price of underlying cost drivers in the digital workflow reduce product price by half every 12-18 months (or their product capability doubles for the same price), someone else in the chain will be innovating the creative process to deliver a better, faster and cheaper solution  to present to your customers because the opportunity for disruption it creates is just too large to ignore.

29 2016 Aug

MPSE/CAS Golf and Poker Tournament Tees Off 18 September

8th annual charitable event co-sponsored by the Cinema Audio Society.

The Motion Picture Sound Editors (MPSE) and the Cinema Audio Society (CAS) will host the 8th Annual MPSE Golf and Poker Tournament September 18, 2016 at the Angeles National Golf Club in Sunland, California. Proceeds from the event will benefit the MPSE’s Ethel Crutcher Scholarship Fund, which provides mentoring and support for student sound artists, as well as the organization’s work promoting the role of sound in movies, television, gaming and other entertainment media.

MPSE Golf and Poker