Stay in Touch via Social Networks

Home → Archives → Category → HPA Newsline
16 2017 Feb

Xytech and Phoenix7 Forge Technology Collaboration

MediaPulse and Zeus Integrate Channel Management and Facility Scheduling for the First Time

xytech logo15 February 2017 (Los Angeles, CA) Xytech, creator of the industry-defining facility management platform MediaPulse, has announced a technology collaboration with Phoenix7, creator of Zeus, the industry’s leading channel management system. This integration allows a channel schedule to automatically create jobs, schedules and transactions in MediaPulse. This adapter saves broadcasters from dual entry, reducing mistakes, lowering costs and increasing throughput. Simply put, the same staff accomplishes more tasks with fewer mistakes and greater transparency.

Channel management systems and facility scheduling systems have traditionally been disconnected islands within a broadcaster’s IT landscape. This collaboration allows for real-time channel scheduling and title information from Zeus to display in MediaPulse’s Schedule Book. The connection gives facility schedulers and personnel the ability to see immediate changes to a channel’s on air timetable and react swiftly.

Daniel Lynch, Vice President of Broadcast Services for Xytech, said, “Broadcasters must organize their people and resources around their live, on-air output. This is especially challenging for sports and news broadcasters.  The events driving their schedules are by nature highly volatile.  A match can be altered on short notice due to inclement weather or a broadcaster changing its decisions based on a team’s current standing. In the fast-paced world of broadcasting, staff, studios and equipment need to be repurposed quickly and accurately.”

The ability to assimilate these two different but critically related systems is possible because of the software development expertise of Phoenix7 and Xytech. A seamless interface between the products fulfills many client requirements.

Hitesh Vekaria, Founder and Managing Director of Phoenix7, said, “Integrating the channel management power of Zeus with MediaPulse is an exciting development and one we believe will be highly efficient in our customers’ operations. Working with the team at Xytech has been a great experience because Xytech and Phoenix7 both operate with a similar approach – to provide the best service for our customers as they operate under extreme deadlines and pressure. We see this integration as a concrete example of our commitment to improve service for customers.”

For more information, visit xytech.com.

16 2017 Feb

How Low Can You Go?

Craig German

Craig German

My original intent for this article was to explore how a TV producer or film director could conceive, create, and distribute a high production value show for a fraction of the “typical” cost.  We all know about Robert Rodriguez’s critically acclaimed $7,000 film, El Mariachi; Tyler Oakley’s $6M/year on-line social issues show; and student filmmakers who make compelling movies on nothing but free labor and borrowed equipment and rooms.  And there’s no question we can learn a lot in this industry by studying these examples to test our ideas about how we make great content.  But what I was after was this: how can we change our creative workflows and apply new products and services from industry disruptors to reduce the cost profile for what we do?  With limited print space and time, I decided on two things: I’d focus my attention on production and post in scripted TV and film; and I’d avoid quantitative budget comparisons, instead aggregating the thoughts of some of our colleagues into a picture of where we’ve made progress today, where we may go in the future, and where some of the challenges will probably persist.

Historically, scripted television and film have required significant budgets and heavy infrastructure. However, progress in technological innovation and the evolution of content business models have enabled and motivated us to seek lower cost profiles, with even greater expectations that costs will continue to decrease.  The sources of cost savings have been in several areas: workflow optimization; tools commoditization; cloud services; and open technology.

When we talk about workflow optimization, we’re talking about intelligent process choices.  Are there redundant steps?  Can we remove any dependencies to shorten the timeline?  If we change the order of events, do any steps collapse?  We’ve seen workflow specialists rise to the level of key players at post houses.  And with more compute power, cheaper storage, and greater connectivity, a workflow specialist has a deeper arsenal for his attack plan.

As an example, Jesse Korosi, Director of Workflow Services at Bling, has had increasing requests to be involved in a job from capture to final conform, going beyond the typical separation of on-set and post roles.  Often, much of the knowledge about the assets leaving the set for post is lost – it wasn’t captured in the first place, or it is not convenient enough to track.  By involving the workflow specialist in technical decisions up front, from how color is created on set, to the software used for metadata logging and VFX Data Wrangler notes, he is able to aggregate project metadata so that downstream teams can automate more of their processes.  Aside from some of the customized tools that a workflow specialist may use, many other software companies like Colorfront have opened up their products to enable workflow specialists to customize them, as well.

Another type of workflow optimization is talent focus.  Great colorists and sound mixers still command a premium and will likely always be appreciated for their importance, as evidenced by the many award categories at the BAFTAs, Oscars, Emmys, HPAs, etc.  Even in the indie world, as noted by Randall Maxwell, an indie producer/writer, you need the pop you get from the color pro and the mixer.  Is there a way to focus these talented professionals on where they add the most value, and not where the work is more preparatory or ancillary?  Adam Stern, Founder and CEO of Artifex Studios in Vancouver, has seen a trend of doing the initial sound or color work in a home studio before submitting it for the high end finishing step.  Having said that, you won’t achieve high production value without true talent, the right tools, and the right viewing or listening experience.  The choice you have is how close you want to get to the look you want before you engage with the real deal.

The term “cloud” has been around long enough that we all have a pretty good idea of what we mean when we say it.  But how does it help us to control our costs?  When cloud was first becoming a thing, the only solution most of us had on our lips was Amazon, and we were skeptical about what could be accomplished in the cloud, how secure it was, and whether it could meet our SLAs.  Ten years later, we also have Microsoft Azure and Google Cloud, major vendors have developed vertical ecosystems in each cloud solution, the cloud providers have a solid relationship with the MPAA, and there are sophisticated tiers of storage and compute to meet our varying post and distribution needs.  Erik Weaver, who ran the Entertainment Technology Center’s Project Cloud, observed through various case studies that creative teams were able to exploit editorial resources, via BeBop and Avid, as well as HDR color correction via Colorfront, in the cloud to complete their films, while only paying for the transactional costs of the actual cloud cycles.  Similarly, Adam Stern said that he now has the luxury of reserving budget that he used to have to hold back to keep up with the latest rendering hardware and infrastructure, and he is instead able to reallocate those funds directly to VFX burst rendering in the cloud.

Post production tools themselves have become more sophisticated and multi-purpose.  The competition from Final Cut Pro and Premiere has given Avid a run for its money, and our community has benefited as a consequence.  Although there have been questions over the past several years of whether Apple was serious about the professional market, they are still an important option for many post production professionals, whether as part of a large operation, or as an independent producer.  And this has certainly driven each of these vendors to lower their price point while piling on features that enable content creators to get closer to their final vision with a one-stop shop mentality.  As an even more extreme case of commoditization, look at the success of the film Tangerine at Sundance 2015 – shot completely on an iPhone 5s (with Steadicam and anamorphic lens attachments).  While the Tangerine crew still went through a traditional post phase, the fact that they could make a splash using a camera that doesn’t have ARRI, Sony, or RED in its name should give us all pause.

We’ve discussed a few areas of disruptive change – so what’s next, and what’s not?  It seems that we will eventually see all of our tools in the cloud, with metered access according to how much we use them.  More creative choice will be accessible on set, and eventually, even raw material will be delivered electronically.  Workflows will end up fully integrated all the way from the set, through post, and into distribution.  But – the need for sheer talent is essential to the creation of compelling content (at least, until Watson understands the greatness of Martin Scorsese or Matthew Weiner), and no one wants to put out a show without experiencing it in an environment that approximates how a viewer will see it, so mix stages and DI suites are also a fixture in the process.  As pointed out by Jacob Medjuck, an early pioneer of all-digital post, the biggest budget items in a mainstream film continue to be the talent, and music – since those are much harder to control, we’ll continue to focus on process and tools.  Our industry will continue to find ways to produce great content more quickly, for a lower cost, and for more platforms.  One thing that we can count on is that each of us will be asked to contribute in our own way to this on-going transformation.

I’d like to thank the following people for their input to this article: Jesse Korosi, Director of Workflow Services at Bling/SIM Digital, and Co-Head of the Young Entertainment Professionals group (YEP) at HPA; Jacob Medjuck, Film Director and Founder of Film Raiser; Erik Weaver, Global Marketing Director at HGST, and Project Cloud Lead at ETC; Randall Maxwell, Film Producer/Writer; and Adam Stern, Founder/CEO of Artifex Studios.

15 2017 Feb

SCRG Matures to NET (Networking-Education-Technology)

NET events launch in March – details to be announced shortly!

The HPA’s Sales & Career Resource Group, which was formed in 2007, has grown into NET (Networking-Education-Technology), formally launching next month with a fresh mandate for more networking, education and knowledge sharing on business and emerging technologies. Known for presenting lively networking events where HPA members shared ideas and stayed informed, SCRG evolved as the dramatic changes in our industry grew the demand for knowledge and the need for connection, particularly among the non-technical crowd.

“This gathering is an engaged, upbeat, exciting place to discuss what’s new, understand it, and get our heads around how it impacts what we are working on.  SCRG was a perfect launch pad for what NET is becoming – a place where the usual suspects and new faces gather to talk about what keeps them intrigued.  It’s a more than worthwhile way to spend your lunch hour,” says Josh Wiggins, NET chair. “Our initial event last year attracted nearly 100 people, and we’ve been deluged with requests for the next NET lunch.  If you are interested in being part of the event, please contact Alicia Rock.  We look forward to seeing you in March!”

With new speakers and a wide array of topics, NET events promise to be engaged and engaging. Details for the upcoming NET event slated for late March will be announced at the HPA Tech Retreat in Palm Springs.

15 2017 Feb

The GRAMMY® Awards Sees Avid Customers Hit the High Note

Every nominee for Record of the Year and Album of the Year relied on Avid’s comprehensive music creation solutions

Avid® congratulated its many customers recognized for their outstanding achievements in the recording arts with award wins and nominations at the 59th Annual GRAMMY® Awards, which took place on Sunday, February 12 in Los Angeles. The world’s most prestigious ceremony for music excellence honored numerous artists, producers and engineers who relied on Avid’s music creation solutions powered by the MediaCentral® Platform.

Every nominee for Record of the Year and Album of the Year used Avid’s comprehensive tools and workflow solutions, including the industry-standard digital audio software, Avid Pro Tools®.  GRAMMY Award winners created Adele’s Record of the Year and Song of the Year Hello, and Album of the Year 25 using Pro Tools.

Record producer Noah “40” Shebib, who was nominated for Record of the Year for Work by Rihanna featuring Drake, Album of the Year for Drake’s Views and Best R&B Song for Come and See Me by PartyNextDoor featuring Drake, said: “It’s always humbling to be recognized on such a big stage like the GRAMMYS and I’m honored to have been nominated for three awards. Working with artists in different locations and time zones can be challenging, but with Pro Tools, we’re able to collaborate in real time without compromising our creativity.”

Producer, mixer and musician Mike Dean received multiple nominations for Album of the Year for Beyoncé’s Lemonade and Justin Bieber’s Purpose, and Best Rap Song for Famous by Kanye West featuring Rihanna and Ultralight Beam by Kanye West featuring Chance The Rapper, Kelly Price, Kirk Franklin and The-Dream. “To once again have several GRAMMY nominations, all of which were recorded and/or mixed with Pro Tools speaks volumes about the reliability of Avid technology and its ability to take our music to the next level. Special thanks to the whole Avid team for keeping me up and running year after year.”

Other GRAMMY Award-winning projects by artists, producers and engineers who use Avid solutions in their studios include Best Rock Performance Blackstar by David Bowie, Best Dance Recording Don’t Let Me Down by The Chainsmokers featuring Daya, Best Rap Song Hotline Bling by Drake, and Best Pop Duo/Group Performance Stressed Out by Twenty One Pilots.

“The most talented music professionals from around the world continue to choose Avid’s comprehensive audio solutions to record and produce award-winning music,” said Avid Chairman and Chief Executive Officer, Louis Hernandez, Jr. “As a company that’s passionate about world-class content creation and technical innovation, we’re proud to provide our preeminent customer community with not only the industry-leading solutions they need to deliver the highest quality music, but also a media ecosystem to help artists economically through Avid EverywhereTM. We congratulate our GRAMMY Award-winning and nominated customers who inspire us with their outstanding artistic and technical achievements.”

See the full list of GRAMMY Award winners and nominees who relied on Avid Artist Suite creative tools here.

15 2017 Feb

Avid Delivers Next-Generation Newsroom Innovations that Accelerate Multiplatform Story Creation

Cloud-enabled story-centric workflow enables media and news organizations to quickly and efficiently create and deliver multiple angles of a story across more viewer outlets

Avid® today announced the availability of innovations for the next-generation newsroom. Powered by the Avid MediaCentral® Platform, the industry’s most open, tightly integrated and efficient platform for media, the next-generation newsroom is based around a complete story-centric workflow that connects production teams anywhere in the world via the cloud and takes their stories beyond traditional media outlets. It includes multiple Avid solutions and new feature enhancements—including several new panes within MediaCentral | UX—for fast, efficient, modern newsroom management and news production.

The story-centric workflow puts the story at the center of news operations, and provides the best and most comprehensive tools and workflow solutions to enable news teams to plan, gather, create, collaborate, manage and deliver news to a wider range of viewers across multiple platforms. This holistic approach allows for more dynamic and organic storytelling and greater workflow agility–both inside and outside the newsroom.

In addition to news teams, anyone who needs to plan and schedule resources, gather information and distribute content around specific topics or stories can benefit from the story-centric workflow. It enables teams to quickly find and access the media and information they need to tell multiple angles of a story and increase viewer interest. Content can be pushed across a variety of platforms as the story evolves, including on-air, online, and on mobile devices. Audiences can get up-to-the-minute information and contribute to live broadcasts through social media interaction. And news teams can move away from traditional rundown-driven workflows, accelerating their ability to react to changing information.

“Today’s media organizations are under intense pressure to deliver stories to a broader range of outlets including social media, boost ratings without necessarily increasing resources, more easily incorporate social content, and engage a broader audience,” said Alan Hoff, vice president, Market Solutions, Avid. “The new story-centric workflow enables our preeminent customer community to capitalize on these opportunities while improving efficiency across workflows and resources, and future-proofing their investments with flexible licensing and deployment options including cloud-enabled workflows.”

The products that comprise the story-centric workflow include MediaCentral®| UX, iNEWS®, Interplay® | Production, Media | Distribute, Media Composer® | Cloud, Maestro™, and Social Media Hub. MediaCentral | UX, the cloud-based web front end to MediaCentral, is the hub and catalyst in the story-centric workflow, so customers who already have iNEWS and/or Interplay | Production systems can manage every facet of a news story from a single user interface.

Several new capabilities within MediaCentral | UX facilitate the story-centric workflow and integrate with other platform-connected solutions like iNEWS and Interplay | Production. They include the ability to create, and manage assignments and resources, and to gather story-related content in a collaborative workspace.  In addition, new integrated panes for Maestro and Social Media Hub provide seamlessly integrated graphics management and social media interaction.

With the new MediaCentral | UX Assignments pane now available, the story-centric workflow enables users to:

  • Assign and manage stories with a virtual assignment desk. Users can assign teams, resources, topics, and destinations to a story, which can then be easily searched for, filtered, updated, and managed throughout its evolution
  • Build a story by gathering and associating graphics, video, text, tweets, social media posts, and other potential story-building content in the Elements area of the Assignments pane
  • Sync stories across newsrooms with richer metadata tagging in MediaCentral | UX. Users can associate categories, topics, and tags to a story to enable better content searching and management across all accessible local and remote databases
  • Aggregate Twitter and Facebook content and display social media postings (from Social Media Hub) as on-air graphics (via Maestro)
  • Remotely access online and archived footage, scripts, and information back at the station or anywhere on the network and shoot, write, edit, and deliver stories from any location in the world using MediaCentral | UX and Media Composer | Cloud
  • Easily deliver a single story across multiple formats for viewing on TV, in a web browser, on a mobile device, or through a variety of social media platforms with direct integration between MediaCentral | UX and Media | Distribute
  • Broadcast up-to-the-minute graphics with the new Maestro pane in MediaCentral | UX, which allows users to drag and drop clips and images directly into Maestro templates to create and update real-time graphics quickly for on-air versions of stories

The MediaCentral | UX Assignments pane is available at no additional charge for customers who upgrade to MediaCentral | UX version 2.9. 

Read More
14 2017 Feb

Sci-Tech Awards Honor Decades of Digital Camera Development

By Debra Kaufman

The 89th Scientific and Technology Awards, first established for the 4th Academy Awards in 1931, were held on February 11, 2017 at the Beverly Wilshire Hotel. Hosted by actors John Cho and Leslie Mann, the 2017 Sci-Tech Awards featured 18 separate awards, given out as citations and plaques to 34 individuals.

A well-written script and comic delivery of the two hosts provoked frequent laughter in a show where the awards are described in dense scientific and engineering terminology. Cho and Mann, who got appreciative laughter for their self-deprecating befuddlement at the text they read off the prompter, hit the mark when they dubbed the Sci-Tech Awards a “secret, private awards ceremony” that turns away the many celebrities who want to attend.

The main star of the evening was digital cameras, which, as Scientific and Technical Awards committee chair Ray Feeney noted, “helped facilitate the widespread conversion to electronic image capture for motion picture production … [and] significantly expanded filmmakers’ creative choices for moving image storytelling.” Two awards – to Sony/Panavision’s Genesis and Thomson Grass Valley’s Viper FilmStream – acknowledged the pioneering work of those companies that built the cameras at a time when the idea of digital cinematography was new and, often, unwelcomed. The Committee then recognized three digital cameras that are in widespread use today, featuring the latest technologies: the ARRI Alexa, RED Digital Cinema’s RED and Epic cameras, and Sony’s F65 CineAlta camera.

In the arena of digital visual effects, which has reached a high level of sophistication and maturity, the Academy honored advances in facial performance capture with awards to several responsible companies and individuals: Parag Havaldar’s work on expression-based facial performance at Sony Imageworks; Nicholas Apostoloff and Geoff Wedig for their animation rig-based system at ImageMovers Digital and Digital Domain; Kiran Bhat, Michael Koperwas, Brian Cantwell and Paige Warner for ILM’s facial performance capture system; and Luca Fascione, J.P. Lewis and Iain Matthews for creating Weta Digital’s FACETS facial animation capture system.

The trend towards production that incorporates heavy use of CGI and animatronics put Steven Rosenbluth, Joshua Barratt, Robert Nolty and Archie Te in the spotlight for their work on the Concept Overdrive motion system, which coordinates real world, virtual and animatronic imagery into a seamless workflow. Brian Whited was honored for designing and developing the Meander drawing system at Walt Disney Animation Studios

Awards more specific to the creation of digital imagery recognized Larry Gritz for his creation of Open Shading Language, which has become “a de facto industry standard,” and several awards to those whose work has improved rendering. Carl Ludwig, Eugene Troubetzkoy and Maurice van Swaaij were awarded for an early rendering breakthrough, with CGI Studio at the now-defunct Blue Sky Studios. The popular Arnold renderer was honored, with awards going to Marcos Fajardo for the “creative vision and original implementation” as well as Chris Kulla, Alan King, Thiago Ize and Clifford Stein for “highly optimized geometry engine and novel ray-tracing algorithms” developed at Sony Imageworks and Solid Angle. Vladimir Koylazov was awarded for “the original concept, design and implementation” of V-Ray, at Chaos Group, widely used for its approach to ray-tracing and global illumination and “its support for a wide variety of workflows.”

An animatronic horse puppet, developed originally for use in “Seabiscuit,” won honors for Mark Rappaport for the concept, design and development, Scott Oshita for the motion analysis and CAD design, Jeff Cruts for developing the faux-hair finish, and Todd Minobe for character articulation and drive-train mechanisms. Digital wireless microphones systems were awarded to Glenn Sanders and Howard Stark at Zaxcom and David Thomas, Lawrence E. Fisher and David Bundy at Lectrosonics.

14 2017 Feb

MTI Film Expands Hollywood Facility to Support HDR

Adds new color grading, editorial and QC resources.

MTI Film has completed a major expansion of its Hollywood facility designed to accommodate growing demand for post-production services and support for HDR content. The expansion includes the addition of two new color grading suites and an editorial finishing suite, all HDR capable. The company has also improved its IT infrastructure, adding significant bandwidth and storage, and installed new monitoring systems throughout the facility. All the moves are intended to address the growing number of television productions requiring HDR delivery as well as the company’s expanded slate of HD and UHD shows.

“We’ve built a complete HDR workflow,” said MTI Film CEO Larry Chernoff. “We can support productions from the set to screen, and provide deliverables optimized to any network specification including HDR10, Dolby Vision, IMF and AS02.”

MTI Film’s post-production services division has been growing rapidly.  It currently provides post production services for shows Outlander, The Affair, Longmire, Major Crimes, Bates Motel, Good Behavior, The Story of God and Outcast, among others. Its colorist staff has recently grown to five as former assistant colorists Alex Chernoff and Greg Strait have been promoted to colorist joining Tanner Buschman and supervising colorists Steve Porter and Johnnie Kirkwood.  “MTI has always believed in developing talent from within our ranks,” explained Executive Producer Barbara Marshall. “It assures continuity for our customers and engenders an environment where young talent have confidence they will be nurtured.”

The facility’s new color grading suites are equipped with the latest generation Digital Vision Nucoda color grading systems, and the new editing suite utilizes both Avid and Autodesk Flame.  Monitoring includes LG UHD OLEDs, with perfect rec 709 calibration, and Sony BVMX300s for HDR mastering.

 

14 2017 Feb

Highly Anticipated Sundance Films and Series Created Using Blackmagic Design

Blackmagic Design today announced that more than 45 films and series at the 2017 Sundance Film Festival used its digital film cameras, DaVinci Resolve Studio grading, editing and finishing solution, Fusion Studio visual effects (VFX) and motion graphics software, Video Assist monitor and recorder, and other products throughout production and post production.

Some of the festival’s most anticipated films and series were shot and completed using Blackmagic Design products, including “Carpinteros (Woodpeckers)” which was shot with an URSA Mini 4.6K, “Colossal” that used Fusion Studio for its VFX, and many films such as “A Ghost Story,” “The Big Sick,” “The Discovery,” “The Hero” and “Rebel in the Rye” that were graded using DaVinci Resolve Studio.

Read more here.

14 2017 Feb

Pixspan’s Pixsmover™ Transfers to the HPA Tech Retreat Innovation Zone

With an industry increasingly required to manage more 4K+ image files, Pixspan’s PixMover offers a coveted three-fold solution: fast transmission speeds with reduced storage needs while retaining 100% integrity of every bit per pixel. Pixspan specializes in full-resolution workflow management and with PixMover™ quality, quantity and speed can all be attained.

Rapidly growing file sizes and frame rates are a fundamental part of production workflows, and PixMover ensures the ability to manage them. With simple drag-and-drop features, PixMover makes it easy to save 50-80% on storage and networking resources. Using Pixspan’s Bit Exact Round Trip technology, PixMover encodes much smaller bit-exact copies of their full-resolution images and then restores them – bit for bit – to the original image. PixMover is native to Autodesk Flame, incorporated into TIXEL Network Optimization Software, and enables 4K workflows in existing infrastructures without adding expensive new equipment. It has been implemented by some of the most in-demand vendors, including Dell, ELEMENTS, EMC Isilon, and NVIDIA. Operating on both CPU and GPU platforms, Pixspan offers management of uncompressed camera raw files (EXR, DPX, Cineon, ARRIRAW, Cana Raw etc,) and digital intermediate workflows with speed and efficiency on space and bandwidth-heavy networks.

Visit Pixspan in the Innovation Zone: http://www.pixspan.com/hpa_2017.php

30 2017 Jan

YEP Continues Momentum

Jesse Korosi (Director of Workflow Services at Sim) and Jennifer Zeidan (Media Systems Engineer, Industrial Light & Magic) head up the YEP initiative for HPA alongside WIP leaders Laura Thommen, Kari Grubin, and Belinda Merritt; HPA Board Member Loren Nielsen; and HPA President Seth Hallen.

Since the first class of HPA YEPs was announced last September during the SMPTE Annual Technical Conference, we have been collaborating, connecting and designing the road map for our first year of this committee. The next event that will bring YEPs together will be the HPA Tech Retreat. Some of our YEPs will be presenting at the event, while others will be attending the VR seminar, as well as the Supersession. We are working on a number of additional activities over the next few months, which will soon be announced.