Stay in Touch via Social Networks

Home → Archives → Category → Uncategorized
19 2017 Jul

Tackling the Format Explosion at the second annual HPA Tech Retreat UK

The HPA Tech Retreat UK is fast becoming a must-attend event for the cognoscenti of professional content production in Europe, as it already is for the US community who meet in Palm Springs each year.

Those who gathered to retreat at Heythrop Park Resort in rural Oxfordshire for three days in July came from locations as diverse as France, Poland, and East and West Coast America.

Unlike at trade shows where most of these senior folk meet, there is a real chance at the event to relax into proper discussions and connect and re-connect with contacts.  In addition to the programming, continuous opportunities for conversation and exploration took place at breakfast roundtables, cocktail receptions, lunches, dinners and parties.

Cutting across much of the agenda, and front of mind for everyone from CEOs to engineers –– is the explosion in format and versions to serve the international market on ever multiplying platforms.

Setting the scene, delegates enjoyed an invigorating keynote from Eric Pearson, Home Entertainment Supervisor at Pixar Animation. He explained how, with just a team of seven, they created a remarkable 7482 new shots for international versions of Cars 3.  This picture localisation entails catering to the nuances of culture which Pixar takes extremely seriously by making artistic changes to frames or whole sequences. For example, it regularly substitutes cultural and language appropriate text in newspaper headlines in backgrounds to ensure a joke or plot line is followed.

“We’re creating an experience for the Mandarin or Thai speaker so they can be lost in the movie as if it were made natively in their language,” said Pearson. “This dramatically increases the complexity but we think it’s worth it.”

MESA estimates the annual cost of localizing TV content for EMEA markets alone at $2.5 billion.

“For the same amount of money you spend on a transcription house you can use machine learning to deliver speech to text, localization and an almost infinite other variety of data tasks – and you end up with richer content,” said Josh Wiggins, CCO, GreyMeta.

Machine trained automated speech to text may not yet be good enough for BBC One, admitted Stephen Stewart, VP, Global Content Operation, BBC Worldwide. “But if you have an opportunity to push content where it’s not economically viable at present and you can inform people about a subject they would not otherwise have seen, then it is worth it.

“Machine Learning is getting there,” he added. “We can expect to see artificial intelligence encroaching more and more on the content creation, production and delivery ecosystem in a very short time.”

Lydia Gregory of Jukedeck demonstrated two music tracks – one composed by human, one by computer – illustrating that the line between art and science is already blurring.

SMPTE Fellow and BBC standards lead Andy Quested chaired a discussion of the format minefield that went into creating BBC Natural History series Planet Earth II.

“If you’re going to the ends of the world you want whatever you do to be futureproofed,” explained producer Elizabeth White. “What we didn’t know then was how it going to be post produced so we recorded with no real knowledge it was going to be finished as UHD let alone an HDR product.”

This staggeringly complex show was made over four years, shot on at least 16 formats and accumulated a 400 to 1 shoot ratio.

As it was being postproduced, BBC R&D’s Andrew Cotton explained how the broadcaster helped devise the HLG format of HDR in order to serve both legacy and new TV sets with dynamic range.

The event began with a series of expert reports on VR, AR and MR. While there is exciting work being done, the tenor of discussion was that the industry needs to take a reality check.

“Perhaps the biggest problem is that there is no audience for VR yet,” said Zillah Watson, a former current affairs producer, who is now editorial lead on future content and storytelling projects for BBC R&D. “We haven’t got a way of distributing VR to an audience to find out what they want from the experience.”

She said the industry has come a long way in terms of creating hard news programming in 360° since the BBC’s first news experiment from the Calais migrant camps in 2015 —but it was clear that there are still challenges to overcome before VR news goes mainstream.

“360° has been justified by the broadcast news industry as a gateway to VR. It is not. I question if there is any evidence that watching 360° will make a user want to watch on a VR headset. 360° video on mobile or in browsers will not drive people to VR. If we don’t create a good content ecosystem that people want to explore and view and we don’t make headsets better, then the whole thing won’t work,” said Watson.

Evidence that VR can attract a positive response from audiences was provided by BT Sport’s Andy Beale. He shared the background to the live streamed VR experience for the UEFA Champion’s League final earlier this year.

“We’re not doing VR just because we can but only if it adds value,” said Beale.  “Rather than saturate viewers every week with it we want to keep it as a tool for big occasions.”

One of the best received sessions was a call to action to extend racial and gender diversity across the industry. IBC project manager Jay Sakallioglu moderated a talk with Geoffrey Okol of ITN Productions, multi-cam operator Abigail Dankwa, and BAFTA’s Emma Perry, rejecting tokenism and calling for a pro-active stance to encourage greater range within craft and technician levels. “Diversity is not only common sense; it helps media companies adapt to the fast-paced environment, capturing ideas and delivering on innovation,” said Okol.

The HPA Tech Retreat Innovation Zone included a roster of companies whose experts were on hand to lead attendees through new technologies and products.  Featured companies included AJA, Avid, BASE, Codex, Dolby, GreyMeta, Image Matters, LiveU Ltd, Motion Impossible, NexGuard, Pixelogic, Pixspan, RED Digital Cinema, Signiant, Sohonet, Sony Digital Cinema, Sundog Media Toolkit, Teledyne LeCroy, XTFX Unlimited. Visit here for a complete list of companies and products.

18 2017 Jul

The HPA AWARDS: Call For Judges, Creative Categories

The HPA Awards honor the best work from our industry’s finest artists and companies. Are you interested in helping us find award-winning entries?  Categories include color grading, editing, sound editing and visual effects for features, television and commercials.  Ideally, judges are working in the categories, or have expertise in the craft. You join a high-caliber cadre of HPA Awards judges, and your contribution of time and expertise helps make this wonderful show what it is.

Please send us your contact information, credits or IMDB link, and we will make sure there’s no conflict of interest.  Judging takes place at facilities in the Los Angeles area, and usually entails one evening of commitment, sometimes two; and begins in mid August.

For further information, call Alicia Rock at the HPA Office + 1 213 614 0860, or email hpa@hpaonline.com.

Read More
17 2017 Jul

SMPTE® Hollywood Section to Revisit Classic Movie Sound at July Meeting

LOS ANGELES — The Hollywood Section of SMPTE®, the organization whose standards work has supported a century of advances in entertainment technology, will host a demonstration of classic movie sound technology at its monthly meeting, scheduled for Tuesday, July 25, in Hollywood.

The event will include a live performance by Joe Rinaudo, founder of the Silent Cinema Society, on an American Fotoplayer (provided by the Academy of Motion Picture Arts and Sciences). The Fotoplayer is a type of player piano used in movie theaters during the silent era to provide sound effects and music.

Motion picture archivist Bob Heiber will deliver a presentation on restoring magnetic soundtracks from the 1950s. He will also screen sequences from 70mm Todd-AO and Cinemascope 55 productions, including Oklahoma and The King and I.

“Movie sound has undergone an incredible evolution since the early days of cinema,” said Jim DeFilippis, chair of the SMPTE Hollywood Section. “Our July meeting will provide a wonderful opportunity to experience what movies sounded like before the sound era, as well as when widescreen and stereophonic sound first hit theaters.”

The event is produced by SMPTE Life Fellow, Dick May.

What:  SMPTE Hollywood Section, July Meeting

Topic:  “The Sound of Movies”

When:  Tuesday, July 25, 2017, 6:30 p.m. — Reception, 7:15 p.m. — Meeting

Where:   Academy of Motion Picture Arts and Sciences (AMPAS), Linwood Dunn Theater,1313 Vine Street, Los Angeles, CA 90028. Free parking is available behind the building.

Price: Free for SMPTE members and nonmembers

Register:  https://www.eventbrite.com/e/movie-sound-tickets-36229067115

About the SMPTE® Hollywood Section

The Hollywood Section of SMPTE® was originally organized as the West Coast Section in 1928. Today, as its own SMPTE Region, it encompasses more than 1,200 SMPTE Members with a common interest in motion-imaging technology in the Greater Los Angeles area. The Hollywood Section offers free meetings monthly that are open to SMPTE Members and non-members alike. Information about meetings is posted on the Section website at www.smpte.org/hollywood.

About SMPTE®

For more than a century, the people of SMPTE (pronounced “simp-tee”) have sorted out the details of many significant advances in media and entertainment technology, from the introduction of “talkies” and color television to HD and UHD (4K, 8K) TV. Since its founding in 1916, SMPTE has received an Oscar® and multiple Emmy® Awards for its work in advancing moving-imagery engineering across the industry. SMPTE has developed thousands of standards, recommended practices, and engineering guidelines, more than 800 of which are currently in force today. SMPTE Time Code™ and the ubiquitous SMPTE Color Bars™ are just two examples of SMPTE’s notable work. As it enters its second century, SMPTE is shaping the next generation of standards and providing education for the industry to ensure interoperability as the industry evolves further into IT- and IP-based workflows.

SMPTE’s global membership today includes more than 7,000 members: motion-imaging executives, creatives, technologists, researchers, and students who volunteer their time and expertise to SMPTE’s standards development and educational initiatives. A partnership with the Hollywood Professional Association (HPA) connects SMPTE and its membership with the businesses and individuals who support the creation and finishing of media content. Information on joining SMPTE is available at www.smpte.org/join.

All trademarks appearing herein are the properties of their respective owners.

14 2017 Jul

Avid Customer Association Vote Influences Avid’s Future Innovations and Reveals Media Companies’ Investment and Technology Priorities

Survey of more than 6,500 media professionals gives Avid’s customer community unprecedented influence over future offerings, and uncovers key investment priorities and technology trends spanning cloud, IP, 4K/UHD, multiplatform distribution and VR/AR

Burlington, MA Avid® (Nasdaq: AVID), a leading media technology provider for the creation, distribution and monetization of media assets for media organizations and individual creative professionals, today announced the findings of the inaugural Avid Customer Association (ACA) Vote. The ACA Vote gave Avid’s preeminent customer community the unique and unprecedented opportunity to directly influence Avid’s future offerings. The findings on emerging technology and new business requirements also provide valuable insights into the media industry’s future plans and challenges in relation to cloud computing/virtualization, IP networking and content delivery, 4K/UHD in mainstream broadcast, multiplatform content delivery, and virtual/augmented reality.

The ACA Vote set a precedent for the media industry by giving ACA members the opportunity to weigh in on their most important requirements and ensure that Avid continues to deliver new or improved offerings that will positively benefit the community, demonstrating a deeper collaboration between Avid and its community. Over 6,500 unique voters from over 4,000 organizations in 109 countries participated in the vote. Spanning the areas of creative applications, workflow solutions and emerging technology, it uncovered what will most significantly impact the future performance and success of Avid’s customer community.

The ACA Vote revealed that the vast majority of media professionals (71.7%) are considering moving some part of their infrastructure or workflow to the cloud over the next two years—the most popular being remote access workflows (15.8%). Just 4.8% are considering moving their entire infrastructure and workflow to the cloud, highlighting the important role that hybrid cloud deployment models will play in the media industry’s journey to the cloud.

A hybrid approach will also be important to the industry’s transition to IP. Just over half of respondents (50.9%) are considering hybrid SDI/IP connectivity for new investments. 26.6% of media professionals are considering IP-only connectivity. Dynamic scalability is the most popular reason for considering IP video/audio (36.6%), followed by new high-bandwidth productions like UHD (28.8%) and format-agnostic workflows (16.3%).

High-resolution media formats are firmly taking hold, with the majority of media professionals (64.6%) expecting to implement 4K/UHD across their organization within the next two years. OTT or internet delivery is by far the most prevalent delivery mechanism for 4K/UHD (50.7%), followed by theatrical/venue viewing (21.6%) and satellite or cable delivery (13.6%). Just 9.9% said terrestrial broadcast is their most prevalent form of 4K/UHD delivery. The biggest challenge to adopting 4K is the burden on storage capacity (31.6%), followed by the cost of adding/upgrading 4K capabilities (30.5%), and the negative impact on the real-time performance of creative apps (24.7%).

While most media professionals (73.3%) are creating content for multiple platforms, less than a third (32.3%) use a single online video platform for social media content distribution. The majority (67.7) use the social media service’s online video platform, making content distribution cumbersome and inefficient. The top two most important drivers for investing in multi-platform content production are reaching new audiences (37.8%) and maximizing audience engagement (37.7%).

While more than half of media professionals (58.4%) said that virtual and augmented reality are important to their strategic growth plan, the vast majority (82.3%) aren’t yet sure which business models to consider, and most (63%) have no plans to implement VR/AR over the next two to three years. The most appealing applications of VR/AR are entertainment (23.1%), live events (21.25), gaming (20%) and film (19.4%). 15-30 minutes is seen as the ideal length for VR/AR programming (29.4%), followed by 5-10 minutes (25.5%), less than five minutes (18.7%), feature length (16%), and one hour (10.5%).

 

“The ACA Vote represents a new phase of customer participation in Avid’s future direction, building on the deep community partnership with our customers and users,” said Avid President Jeff Rosica. “I am proud of our community for reaching this exciting milestone and applaud the ACA Executive Board of Directors, who oversaw this process. The results of the ACA Vote will directly influence innovations for the MediaCentral® Platform, the industry’s most open, tightly integrated and efficient platform designed for media, and ensure that the ongoing development of our comprehensive tools and workflow solutions for media creation, distribution and optimization continue to support what is most important to our customers and their creative, technical and business requirements.”

 

14 2017 Jul

Avid Announces Availability of Avid NEXIS Delivering Unrivalled Performance, Scalability, and Pro Tools Support

The newest software release for Avid NEXIS is now available to all new and current customers, delivering greater bandwidth for fast, reliable workflows and support for Avid Pro Tools to optimize professional audio production workflows

BURLINGTON, MA – Avid® (Nasdaq: AVID), a leading global media technology provider for the creation, distribution and monetization of media assets for global media organizations, enterprise users and individual creative professionals, today announced that the availability of Avid NEXIS®, the world’s first and only software-defined storage platform for media. Powered by the MediaCentral® Platform, the most open, tightly integrated, and efficient platform designed for media, Avid NEXIS and Avid NEXIS | PRO systems now provide the fastest, most efficient and reliable workflows for real-time media production, including highly intensive professional post-production and broadcast environments. With support for Avid Pro Tools®, Avid NEXIS also enables new collaborative shared storage workflows for professional audio production.

Unrivalled performance, scalability, and reliability for every production environment

For larger post and broadcast environments, Avid NEXIS | E4 and E2 enterprise-class storage systems offer greater performance for 4K/UHD, color grading and finishing workflows. New high-performance storage groups (HSPGs) deliver up to 28.8 GB/s of bandwidth in a single Avid NEXIS system, providing the throughput needed to handle full-resolution media for online editing workflows. For smaller environments, Avid NEXIS | PRO professional-class storage provides the industry’s most comprehensive collaborative capabilities while also delivering real-time 4K performance at up to 2.4GB/s, all at an even more cost-effective price point. Avid NEXIS and Avid NEXIS | PRO also provide real-time creative team collaboration using not only Avid Media Composer® but other editorial and creative tools including Adobe Premiere Pro CC, Apple Final Cut Pro X, DaVinci Resolve and more, as well as allowing for easy integration with third-party asset management systems.

New collaborative workflows for audio production

Avid Pro Tools, the industry-standard digital audio workstation, is now qualified on Avid NEXIS, enabling audio creative teams to leverage the industry’s most efficient and powerful media storage environment. With Pro Tools combined with Avid NEXIS, users can share projects on a centralized pool of media storage, turning work around faster by eliminating the time wasted moving files between different systems.

Major new Avid NEXIS features include:

  • High-performance storage groups, with each media pack capable of data rates up to 50% faster at 600MB/s, providing the performance needed for high-volume 4K/UHD, HD, and bandwidth-intensive media workflows.
  • Pro Tools streaming from Avid NEXIS delivering enhanced audio workflows. Customers can accelerate completion of professional audio projects while benefiting from the automatic media protection, security, and flexibility of Avid NEXIS.
  • Scalability enhancements that double the capacity of Avid NEXIS | Enterprise systems to support up to 48 media packs across a single scale-out networked system. Customers can mix and match a combination of Avid NEXIS | E5, E4, and/or E2 engines, providing nearly 3PB of storage capacity and 28GB/s of high-performance bandwidth.

 

The new version of Avid NEXIS including support for Pro Tools is now available to new customers as well as current Avid NEXIS owners with an active annual support and software maintenance plan. For more information, visit www.avid.com.

 

Read More
14 2017 Jul

AMPAS at Work on Next ACES Version

By Debra Kaufman

At NAB 2015, the Academy of Motion Picture Arts and Sciences officially launched the ACES (Academy Color Encoding System). Now, says AMPAS Science and Technology Council managing director Andy Maltz, more than two years later, ACES has been adopted by innumerable product manufacturers and used on at least 100 feature films, from Guardians of the Galaxy 2 to Woody Allen’s Café Society. Marvel, Screen Gems, NBCUniversal and Netflix are among the studios that have committed to the standard. The Academy also launched ACESCentral.com, an online forum on which 700+ users discuss and seek support on ACES questions from online mentors.

It’s about time for some changes. “We always said there wouldn’t be a next generation ACES until ACES 1.0 was widely adopted,” says Maltz. “Right around the two-year point of ACES 1.0, it became more than apparent that it was time to start moving towards enhancements and extensions.” After years of serving as co-chairs of the ACES project, Starwatcher Digital principal Jim Houston and RFX president Ray Feeney stepped down, making way for Marvel Studios vice president of technology Annie Chang as incoming ACES chair and HBO director of production R&D Rod Bogart and EFILM vice president of imaging science/technical director Joachim Zell as vice chairs.

Joachim Zell

Joachim Zell

“Annie, Rod and I are all using ACES as a tool in our day-to-day production,” says Zell. “So we will also be able to talk about what does or doesn’t work, and guide it in the right direction to make it an end-to-end system.”

The first version of ACES has largely been a success. “Everything worked the way we expected it to work,” says Maltz. But, despite the fact that nothing in ACES 1.0 is “broken,” Maltz and the ACES project team became increasingly aware that some portions of the standard hadn’t been as widely adopted as others, and that some tweaks and enhancements were required for ACES to reach its ultimate destination. “Our goal is to get all six major studios to declare publicly that all deliverables should come in as ACES deliverables,” says Maltz.

As the first step, says Zell, they identified 15 different constituent groups, including the major studios, post production houses, DITs, cinematographers, VFX professionals, colorists, manufacturers and producers. Zell reveals that they have just conducted a meeting with Disney, Fox, Paramount, Sony, Universal and Warner Bros. “We all met in one room, and although it is too early to talk about what we’ve discovered, I can say that the studios are the ones who benefit most from a common standard in terms of look and color management, so we expected they would give positive feedback,” says Zell. “The studios definitely support the mission the Academy is going for by inventing ACES and bringing it to the market.”

Maltz enumerates aspects of ACES 1.0 that need to be tweaked. First is adoption of ACES’ metadata file, dubbed ACES clip. “You can use that metadata carrier to better communicate how to reproduce the colors,” says Maltz. “We’re adamant about that; it’s required for archiving. This has to happen for people to be able to get what they want.” The ACES team is also looking at the Look Modification Transform (LMT), an easier implementation of custom looks. Third is the Common LUT. “To communicate a look you need a standardized version,” he says. “We didn’t anticipate a programmatic or algorithmic description of a look, and one new requirement is that people need to use the algorithmic description, like shader language.”

Zell adds that the process of interviewing groups impacted by ACES will provide a roadmap for going forward. “The outcome will help us understand where ACES is at the moment and where people want it to go,” he says. “What we will have learned in the discovery phase will help us get to ACES Next or ACES 2.0.

 

Read More
14 2017 Jul

Sony Pictures Post Production Services Weaves an Intricate Web of Sound for “Spider-Man: Homecoming”

In Columbia Pictures and Marvel Studios’ Spider-Man: Homecoming, the movie’s colorful cast of characters, impressive gadgetry and thrilling action created plenty of opportunities for its veteran sound team to flex their creative muscles. Sony Pictures Post Production Services assembled an all-star lineup of sound talent for the project led by Supervising Sound Editors Eric Norris (The Amazing Spider-Man 2, Man of Steel) and Steven Ticknor (Spider-Man, The Lincoln Lawyer), Academy Award-winning Re-Recording Mixer Kevin O’Connell (Hacksaw Ridge, Transformers), Sound Designer/Re-Recording Mixer Tony Lamberti (Ghostbusters, Inglourious Basterds) and Foley Artist Gary Hecker (Justice League, Spider-Man, Spider-Man 2, Spider-Man 3).

“It was a high-octane crew,” says Ticknor. “As a team, we blended well, everyone contributed and the ideas flowed seamlessly.”

The Dolby Atmos mix was completed on Sony Pictures’ William Holden Stage.

Although Spider-Man is well known to movie fans around the globe, Homecoming introduces several new elements to the character. Notably, Spidey gets a new suit, designed by Tony Stark, that is outfitted with a variety of sophisticated tech including a detachable drone. “It was an interesting challenge to create the sound for the drone,” recalls Norris. “Steve came up with the idea of using a toy noisemaker that produces a high-pitched whistle. We used that and a palette of other effects to give the drone its personality.”

Later in the movie, Peter dons an older Spider-Man suit that he had created himself. It required a slightly different sound treatment. “Jon Watts wanted the web coming from the old suit to sound a little less modern than the web that shoots from the new Stark suit, which is hot and cool,” notes Ticknor. “We ordered a couple of 5,000-foot rolls of magnetic tape and let them unravel. They created a whooshing sound that became our old-school web.”

One of the most intricate blend of sound effects was applied to the winged suit worn by supervillain Vulture. The massive device changes over the course of the film acquiring new features and becoming more menacing. The team employed a mix of mechanical sounds for its metal feathers and wings, and jet turbines for the roar of its engine. “We used samplers to stack sounds together, and shape them to create a sense of movement that mirrored the action on the screen,” explains Lamberti. “It was a lot of fun!”

Hecker spent much of his time recording Spider-Man’s signature foot, body and suit movements, featured throughout the film. His task was anything but routine. “Jon Watts wanted Spider-Man to be stealthy, ninja-like,” Hecker notes. “He’s light on his feet, acrobatic. In one scene, he climbs up the side of a building, opens a window and enters a house. He’s upside down on the ceiling, crawling on his hands and feet. That’s all Foley.”

Hecker used a special neoprene shoe to recreate Spider-Man’s light footfall. “I wanted to capture his character through his footsteps and body movements,” he explains. “He’s sometimes moving very fast, sometimes fighting, sometimes sneaking. It’s important to convey the emotion of the scene through movement.”

Hecker also worked with Norris, Ticknor, O’Connell and Lamberti on custom sounds for Vulture’s winged suit.

During mix sessions, O’Connell and Lamberti blended thousands of custom sounds with dialogue and music to produce the finished soundtrack. “The sound editorial team was very well organized and that made it easy to swap things out and get it to picture,” says Lamberti. “The mix went very smoothly. Kevin, who’s a legend in the business, had a great overview of the entire soundscape and kept an eye on the big picture. The finished mix is clean and articulate. You can hear everything; nothing is overwhelming or underdone.”

O’Connell credited Norris, Ticknor, Lamberti and Hecker with providing an abundance of technical and organic sounds that help bring the world of Spider-Man to life. He also offered high praise for Composer Michael Giacchino. “He did a fantastic job with the score; it was well-balanced and right on the money,” O’Connell says. “It was a dream come true for a mixer.”

“With so much great visual material to work with, we could focus on Jon Watts’ vision in delivering an experience that audiences will remember for a long time,” O’Connell adds. “Spider-Man: Homecoming is more than an action film. There were scenes where we could have gone crazy with sound effects and music, but, instead, we did our best to stay true to the story and keep the focus on Peter with respect to the world around him.”

 

Read More
14 2017 Jul

Blackmagic Design Ships Remote Bluetooth Camera Control App for URSA Mini Pro

Blackmagic Design recently released Blackmagic Camera Control, a free iPad app that lets customers remotely control their URSA Mini Pro cameras via Bluetooth, along with Camera 4.4 Update for URSA Mini Pro cameras. The new Blackmagic Camera Control app is based on the open protocol for URSA Mini Pro cameras that Blackmagic Design demonstrated at NAB earlier this year.

Camera 4.4 Update can be downloaded free of charge from the Blackmagic Design website. Once installed, customers can download the Blackmagic Camera Control iPad app from the Apple app store.

All URSA Mini Pro cameras feature built in bluetooth connectivity, which until now has not been enabled. The built in bluetooth will allow customers to to send and receive commands from up to 30 feet away. Once the camera is paired with the iPad, customers can remotely power URSA Mini Pro on or off, change all major settings, adjust and add metadata using a digital slate and trigger recording. The Blackmagic Camera Control app is perfect for customers that need to control cameras in hard to reach places such as on cranes, drones, in underwater housings and more.

To make URSA Mini Pro’s Bluetooth support even more flexible, Blackmagic Design has developed a new, open protocol and is publishing a developer API, along with sample code, for customers that wish to build their own camera control solutions. This free API and sample code will be available later this summer.

In addition to the Blackmagic Camera Control app, Blackmagic Design has also released Camera 4.4 Update which enables Bluetooth functionality and adds new preset timecode options on URSA Mini Pro cameras. The update also adds compatibility with Canon 18-80mm T4.4 for iris, focus and record trigger, along with improved EF, PL and B4 support, improved digital slate functionality and improved zebra stripe overlays on URSA Mini 4K cameras.

“URSA Mini Pro has become incredibly popular because of its amazing image quality combined with broadcast features and controls, built in ND filters and interchangeable lens mount,” said Grant Petty, Blackmagic Design CEO. “The new Blackmagic Camera Control app and open API means that the possibilities are truly endless. Customers are going to have the tools they need to build completely custom remote control solutions of their own design using the new Bluetooth support!”

20 2017 Jun

Registration Now Open for The Reel Thing

The Reel Thing is a three-day symposium addressing current thinking and practical examples of advanced progress in the field of preservation, restoration and media conservation. The upcoming Los Angeles edition August 24-26 will explore challenges facing the evolving ecosystem of film and digital archiving. The event takes place at the Academy’s Linwood Dunn Theater in Hollywood. In addition to discussions with industry leaders, premiere screenings of several restored films will take place. Created and co-founded by Grover Crisp, executive vice president of asset management for Sony Pictures, and Michael Friend, director of digital archives and asset management at Sony Pictures, The Reel Thing supports the programs and services of the Association of Moving Image Archivists (AMIA). For more information, visit www.the-reel-thing.co.

Read More
20 2017 Jun

What’s On at the HPA Tech Retreat UK

I am delighted to report that we are putting the finishing touches on our second annual Tech Retreat in the UK.  Based on the overwhelming support and positive response, the HPA decided to give it another go in 2017 for our second event.

The Tech Retreat is a way to get away from the hustle and bustle of your day job to learn and network with people who work in the creative technology side of the entertainment business.  In the true sense of the “retreat” word, you can relax and open your mind to new ways of content creation as you network with industry leading creative technologists.  What’s not to like?

By now, I hope you’ve already registered for the expanded event taking place 11-13 July at the Heythrop Park Resort in Oxfordshire, UK.  But if you haven’t done so, be sure to click through here to get in on this wonderful event.

So what’s on this year, you ask?

We start off on Tuesday, 11 July with TR-eXtra (or TR-X for short), a separately ticketed half-day deep dive seminar added on to the general Tech Retreat schedule.  TR-X is meant to focus on a single topic providing loads of details forwards and backwards.  This year’s topic is VR/AR/MR.   Chaired by Technicolor’s Nick Mitchell, we’ve put together an excellent program focusing on what creative technologists need to consider when developing a VR experience. From news to games to Coachella, this year’s TR-X is not to be missed.

The rest of the Tech Retreat runs Wednesday and Thursday. Richard Welsh, this year’s program chair, has put together a fantastic two-day session that will improve upon the excellent inaugural year’s event (hard to believe that he could pull that off!).  This year’s program includes the following topics:

  • Lots of explosions!
    • Versioning and how to deal with it
    • Supply chains and how to solve them with efficient workflows
    • Examples of predicting explosions in advance through an exploration of the BBC wildlife series, Plant Earth II
    • Exploding formats with a case study from Woody Harrelson’s latest project, Lost in London
    • The exploding archive – how best to archive all this digital content
    • Lots of connectivity
      • State of the industry
      • Remote collaboration on successful productions
      • Cellular solutions for production pipelines
      • A couple of cloud sessions (because it’s a must in our business!)
      • An afternoon of AI and ML and IMF – if you don’t know what that means, then you must attend to find out!

Add to this excellent program the Innovation Zone of 20+ companies showing off their latest innovations plus plenty of food and drink, it really is an event not to miss.

Register now!

I look forward to welcoming you to the second annual Tech Retreat UK!