Sony ES Press Briefing and Studio Tour
Los Angeles - Sony
invited out members of the press to discuss their ES line strategy, display new ES components, and provide a tour of Sony Pictures 3D facilities to help us understand all that goes into this recent transition to 3D for movie theaters and home. The largest change that is happening with the ES line is the transition from being just a high end line that is sold in a variety of places, including many online dealers, to a line that is only sold by specialized dealers that are better prepared to help consumers with the setup and installation of the ES products.
shift away from online availability might seem like a surprising way to go, but the ES line contains many features that are often oriented towards custom installers, including multiple zone support, IR and 12V triggers, and other features that Sony found were not being utilized by consumers and a shift to dealers that will be able to directly interface with the home user to communicate these additional features, as well as help to install and configure the products.
Sony’s updated ES line
consisted for three new ES receivers, the STRA-DA5600ES, STRA-DA4600ES, and STR-DA3600ES, ranging in price from $1,099 to $1,999, and a 3D ready Blu-ray player, the BDP-S1700ES, for $399. All of the models will be available in August, with the exception of the 5600 receiver which will be available in September. All of the receivers feature HDMI 1.4 with 4 inputs, 3D pass through and the audio return channel, three zones (the 2nd zone with component video, and the 3rd with audio only), and have added iPhone support for control of the receiver, Sony’s room correction technology with added Phase Matching, a DLNA client with support for multiple services (Rhapsody, Shoutcast), and in a first for any receiver that I know of, a 4 port Ethernet hub, so the receiver can distribute networked content to up to 3 other devices.
The 4600 model adds a 2nd HDMI output to the 3600, and the 2nd zone also moves from being standard component video to component over Cat5 with OSD and scaling support, to allow for longer cable runs that component traditionally allows. The amplifier section is also upgraded from 100x7 WPC to 120x7 WPC (all amplifier ratings are done at 8 ohm, 1 kHz signal with 0.09% THD). Moving up to the 5600 model (pictured below) adds an additional 2 HDMI inputs, including a front panel input, upgrading from a DNLA client to a DNLA server, support for HATS over HDMI (Sony’s audio rate control over HDMI technology to reduce jitter), an enhanced GUI, and the amplifier moves up to 130x7 WPC. The 3600 model lists for around $1,100, the 4600 for around $1,500, and the 5600 for around $2,000.
The BDP-S1700ES Blu-ray player carries over all of the features of the S770 Blu-ray player including 3D support, 1080p scaling, iPhone control, Gracenote database support for title lookup, DLNA client support, as well at 1 GB of memory for BD Live and WiFi built in. The ES model adds a rear IR input for custom installation and automation, as well as the ES line’s 5 year warranty. The last thing that we got to see was a preview of a 3D projector from Sony. It was an early prototype and there won’t be any sort of announcement until later this year, but it was the largest 3D demonstration I had seen to date, and I think the effect is much better when your whole field of view is occupied, much as it is at an IMAX film.
After the product introductions, we were taken to Sony Pictures Studios to learn more about the work that goes into 3D technology before it gets to consumers, and it was really very enlightening to all the work that has to go into it. We met up with Chris Cookson, the President of Sony Pictures Technologies, who gave us an overview of the move the 3D and the challenges that Sony and everyone else faces. Their main concern is that people will get their first taste of 3D and it will be bad 3D, and they will be turned off from it for the future. As someone that has not seen Avatar at all, or in 3D, this makes sense to me as everyone I know that is in love with 3D got there because of Avatar, which was designed from the start to be in 3D. With some pictures just adding on 3D at the last minute now (to take advantage of the higher ticket prices that can come from being shown in 3D) and not putting as much thought behind it, you run the risk of turning consumers off and that’s what Sony wants
Our first full
presentation was from Buzz Hays, the Executive Stereoscopic 3D Producer for the Sony 3D Technology Center. He gave us an overview of the training classes that Sony is putting on at their studio to help teach cinematographers, directors, producers of live events (sports, shows such at the Oscars), and game developers about the challenges of 3D, the additional things you need to consider, and everything else that goes into making a film, TV show, or game in 3D. The first interesting point that was brought up was that an estimated 8-10% of people see in monocular vision and can’t discern objects in 3D. I had no idea that such a high percentage of people were unable to do so, but that is what research has brought out.
Buzz went over a lot of the technical things that I had never considered when it came to making a film in 3D, such as the differences between putting the lenses close together and further apart, deciding how far out to bring and object from the screen or if you want to push it further behind, working to isolate objects that were in the front from those that are behind and how traditional approaches using bokeh can cause headaches for viewers, and the difference that a camera lens can make in your perspective. I found the camera lens demonstration to be the most enlightening myself.
Using an example from
Beowulf with Angeline Jolie, he went between different lenses from 8mm, to 50mm, 85mm, and 200mm. At 200mm, the was almost no depth to the image, with the closest object being almost as near as the background. At 8mm, Angelina Jolie’s head was brought to the very front, and everything else was pushed so far back that it was horribly distorted (8mm is traditionally a fisheye lens). With 50mm and 85mm, there was a definite shift between how much depth there was to the image, and how much the person at the center of the screen was isolated from their background. Having to decide which to use is certainly a hard choice, and not something that you can really go back and fix after you have made the choice on the set. This presentation, which was over an hour long, really showed me all the work that goes into 3D that you didn’t need to think about with 2D, and certainly how it has a learning curve that people will need to figure out. Surround Sound had a learning curve as well, and over the past two decades we have seen how films have gone from barely using the surrounds, or using them for rare effects (I’m thinking of Top Gun here) to using them to really pull the viewer into the picture with the soundtrack.
After out meeting with Buzz, we got a
tour of the Sony Digital Authoring Center, including the massive arrays of servers that work to produce the movies we watch at home. In the racks were also a full array of Blu-ray players from all companies so that titles can be tested on most players that are out there to be sure that they will work when they are released into the marketplace. The amount of processing power that is used to take a film master and convert it into a Blu-ray was astonishing, and there was always someone there checking the quality to make sure it looked great. Converting to a 3D Blu-ray takes almost an order of magnitude more processing power than a standard Blu-ray film does, though it is still a process that is in it’s infancy and they are working to improve that. On the monitors in the background I could see them working on the last season of Lost for Blu-ray, which I can’t wait for, and one member of our tour was trying to avoid as he preferred to watch on Blu-ray over broadcast and hadn’t seen it yet.
Next we visited Colorworks, Sony’s new Digital Intermediary facility that is located in the Stage 6 building. Colorworks allows Sony to do daily scanning of all their film negatives at 4K resolution for color correction, editing, special effects, and more. With the ability to scan at 15 frames per second, Sony is able to take the dailies from a set, have them delivered that night, and have the film scanned into the system and ready to be viewed and worked on the next morning. The servers pictured here hold over 500 Terabytes of data, which is only a fraction of the over 3.5 Petabytes of storage available in the building, all connected via fiber. Working at this high resolution ensures that the final product is ready for any display, from a Blu-ray player to a projector in a cinema. One two hour film at this resolution takes up around 12 Terabytes of data to store, so that storage can be taken up quickly but there is a team of people working just to manage the data all of the time.
Working on color correction for a film at Colorworks.
Additionally, they are working to bring over catalog films to 3D that had been shown that way previously. Here you can see someone working on “The Mad Magician” with Vincent Price, which was broadcast on TV in 3D. Of course I can’t tell you what the old version looked like in 3D, but looking at the work-in-progress version, the effect was more to draw attention to what the director wanted you to focus on with the screen, and not to annoy you with tacky effects that would grow tiresome quickly. The transfer looked to be very nice on the studio monitors, and I noticed that they used Genelec monitors for the audio in the rooms, which I would see later on as well.
The scoring stage, used for recording movie soundtracks, was a phenomenal place to tour, and I had no idea that soundtracks were typically recorded in just a couple of days. The musicians will show up that morning, having not seen the movie or the music, be handed their sheet music, and are ready to go as soon as they are warmed up and start recording. We were not able to see them recording live while we were there, but composers and conductors such as James Horner (Titanic, Avatar, A Beautiful Mind) have recorded their film soundtracks in the room. I would love to see a short documentary on this whole process one day, as it's just fascinating.
Inside the control room, they have the largest soundboard I had seen in my life (72 channels, I believe), once again using Genelec monitors. I was told that certain producers will bring in B&W speakers to listen to the recordings on instead of the Genelec’s, but those were the only two brands of monitor that I heard mentioned or saw on the tour. After the control room we saw the foley room, where a foley artist will create all of the sounds in a movie, from a cup clinking, to shoes on a hardwood floor, to any other sound that you will hear in a film. They had a huge array of props and objects to use to create any sound that might be needed, though I imagine they never have to stop thinking about what else to use when a new film comes their way.
Finally, we got to visit an automatic dialog replacement stage, used to redo, edit, or add lines to a film after production. This took place in a very large, open room with lots of comfortable couches around and a couple of mics in the center of the room. Directors often have to go back to change dialogue after they discover in a test screening that a scene or plot-line isn’t making as much sense, or if they remove one scene they need to fill in details that are no longer in the film with that scene removed. Additionally, they might need to edit a film’s dialogue to be shown on TV or on a plane (Such as the hilarious dubbed lines that appear in the Die Hard series when it’s on network TV) so that it’s appropriate for all audiences.
Once you have all of the score, sounds, and
dialogue assembled, someone has to put it all together, and mix it into multiple channels, and for that there are a variety of re-recording theaters on the Sony lot. We visited the Kim Novak theater, where Greg Russell was currently working on the mix for Salt, with Angelina Jolie. Greg has a dozen Academy Award nominations to his credit, including the sound mixing for Transformers, Con Air, The Rock, Spiderman 1 and 2, and Pearl Harbor (I can see that he and Michael Bay seem to enjoy working together) and really enjoyed talking to us about working on the mix. In addition to the 5.1 mix for theaters (with 7.1 mixes starting this year for theaters as well), they will bring in speakers to mimic a home 5.1 setup for doing the mix that might appear on the Blu-ray disc, and for a 2 channel mix they will often use a standard TV, as that’s how most people will watch that mix.
I had no idea that the same team would work on multiple mixes of the same movie for the different environments, or that there could be such a difference on the home and cinema mixes, but when you consider the huge differences in the audio setups of the two locations, you can see where having a new, dedicated mix would be a beneficial thing for the home viewer. Additionally, they will now work on their mix differently if a film is in 3D. If something happens on the screen normally, perhaps you’ll just use the front channels. However if that action is coming out of the screen when it’s in 3D, you would obviously want to bring the surrounds into play to fully immerse the audience and not have a disconnect between the screen and the audio. I can’t convey it in these photos well enough, but the Kim Novak theater was just amazing to be in, and as ideal a theater for watching a film as you could imagine. I’m not sure what projector they were using, though a 4K model would be my guess since they master everything that way, and I’m certain the audio setup was as good as you could get so their mix can be Oscar caliber once again.
I mentioned the
training classes that Sony had for cinematographers earlier, and we got to drop in on one as they were working on recording a scene, and how the 3D choices they made would affect that shot. In this case, they were working on the inter-ocular distance for the lenses (how much space is between the dual lenses of the 3D camera) and how changing that will affect the shot. They would run through a scene, watch the playback (3D is almost always shot digitally) on their monitors, and adjust the settings to get their desired result. Sony was offering these classes for everyone in the cinematographers guild, as they want 3D to succeed as a whole and teaching people the best way to use it is the only way that is going to happen.
Finally on my way out of the Columbia Pictures lobby, all of their Academy Awards for Best Picture were on display, and I had to stop and admire the award for one of my all time favorite films, Lawrence of Arabia. I’d really like to thank Sony for bringing me out as the tour of their facilities was incredibly enlightening, and it made me realize how much more goes into 3D just just adding some special effects and a pair of glasses for the audience. I’m sure everyone in the home theater industry will be interested to see how this develops going forward.