a video review

Since I was renting the camera - the Sony a7s that is - I figured I might as well record my thoughts on how well it worked with the Steadicam Pilot.
a7s on a Pilot from Man Made Wilderness on Vimeo.
Since I was renting the camera - the Sony a7s that is - I figured I might as well record my thoughts on how well it worked with the Steadicam Pilot.
a7s on a Pilot from Man Made Wilderness on Vimeo.
Here's a rough drawing that I came up with to explain the difference between Full Frame still photographs and motion picture images, which are after all a succession of still photographs.
The difference originates from the direction that film travels through the camera's film gate.
There are complications, naturally. And there are many, many aspect ratios, as can be seen here and here. But this directional orientation explains the inherited size difference between (35mm) stills and motion photography.
Not very rigorous, but I think this pretty effectively demonstrates the usefulness of the Mosaic Engineering Anti-Alias filter in the Canon 7D. Watch this fascinating video to see the difference between using the filter and NOT using the filter, with ordinary household objects.
Here's what I did for five days back in January, scurrying around in the background.
Video by Brian Wimer
It wasn't enough that my local lab should stop developing E6 film materials. Now even Kodak is withdrawing from manufacturing slide films. Thanks to TOP for the heads up. Now there is no choice but to use Fuji. But I'll be damned if I'll use Velvia again, which leaves that hideous emulsion Provia. Looks like I'll be moving to negative films sometime in the future, after my preferred supply of out-of-date Astia runs out.
Continuing my complaints with the bolted together components of EOL'ed Final Cut Studio 7, now I'm on to Color 1.5. Some time has been spent with this opaque interface previously. But it took me probably another 8 - 10 hours to finally uncover the undocumented means to zoom in on the image you're working with in the geometry window so that you can magnify the edges of a vignette being applied. BTW, Vignette = PS layer = PP lens. But the controls are nowhere near as diverse as they are in those two still image manipulation packages.
Work proceeded on a shot that's on screen for 5 sec. 23 frames. There were five Secondary Vignettes applied, using five different shapes, four of which changed as actors move through the shot. About 15 hours was spent on this 6 second shot, learning the software, how to apply shapes, how to move shapes, trying to fine tune the edges.
Finally, after sending the sequence back to FCP so that I could watch the six seconds that had been modified, I'm deciding to start over again and abandon the 15 hours of work. Some of it was software learning curve, so I'll be able to use it again. But the edge detection/drawing around a moving object is a serious challenge. On a still image this is not that much of a problem with these simple tools. But for moving images, where occasionally the edges of an adjustment need to be redrawn every frame, those edges flicker and waver mercilessly, totally unacceptably when all the frames are viewed together. No doubt it's a combination of tools and technique. I'm lacking in both.
Which leads me to realize that if I'm going to use Color - which those who use it seem to feel is a fabulous piece of software - it's going to have to be in a more general manner. If I'm going to pick objects out of a scene for specific adjustment, either they need to be small, or they don't move, or they don't change shape.
I was thinking I'd figured out how this video was done, with some elements colored while everything else in the image is b&w. But after watching it again, I can see that they're using something way more sophisticated than the Vignettes in Color 1.5.
Turns out the zoom control in the geometry image preview in Color 1.5 is documented. I've moments ago found it in the online help, as the first topic under the Geometry Room heading. Guess I should have looked a little closer at the results when I searched the help files. Probably could have saved myself some time & agony.
get up and work on something, rather than lying in bed awake considering all the possibilities. Which leaves me sitting at a keyboard wondering which direction to take instead of reclining on a mattress. The last, likely alternative prior to becoming vertical, was a revisitation of the last topic of anguish immediately below: the barrier to entry into the software known as Soundtrack Pro.
Nearly a month after my last rant on the subject, during which I've chipped away at that wall, I think it can be reported that some progress has been made. Nonetheless, the functionality still appears erratic, limited, and mostly opaque.
A month ago, the primary frustration was not being able to uncover the means to utilize envelopes a.k.a. keyframes a.k.a. automation in the File Editor module. It's buried somewhere in the Help files, but that source didn't yield the information. Merely poking around the interface finally revealed the minute button that controls the envelope graph. But then it was another 8 - 10 hours of poking and probing that uncovered the logical necessity that Effects cannot be automated without first creating points on the envelopes. It was an hour or more before it became obvious that when automating five to eight variables at a time, all to coincide with one another, all the points on those different envelopes needed to be in line with one another. And the only way to do that is to zoom way in and then turn on the snap feature.
Apply a stored, user preset from one of the EQ effects? Oh yeh. It can be done. But it's going to take several hours to figure it out. Because I doubt I could explain it.
So my complaint? That the software gets in the way all too frequently.
When was the last time you used a piece of software that had a feature that the documentation described as doing exactly what you needed to do? And then when you used the feature it did EXACTLY what you expected and wanted?
Such is the joy of the first usage of a filter in Final Cut Pro 7 called "SmoothCam." Admittedly, it took ten minutes to analyse a piece of video 3 seconds 19 frames long - apparently it has to look at the entire clip from which those ninety-one frames come. I didn't ask the software to do much - simply smooth out a camera that bounced a little from actors walking across the floor.
This is going to be my excuse for why there isn't much movement during this interior scene: if I'd had them walk around, the floor would have been bouncing the camera in a totally uncontrolled manner. The 7D should probably have been mounted on the Steadicam instead of a tripod - or some mount attached to the ceiling instead of the floor. (Which reminds me of a stereo installation from the recesses of my past where I suspended the turntable from the ceiling of the room, since I knew that walking across the room would make the tone arm bounce unacceptably. I don't think the landlord was too keen on the holes left in the ceiling when I departed.) Or maybe several thousand pounds spread around the floor to dampen out the movement. You would think that the several thousand pounds of machinery already in the shop would have done the trick, along with the massive shop bench included in the master shot.
Rarely does a filter work the way I want it to. Nice to see that software can come to the rescue of a shot that would have been eliminated without the filter.
Boston in July? Why not?
We'd never been, but it wasn't on a whim that we travelled this far into foreign territory. Months ago, after acquisition of one of motion picture technology's most favored devices - a Steadicam Pilot - it was recommended by one and all that a workshop should be included along with the Steadicam. The workbook is a great start, but the hands-on approach is a quick way to vault up the learning curve. At the time, the Boston workshop was the closest time wise as well as geographically.
In the few months between signing up for the workshop and actually attending in Boston, I've been able to get in a good bit of practice, find various balance combinations that do or don't work, and employ the device in the production of a short film, the self produced "Walking With Roscoe."
While the location of our hotel was less than ideal for direct access to famous tourist sites in Boston (it was chosen to be within walking distance to the workshop), public transportation in the Boston area is superb. The CharlieCard, which can be purchased in all the subway stations, permits access to the subway trains as well as the buses. It keeps an electronic record of the fare paid initially, rides deducted, and can be replenished electronically as many times as desired. Something this intelligent is bound to help immesureably with getting people to use public transportation. Which the Boston area residents surely do. Buses and trains are nearly always full, and they run on a frequent schedule.
It would seem that all the classes and workshops I take I already have a fairly good grasp of the material prior to arrival. This seemed to be the case with this workshop as well. At least I know the theory. The practice requires A LOT more practice. What our instructor, Director of Technical Services at Steadicam, Peter Abraham was particularly emphatic about is learning to use the Steadicam to carry a camera in a manner that truly emulates how a human witnesses the present. Or at least be aware of the manner of human presence, to create movement that contradicts the smoothed, rounded corners, short cutted way we travel through life.
Day 1 was theory and basic movement.
Day 2 was practice operating three different shots designed by Peter that permitted us to branch out and work in various spaces around the Rule/Boston Camera facility. Shot 1 was with the Panasonic AG-AF100 Micro 4/3 camera on a Pilot rig, and utilized a Don Juan move in the middle of the shot. Shot 2 used some large Panasonic video camera & lens on the Zephyr rig in low mode, camera flipped upside down and the monitor up top. A lot of gear to move around, it never felt very comfortable. Shot 3 was with a Sony PMW F3, a Zeiss 18mm CP2 prime, on a Scout rig. This was our Grand Prix shot, the only one recorded during the weekend. In fact, everyone looked pretty good.
And it was pretty nice to return home and be able to offer some Steadicam work to someone else the following day.
Several weeks later, it appears as if winter has indeed loosened its grip.
Tech note: this photo comes from a 25 year old Nikon 24mm lens on a Canon 7D body. The Nikon glass, with an adaptor, is a lot less expensive than current models of Canon lenses. But this was purchased primarily for use as a video device, an example being the previous entry.
Maybe it's already obvious to everybody else, but the proper methodology - aka "workflow" - for getting files from flash card based video cameras or DSLRs into editing software, specifically Final Cut Pro, has eluded me. If you don't already know it, DO NOT simply drag & drop files from the cards to the hard disk. When trying to open them later in FCP, in the Log & Transfer window, the software will report an unsupported file type. The entire file structure needs to be copied off the card.
If using a DSLR such as the Canon 5D MkII or 7D, Canon has created a utility for FCP which helps with the correct settings and the use of the Log & Transfer function, and supposedly transcodes the original H.264 codec to Apple ProRes (or whatever editing codec you want) at three times the speed that Compressor will do this operation. Canon suggests using the Mac Disk Utility to first mount the card as a disk image on the hard disk.
Very preliminary use shows that another method, which seems much simpler, is to select in a Finder window the folder on the card that contains the files that need to be copied off the memory card, go to Edit/Copy, then open the folder on the hard disk where they need to be placed and go to Edit/Paste. Once again, Drag & Drop doesn't work, but copy & paste does.
In the case of the Canon camera(s), there is a directory called eos_digital with a subdirectory called dcim. When using Log & Transfer, open the dcim directory to find the copied files. Choosing any directory lower than this results in the Unsupported media message.
This way the files can be opened from the hard disk, and the memory cards can be reformatted and used again for new material. As an added benefit, cards can be copied to a portable hard disk such as the Photo Safe II, and then transferred later to a computer.
The Photo Safe has no display other than digital readout for functions, so is really only a small portable hard disk with card readers connected. When travelling no computer is needed to download memory cards. I've not really used this much yet, but with a summer vacation under way, it seems the perfect solution to the checked bag luggage problem. The primary issue appears to be the transfer speed from card to Photo Safe: they claim a 1 gig card takes 3-1/2 minutes, so my 16 g cards are going to take nearly an hour. Photo Safe to computer runs at USB 2.0 speed.
If anyone using the Canon 7D and FCP has a simpler way of getting video files off the compact flash cards, I'd love to hear about it.
This one twenty paces from yesterday's entry. I'm ecstatic about the color from this roll of negative film. It's Portra 400 scanned and run through Colorneg before a few minor tweaks in Photopaint. Maybe it's only the location and the light, because I've certainly had a hard enough time in the past getting what I considered faithful colors from negative stock. It obviously works as a medium, since so many people use(d) it, but I've had a hell of a time getting color that appeals to me. Apparently it's my scanning technique.
I'm ready to use this as a movie location. It even suggests the hint of a story.
Listen up, all you Luddites. If you're still using film, and scanning images into the computer, you may need this plug-in for PS. I read about ColorNeg a year or so ago, and tried the demo version, which is fully functional but saves files with a grid over them. For whatever reason, it didn't seem to suit me at the time. Recently I tried it again and am much more favorably impressed, to the point that I purchased a copy. This permits use of registered copies of ColorPos and GamSat as well, all of which require 48 bit scans as input. For the color 4 x 5 negatives as scanned by Vuescan @ 1600 dpi x 16bit, my files are approximately 252 meg in size. Rather massive, but in fact PhotoPaint X4 gets around them without too much trouble even using the four year old Dell box. The point of this name dropping is that this is finally a way to scan problem color negative film that has never scanned consistently with the Epson software for the V700. You might want to give it a try - and let me know of your experiences.
My awe and enthusiasm have waned in the intervening hours since dinner. It's not usual for me to say much of anything about tech, but this one is so cool I've got to mention it.
My brother loaned me his D70 a month or so ago to play with to learn about flash photography. That's been coming along. I won't bore anyone with details. I've become the most cooperative model I know... The most recent work has been within the house using the flash to light bits and pieces of rooms, usually with a lamp and a window somewhere in the frame. I've used my Sekonic meter to get a flash reading, then change the percentage readings and vary the settings on the camera accordingly. This has worked fine, but the problem is learning anything from this requires downloading the images to the computer and looking at them there.
Roger suggested Camera Control. Whoa... After installing it twice, charging the battery, changing the USB port setting on the camera, damned if it didn't work! Set the camera on a tripod, connect to a laptop via USB cable, and most of the basic functions (and a lot more) can be controlled from the computer, and then view the image on screen, instead of the shitty little LCD. Great for studio work, that's for sure. Or any kind of remote photography. In fact, with the addition of another $580 wireless device, you can connect the camera to a WiFi network and go wireless. Wouldn't that be too cool? On top of that, the newer more expensive cameras (Nikons) will generate a live feed to the computer.
I can see that these are the tools I should be using for the architectural photography that I've started to do. Especially with interior lighting, it's nice to be able to get instant feedback on the settings that work the best. One more example for me that there is no reason to do any commercial photography using film. I still prefer using the view camera, but these digital tools do way more a lot faster. For the sizes that I'm likely to need for people, a current SLR is likely to have plenty enough quality.
Alas - another nail in the coffin. I exposed five pieces of 4 x 5 film in the past three days, and it was still a lot more fun. Not as gee wizz, but still ultimatley more satisfying. I see a possible division of tools coming: more digital capture for other people, architecture, and that sort; continued use of the 4 x 5 for more personal work which I might want to print larger.