Use of ND filters

Good discussion. One thing that had me thinking recently is someone (respectable) saying that they prefer to add fake motion blur in Premiere/AE instead of relying on P3 camera motion blur, which they described as terrible.

I typically use an ND16CP simply to avoid over-exposure and blowing out the scene. You can always lighten but once over-exposed your exposure latitude is gone forever.
 
How can you be a pro photographer when you don't even understand what the ND filter does? It's simple, a neutral density filter uniformly reduces the amount of light that reaches the sensor without affecting colour saturation. This helps you to use a slower shutter speed to get the same exposure as you would without a filter for that particular scene.

You use ND filters in still photography too when you want to use a large aperture lens such as f2.8, f2.0, f1.8, f1.4, f1.2, or if you are very rich, f0.95(if you really are pro you would know which lens this is). Some cameras are limited by their maximum shutter speed, this means even at the fastest speed, the scene would still turn out over exposed. This is why you need the filter.

The filter also helps in still photography when you want to blur moving objects such as a stream of water. ISO and Aperture can remain constant while your shutter speed can be slowed down.

Carrying this technique into videography, the slowing down of shutter speed allows you to choose a suitable shutter speed which is approximated double that of your frame rate. This is crucial in videography to create a more natural looking video.

I'm really surprised you even dare to say you are a pro photographer when you can't even just google the information. If I were you I would feel so embarrassed to even post that.
What a jerk.
 
Thanks everyone for your input on my question, I didn't realize there was a shutter speed with video I thought it was just frames per second as I said I'm new to video and I'm going to have to digest this part of video. I understand the aperture is fixed, so the only variables we have control of is shutter speed, FPS and ISO. From your answers then we have control of FPS and shutter speed, camera will not override that, so then the exposure is controlled by the ISO, so the camera then adjusts the ISO to adjust for the different lighting conditions, so that the camera maintains our FPS and shutter speed. Then you use the ND filter to maintain your FPS. I hope I said that correctly.
Not really. FPS is not variable in an automatic way, it is fixed to what is available for a camera, It is one of your first decisions when you start filming. Also, shooting at 30 or 60 FPS will not make a difference in exposure. the FPS rate only allows you to squeeze more pictures in a second, which can change the way you perceive your film, and is useful for smooth slow motion. Cinema uses 24p, video 25 or 30 depending on places, and faster FPS rates are for slow motion purpose.
You set your FPS.
As Zezrum said, in video world the best shutter speed for a smooth sequence has to be set at twice your FPS and he explains why. So if you shoot at 30fps in USA, your shutter speed should be at 1/60s. (unless you want to achieve some FX, ie: strobbing the blades of a heli just like in photo.) Same, if you shoot at 60FPS to reduce the speed afterwards to a nice slomo, your shutter speed will be set at 1/120s.

Once FPS and shutter speed have been set, and knowing that the P3 cam has a fixed aperture of 2.8, the only adjustment left to get the correct aperture is the ISO. At that point, you may realise that you cannot achieve it because the sun is too bright. Your only alternative now to keep your values to what you decided is to add a ND filter (8 or 16). Then you will find again some flexibility on the iso to achieve the perfect exposure with the suitable shutter speed.
Just because if we let the auto exposure program do its job, when you start panning or having big objects crossing your frame, your video might become choppy, which you certainly don't want.
Hope it helps. ;)
 
It's helpful for photographers moving to video to simply think of fps as just individual photos. The fps can be set for various cinematic purposes to 24/30/60. Let's say we choose 30 fps. That's simply 30 individual photos taken in a single second. Now the ISO and shutter speed get applied to those 30 photos in the same way as photography (as does aperture but it's fixed on P3s so no need to complicate things). On a bright day, you'll have your ISO as low as it can go (100) and your shutter speed would be the only part of the exposure triangle left to compensate for proper exposure. Because It's so bright, it will naturally move up pretty high (1/400 - 1/1000 is typical). So each of your 30 photos will be exposed for, say, 1/1000th of a second at ISO 100. That individual frame (photo) will be incredibly sharp with such a fast shutter, as will the next 29 frames ( each with the same exposure settings). When you string 30 incredibly sharp photos together and play them as a video, you'll get a very unrealistic "motion-stoppy" quality that doesn't feel like natural motion.

Ideally, we would want to expose each of our 30 photos a little longer so that each frame has a little motion blur, especially on anything that's moving. Since our aperture is fixed and our ISO is already as low as it can go, slowing down the shutter to, say, 1/60 would WAY overexpose our shots. So to compensate, we have to have some other way to bring down the amount of light entering the camera - Enter ND filters... Sunglasses for your camera :)

There's very little difference when you realize video is just a LOT of photos taken in rapid succession with the same exposure rules applied!
 
  • Like
Reactions: ThommyBoy

Recent Posts

Members online

No members online now.

Forum statistics

Threads
143,094
Messages
1,467,602
Members
104,980
Latest member
ozmtl