Go | New | Find | Notify | Tools | Reply |
Member |
Watching TV I see that movies have different appearances. Different ways of filming, videoing, or? I am aware that older movies are filmed, but the new produced stuff is video, or just better technology now? There appears to be a difference between even the newest productions? Can someone explain to me the differences? Jim | ||
|
Member |
So, video is video regardless of the recording medium. Any camera works, at its most basic level, by recording information about the light coming into the camera. If we go all the way back to the earliest cameras, you put a piece of metal with light-sensitive chemicals on it in the back of the camera. When you opened the lens, the lens focused the light from the scene on the metal plate. Where brighter parts of the scene were projected onto the plate, the chemical changed more, and where darker parts of the scene were projected onto the scene, the chemical changed less. The first commercially available process for this was the daguerreotype, which was introduced in 1839. FILM: Film works the same way. Black and white photographic film is just a clear plastic film with silver halide in it. When the silver halide is exposed to light, it gets darker. (This is why with film you get a negative, where the dark parts are light, and vice versa.) Color film also uses silver halide, but has multiple layers with other chemicals in them, and part of the developing process turns each layer a different color. The smallest detail you can capture in an image on film is based on the size of the silver halide crystals in the film. Smaller crystals mean you can capture smaller details, but they also mean the film needs more light for the right amount of chemical change to occur. So, e.g., 50 ISO film has very small crystals and makes very sharp images, but requires long exposures, while 800 ISO film has large crystals and makes grainier images, but requires much shorter exposures. Note that if you use a larger piece of the same kind of film to capture the same image, each silver halide crystal is a smaller piece of the overall image, so you get more detail. The long and short of it is, assuming you have a great camera and lens, the quality of the image is dependent on the size of the silver halide crystals (smaller is better) and the size of the film (larger is better). If you go back 20 or 30 years, almost all consumer photographic cameras used 35mm film for the sake of convenience: cameras could be small, film came in rolls of 24 or 36 frames so you could take a bunch of pictures without changing film, etc. Many serious professional photographers of the time used medium format cameras (which still used rolls of film, but the rolls were twice the size of 35mm film and the cameras and lenses were a lot bigger) or large format cameras (which used individual sheets of film, usually 4"x5", 5"x7", or 8"x10"). ELECTRONIC SENSORS: The other major type of camera uses an electronic sensor. Instead of inducing permanent chemical changes, you use a grid of electronic circuits that are sensitive to light (photosites), and you record information about the electric signal produced by each circuit. More light, more signal. Unlike film, for a specific image sensor, the resolution is always the same. When you capture an image, you get information from each of the photosites. Any electronic device has noise - some random addition to the signal that you can't control. When the signal is much larger than the noise, you don't really notice the noise. As the signal gets smaller, the noise becomes more and more apparent. Eventually, you can't see the signal at all anymore because the noise overwhelms it. Here we talk about the Signal-to-Noise Ratio. So, less signal means noise is more of a problem. In the context of electronic image sensors, what makes the signal smaller? Less light. When you take a picture of a darker scene, there's less signal, so the noise is more apparent. When you use smaller photosites, there is physically less area for the light to hit, so there's less signal (but the noise stays about the same). So if you have two sensors made with identical technology and the same number of photosites, but one is larger than the other, the larger one will have a higher signal-to-noise ratio - each photosite is larger, so more light hits it, so there's more signal. As an aside, for almost all current image sensor technologies, an individual photosite responds to light - any light - and doesn't care what color it is. The way we make a color image sensor is basically to put little red, green, and blue filters over the photosites, so that a given photosite records the amount of red, green, or blue light hitting it (this is called a Bayer filter). VIDEO CAMERAS: Film video cameras use a long roll of film that they feed through the camera, exposing one frame at a time. Electronic sensor video cameras, historically, have worked two ways. The oldest ones (if you had a camcorder in the 90's, it almost certainly worked this way) recorded an analog signal on magnetic tape - basically you would take the image and actually put it on the magnetic tape, with brighter areas of the image being more strongly magnetized. Virtually every electronic sensor camera of the last 10-15 years has been digital, where it converts the information from the image sensor to a string of 1's and 0's and stores that representation on some digital storage medium. CAMERAS AND LENSES: Regardless of whether you're using film or an electronic sensor, a larger film frame or sensor means that both the camera and the lenses have to be larger. If you keep everything else the same but make the film frame or sensor twice as wide and twice as tall, the camera and lenses will also be about twice the size and about four times as heavy. Also, even if you ignore the direct quality improvement from making the film or sensor larger, there's a secondary benefit. Past a certain point, the sharpness of an image is limited by the lens rather than by the film or sensor. Using larger film or a larger sensor also makes lens imperfections have a relatively smaller effect. FILM VS DIGITAL: OK, finally the actual answer to your question! Film technology has not changed a lot over the last 50 years. There have been minor improvements made over time, but nothing huge. Digital camera technology has made huge strides over the last 10-15 years and continues to do so. ALMOST all professional video recording in any context is done with digital cameras now. It basically comes down to image quality and convenience. If you take a film video camera and replace the film with an image sensor of the same size as a frame of the film, you get a sharper, cleaner picture with more detail in any lighting conditions, but most especially in low light. When you use a film camera, you have to have a bunch of film. You might have to have a bunch of different film in different sensitivities for different lighting conditions. Film is bulky. When the camera uses up all its film, you have to swap the film out. With a digital camera, you use a few small tapes or memory cards or hard drives and you have to change them a lot less often. Editing film is a pain: you have to physically cut the strips of film and splice them together. Editing digital video is really easy and if you screw up you can just undo it or if you want to play around with moving things around a few frames either way it's trivial. Copying digital files is easier than copying film. Digital files don't deteriorate the way film does. Basically, digital cameras are a lot more convenient and they get you a lot closer to "what would this look like if the TV/movie screen was a window onto the scene." Film noticeably affects what you record. Generally, the difference between video recorded on 35mm cinema film and video recorded on 35mm digital cameras is huge and very obvious. Because of improvements in technology, video recorded digitally 10 years ago doesn't look as good as video recorded digitally today. Current digital recordings should look pretty much perfectly clear and sharp on a 1080p TV or a normal-size movie screen. 35mm cinema film recordings will look noticeably "fuzzy" or "grainy," especially on a movie screen. Do note that the picture quality argument goes out the window if you don't require the film and the image sensor to be similar sizes. If you compare a pocket-sized consumer camcorder with a sensor the size of your fingernail to an IMAX film video camera (which uses HUGE film and is a HUGE, HEAVY camera), the IMAX film recording will look tremendously better. (Most recent IMAX movies were recorded digitally rather than on film.) There are still some people out there recording things with film, both hobbyists and professional film makers. There are a few different reasons why but they basically all boil down to nostalgia, either for the process of using film or for the "character" that using film gives to the recording. You see it sometimes when a director is making a movie as an homage to some past genre or period of film and wants to recreate the look of those films. This is also true of still photographers. | |||
|
Member |
Thanks, I think! Lots to digest here, thanks for the explanation! Jim | |||
|
Powered by Social Strata |
Please Wait. Your request is being processed... |