Audiovisual Syntax: Editing, Continuity, and Lighting Techniques

Audiovisual Syntax – Editing

-Metric montage is generated through the juxtaposition of shots containing different lengths of shot. Contrast emerges from the duration of shots.

-Rhythmic montage is the movement within the frame that impels the montage movement from frame to frame. Contrast emerges from what’s inside the frame. Whereas metric and rhythmic montage are relatively easy to grasp, the concepts tonal, overtonal, and intellectual are more abstract ones.

-Tonal montage essentially refers to the overall atmosphere of a scene, or series of shots. Contrast emerges from the aesthetics.

-Overtonal montage consists of the interlinking of thematic elements across the whole film through associations of composition, motif, rhythm, and sound that articulate the shots in even more complex networks of interrelation. Contrast emerges from the montage of sequences. As Cubitt himself points out, this overtonal potentiality of montage has been thoroughly explored by contemporary electronic videoclips.

– Intellectual montage is probably Eisenstein’s most significant contribution, both in theoretical and practical terms, to Film History. Contrast emerges from the ideological implications. This is the type of montage that better reflects the influence by the aforementioned Marxist philosophy of dialectical materialism in Eisenstein’s postulates. Intellectual montage is based on the premise that if two shots with radically different diegetic contents are placed in conjunction with each other, the spectator can only explain their relationship by means of a concept which links the two at the level of symbolic meaning.


Audiovisual Syntax – Continuity

This leads to several considerations regarding:

  • Movement between spaces and screen direction
  • The eyeline match
  • Shot/reverse shot
  • Crosscutting

Movement between spaces and screen direction: Even though in early films contiguous spaces were frequently linked by characters moving onscreen, screen direction was not always kept until the late 1910s.
What is screen direction in terms of continuity?:

  • If a character is moving in one shot right to left, and leaves the frame on the left, he should enter the next frame on the right.
  • The same happens if something falls downwards and leaves the frame on the bottom side. The next shot should present the same object entering the frame from the upper part.
  • The end of the clip introduces one of the basic rules regarding continuity, the 180-degree rule, which relates to the imaginary 180-degree arc that is established once the camera shoots the first frame.
  • The 30-degree rule also applies in the name of continuity and is related to the aforementioned 180-degree rule.
  • The rule suggests that, when there is a cut to a different camera position, the camera should move at least 30-degrees from the previous position.

But, what happens when the 30-degree rule is broken?: If this rule is not observed and two shots are cut together of the same person or object within the scene, without the camera having moved more than thirty degrees, the effect on the spectator is a jolt, as if the camera has jumped a bit. Essentially, in terms of spatial logic there is not enough difference between the two shots in terms of angle (and, therefore, a clearly renewed position on the object) for the transition between the two shots to remain unnoticed. The result is a noticeable jump, what is termed a jump cut. The 30-degree rule serves to create an undisturbed seamlessness in the film because such shift does not draw attention to itself and is logically motivated in the narrative”


The eyeline match

  • If a character movement can cue contiguous spaces, so does character glance.
  • Continuity determines that gazes offscreen are usually employed as a way to introduce some contiguous space, which is the reason why they are often followed by a POV shot.
  • The POV shot can show either a portion of the space seen in the establishing shot, or it can show a contiguous space. Filmmakers who first employed the POV shot used it in both ways.
  • The second shot also shows a space that has been seen previously, but not from a character’s POV.

Shot/reverse shot: If a single eyeline provides a strong spatial cue, then a second eyeline on the other side of the cut should create an even stronger spatial anchor for the spectator. This principle is commonly used to create the shot/reverse-shot (SRS) schema, one of the most prevalent figures in the classical Hollywood cinema’s spatial system. The SRS also depends on screen direction.

Crosscutting

  • We have already defined crosscutting as the technique to broadcast two actions that happen simultaneously but in different locations.
  • It was one of the techniques listed by Pudovkin. The technique had been previously seen at a very early stage in E.S. Porter’s The Great Train Robbery (1903) and The Kleptomaniac (1905).


Which of the following filmmakers said that editing was NOT the central element of film?: Andrey Tarkovsky

Eisenstein suggested different styles of montage that were built on the notion of conflict/contrast. In which one contrast emerged from the ideological implications suggested?: Intellectual montage

The 30-degree rule states that: When cutting to a different shot, the camera should move at least 30-degrees from the original position in the previous shot.

What is the difference between crosscutting and parallel editing?: Crosscutting alternates scenes that happen at the same time, in parallel editing scenes aren’t necessarily simultaneous

What is the most recurring shot after a character glances offscreen?: A POV shot

When shooting a conversation, if the first shot is an over-the-shoulder low angle shot, the reverse shot should be (if we do not want to introduce any distortion): An over-the-shoulder high angle shot

A sequence in which “in which portions of a process are rendered through emblematic images linked by dissolves or other forms of punctuation” is a… Montage sequence

Regarding continuity, the axis that goes from the camera to the object filmed is: The optical axis

By placing props between the camera and the object filmed: We increase depth

In the camera team, who is in charge of maintaining the sharpness of the shot?: The 1st assistant camera


Lighting Techniques

Natural light – It can be direct or reflected.

  • Direct natural light: Prevailing in documentaries. It is also a basic light component when shooting outdoors, although it is frequently complemented with additional lights to better control the final result.
  • Reflected natural light: We obtain it reflecting natural light with white or pale reflectors. The resulting light is diffused and soft, easy to control and redirect.

Artificial light -it can be created with different kind of light projectors

  • Tungsten lamps: Similar to domestic bulbs, but with a higher voltage. The most popular incandescent tungsten lamp at a professional level is the photoflood lamp, whose color temperature is 3400oK. Its life span is quite limited, as the color temperature starts to fade away very quickly.
  • Fresnel: Named after their inventor, Augustin Fresnel, it is the most used artificial lighting in films. -“lights with lenses (…) [with a] stepped ring design that reduces the thickness of the lens to save on cost and also to prevent heat buildup in the glass, which can cause cracking”. -Fresnel’s beam is quite concentrated and focused. -There are two basic models, one for studio shootings and a lighter version that is more portable. -As the power capacity of Fresnel units ranges from 100W to 10.000W, they can be used for multiple purposes


Audiovisual Syntax – Continuity

What is continuity?: Thomson defined continuity as the tool that audiovisual pieces implement in order to achieve a “smoothly flowing narrative, with its technique constantly in the service of the causal chain, yet always effacing itself”. There is “a set of guidelines for cutting shots together”, but continuity goes beyond specific rules and ultimately refers to everything that works in favor of the aforementioned effort. Thompson explains why continuity became an issue in the first place focusing on two main notions:

  • Analytical editing: continuity and editing are so linked that, before montage came up, continuity was not even a problem.
  • Multiple spaces: once we introduce editing, some more specific problems may come up, one of them being how to visually connect different spaces. Several considerations need to be taken into account in this respect: Movement between spaces and screen directions, The eyeline match, Shot/reverse shot and Crosscutting.

Analytical editing: The use of the cut-in for detail starts to accommodate between two long shots in the early 1910s. The motivation for the cut-in is compositional, for without a closer view, we couldn’t follow the action adequately. Cut-ins frequently introduce POV shots, with a clear aspiration to realism

Multiple spaces: As introduced in the previous epigraphs, when filmmaking moved from stories told in very few wide shots to a more fragmented language, filmmakers had to learn how to deal with the multiple spaces that opened up in front of their (and their spectators’) eyes


Continuity for two characters:The easiest way of framing two characters without resorting to editing is using the 2-shot. The lack of cutting when framing two characters is less dramatic, though; hence filmmakers tend to make these piece more dynamic through editing. The main rule we have to take into consideration when editing a conversation between two characters is that the two optical axes have to keep the same angle towards the axis of action. The key then is the angle we choose. We can opt for a 90º angle:

  • The problem is that a completely lateral frame deprives us from the eloquence of tight shots.
  • This is why we may opt for more intermediate angles.

One of the tricks here is that the rule stands for the angle between the optical axis and the axis of action, but it does not affect the distance from the camera to the object. So, we can keep the same angle, but change the shot scale. We do need, though, to take into account the camera angle: if the 1st shot is an over the shoulder high angle shot, the reverse shot has to be an OTS low angle

shot. The situation is more stimulating if the characters move. That movement changes the axis of action, but can be used intentionally to move the camera, accordingly, introducing new spaces or situations. The key here is always having the 180º rule in mind. To sum up, if one of the characters move, the axis of action changes, which affects the way we edit the conversation between the characters from then on.


“When it comes to ‘match-cutting’ two shots showing someone walking through a door, for perceptual reasons, a few frames of the action may be omitted or repeated in order that the filmed action may seem more smoothly continuous than would have been the case had the shot been picked up precisely where the previous one left off”

This notion of raccord or “match” is essential to smoothen the editing process and applies to different fields within filmmaking in general and editing in particular, some of which we have already explained. “Match refers to any element having to do with the preservation of continuity between two or more shots. Props, for instance, can be ‘match’ or ‘not match’.”.

‘Match’ can refer, in the end, to several aspects when juxtaposing shots:

  • Eye-line matches – raccord de mirada.
  • Matches in screen direction – raccord de dirección
  • Matches in the position of people – raccord de posición (personas)
  • Matches in objects on screen – raccord de posición (objetos).

Continuity for one character: Imagine we want to get closer to or farther away from a character in two consecutive shots. We may want to preserve the axis camera-character.

  • If the subject is centered, when we move the camera, the axis needs to frame him or her centered.
  • If the subject is not centered, the axis of the subsequent tighter shot is usually the same, so the background remains a reference point for the spectator.
The same usually applies when there are high and low angle shots. When we get closer or farther, the camera angle is usually the same. // Continuity for three characters: Once we know how to deal with two characters on screen, we may need to introduce a third one. No new rules here, we only have to choose the axis of action that is more relevant in narrative terms. How can we introduce a new character?: The eyeline axis becomes extremely useful here: if one character looks at something in the offscreen and the subsequent POV shot reveals the presence of a new character, a new axis of action is automatically created.


LIGHT RELATED FACTORS AND RULES

There’re 3 factors:

  • If the light source is ‘hard’ or ‘soft’, determined largely by the size and distance.
  • The angle of the source in relation to the camera position.
  • The color of the light.

The angle of the throw, that is, the direction from which the light comes has the power to suggest:

  • The mood of the scene
  • The time of the day
  • The type of location.
“It will also model the objects in the scene, bringing out their shape and texture, or perhaps intentionally not revealing shape and texture”

The color of the light: “Often the story may allow color to be used in ways that go beyond strict realism, or the situation logically justifies a colored lighting effect. In such cases filters may be used on the light sources”

To sum up: Color is an essential component of (metaphoric) storytelling; early filmmakers like Mèlies were so aware of the persuasive power of color that they literally painted on their frames to enhance their images.

Color can separate space and time, as Griffith showed in Intolerance, but there are other options too, such as associating different colors to different characters and plotlines; The key is that color provokes different psychological reactions, which filmmakers have explored profusely; Why our brain assigns different psychological values to different colors is, to a certain extent a mystery, Nuances are important, though, when it comes to present colors: red can mean danger and death, but also passion and love; green can convey hope, but also death and rottenness; Cinematographers can adjust the color of the light by using filters or gels.


Artificial Light

  • Quartz Halogen: Frequently used for video recordings and by TV reporter teams, as they are easy to use. “The term quartz refers to the glass that is used to enclose the bulb. The term halogen refers to a reversible chemical reaction that occurs within an inert gas contained in the bulb”
  • Carbon arc lamps: It was first used in theatre and later moved to films, but it was soon replaced by HMI lamps, as they achieve the same goals (notably, simulating the light of the sun or the hard light of the moon), but they offer more advantages.
  • HMI lamps: They are used profusely in TV and also films

Sound Design

There are 3 main types of sound that we can identify in a movie:

  • Speech (Dialogues, Voice-over)
  • Sound effects
  • Music (Score, Source music, Soundtrack, Playback).

Sound effects are “any sound, other than music or speech, artificially reproduced to create an effect in a dramatic presentation”.

There’re four types of music in a movie:

  • Score: Musical themes and textures (mostly instrumental) that are appropriate to the style of the piece. It underscores changes in scene, action, and mood.
  • Source music: It is mixed to appear as though it was emanating from an onscreen source.
  • Soundtrack: It is composed by a set of songs which are often compiled into a CD on the film’s release.
  • Playback: Scenes that might include musicians in shot miming the playing of their instruments.

There’re 2 main functions that music plays in movies:

  • Continuity: Which is related to something we have already seen: we can close our eyes, but not our ears.
  • Emotional atmosphere: Music contributes to generate emotions that are inextricably associated to action and moods.

At a narrative level, we can distinguish 2 types of soundtracks in a movie: diegetic and non-diegetic (or extra-diegetic):

  • Diegetic sound: It is the one that is directly related to what we see on-screen, for instance, a musician playing some piece.
  • Non-diegetic sound: It is the music that emphasizes certain aspects of the action we see on-screen, although we never see the source of it, and the characters are not listening to it.

There’re 2 main groups in non-diegetic sounds:

  • Pre-existing music that wasn’t composed for a movie (classical music).
  • Ad hoc non-diegetic music: Most soundtracks nowadays are usually like this