The idea of stereo180 for the underwater world could be considered a bit counterintuitive. The whole discussion of 360 video for the a truly immersive experience would lead the causal observer to question the logic of cutting your sphere in half, because, isn’t it the full sphere that gives you that “as close as possible to diving without getting wet” experience?
Photo courtesy of Oceans360
All of this is true, but it is not a case of one being truly more immersive than the other or one being ‘better’ than the other. They are different and work well for different use cases.
When showing 360 underwater video to the uninitiated, one of the first things they say (after OMG this is so cool!) is ‘I thought things were more colorful underwater’. To understand this we need to take a really quick dive into how light is absorbed in water.
One of the first things you notice as a new scuba diver is the surprising lack of color. National Geographic covers and planet earth videos share a vibrant colorful world! What happened!?
The answer is actually somewhat interesting, water absorbs light quickly as you descend beneath the surface and depending on particulates in the water, the depth at which no visible light remains can be quite shallow at times, well within the recreational limits of modern day SCUBA diving. (This is how we end up doing ‘night’ dives during the day).
The visible light spectrum can be broken up into the standardly recognizable ‘rainbow’ of colors. From least to highest energy, red, orange, yellow, green, blue, and violet, the order of which is quite important because it is the lowest energy colors that are absorbed first in water.
Where does the color go exactly, what is absorbing it? All water contains microscopic particles. Light strikes these particles and scatters, with some of the light absorbed. What remains is what color you see.
The deeper you go, the further the light is absorbed until only blue (or in our case here in Seattle, green) light remains, until we reach a depth where no light penetrates.
Back to National Geographic and Discovery channel, how DO they get those wonderful vibrant colors then?
Artificial light. By bringing high powered underwater lighting systems with us, we can return the ‘natural’ colors. This is accomplished because the distance of water between the subject and the lens is short so there is minimal low energy light absorption before the image reaches the camera sensor. It doesn’t take a lot of water to absorb colors, this becomes quite noticeable when you shoot a ‘diagonal’ shot. Back in the editing bay you might get frustrated thinking your fancy video lights have different colors. Well, before you start composing that nastygram to the manufacturer, take a moment to look at your shot. Is the subject matter on one side much further away than the other? How does this end up looking ‘on camera’? I’ll give you a hint, on the closer side the image will have more color, and minimal hue and tint shift. The side where the subject matter is further will have a blue or green shift due to some of the reflected light being absorbed before it reaches the sensor. Take away? For the cleanest image, ‘square up’ to your subject and for diagonal shots consider using a ‘close focus wide angle’ technique so the blue or green in the background feels right to the viewer, and the close and well lit area is the focus of the shot.
Close focus wide angle.
Photo courtesy of Oceans360
This is one of my favorite methods of underwater capture. I love it because you can show the colors and beauty of the underwater world without loss of the ‘big picture’. I can mess with scale, make a diver seem HUGE or tiny. I can made a nudibranch look as big as a house or a shipwreck into a tiny toy in the background.
Basically stereo180 is a type of close focus wide angle but on steroids.
So far there are not a lot of options for stereo180 underwater, the Sexton K1 Pro housing, and you can jury rig a couple Kodak pixpro’s together as a makeshift unit.
With the influx of consumer and prosumer stereo180 cameras coming out, I
believe we may see some more options soon.
So, back to “why” stereo180.
Beyond the color (or lack thereof) with standard 360 (and yes there are some really clever lighting kits being made by the folks at Boxfish for their array, but still for moderate distance subject matter) there is the issue of proximity to subject matter.
Photos courtesy of Boxfish-Research
We tend to try and keep our distance so to speak due to stitch line artifact. Even with modern stitching methods utilizing optical flow or d-warp, I find the distortion on stitch lines bothersome. This is a bit nit picky, yes, and I realize anecdotal evidence shows that if its a quick stitch line error or the jiggly distortion you see with optical flow, as long as it isn’t making someone or something into a Picasso painting for a period of time, that a general viewing audience doesn’t mind (or even notice?). I suspect as time passes and our viewing audience develops a more discerning palate that it will raise the bar and lazy stitching and not perfectly sync’d footage will surly become a thing of the past.
Photo courtesy of Boxfish-Research
Generally speaking stitch lines in 360 are managed by keeping a nice even distance from ‘stuff’ all around which is even more important for stereoscopic footage. For example it would look like this: Shipwreck is 10 feet away, swimming 10 feet off the bottom with lighting team and buddies staying 10 feet away from diver with camera. In that case, as long as we can sync the footage the stitch itself should be rather straight forward, even for moving shots. As you get closer, you want slower motion and more and more thought put into where the stitch lines are placed (stitch lines are where the clips from different cameras overlap to make up the full 360 sphere) and subject proximity for stereo comfort. Closer still you reach for camera arrays with fewer cameras (2 and 3 cameras) despite taking a hit where resolution is concerned, and that limits you to monoscopic footage.
The basic concept of stereo180 is two cameras side by side at approximately human eye distance with ultra wide angle lenses (often 200 or 220 degrees). This allows for binocular disparity which is referring to the difference in image location of an object seen by the left and right eyes, resulting from the eyes’ horizontal separation (parallax). The brain uses binocular disparity to extract depth information from the two-dimensional retinal images in stereopsis.
Photo courtesy of Oceans360
Although we still must respect the comfortable distances for stereo, all of a sudden this opens up a whole new world of options with regards to subject matter. There are no ‘stitch lines’ to worry about. All those medium and small individual critters you have not really been able to shoot are suddenly within your reach. Not just within your reach monscopically but stereoscopically (3D) as well. You can now add lighting arrays back into the mix without having uneven light and dark patches or having the lights not so hidden in the back quadrant (because generally people don’t look there much anyway).
Quite simply it means I can now shoot that baby wolf eel from just a few feet away and show off its full vibrant color as well.
The question has come up “doesn’t your audience miss the back 180?” The answer in a number of viewings is this: That is on me, the shooter. If i compose my subject matter properly, with items of interest well framed, then the audience is much less likely to get bored and look ‘around’ for what they are missing. The full 180 so the viewer can look around and stereo nature of the footage maintain the immersive nature.
So where does stereo180 fit in my toolbox?
I will still shoot monoscopic 360 for larger subject matter. Take for example Cabo Pulmo with its fish tornados. Although it does have some of the oldest coral reef structure in our northern hemisphere, its not the same kind of soft coral you’d see in bonair or whatnot. The real draw at this site are the Pellagics, the amazing schools of fish, rays, sharks, sea lions in the water column. For places like Cabo Pulmo or Socorro’s I would absolutely choose to shoot full 360 with lighting from the likes the Boxfish 6 SOLA array.
BUT, the second i want to share specific subjects, i.e. Anemones with clown fish, the Moray eel in a crack, 3 blacktip sharks hanging out in a tight little cave with a turtle or two in MAUI, I’ll now use stereo180 almost exclusively.
Stereo180 for me is the happy middle ground between standard flat video and 360, with the added bonus of some room to look around and the added immersive nature of 3D. It allows me a bit of a return to my roots if you will, revisit my love of close focus wide angle shooting, one of the best ways outside of macro to shoot in lower visibility environments such as our magical Emerald Sea.