360-degree cameras are still uncommon, and for most people shooting with it will be unfamiliar territory. Moreover, you can forget the techniques of traditional photography, such as the rule of thirds.
With 360 pics and VR, you truly feel like you’re somewhere, not just experiencing it through someone else’s eyes. It’s not exactly like being there, but it’s tremendously closer than anything I’ve ever been able to capture or share before and it’s this experience that really excites me.
State-of-the-art cameras through which you can record the world as you see it in 360°. This means the camera captures stills or video from all around up,down, left, right, in front and behind you.
Technically yes you can, there are apps that help you like Google Street View. However this requires you to slowly but surely rotate on the spot several times taking lots of individual pictures as you go to cover a full sphere. This usually takes 5 minutes or so and a lot can move in that time, so this often leaves you with a lot of stitching errors. Another way to do it is to use a rotator and a clip on fisheye lense. This way the phone remains stable on a tripod and takes steady level images that can be stitched in post. If you have the patience this can wield great results, most phone cameras can achieve an 8k image this way.
The only way to shoot 360º video at the moment is using a special 360º camera.
The cameras are essentially taking two pictures (or videos) at the same time using fisheye lenses mounted on each side of the camera usually covering at least 190º, this gives an extra 10º of overlap so the software can align, match and stitch. Once you’ve taken your shot you have to stitch the double fisheye file together, a process that can usually be be done on your phone or computer using software that came with the camera. More on that can be found below.
While media and marketing companies often use these terms interchangeably, 360 and VR are actually two separate experiences.
360 degree cameras, while they can produce photos and videos that can be viewed in a virtual reality headset, output 2D images. In other words they’re the same old traditional flat footage. These cameras excel at creating fully surround images that users can explore on devices such as smartphones, tablets or even cast to your living room TV. But at the end of the day, this still isn’t virtual reality.
VR cameras differentiate themselves by creating a sense of depth, recording footage in full 3D. This creates a hugely different sense of immersion, using sight and sound to trick your brain into feeling like it’s transported to another world. Unlike 360 degree cameras, which limit you to looking at captured footage, VR lets you step into that footage. It’s a completely different experience.
Yes of course, in fact most computers these days have been updated to view 360 with there built in players, but there are also many software options for you to download from 360 manufacturers and you can find them here.
One rookie mistake we all make when first playing a 360 video on sites like facebook & youtube is relying on the default resolution, it makes the videos look terrible. Change the setting, click on the settings icon to max out the resolution to 4K if it allows. On Facebook make sure to select the HD option. It makes a huge difference.
While 360 video features a complete panoramic 360-degree horizontal view and a 180-degree vertical view, it’s important to understand that on most Head Mounted Displays (HMD) a user can only see about 90 degrees at a time. Viewing the entire sphere of the video involves the user turning and tilting their head and body.
Screen Resolution Matters
If a video is shot on a 4K 360 video camera, it’s not quite the video quality that it sounds! 4K resolution means that the image is 4,000 by 2,000 pixels. Since a viewer can only see 90-degree of the image at any time, they’re only seeing an image 1000 pixels wide. Imagine the difference between a Playstation 1 and a Playstation 4. That’s a very visible difference. True 4K resolution in a 360 video means that 4K resolution needs to apply directly to that 90 degrees that are visible in the headset gear being used, so to get true 360 4k you’ll need to shoot 16k, 16,000 pixels wide by 8,000 pixels high.
Beginner: Point and shoot and you’re ready to go. The stitching is done on the fly so you can share your shot right away without editing. The Ricoh Theta V, insta360 one, Xiaomi (Madventure360).
Advanced: The Garmin Virb, GoPro Fusion, Yi360 all roughly around £650, the likes of this middle ground camera will give you more post grading ability and slightly better quality being able to shoot at 5K or more, though in my opinion the difference isn’t all that obviously plain to see. I think they need to produce atleast 8k to warrant spending the extra.
Prosumer: The Insta360 Pro costing around £3500 to the Samsung Round £10,000. As you would expect they have all the bells and whistles to come with them, including the ability to shoot 360 in 3D.
All these and more can be found in my recommended 360 camera list.
Monoscopic: Single lens cameras that are attached together via a ring to form a circular formation. Monoscopic camera setups are generally the easiest and lowest cost setup. The most common setup for this style of video usually involves a least six different cameras in six different fields of view to create the full 360° experience.
Stereoscopic: Often filmed with two cameras per FOV. Cameras that utilises two cameras designed for each eye. This can generate the 360° vision and also create a 3D 360° view. Increasingly better stereoscopic options are becoming available all the time.
Equirectangular: Equirectangular format is one single, stitched image of 360° horizontally and 180° vertically. An equirectangular panorama essentially takes a spherical environment and maps it onto a flat plane. The width/height ratio is usually 2:1. A normal panorama is only a horizontal sweep and doesn’t take vertical into account. You also can’t convert any panorama into equirectangular because not all of them capture everything in 360 degrees meaning their will be missing information.
Dual Fisheye: When your camera has two lenses that each capture 180 degrees, this is what the output of the file often looks like when it hasn’t been stitched and it’s opened in a standard player.
Nadir: This is the lowest point at the bottom of your 360 footage.
Zenith: This is the Highest point at the Top of your 360 footage.
Optical flow Stitching: Instead of stitching together images according to a template based one-size-fits-all rule – essentially programming a computer to match dotted line A with dotted line B – the optical flow approach lets a computer keep track of the actual content of an image, down to the level of individual pixels. Letting a computer actually “see” what’s happening in an image, optical-flow-based stitching avoids any obvious stitching errors.
Chromatic Aberration: Also known as purple fringing. This is where you’ll see purple or sometimes blue appearing on some edges of objects in high contrast situations. This can be corrected in post.
6DOF
Six degrees of freedom, what this means is the number of different “directions” that an object can move in 3D space and help you sense depth and distance. Not only will the headset track your head movement as your looking around, 3DOF the 3 axis are roll, yaw and pitch, but it will also track your location as you move. 6DOF headsets will track orientation and position, the headsets knows where you are looking and also where you are in space. A person with a 6DOF VR headset can move freely and naturally in the virtual environment. Everything behaves the same way it does in the real world. You can look at objects from different angles, you can bend over or under them, you can walk around to the other side. It’s just.. natural
Spatial Audio
This delivers a fully 360 degree soundscape that responds to your visual field. When you move your head while wearing a headset or panning around on you desktop in one direction or another, the audio changes to reflect that movement. This is why the content that does have spatial audio recommends you listen with headphones. If someone is talking directly in front of you in the video, then you would hear them clearly in both ears. Move your head left and the audio perspective would shift along with the visual and the person talking would be heard mostly in the right ear.
Reframing
The ability to reposition ‘overcapture’ your 360 footage to specific points of interest creating a more traditional 16:9 video.
View this post on Instagram
Stabilization
When it comes to stabilisation and leveling, 360 video makes a lot of sense. Video shot with a standard camera can only be leveled and stabilized by cropping and rotating the footage; in more extreme cases, this can result in a significant softening of the image. In immersive video, there really is no “correct” camera orientation — anything can be up, down, left, or right. You can rotate the video in post to your heart’s content, and the number of on-screen pixels will always remain the same. Thanks to 360 cameras now having an onboard gyroscope, they can handle stabilization and leveling automatically in real time.
3 Axis vs 6 Axis Stabilisation
The main difference between 3 axis and 6 axis stabilisation is that the latter has 3 accelerometers in addition to the three standard orientation sensors. The pitch, yaw and roll sensors will navigate through your 360 footage well, but the added accelerometers with 6 axis or more make the footage more resistant to drifting displacement.
Fixed Foveated Rendering (FFR)
FFR is somewhat of a game-changer for mobile VR headsets, especially standalone units like the Oculus Go. It will render only the portion that the viewer is viewing up to 8k, leaving the rest of the video that is out of view at a lower quality.
You need to remember that the camera will capture everything surrounding it, so if you don’t want to be seen you will need to hide out of view and make use of the timer feature. When scoping for a shot, consider the fact that the viewer will be able to explore the whole image, so it’s best to try and find a location with plenty of interesting features and put the camera in the centre. Typically its best to imagine the camera as another person, keep it around chin – eye level in height and at a similar distance away when talking. You can find even more shooting tips here.
RAW file is basically an image that preserves most of the information from the camera, such as sharpness and contrast, without processing and compressing. When shooting in JPEG image information is compressed and lost. Because no information is compressed with RAW you’re able to produce higher quality images, as well as correct problem images that would be unrecoverable if shot in JPEG format.
This depends on the scene and what you’d like to achieve. Bracketing and RAW are not necessarily better than each other, they’re just tools you can use for certain specific situations.
If you’re shooting in an environment with high contrast, under and overexposed, then bracketing and HDR are perfect. You will need to use a stable tripod/monopod. And even with this, sometimes the camera can move and you can’t use the shot. With regular photography HDR software it can compensate for the movement and alignment, but shooting in 360º the software cannot.
To achieve the best quality results? Do the HDR stacking with the unstitched photos, not with the equirectangular image. If you stack the stitched photos you’ll likely end up with a lot of artifacts and distortions.
A High Dynamic Range (HDR) image is an image that contains a larger range of color and tonality than typical files, which are known as Low Dynamic Range (LDR) images. HDR images are captured by special cameras or are created by merging multiple LDR images together. While LDR images normally contain either 8- or 16-bits per channel, HDR images contain 32-bits per channel.
Currently, there are no printers and only special monitors that can display HDR images. Therefore, in order to print or display an HDR image, they must be compressed down to an LDR image. This process, called tone mapping or tone compression, is one of the main processes that occur in an HDR application.
In photography, Dynamic Range is the ratio between the minimum and maximum brightness values either of the original scene, the digital image, or of the final print
There are desktop softwares like PTGui or Hugin and most of the 360 camera manufacturers themselves now supply desktop software that can stitch, but i think this is not necessary and time consuming unless you have a whole bunch of photos to batch stitch all at once.
Personally i would advise to use your camera manufacturers available mobile app to select the photo you want from the apps gallery, then you simply download it to your phones gallery from the camera app. Its during the download process that the app stitches your photo to equirectangular and then you will be able to see the downloaded file in full 360. For more on 360 players click here.
If you choose to remove the memory card and download straight from the memory card to your computer you will then find you have the unstitched double fisheye file, this won’t display in 360 and you’ll need to stitch it using desktop software or put the card back in the camera, load the cameras app and download it to your phone. For more on stitching click here.
Yes I think now all sharing sites will now offer you an iframe embed code, simply click on embed which can sometimes be in the 3 little dots menu in the top right and upload your photo to one of the many sites (I listed a few here) and just copy the code and put it anywhere on your website. If you use wordpress you can actually use a shortcode to use the uploaded photos in your media library: (vr url=path-to-photo.jpg view=360) Just replace () brackets with these [].
Yes there are now many apps and software available i listed those i know of here. One thing you cannot do is crop, as it will no longer be 360. If you use a non 360 aware editor and add to much clarity, sharpness or HDR effect you will find the stitch line becomes very obvious. There are techniques and tutorials to help you here.
Even if you do everything right things are bound to go wrong when shooting in 360. I learned most lessons from trial and error during shooting: avoiding the stitch line, remembering to take the position of the sun/shadows into account, remembering to take spare batteries, make sure you have your SD card with you, and if you don’t’ want to be in the shot, where to hide.
You can find 20 more tips for shooting in 360 here.
Everyone from the camera makers to the content creators are still trying to figure out this new frontier. Improvements are happening all the time, so expect products and the footage to improve significantly very very quickly.
Click Here For More of My Tutorials
If you found this helpful, please like and follow my social pages
Do yo have any more recommendations for beginners?
Originally posted on 4 Oct 2017 @ 22:16