Take your portrait mode photos from awesome to unforgettable with Apollo!
Post-process a photo as if it was a 3D render! Illuminate the scene to your heart's content.
WARNING: Taking portrait mode photos is supported by iPhone 7 Plus, iPhone 8 Plus and iPhone X.
Turning off Help Tips
You can learn how to easily turn help tips off by watching this video.
Take your portrait mode photographs from awesome to unforgettable with Apollo! Select the color, intensity, distance of a light and illuminate the scene to your heart's content.
Apollo is the first app to use the depth data of your portrait mode photos to calculate photorealistic lighting. Using a proprietary graphics card processing algorithm, Apollo is able to illuminate a portrait scene in real time.
Load up a photo from your gallery and start adding lights. Apollo will realistically calculate the effect of the light's distance, color, intensity etc.
Add up to 20 light sources. For each one, calibrate:
- - position in 3D space
- - light intensity
- - light spread
- - color
You can even position a light source BEHIND an object in your photograph to produce halos and emphasize silhouettes!
For best results, keep in mind the following:
- - The photo should have uniform lighting - no hard shadows or strong highlights.
- - Low lighting photos, photos taken under an overcast sky or selfies and close ups are perfect candidates.
- - A crisp and steady photo results in crisp customized lighting.
- - Practice makes perfect! Although Apollo is built with simplicity in mind, mastering its concepts may take some time.
Apollo is the first application to use the depth data of portrait mode photographs, to realistically add light sources to a scene. Development of the app began as an experiment back in November 2017, when we first got our hands on a brand new iPhone 8+. We wanted to see what could be achieved by taking advantage of the depth information of portrait photos. Our hypothesis looked simple: if depth information can be superimposed on a 2D photo, it should be possible to re-illuminate objects with custom light sources.
Of course, the first results were horrible. Our team stuck to the cause and tried to squeeze every last bit of information from the depth buffer. First we needed a method for deriving more depth points from the depth(disparity) map provided by the dual camera API. We algorithmically produced a new, denser map of depth points on the existing photo. Things were looking brighter, but still the visual effect of the computed lighting using the enriched depth map looked disappointing.
It was time for smoothing. We implemented different filters with various results. We needed a map of smooth contour lines that realistically follow the curves of the foreground objects. A special sauce of interpolation for enriching our map, along with some bilateral filtering for avoiding edge artefacts was the recipe that saved the day.
Armed with a high quality depth map, we were able to deduce the normal map which is fundamental for applying the lighting model of a 3D scene. Using a Phong-style lightning model, we had our first success!
At this stage, computation of depth information for a portrait photo took roughly 45 seconds, leading to very bad UX. It was time to move closer to the GPU! Our algorithm was first broken down to take advantage of multiple threads. Then all computations were rewritten for the Metal 2 SDK. Loading time dropped to around 3 seconds, a staggering improvement!
The next step was to expose all configurable parameters to the user. When our UX team started work on the project, there were dozens of parameters to tweak. That was no good, we needed a minimal set of parameters that give the user full control without being overwhelming. After lots of iterations, we narrowed down our list to six parameters: 2 global settings and 4 light source specific parameters.
We hope that our user community will enjoy illuminating photo of themselves and their loved ones. We are open to suggestions on how to make Apollo better!