Mocktail: A Shortcut for Creating iOS Device Mockups

I last updated my shortcut for creating device-framed screenshots in December 2017 by adding support for iPhone X. I’ve been meaning to add more device frames for a while now but, since becoming a father six months ago, free time has been a precious commodity.

In the meantime, Federico Viticci released his impressive Apple Frames shortcut which does a fantastic job at framing screenshots for different devices. I still wanted to update my shortcut though it now seemed redundant to simply add more frames. Instead, I decided to start from scratch and approach the concept of screenshot framing differently.

My original shortcut, like Federico’s, makes use of Apple’s marketing product images—high-resolution images of devices for use in marketing or promotional material—that are ideal for screenshot framing. However, there’s no variety in what’s available, only flat images of devices and usually in Space Gray.

Apple's product images for marketing aren't particularly exciting.
Apple’s product images for marketing aren’t particularly exciting.

Instead of framing screenshots using just these images, I wanted to create mockups using different product images that are more distinctive and, in some cases, three-dimensional. The result is Mocktail, a shortcut that creates framed iOS screenshots using various device images I’ve sourced from Apple’s website (e.g., product landing pages or the online store). Where necessary, Mocktail applies perspective distortion to screenshots using Cloudinary, an online image manipulation API.

A three-dimensional mockup created using Mocktail.
A three-dimensional mockup created using Mocktail.
Mocktail still performs traditional screenshot framing but includes some additional images to add variety.
Mocktail still performs traditional screenshot framing but includes some additional images to add variety.

Supported devices

Mocktail creates mockups for the following devices:

  • iPad Pro 2018 (11″ and 12.9″)
  • iPad Pro/Air 10.5″
  • iPad (including iPad mini1)
  • iPhone XS and XS Max
  • iPhone 8 and 8 Plus
  • Apple Watch Series 4 (40mm and 44mm)
  • Apple Watch Series 3 (38mm and 42mm)

Notably absent is the iPhone XR. There aren’t yet any usable images of the device to create mockups with (the only ones I could find aren’t a high enough resolution) nor has Apple created a product marketing image for it. I hope to support the iPhone XR sometime in the future.

Using the shortcut

Mocktail can be run as a normal shortcut, accepts images from the share sheet2, or using drag-and-drop, and performs the following steps:

  1. Checks if any images were shared to the shortcut via the share sheet (or using drag-and-drop). If not, the shortcut displays a list of recent screenshots for you to select from. Multiple images can be shared to create a batch of mockups all at once.
  2. Checks that the required base images are available in iCloud Drive. If not (i.e., the shortcut is run for the first time), they are automatically downloaded and saved3. The shortcut then continues.
  3. Calculates the pixel count of the screenshot by multiplying its width and height. This is used to determine what device the screenshot was taken on.
  4. Determines the orientation of the screenshot (either landscape or portrait).
  5. Using the device information and orientation, the shortcut displays a list of suitable base images for you to select from.
  6. Applies rounded corners if the device requires it (e.g., iPad Pro or iPhone XS). For iPhone XS and XS Max, a notch is also added to the screenshot.
  7. For flat base images, the screenshot is overlaid onto the base image. For three-dimensional base images, Mocktail uses Cloudinary to apply perspective distortion to the screenshot, then overlays it onto the base image.
  8. Certain base images have a significant amount of white space. Where necessary, the shortcut crops the completed mockup.
  9. The completed mockup is saved to iCloud Drive. Each mockup is saved in a folder corresponding to the name of the device, such as /Shortcuts/Mocktail/iPhone XS Max.

Distorting images with Cloudinary

Mocktail uses Cloudinary’s upload and image manipulation APIs to apply perspective distortion to screenshots. You need to create a free Cloudinary account to use Mocktail. The free pricing tier is more than sufficient as you would need to run this shortcut several thousand times a month before you would exceed the free plan.

When you first launch the shortcut, it asks you to provide the following information about your Cloudinary account which can be found in the Dashboard:

  • Username (your Cloudinary “cloud name”, not email address)
  • API key
  • Upload preset
You can find your Cloudinary username (cloud name) and API key in the Cloudinary console.

By default, Cloudinary requires uploads to be signed with the account’s secret key. Mocktail doesn’t do this so you need to enable unsigned uploads and specify the upload preset in the shortcut. This randomly generated value is used to upload images without needing to sign them4.

The upload preset can be found in your Cloudinary account's dashboard.
The upload preset can be found in your Cloudinary account’s dashboard.

Once you have provided your Cloudinary details, you can begin using Mocktail to generate mockups.

Mocktail is one of the most complex shortcuts I’ve created and it makes extensive use of dictionaries to store information. To figure out how screenshots should be distorted, I used Affinity Photo to draw lines along the sides of the display. I then added horizontal and vertical guides at the location where these lines intersected, providing me with the necessary X,Y coordinates required by Cloudinary.

Every image that would require a screenshot to be distorted was run through Affinity Pro to get the coordinates.
Every image that would require a screenshot to be distorted was run through Affinity Pro to get the coordinates.

Mocktail is available from my GitHub repository of shortcuts. I prefer not to use iCloud links when sharing shortcuts because of they’re one-time use limitation. Rather than generate a new link every time I update the shortcut, I can push an update to GitHub and the existing link still works (there’s also the usual benefits of using a version control system).


  1. Both iPad and iPad mini are the same resolution so iPad mini screenshots are handled as iPad screenshots—there are no iPad mini-specific base images. 
  2. At the time of writing, the current version of Shortcuts—2.1.3—has a bug that can prevent image-based action extension shortcuts from working. If Mocktail doesn’t work from the share sheet, run it from within the app. iPad users can also drag-and-drop images into the shortcut. 
  3. Federico cleverly uses Base64 encoding to store all images as text within his shortcut. I decided against a similar method because it seemed to severely impact the performance of Shortcuts. Instead, the base images are made available as a ZIP file in my repository that the shortcut downloads and extracts. 
  4. Signing uploads would have been a more significant undertaking. I don’t think it’s necessary considering the use case. 

Changelog

1.0.1
  • 1 month ago Fixed Fix an issue with iPhone 8 Plus screenshots.
1.0.2
  • 1 week ago Fixed Add "Continue Shortcut in App" action as a workaround for limitations when running from the share sheet as an action extension.

BBC’s History of Video Game Music

BBC 6 Music has a fascinating and nostalgic two-part series on the history of video game music. There’s also a bonus episode with Charlie Brooker, creator of Black Mirror and former video games journalist, talking about his love of video games and sharing some of his favorite game music.

You can listen to all three episodes in the browser or using the BBC iPlayer Radio app for iOS.

Infant Formula Preparation with Shortcuts

Our two-month-old daughter is formula fed. My wife and I prepare a batch of bottles every day using this formula mixing pitcher so they’re readily available at feeding time. Some arithmetic is needed to work out how much formula powder to add to a certain volume of water, and the amount of formula we need to prepare steadily increases as she grows. To give our sleep-deprived brains a break and avoid any miscalculations, I created some shortcuts to help out and do the math for us.

Formula Calculator works out how much formula we should prepare for the day, based on our daughter’s current weight (in pounds and ounces). The general rule of thumb for babies up to six months old is to offer 2.5 ounces of formula per pound of body weight in a 24 hour period. Her weight is entered when running the shortcut, along with how many feeding sessions to expect that day 1. The amount of formula to prepare, along with how much to fill each bottle with, is then displayed.

The Formula Calculator shortcut calculates how much formula we should prepare for the day.
The Formula Calculator shortcut calculates how much formula we should prepare for the day.

This next shortcut, Formula Prep, is the one I use the most. It calculates how much formula powder to add to a specified amount of water. Most formula powder in the US specifies one scoop (8.7g) of powder for every 2 fluid ounces of water. I specify how much water is in the pitcher and it calculates the amount formula powder to add—both in grams and scoops. I prefer to measure by weight as it’s all too easy to lose count of the scoops being added.

The Formula Prep shortcut takes a specified amount of water and works out how much formula powder I need to use.
The Formula Prep shortcut takes a specified amount of water and works out how much formula powder I need to use.

Prepared formula can be stored in the refrigerator for up to 24 hours. After that, it must be discarded. Formula Reminder is a shortcut I run once I’ve prepared formula that creates a reminder with an alarm set 24 hours later.

This shortcut creates a reminder so I don't forget to dispose of any leftover formula after 24 hours.
This shortcut creates a reminder so I don't forget to dispose of any leftover formula after 24 hours.
  1. The number of feeding sessions can vary from day to day. We’ve been tracking our daughter’s feeds since birth and she’s currently averaging about six sessions per day.

Moment 72 Hour Holiday Sale

Moment is discounting all of their lenses, cases, and accessories by 20% and offering $5 shipping worldwide for the next three days when you use the code 72HOURSALE during checkout. Moment’s lenses are a core part of my iPhone photography kit and I highly recommend them.

Here are a few photos I’ve taken with Moment lenses:

Taken with iPhone X and Moment Macro lens.
Taken with iPhone X and Moment Macro lens.
Taken with iPhone X and Moment Wide lens.
Taken with iPhone X and Moment Wide lens.
Taken with iPhone X and Moment Wide lens.
Taken with iPhone X and Moment Wide lens.
Taken with iPhone X and Moment Macro lens.
Taken with iPhone X and Moment Macro lens.

If you’re thinking about getting started with Moment lenses, I recommend picking up the Wide lens first. It’s a versatile lens you’ll get a lot of use from and the one I like to use the most. You’ll also need one of their photo cases for your phone—this is what the lenses attach to.

For iPhone X and Xs users1, there’s also a battery photo case with built-in shutter button. When used with Moment’s iOS camera app, the button supports half-press to focus. If you prefer to use any other camera app, it operates the same as a volume button to trigger the shutter.

Moment’s camera app has some advanced features, such as manual controls, and can also shoot in RAW. If you’re a stickler for EXIF data, you can select the Moment lens you’re using and the app embeds the information within the photo’s metadata.

  1. The case is MFi certified for iPhone X, though iPhone Xs certification is still pending. I use the battery photo case with an iPhone Xs and it works fine, and it’s expected that the case be certified in the near future.

Importing Instagram Photos Into Jekyll

I often post photos to Instagram or Unsplash. Now that I’m using my website for microblogging, I’ve started publishing my photos here as well. I’ll have more ownership over the content and, should either of these services ever go away, my photos will still be available.

I’ve also imported into my website a copy of all the photos I’ve posted to Instagram–about 1,200 photos spanning almost eight years. To do this, I requested an archive of all my Instagram data, copied the photos to my site, and generated all of the posts using Shortcuts.

The ZIP archive provided by Instagram contains a copy of everything uploaded, along with JSON files containing data about each post, comment, like, and more. While the archive contains all of this data, I was only interested in the photos.

Extracting the photos

The archive’s photos/ directory is neatly structured, with all photos organized into subdirectories using a YYYYMM date format (e.g., 201804/). The media.json file contains a photos dictionary, where each item contains information about each photo1:

  • path: The relative path to the photo.
  • location: The location the photo was tagged with. This is blank if no location was specified.
  • taken_at: The date the photo was posted.
  • caption: The caption of that photo. Similar to location, this is blank if no caption was included.

The first step was to import all the photos into the media/ directory of my website. I extracted the archive using Documents on my iPad , then created a new ZIP file containing just the photos/ directory. I opened this in Working Copy and extracted this new archive into my website’s git repository. After committing and pushing the changes, all of those photos were live and available to link to.

Creating the posts

Next, I copied media.json to iCloud Drive, then used Shortcuts (née Workflow) to create this shortcut that performs the following actions:

  • Loops through every item within the media.json file’s photos dictionary.
  • Gets the value for each item’s path, location, taken_at, and caption.
  • Creates a text file for each photo with the required Jekyll front matter using the values retrieved above, and sets the category to photo. The caption, if available, is included in the post.
  • Sets an appropriate name for the text file, based on the date information from taken_at.
  • Creates a ZIP file of all the text files that have been generated.

This is an example of a text file that the shortcut generates:

---layout: microblogpostcategory: photodate: '2018-09-02T14:54:40'title: ''slug: '18090212145440'mf-photo:  - https://www.jordanmerrick.com/wp-content/uploads/2018/11/e22680154ef0b993e870789b69764673.jpg---I love Central Park...

The photo URL is included in mf-photo as I’m using the same template I use for microblog posts, and that field was established when I set up the Micropub to GitHub service I use. You can easily change this to whatever you need.

Once complete, I opened the archive in Working Copy and extracted it into my site’s microblog/_posts/ directory.

Making changes to my Jekyll template

With my photos imported, I made a few small tweaks to my Jekyll template files. My microblog archive page displays a 10-word excerpt of the micropost’s text and uses it as the post’s link. However, many of my photos had no caption. As a result, there was no text to create an excerpt from, so Jekyll was skipping them and they weren’t being listed.

To make sure all my photos were listed on my archive page—and distinguish between plain text and photo posts—I added an emoji icon for any microblog posts that have the photo category set. I also edited the template for individual microblog posts to display the location information (along with the emoji pushpin symbol), if available.

Finally, I created additional JSON and RSS feeds that only include microblog posts with photos.

  1. Instagram treats posts with multiple photos as separate posts in the data archive. That was fine for me, as I’ve used only that feature maybe two or three times. 

Apple Watch Apps Are Dead, Long Live Apple Watch Apps

9to5Mac reports that Instapaper has dropped Apple Watch support:

Just two weeks after announcing it was going independent, popular read-it-later service Instapaper has updated its iOS application to remove support for Apple Watch. Instapaper was one of the first applications to ever support Apple Watch, launching its client on Apple Watch release day in 2015…

On Apple Watch, Instapaper allowed users to access text-to-speech playback of saved articles. The app also supported reorganizing articles, “liking” them, deleting or archiving, and more. While those features were originally hidden behind a $2.99 per month Premium upgrade, they became free in 2016 after Instapaper’s acquisition by Pinterest.

Instapaper is just the latest iOS app to drop support for its Apple Watch client. Earlier this year, Instagram killed off its Apple Watch application, as did Slack, Whole Food, eBay, and several others.

The reasons Instapaper had for dropping Apple Watch support are similar to those we’ve heard before. Apple deprecated WatchKit 1.0 and requires existing apps to be updated, but app usage was so low that it wasn’t worth the effort.

But why was usage so low? Some see this as a sign that Apple Watch just isn’t a viable app platform, but I disagree. I think the main reason why some apps suffer from poor adoption is that they simply lacked any meaningful purpose. Apps like Instapaper weren’t solving a particular problem or serving a need. As a result, they felt forced and unnecessary.

At WWDC in 1997, Steve Jobs responded to a question from the audience with one of his most memorable quotes:

You’ve got to start with the customer experience and work backwards to the technology. You can’t start with the technology and try to figure out where you can sell it.

This is as true today as it was 21 years ago, and I’d argue it explains why some Apple Watch apps didn’t take. It isn’t because the platform isn’t viable, it’s simply that some developers started with the technology and tried to come up with a reason to use it. I use Instapaper across my iOS devices, but I never used the Apple Watch app because organizing, deleting, and liking articles with it never made any sense to me.

As watchOS matures, apps that don’t have a compelling purpose are disappearing. This is a good thing, because it leaves us with apps that are better suited for Apple Watch. However, I am thankful that Apple Watch apps like Instapaper existed in the first place, as they paved the way by showcasing functionality or demonstrating how versatile Apple Watch can be—even if the apps themselves weren’t successful.

Apple Removes Apps From Their Affiliate Program

Members of the iTunes Affiliate Program (myself included) received an email from Apple earlier today that announced iOS and Mac apps would no longer be included:

Thank you for participating in the affiliate program for apps. With the launch of the new App Store on both iOS and macOS and their increased methods of app discovery, we will be removing apps from the affiliate program. Starting on October 1st, 2018, commissions for iOS and Mac apps and in-app content will be removed from the program. All other content types (music, movies, books, and TV) remain in the affiliate program.

This stinks, especially as it comes less than 24 hours after Apple’s earning call that announced yet another record quarter. Was that 7% rate really eating into their bottom line? I do find it interesting that the only content being dropped from the affiliate program is that which Apple takes a sizable cut of. iTunes Store and Books content remains, so why only apps? I can’t help but think it’s because Apple pay affiliates from their own 30% take, and they just don’t want to do it anymore.

Federico Viticci describes the move as downright hostile and petty, and I completely agree with him. This decision is a shitty one on Apple’s part, and it feels like it was made only with a balance sheet as consideration.

There are many great sites within the Apple community that contribute to app sales and adoption of Apple devices through app recommendations. This decision to end affiliate links hurts the very people who will have had a noticeable influence on the purchase of apps. Eli Hodapp over at Touch Arcade, one of the most popular iOS game sites, isn’t even sure how the site can continue.

I can say with absolute certainty that the majority of apps I’ve purchased and enjoyed over the years have been through reviews and recommendations that used affiliate links. That’s how I, and many others, discover new apps. I enjoy reading app reviews on MacStories or hearing recommendations in an episode of Mac Power Users. One of my worries is that we’re going to see far fewer meaningful recommendations from the community. Sites like Touch Arcade are going to find it extremely difficult to survive, and other publications may no longer publish app recommendations at all.

Apple is one of the world’s richest companies with billions of dollars in the bank. Dropping apps from the affiliate program after all these years just feels like a dick move.

Adding Webmentions to Jekyll

I’ve added some basic support for webmentions to my Jekyll-powered site using webmention.io and this Jekyll plugin. If any of my posts are mentioned elsewhere and my site receives a webmention, it’s displayed below the post content.

Since Jekyll is a static site generator, the plugin can only check for new webmentions when the site is rebuilt. Netlify uses continuous deployment to keep my site up to date, so any time I commit a change and push it to Github, the site is automatically rebuilt and deployed. To supplement this, I also use IFTTT Webhooks to trigger a build every 24 hours, allowing my website to check for new webmentions on a daily basis.

Although the plugin is easy to install and use, I ran into a hiccup when trying to work on my site locally. I’d normally use the following command to serve the site as I work on it, allowing me to see changes reflected:

bundle exec jekyll serve --limit_post 50

This uses the development Jekyll environment by default, which overwrites site.url with http://localhost:4000 (instead of using https://www.jordanmerrick.com). The webmentions plugin then attempts to retrieve webmentions for posts under that URL, not my site’s actual URL. As a result, no webmentions were being retrieved, so I couldn’t test locally.

As a workaround, I discovered that I needed to set Jekyll’s environment to production. This keeps site.url intact, allowing for webmentions to be properly retrieved:

JEKYLL_ENV=production bundle exec jekyll serve --limit_post 50

The plugin also supports sending webmentions, though I need to do a little more work to set that up. Outgoing webmentions is a separate command and not part of the build process. I do use a Digital Ocean instance for development, so I’m considering some sort of cronjob for handling outgoing webmentions.

Some Initial Observations About Shortcuts

Apple started accepting requests to download the new Shortcuts app earlier today. I received an invite this afternoon and have spent about an hour using the app. Here are some of my initial thoughts.

  • The app may have some new functionality and a fresh coat of paint, but it’s still very much the Workflow we know and love. The interface, how shortcuts (née workflows) are created, and the actions available are basically the same.
  • I don’t like how actions are listed. Shortcuts hides the groups of actions behind a set of suggested actions at first. To view all actions, you have to tap the Search field.
  • Shortcuts are run with just a single tap, not a double-tap. To view or edit a shortcut (or run it and see each action take place), tap •••.
  • Unlike Workflow, Shortcuts doesn’t show you each action step as it takes place. It hides this out of view, so a running shortcut doesn’t have that visual distraction.
  • I didn’t have any trouble adding some shortcuts to Siri. I was able to set a spoken phrase and run each of them without issue.
  • Siri Suggestions is an interesting feature. Based on your behavior, it offers a selection of actions that you’ve done before, such as view an article in Apple News or open an email you’ve recently read. These are actions you can’t replicate in Shortcuts, but they’re a bit limited in scope for the time being. I’m sure this will improve as time goes on.
  • There are several new system actions that can be used in Shortcuts:
    • Set Low Power Mode
    • Set do not disturb
    • Set Airplane Mode
    • Set Bluetooth
    • Set Cellular Data
    • Set Wi-Fi
  • There are also some new actions that provide some more functionality in iOS:
    • Run JavaScript on Safari Web Page
    • Markup
    • Show Result
    • Send and Request Payments
    • Share with iCloud Photo Sharing
  • Some third-party actions that Workflow supported seem to no longer be available:
    • Trigger IFTTT applet
    • Giphy
  • Some of my workflows no longer work, though exactly why is a bit of a mystery. Granted, these are really complex workflows, but they run fine in Workflow. I need to dig deeper into Shortcuts to see what might be causing it.

Much more functionality is expected to come in subsequent betas and, eventually, the final release of Shortcuts. I’m already impressed with this first beta, and I can’t wait to see the finished product.

Update 2018-07-06: Restarting my iPad appears to have resolved the issue of some of my shortcuts not working.

My iPhone Photography Kit

Photography has long been a hobby of mine, and for the past few years I’ve pursued this using my iPhone. I’ve owned digital SLRs and mirrorless cameras in the past, but the iPhone eventually made a separate camera redundant. Nowadays, I shoot with an iPhone X, and it’s the best camera I’ve ever owned.

Latourell Falls in Oregon.
Latourell Falls in Oregon. Taken with iPhone X using Halide. Edited in Darkroom.

iPhone photography is more than just the performance of a CMOS sensor though. It’s also the ecosystem of third-party apps and accessories that can be used to help produce great photos. As I’ve become a more experienced iPhone photographer, some of these have become an essential part of my hobby.

Halide

I take a lot of photos using the built-in Camera app, but I use Halide ($5.99) whenever I want more control. The app has a range of options, such as ISO and focus, and supports RAW. Halide also provides full support for switching between the 1X or 2X lens of dual-lens iPhones1.

A highly polished camera app.

You can see how deeply the developer cares about iPhone photography as Halide is one of the most highly polished apps for iOS. One of my favorite features is the way it uses the curved corners at the top of the screen on the iPhone X. Instead of leaving those corners empty, the developer puts the space to good use, displaying a histogram and exposure information.

Darkroom

I’ve used dozens of iOS photo editing apps over the years, but Darkroom (Free, $7.99 to unlock all tools and filters) has been my app of choice for some time. It’s a fully featured iPhone app with a wide range of adjustments and filters, including support for RAW photos. Edits can also be saved as custom filters to use with other photos.

Darkroom's impressive array of adjustments.

Darkroom has deep integration with the iOS photo library and there’s no “intermediate” library you have to import and export photos with. Edited photos are labeled and can be easily filtered, and edits can be reverted directly in the app.

Darkroom keeps getting better and better, and the developers just updated the app with more filters and a framing tool. The one feature I do yearn for, however, is iPad support. For now, I edit all my photos on an iPhone X, but I’d really like to edit photos using the larger screen of my iPad (and maybe using Apple Pencil, too).

The developers of Darkroom and Halide have been collaborating to make their apps work more closely together. Both apps have a shortcut button to open the other app, which makes it a seamless experience to take a photo and immediately start editing it.

Moment lenses

I’m a huge fan of Moment lenses as they add another layer of creativity to iPhone photography. I own the macro, wide, and tele portrait lenses—along with an assortment of accessories—and have taken some really great shots with them.

Tele portrait, wide, and macro lenses. Also pictured: cleaning pen and lens case.
Tele portrait, wide, and macro lenses. Also pictured: cleaning pen and lens case.

Moment’s mounting system is built into the iPhone photo case. To attach a lens, I just place it on a mount point and turn it clockwise. Since the iPhone X has two cameras, there are two mount points that the lenses can be mounted over.

A close-up of my dog's eye. Photo taken with Moment macro lens and iPhone X.
A close-up of my dog’s eye. Photo taken with Moment macro lens and iPhone X.

The mounting system makes the photo case a little thicker than other iPhone cases, but it’s hardly noticeable. There’s even a place at the bottom of the photo case to attach a wrist strap. It’s actually a solid case that I use all the time to keep my iPhone X protected.

This was the first shot I took with the Moment wide lens and iPhone X.
This was the first shot I took with the Moment wide lens and iPhone X.

If you’re interested in buying a Moment lens (or two), you can use my affiliate link to get 10% off your order.

DxO One

Due the smaller size of the iPhone’s camera sensor, there are times when it just can’t match the performance of a regular camera. The DxO One ($465) is an iPhone accessory that’s a 20MP digital camera. It has a 1″ sensor—much larger than that found in the iPhone—which is the same one found in Sony’s advanced RX100 compact camera. As a result, the DxO One can produce some exceptional photos that are simply beyond the current reach of the iPhone.

DxO One attached to my iPhone X. This photo was taken with an iPad mini 4.
DxO One attached to my iPhone X. This photo was taken with an iPad mini 4.

The DxO One app offers as much control as any camera, with the usual PASM options and full RAW support. It doesn’t have to be attached using the Lightning connector, as the DxO One can be connected over Wi-Fi, turning the iPhone into a wireless viewfinder.

I’ve written about the DxO One before, and it’s an accessory I still use, though not as much since I upgraded to the iPhone X and invested in Moment lenses. I mostly use the DxO One nowadays for night or long exposure photography. One of the photos I’m most proud of is this night shot of Manhattan, taken with the DxO One.

One of my favorite photos. Taken in September 2016 with DxO One.
Manhattan. Taken in September 2016 with DxO One. f6.3, ISO 100, 8 seconds.

Joby Micro Tripod

The Joby Micro Tripod ($23) is a handy accessory, especially for night photography. I use it with the Glif so I can stand my iPhone X on something like a table or wall. When not in use, the Micro Tripod folds into the size of a memory stick.

A portable tripod solution.

The mounting point at the center also pivots, providing some flexibility in positioning. In a pinch, this combination has even come in handy to hold my iPhone at a comfortable viewing angle while I watched a movie on a flight.

Despite the diminutive size of the tripod, it’s very stable. I’ve used the Micro Tripod and Glif to hold my iPhone X with the DxO One attached.

Glif

The Glif ($30) by Studio Neat is a deceptively smart tool that every iPhone photographer should own. It’s a portable tripod mount that works with almost any phone and case combination, thanks to the way the jaws wrap around the device and the lever locks it in place.

The Glif attached to the Joby Micro Tripod.

The three mount points allows it to work in either portrait or landscape; even attach other accessories, such as microphones or lights.

Anker and Yoozon monopods

Ok, these are technically selfie sticks, but hear me out. Selfie sticks get a pretty bad rap, mostly because of the obnoxious way a lot of people use them. Fundamentally though, selfie sticks are just handheld monopods, which are an extremely useful photography tool. I own two selfie sticks, one from Anker and the other from Yoozon.

Anker and Yoozon selfie sticks.

Both of them have a rechargeable Bluetooth shutter button, making it easy to hold take photos one-handed. If I want to take a photo of something up-close, I can just extend the stick and move my iPhone closer; I’ve been able to take some great photos of flowers and animals by using a selfie stick to get a bit closer than I normally would have been able.

A selfie stick allowed me to take an up-close shot of this butterfly. Taken with iPhone X.
I used the Anker selfie stick to take an up-close shot of this butterfly.

I prefer to use the Anker stick most of the time, simply because the build quality is excellent. The Yoozon feels flimsy in comparison, but it has a few features that make it more useful in some situations. The handle of the Yoozon stick can open up into a tripod, saving the need to carry a separate one with me. In addition, the Bluetooth shutter button is also removable, so I can set up my iPhone and take photos without needing to touch it.

The Yoozon selfie stick used as a tripod.

If you’d like to see more of my photos, check out my photo blog at jordanmerrick.photos, or follow me on either Instagram or Unsplash.

  1. The built-in Camera app doesn’t always use the 2X lens when selected. Instead, the app might still use the 1X lens and apply digital zoom. Halide’s option is an explicit hardware choice, so selecting 2X means the 2X lens will be used.