Our two-month-old daughter is formula fed. My wife and I prepare a batch of bottles every day using this formula mixing pitcher so they’re readily available at feeding time. Some arithmetic is needed to work out how much formula powder to add to a certain volume of water, and the amount of formula we need to prepare steadily increases as she grows. To give our sleep-deprived brains a break and avoid any miscalculations, I created some shortcuts to help out and do the math for us.
Formula Calculator works out how much formula we should prepare for the day, based on our daughter’s current weight (in pounds and ounces). The general rule of thumb for babies up to six months old is to offer 2.5 ounces of formula per pound of body weight in a 24 hour period. Her weight is entered when running the shortcut, along with how many feeding sessions to expect that day 1. The amount of formula to prepare, along with how much to fill each bottle with, is then displayed.
This next shortcut, Formula Prep, is the one I use the most. It calculates how much formula powder to add to a specified amount of water. Most formula powder in the US specifies one scoop (8.7g) of powder for every 2 fluid ounces of water. I specify how much water is in the pitcher and it calculates the amount formula powder to add—both in grams and scoops. I prefer to measure by weight as it’s all too easy to lose count of the scoops being added.
Prepared formula can be stored in the refrigerator for up to 24 hours. After that, it must be discarded. Formula Reminder is a shortcut I run once I’ve prepared formula that creates a reminder with an alarm set 24 hours later.
The number of feeding sessions can vary from day to day. We’ve been tracking our daughter’s feeds since birth and she’s currently averaging about six sessions per day. ↩
Moment is discounting all of their lenses, cases, and accessories by 20% and offering $5 shipping worldwide for the next three days when you use the code 72HOURSALE during checkout. Moment’s lenses are a core part of my iPhone photography kit and I highly recommend them.
Here are a few photos I’ve taken with Moment lenses:
If you’re thinking about getting started with Moment lenses, I recommend picking up the Wide lens first. It’s a versatile lens you’ll get a lot of use from and the one I like to use the most. You’ll also need one of their photo cases for your phone—this is what the lenses attach to.
For iPhone X and Xs users1, there’s also a battery photo case with built-in shutter button. When used with Moment’s iOS camera app, the button supports half-press to focus. If you prefer to use any other camera app, it operates the same as a volume button to trigger the shutter.
Moment’s camera app has some advanced features, such as manual controls, and can also shoot in RAW. If you’re a stickler for EXIF data, you can select the Moment lens you’re using and the app embeds the information within the photo’s metadata.
The case is MFi certified for iPhone X, though iPhone Xs certification is still pending. I use the battery photo case with an iPhone Xs and it works fine, and it’s expected that the case be certified in the near future. ↩
I often post photos to Instagram or Unsplash. Now that I’m using my website for microblogging, I’ve started publishing my photos here as well. I’ll have more ownership over the content and, should either of these services ever go away, my photos will still be available.
I’ve also imported into my website a copy of all the photos I’ve posted to Instagram–about 1,200 photos spanning almost eight years. To do this, I requested an archive of all my Instagram data, copied the photos to my site, and generated all of the posts using Shortcuts.
The ZIP archive provided by Instagram contains a copy of everything uploaded, along with JSON files containing data about each post, comment, like, and more. While the archive contains all of this data, I was only interested in the photos.
Extracting the photos
The archive’s photos/ directory is neatly structured, with all photos organized into subdirectories using a YYYYMM date format (e.g., 201804/). The media.json file contains a photos dictionary, where each item contains information about each photo1:
path: The relative path to the photo.
location: The location the photo was tagged with. This is blank if no location was specified.
taken_at: The date the photo was posted.
caption: The caption of that photo. Similar to location, this is blank if no caption was included.
The first step was to import all the photos into the media/ directory of my website. I extracted the archive using Documents on my iPad , then created a new ZIP file containing just the photos/ directory. I opened this in Working Copy and extracted this new archive into my website’s git repository. After committing and pushing the changes, all of those photos were live and available to link to.
Creating the posts
Next, I copied media.json to iCloud Drive, then used Shortcuts (née Workflow) to create this shortcut that performs the following actions:
Loops through every item within the media.json file’s photos dictionary.
Gets the value for each item’s path, location, taken_at, and caption.
Creates a text file for each photo with the required Jekyll front matter using the values retrieved above, and sets the category to photo. The caption, if available, is included in the post.
Sets an appropriate name for the text file, based on the date information from taken_at.
Creates a ZIP file of all the text files that have been generated.
This is an example of a text file that the shortcut generates:
---layout: microblogpostcategory: photodate: '2018-09-02T14:54:40'title: ''slug: '18090212145440'mf-photo: - https://www.jordanmerrick.com/wp-content/uploads/2018/11/e22680154ef0b993e870789b69764673.jpg---I love Central Park...
The photo URL is included in mf-photo as I’m using the same template I use for microblog posts, and that field was established when I set up the Micropub to GitHub service I use. You can easily change this to whatever you need.
Once complete, I opened the archive in Working Copy and extracted it into my site’s microblog/_posts/ directory.
Making changes to my Jekyll template
With my photos imported, I made a few small tweaks to my Jekyll template files. My microblog archive page displays a 10-word excerpt of the micropost’s text and uses it as the post’s link. However, many of my photos had no caption. As a result, there was no text to create an excerpt from, so Jekyll was skipping them and they weren’t being listed.
To make sure all my photos were listed on my archive page—and distinguish between plain text and photo posts—I added an emoji icon for any microblog posts that have the photo category set. I also edited the template for individual microblog posts to display the location information (along with the emoji pushpin symbol), if available.
Finally, I created additional JSON and RSS feeds that only include microblog posts with photos.
Instagram treats posts with multiple photos as separate posts in the data archive. That was fine for me, as I’ve used only that feature maybe two or three times. ↩
9to5Mac reports that Instapaper has dropped Apple Watch support:
Just two weeks after announcing it was going independent, popular read-it-later service Instapaper has updated its iOS application to remove support for Apple Watch. Instapaper was one of the first applications to ever support Apple Watch, launching its client on Apple Watch release day in 2015…
On Apple Watch, Instapaper allowed users to access text-to-speech playback of saved articles. The app also supported reorganizing articles, “liking” them, deleting or archiving, and more. While those features were originally hidden behind a $2.99 per month Premium upgrade, they became free in 2016 after Instapaper’s acquisition by Pinterest.
Instapaper is just the latest iOS app to drop support for its Apple Watch client. Earlier this year, Instagram killed off its Apple Watch application, as did Slack, Whole Food, eBay, and several others.
But why was usage so low? Some see this as a sign that Apple Watch just isn’t a viable app platform, but I disagree. I think the main reason why some apps suffer from poor adoption is that they simply lacked any meaningful purpose. Apps like Instapaper weren’t solving a particular problem or serving a need. As a result, they felt forced and unnecessary.
You’ve got to start with the customer experience and work backwards to the technology. You can’t start with the technology and try to figure out where you can sell it.
This is as true today as it was 21 years ago, and I’d argue it explains why some Apple Watch apps didn’t take. It isn’t because the platform isn’t viable, it’s simply that some developers started with the technology and tried to come up with a reason to use it. I use Instapaper across my iOS devices, but I never used the Apple Watch app because organizing, deleting, and liking articles with it never made any sense to me.
As watchOS matures, apps that don’t have a compelling purpose are disappearing. This is a good thing, because it leaves us with apps that are better suited for Apple Watch. However, I am thankful that Apple Watch apps like Instapaper existed in the first place, as they paved the way by showcasing functionality or demonstrating how versatile Apple Watch can be—even if the apps themselves weren’t successful.
Members of the iTunes Affiliate Program (myself included) received an email from Apple earlier today that announced iOS and Mac apps would no longer be included:
Thank you for participating in the affiliate program for apps. With the launch of the new App Store on both iOS and macOS and their increased methods of app discovery, we will be removing apps from the affiliate program. Starting on October 1st, 2018, commissions for iOS and Mac apps and in-app content will be removed from the program. All other content types (music, movies, books, and TV) remain in the affiliate program.
This stinks, especially as it comes less than 24 hours after Apple’s earning call that announced yet another record quarter. Was that 7% rate really eating into their bottom line? I do find it interesting that the only content being dropped from the affiliate program is that which Apple takes a sizable cut of. iTunes Store and Books content remains, so why only apps? I can’t help but think it’s because Apple pay affiliates from their own 30% take, and they just don’t want to do it anymore.
Federico Viticci describes the move as downright hostile and petty, and I completely agree with him. This decision is a shitty one on Apple’s part, and it feels like it was made only with a balance sheet as consideration.
There are many great sites within the Apple community that contribute to app sales and adoption of Apple devices through app recommendations. This decision to end affiliate links hurts the very people who will have had a noticeable influence on the purchase of apps. Eli Hodapp over at Touch Arcade, one of the most popular iOS game sites, isn’t even sure how the site can continue.
I can say with absolute certainty that the majority of apps I’ve purchased and enjoyed over the years have been through reviews and recommendations that used affiliate links. That’s how I, and many others, discover new apps. I enjoy reading app reviews on MacStories or hearing recommendations in an episode of Mac Power Users. One of my worries is that we’re going to see far fewer meaningful recommendations from the community. Sites like Touch Arcade are going to find it extremely difficult to survive, and other publications may no longer publish app recommendations at all.
Apple is one of the world’s richest companies with billions of dollars in the bank. Dropping apps from the affiliate program after all these years just feels like a dick move.
I’ve added some basic support for webmentions to my Jekyll-powered site using webmention.io and this Jekyll plugin. If any of my posts are mentioned elsewhere and my site receives a webmention, it’s displayed below the post content.
Since Jekyll is a static site generator, the plugin can only check for new webmentions when the site is rebuilt. Netlify uses continuous deployment to keep my site up to date, so any time I commit a change and push it to Github, the site is automatically rebuilt and deployed. To supplement this, I also use IFTTT Webhooks to trigger a build every 24 hours, allowing my website to check for new webmentions on a daily basis.
Although the plugin is easy to install and use, I ran into a hiccup when trying to work on my site locally. I’d normally use the following command to serve the site as I work on it, allowing me to see changes reflected:
bundle exec jekyll serve --limit_post 50
This uses the development Jekyll environment by default, which overwrites site.url with http://localhost:4000 (instead of using https://www.jordanmerrick.com). The webmentions plugin then attempts to retrieve webmentions for posts under that URL, not my site’s actual URL. As a result, no webmentions were being retrieved, so I couldn’t test locally.
As a workaround, I discovered that I needed to set Jekyll’s environment to production. This keeps site.url intact, allowing for webmentions to be properly retrieved:
The plugin also supports sending webmentions, though I need to do a little more work to set that up. Outgoing webmentions is a separate command and not part of the build process. I do use a Digital Ocean instance for development, so I’m considering some sort of cronjob for handling outgoing webmentions.
Apple started accepting requests to download the new Shortcuts app earlier today. I received an invite this afternoon and have spent about an hour using the app. Here are some of my initial thoughts.
The app may have some new functionality and a fresh coat of paint, but it’s still very much the Workflow we know and love. The interface, how shortcuts (née workflows) are created, and the actions available are basically the same.
I don’t like how actions are listed. Shortcuts hides the groups of actions behind a set of suggested actions at first. To view all actions, you have to tap the Search field.
Shortcuts are run with just a single tap, not a double-tap. To view or edit a shortcut (or run it and see each action take place), tap •••.
Unlike Workflow, Shortcuts doesn’t show you each action step as it takes place. It hides this out of view, so a running shortcut doesn’t have that visual distraction.
I didn’t have any trouble adding some shortcuts to Siri. I was able to set a spoken phrase and run each of them without issue.
Siri Suggestions is an interesting feature. Based on your behavior, it offers a selection of actions that you’ve done before, such as view an article in Apple News or open an email you’ve recently read. These are actions you can’t replicate in Shortcuts, but they’re a bit limited in scope for the time being. I’m sure this will improve as time goes on.
There are several new system actions that can be used in Shortcuts:
Set Low Power Mode
Set do not disturb
Set Airplane Mode
Set Cellular Data
There are also some new actions that provide some more functionality in iOS:
Send and Request Payments
Share with iCloud Photo Sharing
Some third-party actions that Workflow supported seem to no longer be available:
Trigger IFTTT applet
Some of my workflows no longer work, though exactly why is a bit of a mystery. Granted, these are really complex workflows, but they run fine in Workflow. I need to dig deeper into Shortcuts to see what might be causing it.
Much more functionality is expected to come in subsequent betas and, eventually, the final release of Shortcuts. I’m already impressed with this first beta, and I can’t wait to see the finished product.
Update 2018-07-06: Restarting my iPad appears to have resolved the issue of some of my shortcuts not working.
Photography has long been a hobby of mine, and for the past few years I’ve pursued this using my iPhone. I’ve owned digital SLRs and mirrorless cameras in the past, but the iPhone eventually made a separate camera redundant. Nowadays, I shoot with an iPhone X, and it’s the best camera I’ve ever owned.
iPhone photography is more than just the performance of a CMOS sensor though. It’s also the ecosystem of third-party apps and accessories that can be used to help produce great photos. As I’ve become a more experienced iPhone photographer, some of these have become an essential part of my hobby.
I take a lot of photos using the built-in Camera app, but I use Halide ($5.99) whenever I want more control. The app has a range of options, such as ISO and focus, and supports RAW. Halide also provides full support for switching between the 1X or 2X lens of dual-lens iPhones1.
You can see how deeply the developer cares about iPhone photography as Halide is one of the most highly polished apps for iOS. One of my favorite features is the way it uses the curved corners at the top of the screen on the iPhone X. Instead of leaving those corners empty, the developer puts the space to good use, displaying a histogram and exposure information.
I’ve used dozens of iOS photo editing apps over the years, but Darkroom (Free, $7.99 to unlock all tools and filters) has been my app of choice for some time. It’s a fully featured iPhone app with a wide range of adjustments and filters, including support for RAW photos. Edits can also be saved as custom filters to use with other photos.
Darkroom has deep integration with the iOS photo library and there’s no “intermediate” library you have to import and export photos with. Edited photos are labeled and can be easily filtered, and edits can be reverted directly in the app.
Darkroom keeps getting better and better, and the developers just updated the app with more filters and a framing tool. The one feature I do yearn for, however, is iPad support. For now, I edit all my photos on an iPhone X, but I’d really like to edit photos using the larger screen of my iPad (and maybe using Apple Pencil, too).
The developers of Darkroom and Halide have been collaborating to make their apps work more closely together. Both apps have a shortcut button to open the other app, which makes it a seamless experience to take a photo and immediately start editing it.
I’m a huge fan of Moment lenses as they add another layer of creativity to iPhone photography. I own the macro, wide, and tele portrait lenses—along with an assortment of accessories—and have taken some really great shots with them.
Moment’s mounting system is built into the iPhone photo case. To attach a lens, I just place it on a mount point and turn it clockwise. Since the iPhone X has two cameras, there are two mount points that the lenses can be mounted over.
The mounting system makes the photo case a little thicker than other iPhone cases, but it’s hardly noticeable. There’s even a place at the bottom of the photo case to attach a wrist strap. It’s actually a solid case that I use all the time to keep my iPhone X protected.
If you’re interested in buying a Moment lens (or two), you can use my affiliate link to get 10% off your order.
Due the smaller size of the iPhone’s camera sensor, there are times when it just can’t match the performance of a regular camera. The DxO One ($465) is an iPhone accessory that’s a 20MP digital camera. It has a 1″ sensor—much larger than that found in the iPhone—which is the same one found in Sony’s advanced RX100 compact camera. As a result, the DxO One can produce some exceptional photos that are simply beyond the current reach of the iPhone.
The DxO One app offers as much control as any camera, with the usual PASM options and full RAW support. It doesn’t have to be attached using the Lightning connector, as the DxO One can be connected over Wi-Fi, turning the iPhone into a wireless viewfinder.
I’ve written about the DxO One before, and it’s an accessory I still use, though not as much since I upgraded to the iPhone X and invested in Moment lenses. I mostly use the DxO One nowadays for night or long exposure photography. One of the photos I’m most proud of is this night shot of Manhattan, taken with the DxO One.
Joby Micro Tripod
The Joby Micro Tripod ($23) is a handy accessory, especially for night photography. I use it with the Glif so I can stand my iPhone X on something like a table or wall. When not in use, the Micro Tripod folds into the size of a memory stick.
The mounting point at the center also pivots, providing some flexibility in positioning. In a pinch, this combination has even come in handy to hold my iPhone at a comfortable viewing angle while I watched a movie on a flight.
Despite the diminutive size of the tripod, it’s very stable. I’ve used the Micro Tripod and Glif to hold my iPhone X with the DxO One attached.
The Glif ($30) by Studio Neat is a deceptively smart tool that every iPhone photographer should own. It’s a portable tripod mount that works with almost any phone and case combination, thanks to the way the jaws wrap around the device and the lever locks it in place.
The three mount points allows it to work in either portrait or landscape; even attach other accessories, such as microphones or lights.
Anker and Yoozon monopods
Ok, these are technically selfie sticks, but hear me out. Selfie sticks get a pretty bad rap, mostly because of the obnoxious way a lot of people use them. Fundamentally though, selfie sticks are just handheld monopods, which are an extremely useful photography tool. I own two selfie sticks, one from Anker and the other from Yoozon.
Both of them have a rechargeable Bluetooth shutter button, making it easy to hold take photos one-handed. If I want to take a photo of something up-close, I can just extend the stick and move my iPhone closer; I’ve been able to take some great photos of flowers and animals by using a selfie stick to get a bit closer than I normally would have been able.
I prefer to use the Anker stick most of the time, simply because the build quality is excellent. The Yoozon feels flimsy in comparison, but it has a few features that make it more useful in some situations. The handle of the Yoozon stick can open up into a tripod, saving the need to carry a separate one with me. In addition, the Bluetooth shutter button is also removable, so I can set up my iPhone and take photos without needing to touch it.
The built-in Camera app doesn’t always use the 2X lens when selected. Instead, the app might still use the 1X lens and apply digital zoom. Halide’s option is an explicit hardware choice, so selecting 2X means the 2X lens will be used. ↩
Once again, Workflow to the rescue! I’ve created this workflow for publishing microposts on my Jekyll-powered blog that can also include photos. Instead of relying on the Micropub to GitHub service, it uses GitHub’s API to directly upload and commit the micropost–and attached photos–to my blog’s repository. This automatically triggers a site deploy on Netlify, making the new micropost and photos available a few moments later.
When run, the workflow does the following:
Prompts me to write a new micropost.
If the workflow was not run as an action extension, it asks if I want to add any photos (the workflow checks if any photos were shared when run). I can then select photos from my library.
Resizes all photos to 1000px wide. Photos that are already smaller than 1000px are not resized. I don’t want to share full-size photos, so reducing the size suits my needs.
Asks me to confirm or change the file name for each photo.
Uploads the photos to my GitHub repository and commits the changes, triggering a deploy with Netlify.
Creates the micropost text file with the composed text and current date. If photos have been included, Markdown-formatted photo links are added at the end of the text file. This text file is uploaded and to GitHub and committed. Netlify then does another deploy, making the post available.
Although I created this workflow primarily as a way to publish photos, I can also use it to quickly publish text-only microposts. What’s more, I can even dictate new microposts and publish them straight from my Apple Watch.
If you wish to customize the workflow, change the following parameters in the Dictionary action at the top of the workflow:
token: A GitHub token that has read/write access to your repositories.
username: Your GitHub username.
repo: Your site’s Repository.
directory: The directory to save images to.
site_url: The URL to the website.
post_dir: The directory where micropost text files should be saved to.
The configuration options have been created based upon my implementation of microposts. Depending on how (and where) you publish microposts, you might need to make some additional changes to the workflow.
I didn’t want existing followers to start seeing microposts in their feed readers without opting in, so the current RSS and JSON feeds continue to be used for blog posts only. If you’re an existing follower and also want to receive microposts, you can either use the new feed for all posts or subscribe to the additional feed for microposts.
Thanks to this helpful guide by Fiona Voss about microblogging with Jekyll (the static site generator this site uses), I’ve also set up a Micropub to GitHub service. This creates a Micropub endpoint that converts requests into a Jekyll-compatible format and commits them directly to my blog’s GitHub repository. With this endpoint, I can use the official Micro.blog app (or any app that supports Micropub) to write and publish posts.
Once a new micropost is added and committed to the repository, a deploy is automatically triggered over at Netlify that publishes the latest changes.