Developing Cinescope v2
Process | Insights
Now that Cinescope v2 has finally be released 🎊 I thought it would be fun to reflect on the development process. If you’re interested in designer-related things check out the companion article in which I dive into my journey redesigning the app to incorporate all of its new features.
In 2014 I began collaborating with DP Rachel Morrison to build Cinescope. She was looking to create a camera app that would allow her to quickly scout locations and capture imagery using aspect ratios common to the film industry.
Version 1 of Cinescope was successful in meeting its goals but a lot has changed with Apple products in the last five years. iPhone hardware is more powerful than ever and iOS developer tools expose additional nuanced control over the camera. Our users have been asking for Cinescope to support these evolving features and for the past two years Rachel and I have been working in our spare time to redesign and rebuild Cinescope to meet those needs.
A note about migrating from Objective-C to Swift…
The decision to rewrite Cinescope in Swift was a no-brainer for me. Although it was technically unnecessary since all of the new features and functionality could be implemented with Objective-C, I believe the codebase is now more concise, extensible, and easier to understand. The trade-off for the upfront effort is that future me — or whomever takes over the project — will be thankful for a more approachable and maintainable codebase.
Enumerations are Elemental
Enums are so versatile that they are my go-to solution for representing the elemental building blocks of my apps. Their concise nature makes it easy encapsulate things like app state and other bits of related information, along with the operators that manipulate that information.
UI Strings — One enum pattern I use practically everywhere is associating UI strings (like button titles or switch values) to their underlying data counterparts. Rather than dump all of my strings into a global file or embed them in the VCs, I add a helper method to an enum. For example, I extended AVCaptureDevice’s FlashMode enum to return label values that support Cinescope’s design of the flash setting.
Color Themes—Here’s an example of using enums and tuples to represent a color theming solution within a single unit of information. With this structure in place my StyleService can expose a darkColor and a lightColor property to return the user’s current theme—which gets stored in UserDefaults.
Fonts—I expose my app’s fonts through an enum as well. Taking advantage of associated values allows me to create a self-documenting and concise interface. These kinds of patterns ensure a consistent usage convention throughout the app. Since my StyleService is the single source of truth for all things related to initializing the look-and-feel of the app, there’s only one place to look for bugs or to make a style change.
One More Thing—Swift now has a CaseIterable protocol that you can add to your enums which will automagically expose a collection of your cases…no more writing your own iteration solution. Thanks, Swift! 👍
I think of structs as super-enums or mini-classes—depending on the day—because in my brain they exist half-way between the two types. The thing to understand about structs is that they are essentially value-type classes, which means you won’t produce side-effects when modifying internal data. This design choice fits in perfectly with a service-based architecture, which is the pattern I use for all of my projects. In fact, Apple recommends favoring structs and protocols over classes to represent information and behavior.
Aspect Ratios—An example of this in action within Cinescope is how I model aspect ratios. The new version of the app supports storing both standard and user-defined ratios. There is business logic specific to each type, so I used a protocol to establish the basic interface and then extended structs adopting that protocol to support the required functionality.
Note: The Codable protocol is a necessary addition for me to be able to store AspectRatio objects and AspectRatio arrays in UserDefaults.
I encountered a few quirks during development. They ranged from the perplexing to the frustrating to the down-right maddening. Here are two of my favorites!
iCloud and the Camera Roll—Cinescope includes a media library that allows users to crop photos & videos, and format media for sharing in other apps, like Instagram.
I was initially using PHAssetCollection to populate the media library and this solution was working just fine on test devices. Here’s the code for grabbing a user’s assets from the Photos framework:
Once we began beta-testing, some users noticed that their camera roll was empty. Curiously, other albums were populated with assets as expected. The only common thread was the use of iCloud. There’s nothing here or here or here or here that suggests why my code would—or would not—work due to iCloud being part of the equation. Searching StackOverflow led me down the path of investigating isNetworkAccessAllowed, but that turned out to be a red herring.
Ultimately, I resolved this issue by looking at my old Objective-C code and porting it over to Swift. This fixed the problem for all of our users, although I cannot say why accessing assets via PHAsset rather than PHAssetCollection makes a difference.
RAW Capture + Camera Zoom Crash—This was a tricky bug to track down because it seemed to happen randomly. I couldn’t reproduce it on my end and the debug symbols didn’t provide any insight. It wasn’t until a user noticed that the crash occurred when he attempted to take a zoomed photo with the “Save RAW” setting enabled that I finally had a clue to investigate.
Here are my AVCapturePhotoCaptureDelegate methods that handle processing and saving image captures:
You’ll notice that I check for an error before proceeding. I would expect that if combining zoom with Raw capture was a crash-able offense it would show up as an error, but that wasn’t the case. Nothing in the documentation specifies that this is a crash-able offense either. Running the app in debug mode and watching the console output was the only way for me to verify the bug:
This appears to be a non-catchable bug. None of the framework methods I call throw an error, therefore I can’t handle it with a Do-Catch. If you know of a way I could have avoided this I’d love to be enlightened!
Interesting UX Challenges
Unlike the framework issues mentioned above, these types of problems are self-created. They come about due to design decisions, naiveté, and sometimes hubris…so I was kinda asking for them. 😃
Rendering text on a curve—One of my visual design explorations was inspired by the machining and typography of physical cameras. I represented some of the UI components as digital dials and in order to achieve this design I needed to be able to render text on a curve.
As it turns out, this isn’t easy to accomplish with iOS and it posed a significant technical hurdle. There are some potential solutions proposed on StackOverlfow but they are fairly old and, once I took the time to adapt them to my project, didn’t meet my needs. I was deep in the Googles (2nd or 3rd page 😑) when I came across Luka Oresnik’s repository.
Rendering NSAttributedStrings along arbitrary UIBezierPaths - lvnyk/BezierString
I had to make a few modifications to get the code working for my needs but I eventually found success! Below is a gif of the camera mode and aspect ratio ring control prototypes in action.
Because the design, interaction, and states become a little complicated when considered together I implemented the solution as a UIControl composed of subcomponents, each responsible for a specific piece of functionality.
The base SelectionRingControl object is responsible creating the view, monitoring user interaction, and routing the outcomes of those interactions.
Unfortunately, this design direction was scrapped for a different solution. But I think it’s an interesting interface that I hope to repurpose for a future project.
Overloading the volume buttons—We had a lot of requests to allow photo capture using the iPhone volume buttons. This is something that the iOS camera app supports, as well as other photo apps. But it’s not explicitly exposed by any iOS framework, which means implementations tend to be kludgey. I came across helpful code from other developers and they served as a good starting point for creating something that worked well for my needs.
The first step is to use KVO to listen for volume changes. However, if the device’s volume is either at minimum when the down-button is pressed or at maximum when the up-button is pressed then no KVO notification will be sent. This means the “capture using volume buttons” feature will appear to be broken to users. You’ll need to create an instance of MPVolumeView in order to preemptively control the volume value.
You have to explicitly add the MPVolumeView instance to your view hierarchy otherwise iOS will still present the volume level overlay to the user when a hardware button is pressed. Additionally, because Cinescope can potentially be in many different states—first-time use, returning from the background, viewing the camera roll, etc.—I had to introduce additional logic to avoid false-positive capture notifications. The cleanest solution was to make the root VC responsible for monitoring state and decide whether or not it wanted to respond to a delegate notification from the EffectService.
Note: Restoring the user’s initial volume information after responding to a volume button press results in a subsequent KVO notification, so you’ll need to account for that in your logic. In my I code I deal with it by adding/removing delegation.
Writing Code, A Sisyphean Task
My day job is Principal Product Designer at Duo. Since iOS development is done in my spare time it’s a challenge for me to stay current on the constant changes with Swift, iOS frameworks, and Apple hardware. PhotoKit and AVFoundation are so deep and complex that even though I’ve successfully released an app that leverages them, I can’t claim to have mastery…and those frameworks represent perhaps 1–2% of the iOS operating system. Sometimes I lie awake at night staring wide-eyed at the ceiling, unable to sleep because I’m filled with the existential dread that I’ll never know enough and oh my god I’m a total hack.
I don’t really have a point here other than to say I’ve been writing code since I got my Commodore 64 and damn, I know less now than I did back then.
Never Turn Learning Mode Off
Swift and iOS Fluency—Not all of my Swift code is written in a “Swift” way, and there are new language and framework features that I’m not taking advantage of. I would love to add an edge detection feature using CIFilter to assist with manual focus and also do something cool with the dual-camera depth capabilities. There are always new things to discover about the latest version of Swift. Now that Cinescope has shipped maybe it’s time for a pet project!
Unit Testing—For future projects I would like to leverage unit-testing and experiment with test-driven development. This is an area that I don’t have a lot of practice in, but I’m curious to see how it is applied to validate user interface functionality and state transitions.
You can purchase Cinescope here and support my amazing client. If you have any questions or comments you can hit me up here or on the Twitters or email or whatever. I’m always looking to learn new things so if you have advice or tips & tricks, please send them my way!