My iOS App Creation Process — Part 4
First, let me say I love SwiftUI! Why am I such a fan of SwiftUI? I prefer the declarative approach to interface development over the imperative (Interface Builder) approach. SwiftUI allows me to more rapidly iterate on design ideas and see the results even without running the iOS Simulator. With my fanboy declaration out of the way, let’s dive into my SwiftUI development process.
NOTE: This is not a SwiftUI tutorial or even a detailed introduction to SwiftUI. There are many useful resources for such information on developer.apple.com and raywenderlich.com. My article and code examples assume some basic knowledge of SwiftUI, SwiftUI data flow/state management, and the Model-View-ViewModel (MVVM) design pattern.
How Do I Start?
Remember the user experience (UX) design and wireframes from Part 3 of my series? The wireframes serve as my starting point. My basic user interface development philosophy is that each wireframe is a SwiftUI “screen” consisting of “views” that are the subsections/components composing the screen. I build out the SwiftUI implementations of the wireframes and then refactor the screens as required into views that comprise the screens resulting in better-organized code.
Also, I create skeletons of the ViewModel and any JSON Data Models required to develop the necessary SwiftUI code. For example, the app will display photo streams from Flickr, so we anticipate the need for a JSON Photo model to be used to decode photos in Flickr photostreams. Further, we expect a stream of JSON Photos from Flickr. A Swift array called photos
[Photos] in the view-model represents the stream of pictures from Flickr. Again, full implementation of these models and view-models is not required to build the user interface using SwiftUI. Completing the model and view-model code is done after drafting the necessary SwiftUI interface code.
Screen Decomposition based on Wireframes
MainScreen.swift and its various component views. The main screen uses tabbed view navigation, which allows the user to switch between Photos, Favorites, and Settings. Focusing on the first screen (Photos), here is an overview of the views that compose the screen :
- SearchBar.swift — this view creates a standard iOS search bar to search Flickr Photos across the various photo streams envisioned for the app, such as Interesting, Recent, and Nearby photos.
- PhotoCategoryPicker.swift —this view creates a segmented controller view to allow users to choose from the various photo streams (interesting, recent, and Nearby).
- PhotoGrid.swift — this view creates a scrollable
LazyZGridof photos from the selected photostream.
- PhotoGridCell.swift — this view displays a thumbnail of a specific image within the chosen photostream. The PhotoGridCell enables users to navigate to the PhotoScreen, which shows a larger view of the photo along with a map view (if geolocation data is available for the picture) and photo details such as the title, photographer’s screen name, and date the photo to name just a few data elements. (Again, these data elements will most likely be part of the JSON Photo model.)
MainScreen (Flickr Photos Screen)
Here’s a screenshot of the SwiftUI screen I built using the above approach:
The SwiftUI code for
MainScreen.swift is listed below:
Details of the various views are provided in code listings that will appear below.
SearchBar.swift SwiftUI code listing:
The SwiftUI code for the
PhotoCategoryPicker.swift leverages a
Picker to construct a segmented controller allowing the user to select a Flickr photo stream (Interesting, Recent, or Nearby photos):
PhotoGrid.swift code listing depicting the use of the ViewModel’s
[Photo] array in a
LazyVGrid to display the images from the selected photostream:
PhotoGridCell.swift code listing provides the details of how individual photo thumbnails are displayed within the photo grid and enabled as a nagivation method to
CellPhotoView.swift code listing leverages the Swift Package
KingfisherSwiftUI as a means of decoding photo data from an URL while also enabling image caching to prevent unneeded data transfer:
PhotoScreen (Photo Details Screen)
Here’s a screenshot of the PhotoScreen that depicts how a photo and its corresponding details are displayed once a user taps on an image from the photo grid:
Here’s the SwiftUI code listing for
The photo screen layout adapts based on the device’s horizontal class size. If the size is
.compact, then a vertical view is shown. If the device’s horizontal class size is
.regular, then a horizontal layout is employed to utilize the available screen real estate better.
Rather than posting additional individual code listings for each of the view components within the
PhotoScreen.swift file, I’ll refer the reader to the full GitHub repository available at the end of this article.
Manage Favorites Screen
I hope the article whetted your appetite for SwiftUI and inspires you to try developing an app using Apple’s declarative interface language. I encourage you to explore my GitHub.com repository to gain more detailed insights into my SwiftUI code.
In my next article, I will explain how I approach JSON data modeling and networking.
Until next time, stay safe and keep learning!
The full repository is available at the link below and will allow the reader to explore further the details of each screen and view included in the app: