Automating app screenshots and previews for the App Store with Snapshotting and fastlane
Table of contents
- The problem
- Solution: Automation
- Prepare screenshots
- SwiftUI views for app previews or screenshots -
- Make screenshots
- Device sizes
- Languages
- Git LFS for images
- Prepare screenshots
- Publish to App Store with fastlane
Manually managing screenshots and app previews for your app is quite a pain. You have to provide at least one screenshot for every device size and app previews are optional. (App Radar has a nice summary of guidelines and requirements.)
The good thing is that the number of device sizes is smaller than the number of all iPhones and iPads (plus Watches, plus Apple TVs) - take a look at Screenshot specifications and App preview specifications, but still it’s 🤯 for manual work.
And don’t forget about localization - the rules apply for every language in you app (“Fun with flags”©).
So, it great that fastlane has this covered (i.e. automated) with screenshot and framing.
Too bad it needs UI Tests
. They are flaky and unreliable, they are very very slow, and most importantly they are quite difficult to implement (and test all possible states/paths). Imagine the screen somewhere at the end of the flow — you have to navigate there by “tapping” in code. It’s good if you have uses accessibility tools that would help you to identify controls, but what about data? Should you talk to the backend? Staged or production? What if it’s down? Use the database? What if you have changed the navigation? Easy, you say, just use launch arguments
. Sure. And we just added another layer of complexity to manage and maintain and remember (that’s why companies rely heavily on manual testing and have lots of QAs), not to mention bad practice of polluting production code with code intended just for tests. So much just to get one screenshot. And you need like hundreds.
My favorite thing about reactive programming is thinking about and implementing a view as dumb object - the one that has no behavior, just rendering the state.
If we design UI components like this, we could easily test the rendering of any possible state. And instead of UI tests we would use snapshot tests - much faster, super flexible (just pass the state you want to render), super easy to manage and maintain. We won’t worry about navigation, networking, persistence, etc - state rendering is totally decoupled from all these components.
With this in mind we’d create a test target ScreenshotTests
and run test (to make screenshots) for just the views (UI components) showing exactly what we want that we’d like to add to the App Store. CHANGE WORDING
State -> View -> Snapshot -> PNG-file -> fastlane
Run this flow for every device size and every language we need.
struct DeviceConfig {
enum Orientation {
case landscape
case portrait
}
// ...
static let iPhone13Pro: Self = .iPhone13Pro(.portrait)
static func iPhone13Pro(_ orientation: Orientation) {
// ...
}
}
extension Locale {
static let enUS: Self = .init(identifier: “en_US”)
}
extension XCTestCase {
func snapshot(
_ view: any View,
config: DeviceConfig,
locale: Locale,
colorScheme: ColorScheme
) {
let view = view
.environment(\.locale, locale)
.preferredColorScheme(colorScheme)
// ...
}
}