Swift Gotchas

Brian Donohue
iOS App Development
4 min readJun 10, 2014

--

In programming, a gotcha is a feature of a system, a program or a programming language that works in the way it is documented but is counter-intuitive and almost invites mistakes because it is both enticingly easy to invoke and completely unexpected and/or unreasonable in its outcome.

Wikipedia

When Craig Federighi announced that Apple had been working on Objective-C without the C, the whole room lit up with chatter. The announcement of Swift was the go-to topic for the rest of WWDC, and represents a watershed moment for iOS development. A new programming language that combines the ease of use and readability of modern languages with native performance. It certainly seems like the future of iOS development.

I skipped all of the Swift sessions in favor of skimming the Swift book, downloading XCode 6, and working on a few Swift apps. I quickly ran into a couple of Swift gotchas.

Leading Zeroes

I decided to translate one of my weekend hacks, WhereBrianAt, into Swift because the app is extremely small (250 lines of Objective-C) and simple: show my position on a map, run in the background, and upload my location to a server. I ran into my first issue when trying to center the map on my current location:

Expected identifier after ‘.’ expression

Wut? I thought I must’ve screwed up some Swift syntax, so I tried the following:

var x = .08

Same compiler error. I’ve written software in languages from JavaScript to Scala, and I can’t think of a single language where this isn’t valid syntax. The solution struck me after an hour or so away from the computer:

var x = 0.08

Apparently Swift will not compile decimal numbers without the leading zero, and the compilation error is cryptic to say the least. From the Swift book:

Floating-point numbers […] must always have a number (or hexadecimal number) on both sides of the decimal point.

However, these statements are acceptable to Swift:

let oneMillion = 1_000_000let justOverOneMillion = 1_000_000.000_000_1

It’s great to be able to delineate orders of magnitude in numeric literals for better legibility, but it seems strange to force leading zeroes for floating-point literals.

Casts to Higher Precision

While Swift is an inferred type language, it’s also type safe to the extreme. See the following example:

Could not find an overload for ‘*’ that accepts the supplied arguments

You’ll notice that the ‘x’ and ‘y’ assignments, while almost equivalent, yield different results. In the ‘x’ assignment the type of numeric literal 2 is automatically inferred to be a Double, and does not require an explicit cast. However a Swift array’s count function returns an Int, and that must be explicitly cast. Again, I should’ve RTFM:

Conversions between integer and floating-point numeric types must be made explicit

And the accompanying example:

let three = 3let pointOneFourOneFiveNine = 0.14159let pi = Double(three) + pointOneFourOneFiveNine

I generally agree that being type safe is important. I don’t want an Int interpreted as a String without a cast, but I shouldn’t have to be so explicit when casting to a numeric type of higher precision (i.e. Int to Double).

Designated Initializers

In WhereBrianAt, I have a custom subclass of MKUserAnnotationView that puts a dropper with my avatar on a map:

In the subclass, I implemented the same initializer as the Objective-C version of the app:

init(annotation: MKAnnotation!, reuseIdentifier: String!)

No compiler errors this time! But when I run the program, I get this crash:

fatal error: use of unimplemented initializer ‘init(frame:)’ for class ‘WhereBrianAtSwift.UserAnnotationView’

Hmm alright, so I guess I’ll just…

 init(frame: CGRect) { super.init(frame: frame) }

And it runs fine!

Swift has a notion of a designated initializer for a class, and apparently that initializer is only inherited from a superclass “in some cases”. See this snippet from the docs:

Every class must have at least one designated initializer. In some cases, this requirement is satisfied by inheriting one or more designated initializers from a superclass, as described in Automatic Initializer Inheritance below.

Describing the Automatic Initializer Inheritance is outside the scope of this post, but I encourage you to have a look at it if you’re planning on writing Swift apps.

There are obviously good intentions behind default initializers, but it feels like the implementation goes against some basic inheritance principles of object-oriented programming.

Swift

The promise of Swift is a more approachable, legible, and expressive programming language. Unfortunately there are so many idiosyncrasies to the language that writing apps in Swift quickly becomes less expressive, and requires a learning curve similar to Objective-C.

Back at WWDC I showed these issues to an Apple engineer and explained each gotcha, and he suggested that I file bug reports for each issue and that Apple is going to be listening very closely to feedback on Swift. Given that Apple has solved a ton of developer pain points this year, I’m confident that they are listening and will resolve most of these issues. Until then I’ll be using Objective-C for all of my professional work, and dabbling in Swift the side.

--

--