Swift, the new proprietary programming language from Apple, is the biggest news in programming languages in several years. One of Steve Jobs’ lasting legacies at Apple is the influence of NeXTStep and its Objective-C-based programming tool set. The latest generation of smartphones and pads (and watches?) are programmed with types with NeXTStep-related names such as NSString and NSObject. Apple has flirted with alternatives (the more broadly-known C++ and the flexible Ruby), but it has has always pushed developers back toward Objective-C.

It is not just momentum that has kept Objective-C at the forefront. Without going too deep into programming language wonkery, Objective-C’s function-calling and much of its type-system semantics come from Smalltalk. The OS X and iOS SDKs comprise tens of thousands of functions using this model. Using those functions in C++ is fairly ugly, and while Ruby is flexible enough, its runtime overhead on phones and tablets may have disqualified it. (Personally, I believe a “performant-enough” Ruby is possible.)

Finally, it would be silly to ignore the realities of control: C++ is controlled by an international standards group, and Ruby is essentially still the work of its creator, Yukihiro Matsumoto. A proprietary language is, naturally, easier to co-evolve with an operating system, and the Apple developer community is not going to howl for open-ness and portability. (While Objective-C is actually broadly available across platforms, it’s only broadly used in the Apple ecosystem.)

From a wonk standpoint, Swift is a hybrid object/functional language that’s clearly influenced by “modern” thinking about type systems. It has type inference, generics, value types, enumerations, tuples, first-class functions, a built-in monadic Optional type, and algebraic data types. But one thing that impressed me about Apple’s introduction of Swift is that they never once mentioned “type systems” or “functional programming.” Instead, they emphasized its brevity and ease of use compared to Objective-C. As always with Apple, the emphasis was on how the external design benefited the user, both practically and aesthetically.

There is no question that Swift is considerably terser than equivalent Objective-C code, as it eliminates a good amount of bookkeeping and boilerplate. To my taste, it is also clearer and easier to use than Objective-C, but one of the most fascinating aspects of the release will be the reaction of the broad community of Objective-C developers. Unlike C# and Java developers, who have moved toward functional programming approaches over the course of several language versions, the shift from Objective-C to Swift is a leap in terms of both programming language syntax and mental models.

On the other hand, maybe functional programming approaches don’t need to be wrapped up in all sorts of jargon and justification. Maybe you can just say “use ‘let’ for things that don’t change after assignment and ‘var’ for those that you want to change” and let developers figure out the benefits for themselves. Maybe you can just say “Optional chaining” and avoid the temptation to use the word “monad.”

Maybe you can do those things, but I can’t. In Swift, rather than returning null or, say, a string, a function may be specified as returning an “Optional<String>” and would either return a “None” or a “Some<String>”. (Wonkishly, this is an “algebraic data-type” or “discriminated union,” but in object-oriented terms, “None” and “Some<T>” would be sub-classes of the abstract class “Optional”.) Rather than check every return for null (“nil” in the Objective-C world), “Optionals” can be chained together so that, automatically, a function passing a “None” will itself return a “None”. Swift contains syntactic “sugar” to make this trivially easy to use.

One of the bolder decisions in the Swift design is that it doesn’t have exceptions. Exceptions are one of the most troublesome issues in the design of programming languages. Most popular programming languages developed in the past 20 years have included some version of “try…catch…finally” for specifying an alternative flow of control for functions that encounter “wrong in a non-routine manner” conditions. On the one hand, language-level exception handling seems like it solves a need in a manner that’s conceptually straightforward. On the other hand, in practice it clearly hasn’t solved the issue, as the codebases of the world are riddled with poor exception handling (and, less importantly, exceptions are a thorn in the side of compiler writers).