Home » The Subjective Charms of Objective-C

The Subjective Charms of Objective-C

by Anna Avery
0 comments


After inventing calculus, actuarial tables, and the mechanical calculator and coining the phrase “best of all possible worlds,” Gottfried Leibniz still felt his life’s work was incomplete. Since boyhood, the 17th-century polymath had dreamed of creating what he called a characteristica universalis—a language that perfectly represented all scientific truths and would render making new discoveries as easy as writing grammatically correct sentences. This “alphabet of human thought” would leave no room for falsehoods or ambiguity, and Leibniz would work on it until the end of his life.

A version of Leibniz’s dream lives on today in programming languages. They don’t represent the totality of the physical and philosophical universe, but instead, the next best thing—the ever-flipping ones and zeroes that make up a computer’s internal state (binary, another Leibniz invention). Computer scientists brave or crazy enough to build new languages chase their own characteristica universalis, a system that could allow developers to write code so expressive that it leaves no dark corners for bugs to hide and so self-evident that comments, documentation, and unit tests become unnecessary.

But expressiveness, of course, is as much about personal taste as it is information theory. For me, just as listening to Countdown to Ecstasy as a teenager cemented a lifelong affinity for Steely Dan, my taste in programming languages was shaped the most by the first one I learned on my own—Objective-C.

To argue that Objective-C resembles a metaphysically divine language, or even a good language, is like saying Shakespeare is best appreciated in pig latin. Objective-C is, at best, polarizing. Ridiculed for its unrelenting verbosity and peculiar square brackets, it is used only for building Mac and iPhone apps and would have faded into obscurity in the early 1990s had it not been for an unlikely quirk of history. Nevertheless, in my time working as a software engineer in San Francisco in the early 2010s, I repeatedly found myself at dive bars in SoMa or in the comments of HackerNews defending its most cumbersome design choices.

Objective-C came to me when I needed it most. I was a rising college senior and had discovered an interest in computer science too late to major in it. As an adult old enough to drink, I watched teenagers run circles around me in entry-level software engineering classes. Smartphones were just starting to proliferate, but I realized my school didn’t offer any mobile development classes—I had found a niche. I learned Objective-C that summer from a cowboy-themed book series titled The Big Nerd Ranch. The first time I wrote code on a big screen and saw it light up pixels on the small screen in my hand, I fell hard for Objective-C. It made me feel the intoxicating power of unlimited self-expression and let me believe I could create whatever I might imagine. I had stumbled across a truly universal language and loved everything about it—until I didn’t.

Twist of Fate

Objective-C came up in the frenzied early days of the object-oriented programming era, and by all accounts, it should have never survived past it. By the 1980s, software projects had grown too large for one person, or even one team, to develop alone. To make collaboration easier, Xerox PARC computer scientist Alan Kay had created object-oriented programming—a paradigm that organized code into reusable “objects” that interact by sending each other “messages.” For instance, a programmer could build a Timer object that could receive messages like start, stop, and readTime. These objects could then be reused across different software programs. In the 1980s, excitement about object-oriented programming was so high that a new language was coming out every few months, and computer scientists argued that we were on the precipice of a “software industrial revolution.”

In 1983, Tom Love and Brad Cox, software engineers at International Telephone & Telegraph, combined object-oriented programming with the popular, readable syntax of C programming language to create Objective-C. The pair started a short-lived company to license the language and sell libraries of objects, and before it went belly up they landed the client that would save their creation from falling into obscurity: NeXT, the computer firm Steve Jobs founded after his ouster from Apple. When Jobs triumphantly returned to Apple in 1997, he brought NeXT’s operating system—and Objective-C—with him. For the next 17 years, Cox and Love’s creation would power the products of the most influential technology company in the world.

I became acquainted with Objective-C a decade and a half later. I saw how objects and messages take on a sentence-like structure, punctuated by square brackets, like [self.timer increaseByNumberOfSeconds:60]. These were not curt, Hemingwayesque sentences, but long, floral, Proustian ones, syntactically complex and evoking vivid imagery with function names like scrollViewDidEndDragging:willDecelerate.



Source link

You may also like

Editor Pics

Latest News

© 2025 blockchainsphere.info. All rights reserved.