Now that we have some distance from the events themselves, it's time to look back at the loss of two icons of our industry who apparently couldn't have been more different - yet each profoundly shaped the industry in ways that are so fundamental, we literally can't imagine what life would be like right now without them.

Dennis Ritchie, half of the team that created the C language, passed away shortly after Steve Jobs, half of the team that created Apple computer, the Mac, iPhone, iPad and App Store.

Both were iconoclastic: one quietly so, one loudly so. One got ridiculously, fantastically, overwhelmingly and ludicrously rich, the other… shrug. One spent his entire life laboring in (relative) obscurity at Bell Labs, while the other started a company out of his garage, got kicked out of it, then came back as its savior. One developed a language that would serve as the gold standard for all programming languages to follow it, the other sought to elevate computers out of the realm of technologists and successfully turn them into “consumer devices” that lacked many of the hallmarks of what “everybody else” thought a computing device needed to have.

Both, I think we can agree, were pretty good guys.

As churlish as it may seem, many across the Internet have made this into something of a competition: the two passed away closely enough that the inevitable “Suck/Rock Dichotomy” (as coined by Neal Ford) kicked in: forums began to sprout debates between geeks each seeking to eulogize one and claiming the other was nice, but not really all that big a deal. Granted, most of the posts I saw at least tried to be respectful, but in the end, the message was still one of “Mr. X's accomplishments were greater than Mr. Y's because….”

Why we in this industry feel compelled to turn everything into a competition is something that should be addressed by a sociology or psychology study, and is clearly beyond both the length limitations of this column and the powers of its author to explore.

That both men were “great” is beyond debate. Ritchie, along with Ken Thompson, essentially invented Unix (the name being “a kind of a treacherous pun” on its immediate-and failed-predecessor, Multics). Along the way, as part of the development of Unix, Ritchie and Thompson took an existing language called BCPL (Basic Combined Programming Language, itself a simplified variant of a predecessor called CPL), squeezed it down to fit into 8 kilobytes, and called it B. A later version was sufficiently new and improved enough to warrant a new name.

That language, of course, is C. (Ritchie later commented that the choice of the name “left open the question whether the name represented a progression through the alphabet or through the letters BCPL”.)

What most developers know about Ritchie is his seminal book on C, The C Programming Language, written with Brian Kernighan - most famous for its opening pages, in which the very first “Hello World” program appeared. For many, K&R (as the book came to be called) set the gold standard for all programming books written since: not only did the book convey the concepts of the language, but it offered discussions of style and wisdom, lessons that many “old-timers” continue to espouse to this day. And all within (roughly) a hundred pages.

Ritchie's Unix has not only officially outlived its creator, but spread far wider than I think anyone could have imagined. First through the efforts of Linus Torvalds in creating a version of Unix that can run on a PC and is freely available for anyone to hack on (Linux), and then through the adoption of Unix by Steve Jobs, who first took it to be the platform of his NeXT workstations after he was ousted from Apple, then later brought it into Apple and put it front of millions of customers (as Mac OS/X).

Of course, Jobs' career and his accomplishments need little explanation or retelling: the founding of Apple with Steve Wozniak, Jobs' ouster from the company, the founding of NeXT (where he apparently learned to love both Unix and Objective-C), his return to Apple and the subsequent transformation of the company from a “has-been wash-up” of machines suitable only for graphics designers and fashion geeks to the manufacturer of the hottest-selling line of consumer devices, led by the turtleneck-and-jeans “brilliant visionary.”

In the immortal words of Shakespeare, I come not to praise these gentlemen - despite the fact that they are worthy of whatever praise this meager column can heap upon them - but to ask the question, “What made them great?”

It's a question that deserves asking, not just of those whom we revere, but of ourselves, as well. The other day, I heard a couple of friends talking about another friend as “a great programmer” and I was curious enough to ask, “What makes him a great programmer?” When they both looked at me like I had sprouted horns, I pressed: “Does he write his code really tight, getting good performance, or does he write it really openly, so it's easy to see what he's doing? Does he have great unit tests? Did he document the code well? What makes him great?”

Greatness

I won't pretend that I have the all-encompassing, all-definitive answer to this question. In fact, I think the answer lies in asking the question, rather than in knowing the answer. (That's the Dudeist way of saying "It's not the strike, man, but how you bowl. Or hold your beverage. Or how the rug really ties the room together.")

For both men, it is important to realize that neither achieved his greatness alone. Each was helped tremendously by a partner, and each was either a part of an organization that was geared around “doing interesting things,” such as Ritchie's job at Bell Labs, or was the head of an organization that consisted of a number of people pulling in the direction he set, such as Jobs' role at Apple.

In that caveat, the debates will rage again: Ritchie didn't have whole teams of developers underneath him, making things like iOS and the App Store. Jobs didn't have to try and fit Objective-C or iOS into 8 kilobytes of memory. However, neither did Ritchie have to somehow create an entirely new device in a market already deeply saturated with players, as Jobs did with the iPod. Nor did Ritchie have to figure out how to create a smartphone that would be a success when so many other small consumer devices - Palm, Windows Mobile, and even Apple's own Newton - had either flamed out or set the idea that the “personal consumer device” market was not a particularly large one.

In truth, each man faced significant obstacles in his course, and each found ways around those obstacles. Each sought to create something new - not entirely new, but different enough that nothing which currently existed could be “made to work.” Each showed a mix of skill and insight, along with creative energy. Whether Jobs' “And one more thing” on stage trumps Ritchie's wry sense of humor is something that is left for computer science historians to debate in the years to come.

What I know is that whether we attribute their success to them individually, or to them as the heads of the pyramids of people and ideas that they brought together, they each left legacies that I can only hope to someday approach even a small fraction of. I'm pretty sure I'll never invent a programming language that will become the definitive standard for decades to come, and I'm pretty sure I'll never be the head of a multimillion-dollar company and turn it into a multibillion-dollar company through my strategic vision and insight. I will have to find other avenues through which I can achieve greatness. The profound changes they brought to our industry, the way they simply did what they had to do is powerful and inspiring.

And that, more than anything, is the power of a legacy.

I think it is best summed up thusly:

int main(int argc, char* argv[]) {
    printf("Goodbye world\n");
    NSLog(@"No more things\n");
}