And just like that, we’ve arrived at the last installment of our "30 Years of CODE" celebratory column. Wow. Time flies! Seems like "just the other day" we had our 30-year anniversary celebration in Orlando, yet that was in December of 2023. But it’s even wilder to think back five or six years. "Just before the pandemic," really. How much has changed in those few short years! Some things are very similar and some things are practically unrecognizable from just a short five years ago.

Operating Systems and General Tech

I usually start these columns looking back at operating systems. Truth be told, not all that much has changed in the last five years. We were using Windows 10 and Mac OS-X. We were using iOS and Android. And of course, we were using Linux. That is all very recognizable to us today. Things may have swung back toward the PC a bit since then, with the hype around tablets as a PC replacement having died down (although not gone away). Google’s Chrome Books didn’t take over the world, but they’re still around.

Even on the hardware side, things were relatively similar, at least for the consumer/user. Macs and PC-based hardware is similar to what we used five years ago. (Although ARM—advanced RISC machine—platforms have gained in importance since then, even in the Windows world, I’m typing this article on a modern Copilot + PC, running Windows 11 on an ARM chip.) Mobile platforms have steadily improved in performance and quality, but, by and large, we’ve gone through several years of "the fastest processor we’ve ever made," and "the best camera our device ever had," and "more memory than ever." As if it was a surprise to anyone that the latest iPhone or Samsung Galaxy isn’t actually a step back in performance. Who would’a thunk?!?

The big elephant in the room is that half a decade ago, hardly anyone cared about artificial intelligence. Only people and companies with a special interest in the matter spent any time and money on it. At our CODE Consulting division, we did implement AI and machine learning solutions, but most of our competitors did not, and even for us, it wasn’t a huge part of our business. I’d even venture to guess that we did it mostly because people like me and a few others in my organization were passionate about it. And yes, it was important to run AI to predict business trends or financials. It was important that companies like Netflix could predict what movies you might like, and Shazzam could identify the song that was playing in your favorite bar.

Image recognition had already come a long way back then. I was teaching classes and doing presentations at events like DEVintersection in Las Vegas, showing people how to do facial recognition, train custom image recognition models to identify marine animals in a dive-log application, or analyze medical imagery, among other examples. But the current AI hype was still a long way off.

We also weren’t talking about specialty AI hardware like we do today. We’d already recognized that graphics processing unit (GPU) hardware was essentially math co-processors that were really good at doing floating-point math, and that happened to be great to run AI workloads. Hence GPUs started to appear in servers, but we just slowly started to dip our toes into the waters that would become a sea of NPUs (neural processing units). NPUs follow the same fundamental ideas as GPUs, but with an eye on the specific needs of AI rather than graphics processing. Today, NPUs are becoming ubiquitous, just like GPUs have become ubiquitous in the last decades. Had I written this article at the start of this one-year series of "30 Years of CODE" anniversary articles, I probably wouldn’t even have mentioned them in the context of consumer laptops. Yet here we are, in this incredibly fast-moving world of artificial intelligence.

You know what else we used GPUs for five years ago? Blockchain! You already forgot about that, didn’t you? Around 2017/18 is what I would consider the height of the hype-cycle for blockchain and cryptocurrencies. These concepts are still important, of course. As we approach another election in the U.S.A. and other countries (which are probably in the past by the time you read this article), I realize that blockchain technology and the concept of "non-fakable crypto tokens" could be incredibly useful for voting in elections digitally and securely around the world. Alas, the topic is not so much a technical one (surely, when we perform trillions of dollars’ worth of business transactions safely in the digital world, we could cast a single vote safely in a digital world also) as it is a matter of political will and conflicting interests.

Programming

Programming trends five years ago followed much the same pattern outlined above. For the most part, we used the same tech we use today, and the major change is the emergence of AI as a platform for developers. Technologies like ML.NET and PyTorch were introduced at roughly that time. Azure AI was already a thing (although with several different names).

With that said, most developers didn’t care about AI yet. We coded in C#, Python, and JavaScript and were excited about functional languages such as F#. Not much has changed there. .NET Core and related tech, such as ASP.NET Core or Entity Framework Core, were used in the Microsoft development ecosystem. React, Angular, and Vue.js were the web frameworks of choice. Ruby on Rails had passed its peak but was still strong.

One of the things that stands out to me when thinking back to 2018 is that the cloud was already important, but it wasn’t at quite the same status as it is today. Yes, a lot of companies had moved their infrastructure to the cloud, but this was pre-pandemic. Working from home (at least part-time) was not a thing for everyone. Many companies still operated from a single brick-and-mortar location. Today, we see "back to work" or, as some call it, "work-from-work" (as opposed to "work-from-home") but we’re clearly not going back to the world of 2018. We’re now taking well-working online meeting software, such as MS Teams and Zoom for granted and even use them for in-office meetings.

There are also several key concepts in cloud development that were emerging around the 2018 timeframe. Things like Kubernetes, cloud-native development, serverless computing, edge computing, and hybrid cloud scenarios had stabilized enough to be rolled out at a large scale.

This was also the time when augmented reality (AR) was supposed to be the next big thing. Some devices were mostly vaporware, while others are still present in the market. Microsoft was one of the first to enter with a truly inspiring design with their HoloLens device (Figure 1). Ultimately, the device fell short in both capabilities and availability. Although it did some amazing things, it was plagued by a field-of-vision that was just too small, and the device was also too heavy to wear for extended periods of time. Add to that that it was essentially unavailable in any significant quantities made it a non-starter for most scenarios.

Figure 1: Microsoft’s HoloLens in action (image credit: Microsoft).

Magic Leap, Microsoft’s key competitor in this niche, had a design that was much lighter than the HoloLens and it didn’t suffer from the same field-of-vision problem. It did require that the computing power came from a secondary component worn on a belt, which many purists didn’t like. Personally, I found the device much more enjoyable to use. Nevertheless, it still never gained a significant market share. (Magic Leap now has a second-generation device available, aimed at enterprise markets). To this day, other companies are trying to crack a similar market segment, including Meta (Facebook) with Oculus (now re-branded as Meta Quest), Apple with the Vision Pro, and Sony with the XR HMD. Many of those devices are very impressive and cool. They usually also suffer high price points and are often not comfortable enough to use for extended periods of time.

It seems that when it comes to augmented reality, the simpler concepts are more successful. Although AR through the view-finder of a mobile phone is far less cool, Pokemon GO still outsells the more sophisticated offerings.

Politics, Music, Movies, and Games

This is where I usually try to look at big trends in politics in a non-partisan and inoffensive manner. Looking at what was going on politically five years ago (and ever since), I think I’ll just side-step this topic for this article in the interest of civility. <g> I was, however, somewhat surprised to see that Brexit is now more than five years in the past, as the UK decided to leave the European Union in 2016.

Moving on to the world of music, things are slightly happier, although I don’t remember all that many great new musical developments. "Despacito" is a song I remember well. "Shape of you" I remember less well, even though it dominated the charts. That may just be me, having somewhat lost interest in new music at that point in time. Maybe someone could hum "When We All Fall Asleep, Where Do We Go?" for me, because I sure couldn’t, off the top of my head. Yes, I’m losing touch with modern-day pop culture, it seems.

I’m slightly more in touch with five-year-old movies: "Black Panther" is already that old? Wow! So is "Avengers: Infiniti War." There was "Frozen 2." Also, Netflix and other streaming-only series had gained significant importance. Some would argue some of them were more important than movie releases. "Stranger Things" was in full swing. "The Witcher" was out and became one of Netflix’s most watched shows in 2019. I watched a lot of "Money Heist." I never saw "Bird Box" and "Tiger King" (even though a CODE employee had a relative who was portrayed in Tiger King—but we shall not delve into further details there, as he wasn’t thrilled by that coincidence).

Significant for CODE Magazine: "Devs" premiered on Hulu (see Figure 2). This was a mini-series about developing quantum computers. We ran a cover story on the show, because the show had asked us if they could use CODE Magazines as props in the television series. Rod Paddock, the editor in chief of our fine publication, travelled to the filming location in the UK and wrote an article about it all. It became a TV show that used our magazine as a prop, and we wrote about the show in the magazine, creating a circular relationship and breaking the fourth wall in all kinds of directions.

Figure 2: CODE Mag had a man on the set for FX on Hulu’s "Devs."

Unfortunately, the pandemic hit soon thereafter and it somewhat stole the thunder of "Devs." Admittedly, the show probably wasn’t going to be the highest grossing cultural phenomenon anyway. There was another show, however, that was a runaway success: "Star Wars: The Mandalorian" was released in 2019 (see Figure 3). Not only did it make me subscribe to the Disney+ streaming service when it came out, but I accidently ran into the premiere of The Mandalorian in Hollywood, as I happened to be in Los Angeles at the time for unrelated reasons. A very cool experience!

Figure 3 : (L-R): Grogu, Din Djarin (Pedro Pascal), and Greef Karga (Carl Weathers) in Lucasfilm's THE MANDALORIAN, season three, exclusively on Disney+. (©2023 Lucasfilm Ltd. & TM. All Rights Reserved.)

One of the themes that seems to be coming up time and again as I’ve been writing this series of articles looking back over 30 years is how many good games have been coming out. A half a decade ago, it was games like "Red Dead Redemption 2," (Figure 4) "Sea of Thieves," "PlayerUnknown’s Battlegrounds (PUBG)," and the reboots of several series, such as "Far Cry," "Resident Evil," and "Assassin’s Creed." There also were some surprise hits and deep games, such as "Disco Elysium" and "Nier: Automata," which I really didn’t see coming. Incredible! I also remember spending way too much time in "Dishonored" (a personal favorite of mine), which was originally released in 2012 (can you believe it?) but the follow-up "Dishonored 2" and "Dishonored: Death of the Outsider" came out just over half a decade ago.

Figure 4 : Rockstar Games’ Red Dead Redemption 2 is a fan favorite. (Image credit: Rockstar Games. All Rights Reserved).

Gaming hardware was surprisingly steady in those years (and to this day). Microsoft’s Xbox One had evolved to the Xbox One X. PlayStation 4 was now the PlayStation 4 Pro. All good devices for sure, but it was also surprising how little had changed on those platforms beyond the obvious improvement, such as support for 4K resolutions. The PC, on the other hand, kept pushing things forward for true gaming enthusiasts in terms of better performance. Further innovation, such as Valve’s Steam Deck (released in 2022) were still not on the horizon. Nintendo already ruled the handheld gaming device market with the Nintendo Switch (Figure 5).

Figure 5 : The Nintendo Switch

Also, Google released Stadia in 2019 as an online gaming platform. The idea was that games would run on Google’s servers with the visuals streamed to clients. It worked better than most expected, with pretty good performance for all but the twitchiest of games, allowing users with even old hardware to play modern games. Initially released to great fanfare, the service suffered an unfortunate end. Although it worked better than most expected, Stadia never gained enough market share, probably also due to the lack of an extensive game library. The service shut down in 2022 and all customers were refunded money.

On a slightly odd note: The "Storm Area 51" event happened in 2019 as well. At least, it was supposed to. A bunch of UFO enthusiasts finally wanted to get to the bottom of what was really going on inside of Area 51 in Nevada. It ended up being one of the largest Facebook events ever organized. "They can’t stop all of us," they said. The "us" was about 3.5 million people who responded to this event. Apparently, they were wrong. In the end, only 150 people showed up to the event and were quickly deterred by the government. Whether it was human soldiers or aliens remains unknown.

The Conclusion to Our Journey

This concludes our journey through 30 years of CODE history. Wow! It’s hard to believe that I started the CODE Group 30 years ago (now closer to 31 years). We went from stand-alone PCs to interconnected AI-driven devices with the power of the world’s cloud computing systems in the palm of our hands.

This is where authors usually say, "yet it has just begun," which is a phrase that’s used way too often. It’s especially true in this case. We just entered the era of artificial intelligence for real. I’ve been doing more actual development in the last 12 months than I have in a long time, and I’m having the time of my life with AI. I’m lucky to still be young enough to have hope that I will be able to ride this new wave for a long time—AI is a reset of everything we’ve done in this industry. Those of you who have attended any of my recent talks about AI know how enthusiastic I am about the things we’ve been building for ourselves and our customers. What a time to be alive and to be a software developer. I couldn’t be more thrilled!

It is the way!