From my childhood onwards, I’ve had a strong inclination for building things. Fascinated by sci-fi movies, I channeled this interest into crafting spaceships from Lego bricks. Because of my height, my parents enrolled me in sports activities, and volleyball caught their attention. While I diligently attended these training sessions throughout my childhood and displayed skill, sports never truly ignited a passionate flame within me. Nonetheless, the sports experience instilled in me perseverance and inadvertently led to a healthier lifestyle down the line.
Upon completing elementary school, the time came to select a profession. Given my enduring fascination with sci-fi, computers seized my attention. My family had a computer-savvy relative, an accomplished computer engineer, who consistently brought his machine along during visits. The allure of the command line and interface entranced me, as they evoked memories of interfaces from Star Trek.
In the early days, I got my hands on the iconic Amiga 500 computer. While personal computers were gaining popularity, I had the fortune of acquiring this machine for free, propelling me into a captivating journey. After a brief phase of gaming addiction during my youth, I unexpectedly discovered the realm of programming languages like Aztec C and Himpelsoft Pascal. This transition felt like destiny. With Amiga games relying on floppy disks, the passage of time led to the degradation of these disks. Slovakia grappled with a scarcity of Amiga-formatted floppies as the majority had already transitioned to PCs or Ataris. This scenario catalyzed my deep dive into programming, which swiftly became my primary channel for engagement.
Amiga 500 | Aztec C | Himpelsoft Pascal | Ataris
And I relished it! I commenced crafting my own applications, games, operating systems, UIs, and command-line utilities.
Let me take you back to a time when technology was evolving at an unprecedented pace. It was a few years down the road, and I was about to embark on a journey that would shape my path as a software enthusiast forever.
Picture this: I finally got my hands on my very own bona fide personal computer. Now, this wasn’t just any computer – it was a real PC, a marvel that held endless possibilities. And as fate would have it, a family relative of mine had become captivated by the Linux operating system, which was all the rage at the time. With a sense of intrigue and anticipation, he set out to transform my new PC into a dual-boot wonderland, housing both Linux and the omnipresent Windows 98. You remember Windows 98, right? It seemed like it was everywhere back then.
As I gazed at my newly transformed PC, a peculiar shift occurred within me. My fascination with games, which had once consumed my every waking moment, had now taken a backseat to this new and exhilarating operating system. To me, it was as if I had stepped onto the deck of a starship from Star Trek, leaving behind the archaic world of the Amiga OS. The graphical user interface (GUI) was a sleek revelation, a portal to uncharted realms.
Naturally, my first order of business was to explore every nook and cranny of the system’s settings and internal workings. But, as they say, curiosity can lead to unexpected consequences. Before long, I managed to inadvertently wreak havoc on my once-pristine PC. It was a setback, but every cloud has a silver lining. With no Windows installation CD at my disposal and only a Linux installer to my name, I found myself taking the plunge into the realm of Mandrake Linux.
Let me tell you, the Mandrake Linux installer was like nothing I had ever encountered before. It was a cosmic symphony of options and customizations, an expansive universe of choices that seemed to stretch beyond the horizon. Every installation was a voyage of discovery, an opportunity to delve into new and uncharted territories of knowledge. I remember the sheer exhilaration of installing Mandrake Linux not just once, not twice, but a total of five times – all for the sheer thrill of the experience. Each installation ensured that I had all the tools and features I desired at my fingertips.
The most enchanting aspect was the boundless customization that Mandrake Linux offered. If you could dream it, you could achieve it. And if by some chance you missed a beat, you could weave your own magic through scripting. The possibilities were truly limitless, and I found myself both awed and inspired by the power that lay in my hands.
Looking back, those were the formative years of my journey into the world of software engineering. Little did I know that this early fascination with operating systems and customization would become the cornerstone of my career. Each trial, each installation, and each script I crafted was a stepping stone, leading me toward a future I could have only imagined.
Meanwhile, I was navigating the halls of high school in Slovakia. Attending the particular type of high school I did came with a unique requirement – participation in an “internship” program that formed an integral part of our education. The intriguing aspect was that the choice of where to “work” during this period was entirely up to me.
For me, the choice was crystal clear – a local internet café nestled within the heart of our city. It’s important to understand the context of the late 90s in Slovakia: gaining access to the internet was a somewhat daunting endeavor. You either swallowed the bitter pill of exorbitant fees tied to your landline or ventured out to the nearest internet café, where you could pay for precious time online. Landing an internship there felt like a coup of sorts – an entry ticket to unlimited internet use throughout the day. It was nothing short of a digital paradise.
In those days, the internet was a portal to an expansive realm of untamed content. To me, it was a window to the world, a treasure trove of information and entertainment. One discovery stood out: the myriad Linux distributions that dotted the digital landscape. This newfound fascination ignited a spark within me. With a fervor to explore, I embarked on a quest to download these Linux distributions, one after the other, each vying for its turn on the stage of my home computer.
It was during my high school years that I stumbled upon the Linux from Scratch (LFS) project—an opportunity I couldn’t resist. The decision to craft my own distro ignited a fire of curiosity that led me deep into the labyrinthine world of Linux internals, the kernel, bash scripting, and packaging systems. My affinity for Linux had already taken root, with X and the pekwm window manager shaping my digital landscape.
As my knowledge blossomed, I took a stride into the realm of administration, tending to routers and internal servers at the internet café where I honed my skills. And as the digital dawn of the early 2000s broke, the internet landscape held a different allure. Amidst the fervor of IRC (Internet Relay Chat), I found a community that transcended social media’s grip. The protocol’s Windows and Linux desktop clients opened a portal to a captivating realm, one widely embraced in Slovakia. This wasn’t just a realm of technology—it was a haven for geeks and enthusiasts alike. My fascination with the world of hacking and programming found a home among communities like hysteria.sk. It wasn’t just the valuable resources these digital pioneers produced; it was the real-time conversations that drew me in, a lifeline that connected kindred spirits across the digital cosmos. It was a time when solitude gave way to a vibrant sense of camaraderie.
Hours melded into days, and my life followed a unique rhythm: school, an afternoon voyage with Star Trek Voyager, and nights immersed in IRC discussions. As the final year of high school approached, I was struck by a new ambition—to transform my passion into currency, to earn money and acquire the latest hardware. A family relative once again extended a helping hand, recognizing my prowess in Linux administration and budding skills in PHP programming. In an era when PHP was a functional gem rather than the heavyweight OOP language of today, I ventured into the realm of Gentoo Linux server administration. These servers weren’t just systems; they were guardians of sensitive data, their security enhanced by grsecurity, chroots, and an array of kernel-level security measures. It was a valuable experience, but a desire for fresh challenges was beginning to simmer beneath the surface.
Embarking on a captivating journey through the dynamic world of technology, I delved deeper into the realm of PHP, sparking a collaborative venture with a family relative that would shape my trajectory. Our brainchild, QWiki, emerged in the early 2000s when the concept of a wiki was still novel—Wikipedia was in its infancy, and content management systems were a luxury. Back then, the internet was primarily composed of simple HTML files or extravagant corporate systems. QWiki, however, was a potent game-changer. Distinguished by modularity and versatility, it harnessed XML and XSLT transformations, novel tools that instilled the power to convert XML data into diverse formats, igniting my excitement.
As the clock of progress ticked, I was welcomed into the University of Zilina, Faculty of Computing Science—a realm of unbridled opportunity. A 100Mbit internet backbone unlocked a realm of exploration, where even my aversion to mathematics couldn’t eclipse my passion for computer science lectures. A command over SQL, PHP, bash, XSLT, XML, and an array of other technologies set me apart from fellow students. While programming was a joy to me, for many, it remained a necessary academic endeavor.
In the early stages, a seasoned Slovakian Java developer, an Oracle virtuoso with a wealth of experience, held sway over the company. This was a nascent startup, and my colleague’s influence was pivotal. As time progressed, my capabilities transcended the confines of a student coder. I morphed into a versatile “guy-for-all” engineer, seamlessly straddling the realms of server administration and Oracle database management. A pivotal moment arrived when I was dispatched to Vancouver to meet with a key business partner, marking my introduction to the fertile grounds of North American innovation. Here, I encountered Gerald Bauer, a freelancer responsible for trailblazing experimental web solutions and CMS systems. Vancouver’s dense programmer population was fertile ground for tech evolution, and it was under Gerald’s guidance that I first encountered Ruby, specifically Ruby on Rails, circa 2005. This revolutionary framework bestowed upon me the power to achieve unprecedented progress, far outpacing the clunky Java code linked to Oracle databases. A remarkable feat followed, as Gerald and I reengineered the search engine in a mere fortnight. We made a decisive shift to PostgreSQL, liberating ourselves from the clutches of Java and Oracle. The transformation was rapid, robust, and it breathed new life into our platform.
Returning to Slovakia, I found myself immersed in an era of burgeoning SEO and the frantic race for internet visibility. While the pursuit of business dominance didn’t captivate me, my fascination with technology endured. This curiosity prompted me to leave the bulkiness of Ruby on Rails behind, embracing the lean elegance of the Sinatra framework. As I bid farewell to the company, an arduous period in my life, I left behind a finely tuned framework and software suite that resonated with efficiency. I had delved into kernel tweaks and fine-tuned PostgreSQL settings to extract every ounce of performance, crafting an impeccably fast UI. Yet, despite my technological prowess, the tumultuous business landscape and the owners’ perpetual discord precipitated my decision to walk away.
In the early days of 2010, amidst the camaraderie of online communities like hysteria.sk and kyberia.sk, I remained unwavering in my conviction that traditional corporate roles weren’t aligned with my aspirations. However, fate had a new chapter in store for me. In the same year, a pivotal message from Marek Mahut transformed my trajectory. He introduced me to the world of Red Hat, a distinguished company situated in Brno, Czech Republic, renowned for its resolute commitment to open source values and Linux innovation. While Red Hat Linux and Fedora were not my initial Linux distribution preferences, their substantial investments in the Linux/GNU ecosystem resonated deeply with my favored distributions, namely LFS and Gentoo. Explore the open source journey that redefined my path with Red Hat, where ideals converge and innovation thrives, propelling the Linux landscape into new dimensions.
Setting foot in the corridors of Red Hat on January 1, 2010, marked the start of a new chapter. The company’s offices spanned two floors of a commendable building, exuding an open and welcoming atmosphere. My interview was an intriguing encounter—a time when Ruby development was an uncharted territory within Red Hat’s Czech arm. The conversation involved a Linux Kernel team lead and a libvirt project lead engineer, a duo whose understanding of Ruby/PHP/XSLT/etc. remained uncertain. Regardless, I was offered a place within the company’s ranks.
My entrance into Red Hat saw me join the **DeltaCloud project, a significant component of the larger Hybrid Cloud initiative still taking shape in 2010. The mission was to craft a REST API, an emerging concept at the time, that granted agnostic access to the burgeoning infrastructure-as-a-service landscape. In a world dominated by concerns about customer vendor lock-in, Red Hat championed the creation of a universal protocol for managing virtual machine resources in the cloud—a protocol that could be embraced by all cloud providers. This vision resonated beyond Red Hat, as other providers recognized the benefits of enabling customer migrations from rival platforms.
Initially rooted in Ruby on Rails and led by the esteemed David Lutterkort (later CTO of PuppetLabs), the DeltaCloud project underwent refinements, shifting its foundations to the more streamlined Sinatra framework. This shift not only simplified its architecture but also made installation more accessible.
Here, I was schooled in a fundamental open-source tenet: Code’s brilliance doesn’t matter if it goes unused. I embraced this lesson by extending my efforts beyond coding to become an evangelist for the project. Armed with the conviction that hybrid cloud strategies and the avoidance of vendor lock-in were essential, I embarked on a journey to share the DeltaCloud vision at open-source conferences worldwide. The results were remarkable—CERN and numerous clients incorporated DeltaCloud into their projects, including Red Hat’s own CloudForms initiative.
Five years elapsed, and Red Hat’s strategic evolution led to the acquisition of a company that paralleled CloudForms’ offerings but boasted a more extensive customer base. However, my reservations about their codebase—the weightiness of their Ruby on Rails foundation, burdened with thousands of lines of controller code—prompted a change of direction.
At that juncture, the industry’s focus was on OpenStack—a revolutionary solution. Yet, my hesitation stemmed from the fact that OpenStack’s core was rooted in Python, a language I didn’t particularly favor. This disappointment was compounded by the realization that Python was an inescapable presence in the OpenStack environment.
With time, my quest for a new endeavor led me to scrutinize Ruby-based projects, outside the scope of CloudForms. And there, like a beacon, stood OpenShift.
During this period, a notable contender in the platform-as-a-service (PaaS) domain was Heroku, a provider that wowed the world by simplifying code deployment, runtime management, and scalability. Users marveled at the ease with which their applications could be launched and handled, all without the intricacies of Linux administration. With just a Git repository and a somewhat ungainly YAML configuration file, a simple “push” unleashed a world of magic. Both individuals and enterprises reveled in this streamlined process.
Enter the OpenShift project—a Red Hat initiative prompted by Heroku’s game-changing impact. Recognizing the market’s momentum and the fervor it generated, Red Hat, a trusted source of enterprise open-source technologies like Ruby and PHP, embarked on a mission to offer an in-house solution for deploying, running, and scaling applications. This was facilitated through a product that clients could install within their data centers, enabling developers to interact with it directly.
OpenShift’s initial iteration was a fusion of technologies—a blend of complexity and architectural simplicity. Employing AMQ messaging, database functionality, SELinux, and a sprinkle of Linux sorcery, it effectively separated applications in the system. While it wasn’t fully embracing Linux containers at that time, it amalgamated containers with SELinux, dubbed as “gears.” These gears enabled developers to deploy applications by employing specific “cartridges” for languages such as PHP, Ruby, Python, and Java. This approach mirrored the success of Heroku but under the banner of a proprietary cluster.
The next one or two years saw me deeply engrossed in the development of Ruby cartridges, among others, as well as overseeing the core OpenShift systems. My tasks ranged from bug fixes to performance enhancements, all while striving to enhance the user experience.
As OpenShift evolved, customer demands for more features and options grew, inevitably leading to a bloat in the system’s architecture and increasing maintenance costs. It was during this juncture that Clayton Coleman, a pivotal figure in OpenShift’s development, observed an opportunity to overhaul the cumbersome Ruby codebase. Drawing inspiration from a system Google employed internally, we embarked on the journey of recreating OpenShift using a relatively new language—Go. Dubbed geard (inadvertently reminiscent of systemd), this endeavor was a collective learning experience, as we endeavored to recreate a system already mastered by Google.
Coinciding with our efforts, a French company named dotCloud was crafting their own PaaS project. Within this project, they birthed an internal tool called “docker,” which leveraged Linux Containers as its foundation. However, it introduced a game-changing concept—layered filesystems and “images.” While Linux Containers themselves had existed since the 1990s, mainly employed in combination with “chroot” and intricate bash scripts to create secure containers within Linux environments, Docker’s innovation lay in its layered approach. This innovation drastically improved speed—running a virtual machine in 2015 could take minutes, consuming a significant chunk of system resources. In contrast, Docker Containers produced the illusion of instantaneous virtual machine creation, a kind of technological magic.
Yet, Docker’s true revolution lay in its “images.” Previously, software packaging relied on formats like RPM, DEB, or “ebuilds,” which varied among Linux distributions, leading to software delivery inconsistencies. Docker’s “images” revolutionized this landscape. Instead of just packaging the software, Docker images encapsulated an entire operating system, providing a preconfigured environment tailored for seamless execution. This transformative feature paved the way for stability and consistency in software distribution, solving an industry-wide challenge. Adding to its prowess, Docker’s layered filesystem only fetched the additional layers required, resulting in efficient use of resources.
As OpenShift’s engineering team grasped the inevitability of Docker’s ascendancy, it was clear that integrating Docker into the existing Ruby-based OpenShift was no trivial task. The divide went beyond mere programming language differences—it extended to disparate containerization technologies as well. This hurdle led the team to reembrace the “geard” project, geared toward running Docker containers and images in lieu of the antiquated OpenShift cartridges.
However, a critical challenge remained: how to maintain the PaaS experience that OpenShift customers cherished while sparing them the need to craft their own Docker images from scratch. The solution emerged as the s2i (source-to-image) project, a nimble Go-based image builder. This tool blended a pre-existing Docker image with a programming language/framework and the actual project code. The result was a fully operational Docker image ready to run applications seamlessly.
With the integration of Docker and s2i, the geard was then tasked with running these Docker containers, expertly scaling them, and deftly balancing the incoming traffic—bringing OpenShift to the brink of completion.
A seismic shift soon reverberated through the IT industry. The same Google project that Clayton had informed us about had a major announcement—Google decided to open-source it. The project’s release would attract a fresh wave of developers, features, and enhancements.
Acknowledging the potential, Clayton recognized the futility of competing with Google in this arena. The geard project was shelved, paving the way for the birth of OpenShift v3. The plan was simple: amalgamate Kubernetes (Google’s brainchild) with s2i and introduce additional APIs for a more PaaS-like experience. This would simplify adoption for customers who merely sought to run and scale their applications without delving into the intricacies of Kubernetes.
While conceptually straightforward, the implementation proved a different beast—especially for those unacquainted with Go language vendoring intricacies. The complexity of staying in sync with Kubernetes while properly integrating it into OpenShift’s ecosystem was a formidable challenge.
Around this time, I was a part of the “build” team, entrusted with maintaining the s2i project. My focus lay in ensuring that the builder images catered to the users’ needs while optimizing the efficiency of the s2i process. My role didn’t involve deep dives into the core of OpenShift or Kubernetes, yet their essence fascinated and, admittedly, intimidated me.
A year later, I was presented with the opportunity to lead the “master” team—an irony, given that the term “master” had not yet become a contentious term. The “master” team’s mission was clear: keep OpenShift aligned with the latest Kubernetes release. This was a mission-critical task, and the pressure to deliver was immense.
The members of my team dwarfed me in both skill and intellect—names like Jordan Liggit, David Eads, Stefan Schimanski and Michalis Kargakis hold great significance for those in the know of the modern Kubernetes landscape. These brilliant minds contributed core Kubernetes features like RBAC, deployments, and more. Leading a team of engineers who continually amazed me with their solutions to seemingly insurmountable problems was a rewarding challenge. Their enthusiasm for problem-solving was palpable, and their mastery of Git and Golang vendoring set them apart.
As time progressed, I transitioned from team lead to a “group lead” role, where I oversaw core OpenShift teams, including the “master team,” “etcd team,” and “workloads/CLI team.” I became increasingly aware of the teams’ impact on the product, the struggles faced by customers, and the ramifications of bugs. However, with this shift, I realized I was stepping away from coding toward management. The realms of Kubernetes and OpenShift grew more distant as my focus shifted to people management and team efficiency through automation tools.
And so, this year, I officially embraced the role of manager, bidding farewell to my path as a software engineer. My focus shifted to optimizing team dynamics and effectiveness, mentoring aspiring engineers, and fostering the creation of future-disrupting technologies.
… to be continued?