The Curse of Blue Collar IT

Tech has traditionally been one of the more unorthodox career paths for the better part of the past half century. Few other careers have had the raw flexibility or unbridled chaos of the tech industry. The tech industry has ridden out the dot-com boom and bust, the growth of Silicon Valley as a cultural element, and the corporatization of IT. Things have changed for better and worse, but the majority of IT has slowly evolved into a blue-collar office job.

Tech was once the Wild West where anyone who could code, build, or even just fix a computer could find unbeaten paths forward. The first professional programmers made magic from scratch, but their efforts pale in comparison to what we’ve built on the shoulders of giants. The unordered wildness of the industry bursting into existence and riding the chaos has eroded down into a calmer, more boring sea. Accessibility is at an all time high, but is it really what most people who got into tech still want?

Call centers are ruled by metrics and data analytics on their every action. Every millisecond away from the phone is profit ticking away. The scope of what can be supported gets more and more narrow as the requirements get higher and higher. It feels choking to anyone looking to actually get better. Companies shy from risk which has led to a gradual death of the tech dream as a neutral meritocratic system (whether true or not). Where you landed at a new company was once seen as a function of intellect and mastery, now it’s a paper pushing Game of Drones.

My first summer job wasn’t selling lemonade; I fixed computers. I had to learn how to fix the family computer since I was the one who always broke it and my dad got sick of having to reload the OS. Once I learned how to do the basics, I used those skills to make some money in the neighborhood. “Standardized” computer shops killed that dream, but many other ideas have come forth from tech’s continuous churning.

Liability and Risk Aversion

I grew up when computers were still mysterious, but saw them also become something even the local big box store would work on. Experience meant nothing compared to a company willing to replace it if (or more often “when”) their minimum wage employee running a repair CD broke it. Why risk expensive hours on an experienced tech when you can use a CD which repairs 90% of common problems and reload the OS for the rest? As the internet got faster and remote control utilities got practical, there was a rise in call center IT. Computers were getting cheaper, and so were people’s attitudes towards how much they were willing to spend on fixing it.

It got to make sense to have a scope of support which could be codified as to what is supported and what isn’t. A tech company didn’t necessarily have to be local or have a relationship with the end users to support them. While not every IT company does this or did this, enough did which shaped the greater industry. Clients shifted from viewing you as their technical wizards who helped them as much as possible to a service position with a fixed duty. The loopholes on both sides led to a mutual contraction of responsibilities.

Reduction of Risk and Liability

The traditional technical relationship was reduced to a service exchange. Larger companies began to say “no” more and more to special requests. You don’t do “favors” anymore or you run the risk of being fired by setting a precedent which has to be upheld. If it can’t be quantified or fit in the scope, it’s a liability. Some hold out longer than others, but it becomes a factor in growth rather than a cultural decision at some point.

To complicate everything, tech skills are not easy to measure. Some skills are domain relevant, and others are universal. Others can be transferred with the right training or knowledge. Everything depends on the intellect, skill, knowledge, experience, and personality of a tech. How exactly do you measure all of that in a way that makes comparisons more than apples and oranges? All of the ingredients are important, but all are also essential. How do you numerically rate the rest when there’s a deficit of one, but several substitutes on hand?

More and more of the IT industry has gotten to be risk averse. The irony is that tech is viewed as taking bigger and bigger gambles while it gets more and more conservative as a whole. The fringe is intense, but the bulk is boring.

Certificates and Knowledge

If you look at any job website, you’ll find something akin to asking for 10 years experience in a framework which is 5 years old. You’ll also find countless requests for full-stack developers, fluent in 8 languages, have worked intimately with AWS and Azure, have experience kernel level debugging with Linux, Windows, and MacOS, can write an OS for an embedded system, and are willing to work for the department’s budget of $40,000 USD. There’s a specific set of requirements that they want to avoid having to train the individual, and a lack of understanding of what goes into it.

While this happens at almost every job anymore, it’s especially bad in tech. The people screening the resumes may not (and tend not to) have any technical knowledge, even at tech companies. They get a list of ideas about what is desired and a budget, and they cram in keywords they’ve heard because they have no way to make sense of it all. The trend towards risk aversion has made more and more employers conservative about hiring on “potential”, especially when they can’t weigh experience they don’t understand. A person who has done kernel debugging on embedded Linux systems has less potential as a Windows tech to most companies than someone with an A+ certificate.

You can see this mindset creep in with coding when they demand a certain number of lines of code. Tech tends to be a mix of art and technique, and it gets dangerous when reduced to just technique. What happens when you hit a real problem or something outside the box? You need someone who truly understands the system in a way which is near impossible to quantify to solve it. How can you put a metric on creativity for a solution that breaks the current process but saves it too?

A Quantifiable Mindset

Certificates have taken over at many companies as a metric of knowledge and skill in tech. While they may show a proof of some type of knowledge, how applicable or useful that knowledge is depends on the quality of the testing. Some certificates are useful, others are completely useless, but they all provide some measurable baseline. Companies rely on this promise to hire or promote. It hasn’t quite crept into coding, but I’m sure it will seeing as coding boot camps are seen as proof of knowledge to some groups.

Not all certificates are created equally. A+ was at one point a complete joke and a red flag on a resume, but now it’s considered essential for newer techs. The certificate got better, and has a great reputation now, but what about people coasting on their old credentials? The certificate might expire, but do people outside of tech know that? How long does someone hang onto knowledge before it fades away? Without understanding the certificate or an easy way to measure familiarity, there isn’t a way to see if the cert is a piece of paper or proof of knowledge.

The danger of a mindset of quantifying everything is that not everything quantifiable matters or helps. Certain metrics can be blurred by multiple factors. One factor may look the same on paper, but is completely different in practice. Knowing intimately how a switch functions helps with debugging, but having done it helps too. The skills aren’t equivalent, but they can look the same on paper in certain situations.

Where Creativity Goes to Die

Call center IT is the worst sort there is. Every operation is reduced to a ticket which is typically worked in a way which has a one-dimensional analytic. This constant vigilance to reduce a ticket to a singular metric incentivizes technicians to play games to affect that dimension. It also serves to make the metric a reflection of the shortcuts a company will take. SLA means you get a generic response or phone tag at 4:59 PM on a Friday. Time to close means people playing ticket dodgeball and ticket padding. These trends are slowly creeping out to other parts of tech though.

IT has become a position in many companies where creativity goes to die. Your worth becomes a factor of a metric, certificates, and how few issues you cause rather than your actual success. Clients can love you, but you’re worth the same on paper as someone who can play games and take tests in many positions. A mistake in the team means an avenue for learning is cut off for all too. You have to know how to get out of this situation to avoid falling prey to the reductionism.

Not every job is like this, but it feels like we’re in the historical change from artisan crafting to factory production. Artisans and guilds survived alongside factories for a long time, but prospects for growth were limited by wealth and lifespans. Making something for a factory means getting faster at a single step, but an error margin is built in because it usually isn’t “better”. This move reduces a task from something potentially creative with variables which can shape the process into a set of basic rules and percentages. As the task can be measured more “objectively”, the subjective qualities matter less and less, whether “objective” is objective or not.

The Death of a Dream

We all knew the most basic parts of the tech industry would be consumed by the singularity, but all of us thought we’d be immune to the changes along the way. The continued codification of IT has become the creeping nightmare eating away at this illusion. The technology our industry previously enabled is coming back to haunt us in the name of the bottom line. The Wild West of Technology is long gone, all we have left is the chance to run to higher ground.

The trick to staying ahead of all of this is to learn more and find a niche or location which can’t be easily reduced. Automation will eventually catch us all, but if your job is harder to measure, it’s also harder to dehumanize. If you long for the dream afforded by the tech industry, you need to look to higher ground.

You can’t just learn the basics of coding anymore, you need to know the basics, what they imply, and how to interface with others. The hard skills blur into soft skills. The more your job sees a gray area and fuzz, the longer it will defy quantification and quantization. The dream is going up in flames on one floor, but that doesn’t mean you can’t outrun it.

Every technical position I take continues to allow growth into something more; I wouldn’t take it otherwise. I always look to take what I know and use it to advance my role. Automation is useless without the ability to troubleshoot, which is useless without an understanding of the ramifications of each action. Each role requires a minimum understanding of multiple levels of technical knowledge. There isn’t a sane way to reduce the understanding of each level and the interactions of each level to a single question or even series of questions.

Riding the Fringe as a Career

You evade the singularity for now. While more traditional tech jobs are going to exist for ages to come, the difference will come in how they’re measured, quantified, and maintained. A tier one could easily get the knowledge to grow into a tier two and higher at most corporate places a decade or so ago, but now they have to do all of that on their own time now. A junior coder needs to push themselves out of boiler plate territory into applicable practice to really advance. The big difference is now you won’t be forced to, unless your on the fringe already.

The job becomes repetitive and scoped. Your move to the next level gets harder and harder without foresight. If you wait too long, the technology to learn enough to be relevant gets to be a continued cost. You have to make employers take a leap of faith rather than making an investment. The burden of proof moves from them to you.

I’ve stayed relevant by jumping to the “next big thing” whenever it comes up. Before plain tech dried up, I had already jumped to automation. When automation dried up, I was in development. I didn’t jump to the cutting edge, I jumped to the frontier.

How do you push your job into the next unknown? How do you ride the trends without having to be the guinea pig (unless you want to)? You have to create a pathway that solidifies the developments of tech without trailblazing solo (unless you’re gunning for it). It’s one thing to hit the wild prepared and hitting it without the slightest preparation.

When an aspect is quantifiable or is believed to be quantifiable, it usually means it will be quantified for a job. How do you get somewhere less quantifiable to keep the creativity and fun alive? Modern tech is a blue tech job unless you can get to the next level. Where do you see tech going and how do you get ahead of it?

Image by Peter H from Pixabay