IT Meritocracy. Part 3: Hiring Binges — Fraud, Incompetence, or Technology Limitations?
Hope you are enjoying my series. The previous post talked about fewer programmers doing — and earning more. I don’t preach. I practice — obviously in my own company. Never was allowed to do that during my corporate IT career. Staffing middlemen interests aside, I’ve heard countless bloated headcount excuses from both non-technical managers: “We are not Google” and, I kid you not, developers too: “But don’t you need a team?”, “Duh/Why?” (after showing one how to cut the amount of code to write 100x), and “I am not comfortable with such extreme focus” (after showing my colleagues one of my indented bullet point task lists I copy-pasted into the first article in this series).
What is really driving headcount explosions and why the Earth is running out of programmers even after annexing densely populated “offshore” heavens like India to the Western labor pool? As an engineer, I tend to think (inadequate and stagnated) technology is the culprit, however let’s examine other factors: fraud and incompetence too.
IT History Recap.
I heard the phrase “We are on a hiring binge” from one CFO during a typical life or death negotiation over a $10K to bring the salary to the upper industry average ($150K in 2013 if you are wondering). I’m sure you have plenty of similar experiences. You know what recruiter confessions of needing to “fill 15 Java reqs” mean. Why they arrogantly (or stupidly) bring it up in emails to candidates is beyond me. Perhaps hinting at a well-funded project — from their perspective. They need to share the joy of those “requisitions” and “job orders” I guess. What it really means for you: the merchandise being sold to fulfill said “purchase orders” is the low compensation — opposite of the fewer people doing and making more approach I outlined in this series.
How those “hiring binges” happen? There are three reasons: incompetence, fraud, and inadequate technology. The first two are self-explanatory. No board wants to keep the incompetent CIO (and everyone below). No one wants to steal. It follows the same logic, as 19th Century gold rushes: the local economy flourishes, when mining gold (programming) becomes easier than robbing gold miners (man-hour markup). Not the case in today’s corporate IT i.e. outside of the handful of consumer tech companies like Google, is it?
I am an engineer, so my opinion is biased: focused solely on the third variable: technology. I joined the industry in the 90s: around the time business process complexity started to outpace automation technology. Here’s a brief IT history recap.
Before the late 1970s, business software, or better said, software in general, as there were no (consumer) PCs, was intellectually owned by just one company: IBM. Then Oracle took over with the relational aka SQL database idea, taken, if anyone still doesn’t know, from an IBM engineer, ignored by his employer, since IBM was reluctant to cannibalize the sales of their old crap. The same thing happened to Oracle in the late 90s, when one of its execs quit and founded Salesforce. Which didn’t invent anything new. Salesforce just meticulously optimized and polished Oracle’s relational technology.
All custom (Java, Ruby, Python, etc.) solution providers use the same age-old approach too. To this day 99% of business aka enterprise software on Earth conforms to Larry Ellison’s original vision of related two-dimensional tables of data managed by structural (aka procedural) programming — the only one existed at the time. Allegedly “object-oriented” languages like C++ and Java are used predominantly procedurally.
I don’t want to start a discussion on OO purity. The majority of production code I’ve seen over 30 years was procedural. IMO the biggest reason behind it is inherently anti-OO dumb tabular data structure. Revolutionary compared to proprietary flat file formats of the 60s, it didn’t play well with mid-80s C++. One way or another relational/structural combo started showing its limitations — to model the increasingly complex business processes and government regulations — long time ago. That cancer grew unnoticed for a decade or so, only producing symptoms in the 90s.
Human mind has many limitations. E.g. our visual perception is limited to the three-dimensional system of coordinates. Our written/verbal communication is constricted by verbally pronounced alphabets — pointed out in The Arrival movie, which compared human linguistics to advanced alien means of communication. Programming languages have similar shortcomings. Databases too.
Speaking of the latter, luckily Oracle’s reign came to an end, though it will take time for NoSQL to become mainstream. I don’t want to talk about the hyped AI future or speculate about unorthodox e.g. analog vs. digital computers — in a typical Forbes manner. And to give relational technology the deserved credit, the main convenience of the old tabular data format is its universal nature — similar to universal (though inferior to advanced aliens’) human languages.
What it means, is one can do absolutely anything with classic Excel tables by connecting hundreds of thousands of spreadsheets via VBA macros. It will just be extremely expensive — to the point the Earth will not have enough programmers to produce and maintain software like that. Same thing with horses instead of cars (trains, planes, etc.), and generations of slaves building Egyptian pyramids compared to lean teams of modern construction specialists operating robust machinery.
That’s exactly what started happening in IT in the 90s. Revolutionary 10 years before Oracle Forms, absolutely every today’s enterprise software development tool is modeled after, started to show its age — compared to the best technique to capture the complexity of the outside world: object-oriented programming, invented specifically to model complex processes, organisms, and things e.g. in molecular biology. Though groundbreaking, OOP was crude and hard to learn. It didn’t gain enough traction in the 90s. Nothing’s changed today. Most of C++ code looks like “structs with functions”
Imagine a transportation system. It consists of cars/trucks, and planes. The former require roads, while the latter require orders of magnitude more expensive infrastructure: airports. Many metropolitan areas, namely my current place of residence: Souther California, reached their motor vehicle capacity long time ago. There is no land to build new freeways. SoCal is suffocating without new roads, while planes and helicopters are still too expensive to produce and operate.
Flying cars are a myth. No soccer mom will ever learn to fly. Spare me the talk on “self-driving” cars. They really should be buses (driverless shuttles) — which brings us back to big planes. Shouldn’t all buses be airborne? Pilot-less aviation may come sooner, than programmer-less software development (true AI), however neither is going to happen in the next few decades, is it?
Someone still needs to pilot planes and helicopters. Even if the aircrafts somehow became affordable, there is still a finite number of pilots: people capable of learning how to fly. However the population is growing meaning more drivers on already congested roads, not pilots in the sky.
What if the aircrafts were 100 times faster, bigger inside, while more compact in general i.e. w/o wings or blades? What if they could use any flat surface to vertically take off and land? In other words, what about un-compromised industrial-grade tools made better for people skilled to take full advantage of them. The same number of skilled pilots would have solved most of the passenger and all cargo transportation problems, right?
One day the technology will be there, making the freeways obsolete. It’s easier to make a complex plane faster — still operated by a trained pilot, than to make such robust machine simpler and foolproof — operated by an inattentive soccer mom/dad, who’d crash it just like they get into “innocent” “accidental” fender-benders every year.
IT had its “planes”: OOP in the 90s. They just weren’t fast enough. Hard to learn too. IT tried to develop its “flying cars”: pseudo-DIY software development tools (ripoffs of the original Oracle Forms up to MS Access, followed by CMSes, etc.), still foreign and never used by non-programmers, yet awkward, inefficient, and plain intelligence-insulting for skilled programmers.
Overall, faced with more demand, the existing “ground transportation” infrastructure predictably started to fail, like LA freeways.
The Infamous 70% Failure Rate.
I first came across the infamous IT project 70% failure statistics in a 2003 IBM WebSphere book. I googled it and found one of the sources: Michael Krigsman’s ZDNet blog. I am not linking it here because he’s since removed the number, apparently declaring the truce with the Great American IT Consulting Food Chain (Deloitte, Oracle, and others) he used to stand up to. Though Mr. Krigsman never officially disproved his numbers. According to my own industry experience: tens of mission-critical projects in companies ranging from Fortune 100 IT to boutique software shops, the 70% rate first appeared in mid-90s. Having sharply increased to 90% during the 2000s offshoring bonanza, it is not going down any time soon.
IT failures don’t mean the complete inability e.g. of Salesforce or enterprise Java to tame the business process complexity. Freeways are also (theoretically) designed to flow even at full capacity. A freeway doesn’t have stop lights, does it? If everyone kept driving, it’d have always moved smoothly. However the circumstances and driver mistakes bring any freeway to a halt during the rush hour. Does it mean you stop completely, turn off your engine, and wait? No. It just takes significantly longer to reach your destination. Similarly, the 90% IT project failure rate means, that a complex project is likely to succeed on the 10th try, whether it’ll succumb to inevitable programming mistakes, “communication problems”, and thousands of logistical issues during the previous nine attempts.
I can give you plenty of examples from my two decades in IT. I still remember a vanilla ecommerce website finally built by one of my employers after 10 years of trying. Predictably, the developer (it could only be an engineer, not a semi-technical manager) who took it upon himself to mobilize a tight team of fellow geeks to make things happen, was laid off after he dared to ask for a promotion (pushing out someone else). It proves yet again, that there is no reward for any achievement after you reached your (compensation) cap.
Which One Is It Then?
Going back to the three factors behind bloated headcount: fraud, incompetence, and technology limitations, the first one: fraud is simply making the best out of an always bad situation. If the project is (70–90%) expected to fail: due to inadequate technology, the parent corporate entity can use IT as an obscure R&D tax-hedging sink, transferring (kicked-back) funds to staffing middlemen buddies and relatives. Cannot play this game with servers and routers, since everyone knows the markups. The human cattle though… sorry, “human capital”, is ideal to muddy the waters with the visibility of work to realize funds. It doesn’t matter whether it’s an IT department budget or VC rounds of funding. Neither are financed with the decision makers’ own money.
The incompetence is also easily explained. Since, due to the same technology limitations, no one across the entire industry delivers working solutions — to set the benchmark, incompetent managers all the way up to the CIOs are simply forgiven. Or fired with a golden parachute after their typical five-year contract and immediately hired somewhere else… where a similar failed CIO was fired, duh. I’ve personally seen plenty of such circulation in two big Southern California industries: entertainment (Hollywood) and car sales/distribution (American headquarters of Japanese and Korean car makers).
As for the technology itself, the aircrafts of the future are here. General computer science didn’t hibernate, compared to corporate IT still celebrating the Y2K “aversion” i.e. extending 1970s tech’s end of life past 2000. Today’s programming landscape is vastly different from mid-90s, when IT woes started. IBM, Oracle, and Microsoft ruled the world back then.
Thanks to independent open-source teams, not to mention all the programming languages and tools donated to the world open-source by consumer tech leaders like Google and Facebook, one will not only survive today w/o a single piece of Microsoft or Oracle technology, but will most definitely thrive, compared to the prospect of adapting and integrating their old-school offerings. And no, Oracle did not invent Java, let alone millions of its open-source libraries.
The technology is here. The only remaining issue is putting advanced programming tools and techniques together in a different way, since they were originally developed for consumer products at Google and Facebook. How easily those hypersonic vertical takeoff passenger planes can be converted into cargo ones? It’s not rocket science if you ask me. I’ll explain it in depth in the next article: https://medium.com/@arog/it-meritocracy-part-4-pedaling-in-a-higher-gear-is-writing-less-code-better-c9099d96dd81