Object-Oriented or Functional? Just Write Quality Code.

Alex Rogachevsky
16 min readMay 4, 2018

That’s right. Who cares? Just write good minimalistic code. Does it make OOP and FP unimportant? No, it makes them equally important to write quality code.

This post is for programmers. I am not going to explain OOP or FP, referring to “fundamentals” from Knuth, Dijkstra, or Stroustrup. Sorry, there won’t be a linguistic discussion comparing Haskel to Smalltalk either. Or other purist pet peeves. Nor you’ll see “Hello, world!” level code samples. I simply want to point out how you can turn your expert programming (OOP and FP) skills into money.

Differences Between OOP and FP

FP is older than OOP, however only got adopted by the industry recently. Think what you want. In my opinion FP is not limited to one language e.g. Lisp dialects like Clojure or JVM’s Scala. Today’s programming requires using at least two languages to develop an industrial-grade project.

The main reason for FP’s comeback, again in my very private opinion, is the key programming paradigm popularized in the last decade: the Inversion of Control (IoC).

99.9% of programmers use IoC every day, whether they explicitly acknowledge it e.g. by consciously choosing Spring, or not. It’s not limited to Spring. The “inversion” means turning things inside out: to write the most inner snippets of code (lambdas, event handlers, callbacks, etc.) injected into the right places — to be called by the framework at the appropriate moment.

The concept existed long before Spring. Frameworks like Spring simply hide the main() method/function, main entry point, main event loop, or whatever “main” is called in your language of choice. And whether the pattern itself is called IoC, Observer, Pub/Sub, or something else, it is about the same thing: reacting to something without much care for the plumbing that called your “event handler”. Someone else develops that “main” plumbing along with all of its services, commonly called the “container”.

Now, do innermost snippets of code I am going to refer to as “event handlers” require OO classes? No of course. Most of the event handlers and other callbacks are 100% procedural. It made many think OOP is obsolete. Not so fast…

I know, it’s natural to be looking for some silver bullet, especially at the beginning of your programmer career, overwhelmed with the need to learn thousands of new things. Let’s not swing one or another direction, saying everyone writes event handlers of different kinds, or every lambda/closure is a one-method interface. Those interfaces are not very object-oriented, are they?

Being trendy at the moment (before the code monkeys discredited it just like object-oriented Java) FP doesn’t need any justification. OOP does.

Why OOP Matters

If any of you read my previous posts here, you know what I think of “algorithmic” interviews. Why OOP matters more, than algorithms all of a sudden? If one writes click and other handlers all day long, or worse, fixes bugs in somebody else’s handlers, he/she may spend the entire career, never asked to design an object hierarchy.

There is a big difference between being asked to do something by your boss and actually solving the problem for your customer.

Sorry for patronizing. Fundamentally, you can expect an adequate reward e.g. a Google’s or Netflix’s $400K compensation package only for solving a paying customer’s problem. Not logging your hours, so your staffing middleman gets his/her commissions. I am not idealizing the “big tech” like Google and Netflix. They happen to be more efficient compared to IT. Ran by engineers, they align one’s work with customer technical needs better, than MBA-managed IT.

Only the complete solution matters, whether you are a founder of your own company, or key contributor to your current day job project. The question is, do you want to be a bug fixing code monkey, narrowly specialized dev pigeonholed into some abbreviation by his/her bosses, “integrator” of somebody else’s code of questionable quality, or you want to build a complete product solving someone’s problem?

How many complete solutions are based on algorithms: e.g. of machine learning or big data kind? Is everything “un-scientific” and “un-algorithmic” covered by Excel, Access, and their Cloud implementations: CMSes, DMSes, etc. all the way up to Salesforce? I’ll skip prehistoric “almost turnkey” “packages” that require months and millions to “tweak” and “integrate” — one of the biggest lies in the enterprise automation industry.

Again, it’s my private opinion. There is a vast empty space between CMSes and Salesforce for starters. The conventional technology doesn’t offer anything there. What it means for you? An opportunity — a lot of work (and reward) for normal quality programming w/o a single custom algorithm or AI “science”. Why?

Because no one has applied solid programming e.g. OOP plus FP to business process automation.

Non-technical IT bosses hated programming for several decades, trying every possible semi-DIY tool to get rid of “expensive” programmers. “Business-oriented” “languages” from ancient COBOL to the latest Apex (Salesforce) are not very robust, regardless of Salesforce calling its building blocks “objects”. They are not in a true OOP sense. The newest (still 20 years old) and the best of breed Salesforce is a meticulously optimized and streamlined take on 1980s Oracle Forms, like everything else in that space today. And custom Java spaghetti, typed by millions of code monkeys at Initech-like “solution providers” is not object-oriented either.

I spent 20+ years in American IT alone, and am yet to see any C++ or Java code, that is not a struct with functions. Does it render OO languages like C++ and Java obsolete? They’ve been discredited by code monkeys to the point everyone is asking if Java is dead (vs. Javascript). Believe me, it will be Clojure and Scala in a few years, after the same bodyshop-supplied “discount resources” get their hands on those. OOP or FP, IT has never known true programming. Especially in the 21st century, after it evicted its last talent (whoever didn’t leave to dot-coms in the late 90s) to Googles through wage-slashing “outsourcing”.

Going back to algorithms, I’d say not even 1% of real world software products i.e. the ones being sold to real customers vs. Silicon Valley “exit” vaporware, are based on complex algorithms. And even if some are, you still need to have the users/security model, screen flows, and other business logic modeling some business process: 100% OOP specialty.

Science and algorithms are great, but like “event handlers”, they are parts of a bigger offering. The customer buys a finished product, not a Python library to calculate something. I have tremendous respect for narrow (ML, etc.) specialists, but there is a severe shortage of generalists delivering complete solutions, as everyone wants to learn some hot abbreviation to find a better paying job or cram “algorithms” and apply to Google.

It’s not your fault, that most employers pigeonhole developers into little specializations, following the scholars folly about being paid for good grades: abbreviations in your resume, “years of experience”, etc. Just don’t expect pre-outsourcing compensation (currently offered only by a handful of companies like Google and Netflix) for being a narrowly-specialized “offshorable” pawn. You need to build bigger things to deserve $300–400K.

How those bigger things are structured to even work w/o crashing, let alone be extended with future functionality? As structs (DTOs) and flat “facades” of functions (DAOs, services, and other “J2EE patterns”), forming “big balls of mud”? No.

Short of AI writing 100% of software, there is no other way to comprehend complex organisms and business processes, than dissect and abstract them into meticulous object hierarchies with precise injections of snippets of “functional” code.

Put it this way. Writing the innermost functional pieces injected into some IoC framework like Spring is great. Ever dared to write code of Spring caliber?

That’s what Google and Netflix do every day. Not the “algorithms” or “science”. It’s all about growing up — out of your narrow specialization or dumbed down (to accommodate the cheapest code monkeys) “outsourced” project. Like I said, you’ll only be rewarded for putting effort (applying things you’ve learned) into building a complete solution for somebody. Instead of mindlessly keeping to “learn”: by cramming some tutorial and job-hopping to another employer, hoping to outrun the tidal wave of “outsourcing”.

Unless you intend to make a living talking about programming e.g. at job interviews or selling your tutorial knowledge as an “enterprise architect”. Those are commodity occupations, like general “management”. No matter what they pay today, they get “outsourced”.

Stop Living in the Constant Job-Seeking Race.

Unfortunately job-hopping is necessary in the beginning of one’s career, unless you are at Google or Netflix that provide infinite career future.

The rest of the industry is f-cked up. If life or death initial negotiations over $5K are any indicator, your bosses are not going to notice your professional growth and give you a raise automatically. If that wasn’t bad enough, bodyshopping and offshoring i.e. living under the threat of being replaced by cheap code monkeys (whether they can do your job or not doesn’t matter for non-technical corporate decision makers) put everyone on defensive.

Last, but not least comes the beginner’s desire to please his/her teachers by showing how well you learned something. Relax. You’ve graduated. Google and Amazon may tell you otherwise at their “algorithmic” interviews, that resemble college finals, but in the grand scheme of things — yes, solving real-world customer problems, teachers no longer matter. Nor interviewers you treat like teachers.

Stop thinking of your work as impressing the next interviewer e.g. w/ OOP. Just do a quality job, that rarely looks impressive. An average enterprise project hardly requires 10% of GoF patterns or other book stuff. It’s just OOP basics: inheritance vs. composition, clean, minimalistic and orthogonal interfaces, polymorphism, etc. No magic to impress somebody. Just quality — leading to the minimum lines of code.

Don’t expect anything good from your boss: present or future (interviewing you). You are putting in quality work for your customers, gaining experience in solving their problems. One day, with a solid portfolio of working projects (vs. architectural advice or logging your hours and contributing to IT’s 70–90% failures), a good boss will find a rare problem solver like you (unlikely), or you become your own boss selling your work directly to the customers.

Current Technology: the Chicken And Egg Problem

Why developers outside of the few Googles don’t dare to build bigger things and solve pressing automation problems e.g. in stagnated fro 20 years enterprise software industry? I’ll skip the man-hour fraud and incompetence, covered in my Meritocracy series here. From the unbiased engineer’s point of view, it is a 100% technical problem — current technology limitations.

Start doing something bigger by the book, and you’ll end up crawling to investors to finance the same team of narrow specialists. Are there different books? Don’t expect “enterprise architects” to write them. They work for the Great IT Consulting Food Chain living off the man-hour revenue.

Are we doomed and need to wait a couple of centuries for the true AI to replace programmers? No. The right technology already exists, albeit limited to the B2C space of Google and Netflix. Are they going to expand into the inherently custom enterprise software development? Everyone knows the answer. They won’t touch anything unglamorous and “unscientific” like that with a 10ft pole.

Want to know what is wrong with the conventional enterprise technology exactly?

It is the incompatible with OOP tabular aka “relational” data model. ORMs or not, flat data leads to flat DTOs (good old C structs) and DAOs (flat facades of functions). The solution? Making the “middle tier” developer a true generalist, talking to the customer and modeling everything he/she learned about the business process: from UI to the persisted data through OOP.

The very first question such generalist developer will ask, is why he/she needs tabular databases instead of modern document ones like Mongo, saving data in its natural polymorphic object hierarchy format. The polymorphic part is not trivial, but I don’t want to bore you with the technical details how I did it in our Px100 Platform.

Have you seen such generalists in your “IT organization” — even after replacing Oracle with Mongo? 99% of Mongo databases mimic old relational data structures, not taking any advantage of natural schema-less NoSQL way to store data. Worse, it’s followed by age-old follies e.g. Mongo DBA jobs, since allegedly a “middle tier” developer doesn’t understand databases. Cheapest code monkeys sure don’t, but that’s a separate discussion of dumbing down projects. It was that way with SQL. No one’s got the memo, that SQL is taught to every CS grad?

Databases like Mongo were created to get rid of DBAs along with all “communication problems” between them and developers. Speaking of the latter, how they typically access Mongo? Yes, through age-old DAOs and DTOs.

Other dogmas follow. Business analysts (product managers, whatever) write the “requirements”. UI designers design screens. DBAs look at those screens and translate them into Excel-ish tables. And the poor squeezed middle-tier develop? Tasked with writing DAOs and DTOs to connect the UI with the database.

It’s 100% reasonable for him/her to question why a DTO should be a class with getters and setters instead of a dumb “struct”. I have a better question. Why such useless straight passthrough “middle tier” job even exists? Save it to Mongo directly from the front end. There is no reason for multiple application tiers that pass the data along w/o changing it.

See what happened here? The most powerful: compared to browser scripting in JS and archaic database scripting via triggers and stored procedures, the most advanced tool: Java is being ignored. It’s scripting vs. programming. Java is not a single language. It’s a linguistic ecosystem with robust patterns and millions of open-source frameworks: from quintessential Spring (JS w/ React let alone PL/SQL have nothing on) to clever little GitHub-hosted libraries, is being ignored.

I understand, everyone wants to quickly hack something together, our semi-technical bosses have always wanted to replace complex and “expensive” programming with scripting, and generally there are very few people on Earth enjoying diligent programming which requires running more than three (the human comfort level) logical chains in one’s brain.

It still needs to be done. Exactly how it is needed for other complex engineering products e.g. cars. No matter how much one wishes to 3D-print a real car in his garage, they are still built at expensive factories after months of research and meticulous engineering.

There is no way around the product complexity. Only a choice of writing a million lines of JS event handlers and stored procedures or 10–100x less well-structured object-oriented code in an industrial-grade language like C++ or Java, concentrating the same number of tricky business rules in smaller codebases. Such efficiency (writing less code) requires an extra effort, but pays off in the long run. I call it “pedaling in a higher gear”.

Here we go again. Do you want to be a generalist who retired both UI and database developers along with other narrow specialists? Or you want them to retire you with either JS or PL/SQL?

I understand why JS is winning over Java. The current “middle tier” use of Java is pathetic: passing data to the database, whether most of the business logic: screen navigation, enabling/disabling fields and buttons, etc. has traditionally been on the client side. Why can’t all of it live there, right?

The language (JS vs. Java) doesn’t matter, It’s notably harder to write and debug OO code in JavaScript. Not impossible for a true programmer. While unfortunately it is easy to type millions of lines of spaghetti code in any language. With the rising popularity of JS it’s no longer a “script” to write click handlers. Its massive codebases cannot remain flat and procedural. Not surprisingly, we started seeing OOP frameworks like React, that brought MVC to the browser world. MVC is an OOP pattern if you didn’t know.

Look, whether front-end or backend, I dare you to find another way to organize your code better by modeling different parts of your business process interacting with each other. OO abstractions are not as “abstract”, as you think. Which brings the next topic — how it was butchered by…

Architects” and Academia

Why OOP has been so detached from reality, when it is core purpose, stated in every C++, Java, let alone explicit “OOP” textbooks is to model the real world. Why it doesn’t get further than stupid UML diagrams of the Dog → Mammal → Animal kind?

Read above. IT has never known OOP. No one, other than the few Googles had a chance to apply it.

Freelancers are busy scripting something simple. Narrowly specialized corporate IT developers are busy writing their little pieces of code. Bodyshop-supplied code monkeys are busy typing spaghetti: creating five bugs by attempting to fix one. And the “architects” whose formal responsibility is to apply OOP and FP to customer’s automation problems, are busy drawing the most simplistic diagrams of colored boxes and arrows presented in boardrooms to sell IBM, Oracle, or whatever the consulting shops employing them are offering.

There is no shortage of dignified IT professionals explaining OOP and FP fundamentls, along with selling their man-hours to demo mysterious (for outsiders) frameworks like Kafka. No one outside Googles have ever tried to leverage OOP in IT to build a working product. College professors and enterprise architects surely haven’t. Does it invalidate the concept? Don’t throw the baby out with the bathwater.

I got tired of waiting. If no one develops Google-level technology in enterprise software domains, most of IT jobs will be limited to primitive bug fixing and “integration” of that buggy old stuff. Is that a chicken and egg problem: no employers to offer you bigger and better opportunities to apply your expert programming skills? Only from inside of your pigeon hole.

Np one has done for Google. Throw away circa-2002 “J2EE blueprints” and build the next generation B2B technology like Google did in B2C. Or STFU about OOP. Nothing will change — meaning money in your pocket, if you keep defending your smalltime job: freelance, narrow specialization, primitive bug fixing, etc. — sighing, that no one offered you anything better.

Freelancers

Curious about blockchain I googled a couple of programming articles. Mind you, written by real developers instead of Forbes analysts. The first observation: 100% JS. I remember one guy explaining how to install one Node framework after another, narrated by “And if you don’t know this [abbreviation], you are really new”. To what? Programming? It spans wider, than Node.

What he was building? Some rudimentary smart contract. No blockchain required to implement that concept with 20 lines of normal Java code. What could he possibly build? A wallet, nothing more.

I don’t want to start the leaders vs. followers debate. You choose what to use your programming skills for. OOP is indeed an overkill for vanilla brochure websites, online stores, and crypto-wallets. I understand, everyone needs to start somewhere. Do you see your future as a cheap freelancer competing with the entire third world on Upwork? How long you think it takes a person on the other side of the planet to read the same online documentation and go through the same tutorial?

Narrow Specialists

You are not going to do any serious programming if you chase abbreviations or are otherwise pigeonholed in some narrow specialization by your bosses. Unlike freelancer projects, yours can be quite big, however just like freelancer’s case, your part remains small — to ever benefit from meticulous OOP.

Why your bosses never wanted you to build anything bigger? Because, provided they are technical, which is very rare to begin with, they come from one of narrow specialties — UI: Photoshop pros who learned JavaScript, infrastructure: writing Perl and Python scripts to serve Web pages, or worse: the DBAs, who look for semi-DIY tools like reincarnations of 1980s Oracle Forms to present the tabular data to the user.

There is big difference between “full-stack” and “generalist”. The former knows both the front and back-end — to fix bugs. The latter delivers complete solutions, doing everything from talking to the user (Product Management) to deploying the system (DevOps), with front-end, back-end, and all “tiers” in between. How is it possible for a single person or 100x leaner team? By abandoning the conventional wisdoms like “J2EE blueprints”.

Code Monkeys

Unfortunately in most projects outside of the handful of “big tech” employers like Google and Netflix, you will not even be tasked with real engineering. You’ll be fixing bugs and connecting (aka “integrating”) already engineered pieces. That’s not engineering. It’s a low-level technician job, a commodity occupation, all of your bosses will always be looking to pay you less for.

Does such slave labor work in our creative occupations? It may preserve the 1970s mainframe aka “legacy” tech for another decade, until it crashes and burns under the mountain of bugs, causing the real Y2K crisis.

I’m sure many dignified “enterprise architecture” books not to mention equally detached from reality “methodologies” like TOGAF have been written on the typical arrangement of one “architect” telling hundreds of code monkey technicians what to do. It’s never worked. Nothing new has been developed in IT for almost 20 years. While the project failure rate predictably rose from 70% to 90% as a result of dumbing down projects to turn programming into a technician job, comprehensible by semi-technical managers and cheapest “offshore” coders.

OOP in Action: My Work

OOP is a guilty pleasure.

I thought I spent more time than I needed (a week instead of a day) on a “perfect” OO model of insurance agent marketing and fulfillment automation. It takes time. And while I know I’ll be ahead in the long run, there is always some shadow of doubt: a few thoughts crawling in the back of my head. Am I overcomplicating things? Could a freelancer do it faster?

No, they cannot. The business rules are complex. There is no way around that complexity. Duplicating that code three times for the initial three lines of insurance: life, home, and auto — all different data-wise, yet sharing the common workflow? Not an option for a true programmer.

Another, more recent example: universal SaaS billing.

I decided to code it once to use in all our products. It needs to be extensible. Some products would use a simple flat list of subscriptions, while other can have complex combination of plan (basic, premium, enterprise, etc.), number of users and other resources e.g. text and email messages per month, the pay period (paying forward means a discount), and the payment type: automatic credit card charge or invoiced/collected, to name just the most basic variables. How would you model this w/o OOP? Not just the config or data. The behavior. There are lots of exceptions in billing. What if the credit card is not on file? What if it is declined? Automatic suspensions for non-payment, reversing charges, issuing credit, etc. etc.

There is a reason, it costs Oracle and SAP millions to automate that stuff via their “almost turnkey” packages. It’s way too complex for freelancers. While it is not scientific enough for an average Googler, let alone definitely outside of his/her single-purpose B2C product attention span. Who would develop that software for insurance agents? Their non-existent IT departments? Outsourced “solution providers” selling SAP and Oracle? “Funded” startups, busy wooing investors for new rounds of funding?

I Am Not an “Architect”

I am not selling you OOP and FP. If you think they are irrelevant, because they don’t pertain to your pigeonholed IT existence or its conventional alternative: “algorithmic” Google interviews, it means more market for me. The problems I am solving — with a 100x leaner headcount, are impossible to comprehend w/o state of the art OO models interlaced with smart FP code snippets.

A non-trivial problem bigger than a vanilla ecommerce site or crypto-wallet, requires developing both the “container” (framework, platform, etc.) and the little “event handlers” injected into it. Sure, there are general-purpose containers like Spring, but none universal ready to use ones automating higher-level functionality like workflows and document management, since every business process is unique.

Love it or hate it, that’s the only way to solve somebody’s problem as a whole. And get paid. Non-generalists don’t solve problems. They are technicians, assisting generalists with developing a solution.

--

--