“Then I saw that all toil and all skill in work come from a man's envy of his neighbor…”
Ecclesiastes 4:4a
Human relationships can and will affect even the most purely technical of problems. For example, how two different systems within the same company interface will be informed just as much, if not more so, by how the two managers of the two teams might get along.
Consuming an API from another team? Your difficulty is not only based on your knowledge of the technical requirements or how well documented the API is. How well do the teams get along? How well do they communicate?
Do members eat lunch together every once in a while? Maybe a game of pickleball? Teams that get along great have important touchpoints for communication, and the contours of their respective systems and responsibilities mesh perfectly.
An antagonistic relationship bubbles up in various ways. Less integration happens than is technically possible or advantageous because everyone dreads working together, and tasks keep getting pushed to the horizon. Or the team with less political clout gets stuck with all of the less exciting tasks, which affects the fundamental design of the entire system.
Another example might be when a certain framework or conceptual model is imposed from above, stifling creativity as it cascades down. All because of the whims of a single person.
Software is made by people to solve problems that other people have determined need to be solved. Social rationality and awareness are needed. Navigating these issues, with a realistic view of human nature, is just part of the job of the most technical practitioners. The single autistic auteur who gets along with nobody has a high cost, even if those costs might be hidden.
There is no one tool that solves these problems. That’s like saying, “make me a program that solves marriage.” You can read all of the marriage books you want, and they will be about as useful as a battery for a candle if you don’t talk to your wife.
Dig deep enough, and this is always true. Why does a man work out? To get healthier. Why does he want to be healthier? To live longer and have a better quality of life. Why does he want to live longer and have a better quality of life? To be around for my grandchildren.
Ask “why” enough, and you’ll get to some reason that has to do with other people. No one is going to end that sequence of questions with “so I can binge Netflix all day long.”
Which brings me to AI. Again. At this point, I’m beating the residual dust of a dead horse’s skeleton, piling it up grain by grain just so I have a big enough target to hit. But it’s relevant.
While AI might (might) speed up many mundane tasks, or even speed up complex software development, there is always going to be a person at the other end. Why are you doing this task? For no one in particular? For no reason in particular?
How do you know you’re feeding it the right prompts and answering the right questions? AI will give you back whatever you ask for, but unless it thinks you’re racist, it won’t tell you that you asked for the wrong thing. AI doesn’t actually think for you, though it’s a fairly good muppet. Even before these newer AI tools emerged, we had issues with solving the wrong problems or designing the wrong solutions. Now we can just be wrong faster and harder and with even less creativity.
Those who can talk to other people, engage with them, interrogate them, discern feelings and moods, disagree without coming across like a jerk, can imagine being in different shoes, and inspire others, will always be in demand. In other words, people who still act like humans.
Because all problems are people problems, and they require people to solve them. This side of eternity, that will never change.
AI will never come close to human abilities.