top of page

Decrypting the youthspeak

Yesterday I got an email from someone I never heard of at an AI company that spun off of one that I had worked with to do computer vision related tasks. The emailer said he wanted to have a phone conversation with me about some technical issues. Many months before I had been in contact with this person's company about the possibility of getting some augmented data for a neural network I was building. I had so little data I eventually became rather inventive in finding more data and writing code to augment data (some of which I posted on my Github: https://github.com/drcandacemakedamoore ). I was really surprised by the email because the CEO of the company blew me off twice, so eventually, I found the aforementioned work-arounds. Perhaps the blowing off was more of a hiding. During our conversations, I uncovered and pointed out many inconsistencies in what he was saying. Not only was some information including but not limited to multiple statements on his company website false, worse some of his supposedly augmented or synthetic data was just de-identified real data. (Some data from the third world is remarkably cheap if you know how to get it. Heck, the same could be said for English NHS data as Google Deepmind proved to the world in a scandal that was remarkably under-reported.) Yet this glib 27-year-old former front end developer now has his secretary writing me asking for a call. Only in the world of computing do 27 year olds become leaders, and 30 year olds turn into elder-statesmen. As an old programmer who is the daughter of another programmer, I must cry I wish this would stop.

To say I am the daughter of a programmer is somewhat incorrect. My mother is a mathematician, and a statistician to be precise. In the old days there were no programmers, just people who wanted to use machines. Ada Lovelace, in-arguably the world's first modern digital computer programmer was a mathematician. So was John McCarthy, the inventor of LISP who coined the term AI. So was Alan Turing, and the list goes on and on. Really, from all anyone I know can tell it was NASA who invented programming as a separate profession because they were the first who had people filing tax returns with this as a profession. NASA also employed "computers," many of them women, some of them from my ethnicity. The recent movie "Hidden Figures" gives a very Hollywood take on the story of some of the black ladies who to paraphrase Gil Scott Heron put whitey on the moon by doing the math behind the scenes. (The picture included is a real one from NASA of Mary Jackson) But even after NASA created the profession of programming, it was not considered glamorous or hip. In fact, I suspect my mother referred to herself as a statistician not only because it was her academic training, but to differentiate herself from the army of terrible coders who started out as female secretaries. The secretarial profession, like programming and nursing, has had some wild swings in terms of it's workers predominant gender and their prestige.

I sometimes have to wonder how exactly computer programming went from something my son's grandmother may have considered an mildly embarrassing task to something my son may apparently begin his career in as a profession while he still has diapers on, hopefully to cash out in a tech IPO 20 years before I retire. Computing has undergone a rapid expansion, so rapid in fact that as a by-product it seems to have become a field of and for the young. Zuckerberg famously quipped that 'younger people are just smarter,' but I'm here to tell you they are also dumber, and I suspect they are a large part part of what is to be blame for the ethical, cultural and scientific decay of computer programming.

It's not just that programmer is a title nearly synonymous with being a 27 year old hipster douche bag arrogant tech worker- that's a micro level issue. In a truly odd combination, on a macro level the profession is degrading before my eyes while it un-democratizes towards a field reserved for a certain very specific kind of elite, which demographically is just about the opposite from some of the old ''computers". The degradation can be seen not only in non-existent documentation which push people onto lord-of-the-flies like message boards to try and figure out their programming problems. These boards highlight a deeper problem- the participants barely speak the same language, sometimes arguing with clearly different interpretations of words they are painfully unaware of. Most fields develop a consolidated set nomenclature. Not so with many issues around computers. Computers have existed for literally thousands of years. If we shrink technological innovation into a human lifespan of 100 years demarcated by the birth of computing by comparison, modern Ai algorithms tend to be about three and MRI machines are maybe two years old. Of course the old machines like the antikythera and the machines Herod of Alexandria cooked up were analog, but still we are talking about a phenomenon (modern computing) that traces itself back to the abacuses of early African civilizations. You would think we could at least agree on how to call things in most cases. The field of computing is shooting ahead towards including quantum computing, as even fairly young people who program applications start feeling left behind...but to relate the two issues is to believe an illusion. Application programmers often get left behind by simple changes in the popularity of different terrible poorly conceived languages and frameworks e.g. Ruby on rails anyone? Oh wait no let's build in JSON, certainly that's good for another 15 minutes...for hardcore programmers these kids look like an overconfident woman of my size in a dressing room trying on different size zero outfits, waiting for one to fit. Most languages and frameworks are, to different degrees, a mess. Even C, the go-to under-the-hood language for speed and precision, used in pretty much all scientific big data stuff at some point might be called a local minimum if we think of languages as occupying a gradient. I'm sure three or four forty year old Rust or Mercury programmers would cheer that statement. But their cheers would be drowned out by a bunch of 25 year old web programmers asking their elder senior programmers, which is to say people who have programmed for maybe seven years, what on earth those languages are. All of it springs from the situation encapsulated by the joke about the monkey who was a front end developer. The joke goes something like this: A front end developer convinces his company he should work remotely. He then heads of to a tropical locale to drink from coconuts and look at the beach while supposedly programming. A monkey swings down from a tree, and types some garbled stuff into his computer. He finds it amusing, but has no idea what it will do so he gives him some peanuts. Soon it's an every day event. Then the company calls. They are thrilled with all the progress, but the programmer has to admit, he actually can't explain the monkey's work. Soon he is an out of work programmer on the beach begging peanuts from monkeys. About a year later he gets a call from his old company offering him his old job back. "But what happened, you replaced me with that monkey.." he states. "Well, yes, he's been with us for a year now, and he's been promoted to a senior programming manager..."

I mean seriously, I just had a conversation with a 27 year old developer I adore about the phenomenon of unaware expert-beginners. Those are the programmers who come up through one single company from bootcamp until management. They are considered experts because they are literally unaware that other ways to program outside their one single company exist. And thus many programmers go to work and work on THE SAME program from a sheer lack of knowledge that it has been written before a few thousand times. In fact it's probably what we used to call, but no longer call, an executable- almost the entire reason I could eek out a module back when I wrote C decades ago.

So now I'm stuck decrypting the youthspeak of a secretary of a man who was essentially an overconfident front end developer until three seconds ago and may retire by 30. In case he is calling for advice, I'm publishing some here for anyone like him. Ring, ring- Half of your endless "fast moving", "breaking" innovation is a bunch of confused nonsense, and when the smoke clears, many of us will be back in math class. Which I'm sure in the razzle dazzle nomenclature of the brave new tech world will be a blended quantum MOOC. See you there.


bottom of page