Tech, AI and the Workforce — We’ve been here before, but never quite like this

Babar M Bhatti
7 min readApr 29, 2024

A look at the history of technology’s impact on the jobs we do, and what’s different this time with AI.

In this post I take a look at how technology has changed job roles over the span of known history of humans. Know any stenographers or typists around?

This is about how AI will impact the work we do, and ultimately the lives we live. It is expected that jobs change over time, but now the change is happening faster than we have time to adjust. It’s the first time since sapiens rose to the top of the food chain that we have been in this position.

Scribes

Literacy rates were low in early days of human history. Knowledge was mainly passed on through text. They had an important role — copying large volumes of information by hand — to equip future generations with a record of what had preceded them. This task included the transcription of documents, books, scriptures and other literary and sacred texts. Scribes played a vital role in preserving literature, religious texts, and legal documents, particularly in cultures and times where literacy rates were low. It was a task that required care and attention to detail — scribes took their time to complete assignment as they had to be reliable.

Scribes and Calligraphers — Photo Credit: Flickr

First Major Technology: Printing Press

With the invention of the printing press by Johannes Gutenberg in the mid-15th century, the role of scribes began to change significantly. Where they were once considered necessary to the preservation of knowledge and information, the printing press enabled mass production of books and documents, dramatically reducing the need for hand-copied texts. Some scribes were able to transition to roles that still required detailed, handcrafted precision, but many found that they were unnecessary.

This change, as others we’ll look at, happened over time and its impact varied widely by region and culture. For example, Ottoman rulers responded to the new technology by banning the printing press, which, some say, was done specifically to prevent errors in scripture.

When was the last time you saw a company looking to hire a scribe? They became obsolete over a few hundred years ago because you can’t stop progress. The printing press was the first time humans experienced ‘one-to-many’ communication on a large scale, and it spawned a new industry which continues to this day.

Photo by Melody Zimmerman on Unsplash

But however painful it must have been, one can argue that there was time for the scribes to adjust.

Typewriter

Typewriters were introduced in the late 19th century — marking the next big evolution in communication specifically, and work in general. No more dependency on humans and the quality of their writing styles for office and administrative tasks. The document production process became better, faster, cheaper — it was standardized and industrialized.

Photo by Sergey Zolkin on Unsplash

Typewriter technology changed not just how to do the work, but who could do the work. It brought increased employment in clerical roles and allowed more women to enter the workforce. This technology helped create the role of ‘secretary,’ which has over time, expanded and morphed into ‘administrative assistant.’ This is a good example of technology expanding the role of a human as well. Typewriters increased productivity, which meant we had time to get more things done in a day. Think about this the next time you’re in a meeting on the phone, while driving to pick up your kids at school.

I remember my father had a nice typewriter at home and I wanted to use it the right way. So I joined a typing school — a one room shop in a busy market with typewriters lined up on both sides of the wall. Little did I know how much typing I’d do for school and work!

But again, it took decades for this shift, allowing people to adjust to typewriters and to pick up new roles. I don’t have any data but my educated guess is that some people never took to typewriting and their work was impacted.

Digital Age — 1980s till mid 2010s

The digital age (say 1980s onwards) brought about profound changes in office technologies, notably through the rise of personal computers and word processing software, which significantly impacted jobs associated with typewriters and writing / knowledge work in general. Workers had to learn new skills — operating computers and software was a distinguishable skill. Given the productivity benefits of the new technology, adoption was inevitable. More was expected from the roles that processed documents. Clerical roles started shrinking.

The workers who adapted quickly to the digital era thrived, as there was plenty of demand for their new skills. New roles emerged — for example: technical support or the “IT guy.” Communication modes and styles also changed — people started using emails instead of typing or depending on others to write their memos.

The time frame for this change was less than 50 years. The change was tough for certain segments of workforce as they had a hard time adapting to digital technologies. Less time to adjust, more painful adjustment.

The Age of AI

Even though the term AI was first used in 1950s, it is just after 2022 that it really became mainstream. It somehow snuck up on us. Those who were not involved in machine learning or watching this area were surprised. Those who were considered experts were also surprised!

The journeys of humanity and technology are now deeply intertwined. — Mustafa Suleyman — 2024 TED Talk

The last 18 months or so have seen an unprecedented amount of progress in the way AI has changed both knowledge-based work, from coding to writing to creating speech and video. This is an inflection point, as Mustafa Suleyman, pointed out.

Whether we like AI or not, whether we understand it or not, whether we control its behavior or not, AI is penetrating deeply into our individual and work lives.

Not everyone is in agreement about the severity of this issue. There is a school of thought that downplays AI’s negative impact on jobs. This has happened in the past and we’ve successfully dealt with it.

I say, this time is different. There are many dimensions to AI’s impact which are unique and carry high risk to our livelihood.

The change is happening faster and at a scale unlike the previous changes. It brings to mind Moore’s Law, which states that the number of transistors on an integrated circuit roughly doubles every two years. In other words, we’re creating technology that continues to outpace itself.

Going from cave drawings, to scribes to the printing press took thousands of years. Jumping from Gutenberg’s invention to typewriters and then computers happened in the relative blink of an eye, but still at a pace that allowed for people to adapt. It was nothing compared to the pace of AI and data fueled innovation today.

The nature of technology changes with printing machines, typewriters and software/computers was neither nearly as fast nor the technology grew exponentially. Looking at the rapid increase in performance of AI language and video models is enough to see how AI is becoming better at learning. There’s no signs that this rate of progress in AI capabilities will slow down — even if reliability and trust remain below acceptable levels for business critical use cases.

AI is a great teacher and a better student. We can see that just by looking at the rapid increase in performance of AI language and video models. And there is no reason to think that this rate of progress will slow down — even if reliability and trust remain below acceptable levels for business critical use cases.

Given the recent breakthrough in language models, it is becoming easier to see how we might create a new digital species that goes way beyond what humans are capable of.

There’s talk of similar breakthroughs in other sub-areas of AI such as robotics. The adjustment window for workers — whether they are aware or not — is alarmingly short.

The other dimension is the AI of today has many flaws and shortcomings. We are aware of some of these — like how disinformation can be created and fueled by new AI models. Other challenges have yet to be discovered.

Remember that technology can never be neutral — since it is created by humans, we inadvertently code our own biases into it. And creating technology based on historical injustices means the decisions being made on a massive scale in the future will likely repeat the same mistakes. Keep this in mind: the most popular AI models are black boxes, unable to provide explanations when things go wrong.

It’s true that every new technology, has positives and negatives. But these changes are not distributed equally over space and time. With respect to AI, expect to see wide variability across countries and regions.

There will be new functions and jobs eventually, but there will also be a painful transition time during which a large number of workers will see their roles evaporate, literally overnight. Some will be able to adjust, others not so easily.

The question is, are we doing enough to prepare for this new transition? We need to do more to make AI more useful for all humans. A good first step is to raise awareness. I hope this post helped with this goal.

Follow me on linkedin where I share my thoughts and learning on AI and let me know what you think!

Many thanks to Nicole Ward for her helpful suggestions and edits to the draft of this post.

--

--

Babar M Bhatti

AI, Machine Learning for Executives, Data Science, Product Management. Co-Founder Dallas-AI.org. Speaker, Author. Former Co-founder @MutualMind