When I first wrote this I had been recently packaged into early retirement by the corporation that I had worked for the last 21+ years as an IT employee.
While the actuality of the termination of employment was not totally unexpected – one would have been completely blind or totally oblivious to the trend of employee reductions within the corporation over the previous years. The writing on the wall was plain for all to see. Each fresh announcement of ‘work-force reductions’ came to be viewed like a lottery. The difference being – the lucky ones were those whose number was not chosen.
Yet, it’s still a jarring realization that I had made the list.
Perspective being relevant, there was also a sense of relief that the soul-sapping uncertainty of not knowing when your number will bubble up into the ‘layoffs lottery‘ was finally over.
So, after 44+ years of employment, I was being put out to pasture. My current skill sets, apparently, not good enough to aid the corporation’s attempts to pivot and capitalise on the quick-changing trends in technologies (in particular) and society (in general).
Like those worker-drones before me who had worked on the assembly lines, in the steel mills and other factories (before those jobs were automated, off-shored, outsourced, downsized or out-right eliminated) – in an era of break-neck pace technological (digital) innovations and break-throughs, I now joined the ranks of the dinosaurs.
Like workers of previous generations who were encouraged to upgrade their education, to specialize, to re-train or learn new skills to face emerging social realities, I too had heeded that advice and specialized in Information Technologies.
Problem is, I.T is not what it used to be when I entered the field; and today, specialization in various fields of employments has the potential to make one redundant or obsolete within the time-frame of a generation.
My interests in computers and its underlaying technologies did not evolve out of the need to find employment. I grew up consuming huge doses of science-fiction writings all the way from Jules Verne, reading Dick Tracy comic strips (loved the wrist-watch radio!) – The Robert A. Heinlein‘s science fiction novel Space Cadet, T.V shows such as Doctor Who – Day_of_the_Daleks, Star Trek , Isaac Asimov – The Naked Sun, Arthur C. Clarke‘s 2001: A Space Odyssey, etc.
When Texas Instruments, in1979, released its TI modes I was excited as I was when The Jetsons was being aired on T.V.
I’m still waiting for my Flying-Car.
I’m now used to seeing and using ‘moving sidewalks‘ or watching space-crafts being launched into space and people living on a space station in Earth’s orbit.
Ray Bradbury‘s The Martian Chronicles and current talk and speculation about dispatching colonisers (some day) to Mars still interest me. And I still see the relevance of The Illustrated Man, Asimov’s Foundation series, Clarke’s Expedition To Earth or Asimov’s I Robot.
All of that to say, by the time that the MAC and the PC verged on becoming mainstream, I had been an enthusiast for a technological/digital future that some are just now realizing is actually taking place all around us.
It is a transformation that, literally, informs us all (in the general sense that the technologies are being utilized by us to define who we are and how we relate to each other in our daily interactions) – as well as informs on (as in keeping track of) our individual and collective habits and tendencies.
So, I can’t help but note the irony: I have worked in the IT field for over two decades installing, maintaining and managing the very same machines that are now being utilised to eliminate thousands (if not millions) of employees from the workplace as digital technologies and their advancements (A.I and machine learning) drive the digital economies of the 21st century.
The lesson that I, obviously, didn’t learn or pay attention to in all of this is – one can get so close to the forest that you can no longer see the trees. I got into I.T, as a career choice, after attending a trade show that demonstrated how ‘computers’ could (would and did) replace the traditional and largely mechanical process of typesetting, wordprocessing and lithographic printing with a computerized process.
This, obviously, was before terms like ‘outsourcing’, desktop-publishing, social media or internet were a common part of the daily lexicons and jargons. A time when Bill Gates is reputed to have said “640K of memory should be enough”, IBM, apparently, marketed the Personal Computer as a ‘loss-leader‘ (to the point that by the time they realized the mis-judgement it was too late for them to, profitably, penetrate and dominate as had been done in the Main-Frame (Big Iron) realm.
All of this was plain to see when, unable to be competitive with the leaders in the PC-compatible marketplace, cut its losses by selling off the PC business to Lenovo and the x86 processors foundry to Global Foundries.
This is when the die was cast; and it would, eventually, lead back to me becoming obsolete.
I got started at BIG BLUE because of Microsoft. My first PC was an IBM-Compatible x86 running MS-DOS 2.0. I have followed that MS path over the years working with multi-core processors,, multi-socketed, Terabytes of memory and storage. I can connect and communicate with just about anyone, that I choose to, around the globe. (almost) – near-instantaneously.
The PC, for me, has been a hobby that led to a job where I have worked with generations of Intel-based hardware architectures and Microsoft Operating Systems and Applications. The Second and Third-party O/S and applications I have worked with over the years can be mind-boggling.
I’m now at peace with Linux and their Open-Source concepts. After all, even Microsoft has been playing nice with Linus Torvalds (after IBM made big overtures to the Open-source community of software developers; and after the DOJ smacked Microsoft down hard for play at being a monopoly – long before Google and Amazon came along to challenge Microsoft’s dominance of the PC consumer market.
The game’s afoot in the A.I and mobile markets. Yet, I now find mobile computers and computing to be too instrusive. Futuristic science-fiction writers – such as Arthur C. Clarke, Isaac Asimov, Robert Heinlein, Ray Bradbury, Philip K. Dick, Gene Roddenberry , George Orwell (and others) were quite a perceptive bunch of visionaries.
Oh!
For the Apple ‘fan-boys and girls‘ out there that may come across this and read to the end … NEVER!!
I’d rather chew both arms off or be forced to learn Esperanto before I become an owner/user of an Apple product! Of course, the nit-pickers of the world will look for something to nit-pick about.
31/01/2022 19:45:03 -0500