Rethink Technology to Balance Engineering and Spirituality.

Chapter One: Technology Literacy

Thomas Mahon

Carpenters, surgeons, architects, bakers, software engineers and mystics…. all seek to impose mind on matter.

And when mind is calm and focused, the results can be efficient, useful, and even elegant: the bicycle, the original Ford Mustang, Chartres cathedral, manhole covers, the Ode to Joy

But when the mind is anxious or wobbly, or the heart is two sizes too small, the results can be disastrous: crash-prone software, nutrition-less food, bridges that collapse, weapons of mass destruction…

This great world-drama of attempting to impose mind on matter has been going on for at least two million years, ever since our ancestors discovered that their opposable thumbs let them grasp and manipulate objects, and so became Homo habilis, “handy human.”

By the Classical Age, 2,500 years ago, handy human had evolved from using sharp stones to scrape meat off bones, to designing and constructing the Parthenon. And our ancestors’ large brains let them conceive abstractions like democracy and justice. Humanity would begin to call itself Homo sapiens, “wise ones.”

About that time, Archimedes of Syracuse, a Greek mathematician, physicist, engineer, and astronomer, observed: “Give me a lever long enough, and a fulcrum strong enough, and I can lift the world.” And he could if he had a place in space on which to rest the fulcrum.

Before we can Rethink the subject, let’s consider Technology itself.

Technology, tools, leverage our limited human capabilities — muscles, senses, brains, consciousness — on the world around us: to produce food, clothing, shelter; to protect us from the elements and enemies, or to overwhelm and conquer those enemies; to provide transportation and communications; to promote the healing arts and store information for future use.

In fact, the Greek word tech (τέχνη, techne) means any art or skill of human design, and that originally included not only hand tools, but spoken and written language (poetry, drama, song), and even body language (dance, athletics, acrobatics). In fact, over time there evolved a huge body of tools (prayer, meditation, yoga, chi gong) to leverage the inner self, called variously the soul, human spirit, anima, pneuma, atman, ruah.

The word ‘technology’ entered the English language around the year 1610, just as Shakespeare was retiring. One of the first times it appeared in the U.S. was in the Boston area in the late 19th Century, announcing the new Massachusetts Institute of Technology.

Our relation to technology is often like that of a fish to water. We are so surrounded by it we often don’t notice it. Someone may know all the specs of a next-gen gadget but be oblivious to the technical elegance of the zipper in his jacket, or the water retardant in the jacket’s fabric.

We teach basic science literacy in grade school and show there is a scientific method that underlies advances in all the natural and life sciences. But there are few programs, especially in primary schools, for developing an understanding of technology itself.

There are plenty of programs teaching digital literacy — how to use digital tools for school assignments; to be a good online citizen; and be safe in the social media universe.

But few courses at the popular level examine technology itself: what basic principles govern how we leverage our muscles, senses, brains, and imaginations for our human benefit?

The National Academy of Engineering describes a technologically literate person as one who, among other things, recognizes the pervasiveness of technology all around; understands basic concepts such as systems, constraints, and trade-offs (risk vs. rewards); and understands the limitations of the engineering design process.

I was a humanities student in college, but after decades writing about technology in Silicon Valley, in the company of engineers, I have a great appreciation for what the engineering enterprise has enabled: removing the danger and drudgery of much physical work; making food supplies safer and more reliable; and letting us see a wider world more vividly. Engineering has greatly satisfied our needs in the developed world beyond any expectation.

But now instead of addressing needs elsewhere, the global business model largely focuses its enterprise, ingenuity and invested capital on creating, and then satisfying, wants in the developed world.

The Classical world understood science to gain knowledge about natural phenomena. And technology was the way to use that knowledge to make things of beauty and do good works.

Beauty, truth, and goodness were the idealized ends of science and technology back then. But we’ve largely abandoned those goals now. If a corporate scientist or academic researcher or high-tech CEO said her mission was to seek beauty, truth, and goodness, she would soon be seeking a new job.

The primary value in much technology today is measured only by the extent to which products meet or exceed engineering (not societal) specifications, and whether stock in the enterprise will continue to rise.

But even as the engineering enterprise claims to be value-neutral, all tool use reflects and shapes values. Yet so much of our technical ingenuity is spent on solving problems that don’t exist, using marketing to create artificial needs while, unattended, the many problems that do exist become even more pronounced. Can we see any other way?

And with digitization of everything we are increasingly drawn into systems that were originally created to serve us, but now demand that we serve them. Our way of life is leading us into digital lobster trap: easy to enter, but difficult or impossible to escape.

Issues of meaning and values are excluded in scientific and technical literature. Scientists and engineers are very good at answering where, when, and how, but are forbidden in their professional lives from asking why. So, Why is that?

Why is asking Why? largely forbidden in the STEM universe? We live, after all, in one uni-verse, not a two-tiered bi-verse.

Why are empathy, kindness, mercy, and justice excluded from many discussions of technology at the professional level? Ford Motor made a point of saying its products had quality built in. Shouldn’t that go without saying?

Without overlooking the many benefits of the new digital electronic technologies — and there are many — there is also a growing list of threats posed by them as we become more immersed in a totally digital world including, among others, total surveillance of our lives, even as the loss of personal security and job security are driving many to acts of desperation. Perhaps worst of all is we are no longer sure of what is true or authentic or meaningful anymore. The ability to propagate ‘false narratives,’ or ‘alternative realities,’ or outright lies is in the hands of virtually everyone now.

Yet there are few mechanisms for public comment, oversight, control, or remediation. And talk of government regulation, of an industry founded on libertarian principles, will be resisted with every considerable lobbying dollar at the industry’s disposal.

And now consumers are urged to move past buying products that worked very well in analog form, from sneakers to refrigerators, and instead buy those same products with an Internet address that are then hooked to the Web where they can, unfortunately, be hacked to pieces. Anything with an Internet address can be hacked, and anything that can be hacked can be turned against the user, including domestic spying.

And among other concerns:

· The more invasive, pervasive, persuasive our tools become, the more closely they are controlled by fewer hands;

· The algorithms that run systems have become so complex they go beyond human understanding and control;

· There is also the rising concern that AI and smart machines will gobble up jobs much faster than they can be replaced, if they even can be replaced;

· And this will leave large numbers of angry, disenfranchised, armed people deprived of a comfortable middle-class life that was once generally available, and with no access for “redress of grievances” except by taking to the street or attempting to take over the government as on Jan 6, 2021.

· And for all we know we are creating systems that may master us, as we become subsumed — identity and all — into them, completing the man-machine interface in favor of the machines;

· Who would have thought that the Information Revolution would become a tsunami of hacks and trolls and fake news to the point that now we question the very notion of “truth” itself?

· However useful many applications are, these programs are gathering information on the individual user. And even if the user voluntarily opted in, she has no idea how that information may be used against her in the future;

· And even as the energy consumption on individual electronic devices goes down, the number of such devices increases, further increasing our energy footprint the is contributing to climate change;

· Implanting ‘chips’ in the body is already becoming a requirement for certain jobs. This may likely spread, even become yet another way the haves can trump the have-nots, with implanted “performance enhancing devices.”

Moore’s Law…. And Murphy’s, too

Technical literacy should also give us a better understanding of humanity’s ongoing attempt to manipulate nature (Moore’s Law), as well as nature’s ongoing efforts to teach us humility (Murphy’s Law).

(Besides little attention to technology literacy in our education system, there is little focus on visual literacy either; that is, the ability to read, interpret and make critical sense of images and sounds. Our education system remains largely focused on verbal literacy and numeracy — reading letters and numbers — even as most of the information we get as students, consumers and voters now comes from sounds and images. But that subject deserves its own extended treatment elsewhere.)

Moore’s Law is a very insightful technical prediction made in 1965 by Intel co-founder Dr. Gordon Moore. And over time it has become the First Law of Electronics: every 12 to 18 months as many transistors (devices that alternately transmit and resist the flow of electrons in a circuit) can be placed on a given sliver of silicon, doubling the processing power in the same space, at little or no increased cost once production ramps up.

Because of Moore’s Law, the size of computing devices has shrunk from room-sized mainframes in the 1960s; to desk-sized minicomputers in the ’70s; to desktop personal computers in the ’80s; to laptop computers in the ’90s, to palmtop devices in the ’00s, to our current mobile devices.

But even at that reduced scale, the only reason even mobile phones are so big is because humans need gross input and output devices (keyboards and screens) to communicate with electrons moving at lightspeed through gates with dimensions of less than one nanometer (one billionth of a meter).

We are already now transitioning to voice inputs and outputs, and those will very likely give way soon after to surgically implanted processors in our brains. The technology imperative says: if it can be done, it should be done, and if there’s a market for it, it must be done.

And although the possibility of having one’s brain hacked, or the operating system of one’s mind suddenly crash, public opinion can always be shaped to align with vendors’ interests. (For example, despite all the health warnings, most of us still love junk food.)

History shows that it is not too hard to get about 15 percent of the public — ‘early adaptors’ — to sign up for the next new thing, to be followed by a majority soon after.

This will be especially evident in the coming years when skillful digital marketing will be used to convince people they and their children need an implanted chip to survive in society: “You will never finish school or get a good job without having enhanced performance.” And who can argue with that? At least at first.

Moore’s Law has been an amazingly accurate projection for over 50 years. But that’s all it is — a very accurate engineering roadmap. It was not etched on a stone tablet and handed down from the mountaintop, demanding in return that we double the pace of our lives every couple of years to keep up with it.

That fiction is the work of marketeers, as I was, on a public that is far too gullible for its own good.

(There is an excellent British documentary on how Edward Bernays used theories developed by his uncle, Sigmund Freud, to manipulate the minds of the public in the 20th Century. He called the process “engineering consent.” Today it’s called public relations. www.youtube.com/watch?v=eJ3RzGoQC4s)

In doubling the pace of life in recent decades, many skills, techniques, traditions, customs, and habits that were time-tested over millennia were abruptly made obsolete or useless. The resulting social disruption is unprecedented and probably unsustainable.

An example of old and new thinking regarding computing: in the 20th Century, Thomas Watson built International Business Machines (IBM) into a behemoth over decades, based on a one-word philosophy: THINK. Good advice anywhere.

But by the early 21st Century, Mark Zuckerberg built Facebook into a behemoth in about six years on the opposite principal: Break Things.

And before a judge broke it up in 1982, AT&T (the one and only phone company then) would never release a new feature into the Bell System until it was bullet-proof. But since then, Microsoft, Facebook, and others in the digital eco-system, routinely introduce new features that are not nearly ready for prime time. And they are correct in their expectation that enough customers will identify and complain about bugs they find. In effect, customers do quality control for these companies. And then when the vendor corrects all these complaints, and finally brings out a robust product, the customers often must pay again for a working Rev. 2.0.

Just as school kids could say, ‘The dog ate my homework,” in the 1990s their parents could tell their boss, “Windows crashed on me.” No manager ever questioned that.

During routinely doubling life-speed, there is little time or energy left to look beyond the immediate. It also means that many consumer-facing businesses no longer produce goods and services, terms that imply a moral component. Instead we have “customer” service, and many senior managers understand that the real “customer” is a Wall Street analyst who can blast a company for being one penny off its estimate. The consumer, meanwhile, is simply trying to get a product that works as advertised.

So much time and talent now go to finding profitable solutions where there was no problem (like microprocessor-enabled sneakers producing reams of unused data), while diverting resources from real problems all around us: in education, health care, infrastructure, and social welfare.

Moore’s Law, and effective public relations, have helped to drown out Murphy’s Law which says that disasters befall those whose profession is to forestall disaster. The law, named for Air Force Captain Edward Murphy, an engineer at Edwards Air Force Base in 1949, actually says, “If anything can go wrong it will go wrong.” We don’t hear much about it anymore, but it’s still there.

The fate of RMS Titanic is a tragic example of this. The original design goal was that the ship would be unsinkable. But along the way, the goal became a given: this ship is unsinkable. And the resulting discussion probably went:

- Since it’s unsinkable, we don’t need lifeboats.

+ Well, experienced travelers will expect to see some.

- Okay, we’ll put up a few lifeboats for window dressing.

+ Yea, that should do it.

We get how hammers work, and understand their dual-use nature: they can be used to build a house, or knock a neighbor’s noggin in. With a little effort we can get how internal combustion engines work. But nobody fully gets how microprocessors work because nobody fully understands “quantum weirdness.” An irony of our time is using electrons that defy logic to power our logic machines.

How much do any of us really need to know about these complex systems on which we’re now utterly dependent, and into which we are increasingly absorbed. Aren’t they best left to experts? Yes, but the experts are increasingly the systems themselves –artificially intelligent expert systems — overseeing other systems. So who is overseeing the overseers?

We should understand the tradeoff: as systems increase in complexity, they increase in fragility and vulnerability. This point is not lost on terrorists, hackers, or other “bad actors.” On 9/11/2001, world civilization was shaken to its core on a budget of less than $500,000.

It’s time to begin reconnecting technology with human and humane values — measurement with meaning — to reclaim our humanity from systems controlled by enterprises, which are in turn controlled by a small group of ultra-wealthy individual who must keep their enterprises growing at the same hyper-aggressive rate as their track record indicates, to please the investors on “the Street.”

Why are we giving the small clique the authority to lead us deeper and deeper into their all-digital wonder-world, under their surveillance and increasingly under their control?

An all-digital dungeon that becomes more fragile with each iteration. If the Internet went down for several days, global civilization would seize up. The new elites may have their secure islands to go to, but the hungry and the desperate will find them, even without GPS. And black water and grey skies, as the planet heats up, will depress even the children of the elites. Perhaps more than most, knowing that grandpa profited by creating this climatic cataclysm will not sit well with them.

The market roadmaps of the tech industry and its thought leaders is not a divine commandment. We must get better at selecting what tools we will use for our own human benefit, and not take our direction from billionaires racing each other to be the first trillionaire.

We, the people, now have at our disposal a wealth of tools to create systems that perform as we wish, and not constantly demand that we adapt to them for someone else’s benefit. We are not Dummies who need to be talked down to because we can’t intuitively use — or implicitly trust — poorly designed software such as that found on the Boeing MAX. The confused customer is not the Dummy; often it’s the firm releasing products that are not ready for prime time. It is high time we reclaim ownership of our lives, our tools, our destinies.

The user’s intention is what separates good tech from bad tech; virtue from vice; grace from greed. But to reach that state of enlightenment, we need an involved, technologically literate public.

Leaders of the technology world are luring us into a metaverse where we will gradually lose any sense of agency in that digital world they control. But we are and will remain creatures of Nature. And nature always bats last.

© 2022 Thomas Mahon

--

--

Tom Mahon, author of Charged Bodies

I started writing about technology in 1974, and began a half-century career as publicist, historian, essayist, novelist and speaker, in Silicon Valley.