We cannot predict with any precision where technology will lead us

By Joe Carvalko | 23 September 2018

The first photo of Earth from the moon was taken on August 23, 1966. (NASA)

This article was originally published under the title “Self Absorption”.

Looking back on my early experience as a young engineer, I am reminded how little my colleagues and I appreciated that what we did would change the world, for good and for bad.[1] I am also reminded how M.J.E. Golay, one of my early mentors understood the duality of technology and how this feature plays large in its application for the right purpose.

Born in Switzerland 1902, Golay received his engineering degree from the Federal Institute of Technology in Zurich, the same college from which Einstein graduated. He is best known for the invention of the Golay cell used in gas chromatography and optical spectroscopy; the Savitzky-Golay smoothing filter; the discovery of Golay codes; the generalization of the perfect binary Hamming codes to non-binary codes; complementary sequence autocorrelation functions used in WiFi and 3G standards; and the Golay hexagonal nearest neighbor logic used in pattern recognition schemes, which I had the good fortune to be a part of.

Golay commandeered the BLACKBOARD lighting up an expanse of slate furiously scribbling in a language that only the scientists across from him and he understood. Everyone tried to anticipate what he was about to chart, but I knew where he was going, after all he and I had worked in the lab the entire weekend trying to find this elusive “cancerous” configuration in the pixels that mapped a new kind of geometry.

My mind drifted in and out of the conversation that followed as I thought about the moment: a space of physical things, air, bodies, a table and chairs, and then beyond into the space of the symbols that only a few in the world understood. Here we could do anything we put our mind to, square the circle, craft Plato’s ideal triangle, overcome the impossible by chalking constructs of the real world and immersing ourselves in Golay’s abstruse theory. I wondered how I got there and where would it all lead.

In 1967 I went to work for Perkin-Elmer’s Optical Signal Processing (OSP) group as a research associate, for a Harvard graduated physicist, who was widely published in holography, optical computers, lens-less microscopes and a type of artificial intelligence that dealt with pattern recognition. My background in signal processing and optical systems was a good fit. The research department, which included the OSP group, was home to individuals who had distinguished themselves in lasers, holography, and even cosmology.

Charles Towne, the Nobel Laureate and one of the inventors of the laser was a frequent visitor, Fritz Zernicke whose father won a Nobel Prize for physics was our lab neighbor, but Golay was dean among the researchers. In 1961, he received the Fisher Award in Analytical Chemistry (one of chemistry’s highest accolades.) He told his audience that he was not a chemist but a communication’s engineer. He asked that they remember that “. . . many . . . advances in science are due to the cross-fertilization of, at first view, separate distinct fields.” And, as I got older I came to understand that this is as much the miracle of the modern world, as it is its menace.

The OSP group used GLOPR (Golay Logic for Optical Pattern Recognition) a parallel computer, which we built before microprocessors were invented, to automate microscopy routines performed by pathologists and lab techs throughout the world. The idea involved imaging cells using a scanning microscope, feeding the image into a computer, and using Golay’s new geometry (Hexagonal Parallel Pattern Transformations) to make topological measurements on cell shape, size, texture, curves, ridges and craters and then fitting the data to a statistical model that would come to represent a cell’s morphological classification.

​For nearly five years I worked in this introverted “cerebellum” of a laboratory where on any given day our sanctuary, dimly lighted by the blue gaseous pin-like lamps fluttered the passing effects of the thousands of binary bits flowing through the silicon vortex of our parallel processor.

Our subjects were well-lit by the yellow sodium lamp marking the center of our cytological universe, the place where we burned images of leukocytes from one or more uranium mine workers or leukemia victims, the day’s cell, through a pin hole onto the conductive cathode of a photomultiplier that moved kernels of cytoplasmic data, photon by photon to waiting digital microelectronic circuits.

From there electronics digested a mix of digital numerology and alchemy, collecting metadata as input to pattern recognition algorithms, breathing life into a machine capable of doing what men and women spent a century trying to perfect. As we succeeded in scanning blood cells, we tried doing other things: finding malaria, looking for signs of syphilis, analyzing cancerous papanicolaou smears and automatic karyotyping of chromosomes.

I have lived many challenging and heart pumping events during my life, but in terms of adrenaline flow, few have compared to those first moments watching GLOPR’s precision miraculously report the names of each blood cell it found: basophile, monocyte, lymphocyte, neutrophil, and eosinophil. With the advent of the new computer, we were able to penetrate the unfelt world, to explain disease through light carrying evidence, to reconcile differences in the nature of biological specimens, and to unlock the unobservable through geometry. But, outside my inner sanctum the civil rights movement and the Vietnam War pounded and twisted the country.

American cities were set afire, stores looted, homes vandalized, and churches fire bombed. Civil rights activists and antiwar revolutionaries were being beaten or worse—killed—some by sole-acting madmen and others by men in the service of a band of thugs or police and national guardsmen following orders. The Weather Underground and Symbionese Liberation Army, both in opposition to the war, were self-styled well-organized revolutionary groups.

All that participated in violence would be deemed, by today’s standards, domestic terrorists. Yet, I refused to divert my attention to what was happening outside. I locked myself, not in a room of having spatial dimension, but locked into the nervous system of computers, the hyper-geometric space of numbers, multivariate statistics, mottled and sparkling lasers beaming off specimens, the inner space of life itself where I peered into the deadly corners of cellular mystery. What could be more mesmerizing (or escapist)?

But, regardless what was happening in the world or whether I was mesmerized, progress relentlessly moved down unexpected paths, some light and others dark. On the lighter side, I had co-authored and published an article in 1969 that included pictures I had taken of red and white blood cells under different colored light in transmission ranging from pure blue, to dark green, to vibrant red. Each cell, depending on the color, resembled a Martian surface; some filled with deep valleys and others craggy mountains.

Sometime later I learned that Craig Kaufman, a California artist, reproduced the pictures into large lithographic prints and displayed them in an abstract expressionism exhibit entitled “California Prints.” The Museum of Modern Art, New York, eventually acquired the collection. I saw something in one medium and Kaufman saw it in another. Although I did not see it then, I now see how something done can serve as the basis for something else largely unrelated and yet cross-connected in some surprising way.

On the darker side, as the Vietnam War labored on, the OSP group was drawn into changing direction, moving from the high purpose of medical science to war. Since our processes could recognize biological specimens, it could in theory be used to recognize military targets—more to the point, photographic images taken over Southeast Asia.

It was easy to turn a machine made for humanitarian purpose into a machine made for war. In short order, we now trained our creation to recognize how to tell a river from a riverbank, how to tell a boat from a river, how to tell a sampan from a patrol boat, but unfortunately not how to tell a good guy, military or civilian, from a bad guy. In December 1971, I presented The Evaluation of Golay Transforms as Applied to Aerial Photo Interpretation, to the Symposium on Automatic Photo Interpretation and Recognition in Washington D.C.

How ironic that technology, so beneficial or at least neutral, more times than not degenerates into a weapon of war. And, for me it was not the first time. I was privileged to assist Emil Bolsey (also an alumnus of Federal Institute of Technology in Zurich), who invented an electronic tracking system, one which when mounted on a spacecraft and pointed at a planet could figure the attitude of the craft (its pitch, yaw and roll) and how fast the ground was passing by.

One application was to move a camera lens to determine the pan-rate to compensate for the movement between the moving craft and the object being photographed, thus reducing the blur. Between 1966 and 1967, five Lunar Orbiters circled the moon with the Bolsey scanners helping to relay high-resolution photographs to NASA so it could choose the Apollo moon-landing site. I last saw Bolsey in the mid-eighties when we met to discuss his suspicion that the government was infringing the tracking system that we had so assiduously worked on nearly 20 years before; and which I later learned guided Cruise Missiles in the destruction of Iraq (’91), and then Kosovo and then Iraq, again.

We know that what makes us individual is that each of us travels in diverse emotional and intellectual orbits. We may spin in or out of creative control hardly ever contacting other inspired souls who may be circling close; we may even crisscross as we move into other spaces—like Kaufman and I. Isabel Myers in “Gifts Differing” said: “We often see different perspectives because each of us looks at the territory from different orientations.” Golay spoke to cross-fertilization, but he left unsaid that it can have both good and bad consequences, for in science discoveries can move into opposite poles.

We must be vigilant to spot those instances where scientific progress serves peace and reconciliation on the one hand and war on the other, or how technology fortifies effectiveness in a national vital endeavor, but weakens our cherished values. By the ’60’s we had known for some time about the atom’s potential to supply a near perpetual source of energy and also knew of its power to annihilate. In the first instance the U.S. conservatively allowed the construction of nuclear power plants and in the second instance barred its use as a nuclear weapon.

In the ’60’s we did not know if a computer-on-a-chip, still in its infancy, or the artificial intelligence we were creating freed us to soar to new heights or tether us to a world where government would hear and see all. With 9/11 and the ensuing march towards the detection of terrorism, a picture has emerged, one characterized by revelations that NSA has cornered the market on private information, euphemistically referred to as “metadata.”

Yet we can look to a brighter side, one I could never have imagined in the ’60’s when the chromosomes we karyotyped would be uncoiled to lay bare the genome as an instrument for critical medical diagnoses, to set free those erroneously convicted of crime, or enlighten us about Mitochondrial Eve our common mother, and the long journey that began two hundred thousand years ago; the journey that brought me into the world of physical things, air, table and chairs, and beyond into the space of the geometries and cohorts, like Golay and Bolsey, who helped me better understand my Universe, the one either too small or too far to see, unless aided by the eyes of science and technology. I once wondered how I got here, and now I think I know, but I am afraid my second query, “where will it lead,” will remain an open question.

One cannot predict with any precision where technology will lead us, although it has the indisputable potential to reduce suffering, extend life, and increase living standards. And, in the hands of the powerful, we witness its misuse altering natural patterns: ecosystems, the sustainability of organisms, to kill with greater efficiency. If we were separated from modern inventions, we would remain alive not more than a few days, weeks for survivalists. Invention does not only express our ingenuity, it expresses a societal conscience commensurate with the kind of world we collectively choose to live in.

Ingenuity itself has little control over where it leads, and I have long wondered whether one might in the words of Hamlet, “bear those ills we have than fly to others that we know not of.” But, I say we should pursue science and technology because, like Prometheus, the fires of invention burn bright, and although we may not always know where it leads us, a world darkened by the fear of treading upon the unknown, is unimaginable.

[1] This article contains material from my book A Road Once Traveled-Life from All Sides (2007).

Reprinted with permission from the author.

Joseph Carvalko is an American technologist, academic, patent lawyer, and writer. As an inventor and engineer, he has been awarded 18 patents in various fields. He has authored academic books, articles, and fiction throughout his career. Currently he is an Adjunct Professor of Law at Quinnipiac University, School of Law, teaching Law, Science and Technology; Chairman, Technology and Ethics Working Research Group, Interdisciplinary Center for Bioethics, Yale University; member, IEEE, Society on Social Implications of Technology; summer faculty member, Interdisciplinary Center for Bioethics, Yale University.

Joe Carvalko on the Intersection of Law, Science, and Technology

Severing of the Species: Implications of Genetic Editing and AI on the Human Substrate

Joseph Carvalko on the Techno-Human Shell: Have Confidence To Reach Beyond!

Be sure to ‘like’ us on Facebook


Please enter your comment!
Please enter your name here