Will ChatGPT Kill Us All?

The pace of the adoption of generative AI (ChatGPT is one implementation of genAI) continues to be extraordinarily rapid. If anything, genAI’s adoption and fears of the consequences of its adoption are accelerating, which is as interesting to me as the technology itself and a determinant in its perceived threats.

Our fears over genAI are nothing new. Most existed long before the neighbor’s cat caught its first rat.

Last week, technology executives gathered at the Capitol to discuss regulating genAI with lawmakers. Mark me down as a skeptic: on the technology executive side, the participants were jockeying for advantage in exploiting the technology. They would have passed if they weren’t slathering to exploit genAI. They argued for restraining their competitors. But themselves? Oh no.

The lawmakers were striking poses against the evil tech execs and trying to establish genAI creds with their constituents. Little was accomplished, although, predictably, everyone promised to continue the discussion. At least while the cameras roll and the press clamors for coverage.

If the lawmakers are serious, which they very well should be, I suggest they call in genAI engineers, scientists, and academics. The discussion would take more effort to understand and generate fewer sound bites. The proceedings would look more like a classroom than a carnival. Far less idle entertainment, but not a complete waste of the time and resources.

Our fears over genAI are nothing new. Most existed long before the neighbor’s cat caught its first rat.

We fear AI will develop an unstoppable super virus that will kill off humankind. Moses used that threat to goad Pharoah into listening to the brickmakers local. But what stopped the Covid-19 pandemic? Vaccines, not tracking down the origin of the virus.

Maybe AI-aided hackers will take over the power grid. Plain old hackers did quite well at shutting down Iran’s uranium purification plant, as did the creeps who brought down Saudi oil refineries. Better and stricter cybersecurity would have stopped both those efforts. AI had nothing to do with it.

What if cheating with AI vitiates education? Cheating didn’t start with ChatGPT, nor will it end if genAI disappears. Students who know the value of learning, not the illusory advantages of arbitrary ticket punches, don’t cheat. Convince students that they will be rewarded in life for what they learn, not the grades they receive, and cheating will be gone.

Which brings us to the biggest fear of all: loss of jobs.

The disappearance of typing pools shows how genAI will change the job market, although crystal ball weather is always cloudy.

Desktop computers and copy machines obliterated typing and stenographic pools starting in the 1970s. Eliminating those pools transformed the nature of office work, and the role of women in the workplace changed dramatically. Women in the office today are far more numerous and significant than they were in the 40s, 50s, and 60s when most were typists, stenographers, and secretaries.

Did the replacement of typing and stenography with copy machines and desktop computers drag women kicking and screaming to become technicians, managers, and executives? Or did the desire of women for greater agency inspire machines that replaced typists and stenographers? You may prefer chickens or eggs but we’re still talking poultry. The workplace of Mad Men is gone.

The disappearance of typing pools shows how genAI will change the job market, although crystal ball weather is always cloudy.

Some kinds of desk work will change radically. The ubiquity of computer network based communications (the internet and the world wide web) has flooded us with words. Has the inundation improved or degraded written discourse? Opinions differ. There’s good writing on the internet, bad writing, and, predictably, a ton of mediocre writing, good enough to convey its intended message, but not of much merit in itself.

Starting in about 2000, some people have made a living producing mediocre network content. They write coherent and passable paragraphs about any subject, not unlike typists and stenographers who transformed anything dictated to them into words on paper. Some of these folks are being replaced by ChatGPT. Since using tools like ChatGPT is cheap and easy, more will be replaced.

GenAI has flaws. It makes things up (the current term is “hallucinates”) unpredictably. Its output is often boring and lifeless, sometimes nonsense. GenAI recipes range in quality from average to inedible. You can bet that developers are working nights and weekends to address these and other genAI issues.

In the 1980s, word processors froze periodically. It’s rare now, but they still do. Bugs still crawl through their algorithms, but only a tiny fraction compared to early times. I put in a few nights and weekends myself killing word processor bugs. They still aren’t perfect, but the easiest place to find a typewriter now is a museum, not an office.

In the 1960s and 70s, most women quit calling typing a career. They were no longer only trained fingers that operated a machine for putting words on paper in conventionally accepted spelling.

What will those who lose their jobs to genAI do and how will work change with genAI? I predict more jobs and more words. Some will move on to jobs they prefer like dog walking or nuclear physics, others will muddle on doing whatever comes next, but the flood of words will not abate.

Perhaps, faced with competition from mediocre genAI, the general quality of internet writing will improve. Yeah. Right.

The Chip Shortage

Last week, my wife Rebecca, who knows what she wants and when she wants it, decided to replace her aging and worn Android phone, which she can upgrade without paying extra. Fully vaccinated and ready for a post-pandemic treat, she set off for the cellular store. Several hours later, she returned in a sour mood, her old phone still in her purse. The tech at the store told her that she couldn’t upgrade because they had no new phones to offer. The chip shortage.

Want a new car? You may have a wait. The chip shortage.

What a time for a shortage. The pandemic is winding down and the economy is winding up.

The Fourth Industrial Age

The World Economic Forum says we are entering The Fourth Industrial Age where digital models and communications combine with physical processes for speed and efficiency. In the first industrial age, beginning in the 18th century, society began to harness energy to replace human and animal muscle; in the second age, industries were built around mass production in factories; in the third age, automated controls and computerization increased productivity.

The fourth industrial age is just in time for the global pandemic. We can be grateful that computerized gene analysis enabled development of lifesaving covid-19 vaccines in record time. Computer networks have supported productivity and commerce through quarantines and lockdowns. Distributed network management shored up shattered supply chains. As much as we complain about Zoom, social media, and video streaming, they made the lockdowns and quarantines tolerable, kept education alive, and allowed many of to continue to be productive by working at home.

Only the future will reveal where the fourth industrial revolution will take us, but one thing is clear: previous industrial ages ran on coal, oil, hydroelectricity, and nuclear power. The fourth industrial age requires energy, but more than any other commodity, advancement in the fourth age depends on more and better computer chips.

The computer chip

The computer chip started as mechanical relays invented in the early part of the 19th century. A relay is an electrical switch controlled by another electrical circuit. The circuit that flips the switch uses only a few volts to control another stronger electrical current. When you start your car, the current to turn over the engine would quickly burn out the switch on your steering column or dash. This doesn’t happen because a relay isolates the driver operated switch driver from the massive power surge that turns the engine over.

Vacuum tubes, invented it the early 20th century, performed many of the same relay switching functions, but faster. With tubes, came audio amplifiers, radios, and early digital computers.

Transistors, which appeared after WWII, are still faster, more compact, require far less power, and have lifespans measured in decades instead of hours, making the complex digital devices and controls of today practical.

Computer chips are tightly packed arrays of transistors on thin slices of silicon. In 1965, Intel co-founder and engineer, George Moore, predicted that the density of transistors on computer chips would double every one to two years.

This prediction was dubbed “Moore’s Law.” Since its beginning, experts have predicted the impending end of Moore’s law, but after nearly sixty years of exponential growth, the law still holds. The ingenuity of chip technologists, largely from the U.S., has been startling. Advanced chips today have over a billion transistors, that’s more than the number of grains of sand in a five-gallon bucket. Think about wiring together every grain of sand in that bucket in an exact pattern connecting each grain of sand with every other one and you get an idea of just how hard chip manufacturing is. Now, shrink the size of each grain of sand so that they all fit in a few layers the size of a postage stamp.

American engineers figured out not only how to perform this staggering task, they devised ways of using these contraptions to control cars, improve the quality of steel, discover likely covid-19 vaccines. Digital processes power Zoom meetings, deliver pizza, anticipate storm surges in sewers, and broadcast cat photos.

Today, a cutting edge computer chip is undoubtedly the most difficult manufacturing challenge on the planet, requiring hundreds of precision operations, so precise they are calibrated in wavelengths of light. Chips must be small because their speed is limited by the time required for a signal moving near the speed of light to travel from one side of the chip to the other.

Outsourced chip manufacturing

As the source of chip manufacturing technology, you would expect the U.S. to be the leading computer chip manufacturer. It is: Intel and a few other U.S. companies dominate the field. But they do and they don’t. U.S. companies design the chips and the processes to manufacture the chips, but they often outsource the fabrication to firms in Asia, primarily in South Korea and Taiwan. Companies in South Korea and Taiwan have factories on the Chinese mainland where skilled workers are plentiful and wages are low.

And there you have it: U.S. chip innovators depend on manufacturing capacity in mainland China.

Why? Chip manufacturing in China is cheap and the quality is high. The managers of the U.S. chip companies like Intel, Invidia, and AMD are obliged to optimize shareholder value. In corporate America, passing up opportunities for increased profits ends careers. Executives must manufacture chips as cheaply and efficiently as possible. They are compelled by the market to outsource to the U.S.’s leading economic and social competitor, China.

The shortage

But don’t jump to the conclusion that the chip shortage is caused by China. U.S. corporations may have shortsightedly handed chip manufacturing to the Chinese, but the shortage today is not the result of secret directives from Beijing.

The shortage was caused by the rapid progress of the fourth industrial age. From the 1970s, when computer chips first came on the scene, tech companies— computer, smartphone, and networking gear manufacturers— were the primary consumers of chips. But this has changed. Automobiles have become mobile computer data centers. The Internet of Things requires millions of computer chips in home appliances and industrial sensors and controls. Industrial robots must have chips.

Consequently, the chip market has expanded far beyond the tech sector. Adding chip production lines is difficult and slow. Think of the gargantuan private and public effort put into developing vaccine production lines. Chip production lines are more difficult, and the demand is higher. It’s not news that rapid increase in consumption of hard to manufacture commodities precedes shortages.

Covid-19 disruption

The fourth industrial revolution may have blunted the damage from the pandemic, but covid-19 entered the scene at the worst possible time for the chip industry. Let me count the ways.

Chip factories had to slow production as employees became sick with covid-19. These factories have their own supply chains for raw materials, subcomponents, and manufacturing equipment. The global pandemic disrupted these supply chains and sources as well as the factories themselves.

It gets worse. Consumers quit buying. Automobile sales plummeted and the big automakers cut back their orders, falling to the back of the queue. People quit flying. Chip manufacturing relies on cargo space on passenger flights to ship their tiny high-value products and receive materials and subcomponents, but passenger flights were cancelled. The alternative, container ships, are a slow inferior choice for shipping, and they too had their covid-19 problems.

Econ 101

Covid-19 did much more than inhibit chip production. It also increased chip demand. Kids needed computers for remote schooling. Parents needed equipment to work from home. Network usage soared, which required added network gear. In the U.S., by the end of 2020, we were using computing and the computer networks the way that the experts had predicted for 2030.

Increased demand and decreased supply. Sounds like an exercise in disaster prediction from Econ 101. Here we are. Coming out of a pandemic with a roaring fourth stage of industry demand for chips and suppliers struggling to fill orders.

How long it will take to stabilize chip production is hard to predict. Some say by the end of 2021. Things are likely get worse before they get better.

U.S. and China

This is painful, but not all bad, because it draws attention to a glaring problem. Even after the current shortage goes away, the U.S. is still in trouble. Outsourcing of sophisticated manufacturing makes managerial and profit sense, but it is a recipe for disaster. The U.S. rivalry with China is nothing like the U.S.-Soviet cold war. At the height of the cold war, the U.S. depended on the Soviet Union for fish eggs (caviar) and furs, but not much else, and the Soviet resource-based economy didn’t depend on the U.S. That left both sides free to exercise military strategies with little regard for economic consequences.

Today, the U.S. and China are economically intertwined in ways that the U.S. and the Soviets never approached. Don’t expect to see Xi Jinping pounding on a desk with his shoe like Nikita Khrushchev at the United Nations in 1960, but expect a series of confrontations and tense maneuvering for advantage. In the cold war with the Soviets, the contest was mainly ideological: state socialism vs. capitalism. Today, the superiority of capitalism is a foregone conclusion in China; the contest is between an authoritarian and a democratic state. Xi manipulates markets to achieve what he perceives as the best deal for the Chinese people. In the U.S., the people direct the market and hope they achieve their goals. If the free market says outsource to China and the people agree, so be it.

A solution

But not all Americans agree that the free market has the best solution to the chip shortage. Some folks, including me, think that we ought to identify the resources we depend upon and act for long term control of our future. They see prioritizing and supporting our own chip manufacturing base as a healthy approach to continuing democracy in the fourth industrial age.

In another venue, we can argue where private enterprise and government enterprise should prevail. But for now, I hope for a government that encourages long-term investment in chip manufacturing and discourages short-sighted profit-taking on outsourcing. We need a landscape in which every off-shore outsource has a vigorous onshore competitor. May the best contestant win, but let’s make sure that onshore contestants are on an even playing field.

We can do this.

Celebrating Christmas 2020

As everywhere, Christmas 2020 ends a year like no other for us on Waschke Road. Rebecca was scheduled for spinal surgery in March that was postponed by the pandemic lockdown. That resulted in a harrowing few weeks during which we decided that a two-story house was not for us.

Sunrise before Christmas 2020 on Waschke Road
The morning panorama on Vine Maple Farm

Though we loved our spacious Ferndale house, a smaller house on Waschke Road we built for Rebecca’s parents was a much better fit for a pair of seniors with bad backs and arthritis. All on the same floor and a ramp to the front door, just in case the surgery failed.

We gave the renters notice, which, fortunately, they were glad to receive because they had already decided to buy their own house. In Phase 1 lockdown, we started moving on the 1st of July with much needed help from the family. (Even six-year-old Dario helped.) We made it in time for Rebecca to recover from surgery on Waschke Road. The Ferndale house sold a shade below our asking price in August.

Every morning, the sun rises in a panorama over the old homestead. It’s so good to be home.

2020 on Waschke Road

The Whatcom County Library System, where I serve on the board, has been open for digital lending, curbside pickup, and a raft of online events and videos. I’ve been amazed at the skill and alacrity of the library staff’s work to move the system online. Our grandson Christopher and I are working on a pilot for an online bookstore for the Friends of the Whatcom County Library System to replace in-library used book sales, which are blocked by the pandemic. I’ve been leading weekly bookstore project standup Zoom meetings, secretly promoting agile development methodology.

Software Architects Anonymous, a miscreant gang of cynical enterprise consultants, meets on Zoom Friday evenings for a little beer and a lot of gossip.

The best news of the year came from the old homestead farmhouse. On Tuesday evening, 24 November, our son Paul, wife Lanni, and a midwife brought Charles Theodore Arnold Waschke into the world in the very room his great-uncle Arnold was born a 100 years ago. My dad— Theodore, Charles’ great-grandfather— was born in what is now a chicken coop.

2020 the dismal

2020 is the year of the most devastating health disaster in a hundred years. The death toll is climbing rapidly, 318,000 as I write this. On September 11, 2001 3,000 Americans died in a single day from a terrorist attack. In December 2020, we have already endured 4 days that exceeded 3,000 deaths from covid-19. Looking at the climbing death rates, I am afraid we’ll exceed the number of U.S. military and civilian casualties in WWII (420,000) by the New Year. If you accept the Economist’s excess death method of calculating the death toll, we may already have passed that milestone.

Christmas 2020 the wonderful

As bad as all this looks, in 20 years, I am convinced we will look back on 2020 as a year of successes. I’m not crazy. At least I don’t think I am.

2020 medical breakthroughs

  • We have 2, possibly 3, effective vaccines for covid-19 11 months after the virus flashed on the scene. The first flu vaccines did not appear until nearly 30 years after the 1918 flu pandemic. In June of 2020, the World Economic Forum reported that it takes 10 years to develop an effective vaccine. We got three in 11 months.
  • Artificial intelligence has solved the problem of protein folding, potentially the most significant discovery for medicine development in a century.

Hope for arresting human caused climate change

  • In sunny places, solar electricity became cheaper than fossil fuel generation in 2020. People will start using renewable energy because it is cheap, not from altruism, which is in far shorter supply than sunlight.
  • BP, in its yearly market forecast, predicted that world oil consumption, currently suppressed by covid-19, will never return to 2019 levels. Not all oil companies agree, but the P in BP is still petroleum. Think of that. Ferndale depends on its refineries, but with the right planning and strategy, the jobs will remain and grow while the climate is preserved. A company that views the future clearly has a hand on success.
  • Car sales plummeted in 2020 but electric automobile sales went up. People buy electric now because electric is cool and practical, not because the trees need a hug.

Technology marches on

  • SpaceX now sends humans into space for $62 million. The space shuttle cost $1.5 billion per flight. The science fiction dream of visiting space is becoming practical.
  • We are learning more efficient ways to teach and learn. With all the grumbling about Zoom fatigue, it is easier and cheaper to be trained in practically anything than ever before.
  • Quantum computing is becoming real, hinting that a new level of computational power is on the horizon— a fresh set of batteries for Moore’s law.
  • Although the economy has taken a massive hit, the digital economy is surging ahead. The Organisation for Economic Co-operation and Development reports that Internet data volume, use of online conferencing tools has been surging. And network providers have been keeping up.

Forces are lining up for the biggest economic burst in centuries.

There is hope that Christmas 2020 will bring future peace, joy, health, and prosperity to us all.