AlphaSmart

This post is about writing and computing. It only touches on the technical, so I’ve posted it here on Vine Maple Farm, rather than Marv Waschke on Computing, which I reserve for more technical subjects.

I have found AlphaSmart mode to be productive and relaxing, which is a nice addition to anyone’s work repertoire.

What’s old is new again.

I’m typing this on an AlphaSmart 3000, a product designed and built for use in elementary and high school classrooms for keyboard training, a $300 alternative to desktop and laptop computers costing thousands. It’s LED display has only four lines, each forty characters long, about the equivalent of two lines of text on a letter-size page.

Largely replaced by Chromebooks, school systems were surplusing them before pandemic began and the lockdowns and school closures accelerated the trend. Lacking online functionality, AlphaSmarts are useless for remote learning, and they now flood Ebay.

I’ve heard about distraction-free writing devices for at least a decade. Curious but not much attracted because I’ve never had much patience with folks who find the ocean of knowledge on the global computer network a distraction rather than a resource.

Lured by low prices and curiosity, I bought an AlphaSmart 3000 on Ebay about a month ago for less than fifty bucks and I am astounded to say that I love it.

Although I am 74 years old, I’m also a digital native. I wrote my first computer program in 1967 and started using screens in the 1980s. My solution for the last decade has been two displays, one for the job at hand, the other for fact-checking and online reference tools. I’m sticking with that configuration, but the AlphaSmart has added something new.

If you want to edit beyond the simplest changes, forget it.

I used to scribble rough outlines on a pad of paper (the backside of single-sided print docs). I still do. But now, I sprawl in a recliner with the AlphaSmart and my paper notes and type away.

The AlphaSmart is a drafting, not an editing device. Navigating text on an AlphaSmart is difficult. You are stuck with single-space arrows, “home,” “end” and “backspace” keys and that’s it. If you want to edit beyond the simplest changes, forget it. You have to upload to a real computer.

The AlphaSmart is for laying down one sentence after another. Leave the moves, cuts, and tweaks for later. If you can’t correct it easily on the four line display, leave it for later. If you can’t remember something, stick in TK (a signal to an editor that more is To Kome) and move on. For me, this provides two advantages. I can leave my office to give my aching neck, back, and butt a break, and it sets me free for a mode of thinking and composing that I have only experienced previously while writing in longhand, which is followed by transcription to text, which I dislike. I have found AlphaSmart mode to be productive and relaxing, which is a nice addition to anyone’s work repertoire.

Now, I’ll get down to technology. The virtues of the AlphaSmart come from what it isn’t rather than what it is. It’s a keyboard with a simple display and a small memory, probably less than a megabyte. When disconnected from a computer, the user types text, which appears in the display, into memory. Although the device has a processor, it acts only as a simple controller. When an AlphaSmart communicates with a computer, it uses a simple keyboard protocol rather than a file transfer protocol. The user opens a text entry tool, like a text editor or word processor, positions the cursor, and presses “Send” on the AlphaSmart. The computer screen acts as if a fast typist is typing in text.

That’s all the device does.

Because the AlphaSmart is so simple, three AA batteries seem to last forever. It does not heat up and there is no humming fan. It has no moving parts other than the keys and starts in less time than it takes me to remember where I left off. The device was designed to endure rough elementary school students. I’ve already dropped my used AlphaSmart without damage. It’s clearly not new, but it doesn’t look shabby either.

I enjoy a good rampage now and then.

The AlphaSmart is not perfect. The keyboard is the equivalent of a quality laptop keyboard, but it does not have the key throw and satisfying feel of a mechanical keyboard. The space bar has to be struck squarely. The LED screen has no backlight, which adds to battery life, but is inconvenient for adding a sentence or two during the ads while watching TV in dim light.

This morning, I went on a rampage, practically tearing the living room and my office apart because I couldn’t find my AlphaSmart. I had forgotten that I tucked it behind a chair cushion. I don’t usually get attached to gadgets. This is not normal behavior for me. Well, not everyday behavior. I enjoy a good rampage now and then.

A final note: I favor the AlphaSmart 3000. I also have a 2000. It’s keyboard interface doesn’t work with Windows 10 without a somewhat hard to find special adapter, which is a pain. Later models, like the AlphaSmart Neo, are Palm PDAs in an AlphaSmart form factor and, in my opinion, a step beyond the 3000’s charming simplicity.

Will ChatGPT Kill Us All?

The pace of the adoption of generative AI (ChatGPT is one implementation of genAI) continues to be extraordinarily rapid. If anything, genAI’s adoption and fears of the consequences of its adoption are accelerating, which is as interesting to me as the technology itself and a determinant in its perceived threats.

“Our fears over genAI are nothing new. Most existed long ago.”

Last week, technology executives gathered at the Capitol to discuss regulating genAI with lawmakers. Mark me down as a skeptic: on the technology executive side, the participants were jockeying for advantage in exploiting the technology. They would have passed on the meeting if they weren’t slathering to exploit genAI. They argued for restraining their competitors. But themselves? Oh no.

The lawmakers struck poses against the evil tech execs and tried to establish genAI creds with their constituents. Little was accomplished, although, predictably, everyone promised to continue the discussion. At least while the cameras roll and the press clamors for coverage.

If the lawmakers are serious, which they very well should be, I suggest they call in genAI engineers, scientists, and academics. The resulting discussion would take more effort to understand and generate fewer sound bites, more like a classroom than a carnival. Far less entertainment, but not a complete waste of the time and resources.

Our fears over genAI are nothing new. Most existed long ago.

We fear AI will develop an unstoppable super virus that will kill off humankind. Moses used that threat to goad Pharoah into listening to the brickmakers local. But what stopped the Covid-19 pandemic? Vaccines, not tracking down the origin of the virus.

Maybe AI-aided hackers will take over the power grid. Plain old hackers did quite well at shutting down Iran’s uranium purification plant, as did the creeps who brought down Saudi oil refineries. Better and stricter cybersecurity would have stopped both those efforts. AI had nothing to do with it.

What if cheating with AI vitiates education? Cheating didn’t start with ChatGPT, nor will it end if genAI disappears. Students who know the value of learning, not the illusory advantages of arbitrary ticket punches, don’t cheat. Convince students that they will be rewarded in life for what they learn, not the grades they receive, and cheating will be gone.

Which brings us to the biggest fear of all: loss of jobs.

“The disappearance of typing pools shows how genAI will change the job market, although the weather in crystal balls is always cloudy.”

Desktop computers and copy machines obliterated typing and stenographic pools starting in the 1970s. Eliminating those pools transformed the nature of office work, and the role of women in the workplace changed dramatically. Women in the office today are far more numerous and significant than they were in the 40s, 50s, and 60s when most were typists, stenographers, and secretaries.

Did the replacement of typing and stenography with copy machines and desktop computers drag women kicking and screaming to become technicians, managers, and executives? Or did the desire of women for greater agency inspire machines that replaced typists and stenographers? You may prefer chickens or eggs but the change occurred. The workplace of Mad Men is gone.

The disappearance of typing pools suggests how genAI will change the job market, although the weather in crystal balls is always cloudy.

Some kinds of desk work will change radically. The ubiquity of computer network based communications (the internet, the world wide web, and social media) has flooded us with words. Has the inundation improved or degraded written discourse? Opinions differ. There’s good writing on the internet, bad writing, and, predictably, a ton of mediocre writing, which is good enough to convey its intended message, but not of much merit in itself.

Starting in about 2000, some people have made a living producing mediocre network content. They write coherent and passable paragraphs about any subject, not unlike typists and stenographers who transformed anything dictated to them into words on paper. Some of these folks are being replaced by ChatGPT. Since using tools like ChatGPT is cheap and easy, more will be replaced.

GenAI has flaws. It makes things up (the current term is “hallucinates”) unpredictably. Its output is often boring and lifeless, sometimes nonsense. GenAI recipes range in quality from average to inedible. You can bet that developers are working nights and weekends to address these and other genAI issues.

In the 1980s, word processors froze periodically. It’s rare now, but they still do. Bugs still crawl through their algorithms, but only a tiny fraction compared to early times. I put in a few nights and weekends myself killing word processor bugs. They still aren’t perfect, but the easiest place to find a typewriter now is a museum, not an office. Something similar is likely to happen to genAI.

In the 1960s and 70s, most women quit calling typing a career. They were no longer only trained fingers that operated a machine for putting words on paper in conventionally accepted spelling.

What will those who lose their jobs to genAI do and how will work change with genAI? I predict more jobs and more words. Some will move on to jobs they prefer like dog walking or nuclear physics, others will muddle on doing whatever comes next, but the flood of words will not abate.

Perhaps, faced with competition from mediocre genAI, the general quality of internet writing will improve. Yeah. Right.

The Chip Shortage

Last week, my wife Rebecca, who knows what she wants and when she wants it, decided to replace her aging and worn Android phone, which she can upgrade without paying extra. Fully vaccinated and ready for a post-pandemic treat, she set off for the cellular store. Several hours later, she returned in a sour mood, her old phone still in her purse. The tech at the store told her that she couldn’t upgrade because they had no new phones to offer. The chip shortage.

Want a new car? You may have a wait. The chip shortage.

What a time for a shortage. The pandemic is winding down and the economy is winding up.

The Fourth Industrial Age

The World Economic Forum says we are entering The Fourth Industrial Age where digital models and communications combine with physical processes for speed and efficiency. In the first industrial age, beginning in the 18th century, society began to harness energy to replace human and animal muscle; in the second age, industries were built around mass production in factories; in the third age, automated controls and computerization increased productivity.

The fourth industrial age is just in time for the global pandemic. We can be grateful that computerized gene analysis enabled development of lifesaving covid-19 vaccines in record time. Computer networks have supported productivity and commerce through quarantines and lockdowns. Distributed network management shored up shattered supply chains. As much as we complain about Zoom, social media, and video streaming, they made the lockdowns and quarantines tolerable, kept education alive, and allowed many of to continue to be productive by working at home.

Only the future will reveal where the fourth industrial revolution will take us, but one thing is clear: previous industrial ages ran on coal, oil, hydroelectricity, and nuclear power. The fourth industrial age requires energy, but more than any other commodity, advancement in the fourth age depends on more and better computer chips.

The computer chip

The computer chip started as mechanical relays invented in the early part of the 19th century. A relay is an electrical switch controlled by another electrical circuit. The circuit that flips the switch uses only a few volts to control another stronger electrical current. When you start your car, the current to turn over the engine would quickly burn out the switch on your steering column or dash. This doesn’t happen because a relay isolates the driver operated switch driver from the massive power surge that turns the engine over.

Vacuum tubes, invented it the early 20th century, performed many of the same relay switching functions, but faster. With tubes, came audio amplifiers, radios, and early digital computers.

Transistors, which appeared after WWII, are still faster, more compact, require far less power, and have lifespans measured in decades instead of hours, making the complex digital devices and controls of today practical.

Computer chips are tightly packed arrays of transistors on thin slices of silicon. In 1965, Intel co-founder and engineer, George Moore, predicted that the density of transistors on computer chips would double every one to two years.

This prediction was dubbed “Moore’s Law.” Since its beginning, experts have predicted the impending end of Moore’s law, but after nearly sixty years of exponential growth, the law still holds. The ingenuity of chip technologists, largely from the U.S., has been startling. Advanced chips today have over a billion transistors, that’s more than the number of grains of sand in a five-gallon bucket. Think about wiring together every grain of sand in that bucket in an exact pattern connecting each grain of sand with every other one and you get an idea of just how hard chip manufacturing is. Now, shrink the size of each grain of sand so that they all fit in a few layers the size of a postage stamp.

American engineers figured out not only how to perform this staggering task, they devised ways of using these contraptions to control cars, improve the quality of steel, discover likely covid-19 vaccines. Digital processes power Zoom meetings, deliver pizza, anticipate storm surges in sewers, and broadcast cat photos.

Today, a cutting edge computer chip is undoubtedly the most difficult manufacturing challenge on the planet, requiring hundreds of precision operations, so precise they are calibrated in wavelengths of light. Chips must be small because their speed is limited by the time required for a signal moving near the speed of light to travel from one side of the chip to the other.

Outsourced chip manufacturing

As the source of chip manufacturing technology, you would expect the U.S. to be the leading computer chip manufacturer. It is: Intel and a few other U.S. companies dominate the field. But they do and they don’t. U.S. companies design the chips and the processes to manufacture the chips, but they often outsource the fabrication to firms in Asia, primarily in South Korea and Taiwan. Companies in South Korea and Taiwan have factories on the Chinese mainland where skilled workers are plentiful and wages are low.

And there you have it: U.S. chip innovators depend on manufacturing capacity in mainland China.

Why? Chip manufacturing in China is cheap and the quality is high. The managers of the U.S. chip companies like Intel, Invidia, and AMD are obliged to optimize shareholder value. In corporate America, passing up opportunities for increased profits ends careers. Executives must manufacture chips as cheaply and efficiently as possible. They are compelled by the market to outsource to the U.S.’s leading economic and social competitor, China.

The shortage

But don’t jump to the conclusion that the chip shortage is caused by China. U.S. corporations may have shortsightedly handed chip manufacturing to the Chinese, but the shortage today is not the result of secret directives from Beijing.

The shortage was caused by the rapid progress of the fourth industrial age. From the 1970s, when computer chips first came on the scene, tech companies— computer, smartphone, and networking gear manufacturers— were the primary consumers of chips. But this has changed. Automobiles have become mobile computer data centers. The Internet of Things requires millions of computer chips in home appliances and industrial sensors and controls. Industrial robots must have chips.

Consequently, the chip market has expanded far beyond the tech sector. Adding chip production lines is difficult and slow. Think of the gargantuan private and public effort put into developing vaccine production lines. Chip production lines are more difficult, and the demand is higher. It’s not news that rapid increase in consumption of hard to manufacture commodities precedes shortages.

Covid-19 disruption

The fourth industrial revolution may have blunted the damage from the pandemic, but covid-19 entered the scene at the worst possible time for the chip industry. Let me count the ways.

Chip factories had to slow production as employees became sick with covid-19. These factories have their own supply chains for raw materials, subcomponents, and manufacturing equipment. The global pandemic disrupted these supply chains and sources as well as the factories themselves.

It gets worse. Consumers quit buying. Automobile sales plummeted and the big automakers cut back their orders, falling to the back of the queue. People quit flying. Chip manufacturing relies on cargo space on passenger flights to ship their tiny high-value products and receive materials and subcomponents, but passenger flights were cancelled. The alternative, container ships, are a slow inferior choice for shipping, and they too had their covid-19 problems.

Econ 101

Covid-19 did much more than inhibit chip production. It also increased chip demand. Kids needed computers for remote schooling. Parents needed equipment to work from home. Network usage soared, which required added network gear. In the U.S., by the end of 2020, we were using computing and the computer networks the way that the experts had predicted for 2030.

Increased demand and decreased supply. Sounds like an exercise in disaster prediction from Econ 101. Here we are. Coming out of a pandemic with a roaring fourth stage of industry demand for chips and suppliers struggling to fill orders.

How long it will take to stabilize chip production is hard to predict. Some say by the end of 2021. Things are likely get worse before they get better.

U.S. and China

This is painful, but not all bad, because it draws attention to a glaring problem. Even after the current shortage goes away, the U.S. is still in trouble. Outsourcing of sophisticated manufacturing makes managerial and profit sense, but it is a recipe for disaster. The U.S. rivalry with China is nothing like the U.S.-Soviet cold war. At the height of the cold war, the U.S. depended on the Soviet Union for fish eggs (caviar) and furs, but not much else, and the Soviet resource-based economy didn’t depend on the U.S. That left both sides free to exercise military strategies with little regard for economic consequences.

Today, the U.S. and China are economically intertwined in ways that the U.S. and the Soviets never approached. Don’t expect to see Xi Jinping pounding on a desk with his shoe like Nikita Khrushchev at the United Nations in 1960, but expect a series of confrontations and tense maneuvering for advantage. In the cold war with the Soviets, the contest was mainly ideological: state socialism vs. capitalism. Today, the superiority of capitalism is a foregone conclusion in China; the contest is between an authoritarian and a democratic state. Xi manipulates markets to achieve what he perceives as the best deal for the Chinese people. In the U.S., the people direct the market and hope they achieve their goals. If the free market says outsource to China and the people agree, so be it.

A solution

But not all Americans agree that the free market has the best solution to the chip shortage. Some folks, including me, think that we ought to identify the resources we depend upon and act for long term control of our future. They see prioritizing and supporting our own chip manufacturing base as a healthy approach to continuing democracy in the fourth industrial age.

In another venue, we can argue where private enterprise and government enterprise should prevail. But for now, I hope for a government that encourages long-term investment in chip manufacturing and discourages short-sighted profit-taking on outsourcing. We need a landscape in which every off-shore outsource has a vigorous onshore competitor. May the best contestant win, but let’s make sure that onshore contestants are on an even playing field.

We can do this.