Friday, December 5, 2014

I WROTE A BOOK, AND YOU CAN BUY IT ON AMAZON!


"Don Quixote in His Library" (woodcut) by Dore

    "Fortunately, following the turbulence of winter comes the season of activity and opportunity called springtime. It is the season for entering the fertile fields of life with seed, knowledge, commitment, and a determined effort. However, the mere arrival of spring is no sign that things are going to look good in the fall. You must do something with the spring."

      — Jim Rohn, 1981

I've always wanted to write a book. Actually, I've written quite a few technical manuals during my career, on a "work for hire" basis, but I wanted to write my own book, under my own name, and sell it to the public. It's finally happened. You can order it here:

If you want to know what it's about, and why you might want to read it, see the web site:

The process has been way more arduous and time-consuming than I ever dreamed, but now that I'm through it, I thought I'd share what I've learned.

What to Write


"The Time Machine"

Since I was a boy I've had a yearning to write science fiction. I've managed to write a few short stories, mostly unpublished, but never a novel. I would love to enter the pantheon of the sci-fi greats, which includes the likes of Jules Verne, H. G. Wells, Robert Heinlein, Isaac Asimov, Arthur C. Clarke, &etc. Perhaps someday I will.


"The Hitchhiker's Guide to the Galaxy"

I was also inspired by Douglas Adams' science fiction satire, "The Hitchhiker's Guide to the Galaxy" (1979).

Not only would I be delighted to one day be as funny as he is, but his concept of a universal guide book gave me a yearning to write my own guide books someday. As I worked at a variety of "traveling techie" jobs over the decades I began to hatch the idea of a guide book for people like me.

Why to Write


"Multiple Streams of Income"

Since I read it in the late '70s, I was inspired by electronic hobbyist Don Lancaster's book "The Incredible Secret Money Machine" (1978),

which recommends a strategy for creating what he calls "a steady stream of nickels" to achieve greater financial independence. Another book, Robert Allen's "Multiple Streams of Income" (1998),

gave a plan for doing this is well, including the idea of writing specialty books with knowledge that is in demand.


"Lakich: For Light. For Love. For Life."

So I decided to write to make extra money. I was given an additional reason by my friend Lili Lakich, who is a neon artist. She told me a story of how she wrote a book about her work, "Lakich: For Light. For Love. For Life." (2007) and got it into bookstores, only to find that it didn't sell as well as she'd hoped.

But then an art patron in Japan read it and contacted her, and it resulted in the biggest commission of her career. She said it was important to do these kind of things for more intangible reasons, because there would be good unanticipated side-effects.

How to Write


"The Writing Life"

It is important to learn to write well, and the best way I know to do that is get a lot of practice. I once saw Ray Bradbury lecture, and he said to write every day.

But there are a few helpful books. The classic "The Elements of Style" (1920) is irreplaceable, and should be on every writer's shelf.

And Pulitzer Prize-winning author Annie Dillard's "The Writing Life" (1989)

has some hints on how to work with problems like writer's block.

My own failure mode for years is that I would start writing a short story, then have it turn into a novella, then a novel, then a trilogy, all the while still working on the first chapter. I was lacking convergence in my work. The cure for this, for me, was to write blogs, which forced me to come to an end in order to get them out the door. I worked first on "Cybernetics in the Third Millennium"

and then on this one. It worked. Now I can get projects to converge, and I can finish them.

I still remember when I made the commitment to write my book. I was sitting in a seminar in Orlando, Florida given by Robert Allen on his "multiple streams of income" ideas. I'd used frequent flier miles to get there from my home in the San Diego area, and the seminar was free (because it was mostly up-sells), but I was still spending my time. I realized that at some point you have to stop getting more advice and just start doing stuff. I pulled out a blank piece of paper and began designing the outline. Self-help pioneer Jim Rohn might say I'd moved from a Winter to a Spring of my life, by planting a seed.

How to Self-Publish


"Dan Poynter's Self-Publishing Manual"

It took me two years to write the book, and ten years to publish it. I had to keep going back and rewriting parts as they got stale. I couldn't say "recently" and mean 8 years ago. I had to change advice about using a Dictaphone to using Siri. The delaying problems were:

  • I was working day jobs, often with intensive business travel

  • I had no idea how to do what I was doing and had to learn as I went along.

I got a lot of help from two self-publishing authors. Dan Poynter started out writing books about parachute jumping as a hobby and self-publishing them, and as he learned the process he began writing books about how to self-publish. They turned out to be much wider in appeal (though he still calls his company Para Publishing). I first found his "Self-Publishing Manual: How to Write, Print and Sell Your Own Book" in its 14th edition in 1979; now it is called "Dan Poynter's Self-Publishing Manual" (2007).

The other author, Aaron Shepard, started out writing children's books and self-publishing them, and followed a similar trajectory. A friend of mine gave me a digital copy of a pre-release version of his book "Aiming at Amazon" (2007),

which explained how the rules of publishing have changed in the digital era. (I bought a paper copy when it came out.) He also did two more books I found very useful, about "Print On Demand" (POD) publishing::

  • "Perfect Pages" (2006)

  • "POD for Profit" (2010)

    He says both of these are obsolete, and now recommends Kindle publishing. He has several new books (on Kindle only, naturally) about how to do this. But he gives away the older books as PDF files on his web site.

    If I was starting the whole thing right now I might go with Kindle-only, but maybe not; I'm an old school kind of guy. I want people in airport bars to see other people reading my book. I want people to be able to read it during take-off and landing.

    Aaron steered me to a company called Lightning Source as my Print On Demand supplier. So far I have been quite happy. They have a lot of useful information on their web site.


    my first box of wholesale books arrives

    In the last week I've gotten my first order fulfilled by Amazon for one book (just to see it work), and my first wholesale order of 20 books from Lightning Source.

    I've learned a lot of things the hard way on this project. For instance:

    • Obtaining permissions to use quotes and artwork is a royal pain in the neck, requiring many months of correspondence and some detective work. Some rights holders are quite unreasonable, and don't believe in "fair use." If I had it to do over I would have far fewer inspirational quotes.

    • In making decisions about layout, font, structure (does the dedication come before the Table of Contents?), just pick a book you think looks good and copy its visual style.

    • I couldn't do the conversion myself from HTML (how I chose to write the book) to PDF in the format Lightning Source wanted. I didn't have the time or the patience. I had to hire an experienced graphic artist who'd done a book with them before.

    • Good graphic artists are golden; cherish them.

    • If you're selling on Amazon, don't agonize too much about pricing and discount; they're easy to change. Just get the book out there.

    • Lightning Source says it takes 6-8 weeks for a book to appear on Amazon, but for me it happened in 3 weeks.

    • Marketing a book is probably as much work as writing and publishing it. That's what I must do next. This blog is a step in that direction. There will probably be some sort of press tour, book signings, and of course, social media.

    More to Come

    Now that I've been through the process, I am re-energized about writing, and I feel I have some more books in me, including more non-fiction:

    • "A Survival Guide for the World-Traveling Techie" — a sequel I would need help to write

    • "Garage Visualization" — data visualization on a shoestring budget

    • "A Curriculum for Cybernetics and Systems Theory" — based on my web site of the same name; possibly the best thing I've written

    • "Surviving Orlando: A Guide for Dads" — a guide-book for the poor sap who usually pays for the trip

    • "Somewhere Near Barstow: A Guide to the Drive from L.A. to Vegas" — a love letter to the East Mojave desert

    • "How to Raise a Genius in Ten Easy Years" — based on my parenting experiences

    • "How to Run a High-Tech Startup Into the Ground" — a humorous book based on my business experiences

    • "Melon Crate 442" — a memoir of my high school years

    • "Between K-Mart and Jupiter" — a memoir of my college school years

    as well as some fiction:

    • "Tales of El Ciervo Blanco" — a collection of short stories which is mostly written, inspired by Clarke's "Tales of the White Hart"

    • "Macronesia: The First Astrogators" — a science fiction saga abut polynesians colonizing the asteroid belt with solar sails

    • "SOB Trail" — a trashy novel set in central Florida about a science fiction author trying to reclaim his muse

    Stay tuned!


    Disclaimer

    I receive a commission on everything you purchase from Amazon.com after following one of my links, which helps to support my research. It does not affect the price you pay.


    This free service is brought to you by the book:

Friday, August 29, 2014

TOOLS FROM MY CHEST

from The Last Supplement to the Whole
Earth Catalog (1971); co-editor Paul Krassner's
response to Ken Kesey's 17-page screed
entitled "Tools From My Chest"

( www.ep.tc/realist/89 )

    "As the UNIX system has spread, the fraction of its users who are skilled in its application has decreased."

      — Kernighan & Pike, "The UNIX Programming Environment" (1983)

I'm a software guy, so my tools are software tools. Over the years I've picked up tools that have stayed with me; some I found, and adapted to my uses, while some I created myself. This blog entry is about some of the found tools — as well as tools for building tools (meta-tools?) — which have become my favorites.

The Search for Portability

Keeping my favorite tools around has been no easy feat, and frankly, sometimes I'm astonished that I was able to do it at all. You see, throughout my computer career (which has spanned about 60% of the history of commercially available computers), there has been an incessant struggle by vendors to "lock in" customers to proprietary systems and tools. This tended to work for them in the short term, but in the long tertm they almost all failed, and left orphaned software at every turn when the proprietary systems became extinct. The only force working counter to this was the occasional heroic struggles of the users, first by banding together to demand standards, and then later by writing and giving away free standards-based software. These brave efforts have kept the ovderall trend towards extinction of tools partially at bay.

At first I didn't understand how important this struggle was. I began programming in the era of proprietary operating systems such as: OS on IBM 360 mainframes, RSTS on DEC PDP-11 minicomputers, AOS on Data General minicomputers, plus the toy Apple DOS disk operating system on the Apple II personal computer, and the equally toy DOS from Microsoft which ran on PCs. I wrote several productivity tools during each of these technical epochs that were all lost to the sands of time.

This began to bother me. I was pretty sure my dad had tools in his physical workshop dating from the 1940s, such as screwdrivers, chisels and alligator clips, that all still worked fine. Why couldn't I keep the tools I built myself from one decade to the next?

When I first used UNIX in 1983 I was elated to find an OS that was somewhat standard, and ran on multiple hardware platforms. Soon I began to appreciate its robust set of small tools that can be recombined to quickly solve problems, and its tendency to hide hardware details from programmers, and eventually I also greatly appreciated its universal presence on nearly every hardware platform. But, honestly, the big deal at the beginning was that so many people I though were smart were gung ho for UNIX.

A Voice in the Wilderness


K&P

I think I first hear about UNIX from some of my smart, geeky friends in the late 1970s, but I do believe the first hard information I got about it came from a book I read around 1980, at the recommendation of my friend Wayne. It was "Software Tools" (1976) by Brian W. Kernighan and P. J. Plauger.

The funny thing was it hardly mentioned UNIX at all, but its co-author Brian Kernighan was tightly wound into the social network at Bell Labs which produced UNIX.

I have been re-reading this book this month, and I am amazed to find how much of its teachings have been deeply integrating into my thinking about software. It mentions UNIX in only two places I could locate, as an example of a well-designed operating system, while explaining how to work around the limitations of poorly-designed systems. This understated praise made UNIX seem uber-cool.

When I'm reading a book I own I usually note interesting quotations with notes in the inside front cover, with a page number and some identifying words. In a typical book I like I'll list a handful of quotes. In this one there are over thirty. It's hard to pick just a few, but here are some I really like.

Whenever possible we will build more complicated programs up from the simpler; whenever possible we will avoid building at all, by finding new uses for existing tools, singly or in combination. Our programs work together; their cumulative effect is much greater than you could get from a similar collection of programs that you couldn't easily connect.

* * * * * *

What sorts of tools [should we build]? Computing is a broad field, and we can't begin to cover every kind of application. Instead we have concentrated on an activity which is central to programming — tools that help us develop other programs. They are programs that we use regularly, most of them every day; we used essentially all of them while we were writing this book.

* * * * * *

How should we test [a program to count the lines in a file] to make sure it really works? When bugs occur, they usually arise at the "boundaries" or extremes of program operation. These are the "exceptions that prove the rule." Here the boundaries are fairly simple: a file with no lines and a file with one line. If the code handles these cases correctly, and and the general case of a file with more than one line, it is probable that it will handle all inputs correctly.

In addition to these general principles, I learned to appreciate any operating system that can string programs together, such as with the "pipes" mechanism of UNIX.

So by the time I actually got to use UNIX, I already knew a little about what made it so awesome.

Joining the UNIX Tribe


UNIX license plate ( www.unix.org/license-plate.html )

Before the World Wide Web (WWW), it was hard to learn about computers. It took a rare genius to be able to simply read the manuals and figure out computer hardware and software. Most of the time a tribe would form around an installation, with oral traditions and hands-on demonstrations being passed on to those who were less adept at learning from the manuals. In many cases the field technical staff for the computer vendor would come in to do demos and training, and pass along the oral tradition that way, and so there didn't have to be a "seed" person who could figure it out by reading alone. I remember asking the the hypothetical question if it would be possible for someone in an igloo with a generator to figure out a computer all alone.

I was fortunate when I was learning to use UNIX to have just started a new job where I had access to three very knowledgable and clever people named Dan, Phil and Jeff.

Dan was a full-time employee of the company, and person who helped me get the job. He had been adamant that the company get UNIX, accepting no other option. (He had a "Live Free Or Die — UNIX" license plate similar to the one shown above, a variation of the New Hampshire state motto, on display in his office.) They ended up buying a VAX minicomputer from DEC, but getting the operating system from a third-party vendor, Uniq, which sold them a version of Berkeley Standard Distribution (BSD) Unix. It was on this system that I began to learn.

Dan gave me the most hands-on help, setting me up with a login, teaching me about the tricky but vital $PATH and $TERM variables and the squirrely nature of backspaces, teaching me how to customize my prompt and giving me templates for shell aliases and C programming language source code (more on those in a minute).

He also taught me something I've never seen written down anywhere: for about 50% of the common UNIX commands (I counted), the two-letter abbreviation is formed by the first letter followed by the next consonant. So:

  • archive becomes ar
  • copy becomes cp
  • c compiler becomes cc
  • link becomes ln
  • list becomes ls
  • move becomes mv
  • names becomes nm
  • remove becomes rm
  • translate becomes tr
  • visual editor becomes vi
&etc. Dan also explained to me that UNIX began in the minicomputer era when most people used teletypes, which printed on paper, to communicate with timesharing systems, and each keystroke took time and energy, so the commands were made as short as possible. (We used to have a joke that you were born with only so many key presses to use throughout your life, so you had to conserve them.)

A similar impulse encouraged the liberal use of aliases, which allowed a long command to represented by a short abbreviation, such as 'd' in place of 'ls -CF' (list files in columnar form, formatted). Dan gave me my first set of aliases; more on that below.

Through Dan's social network of UNIX experts the company found Phil and Jeff, who worked as consultants. I didn't see them often — they both frequently worked remotely from nearby UCSD — but they were also very helpful.

Phil helped me understand the history and philosophy of UNIX. He told me how most hardware manufacturers would rush their software people to finish an operating system as soon as possible, so they could start shipping hardware for revenue. At Bell Labs, UNIX was developed slowly and lovingly over a period of about a dozen years. There was no time pressure because AT&T (also known as "the phone company") was still a government-granted monopoly for voice traffic, and to keep them from having an unfair advantage in other communications they were prohibited from selling computer hardware or software. UNIX was originally for internal use only at Bell Labs.

He also explained that every program was designed to do one thing well, and be interconnected with others. One important principle was that any program shouldn't know — or care — whether a human or another program was providing its input. This is why programmers were encouraged to have output be terse, or sometimes provide none at all. The standard UNIX report from a successful operation is no output. For example, the program diff lists the differences between two files. No differences results in no output. In the same vein, on most traditional mainframe computers if you connected to the command console and you press the Enter key without typing a command, you get an error something like "ERROR #200232 -- ZERO LENGTH COMMAND" and then another prompt. In UNIX you just get the prompt. The assumption is you know what you're doing. Maybe you just wanted to see if the system is "up," or how responsive it is (a relic of the timesharing era which is also useful when you run large simultaneous jobs on a PC).

Jeff was the most terse. He was very busy writing original software of the "kernel" of an embedded system, which is very gnarly work, and usually had a lot on his mind. I'd ask him, "How do you do X?" and he'd usually say "I am not a manual." Once I said, "But there's six feet of manuals on the shelf; I don't have to to read them all." "Use man," he replied. "What's that?" I asked. "man man" he said cryptically. But when I typed "man man" into a command prompt, I got something like this:


man(1)                                                                  man(1)

NAME
       man - format and display the on-line manual pages

SYNOPSIS
       man  [-acdfFhkKtwW]  [--path]  [-m system] [-p string] [-C config_file]
       [-M pathlist] [-P pager] [-B browser] [-H htmlpager] [-S  section_list]
       [section] name ...

DESCRIPTION
       man formats and displays the on-line manual pages.  If you specify sec-
       tion, man only looks in that section of the manual.  name  is  normally
       the  name of the manual page, which is typically the name of a command,
       function, or file.

......

Jeff taught me to be self-sufficient. He would help me if I already tried to help myself, and hit a blockade. When I told him I couldn't find a "search" or "find" feature, he said "man grep" and that did the trick.`

Nuts and Bolts

There are some specific tools that I learned 30 years ago which I still use frequently today as I earn my daily bread. They include:

  • the vi (visual) editor

    ( en.wikipedia.org/wiki/Vi )

    The creation myth is that nearly every minicomputer with a teletype interface had a "line editor," usually called ed and in the UNIX world t evolved into ex, the "extended editor." When Cathode Ray Tube Terminals (CRTs) became cheap enough for widespread use, it further evolved into vi, the "visual editor."

    Much to my astonishment I have ben able to use vi, with nearly exactly the same features, on every computer I've owned for the last two decades.

    It's also ben part of the evolution of other UNIX tools. As the Wikipedia article on ed explains: "Aspects of ed went on to influence ex, which in turn spawned vi. The non-interactive Unix command grep was inspired by a common special uses of qed and later ed, where the command g/re/p means globally search for the regular expression re and print the lines containing it. The Unix stream editor, sed implemented many of the scripting features of qed that were not supported by ed on Unix. In turn sed influenced the design of the programming language AWK — which inspired aspects of Perl."

    For more details see "A History of UNIX before Berkeley: UNIX Evolution: 1975-1984."

    ( www.darwinsys.com/history/hist.html )

  • the C programming language

    ( en.wikipedia.org/wiki/C_language )

    The other main thing these three guys got me started with was the C programming language. C is infamous for being a language that many consider too primitive, or too close to "the metal," and of course it predates the "object oriented" revolution, but it is just about perfect for implementing UNIX. For this reason and others it probably will last a very long time.

    Dan gave me a C code template which I use to this day. It could use updating, but what the heck, the computer doesn't care, and it still works.


    IQ Tester ( www.lehmans.com/p-4542-original-iq-tester.aspx )

    This program I wrote to solve an "IQ Tester" puzzle in 2007 follows a template that evolved out of the template Dan gave me in 1983. (In other words it has improvements I've made along the way). And it runs fine on Windows, Mac and UNIX/Linux!

    The definitive book on C is by the ubiquitous Brian W. Kernighan, coauthoring with Dennis Ritchie who actually invented C: "The C Programming Language" (1978).

  • the C shell (csh) interface and scripting language


    nautilus icon for csh in old SunOS UNIX
    ( toastytech.com/guis/sv4112.html )

    ( en.wikipedia.org/wiki/C_shell )

    Phil explained to me that the "kernel" of UNIX had an application program interface (API) so application programs could "call" into operating system. Then there were "layers" that could "wrap around" the kernel, each adding a new "shell" which was a higher- level interface. The symbol was a nautilus shell with its logarithmic chambers. The term came to mean a text-based scripting interface to the operating system. The first was Bourne's original shell, which we now call the "Bourne Shell" be he called sh in fine UNIX naming tradition.

    It was the famous Berkley port (and re-invention) that introduced the "C Shell," csh, a fine pun in the UNIX humor tradition. This was the first shell I learned, and I stick with it if I can. These days I'm often forced to use bash, the "Bourne-Again Shell" which is based on the Bourne Shell and runs easily on Windows and Mac.

    I'm writing this using vi, and I can "bang-bang out" to a command for my local bash (precede it by two exclamation points). Here, I'll make a deliberate error:

        !!foo
    
        /bin/bash: foo: command not found
    

    That was my local bash responding, right into my text document. It seems like I can't live without these things.

    Perhaps the most mind-blowing thing Dan did for me was to teach me the alias mechanism in the C Shell. By way of example, I can type the echo command with arguments:

        echo hello there
    
    and I will get back the "hello there" from my local C Shell. Or I can type:
        alias ht echo hello there
    
    and it will appear that nothing has happened, but henceforth when I type "ht" to my shell I will get "hello there" back. I have created the alias ht and it's like my own personal custom UNIX command.

    Next, Dan typed this command for me to ponder:

        alias a alias
    
    What do you think it does?

  • shell tools (UNIX utilities)

    In preparing to write this blog I talked to family and friends about the topic. "It's going to be about UNIX shell tools!" I would gleefully state, and I kept getting that same glassy-eyed stare that I'm used to getting if I bring up any of the following topics:

    • "The Eastern Question" in nineteenth-century European politics, dealing with instabilities caused by the collapse of the Ottoman Empire,

    • the Laplace Transform in advanced calculus, economics and engineering, and

    • one's own personal medical problems

    And yet I persist. This stuff has been massively useful to me, and I'm just trying to follow the "hacker ethic" (white hat version) and share my best tips and tricks. I take it on faith that there is someone out there who wants and needs this information.

    I make frequent use of the following UNIX shell tools (i.e., software that can be invoked from a UNIX shell or equivalent):

Thankfully, much of the oral tradition of the UNIX shell tools was captured in the still-useful book "The UNIX Programming Environment" (1983) by Brian W. Kernighan & Rob Pike.

If you do find yourself in that igloo with a generator it's nice to know it's there.

Using What I'd Learned

In 1985 I made a mistake I would never repeat in this century: I quit one job before another was firmly in hand. When the new opportunity slipped away I not only faced the inconvenience of having to find another while earning no money, I found myself going through UNIX withdrawal. There had been a time when I didn't know what UNIX was; now I couldn't live without it. I recalled that my email and conferencing account at "The Well"

included a C shell login, and I dialed up just to edit my aliases with vi for old time's sake.

Then when I got another job it was on a project maintaining some crufty FORTRAN code on a clunky little Hewlett Packard minicomputer with a poorly-designed proprietary operating system whose name I don't even remember. A book came to my rescue: good old "Software Tools."

I didn't have the time to implement the "Ratfor" (Rational FORTRAN) pre-processor provided in the book, but I did manage a few pieces, including comment processing (FORTRAN comments have rigid requirements for placement in column numbers), so I could be a little sloppier in my editing and still produce working code quickly. The operating system didn't have UNIX pipes, but I hacked together a work-around using suggestions from the book. And I wrote a "find" program, which helped me make an automated index of all the function and subroutine calls in the code base, which had never been done before. This set of strategies made my life much easier and only confirmed the productivity benefits of UNIX in my mind.

The Gift of LINUX


Linus Torvalds stamp
( uncyclopedia.wikia.com/wiki/Linus_Torvalds )

For a while I used tools on a variety of "workstations" from DEC, IBM, Sun, HP, SGI, and others, and my tools moved with me from one to the next. The problem was most of these systems cost in excess of $40,000, and I couldn't afford one myself. It was only at work that I had access to UNIX. Then with the arrival of Windows 95 it looked like UNIX was destined for the scrap heap, overtaken by another proprietary OS on a dirt cheap hardware platform. I worried that my favorite OS and its whole ecology were going to become extinct before my eyes.

And along came Linux, the open-source freeware UNIX work-alike that had since taken over the world. I couldn't be happier about the way this has worked out. My old tools have new life. I'd like to emphasize that I had practically nothing to do with it, except being a cheerleader and a user. I am extremely grateful for all the talented people in the open source world who have made it possible for me to keep using my favorite tools on modern computers.

Sometimes I think the greatest impediment to more widespread Linux usage is the difficulty of pronouncing it correctly. Linux creator Linus Torvalds has a name that sounds like a popular American comic strip character, Linus from "Peanuts," but the comic character has a long I (as in "like") while the Swedish programmer has a short I (as in "lick"), and so does Linux. Perhaps a re-branding is needed.

Portability Survival Skills


19th century interface adapter
( ridevintage.com/railway-bicycles/ )

It wasn't until the last twenty years that I had Linux on a home computer. Meanwhile, my wife uses Windows at home (Windows 7 at this point), but I prefer Mac for my primary system. But during this same two decades I have always had some for of Windows I was required to use in my work. In that world what has come to my rescue is the tool set called CygWin from Cygnus.

One of the three founders of Cygnus Solutions the company is a friends of mine; imagine my surprise to spot him posed with the other two in a hot tub on the cover of "Forbes" magazine in 1994.

The CygWin tools give you most of the tools I describe above on Windows platforms, and they're free. I always load them right away whenever I am issued a PC.

Since 1996 I have been investing my time in the technologies of HTML and Java for their portability. I create content (such as this blog) in raw HTML using vi, confident that many other programs can read it.

The appeal of Java was the "write once, run anywhere" philosophy, which mobilized Microsoft to embrace and sabotage Java, creating a "write once, test everywhere" requirement in real deployment. That battle seems to be over now, and instead we're watching Oracle, who bought Sun Microsystems the Java creators, fumble the franchise in creative new ways. Still, I have Java code that runs on Windows, Mac and Linux that I continue to maintain and use.

Vindication


finally, a book that agrees with me

Lest you think that I'm this fossil who can only write C code to process text, let me assure you that I've kept up with the changing world of software development, and I've used modern languages such as Objective C, Javascript and PHP, Microsoft tools like Visual Basic and Access, Integrated Development Environments (IDEs) such as JBoss and Visual Studio, Graphical User Interface (GUI) frameworks such as XWindows, Microsoft Foundation Classes, and Java Swing, and cutting-edge development methodologies such as Software Patterns and Agile Development.

But at a point a few years back when I did some soul-searching on my "core competencies," I ran across a rule of thumb from the book "Blink: The Power of Thinking Without Thinking" (2007) by Malcolm Gladwell.

He said that to become a master at something you have to put in 10,000 hours over the course of 10 years (if I remember right). I realized the one thing I've done that long is code in C.

Amazingly, about the time I began to embrace my inner C programmer, I discover that, at least for a few months, C was the world's most popular programming language, having experienced a new renaissance.

Eventually it seemed like history caught up with me. All the people who bet on Visual Basic had to start over with C Sharp, as did all the poeple who bet on Visual C++, but the UNIX/C code base just keeps on working. I was gratified that books began to emerge by people who shared my views:

  • "In the Beginning Was the Command Line" (1998) by Neal Stephenson

    This delightful manifesto by cyberpunk sci-fi author Neal Stephenson of "Snow Crash," "Diamond Age" and "Cryptonomicon" fame — one of the few SF authors I know of who can actually program — explains why real programmers still mostly use their keyboards, and there is no "royal road" to clicking your way into software development.

    His analogy of Macs to sportcars, Windows to clunky station wagons, and Linux to urban tanks, is priceless.

  • "The Pragmatic Programmer: From Journeyman to Master" (1999) by Andrew Hunt and David Thomas

    It is a rarity for me to read a book and be exclaiming "Yes!" with nearly every page, but it happened with this one. It was even more exciting when I got into the stuff I didn't know about, because by then I trusted the author completely.

    One of the most important principles espoused here is "Don't Repeat Yourself" (DRY), which often requires code that writes other code. This has long been one of my favorite tricks, and it is almost magical in powers. If used correctly it can prevent whole classes of errors as well as the tedium of hand-coding highly redundant source code.

  • "Data Crunching: Solve Everyday Problems Using Java, Python, and more" (2005) by Greg Wilson

    This follow-on book by the same publisher deals with the very issues I grapple with weekly: approaching some unknown input file with a need to rationalize, "cleanse" and reformat its contents.

Moving Forward


from article: "Linux Now Has 'Double' the Market Share of Windows"
( www.tomshardware.com/news/linux-windows-microsoft-android-ios,20220.html )

The new year brought a new client for my consulting, and I once again found myself having to use nearly all the tools in my chest to get jobs done quickly. As Kernighan & co. pointed out more than three decades ago, a lot of what passes for programming involves processing text files in predictable ways. I keep encountering the same "patterns" in the problems I solve, and there's almost always a "sort" involved.

A lot of kibitzers tell me there are other ways to solve my problems, but they can't seem to get things done as quickly as I can with the UNIX tools.

I went "grepping" through some of the projects I've worked on in the last six months, and I found, in addition to programs I wrote in C, Java, Python and the 'R' statistics language, I used these shell tools frequently:

  • awk
  • cat
  • cp
  • echo
  • grep
  • head
  • join
  • mv
  • python
  • rm
  • sed
  • sort
  • tail
  • uniq
  • vi
  • wc

And I endeavor to continue to learn. For about four years I have been playing with Python, and I find it quite promising. It doesn't hurt that its name comes from Monty Python's Flying Circus, the British comedy troupe, and not a snake. And just this year I finally began to dig into an old UNIX favorite: the text processing program called awk.

It's named after the three people who created it: Alfred Aho, Peter Weinberger, and... wait for it... the ubiquitous Brian W. Kerhighan.

Further Reading

One of the best ways I have found to absorb a new computer language quickly is to study very short programs or scripts. So called "one-liners" pack a lot into a small space. The key is to find a tutorial web site that has a maniacal obsession with explaining every little nuance of each one-liner.

Another great resource is the "Stack Overflow" web site.

Using a clever combination of curating, moderating and crowdsourcing they maintain a highly reliable and timely question and answer site for programming. Many of my questions have been answered there. (Note to self: maybe it's time I gave something back.)

And of course there are good old books. Here are my recommendations for a well-rounded programmer:

  • "Computer Lib: You Can and Must Understand Computers Now" (1974) by Theodor H. Nelson

    I was fortunate to have this highly educational comic book and manifesto to learn the inner guts of computing, just before starting my first job in the industry.

  • "The Mythical Man-Month: Essays on Software Engineering" (1975) by Frederick P. Brooks Jr.

    This classic of software management is still as relevant as the day it was published, even though it's based on the work done for a large IBM mainframe in the 1960s. Ask anybody.

  • "Hackers: Heroes of the Computer Revolution" (1984) by Steven Levy

    I firmly believe that in order to program well one must learn to think like a programmer, which means learning about so-called "hacker culture," including the "hacker ethic" and "hacker humor." Technical journalism superstar Steven Levy packages the essential history in this book. In addition it is useful to refer to following lexicons:

    • "The New Hacker's Dictionary" (1996) by Eric S. Raymond (editor)

      A book form of a long-lived and highly-evolved computer file, with heavy contributions from Stanford and MIT.

    • "The Devil's DP Dictionary" (1981) by Stan Kelly-Bootle

      One man's sarcastic reworking of Ambrose Bierce's venerable "Devil's Dictionary" (1911).

      Dated by its mainframe point of view, but still hilarious and educational. It has the classic definition, "recursive — see recursive."

      From these sources you will learn that the name UNIX is a joke, derived from the Multics operating system developed by MIT, General Electric and Bell Labs..

  • "The Information: A History, A Theory, A Flood" (2011) by James Gleick

    Sometimes it's difficult to see the forest for the trees. As the Information Revolution has washed over us through the last 70 years — a human lifetime — it has changed nearly every aspect of our technical civilization. Gleick, an excellent technology journalist, puts the pieces together here with a nice long view. He also provides a good overview of the pioneering work of mathematician Alan Turing, and its relevance to computing today.


Disclaimer

I receive a commission on everything you purchase from Amazon.com after following one of my links, which helps to support my research. It does not affect the price you pay.


This free service is brought to you by the book:

A Survival Guide for the Traveling Techie
travelingtechie.com

Thursday, July 31, 2014

A Traveling Techie's Tale: STEVE JOBS, THE ROCKSTAR TECH CEO

young Apple founder Steve Jobs
Tony Hadley, lead singer of Spandau Ballet

    "If you're the smartest person in the room, you're in the wrong room."

      — attributed variously to Brian Tracy and Steve Jobs

It should come as no surprise that I like to tell stories about my traveling techie experiences. If you've ever met me you know I'll talk your ear off given the chance. But I also like it better when people are interested in what I have to say, so I bother to keep track of which stories people seem to prefer, and when they request to hear more. I have found that my stories of meetings with famous, infamous and renowned individuals — "tech celebrities" if you will — are the biggest crowd-pleasers. And of all the folks I've met, I get the most requests for stories of my meetings with Steve Jobs.

Before Steve Jobs I don't think we had the concept of a "rockstar" tech CEO. I remember when I first realized Jobs had attained this status. I was an Apple employee working a trade show booth in 1997, right after Jobs came back as "interim CEO" (we called him the iCEO). The division I was in had recently been acquired by Apple, but we sold software that so far only ran on Windows. I'll spare you the technical details.

The bottom line was we were at a Windows-only trade show with a generic Apple booth, just like if we were pitching Macs at Internet World or SIGGRAPH. We were stuck with Apple's marketing people to provide our booth art and messaging. I thought we had a compelling story of why we were relevant at this trade show, but the booth didn't reflect it and few passing people were intersting in hearing it.

Then in a stroke of genius someone popped in a videotape of Steve Jobs giving the keynote at MacWorld that summer — the one where he put Bill Gates up on the giant screen and announced Microsoft was investing $100 million in Apple, saving the company. Even before that big reveal, the video of Jobs drew a huge crowd around our booth. Suddenly people we interested in us, and our message. I realized right then, "This guy has attained rockstar status!"

(If you're interested, scrub to ~30:50 in this YouTube video to see Jobs introduce Gates.)

After that I beagn to notice that Jobs had much of the same boyish charm as the lead singer for the band Spandau Ballet, Tony Hadley, seen here in the song "True" (1983). Now that he is gone, history will also remember Steve Jobs as the only entrepreneur to revolutionize all of these industries:
  • personal computers (Apple II & Mac)
  • movies (PIXAR)
  • music (iPod/iTunes)
  • mobile phones (iPhone)
  • tablets (iPad)
and that he became the largest stockholder of the largest media company in the world, the Walt Disney Company. But perhaps we should begin at the beginning, at least my beginning in this tale. I first set eyes on Steve Jobs and the Apple II not in a photograph but in person, in 1977 at a funky little personal computer trade show in San Francisco called the West Coast Computer Faire. Apple had rented a large booth at the very front of the hall, attached an Apple II to a big and expensive-looking video projector, and Steve was standing there with a huge grin on his face giving demos.
Steve Jobs and the Apple II, 1977
He was holding a game controller and playing a game called "Little Brickout" in lo-res but high-speed color. A large crowd was watching in fascination, and he clearly knew he had a hit on his hands.
Little Brickout game on Apple II

Video of Little Brickout game demo played on an Apple II computer:

Nobody told me his name, but he was unmistakable, clearly charismatic, and when I saw his photographs later I knew it was him I had seen.

For the next twenty years I knew of Steve Jobs from his products and his press. For me the products began with the Apple II. Perhaps some context will help explain why counter-culture guru Timothy Leary said of it, "I refer, of course, to that Fruit from the Tree of Knowledge called the Apple Computer."

A few months after I first saw Jobs and his Apple, I was working at computer manufacturer Data General in Westboro, Massachusetts, and had unrestricted access to an Eclipse minicomputer time sharing system. In college I'd had to punch cards and submit batch jobs to an IBM 360 mainframe computer, so this level of interactivity was revolutionary to me. But we initially used teletypes, with yellow paper spools, to program the minicomputer. It was a big day when the new "Dasher" terminals came — with a Cathode Ray Tube (CRT) and rapidly-changing cool blue text on a black screen.

Data General Nova Minicomputer with state-of-the-art Dasher CRT terminal

This was supposed to be the state of the art in interactive computing: it had a blinking cursor!

But still, the memory remained for me of that color computer I'd seen. Soon afterwards I discovered my cousin-in-law had bought an Apple I kit and assembled it, and I got to play with one up close for a little bit. The new capability here was "interactive graphics," though low-res and only 2D, with a limited color pallette. I knew I wanted one.

A few years later I was back in Southern California, and I was asked to configure a "dream system" personal computer for a newly-rich start-up CEO, with money no object. I picked out an Apple II for him with many extras, which I recall cost about $4800 plus sales tax. A few years later he bored with it and wanted the new IBM PC instead, so he sold it to me used for $3000 — on time payments, since I was a poor consultant.

Suddenly I had for the first time in my life unrestricted access to compute power 24/7. The Apple II had a built in BASIC language in the Read Only Memory (ROM), and came on in seconds ready to program, or to run saved programs off of a 5-and-a-quarter inch floppy disk. I certainly learned some bad programming habits during those days, using what some have called a "toy language," but I learned some good things as well. The machine was powerful enough to run non-trivial numerical simulations, and it was a boon to scientists everywhere for that reason alone. I was able to perform numerical experiments which I've blogged about elsewhere.

And there were two other things about that computing environment that were revolutionary:

  1. I was able to write self-modifying code, i.e., programs that change themselves. (This was done by writing lines of BASIC out to a file and then loading them into the current program.) The earliest theories of computing allowed for self-modifying code (e.g., the Universal Turing Machine), but most vendors do not encourage the practice — except for low-level stuff like patching drivers — since it's so hard to debug and control. That experience was quite educational.

  2. I was able to write assembly-language code in the machine language of the 6502 chip under the hood. The "PEEK" and "POKE" command allowed you to set bytes anywhere in the computer's memory, and then you could jump to the beginning of the block and the code would just run. Very dangerous, and very powerful, and also very educational.

I remeber idyllic late-night sessions, alone and with tech buddies like Bob and Eric, hacking away on that machine. My favorite music was the album "Computer World" by Kraftwerk (1981), which I had on cassette. Here is the title track on YouTube.

And the Apple II environment has never fully left me. I finally gave away the machine just 3 or 4 years ago, but on my iPhone I can bring up a web application: the Apple II Applesoft Basic interpreter, emulated in Javascript!

I recently used it to help teach a friend Boolean Algebra. Cut and paste this program into the code window and press "run," and you'll see a truth table for the Boolean "AND" operator.

 10 PRINT "A","B","A AND B"
20 PRINT "-","-","-------"
30 A = 0
40 B = 0
50 GOSUB 200
60 A = 0
70 B = 1
80 GOSUB 200
90 A = 1
100 B = 0
110 GOSUB 200
120 A = 1
130 B = 1
140 GOSUB 200
150 END
200 PRINT A,B,A AND B
210 RETURN

The next product from Jobs' team that changed my life was the Macintosh. I first saw it at the 1984 West Coast Computer Faire — Apple had a huge booth with a giant Mac mock-up (maybe 15 feet tall) that really worked, thanks to a rear-projection screen — but Steve was nowhere in sight. He was probably having press meetings, cashing in on his fame. The Mac was designed to be "insanely great," and it was obvious to me from the get-go that it was.

I first got to use a Mac (besides play with my friend Will's machine) when I worked at Rockwell Space Station Systems Division in 1986, and I've had one at work or at home (or both) ever since. I'm typing this on a Mac. Thank you, Steve, and all the folks that made this possible.

Meanwhile, Steve Jobs cut a very high profile in the tech press. I habitually read magazines like "Computer World," "Datamation" and "Info Week" for IT professionals, and "Byte," "Dr. Dobbs" and "Creative Computing" for hobbyists. They were all over Jobs becasue he was photogenic, quotable, often controversial and yet his products were well-loved. I followed along as he brought John Scully, the "Pepsi guy" in to Apple to provide "adult supervision" (a term I despise, which means either "professional management" or "someone to steal your company" or both). Of course shortly thereafter Jobs was "forced out" of Apple. He actually wasn't fired, just disempowered, and he quit in response, taking a hand-picked team to start his next venture, NeXT computer. From 1985 to 1989 the "elves" labored away, with money from Ross Perot of all people, until they emerged with the legendary "black cube" that nobody could buy at first, and then later nobody would buy, because it wasn't Microsoft.

Through all this Jobs also found the time and money in 1986 to buy PIXAR from George Lucas, who needed the cash for his divorce settlement.

Jobs (and his nemesis-buddy Bill Gates) became even more legendary with the publication of "Accidental Empires" (1992) by Robert X. Cringley

followed by "Insanely Great" (1994) by Steven Levy.

I remember going on pilgrimages from Southern California up to Silicon Valley, to visit friends and get closer to the centers of technical power. I loved hanging out at bookstores such as "Comper Literacy" in Santa Clara and "A Clean Well-Lighted Place for Books" in Cupertino, and read about legends like Jobs.

I also remember a visit to the Apple campus at One Infinite Loop, and taking pictures of the giant pixels among the flowers out front with my wife and daughter.

wife and daughter visiting One Infinite Loop
Young literary talent Douglas Coupland in "Microserfs" (novel, 1995)

captures the naive longing of Microsoft employees longing to work at the cooler Apple, and taking a road trip from Seattle to Cupertino, only to find that under Scully Apple sucks too.

And just when it began to look like the arrival of Windows '95 meant everything would suck from now on, the movie "Toy Story" (1995) came out.

Suddenly we knew where the coolness still was, in Jobs' reincarnation of PIXAR from a money-losing research group to a profitable computer-generated movie production house.
"The Triumph of the Nerds" (TV documentary, 1996) with Robert X. Cringley spread the legends of Silion Valley's founders to those who prefer TV to books for their biographical information. Based on "Accidental Empires" I find it to be the most accurate and least melodramatic version of this story produced.

By 1996 I knew people at NeXT, and quite suddenly was invited to join the company. I was all a dither. First they flew me to Redwood City for interviews. The headquarters was award-winning and expensive.

Someone pointed out a matte-black ceiling tile to me, one of scores, and said they cost $5000 each. I don't know if it was true, but the place was impressive. The stairs were super low and wide; the corridors were extremely wide and filled with furniture, like a flowing expanse of conference room.

The woman who greeted me from HR gave me advice on dealing with Steve Jobs. She said he didn't like small talk, and porobably shouldn't talk to him at all, but if I did, just get right to the point, with no preliminaries.

I didn't meet him right away, but when I went back to HQ for training they gave me a temporary office right across from his. I literally looked up from my laptop, through two windows into Steve's office. But he was never there. He did have a whole row of Toy Story puppets from a Burger King give-away lining the window of his office, which was pretty cool.

Toy Story puppets at Burger King

But the force-of-will that is Steve Jobs was everywhere. They pampered us, gave us expensive laptops and rooms at the Sofitel Hotel, iced lattes and Hagendaz bars, and free popcorn and Foosball games and $5000 ceiling tiles, but they expected us to be outliers of excellence, insanely great at what we did. The pressure and exhiliration were intense.

One of things we all learned from chairman Steve was "Do the right thing." When making a technical decision, not what's the easiest or fastest or simplest thing to do — or most complicasted for that matter — but what's the right thing to do. Once in a programming class, we were instruced to create a new object in Objective C, with a new method called:

    do_the_right_thing()
as a class assignment. This message permeated the corporate culture.

I remember being very impressed with the customer service department. They had every version of the NeXT software running on every applicable version of hardware, all in a row. The company had begun by selling a custom software stack (NeXTStep) on a custom operating system (Mach, a UNIX variant), running on custom hardware (the black box). Later market pressures drove them to port to Intel architectured, which they called "white box" hardware. The Mach operating system and the NexStep stack ran on the vanilla PC hardware. Then even more market pressure drove them to abandon their own OS entirely and port the NeXTStep software stack to Windows NT, renaming it OpenStep. every version from 0.9 on was present in the row of black and white boxes that ran the length of the building. That was the righ thing to do, to accumulate, maintain and protect those machines.

Another right thing that was done was to architect the NeXTStep tools to use the texbook software pattern of Model-View-Controller.

When the internet became huge in the late '90s the NeXT engineers were able to adapt the tools, swapping out the View module for a desktop app with one for a web app, and voila! WebObjects was born. Legend was it took them a weeekend to build the prototype. WebObjects is the grand-daddy of web apps, from which Java Server Pages (JSP) and all template-based active web pages are derived. And it was the biggest hit NeXT ever had.

One time a software devleoper told me that if you had a project ready to ship, you needed to build into the schedule giving a demo for Steve and then making all the changes he wanted. If you just tried to ship it, he'd show up the night before, demand a demo, give you a list of notes and force you to miss your schedule.

Still I almost never saw him, maybe for a few minutes talking on his phone before he ran out again. And then I found out why: he sold NeXT to Apple six weeks after I was hired. He'd been over at Apple HQ hammering out the deal. I was lucky enough to be at NeXT HQ when he held an all-hands meeting to answer questions. Afterwards I felt like I had to talk to him, that I might not have another chance. His new role at Apple was a consultant to the CEO with nobody working for him — a move we used to call a lateral arabesque. I conjured up a pretext to approach him. "Steve," I said, "I just found out there are people at Apple who've developed an Apple II emulator that runs on the Mac, but they're having trouble getting it released. All those Apple II's were given to schools in the '80s, and so much eduactional software was developed for them, is there anything you can do to help keep it alive?"

"I'll see what I can do," he said. "I'm just a consultant now; I don't if anyone will listen to me."

The second time I talked to him was just days later. I was working late into the night in the Redwood City NeXT headquarters, in that same office that looked out over the Ham and Rex and Buzz and Woody puppets in Steve's office, and meanwhile Steve and the his vice presidents were posing for head shots for the a press release in the ultra-wide corridor. I paid little mind and concentrated on my work. FInally I decided to call it a night, and packed up and walked down the ultra-wide stairs to the door. There, outside of the building, was someone waving to me through the glass. It seemed to be an employee without his entry badge wanting in; it was hard to see through the reflected light from inside. I cracked the door for a better look and saw that it was Steve. "How do I know you work here?" I asked him. "I don't," he said, and pushed past me through the door, obviously not to be triffled with.

Now, one of the things I would read about Steve Jobs in the trade press over the years was that he was an a-hole. He would scream at people and insult them in public. Oddly, though, most of these complaints were from people who chose to continue to work for him. It has occured to me that maybe sometimes people need to be screamed at. Jobs certainly had an uncanny knack for getting people to better quality work than they thought they could (me included), and if he had to be an a-hole some of the time I suppose there are worse things. I know that he was never abusive to me, and neither was anyone in my chain of command to him.

I did find that for the rest of his life I was fiercely loyal to Steve Jobs. I told people if he called me up and asked me to move to Alaska for a job I would. I trusted him. That's becauase I felt like he was loyal to me. When he sold NeXT to Apple he protected the NeXT employees. It almost kaboshed the deal — it delayed the press release for many hours, which I got to totally miss due to travel. (I think I was the last person in the company to get the news.) But Jobs was in there playing hardball to add a clause that said if Apple laid off a former NeXT employee their stock option would vest instantly, making them too expensive to jettison in numbers.

For then next half a year or more I was an Apple employee and Gil Amelio, turn-around artist from Motorola, was our CEO. It was pretty obvious to me we were doomed. We were losing money so fast we'd be gone in a few years. Finally Steve sold his own Apple shares, figuring they'd be worthless soon, and that scared the board so badly that they fired Gil and made Steve the interim CEO. (We called him the iCEO).

Things got noticeably better almost immediately. Apple marketing was a mess of in-fighting fiefdoms, and Steve started popping into marketing meetings unannounced. I heard onece somebody tried to stop him, and got the screaming. He would interrupt the presentations to ask, "What's our message? Can anybody tell me what our message is?" As far as I could tell at the time, the message was something like "the new Performa is out and has 10% more disk space for the same price!"

Soon afterwards the new Apple under Steve released the "Think Different" commercials, on TV, billboards and in magazines. Now we had a message.

Jobs had a reputation for stubborness, arrogance and ego, but I got to see him be humble and contrite. I was attending a talk he gave at the Apple Worldwide Developer's Conference, and he was defending his decision to drop support for a technology called OpenDoc. A man from a small start-up based on this technology stood up to challenge him. Jobs became humble. I couldn't believe what I was seeing. Maybe the pilgrimage to the Himalayas and the Zen practice from his 20s were finally kicking in. He explained that he was sorry his decision destroyed this man's business, but that he was trying to save Apple, and some tough choices had to be made.

I believe that was the last time I laid eyes on him. His life had a few more amazing chapters, including the birth of the iPod, the iPhone and the iPad, the growth of PIXAR and its eventual sale to the Walt Disney Company, and his influence on Disney which persists to this day — ABC News (Disney-owned) just showed up as a channel on my Apple TV.

We lost Steve in 2011. Like Walt Disney he was an American original; we will not see the likes of him again. When I feel like honoring him, I endeavor to be the best I can be, and do the right thing.

This unreleased version of the famous "Think Different" ad, with Steve narrating, is as fitting an epitaph as any I can think of:


Disclaimer

I receive a commission on everything you purchase from Amazon.com after following one of my links, which helps to support my research. It does not affect the price you pay.


This free service is brought to you by the book:

A Survival Guide for the Traveling Techie
travelingtechie.com

Friday, April 25, 2014

BEWARE THE FLAVOR OF THE MONTH

    "It's like a Swiss Army Knife for propellor-heads."

    — review of the NeXT Cube (1990)

This blog entry is written from the point of view of a software guy, so I apologize in advance for the specialized examples, but I believe the lessons are universal for any technology.

One of the fun things about being a techie in the Internet Era is watching the accelerating pace of fads as they sweep across the meme-space. In my field of software development I've seen the following come (and often go):

  • ALGOL
  • Backus-Naur Form (BNF)
  • PL/1
  • structured programming
  • Nassi-Shneiderman Diagrams
  • waterfall development
  • Ratfor
  • UCSD PASCAL
  • LISP
  • SQL
  • UNIX
  • object-oriented programming
  • SMALLTALK
  • C++
  • TCP/IP
  • workstations
  • client-server
  • Solaris
  • multi-tiered
  • CORBA
  • PCs
  • patterns
  • Visual Basic
  • Java
  • Javascript
  • Java Beans
  • Objective C
  • C#
  • LINUX
  • Tcl
  • Perl
  • Python
  • AJAX
  • agile development
  • open source
  • XML
  • PHP
  • Ruby on Rails
  • SaaS
  • Go
  • Node.js
and I know I could think of more examples if I took more time. Some of these have made a permanent difference in the way software is created, others not so much. But hindsight is 20/20 — foresight, of course, is harder to manage. Who knew, for example, how big Javascript would become when it was introduced as a proprietary client-side web scripting language by Netscape in 1995?

The Devil You Know and the Devil You Don't Know

The big question we face when we evaluate new technology is "Too Soon" vs. "Too Late." If you are too early an adopter you can suffer from bugs (especially "infant mortality" type), an undersized installed base and user community, and skepticism from many quarters including managers, clients and investors. There is also the danger that a proprietary solution from an unprofitable startup may simply go away, making you a technical orphan. Of course popular open-source projects can also lose favor and fade away. But on the other side a mature technology can fade away as well, if it gets too much "bit rot" to be fixed or is not adapted to its changing environment. Old software is well-known to frequently suffer from the effect that fixing one bug causes two more.

But timing, of course, is everything. The earliest of early adopters almost always suffer for their innovations on the "bleeding edge." As they like to say, "A pioneer ends up with an arrow in the back." But there is a "sweet spot" — for example, Apple's entry into the music player business with the iPod — that maximizes "lock in" and other early benefits. And on the other side, there are technologies like FORTRAN, COBOL, DOS, and XP that seemingly just won't die and are remarkably stable, if not feature-rich.

I have seen extreme examples of the failure modes of both "horns" of this technical dilemma.

I worked for a midwestern startup that had probably adopted a technology too early. They had a mapping suite of software written in Smalltalk, an early object-oriented language, that depended on an "object-oriented database" (OODB) to store its data, and that database had some operational bugs as well as design flaws that made the software pretty much impossible to deploy. One of the operational bugs was that entities in the database could not be completely deleted; their names remained as "ghost objects" forever and could not be re-used.` One of the design flaws was that only one program could use the database, unlike every "real" database I've ever used. These and other problems caused the product, and the company, to fail.

On the other extreme I once worked on an accounting application (in this century I should add) based on an obsolete database from the early 1980s that was so old it didn't use the standard Structured Query Language (SQL) of all modern databases. The vendor was no longer supporting the database for any other customers, and gave the bare minimum of support to my employers. This company was limping along in a fragile and vulnerable state, performing accounting services for scores of clients in a mode that could fail dramatically at any moment, with no short-term disaster plan and no long-term road map for migrating off of this ticking time bomb. They are still in business but I'm glad I'm not one of their investors.

Criteria

So how do you decide when it's too early or too late?

My hierarchy of criteria goes like this:

  1. stability
  2. deployability
  3. portability
  4. maintainability
  5. longevity
  6. security
  7. features
For example, I have found that up to this point the C programming language wins or draws on every point except maintainability and features, and those can be addressed with good programming practices and the right libraries. This helps explain I'm still fond of it, even though I began learning over thirty years ago. Here are some more details on why C remains popular:

On the other extreme I began learning PHP a few years ago and was dismayed by its lack of maintainability and difficulty of deployment. I have written precisely one PHP app, which you can see on-line at:

Deployment was easy because the web hosting company already has PHP installed. If I suddenly had to port this app to run on someone's local computer, I would allocate at least an afternoon to install and test PHP and the Apache web server it needs. I would probably end up using this manual:

Since going through this process I have learned that PHP has some security issues as well. I certainly don't need much security for my little "page-a-day" calendar app, but most uses I'm aware of are for e-commerce, where it really matters. This is why I currently take a "wait and see" attitude with PHP, and only use it when I am given code by others in the language.

Another point to consider when selecting a technology is how to deal with worst-case scenarios. I've noticed that both automobiles and software libraries tend to break down more often after 5 PM on Friday before a long holiday weekend. I have found I would rather tell my boss or client "We're going to work all weekend to find the bug in our code and meet the Tuesday deadline," rather than "The problem is in a library, we don't have time to write a replacement, so we'll have to wait until Tuesday to call tech support. There's no way we can make our deadline."

I know what you're thinking: what about open source software? Well, I have found it is both a blessing and a curse. Its a blessing if you want source code to everything and have time to find and fix other people's bugs. It's a curse when you pile on too many open source pieces under a tight deadline and then have trouble integrating and debugging them. I have learned to approach every such decision with risk management foremost in my mind.

Strategies

There are two main strategies I like to employ to mitigate risk in dealing with new technologies: pilot projects and modularity.

  • pilot projects

    In classic treatise on software development "The Mythical Man-Month" (1975)

    Frederick P. Brooks Jr. says: "Throw one away — you will anyway." He is referring mainly to prototypes, which test technology, architecture, and even spec all at once, and result in something you can set in front of the customer or user and ask, "Is this close to what you need?" Once these issues are resolved by testing and evaluation, real development can begin, often by starting from scratch with revised plans.

    In contrast, a pilot project is a complete project, from design through development to deployment, mainly designed to test technology. A key factor is that it is a non-critical project. If you can afford to utterly fail on this one the team will be less tempted to lie to themselves or their bosses. (The idea that something "has to" work is so toxic to projects.) The solution if it does fail is to reject the technology and go back to an older, more reliable one and re-implement the project, but only if it is really needed at all.

  • modularity

    Simply put, if you break functionality into interlocking pieces and implement the pieces separately, it makes it easier to test, debug, maintain and if necessary replace or totally overhaul each piece. I have found the advice in the book "The Pragmatic Programmer" (1999) by Andrew Hun and David Thomas

    — as well as other books in the series — to be helpful in this architecting for modularity. It is a lesson I learned originally from the classic software engineering text "Software Tools" (1976) by Brian W. Kernighan and P. J. Plauger.

    Thought their examples re in FORTRAN and PL/1, I find their design principles to be timeless. I use them all the time in my work.

    In addition I've found the following abstraction useful: break things into pieces horizontally and vertically. In a famous tutorial on the UNIX shells we are shown how to print out a file listing in order of size: largest to smallest, starting at the current folder and moving away from the root. The command is:

      du -k -a | sort -nr | more
    This actually uses UNIX "pipes" in the form of the vertical bar which connects three programs: du (disk usage), sort and more. Each does of a piece of the job. I think of this as horizontal partitioning of a task; each piece does a part of the job "on the same level" so to speak.

    In contrast, a vertical partition involves breaking things into layers, a concept which is a little more subtle. An example will probably help. I'm involved in a software project currently in which we are still debating how to have programs communicate: using shared memory, or sockets, or "RESTful" web services, or even dropping communications files in a designated folder. I suggested we add an "abstraction layer" by writing a set of communication functions/methods, and have the discipline to always use them. Then the underlying technology can be "swapped out" seamlessly.

    One of the most powerful tools in layering software is emulation in which one technology emulates another. I'm old enough to have used a VT-100 terminal from DEC before they became obsolete in the 1980s as we switched from minicomputers to PCs, But there are still mainstream programs which emulate that old terminal, including the Linux console.

    A pervasive example of software layering is what is sometimes called the "network stack," though nowadays it's usually called the Open Systems Interconnection (OSI) model.

    The technology that makes the Internet work is broken into seven layers. Numbered from "bottom" to "top" they are:

    1. Physical Layer
    2. Data Link Layer
    3. Network Layer
    4. Transport Layer
    5. Session Layer
    6. Presentation Layer
    7. Application Layer

    The Network Layer is where the Internet Protocol (IP) is found, and the Transport Layer has the Transaction Control Protocol (TCP). Together they are called TCP/IP. Ever heard of that one? Part of the genius of the internet's design is the breaking of these functionalities into layers, which can each be improved or replaced independently.

Non-Technical Example


a railroad date nail from 1918

If this discussion of the innards of the internet has been too geeky for you, my last example is from a century ago: an obscure industrial artifact known as the railroad "date nail."

My friend Bob and I discovered this delightful collectible while visiting the Western America Rail Museum (W. A. R. M.) in Barstow, CA.

They have a whole room of the things. They're small and sturdy enough to make great collectibles, and have rich histories that can usually be documented. They are nails with dates or other codes stamped into the head that are nailed into lumber used as railroad ties and trestle beams, so their age can be determined when they must eventually be replaced. The types of chemical treatments applied to the lumber were sometimes noted along with a date.

What do we learn from this practice? A surprising number of lessons can be extracted:

  • plan for change

    Understand that you can't keep using techniques forever; each will have to someday be replaced.

  • everything is an experiment

    You can never test things enough in engineering, so every field deployment is really an experiment.

  • document results

    You can't manage what you can't measure, and you should plan ahead to make measurement and records keeping easier.

  • be prepared to change course

    Every change can prove disastrous, and you need to be able to back out and return to a previous method.

  • try multiple approaches

    Test technologies against each other. Make sure you have choices informed by performance measures.

Now that's the way to run a railroad!


Disclaimer

I receive a commission on everything you purchase from Amazon.com after following one of my links, which helps to support my research. It does not affect the price you pay.


This free service is brought to you by the book:

A Survival Guide for the Traveling Techie
travelingtechie.com