London’s Post Office Tower: My First & Only Visit

Cover of my School Study, 1971

Cover of my School Study, 1971

At the age of eleven, I produced the illustration above for the cover of a “London Study” that we were required to write and illustrate at school. The study was created in connection with our school visit to the capital city, which had taken place in May 1971, just before I drew the cover.

As you may expect (given my interests), my cover drawing emphasized modes of transport. Additionally, I chose as the centerpiece a striking modern building to which we had paid a surprise visit during the trip, and which had substantially impressed me. Little did I know at that time that it would probably be my only opportunity ever to visit that iconic building.

The building in my drawing was the recently-built Post Office Tower (now known as the BT Tower). Even before that first visit to London, I was well aware of the existence of that structure, which was feted as a prime example of Britain’s dedication to the anticipated “White Heat of Technology”. In addition to its role as an elevated mount for microwave antennas, the Tower offered public viewing galleries providing spectacular views over Central London. There was also the famous revolving restaurant, leased to Butlin’s, the famous operator of down-market holiday camps.

The Tower and its restaurant began to feature prominently in the pop culture of the time. An early “starring” role was in the comedy movie Smashing Time, where, during a party in the revolving restaurant, the rotation mechanism supposedly goes out of control, resulting in a power blackout all over London.

In the more mundane reality of 1971, our school class arrived in London and settled into a rather seedy hotel in Russell Square. One evening, our teacher surprised us by announcing an addition to our itinerary. We would be visiting the public viewing galleries of the Post Office Tower, to watch the sun go down over London, and the lights come on! Needless to say, we were thrilled, even though we had no inkling that that would be our only-ever chance to do that.

There were actually several public viewing gallery floors, some of which featured glazing, while others were exposed to the elements, except for metal safety grilles. Fortunately, the weather during the evening that we visited was not exceptionally windy!

Concretopia

I’m currently reading the book Concretopia, by John Grindrod, which provides a fascinating history of Britain’s postwar architectural projects, both public and private.

Cover of Concretopia Book

One chapter of the book is dedicated to what was originally called the Museum Radio Tower (referring to the nearby British Museum). It provides detailed descriptions of the decisions that led to the construction of the tower, and reveals that at least one floor is still filled with the original 1960s-era communications technology.

Due to subsequent changes both in communications technology and British government policies regarding state involvement in such industries, much of the original function for which the Tower was built has now been rendered obsolete or moved elsewhere, leaving the building as something of a huge museum piece (ironically, in view of its original name).

The Once-and-Only Visit

In October 1971, a few months after my school class visit, a bomb exploded in the roof of the men’s toilets at the Top of the Tower Restaurant. Initially it was assumed that the IRA was responsible, but in fact the attack was accomplished by an anarchist group.

Fortunately, nobody was hurt in the incident, but it drew attention to the security vulnerabilities created by allowing public access to the Tower. The result was that the public viewing galleries were immediately closed down, never to be reopened, and Butlins’ Top of the Tower restaurant was informed that its lease would not be renewed after that expired in 1980.

Nonetheless, the Tower continued to appear in the media as an instantly recognizable icon. At around the same time, it was supposedly attacked by a particularly unlikely monster—Kitten Kong [link plays video]—in the British TV comedy series The Goodies.

My younger brother took the same school trip to London two years after me, but it was already too late; the Tower’s public viewing galleries were closed, so he never got to see the London twilight from that unique vantage point.

The Unexpected Technologist

On that first visit to London in 1971, I had no notion that I personally would ever be a participant in the kind of exciting technological innovation signified by the Tower. In my family’s view, such advances were just something that “people like us” observed and marveled at, from a remote state of consumer ignorance.

I never anticipated, therefore, that I would return to London as an adult only ten years later, to begin my Electronics degree studies at Imperial College, University of London. I had to visit the University’s administration buildings in Bloomsbury to obtain my ID and other information, and there was that familiar building again, still looming over the area. (The University Senate House is also famous for its architectural style, but I’ll discuss that in a future post!)

My 1982 photo below, taken during my undergraduate days, offers an ancient-and-modern architectural contrast, showing the top of the Tower from a point near the Church of Christ the King, Bloomsbury.

Post Office Tower & Bloomsbury, 1982

Post Office Tower & Bloomsbury, 1982

The Museum Tower

The photo below shows the Tower again, during a visit in 2010, now with its “BT” logo prominently on display. Externally, the tower looks little different from its appearance as built, and, given that it’s now a “listed building”, that is unlikely to change much in future.

BT Tower, 2010

BT Tower, 2010

For me, the Post Office Tower stands as a memorial to the optimistic aspirations of Britain’s forays into the “White Heat of Technology”. It seems that, unfortunately, the country’s “Natural Luddites” (which C P Snow claimed were dominant in the social and political elite) won the day after all.

Cover of my School Study, 1971

Cover of my School Study, 1971

Mondrian’s Mistake: the Illusion of Primary Colors

You Can Call Me Piet

You Can Call Me Piet

The image above is my own work, but was inspired by “Composition C” created in 1935 by the Dutch artist Piet Mondrian. I’ve been learning more about Mondrian’s life recently (mostly from the book Piet Mondrian: Life and Work), in connection with some design work I’m doing.

You’ll notice that the only colors in my artwork, as in Mondrian’s Composition C, are the so-called “primaries”: red, blue and yellow. Mondrian seems to have become quite obsessed with these particular colors, and he asserted that they somehow exist as special entities in the universe.

Mondrian was part of a group of artists who called themselves neoplasticists, and they published a magazine called De Stijl. As mentioned on page 194 of the book cited above, in 1917, Mondrian claimed in an article in De Stijl that:

All colors are available to our perceptions, but only true colors are susceptible to objective definition. The primary colors, which form the basis for all natural visible colors, fulfill this requirement.

The problem is that the claim is false, because the illusion of primary colors stems entirely from the quirks of the human visual system. Thus, there are no “true colors” in nature that could form the basis of other colors. Colors of light fall into a continuous electromagnetic spectrum, in which no color is more “true” or “primary” than any other.

There are no “primary colors” in nature.

Primary Colors don’t Exist

Those of us who received some type of artistic training at school probably remember being told by our teachers that there are 3 “primary colors”—red, yellow, and blue—from which all other colors may be mixed.

In fact, the illusion that there 3 primary colors stems from the fact that there are 3 types of color receptor cell in our eyes. If instead, due to the vagaries of evolution, our eyes had 2 or 4 such types of cell, our teachers would be telling us that there are 2 or 4 “primary colors” respectively!

An entire book (claimed to be the best-selling art book ever produced) has been written on the misunderstanding of the “artist’s primaries”: Blue and Yellow Don’t Make Green by Michael Wilcox. Oddly, though, that very detailed book never makes any attempt to describe the human visual system and its light receptors. Instead, the author explains color mixing effects in paints as the results of impurities in the pigments (which is also true—the pigments are impure).

The Physiology of Human Vision

In a post on my professional blog, I explain in more detail how humans see color, and how additive and subtractive color systems work. These physiological limitations are a key to the basis of many color reproduction technologies, such as television and halftone printing.

Although research continues today on the subject of vision, the fact that human eyes have several different types of light detector has been known since about the 1850s.

For the details, see my professional post, but to summarize here, the human eye has 3 types of receptors for colors (“cones”), plus one further type for monochrome vision (“rods”). Of the 3 types of cones, there is one type that is most sensitive to red light, another that is most sensitive to green light, and a third that is most sensitive to blue light. Each color of light corresponds to a wavelength in the electromagnetic spectrum.

In my diagram below, the sensitivity of the blue receptors is shown by the S (for “short”) curve, that of the green receptors by the M (for “Medium”) curve, and that of the red receptors by the L (for “Long”) curve. The R curve shows the sensitivity of the rod cells.

The Sensitivities of the Human Visual System

The Sensitivities of the Human Visual System

Light entering the eye may have any wavelength (i.e., any color) in the visible spectrum. Our brains determine the actual color by combining the intensities received by the three types of cone cell. For example, if yellow light enters our eyes, then the red and green cones see high intensities, while the blue cones see little intensity. The brain converts this information into the perception of yellow.

This means that our eyes can be fooled into seeing colors that are not actually present, by presenting combinations of other colors that trigger the receptors in the same way as the missing color. In fact, many display systems, such as color television, rely on this fact to create the illusion of continuous color from only 3 separate frequencies.

Artists’ so-called primaries are in fact the “subtractive primaries”, which are the complements of the “additive primaries” discerned by our eyes. The subtractive primary colors are more accurately named as magenta, yellow and cyan, respectively.

Primary Colors are in the Eyes of the Beholder

If you think a little about this situation, you can understand how the concept of “primary colors” arises. The fact that we see any color as being the combination of responses from 3 receptors gives the false impression that every color of light is somehow made up of proportions of 3 colors.

It may be disappointing to realize that, in the case of color vision, once again, we find that we don’t experience reality directly, but only a filtered version of it, due to the limitations of our senses.

Is it Art?

I should probably make it clear that I am not criticizing Mondrian’s artwork in this article, nor am I suggesting that he lacked artistic skills. The fact that he was misguided in his claims about primary colors does not detract from the quality of his artwork.

Personally, I was first introduced to Mondrian’s work as a teenager, during my Advanced-level Art studies. Our teacher showed us examples of his abstract work. While I don’t recall her ever explicitly saying so, I got the impression that we were supposed to conclude that it was not “real art”, but I do not agree with that conclusion.

Certainly, debates about the quality of Mondrian’s art did not prevent it from gaining popularity, even long after his death. During the 1960s, Yves Saint Laurent designed an entire fashion line using designs inspired by Mondrian’s abstract paintings.

Colors we Can’t See

One implication of the continuity of the electromagnetic spectrum is that there are many “colors” that the human eye cannot see, because they fall outside the range of the receptors in our eyes. One example of this, which caused me some consternation when taking photographs, was the rendition of some flower colors.

In Spring in Britain, woodland areas are often carpeted with beautiful displays of flowers called bluebells. As the name suggests, the appearance of the flowers is bright blue. However, whenever I took photos of such displays (and particularly with a film camera), the color in the photo always came out purplish; not at all the color that my eyes saw in the original scene. The (poor quality) film photo below, from 2001, shows the results.

Bluebells, as captured on Film

Bluebells, as captured on Film

The reason for this apparent change of color is that the bluebells actually reflect ultraviolet light, which our eyes cannot see, but to which photographic film is sensitive.

Apparently, most humans cannot see the ultraviolet in this case; it isn’t just some color-blindness on my part. (I know I’m not color-blind, because I’ve been tested for it several times, such as when I applied for my apprenticeship at Ferranti.) If most people could see the ultraviolet wavelengths, then presumably the flowers would be called “purplebells”!

Modern digital cameras tend to give a more faithful rendition of the color, although it still seems too purple, as shown below in my photo dating from 2007.

Bluebells as captured by a Digital Camera

Bluebells as captured by a Digital Camera

Projecting Our Limitations onto the Universe

Mondrian’s false beliefs in this case are characteristic of much metaphysical theorizing, of a type that also occurs very frequently in religious thinking.

The error is to take some limitation or evolutionary quirk that applies only to the human condition, and then extrapolate that by claiming that it is a “universal truth”.

As the saying goes, a little knowledge can be a dangerous thing!

Demise of the Typewriter

My Pencil Drawing of our Typewriter, 1977

My Pencil Drawing of our Typewriter, 1977

I produced the pencil drawing above in March 1977, while studying for my Advanced-Level Art qualification at Scarborough Sixth Form College. Back when I produced it, I could never have imagined that, some 40 years later, I’d be using exactly that image to illustrate an article about the demise of the typewriter!

As weekly homework, our teacher (Miss Mingay) required us to draw some object or scene in pencil, in a sketchbook. I considered the task very boring and tiresome at the time, but, fortunately, my mother hung on to the sketchbook, so some interesting drawings have survived (albeit now very smudged).

On that particular occasion, my chosen subject was a typewriter, which had originally been used mostly by my mother. (This was our second typewriter, and I think that it was an Olivetti). By that time, however, I was getting ready to use it myself, to type out the content of my A-level Art study in Architecture.

(The following year, Miss Mingay retired, and the onerous weekly homework requirement disappeared with her! That confirmed my suspicion that it was not a requirement of the A-level course.)

My Mother’s Career Plans

As I’ve mentioned in previous posts, my father was a teacher, but suffered his first stroke when I was about two years old. Given that he was the family’s sole breadwinner, my parents began to fear for their future financial security, and considered alternative plans for generating sufficient income.

One idea, which my father seemed to favor, was to buy a Guest House or Hotel, then generate income by letting out rooms. Given Scarborough’s status as a seaside resort, this was a reasonable idea, although the sheer number of such businesses in the town meant that it was highly competitive.

The other idea was for my mother to learn typing and shorthand, with a view to becoming a secretary. In those days, that was still one of the few career paths open to women without specialized qualifications.

My mother did start taking secretarial classes at Scarborough Technical College, and that was what initially prompted their purchase of a typewriter. She also decided that, to be effective in her new career, she would need to learn to drive, which she also achieved. My father’s concession on that count was that he sold his large Humber Super Snipe, and bought a smaller Austin 1100 (shown below, with me in the back seat), which my mother was more comfortable driving.

Our Austin 1100, c.1968

Our Austin 1100, c.1968

I was particularly excited about that car, because it was the first time that my father had bought a brand new car rather than a used model.

Change of Plan

Eventually, though, the Guest House plan won out, and we all moved to a suitable building on West Street in 1970. My mother seems to have abandoned her secretarial aspirations at that point, but she did continue her studies with some Open University courses, and the typewriter was useful for those.

From Typewriter to Computer

While an undergraduate student at Imperial College in the early 1980s, I decided to invest in an electric typewriter, since I was noticing that typewritten papers were better received by our tutors than handwritten ones.

That typewriter saw much use for a few years, but it was the last one that I ever bought. I brought it with me to California in 1987, but never used it again. Why bother, when a computer+printer was so much easier, more productive, and more powerful?

We Don’t Get Much Call for Those Now

I was by no means the only person who realized that the typewriter had been superseded by computer technology. In fact, should you wish to buy a typewriter now, you’ll have to find a used example, because the last new machines were manufactured in 2011, in India.

Just as digital camera technology swept away film cameras, so computers and printers have swept away typewriters. I sometimes find it sobering to reflect on how different the world is now from that of only 30-40 years ago.

Reinventing Myself: From Hardware to Software

 

OCVS Booth, Windows Solutions Conference 1993

OCVS Booth, Windows Solutions Conference 1993

The 1993 photo above shows me effectively embarking on a new career, and not quite sure what I’d started! I was at my business’s own booth, during the first trade show where I was promoting my own product.

Of course, I’d attended, and even worked at, many trade shows prior to that, but I’d always been there as a representative of someone else’s company or organization.

Short-Sighted Employers

The series of events that led to my first attempt to develop and sell my own software provided a thought-provoking lesson in the tragic short-sightedness of many employers and businesses. Until then, I had implicitly but naively assumed that, as technology changed, my employers would “keep their eyes on the ball” and change their products (and my role in the organization) accordingly.

Far from it, in reality! Most employers seemed to think of their employees as fitting into neat, predefined boxes, and their view was that the box (and the employee within it) should stay the same for ever more. Their attitude seemed to be that, if they had once hired an oil-lamp lighter, then that person should continue to light the oil lamps for ever more, even if oil lamps had in the meantime become obsolete!

As a result of my education and industry experience, I felt that I could discern something about the way computer technology would evolve in the future, and it seemed obvious that I should attempt to evolve in the same direction. Unfortunately, as explained below, not only were my attempts to redefine my role not supported by my employer, but they even actively resisted my attempts to change!

Going with the Flow (or Trying to)

As I’ve mentioned in previous posts, my goal in obtaining an electronics degree had been to get a job working “in video”. I’d come to consider that as a desirable career as a result of one day’s teenage experience, when my friend Graham Roberts took me along with him to his shift as a Continuity Announcer for Yorkshire Television.

I really hadn’t considered electronics for any other reason. Unlike some other boys, I was not an electronics hobbyist, and I didn’t even have a “microcomputer” to tinker with.

When I started my video engineering career, the reality was that real-time digital video processing required special hardware. General-purpose computers simply weren’t even fast enough to stream video in real time, let alone modify the pixels.

However, as processing speeds increased, computers became able to handle digital video in real time. As a result, it became possible to write software to process video in ways that would previously have required specialized hardware.

I wanted to move over to some type of software development, but my employer at the time (Media Vision) seemed to be trying to restrict me to hardware development only. My manager apparently decided (without consulting me) that I should become an integrated circuit design engineer, and bought development equipment for me to do that!

Frustrated by their short-sightedness, I quit my job and started my own business, initially with the intention of producing video in some form.

(As things turned out, Media Vision collapsed quite spectacularly some time after I left, so my decision to quit seemed very smart in retrospect!)

No Video Available

Oddly enough, despite my prior programming experience, when I started my own business I did not set out to develop a software product! My initial project was to develop an instructional video, which would be distributed on standard VHS tapes.

I’d created a “treatment” for my video, but I did not myself possess video cameras and editing equipment. It seemed fortunate that a friend of mine had simultaneously started his own video editing business, so we agreed to co-operate on the production. Unfortunately, as the months went by, it seemed that he was never quite ready to begin shooting, and I reluctantly realized that I was going to have to find another way to deliver my product.

My job at Media Vision had had me designing PC hardware for the new “multimedia” technology (which basically involved adding audio and video capabilities to PCs). It struck me, therefore, that perhaps I could create some kind of “multimedia computer tutorial” as a substitute for the planned video.

I had learned to program while at college, and as I related in a previous post, even before that, I had undergone an aptitude test that indicated that I would make a good programmer. Nonetheless, the only complete programs I’d written at that point were small utilities for my own use, or that of my colleagues, when processing data as part of our hardware design jobs. I had also written “embedded” software for custom hardware, but I had never tried to create what is called a “shrink-wrap” software application. Shrink-wrap software is a standalone product that can be sold to consumers, who then install it on their own computers and expect it to run with little or no further involvement from me.

Creating a shrink-wrap software application seemed like a significant challenge, and I wasn’t sure that I could actually do it. Nonetheless, there seemed to be little alternative, so I sat down to learn a multimedia software creation tool called Asymetrix Toolbook.

My First App

The eventual result was “Dave Hodgson’s PC Secrets”, which was a software application for Windows computers (what would now be called an “app”). The initial screen looked like this:

PC Secrets Software Title

PC Secrets Software Title

Unfortunately, sales of the product were not great, which led me to seek consulting work. Although I did accept a couple of hardware design consulting projects, it was obvious that much more work was available for software consultants.

Fortunately, I discovered that the fact that I’d just created my first software “app” qualified me for consideration as a Windows software consultant! That led to many years of work for me as a consulting software developer.

Do Anything You Want to Do, But Don’t Expect Our Support!

That was how I learned that I couldn’t rely on my employer to have my best interests at heart, nor even to be concerned about my career development. It had been clear to me that the future of video (for me, at least) lay in software, but my employer would not support my ambitions.

While I think that most self-help advice along the lines of “do what you want” is simply naïve, I did find that, in order to achieve my goals, I had to define those goals myself, then actually invest considerable time and effort of my own to achieve the results I desired.

Complexity from Simplicity

Commodore776M

Commodore 776M Calculator: Our Family’s First Computer!

This flashback is slightly unusual, in that, instead of discussing an old photograph, I’m thinking about something that, back in the 1970s, seemed to me to be a technological miracle. Learning how it worked, and how to design even more complex devices, provided a valuable lesson in how amazingly complex systems can be built from simple components. The image above is my drawing of the first “digital computer” that my family ever owned: a Commodore 776M calculator.

This article explains how, in the space of less than 10 years, I went from regarding computers as mysterious marvels, to learning not only how they work, but also how to design and build them. I even obtained patents for my own new digital circuitry inventions. I’ve tried to keep the technical discussion as basic as possible, while still trying to show how complex systems are built up from simple components.

As I mentioned in a previous post, when I was at school I studied Advanced-Level Math and Physics, but much of what we were taught, even in Physics, was very theoretical, and it wasn’t at all clear how the principles applied to real-world technology, or indeed how real devices worked. To learn how real systems worked, I often had to resort to teaching myself.

Computers as Black Boxes

The same was true for understanding how computers worked. I was very excited when I was told that, as part of the Physics syllabus, we were going to learn to understand computers, but I was quite disappointed by what we were actually taught.

The teacher explained to us that digital computers use binary arithmetic (the value of every digit can only be 0 or 1), and that computers are built from simple circuits such as so-called “AND” and “OR” gates. The binary 0 and 1 values are represented in the computer by “low” and “high” voltages respectively.

We were able to play around with pluggable electronic “black box” modules that implemented these functions, and we confirmed the results of combining them. We could use Boolean Logic to combine the outputs from these gates.

But I still thought, “How do you get from that to a digital calculator?”

The answer (as I was to learn later on) is by combining thousands or millions of those basic gates together to make devices of increasing complexity.

Even after a career of designing digital electronic systems and the software associated with them, I still find it really astonishing how extremely complex devices can be created from such simple basic blocks.

Building Everything from NAND Gates

There are three basic types of digital computer logic circuit or “gate”:

  1. NOT gate. The output level is the opposite of the single input level.
  2. OR gate. The output level is high if any of the input levels is high. A NOR gate is the same but with the output inverted (i.e., an OR gate plus a NOT gate).
  3. AND gate. The output level is high only if all the input levels are high. A NAND gate is the same but with the output inverted (i.e., an AND gate plus a NOT gate).

It turns out that all three basic types of digital computer circuit can be built by combining one basic type of circuit, the NAND Gate.

A Real NAND Gate

My Advanced-Level Physics studies did not include electronic circuit design, of course, so it would have been unreasonable to expect to be taught exactly how these gates were implemented. I learned the details later on while studying for my Electronics degree.

The actual circuit for a real NAND Gate, implemented as Transistor-Transistor Logic (TTL) is as shown below. This is the diagram for one quarter of the Texas Instruments 7400 NAND Gate (because the actual chip contained 4 such circuits).

Circuit Diagram of Texas Instruments 7400 NAND Gate

Circuit Diagram of Texas Instruments 7400 NAND Gate

At the time that I began designing hardware, during the 1980s, TTL logic such as this was still the standard way of implementing many designs. I used these gates myself for many designs, starting with my undergraduate final-year project at Imperial College.

To avoid all the circuit details, the entire NAND gate can be represented with a symbol, as below.

NANDGateSymbol

NAND Gate in Symbolic Form

The diagram below shows how the connections on the symbol correspond to those in the actual circuit.

7400TTLCircuit2

TI 7400 NAND Gate: Symbol & Circuit

Memory from NAND Gates

To create a useful computer, you need to be able to store numbers in some type of memory.

It turns out that, by combining together a few NAND gates, you can create a simple memory for one bit of information. The combination is called a bistable circuit (aka a flip-flop), because (while the power is on) it remains in one of two stable states until an input causes it to change state. This allows you to store the outputs from logic circuits. Each bistable circuit allows you to store 1 bit of binary data.

Here is a diagram of a bistable 1-bit memory circuit, constructed entirely from NAND gates.

DataFlipFlopNANDs1

Data Flip-Flop (1-bit Memory) built from NAND Gates

The “Clock” input in this circuit can be obtained from another simple circuit constructed from NAND gates; the astable circuit, whose output continually oscillates between low and high states.

By lining up 8 bistables in parallel, you can store one byte of data.

From Gates to Functions

Well, that seemed simple enough, but I still didn’t understand how to get from that to a digital calculator.

Building a set of flip-flops gives you a way to store a number, but how do you combine numbers together? After all, the device is called a “computer” so how does it actually “compute”?

Well it turns out that you can also construct arithmetic circuits from—you’ve guessed it!—NAND gates. For example, you can build an adding circuit (called a Full Adder) to add together two 1-bit numbers, as shown below.

1-Bit Full Adder Circuit built from NAND Gates

1-Bit Full Adder Circuit built from NAND Gates

The circuit adds two 1-bit numbers, A and B, and accepts a carry-in bit from another adder (Cin). It generates the sum of the bits and the Cin  at S, and also a carry-out at Cout. By arranging any number of these adding circuits in parallel, and connecting their Cin inputs and Cout outputs to each other, you can build an adding circuit for numbers of any size.

Displaying Numbers

When you’ve constructed all the circuitry to allow users to type in numbers and compute the results, you still need a way to display the result to the user, because your calculator will be fairly useless without that.

Those early calculators used “seven segment” displays, which are sets of light-emitting diodes arranged so that, by switching segments on and off, any digit between 0 and 9 can be displayed in a human-readable form.

SevenSegmentDisplay

Seven-Segment Digital Display

So, how do you make the segments light up to display a particular number? Well, as you may have guessed by now, the answer is another logic circuit, called a Seven-Segment Display Driver. Texas Instruments also produced an integrated circuit to provide this function; the 7447 IC.

Complexity in Biology

Learning how complex computers (such as the device on which you’re reading this article) can be built from large numbers of very simple components made it easier for me to understand how the same principle could apply in other fields.

For example, in biology, evolution has created incredibly complex organisms (such as humans) from huge numbers of very basic cellular components. It’s much easier to understand such processes when you know how other complex systems are created, even though the results remain astonishing in all cases.

Monochrome Illustration Techniques

Young H G Wells (Incomplete)

Young H G Wells (Incomplete)

The image above was intended as a portrait of the young H G Wells. I started work on it while I was an undergraduate student, but, unfortunately in retrospect, I never finished it.

Even though the drawing is incomplete, I’ve scanned and posted it now because it is one of the few surviving examples of my artwork that uses a monochrome stippling technique.

Over on my professional blog, I just published a new post—The Age of Monochrome Illustration—reflecting on the significant changes in commercial illustration practices that have taken place during the past quarter century or so. On looking back at the work that inspired me in those days, I was astonished to realize just how much things have changed since then. It was a “given” back then that most illustration work would be printed in monochrome, and it had been that way since the dawn of mass printing.

Looking through my own remaining artwork to find images for the post, I had no difficulty finding examples of artwork that used cross-hatching, but an example of the stippling technique was less readily available.

I can think of only one area of electronic publishing where monochrome artwork may still be preferred, and that is for eBooks that are to be read on eInk devices such as the Kindle Paperwhite. However, that’s not necessarily much of a restriction, because the same publications can, in general, be read on compatible readers with full-color screens. I’m not aware of any cases where eBook artwork was deliberately created in monochrome for that reason.

The Truth About US Visas (In My Experience)

 

H1 Visa Passport Stamp

H-1 Visa Passport Stamp

Today’s “flashback” relates to my early experiences in the USA. The image above shows my H-1 visa stamp, in my UK passport, which was obtained for me by Sony so I could start working for them in 1989.

I moved to the USA to work about 30 years ago, initially on an E-2 (Treaty Trader) visa (for a different employer). The H-1 visa shown above was my second and final US visa (because I became a legal permanent resident in 1991).

Ever since I first began working in the US, I’ve heard controversial claims about the working visa scheme. The H-1 visa type was replaced by the H-1B visa in 1999, but many of the controversies surrounding its use have remained.

  • On the negative side, there are complaints that employers use visas to hire foreigners and undercut American workers, or that some employers prefer workers who require visas because such people can be treated as “indentured servants”.
  • Conversely, defenders of the system claim that employers have no choice but to hire visa workers, because the USA simply doesn’t produce anyone with the required skills. Is that really true, and, if so, why?

Some Criticisms are Justified

As someone who has benefited from the availability of US work visas, you may be surprised when I say that I agree with some of the criticisms. I’ve seen personally that some employers do seem to abuse the visa scheme, do use it to undercut American workers, and do treat visa employees as “captive workers”.

On the other hand, not all employers abuse the system. In my case, I have a clear conscience, because I really was hired due to having skills that my employer could not find in any available American workers, as I explain below.

Sony did not treat me as a “captive”; in fact they treated me quite generously, and even agreed to help me obtain Permanent US Residency (which became moot about a year later, when I married Mary, who is a US citizen).

After I began working for Sony, my manager explained to me that they had gone to considerable lengths to hire me because I really did have experience that no other available candidate possessed. Ironically, that experience came from an earlier job that I had, until then, regarded as a “wasted year”!

When Life hands you Lemons…

In 1985, while still living in the UK, I obtained work with Link Electronics Ltd. Link was a manufacturer of television cameras for the BBC and many other worldwide broadcasting organizations. At the time, this seemed like a positive move, given my background in video production and training at the BBC, so I moved away from London to Andover, where Link was located.

Unfortunately, Link succumbed to a pattern that seemed all too common in British engineering companies. There was no doubt that Link’s products were technically brilliant, but it was not a well-managed company, and, unknown to me, was in fact already in severe trouble by the time that I started there. As a result, despite making recognized contributions to their hardware and software, I was laid off from Link after only one year, leaving me feeling that my move there had been a very bad decision. (To this day, it remains the only occasion on which I’ve been made redundant by an employer, as opposed to leaving voluntarily.)

It was, therefore, very gratifying when, about 4 years later and 5500 miles away, I discovered that my one year’s experience at Link had opened the door to a great job at Sony. By the end of the 1980s, there were no remaining television camera manufacturers in the US, so Sony really couldn’t find any available Americans with that experience.

(The reason why Sony were so eager to hire someone with experience of television camera design was because they wanted to develop a film scanner that could convert high-resolution film into HDTV video. The video could then be used instead of the film for editing and compositing cinematic movies, which made the process far more efficient.)

Visa from Tokyo

It may seem odd that my H-1 visa states that it was issued at “Tokyo”, rather than London or San Francisco. That is correct, and occurred because of the oddities of the visa issuing process. In order to get the visa stamped in my passport, I had to visit a US embassy outside the United States. My first opportunity to do that, after being hired by Sony, was when I visited their plant in Atsugi for a project meeting. We stayed at the Keio Plaza Hotel in Tokyo, and, one morning, I went along to US Embassy in Tokyo to get the visa stamped into my passport.

Finally, in 1991, Mary and I got married, as a result of which I no longer needed a visa to work in the US. Of course, there are also many stories of immigrants who marry US citizens simply to obtain residency, but the Immigration Service is well aware of that and conducts extensive checks to prevent that kind of fraud. Now that Mary and I have been married for over 26 years, I think we have adequate proof that there was nothing dishonest about the motivation for our marriage!

The few photos remaining from my 1990 Tokyo visit include a couple of portraits that Mary took of me in the hotel. One of these is shown below.

KeioPlazaDavid2Cright

David Hodgson at the Keio Plaza, Tokyo, 1990

Which David Hodgson is Which?

 

Mystery Escapologist

When it came time for me to select a URL for my personal blog, I had to select the name I would use from whatever was available. There are, of course, many people in the world named “David Hodgson”, so it would be naïve to assume that I’d be the only David Hodgson with an internet presence.

I searched through web sites mentioning “David Hodgson”, and I was enlightened, amused and even appalled by what I found. I found several artists, a vicar, and even a murderer. Some of these sites are blogs, but some are not. I’m listing these URLs here in the hope that it will help you to avoid confusing me with all these other David Hodgsons!

All the URLs listed here are real web sites, and I am not in any way responsible for their content. So, if you want to complain, then please don’t “shoot the messenger”!

The following David Hodgsons are definitely not me

The fact that it’s now quite easy to set up a blog, and that it can be done free of charge, means that inevitably you will encounter some bad blogs. A similar situation occurred back in the 1980s when “desktop publishing” was the new fad; at that time I saw flyers that used twenty or more different typefaces on the same page! Nonetheless, the bad examples eventually fell by the wayside, and desktop publishing is now well-established. In the end, desktop publishing didn’t fail just because some of its early adoptees didn’t know what they were doing. I suspect that the same will be true for blogging, and indeed for ePublishing in general.

One of the features of blogs that you notice when you start examining them is that some people seem to start a blog with unbounded enthusiasm, then realize only later that they don’t really have much to say. My advice is to be realistic; if you’re not already someone who posts regularly on social media and elsewhere, then you almost certainly don’t need a blog, because you’ll never actually post to it.

Then there are those sites that seem to be “their own worst enemy”. I received a communication from one blogger whose site is titled “Five Experts”, and which seems to offer advice to fellow bloggers as to how to attract more followers. Unfortunately, though, I was not inspired by the content. For example, here’s a description of the site:

“We are five profitionels, we creat a small business. a parte of owr work is to help you creat your’s”

Erm… no thanks. Anyway, I already creat mine’s…

David Hodgson: the Previous Incarnation

https://artuk.org/discover/artists/hodgson-david-17981864

You really do learn something new every day! I had never previously realized that there was a British artist named David Hodgson, who lived in East Anglia from 1798-1864. He seems to be best known for his paintings of the Norwich area.

David Hodgson: the Vicar of Wokingham

www.davidhodgson.com (aliases to: https://dphodgson.wordpress.com/)

This is the blog of a vicar in Wokingham, UK. Coincidentally, Wokingham is where I worked for about a year before emigrating to California, but I assure you that there’s no other connection between us.

Most of the posts seem to be just links to other church business postings. However, towards the end of last year, the Reverend apparently received inspiration to start posting for Advent. He explained, “My blog for each day in Advent will celebrate examples of action in the world inspired by hope and the desire to bring closer God’s kingdom of love, peace and justice.”

Unfortunately, he stopped posting after 7 days (on 3rd December), without offering any reason. It seems that the Holy Spirit gave up the ghost at that point.

David Hodgson: the Invisible Insolvency Expert

http://www.davidhodgson.net/

This site does not seem to be stable, and is currently not accessible. When it was accessible, it claimed to be the web site of “David G Hodgson, Insolvency and credit management consultant, Leeds, UK”.

Perhaps insolvency expert David G Hodgson didn’t pay his web hosting bill?

David Hodgson: the Word-Salad Maestro

https://www.linkedin.com/in/davidhodgson/

I must admit that this David Hodgson is the one that perhaps could be most easily confused with me, given that we apparently both graduated from the same university, and both worked for Sony at some time.

I was trying to find a word or phrase to summarize what David does. Based on his own descriptions, I can’t, although I must admit that I’m impressed with his “word salad”. If he wants to add more of the same, then perhaps the Deepak Chopra Quote Generator would be helpful?

David is the CEO of Hummingbird Labs, but the web site of that business consists of nothing but a nice picture of a tree in a field: http://www.hummingbirdlabs.co/. The page used to include a logo of a hummingbird with its beak missing, but that’s gone now.

David Hodgson: the Model

http://www.modelmayhem.com/DavidHodgson

I’ve drawn many models over the years, but I was never a model myself.

David Hodgson: the US Graphic Designer

https://www.behance.net/davidhodgson

No problems here; it’s simply a competently-presented gallery of his graphics work.

David Hodgson: the Blogger Who’s Cutting through the Crap

http://dhodgson.com/

In October 2013, this David Hodgson felt called upon to start a blog to tell the world his opinions of “Magnetic Water Softeners”.

He ended his first post by describing an experiment he was conducting, stating, “I will post with the results in about 30 days and let you know what I have found out.”

But he never did post those results, and apparently that first post was all he had to say. About anything. Ever.

The subtitle of his blog is “Cutting Through the Crap”. Apparently, that David Hodgson cut through so much crap that he left himself with nothing more to say.

David Hodgson: the Other Artist from Leeds

http://artofdavidhodgson.blogspot.com/

David Hodgson: the Video Game Commentator

http://www.penguinrandomhouse.com/authors/34539/david-hodgson

David Hodgson: the Murderer

I repeat that this is definitely not me!

http://www.mirror.co.uk/news/uk-news/police-ask-murderer-david-hodgson-433240

David Hodgson: the New One-Shot Blogger

http://www.davidkhodgson.com/new-blog/

Apparently this blog is so new that most of its pages have been removed… This David Hodgson also states that he’s a graphic artist, but the web site displays only one example of his work (or at least I assume that it’s his, because it’s not signed).

Hopefully the information above, covering many of the David Hodgsons who are not me, will help you identify the real me in future.

Revelation in Aylesbury

The Bell Hotel, Aylesbury, with the Buckinghamshire County Offices beyond, in 1980

The Bell Hotel, Aylesbury, with the Buckinghamshire County Offices beyond, in 1980

There have been some occasions in my life when I’ve attempted something that, at the time, seemed to be a failure, but eventually it became apparent that I had gained some unanticipated new insight or skill. One such instance happened at a job interview in 1980, in Aylesbury, Buckinghamshire.

I described in a previous post how, having commenced my first full-time “permanent” job at Swifts of Scarborough, I soon came to feel that I was capable of something better (and also hopefully better-paying), and began to look around for more suitable employment.

One field that was recognized as being the “white heat of technology”* in those days was “Computing”, and I wondered whether my math skills would make me a good fit for that daunting new field. Incredible though it may seem now, I’d never actually used a computer (except for a digital calculator) until I went to university. Even though I was working as an Accounts Clerk at an engineering company, there wasn’t a single computer of any kind in the business. Swifts’ accounts department used nothing more sophisticated than electric adding machines.

(* Ironically, Harold Wilson made his “white heat” speech in, of all places, my home town—Scarborough!)

My first brush with computer programming was very disheartening. During the first year engineering course, we were required to write a single computer program. However, we received no instruction in how to do this; apparently we were expected to know already, or to teach ourselves! I found this very difficult and frustrating, and I didn’t seem to achieve good results. I concluded that I probably wasn’t “cut out” for computing, and shouldn’t attempt to pursue it further.

Later, after I’d left Warwick, I discovered that quite a few of my fellow students there had actually obtained A-levels in Computing before starting at university; a subject that wasn’t even on offer in Scarborough! Thus it wasn’t at all surprising that I hadn’t been able to compete well against such students.

Insurance Building on Gatehouse Road, Aylesbury, in 1980

Insurance Building on Gatehouse Road, Aylesbury, in 1980

One of the potential jobs for which I somehow noticed an advertisement was for a “Computer Data Entry Clerk” at an insurance company in Aylesbury. I was warned that my interview would include a “Computer Programming Aptitude” test.

When it came time to take the test, the interviewer explained that there was no time limit. Accuracy was more important than speed. I could take as long as I needed to finish, but typically completion took about 4 hours.

I found that I’d finished and checked my work after about 3 hours, so I went over and handed my paper to the interviewer. He asked me if I was sure that I’d finished everything and done as much as I could, and repeated that there was no time limit for the test. I confirmed that I had completed everything. He gave me a skeptical look and accepted the paper, then I walked out.

I heard nothing further until weeks later, when I received a letter from the company, informing me that they were not offering me the Data Entry Clerk job. The letter went on to explain that I had done so well on the test that I “clearly” had great computer programming aptitude, and that my skills would be wasted in so lowly a position! As had happened at other interviews, I received the advice that I should instead return to university and try to obtain a technical degree.

So I didn’t get the job, but I did get some extremely valuable feedback that bolstered my self-confidence and caused me to renew my interest in a field that I had been ready to abandon.

I eventually followed the advice that I’d been given at those interviews, although I chose Electronic Engineering rather than Computer Science. In retrospect, CS may have been a better fit for my unusual skill mix, but, at the time, I hadn’t forgotten the difficulty of trying to compete with other students who had A-levels in Computing, when I had no formal qualifications in that field at all.

Old and New. Aylesbury Canal Wharf, with the County Offices building beyond

Old and New, in 1980. Aylesbury Canal Wharf, with the County Offices building beyond