Blog
Elixirr Digital

There is a perception that user experience (UX) is something new, something that has only been around since personal computing, or even something that came about after the web. These assumptions are both correct and very, very wrong. The term itself is new and the processes it uses only formalised in the last 30 years or so, but the idea has been around for as long as humans have used tools.

Get a warm drink and settle down, this may take a while…

How long ago did it all begin?

Design (HCD) is, at heart, simply improving tools and services to make life easier for humans. Dragging heavy objects was made easier if you rolled them on logs, it became even easier if you replaced the logs with wheels on a cart; was made far easier if you got an animal to pull the cart; got even easier still if you pulled them using a steam engine and even easier if you replaced steam with a petrol engine and then… well you see where this is going.

At some point, somebody understood that people were having a problem and came up with a solution. We’ve been doing it as a species since we became a species. It was just natural; it could even be argued that in evolutionary terms, problem-solving was what made us a distinct species.

The first people to really seek to control it, to create rules around human-shaped problem solving, were the ancient Greeks. Now there’s a bunch of over-achievers if there ever was one! The ergonomic patterns they plotted and documented millennia ago are still evident today. The artefacts they left behind are very familiar to modern eyes because so much modern design stems from them.

Seating sizes, road sizes, the shape and weight of hand tools, the shape of doorways and positioning of windows and rooms for distinct uses. If you were to go back in time, you wouldn’t be able to speak the language, but you could certainly understand and be able to use the culture’s artefacts, because we largely follow their forms now.

The concept of putting human physical needs at the heart of the design is even alluded to in their stories.

The Tale of Procrustes

Procrustes was the son of Poseidon and behaved just as you would expect a demi-god of ancient Greece to behave. In short, he was a bad ‘un.

Now, Procrustes lived on Mount Korydallos, a place which was unavoidable if travelling the ‘sacred way’ between Athens and Eleusis. Procrustes made it known that he had a magical bed that would fit any traveller and many weary strangers visited to test out this remarkable offering. The problem was that the way he made the bed fit was by strapping unwary guests to the bed and then removing hands, feet, legs, or any other extremity, that overhung the edges or,  by breaking bones and stretching them to fit when it came to smaller visitors.

Told you he was a bad ‘un!

For those who like a happy ending, he was eventually killed by Theseus who, having escaped Procrustes’ trap, fitted him to his own bed.

From this story we get the term Procrustean Design, the idea being that people can be fitted to a solution, rather than a solution fitting people. If you’ve ever had any in-depth training in Agile, then you’ve probably come across this term when discussing Minimum Viable Product [. For a lot of places, MVP is defined by what is the least amount of effort an organisation can put into producing a solution that functions and expecting the user to accommodate it. A real MVP is defined by what is the minimal needs of the end-user to make the functions of a product viable. User first to make your business first.

I can totally sympathise with Theseus when it comes to how people who utilise Procrustean Design should be treated.

(Sorry. Bad MVPs and [FR]Agile are pet peeves. We shall speak of them no more.)

Specialised Design

The Greeks turned what was instinctive into a considered methodology, but it was still about meeting mass generic needs. Now, ask any anthropologist about needs within a culture, ethnology, and you’ll get a very different answer. The moment you start to look at segments of humanity it becomes clear that specialist tools are required within specialist environments.

It’s hard to pinpoint when the first specialist tools started to appear, but it is widely understood that, as with a lot of human innovation, conflict may have been the driver. After all, if your enemy is coming at you with a sword and shield and the best thing you can face them with is a pointy stick and an arm with your own skin on it… it’s going to be a very short battle and your best bet is to run away and think about how to craft a better weapon than your enemy.

It’s only natural then that the same model should persist when it comes to everyday use. Think about something as simple as a hammer. All hammers are basically made up of a handle for leverage and a heavy bit for hitting stuff with force. All hammers basically have the same function, yet you don’t drive a nail home with a sledgehammer, you don’t take a geology hammer to work if you’re breaking concrete, a surgeon may occasionally need a bone mallet, which is scary in itself, but if, as they were giving you the anaesthetic, you looked over and saw a 1800w demolition jackhammer being removed from the steriliser you’d be off and headed for home before the surgeon had his gloves on. The hammer may be basic but there are many specialised versions, each tailored to very fine use.

For centuries this pattern persisted, tools were developed for specialist functions and people had to be trained to use them. The function was the driver, not the needs of the person operating the tools.

What changed? Business, or more specifically the move from farming to manufacturing and the advent of the Industrial Revolution is what really changed everything.

Industrial Revolution

The Industrial Revolution marks a major turning point in history; almost every aspect of daily life was influenced in some way. Most historians look at the human cost (horrific), economic and population growth (good for Britain’s expansion and then the rest of the world’s), the dissolution of rural life, and the growth of city life. Ask an industrial historian though and watch their eyes light up as they describe the move from functional design to operational design.

Commercially, for all the increase in manufacturing, what made the machines and business succeed was their ease of use. Any competent engineer of the time would have been able to design a machine that could spin cotton a thousand times faster than a human, but if no one could operate it then it was a complete waste of money, and you can bet that a competitor who had thought about ease of use already had their machine ready and waiting for the same customer base.

As the Industrial Revolution gathered in pace, so did the demand for people who could work these machines better than anyone else. The earliest machines that were created were quickly superseded by ones that did the same job but were easier and therefore quicker to use. They still needed specialist operators, but the easier the operation the easier it was to find or train those specialists.

This is the first clear time that User-Centric-Design (rather than the wider idea of Human-Centric-Design) can be seen as a commercial need and in a form that we’d recognise today. The machines needed to fit the user’s needs, or they would fail and so would the business.

Economic Freedom and Emotional Design

It continued in this fashion unabated for many decades, as the idea of easy to operate machines formed the core rationale behind business decisions. Even a couple of world wars couldn’t stop it, in fact, they actually increased the speed of innovation. The start of World War One was dominated by cannons and horses, the end of World War Two was dominated by air power and rockets, all made possible by making weapons that were easier to use.

All that innovation, driven by commercial or military use, was soon to face a new challenge, though one driven not by industry or the need to destroy, but by personal preference.

By the 1950s the commercial challenge was selling to consumers not to other businesses or  governments; consumerism and marketing were slowly changing the world’s economies. Consumers had choices and if your product was better,  just new and improved, or simply had the best marketing, then you survived and your competitors didn’t. Customer relations, knowing their wants and needs, made products and services virtually the same thing.

Brands that survive today embraced those notions, made them part of their identity, the ones who didn’t are long gone. This was the time when Emotional Design came to the fore, it wasn’t just enough that your product was good, people needed to feel good about owning and using it.

By the 1960s, what we’d now call ‘pop culture’ was the biggest differentiator in business, fashion, cars, gadgets, and leisure activities. It wasn’t just utilitarianism; it was all about the desirability and how the individual felt about their purchases with ease of use.

By the 1970s it had gone one step further, automation and micro-processors replacing conveyor belts and vacuum tubes. We started making machines that mimicked humans, designed to be simply and easily controlled by humans. Market forces dictating that ease of use was the key to success, word of mouth, magazine, or TV articles, keeping up with the Joneses all played a big part in the experience, the acceptance, and the success of products on offer.

The big plan was to free people to do something more interesting and fun, like watching movies on those new VCRs or listening to music cassettes on those Walkman things, or what about those new home computers that everyone seemed to be talking about?

All of that was recognisable as emotional design, but it hadn’t yet been recognised as UX. However, all that was about to change.

HCI and the Emergence of User Experience Design

The 1980s, for most people, is where UX principles really became recognisable. This is where it gets interesting, just don’t look at the clothes! A new kind of product had hit the shelves, with screens and keyboards and functions designed for fun as well as productivity. Computers didn’t just have to be useful and usable, they needed to be engaging too.

Finally, Computers! Isn’t UX Digital?

Firstly, Interrupting like that? Rude!

Secondly, no, UX is about humans not just computers and digital, but since you asked…

UX growth and computer growth have a mirrored evolution, especially since the end of World War Two.

Hold on, I need to put my geek head on for this next bit.

Let’s start with John Mauchly who created ENIAC in the late 1940s, which is probably the first time we’d recognise a modern computer rather than a logic engine. A little later, the idea for hypertext emerged, created Vannevar Bush, though he called it MEMEX and all around the time that consumer design was emerging from industrial design.

The JOHNNIAC computer was created for by RAND researchers in 1954 and lasted for 13 years, making it the first truly long-lasting computer system. Imagine that, in an age where your phone is obsolete in two or three years, 13 years of life for a computer before it was overtaken by competing technology, and it happened at just about the same time that consumerism was established as a long-term socio-economic system taking over from industrial design.

In 1962 the Stanford Research Lab started work on a word processor with automatic word wrap, search and replace, user-definable macros, scrolling text, and commands to move, copy, and delete characters, words, or blocks of text. In 1965 it was demonstrated on their own TVEdit system which was built on 1963’s Sketchpad system, in what we’d recognise today as a Windows, Icons, Mouse and Pointer (WIMP) interface. The sophistication of computers and their interfaces sits alongside the sophistication of precisely targeted consumer-driven features and functions.

The Xerox Alto was created in the 1970s as a personal computer that could print documents and send emails. It comes as a surprise to a lot of people that, not only is the idea of email decades old, but its use is too.  What was most notable about the Alto was its compact desktop design, which included a mouse, keyboard, and screen, something that hasn’t really changed in concept since. The Alto computers were also designed to be child-friendly so that everyone could operate a personal computer.

Modern UX and modern computers have grown up together but travelled different roads, and it wasn’t until the 1980s that they really worked together.

Where was I? Right, HCI and the Emergence of User Experience Design

Human Computer Interaction (HCI), for the unaware, is the blending of cognitive psychology with industrial design, into what is loosely referred to as Cognitive Engineering. It seeks to put humans at the very heart of decision making the most obvious example of which was the retention of the QWERTY keyboard. The keyboard works well for typewriters, the layout stops the jamming of the print arms as they strike the ribbon (look, just ask your grandparents, okay) but it isn’t the most logical layout or even the easiest to learn and use so why keep it? Simple, people already knew how to use them, muscle memory in the fingers meant that you could look at the screen and not the keys when typing.

We also retained real-world modelling in terminology when it came to operating a computer, cut and paste in the print industry meant using a scalpel and some glue to move elements on a page. Typefaces, that’s a printing term too. The usage of these concepts were familiar, so why change the terminology? The whole thing was focussed on familiarity and therefore usability, and even generated its own meme-like soundbite: “Easy To Learn, Easy To Use.”

Most of the Graphical User Interface (GUI) patterns and the reasons behind those patterns that we use today stem from HCI. I say GUI rather than UI deliberately. Today, a user interface might be spoken, it might be touch, it might be automated or even have an AI predicting your needs. But when it comes down to GUI design, the basics elements of WIMP design and Information Architecture (IA), with navigation at the top and side and a hierarchy of content that has the most important information appearing first and in as concise a form as possible, are decades old.

The 1980s was when computers really took off as essentials for the home. Gaming in particular where ease of use and engagement were paramount, needed to know exactly who their users were and what they wanted. The Software was tested for bugs before release, but usage with real users became an industry model that we still follow today. It wasn’t just computers though, desirability, individuality was pushed to their limits be it terrible fashions or bad genre music or the latest gadgets, pushed by lifestyle ‘gurus’ (yup influencers are nothing new, now they just do it through Social Media rather than the TV and magazine), understanding and catering for specific user needs were just as essential to any product and service as it was to the home computer.

The creation of the UX industry

We’re about to get to the 1990s, where UX, as you’d recognise it, was born. All that history that I’ve described here, be it cognitive science, marketing, service design, industrial design, branding, ergonomics, emotional design, or usability, was all drawn together, defined and named by Don Norman. The first-ever UX Architect, and the man widely recognised as making Apple a household name, defined UX like this:

“User experience encompasses all aspects of the end-user’s interaction with the company, its services, and its products.”

It was simple, it was right, it gave what those of us who had been doing it for years a hook on which we could all hang our coat.

Commercially we’d been inching towards this definition for 50 odd years, and now an entire industry was given new purpose and value and direction. If that wasn’t enough, Don Norman and Jakob Nielsen revealed their proven ways of working and gave us a new practical design methodology.

Here was a structure based on sound psychological foundations that allowed the building of a design process that had more to do with the scientific method than the generalisation of ergonomics or opinions on aesthetics. We not only had a goal, human satisfaction, but we also had a way of measuring it, understanding it, and delivering it.

A new industry was born, but after the longest gestation period imaginable.

And UX now?

Firstly, obviously it is a massive business driver. Secondly, it is something everybody should get involved in, creatives, marketing, content, data analytics, technologists, Product and Service, all work together to bring a coherent joined-up user experience. The people who do that might not call themselves user experience designers, but make no mistake about it if their focus is understanding the end-user, to make life easier for that end-user, they’re sitting at a table called UX and they’re producing UX.

Those of us who do call ourselves ‘UXers’ are the ones who enable all those others to do their best work by removing any uncertainty about who the end-user is, we measure and test decisions, we eliminate guesswork and remove opinions to produce facts.

UX, as it stands today, is the result of centuries of development but is also very new and very .

If you want to know more the UX services at Coast Digital, visit: https://www.elixirrdigital.com/services/user-experience-ux-services/

If you want to know more about the author, visit: https://www.elixirrdigital.com/2022/02/07/meet-the-team-bob-powell/

More on this subject