1. Beautiful, Smart and Easy-to-Use
My dream of becoming a designer started only a couple of years after I got on the Internet. I already wanted to be a designer before I knew what a designer’s job was, because all the designers on television looked really cool and smart. It was not a coincidence. The late 90s to the late 2000s was a decade of celebrity design culture across the world. From graphic and industrial design to architecture and urbanism, design was the thing that made our lives imaginative, experimental and stylish. Back then, designers were leading, mainstream figures who inspired the future, until another group of college drop-outs took over the hype with their groundbreaking digital ideas, social networks and platforms.
One may argue that the dot-com boom was also an important strand of future aspiration in the 90s, but that is true only when it came to a popular subculture. “Micro-chip”-based gadgets were the new cool shit; both designers and consumers in the niche saw them as geeky, experimental playthings, instead of considering them as everyday objects for use in daily life. The outcome of this was a wildly diverse range of appearances for phones, MP3 players and other gadgets: glamorous, funky, futuristic or cute, but never uniform.
Similar to the design of hardware, software (whether operating systems of the apparatus or on the Internet) was just as much a part of the sub-culture. Whether something was “user-friendly” was not a concern. In fact, as the threshold of communicative labour tested one's commitment to the community, making something easy to use was generally de-prioritised.
Apple picked up on this rapture between an increasingly popularised market and its initial sub-culture flair. Steve Jobs’s reputation in “design taste” came at the critical juncture where design lifestyle and the democratisation of digital technology converged. After Job’s return to the company in the late 90s, he and Chief Design Officer Jonathan Ive launched the iconic G3 and G4 iMacs, which rounded up the “Think Different” campaign. Right after, as the iPod and iPhone took off in the market, Jobs and Ive said farewell to niche fashion and took inspiration from Dieter Rams, the iconic German industrial designer famous for his minimalistic, modernist style of consumer products that strictly obeyed the “form follows function” principle.
As the threshold of communicative labour tested one's commitment to the community, making something easy to use was generally de-prioritised.
In the documentary Objectified (2009), Rams cites Apple as an exemplary company after he introduces his ten principles for good design. Following the motto, “Good design is as little design as possible”, Jonathan Ive proved the point by presenting his work for Apple. He used the iPod, iPhone and MacBook Pro to demonstrate his “less is more” philosophy, illustrating how much he wanted to weld materials, form and function tightly together in a single package, thus “getting design out of the way”. He wanted his pursuit of this form to feel “almost inevitable” and “undesigned”, so much so that people would not be able to imagine their tools designed any other way and not worry about the operationality of the product so that they could immerse themselves fully in the digital realm. His solution was an iPod with only a click wheel, an iPhone with only a screen and a home button, and an iMac with only a screen and a stand. Such design responded well to Jobs’ insistence on making “beautiful, smart and easy-to-use” products, aimed precisely at this transitional moment between design and technology.
The “form follows function” idea was hardly new to design history. Modernism was the dominant philosophy in art and design in the first half of the 20th Century. In terms of software, Don Norman's theory of affordance and user-centred design were around already in the 80s. But Apple’s adaptation was still groundbreaking to the industry. Their minimalistic design approach and well-crafted products resonated well with the rising desire for a “less is more” lifestyle in the midst of the information explosion. Apple’s fixation on “easy-to-use design” successfully made digital gadgets approachable for people who were not at all interested in the digital apparatus but were pushed into the realm of communicative capitalism, as the Internet and digitalisation were about to become ubiquitous.
Apple was also the pioneer of a closed, proprietary platform ecology, which follows its modernist design philosophy. From their macOS (iMac and MacBook), to iTunes (iPod), to iOS (iPhone and iPad), to the App Stores and the promotion of native Apps, their approach was notoriously controlled and closed throughout their software and hardware. From a design perspective, a highly controlled environment makes it much easier to create a streamlined experience for users, as there is less protocol friction, and therefore less bugs and unexpected outcomes. User behaviours can be predicted better as well, given that Apple products tend to leave fewer choices for users to play around with. This way, designers can make sure users will use their products exactly as they were intended, and not get lost in all the possibilities that are not immediately self-evident. Such an approach was later called intuitive design, wherein users need not explore much before applying themselves to the desired function, as the design is already contoured to human intuition.
Apple’s approach was notorious because it went completely against early Internet ideology, at a time when people were still clinging onto the promises and ethics of an open, de-centralised Internet. From software to hardware, the company has — until today — no interest in collaborating with other existing infrastructure, making their consumers invest in a full range of products for reasons that go beyond brand loyalty. Other companies followed suit, after witnessing Apple’s continuous waves of commercial success, enlightened by the many points of profit in a closed ecology, as well as how “easy-to-use” products help to expand their consumer market share.
Apple's approach was notorious because it went completely against early Internet ideology, at a time when people were still clinging onto to the promises and ethics of an open, de-centralised Internet
Just like that, Apple opened up a new era where the entire tech industry slipped into an unexpressive, tightly controlled process of design — which I call a kind of digital minimalism — making software and hardware uniformly “beautiful, smart and easy-to-use”. The industry moved from a niche market for Y2K geeks to a more general consumer market for personal devices, and setting off a new craze of software-hardware platform ecologies from the mid-2000s until today. Although not many companies have managed to stay in the competition, the shift successfully pushed forward the democratisation of the Internet, providing the infrastructure for the transition towards a persona-driven Internet.
2. Minimalist Design Accelerates
Not many people would associate Facebook with Apple in terms of their design. Even today, Facebook’s visual identity and user interface has never been regarded as something thoughtful or aestheticised; at best it is considered bland and practical. But I would argue that these qualities are perfectly in line with Apple’s design strategy.
Until Facebook became the dominant social media platform, MySpace was looked upon as the future. The platform was a hybrid of personal blogs and social media with a slant towards music communities. Besides having avatars, theme templates, photo albums, background music, videos and playlists, MySpace also provided viewing count, message count, a view of recent visitors, special friends and so much more, to showcase a blogger’s popularity. Users could choose different templates, or even write their own code to design their blogs. These features encouraged users to put more effort into decorating their homepages and assigning friendship ranks. This caused users to be more emotionally attached to their online avatars and the platforms that accommodated them. MySpace furthered the idea of the persona in the existing blogging environment, while moving a step closer to the platform economy by providing a web-based community where all infrastructures were offered universally on the same servers.
The problem with MySpace was quite similar to RSS. The platform incentivised people to design their online persona, yet its software environment had a long learning curve. Compared to the communicative labour the platform absorbed, it created extensive new labour to unprepared users who had to take time to learn the platform’s features, logic and code in order to enter the self-design process. This was not necessarily a bad thing, given that learning tools and self-design processes are important for one to explore one’s digital self. In return, the platform had a quality pool of users with accumulating reputation and stable communicative productions, because the laborious process naturally deterred people from inventing more personas. However, the late 2000s led to an exploding market for personal devices and an accelerating persona-driven Internet. The effort needed to craft a MySpace persona became excessively painful compared to people’s immediate desire to join the blooming communicative economy.
Facebook was created in 2004 as an instant hit, but only surpassed MySpace as the most visited social media site in 2009. Facebook’s resume-like, guided user interface was the reason for its popularity and became a remedy to the difficulties of self-design. From the very beginning, the platform asks for as little visual materials as possible from users. In fact, Facebook’s real-name policy even helps bypass the entire labour of designing a representation of one’s alternate self, implying that there shouldn’t be any invented selves on the platform. These limitations compared to a blog or a MySpace account were in fact liberating for users unfamiliar with code or those who were not conscious of the possibilities for alternative selves. One neither needed to confront their lack of imagination with regards to alternate selves, nor be frustrated by long learning curves deterring them from the instantaneous rewards of a persona-driven Internet.
Consciously or not, Facebook followed the design minimalism movement spearheaded by Apple. In the creation of Facebook, Zuckerberg identified the key motivation for connecting with one another: to feel appreciated and be noticed. The entire logic of a community-centred Internet, which prioritised introspection within the self and community, became a nuisance in light of this purpose. That process seemed too arduous and unnecessary to fulfill the mere desire of attention. Facebook provided a shortcut to reproducing offline social life online because, after all, everyone already has an identity that they are born with, and an IRL (in real life) network that already pays attention to them. People no longer had to put in the effort to appear interesting, when a random selfie or fragments of opinions would be enough to draw attention.
People no longer had to put in the effort to appear interesting, when a random selfie or fragments of opinions would be enough to draw attention.
Another important innovation from Facebook was to scatter elements of a person’s profile over a timeline, as opposed to the information architecture of a personal website, blog, or MySpace page. Facebook offers a social network which prioritises the context of strong social ties that makes this approach plausible, as people no longer have to rely on a meticulous visual representation to be interested in a persona. All of these design decisions followed the same “easy-to-use” principle. With their bland blue resume interface, Facebook not only managed to surpass MySpace within five years of launching, but also incorporated billions of new users from all over the world into the age of communicative capitalism.
Little did we know in this process of making things “easy-to-use”, the “form” had quietly changed the “function”: a digital social space transitioned from being a venue of meaningful exploration in relation to both the self and community, to being a space that rewarded superficial self-presentation and encouraged competition for attention. Marked by Facebook and Apple, more digital innovation narrowed into the same approach. Twitter’s innovation was limiting character length, so that people could share their opinion much faster than with full-length blogs; Medium standardised blogging layouts so that no one now needs to worry about the design in which their opinions are presented. Instagram entered the realm of image-making by offering filters that make bland photos look universally attractive, which makes it even easier for users to craft their persona and lifestyle. TikTok did the same for video-editing.
Through this we see the progression of design solutions: from removing distracting elements, to creating a tightly guided experience, with standardised and automated forms of content creation. From Apple to Facebook, from Twitter to TikTok, it has been a two-fold process: accelerating communicative production and lowering barriers of participation so that user markets can constantly grow. This is a process which continuously absorbs communicative labour.
As these companies’ experience accumulated, the industry grouped a series of standardised models for apps and platforms: namely, templates. More and more companies craft their products on these templates without a second thought. A template condenses years of research that has already led to successful products, so applying them is a cheap and effective solution. Hardly anyone wants to invest in researching alternative models of design, as that is now a costly and risky option; neither do people feel a need to re-examine existing design logic, since the market has already proved them favourable. What happens then is that designers are shoved to the outskirts of the creative process. Structurally, designers lost access to all the meaningful research that shapes the products: the purposes of a tool, its intended functions, the forms and affordance that follow, and the wider personal and societal impacts of such designs.
This is how we have become surrounded by monolithic digital products, costumed by the same kind of logic. Eventually the minimalistic approach became universal, part of our intuition. The learning curves of software, apps and platforms have been shortened, not necessarily because they are better designed, but because we have been trained daily towards seamless assimilation while our everyday environments are increasingly transformed by them.
3. When Tools Become a System
It’s important for us to realise that intuitive design in the digital realm is mostly a constructed illusion. Not much about our relationship with a screen is innate to human nature. We expect Gen-Z and Gen Alpha to know how to use smartphones and computers not because they have some genetic inclination towards digital apparatuses which their predecessors lack, but because they were born into a society where devices, software, and methods of organising life are all made of a similar logic.
Boomers don’t have the same kind of intuition as they grew up in an analogue environment where “easy-to-use” meant tools they could have a grip on, that had only a single or a handful of purposes working for certain contexts. The forms of these analog tools suggested what they could be used for. And if one could not guess a product’s use from its design correctly, one could still misuse it in such a way that it achieved the user’s intended purposes. But with digital tools, all of them hope to be all-encompassing, working across various contexts. The tightly controlled process of using makes digital tools impossible to misuse, or reinvent them. They would simply not work.
Intuitive design in the digital realm is mostly a constructed illusion.
We know why digital apparatuses are designed that way, because they are far from being tools for a single, or a handful of, purposes. Every company wants to create products that everyone needs; as a result, every company wants to have the Swiss Army Knife of products, absorbing as much labour as possible in people’s daily lives. But the more functions one attempts to offer, the more complicated the interface may look, which creates more extra-communicative labour for users. Just as what Don Norman wrote in the 80s, “The real problem with the interface is that it is an interface. Interface gets in the way. I want to focus on the job… I don’t want to think of myself as using a computer, I want to think of myself as doing my job'. The solution then, is a fully black-boxed, closely guided user experience.
Yet today’s interface problem is that we no longer have access to understand what kind of jobs we are doing with our digital tools. The language or code is not even the main issue. It’s the enveloped communicative labour that we are unaware of our own participation in. We have moved from simple designs which allowed people to focus on the task at hand to a new generation of digital design that keeps people out of the fundamental decisions of a job.
But when a tool tightly controls and packages how a person does a task, to the extent that the design disallows people from doing anything else, it ceases to be a tool. User experience has become a narrative powered by certain ideologies: from the belief that users are stupid, lack patience, and don’t not need to know anything about how their products function behind the interface. It has become a system that decides on users’ behalf where functions and aesthetics are secondary; users relegated to merely operators who drive on the given tracks towards a decided destination. On the surface, design minimalism keeps things simple and prioritises what’s important, but in essence, it pre-determines what is redundant and what is important before one gets their hands on the product. It prevents users from deciding how tools can be used. Indeed, the term “user” is much more passive in the age of digital systems compared to the age of analogue tools.
Until today, people still assume that Apple has found the best and only answer to the industry: what kind of design is good, desirable, suitable for the market, and consequently, what is the relationship between the ‘tools’ and human beings. Even though we may increasingly get bored by their solution and suffocated by the all-enclosed digital life. But none of these should be “inevitable”. As we unpack the ideologies and logic that have led digital design to its current state, we should question every point to find an alternative. It has been 20 years now since Apple turned its wheel with the release of the iPod. We should have moved on. We really need to move on.
Editor’s Note on the artworks by Zhong Wei:
Zhong Wei’s artworks and exhibition images accompany Yin Aiwen’s retrospective analysis of minimalism as an ideology in technology design, to surface the irony of converging approaches and aesthetics in a domain highly regarded for its potential for multitudes of expression. Zhong Wei’s digitally composed paintings feature an effect he terms “coupling”: the melding and layering of disparate references across generations, environments and subcultures. As we begin to imagine what could be, his works provide a tenable vantage point: a technology landscape that is heterogeneous, revealing and evocative, no matter how frenetic its contents might be.
The dot com boom or bubble was a stock market bubble caused by the intensely speculative nature of the stock market with regards to Internet companies in the 1990s, the period in which Internet growth first began to flourish.
“Think different” is an advertising slogan of Apple between 1997 to 2002. It was considered a provocative response to the then-IT-magnate IBM’s slogan “Think”.
Direct quotes from the film Objectified (2009), directed by Gary Hustwit.
These three adjectives were often used by Steve Jobs to describe Apple products in new product release events. It later became an unofficial design motto for consumer-oriented IT companies.
Don Norman is an American researcher, professor, and author who advocated for “user-centered design” with his book “The Design of Everyday Things”. He also coined the term “user experience design” when he joined Apple as its first User Experience Architect. He is widely regarded for his expertise in the fields of design, usability engineering, and cognitive science.
B. Schwartz, The Paradox of Choice: Why More is Less, Harper Perennial, 2004
Y2K refers to the year 2000, and often refers to popular culture during the late 90s and early 2000s. Named after the Y2K Bug, the period’s culture is characterised by a distinct aesthetic spanning fashion, hardware design, music, and furnishings gleaming with tech optimism.
More on this in Chapter 5
According to sociologist Mark Granovetter, strong ties exist between close-knit people with frequent interactions, such as family and close friends. By contrast, weak ties are typified by distant social relationships and infrequent interactions, which are commonly observed between acquaintances or strangers. Strong and weak ties have distinctive effects in spreading information in social networks. For more information, see the author’s paper, “the strength of weak ties” (1973).
Twitter, Medium, Instagram and TikTok are all relatively recent social media/blogging platforms, launching from the mid-2000s onward.
Generation Z (also known as “zoomers”) refers to the generation proceeding Millennials and preceding Generation Alpha. They are generally understood to have been born after the mid-1990s, with the cut off for the generation’s birth year falling in the early 2010s.
The first generation to be born entirely in the 21st Century.
The generational term “boomers” references the baby boom, the period immediately preceding the Second World War. The range of birth years for boomers generally runs from 1946 to 1964.
Brenda Laurel; S Joy Mountford, The Art of human-computer interface design, Addison-Wesley, 1990