Leave Me Alone

Rebecca Edwards

Software For Less mimics the aesthetics of a corporate tech trade show with architectural motifs often implemented in this arena of selling, beta-testing, and branding. Pop-up truss structures have been configured to mimic large-scale exhibition stands, showing promotional videos, live programming, and other visuals. A central stage is set up for the promotion of a new social platform, and one-time-use banners display logos for software created to posit alternatives for the current prescribed use/misuse of social media platforms. The exhibition exposes processes via live interaction and works that are generative, leaving interesting liminality between the finished product and beta product. To think of Software For Less as a space for “product launching” allows it to speculate on the future of the user/creator relationship. It also makes space for the artist to exist as a product-designer-cum-startup, entrepreneur-cum-inventor. There is nothing fictional about the works in the exhibition, however - in fact, most already exist and have been utilised by 1000’s of users who share Ben’s vision for developing a critical position towards platform use, and to adopt an analytical stance where a user can reflexively understand the ulterior motives of big tech and make more informed decisions about how platforms infiltrate their routine.

Platforms surround our lives more than ever as we constantly switch and move between them twenty-four hours a day. From the moment we wake up, we’re tuned in; checking our emails and reading the latest news headlines; counting the various metrics afforded to us from friends and unknown digital acquaintances on Instagram, Facebook, or TikTok; ordering breakfast, lunch, and dinner from dark kitchens; and streaming whatever is pushed to us from Netflix, Prime Video or Hulu - platforms are at the core of our digital engagement and they want our attention. 

To understand why these platforms want our attention it might be helpful to understand exactly what a platform is. In his 2017 book, Platform Capitalism, Nick Srnicek developed a typology of platforms and distinguishes four main types: the first type is the advertising platform (Google and Facebook, Instagram and Snapchat whose business model is based on selling data to advertisers); the second type is the cloud platform (Amazon Web Services or Google Cloud, which rent out the software and hardware necessary to run a modern business); the third type is the product platform (these platforms rent out goods as services, like car or clothes rental); the fourth is lean platforms, like Uber or AirBnB which seek to connect buyers and sellers of a service while maintaining a minimum of assets) [1].

According to the most prominent articles listed from a quick search engine, it would seem that our understanding of the way platforms behave and fight for our attention has left us in a state of crisis. Headlines like, "There’s a war for your attention. And you’re probably losing it” [2] or, “Our Minds Have Been Hijacked by Our Phones” [3], or even “Your attention is the hottest currency on the Internet” [4], are provocative and rely on strong verbs and quippy phrasing to instill a sense of fear, urgency, and distrust between ourselves and our smart devices. Whilst the articles may be true (to a degree), there’s often a heightened sense of fear evoked in opinion pieces that maintains this idea that tech = bad, human = good; the joining factor between the two is data, where one generates it (human) and the other (data) transforms it into capital and power. The unequivocal need for groups of all kinds of people to come together and defeat rogue algorithms seems pertinent in the era of post-truth, where reclaiming what is rightfully yours (data, race, sexuality, gender, reduction of pay-gaps…) is rightfully encouraged.

The way that social media, the internet, video platforms and other forms of communication vye for our attention is often cited as being within the theory of The Attention Economy. The attention economy is built on the premise of creating a marketplace where consumers are happy because they are shown relevant information, or information they want to see. Newsfeeds, images, status updates, and anything else posted to platforms is fed in an endless, seemingly random, loop to suspecting and unsuspecting viewers day and night. The algorithm's job is to make sure you see more of what you like, or think you like, and less of what you don’t, in the hope of keeping you within said platform, and within potentials for marketing and direct advertising; aka money.

However, this is not new territory. The concept of attention economics was first theorized by psychologist and economist Herbert A. Simon, who wrote about the scarcity of attention in an information-rich world in the early 1970s. “[I]n an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention....[5]” He noted that many designers of information systems incorrectly represented their design problem as information scarcity rather than attention scarcity, and as a result, they built systems that excelled at providing more and more information to people, when what was actually needed were systems that excelled at filtering out unimportant or irrelevant information. 

There’s been a recent shift in the way we understand The Attention Economy. In September of 2020 alone, viewers watched 1.6 billion hours of Twitch streams, there were 4 billion views of YouTube content about just one video game – Among Us – and TikTok users spent on average 45 minutes on the platform every day. Research from Accenture shows a 22% increase in the consumption of streaming and gaming content as opposed to 2019. Of course, this was primarily due to the worldwide lockdowns enforcing entertainment to be sought indoors, but these numbers have been rising steadily for the last decade with annual increases in the number of people with access to smartphones and gaming devices [6].

For newer platforms like TikTok and older ones like YouTube, success is driven by the size of audiences. Those who can persistently attract huge numbers of users with an audience-first mindset will continue to thrive [7]. In a whole other realm of attention vying, are the multi-purpose messaging, social media, and mobile payment apps, like the Chinese app WeChat. These apps measure success primarily on conversion rate - they keep users within them by providing everything they might need to communicate, shop, play, and watch, acting as a one-stop-shop for all a user’s entertainment needs. But what does an audience-first mindset look like in real terms? Putting the audience first may sound philanthropic, but it’s clear that the agendas of big tech companies are perhaps only surface-level and not so philanthropic after all. The user is being used; if everything is free, you are the product. Buzzwords like tactics, feedback, and strategy are often bandied about in the world of e-commerce, media marketing, and platform capitalism, as best-practice approaches of “knowing your consumer” and “monitor your users” are repeated without question. However, the wellness of the consumer, the way someone might feel about being “targeted”, and the monitoring of behaviour for monetary gain lacks a personal, and genuine strategy of care. The real person behind the screen is regularly neglected and replaced with metrics and analytics. 

According to Charles Arthur in his 2021 book Social Warming, all social networks live by three rules. Rule number one is to get as many users as you can. Rule number two is to keep the user’s attention. Rule number three is to monetise the attention of the users as much as you can. He says, “if you do any one without the other two, you will have minimal success. Execute two well, and you might prosper. Do all three at once and you can own the world. [8]” In the case of social platforms, these rules are predominantly completed by algorithms. All three work in tandem to create the most users, the most time spent on the platform, and the most money generated. 

One fundamental change in social networks being able to keep our attention within the platform was the development of a system that revolutionised the way we interacted with posts. In 2006, a small, but not insignificant piece of software, called EdgeRank, was developed. It labelled each piece of content (post, video, group, update) on Facebook as an “object”. These objects were ranked and given a score depending on values such as, your relationship with the poster, the type of content of the object, how old the object was, how you had engaged with similar objects in the past, who else interacted with the object, and so on. The outcome was less chronological consumption, and more about consumption that aligned with what you were interested in, and what others around you in your peer groups were interested in. EdgeRank has since developed, but paved the way for how users of all kinds of social media platforms will consume content. In early 2021 Facebook introduced machine learning to help power the News Feed ranking algorithm, helping to create, in their words, “a valuable experience for people at previously unimaginable scale and speed. [9]” 

One problem with having what you think you want to see put right in front of you is the lack of labour that needs to occur to satisfy the desire. This in turn makes us lazy consumers, doomscrollers, receptacles of information and content no matter how relevant or irrelevant it might be, no matter how important or unimportant it is to our day. The addictive nature of this cycle makes it difficult to break, therefore keeping us within the platforms and exactly how they had intended. Boycotting Facebook or Instagram will only partly solve this problem, and even then it will only help at an individual level. With smart objects and the IoT forever looming over us, the question about how to decrease consumption and eliminate addiction is less about the algorithms changing, big tech admitting to its faults and better regulation of content and the effects of content on mental health, and more about how we learn to live without in a world where we’re constantly told to live within.

To bring this back to Software For Less, Ben Grosser invites us to rethink our relationship with the platforms we engage with and the way they engage with us. Questioning why software is the way it is are the works ORDER OF MAGNITUDE and Deficit of Less. These twin works, synced for the exhibition, provide us with every instance Facebook founder Mark Zuckerberg has ever said the word more, conveying Silicon Valley's obsession with growth, and every instance Zuckerberg has uttered less. The website artworks Get More and Get Less prompt users to reload the page, increasing or decreasing the number on the screen by one each time. All four works play on Silicon Valley’s "desire for more" and ask visitors to think about their inclinations towards addition vs subtraction, and why or where those inclinations come from.

Endless Doomscroller, Platform Sweet Talk and Creative Just Like Me examine how software platforms work to manipulate users. For Endless Doomscroller the focus is on platforms’ use of the infinite scroll function and how it plays on our curiosity and fear of missing out. Platform Sweet Talk addresses how platforms barrage us with personalised notifications and intriguing updates on our relationships, often concealed until you click-through the notification pop-up. Creative Just Like Me  (coming later to the exhibition) playfully engages with the way platforms like TikTok craft content and encourage users to produce videos that “duet” with other users, ironically producing homogeneity and conformity despite the platform’s mission statement to “inspire creativity” [10].

Recognising the ways in which users might want to retrieve some agency back, some artworks assist users in their fight over platform manipulation algorithms. These include, Go Rando and Not For You which both utilise obfuscation techniques to disguise a users true emotions and confuse sentiment analysis. 

At the centre of the exhibition is a new work, Minus, which is part of a set of works created as an antithesis to the current platforms we engage with, and which all foreground less rather than more. Minus gives users of the platform only 100 posts for life - with pared-down design, minimal interactivity features and a lack of metrics (except for the one visible count of how many posts a user has left, a metric that counts down), ads, and colour - the impetus is on quality over quantity, true engagement and an attention focus, rather than a rollercoaster of enforced behaviour.  

Other works in the exhibition include Safebook, which is Facebook without the content; a browser extension that hides all images, text, video, and audio on the site. There’s also Tokenize This which generates a unique digital object that can only be viewed once, as a direct proposal for resistance against the NFT boom. And then the Facebook Demetricator & Twitter Demetricator, plug-ins that hide all metric data usually displayed to users [11]. 

These kinds of tweaks facilitated by Ben’s work might not seem so significant, but these disruptions of prescribed sociality and their associated metrics create a crucial opening. Simply, Ben Grosser’s works (or tools?) allow users to better question and understand why they’ve been so dependent on quantification, and ask: “who benefits most from a system that incessantly quantifies our public interactions online”? [12]


Footnotes:

1. Srnicek, N. (2017). Platform capitalism. Cambridge ; Malden: Polity.

2. Illing, S. (2016). There’s a war for your attention. And you’re probably losing it. [online] Vox. Available at: https://www.vox.com/conversations/2016/11/17/13477142/facebook-twitter-social-media-attention-merchants [Accessed 1 Aug. 2021].

3. Thompson, N. (2017). Our Minds Have Been Hijacked by Our Phones. Tristan Harris Wants to Rescue Them. [online] Wired. Available at: https://www.wired.com/story/our-minds-have-been-hijacked-by-our-phones-tristan-harris-wants-to-rescue-them/.

4. Kessler, D. (2018). Your attention is the hottest currency on the Internet | The Mozilla Blog. [online] blog.mozilla.org. Available at: https://blog.mozilla.org/en/products/firefox/katharina-nocun-social-media-networks/ [Accessed 1 Aug. 2021].

5. Greenberger, M., Johns Hopkins University (Baltimore, M.D and Brookings Institution (Washington, D.C (1971). Computers, communications, and the public interest. Baltimore (Md.) Johns Hopkins University Press, p.40.

6.  (2021 figures show 82.9 percent of the population in the UK has a smartphone, and 85 percent of people in the U.S)

7. Murdoch, R. (2021). World! Can I have your attention please? [online] www.accenture.com. Available at: https://www.accenture.com/gb-en/insights/software-platforms/winning-the-new-attention-economy [Accessed 1 Aug. 2021].

8. Charles Arthur (2021). Social warming : the dangerous and polarising effects of social media. Editorial: London: Oneworld Publications, pp.43–46.

9. Facebook Engineering. (2021). News Feed ranking, powered by machine learning. [online] Available at: https://engineering.fb.com/2021/01/26/ml-applications/news-feed-ranking/.

10. TikTok (2018). About | TikTok - Real Short Videos. [online] Tiktok.com. Available at: https://www.tiktok.com/about?lang=en.

11.   Interestingly, earlier this year Instagram added a setting to allow users to hide the amount of likes and views their posts receive: instead of “Liked by @person and 12 others”, if a user decides to hide public counts they look like this, “Liked by @person and others.” Even more interesting is the fact that Ben was 10 years early to the demetricator party, and that Facebook had come after him for his efforts back in 2012.

12. Inspired by many conversations with Ben, and paraphrased from texts written by Ben Grosser.

Rebecca Edwards is the curator at arebyte Gallery.