Out with the Old
Introductory Questions (Xavier)
Written by: Xavier Dickason, Shaurya Chandravanshi, and Jutin Rellin
Proofread by: Avan Fata
- What makes something new?
- Welcome to the Social Studies curriculum! Of course, as it seems to be tradition now, we will be starting with a dictionary definition of what is “new.” According to the Oxford English Dictionary, something new is something that has been “produced, introduced, or discovered recently or now for the first time.” Basically, something that has been recently produced for the first time or discovered by you for the first time. Immediately, we come across an issue; what is recent? In terms of technology, something that has been recently produced may have been made yesterday, a month ago or a year ago, depending on what it is. Would you still consider an iPhone X new despite the fact it isn’t the newest iPhone? I believe, in the end, what we consider new in terms of being produced for the first time is highly dependent on the industry that we are discussing. In fashion, new may be every month / three months whilst in technology it might be every year during a big release. In the end, the newness of an object is simply determined by how recently it came out, and how that compares to other things in the same industry.
However, things will always appear new if you are discovering them for the first time. If you have never seen a smartphone and you are given an iPhone 4S, that would be new to you. Even experiences can be new, despite the fact they’ve been around for a long time.
- Welcome to the Social Studies curriculum! Of course, as it seems to be tradition now, we will be starting with a dictionary definition of what is “new.” According to the Oxford English Dictionary, something new is something that has been “produced, introduced, or discovered recently or now for the first time.” Basically, something that has been recently produced for the first time or discovered by you for the first time. Immediately, we come across an issue; what is recent? In terms of technology, something that has been recently produced may have been made yesterday, a month ago or a year ago, depending on what it is. Would you still consider an iPhone X new despite the fact it isn’t the newest iPhone? I believe, in the end, what we consider new in terms of being produced for the first time is highly dependent on the industry that we are discussing. In fashion, new may be every month / three months whilst in technology it might be every year during a big release. In the end, the newness of an object is simply determined by how recently it came out, and how that compares to other things in the same industry.
- Who decides when something new becomes old?
- There are really two groups of people that decide, the public and the “manufacturers” of the object / experience. Firstly, members of the public may collectively decide that something is no longer new / trendy anymore due to it either being out a long time or a new product coming out that replaces it. This decision is not a conscious or organised decision, and instead just occurs. The second scenario, where a new product comes out, is also very affected by the manufacturer of a particular item. Apple get to decide when the iPhone 11 will be considered “old” by releasing a new iPhone. Competitors can also make a new product old, by making its features obsolete through the release of an updated and better product.
- Do products age in the same way that people do? Do ideas?
- Products have a much shorter lifespan than people. Most products are not manufactured with the purpose of lasting a lifetime, as companies want consumers to continue coming back and buying more things. Because of this, some products will age at a rapid rate in comparison to humans. Yes, it is true that well made products will last longer than a few years, but even these will not last a human life.
Ideas are a much more interesting thing to discuss. Firstly, how do we define an idea ageing? Do we think about how it eventually becomes obsolete? Can an idea even die? Let’s try and answer these questions. Firstly, an ageing idea can be thought about from the time of its conception. Most major ideas can be traced back to an individual or a time period, if it’s much older. Ideas can easily become obsolete, such as how Isaac Newton’s Theory of Gravity has been disproven and replaced with Einstein’s Theory of Relativity. Because of this, I would say that ideas do not die, as they are instead replaced with newer ideas, with the older ones becoming obsolete. Through this, they remain around, simply being obsolete. The same could really be said about any product, but it relates most directly to ideas.
- Products have a much shorter lifespan than people. Most products are not manufactured with the purpose of lasting a lifetime, as companies want consumers to continue coming back and buying more things. Because of this, some products will age at a rapid rate in comparison to humans. Yes, it is true that well made products will last longer than a few years, but even these will not last a human life.
- When you were younger, did you have any hobbies or interests that you no longer have now? Why did you grow out of them?
- Of course. Everyone has interests that they grow out of, and I am no different. Two which immediately come to mind are Yu-Gi-Oh Cards and Rubik’s Cubes. Despite the fact that these two hobbies were many years apart (one when I was 6 - 7 and one when I was 12 - 14), they both finished due to the same reason. Money. Turns out that collectible card games require a surprising amount of monetary investment if you wish to become good, as you need to buy good cards. A similar thing is true with Rubik’s Cubes, as the best cubes can be up to 80USD! This monetary investment combined with the time investment of actually practicing these hobbies resulted in me eventually losing interest, and moving on to other things. Interests can be slightly different as, thanks to the internet, if you’re interested in something, you have all the information that you need at your fingertips. However, most of my interests eventually disappeared due to me growing out of them (dinosaurs can only be cool for so long), though some did remain. We often grow out of them not because we lose interest, but because we are embarrassed by having that interest.
- Did you ever become interested in something only to hear your parents say, “it’s just a phase”? Was it just a phase?
- I am lucky to have escaped this accursed phrase, but I can understand why it would easily discourage people from pursuing their hobbies. However, I would say that many of my hobbies truly were only phases. Looking at the above examples, I only really used Rubik’s Cubes for two years and now hardly touch it. You could almost define a life up to 18 as “phases of hobbies.” The issue is not the fact that these were phases, the issue is the stigma surrounding the word, of how this hobby will be useless. That’s the sort of thing that we need to eliminate.
- How many times do you have to listen to a song you like before you get tired of it? What about for rereading a good book, or watching a favorite movie?
- Songs, movies and books are three very different things when you listen to them over and over again. Let’s start with music. Dr. Michael Bonshor (an expert in the psychology of music) says that how many times you are willing to listen to a song before you tire of it is dependant on two factors; how many times it is played and how complex the song is. More complex songs, which may be longer or contain more musical elements / variation throughout, will still be exciting to our brain on the 17th listen, because we’re still picking up interesting information about the song. I’d say that if I find a song that I really like, I can easily play it at least 5 times in a row, and the really good songs can go up to 20. However, I know that I do not represent all scholars here, so consider this yourself next time you put a song on repeat on Spotify.
Rereading books is quite different, as we generally reread them because we’ve developed an attachment to a character. We consume books in a very different way to songs because we space out the rereadings. It takes a significant amount of effort to read a book, and generally we can’t get through them in a few minutes. Because of this, we reread our favourite books out of nostalgia, and to ensure that we know EVERYTHING about that book.
Finally we come to movies, which are in a very similar boat to books. We rewatch movies to pick out the fine details within them and to truly understand everything about each character within them. Let me just link this guy who watched Avengers Endgame more than 100 times whilst it was still in theatres! In short, we can reread and rewatch films many more times than we can listen to songs because they are much longer and much more complex, allowing us to analyse different things every time we view the particular media
- Songs, movies and books are three very different things when you listen to them over and over again. Let’s start with music. Dr. Michael Bonshor (an expert in the psychology of music) says that how many times you are willing to listen to a song before you tire of it is dependant on two factors; how many times it is played and how complex the song is. More complex songs, which may be longer or contain more musical elements / variation throughout, will still be exciting to our brain on the 17th listen, because we’re still picking up interesting information about the song. I’d say that if I find a song that I really like, I can easily play it at least 5 times in a row, and the really good songs can go up to 20. However, I know that I do not represent all scholars here, so consider this yourself next time you put a song on repeat on Spotify.
- Can you think of something you used to say a lot that you don’t say anymore?
- This took a little bit of thinking, but I eventually realised that I no longer quote Ben 10 and Harry Potter at every available opportunity. Our vocabularies and regular mannerisms whilst we are younger definitely change, as we learn new words and realise that some of the phrases that we say are a little bit embarrassing. Another way that this question could be interpreted is quoting facts. When we were younger, we might have regularly quoted something that wasn’t true, simply because we heard it and immediately believed it. The final way that I think this question could be interpreted is words that we used when we were younger. I can say that phrases such as “loser” would fly around the classroom much more regularly during primary school in comparison to high school, and I think we also outgrew (most of) our toilet humour.
- Can you think of a word or phrase that used to mean something other than what it means now?
- Words are constantly evolving in meaning, and it only takes one person to use a word in a particular way before it goes viral and completely changes in terms of meaning. However, it can actually be slightly hard to think of these “changed words,” as they’ve become such a normal part of our lives that we don’t consider them to have a different meaning. One word that comes to mind is “toxic” which can either be used to describe that something is poisonous when ingested or to describe a very negative person who is very hateful, though both meanings of those are still used. After more digging, you can easily find lots of words which have different meanings now than in the past, such as nice (used to mean foolish), myriad (which represented 10,000), clue (a ball of yarn) and meat (which used to simply mean any food!). These words have all changed over much longer periods of time than “toxic,” but they do show how the English language is constantly evolving, as well as other languages!
- Have you ever stopped (or started) doing something because everyone else stopped (or started) doing it too?
- Time to talk about peer pressure, one of the most overdone topics at school yet one that is particularly relevant to our lives. Of course I’ve begun to do things or stopped doing things because other people were / weren’t doing them. You simply have to look at some of the fads which we’ve had throughout the past decade to easily see what you’ve got caught up in and then dumped when everyone else agreed it was past its peak (cough Pokemon Go cough).
- Do your parents like to listen to the same music as you do?
- Yes and no. Often, I’ve listened to music that my parents have introduced to me, but for the most part, I listen to different music from my parents. This is partially a generational gap, as we generally listen to music that’s popular whilst we are young and then continue listening to the same music later in our lives (as we will find out later in the curriculum!). I, as a teenager, am still trying to find the music that I enjoy, but my mum and dad have both found the music that they enjoy and are listening to it. Also, do we really want our parents to listen to the same music that we do? If this happened, it would feel like our parents were simply listening to this music in an attempt to be “hip” and “cool,” and not because they truly enjoyed the music.
- How often do you buy new clothes? How about a new phone?
- I do not buy clothes very often, but that’s due to the fact that I dread shopping with a burning passion. Besides adding to my T-shirt collection whilst I’m at WSC Global Rounds, I probably only buy new clothes two or three times a year, besides realising I need certain things for drama performances and then running around to get them. The stereotype is that teenagers are always buying clothes, but that’s definitely changing. Teenagers are becoming increasingly aware of the effects of climate change and the impacts that buying clothes has, and are therefore buying clothes less or shopping at thrift shops (renewing the clothes, some would say).
Phones are not bought quite as often, despite what many believe. Data that’s been collected shows that usually, people wait 2.87 years before upgrading their phone. This means that if you bought an iPhone 7, you’d probably be upgrading to an iPhone X. However, I only ended up replacing my phone once it regularly crashed upon opening Chrome (5 years). I can completely understand why some people will replace their phones more often than I do, whether it be to keep up the image that they’re wealthy or to ensure that they have all of the latest features within their phone.
- I do not buy clothes very often, but that’s due to the fact that I dread shopping with a burning passion. Besides adding to my T-shirt collection whilst I’m at WSC Global Rounds, I probably only buy new clothes two or three times a year, besides realising I need certain things for drama performances and then running around to get them. The stereotype is that teenagers are always buying clothes, but that’s definitely changing. Teenagers are becoming increasingly aware of the effects of climate change and the impacts that buying clothes has, and are therefore buying clothes less or shopping at thrift shops (renewing the clothes, some would say).
- Some people are drawn to new experiences—whether that means skydiving, visiting Krasnodar, or joining an academic competition with overstuffed outlines and understuffed alpacas. Consider recent research into the benefits (and sources) of novelty-seeking behavior. Discuss with your team: is our world today so full of novelties that it can be difficult for novelty-seekers to ever feel at peace—or is this a uniquely privileged time in human history that novelty-seekers should treasure?
- Unfortunately, thanks to my lack of New York Times subscription, I’m unable to read the above article. However, I can certainly discuss this. Novelty-seeking, also known as neophilia, is where someone is constantly craving something new. It’s believed that neophiliacs, as they are called, are fueled by the larger-than-normal dopamine rush that they receive when they do something new. Neophilia is also affected by age, with those who are older being less likely to be novelty-seekers. Generally, there are four scales you can use to determine how much of a novelty seeker you are, being Exploratory Excitability, Impulsiveness, Extravagance and Disorderliness. Though I’ve been unable to locate too many benefits of novelty-seeking (curse you paid subscription services!), one agreed on benefit is that novelty seekers are easily able to adapt to change, as their life is constantly changing.
Novelty seekers don’t necessarily feel a need to do absolutely everything in the world, they simply crave new experiences. It is not so much ‘at peace’ as novelty seekers (or at least most of them) recognise that they can’t do everything in their lifetime. Because of this, this is an incredibly privileged time in human history, as the wide breadth of experiences and the easier availability than ever to have these experiences makes this a novelty seekers dream.
- Unfortunately, thanks to my lack of New York Times subscription, I’m unable to read the above article. However, I can certainly discuss this. Novelty-seeking, also known as neophilia, is where someone is constantly craving something new. It’s believed that neophiliacs, as they are called, are fueled by the larger-than-normal dopamine rush that they receive when they do something new. Neophilia is also affected by age, with those who are older being less likely to be novelty-seekers. Generally, there are four scales you can use to determine how much of a novelty seeker you are, being Exploratory Excitability, Impulsiveness, Extravagance and Disorderliness. Though I’ve been unable to locate too many benefits of novelty-seeking (curse you paid subscription services!), one agreed on benefit is that novelty seekers are easily able to adapt to change, as their life is constantly changing.
- When a person buys and starts to wear a new shirt, she might then start shopping for a matching scarf and a nice motorcycle to go with it. The Diderot Effect refers to this common pattern of behavior, in which one new purchase leads to a series of related purchases—or even into an obsessive interest in diving into something new, such as sneakers or smoothies. Discuss with your team: if you were a business, could you take advantage of this effect? In the long run, does it help people, or is it something that should be discouraged?
- The Diderot Effect, something which most scholars will have never heard of but will immediately understand. The Diderot Effect is where you obtain a new item or possession and suddenly feel an obsession to buy things that are very similar, such as buying a new pair of nice running shoes and suddenly feeling as if you should buy a treadmill, a gym membership, a healthy eating guide and a set of workout clothes, despite the fact that you don’t actually go running that much. Basically, it results in our purchases spiralling out of control into this rabbit hole of needing new stuff. One common cause of this is that once you have one new thing that’s so much nicer than your old things, it feels as if you need to replace everything so that they are all nice and new.
As a business, utilisation of the Diderot Effect is crucial. The more stuff that you have people buying in one visit, the more money that you make. One of the most effective and easiest to implement strategies that utilises the Diderot Effect is something we’re already seeing on websites such as Amazon. Of course, I am talking about “frequently bought together” and “other shoppers also bought.” This advertising makes us think “if other people bought this, then maybe I should as well,” and before we know it, we’ve ordered an extra nine avocado savers “just in case.” Another way to utilise this is through targeted emails, analysing the purchases of customers using an AI programme and then sending out an email recommending similar items. In the long run, the Diderot Effect disadvantages the individual, as it results in you spending money on things that you do not need at the present when you could be spending that money on something much more useful.
- The Diderot Effect, something which most scholars will have never heard of but will immediately understand. The Diderot Effect is where you obtain a new item or possession and suddenly feel an obsession to buy things that are very similar, such as buying a new pair of nice running shoes and suddenly feeling as if you should buy a treadmill, a gym membership, a healthy eating guide and a set of workout clothes, despite the fact that you don’t actually go running that much. Basically, it results in our purchases spiralling out of control into this rabbit hole of needing new stuff. One common cause of this is that once you have one new thing that’s so much nicer than your old things, it feels as if you need to replace everything so that they are all nice and new.
- Suppose you noticed all the other scholars at your round drinking unsweetened Japanese green tea, so you bought some too. Your choice might be explained by the bandwagon effect—the theory that we are driven to do what we see others doing. Discuss with your team: does the bandwagon effect help create social harmony—or should people try to resist it?
- The bandwagon effect is a cognitive bias where we’re more likely to adopt a particular behaviour if we feel that many other people are also exhibiting that behaviour. For example, if lots of people that we know begin listening to a particular musician and really enjoy his / her music, we feel obligated to listen to that artist and enjoy their music as well. The Bandwagon Effect is based around conformity, as we all want to be “right” and to fit in. Because of this, there can be many negative effects of the Bandwagon Effect, such as those who are exposed to the anti-vaccination (anti-vax) movement being less likely to have their kids vaccinated. However, we must remember all of the positive impacts of the bandwagon effect, such as how it discourages behaviours that are considered to be “bad” by the majority of society, such as smoking, and encourages “good” behaviours, such as eating healthily. At the end of the day, we should be aware of the bandwagon effect, but not go against it in most situations. The awareness of its existence is what is truly crucial.
- In a 2014 working paper, Dr. Erkan Goren hypothesized that groups of people with a gene linked to novelty-seeking would tend to move around more—making them less likely to settle down and form strong states. His hypothesis is controversial, as research in “biogeography” is only a step away from sweeping generalizations about race and culture. Discuss with your team: if certain populations were more inclined to go looking for new things, would businesses want to market products differently (or to market different products) to them? And, if a gene controls some aspect of novelty-seeking behavior, should we find ways to reduce that gene’s impact—or to increase it?
- Though I have not fully read this paper, the gist of it is that a novelty seeking gene exists which results in some groups being less likely to settle down or move far away. For example, a nomadic group would be more likely to have this novelty seeking gene as they wouldn’t settle down, as they wish to continue experiencing different things every day. It’s certainly true that it would be useful for a business if they knew which populations had this novelty seeking gene, as they would be able to market them in a different manner. For example, items could easily be advertised to those with this gene as “New and Improved!” or any other term which makes it seem new and worth using. If this gene does control novelty-seeking, then I do not see why we need to reduce the gene’s impact. Once again, awareness is the crucial thing. If you are constantly aware that you have this gene, then it will not drastically affect you.
- When people first join a new community, for a while they are happy to learn all its special lingo—but research has shown that eventually they stop adopting new words, and the community evolves past them. Someone who first joined the World Scholar’s Cup in 2013 might have happily learned the word “pwaa” but might be reluctant to start using the term “lollipop”—which entered the lexicon in 2016—for not winning a debate. Because their study shows that people are open to new words for the first third of their time in a new community, the researchers imply that you could predict how long someone will stay in a community by measuring how soon they stop using new words. Discuss with your team: would it be helpful to know how long someone will be part of something—and, to keep them around, should groups make a special effort to keep them learning new words for longer?
- The main use of this research is, of course, tracking for how long someone is expected to be within a community. Of course, it would be helpful to know how long someone will be in a community, as you can start figuring out when to recruit new individuals as others will leave (WSC can start planning ToC sizes much earlier!). However, there is no point in putting extra effort into learning these new words. Not learning new words isn’t what caused this person to leave, it is more a symptom which doesn’t cure the main problem; ageing out of WSC.
- In mid-2017, fidget spinners were suddenly everywhere; entire classrooms looked like they were trying to generate wind power. Then, just as suddenly, they disappeared, and the world went on to talk about other things. Similar spikes in popularity occurred with Pokémon Go in 2016, the Dress Debate in 2014, “Gangnam Style” dancing in 2013… the list goes on. Discuss with your team: are people aware when they are part of a fad, or do they only realize it afterwards? What causes fads to fade? Consider this list of fads from the 2010s and count how many of them you were part of, from flossing to the ice bucket challenge, as well as how many you weren’t aware existed.
- Fads, what a wonderful thing. The question of whether people are aware of their own involvement in a fad is an interesting one, and I think we can answer it by thinking about the definition of a fad, “an intense and widely shared enthusiasm for something, especially one that is short-lived.” An individual is generally unable to tell whether a certain activity will be short-lived or not, and whether it is therefore a fad. For example, when Pokemon Go came out, it was hitting 28.5 million daily players within the U.S. Six months later, that was down to 5 million. At the time, no one could have known whether Pokemon Go would drastically decrease in popularity or not, and most would not be able to determine that it was a fad. The main cause of fading fads starts with how fads begin in the first place. When it feels like everyone else is doing something, then you wish to do it as well. However, when the novelty of this fades away and you realise that you don’t truly enjoy that specific activity, then you might stop doing it. From the above list of fads, I had participated in a whole six fads. Some of these I am not so proud of, but I am proud to forever stand by avocado toast.
- How do you do, fellow kids? At a summer 2016 campaign rally, Hillary Clinton beseeched younger voters to “Pokémon Go to the polls”. Discuss with your team: why did so many people consider her call to action “cringey”—and did their reaction suggest that the Pokémon Go fad was past its peak? At what point does referencing a fad become cringey—or is it less about when it is used and more about who uses it? Is it cringey of us to reference memes in this outline?
- One of the most “yikes” moments of Hillary Clinton’s campaign (2nd only to this attempt at using Snapchat) was “Pokemon Go to the polls.” What made this so cringey is the fact that this particular fad of Pokemon Go was so mainstream that a Presidential Candidate, who likely had no knowledge of Pokemon, knew an incredibly limited amount about it. This limited amount was enough to capitalise upon and make the joke, but made no sense within the context of Pokemon Go, making it cringey to all those who play it. This does show that Pokemon Go was at its peak, as of how mainstream it had become. It only had one way to go (hint, down).
It’s not so much when the reference is made to a fad that makes it cringey but who is making it, as if someone is making a reference to something when they personally have never used it and have limited knowledge of it, only due to how big of a fad it was, it’s considered cringey. The main reason why WSC using memes is not considered cringey is because many of their staff, I would say most, have knowledge of memes, and therefore can utilise it in a non-cringy way.
- One of the most “yikes” moments of Hillary Clinton’s campaign (2nd only to this attempt at using Snapchat) was “Pokemon Go to the polls.” What made this so cringey is the fact that this particular fad of Pokemon Go was so mainstream that a Presidential Candidate, who likely had no knowledge of Pokemon, knew an incredibly limited amount about it. This limited amount was enough to capitalise upon and make the joke, but made no sense within the context of Pokemon Go, making it cringey to all those who play it. This does show that Pokemon Go was at its peak, as of how mainstream it had become. It only had one way to go (hint, down).
- The word on the street is that memes, like fads, are spreading and fading more quickly than in the past—in fact, an informal study in 2018 put the average lifespan of a meme at 4.017 months. Discuss with your team: if this trend is real, what do you think could be the causes of reduced meme longevity? Are there ways to keep memes alive for longer, and if so, should we pursue them?
- It is a common joke within the meme community that every meme “dies” very quickly, creating a dead meme. If, however, the average lifespan of a meme is roughly 4 months, that’s actually much higher than expected! The main cause for reduced meme longevity is that memes are a continually evolving creative outlet. Those who create memes are always wondering “what else could I do? What else would be funny?” Because of this, we end up having a cycle of people creating memes, and these being edited, modified and altered in a way that people find interesting and entertaining, leading to the next meme. Eventually, that past meme becomes old news, and simply isn’t popular enough anymore. One reason why memes do stay alive for 4 months rather than 1 month is due to how memes ‘migrate’ across platforms. A meme might be created by someone on Reddit, and when it starts to die down in popularity, might be picked up on Instagram, and then to Facebook. There really isn’t any way in particular to keep a meme alive for longer, and even if there was, it would not be worth pursuing, as part of the fun of memes is how they continually evolve and change, even though some of the changes may feel as if they came out of nowhere.
- “If this trend is real…”—take a moment to consider the differences between a trend, a fad, and a meme. What causes each of them to come to an end—and does something new always replace them?
- Let us begin with the classic case of dictionary definitions. A trend is “a current style or preference,” a fad is “an intense and widely shared enthusiasm for something, especially one that is short lived,” and a meme is “an element of a culture or a system of behaviour passed from one individual to another by imitation or other non-genetic means.” Though these definitions may seem complicated, let’s break them down one at a time.
Trends are simply something which is currently becoming popular. For example, a trend may be that wide-brimmed hats are in fashion at the moment. This trend may have increased rather slowly, and may remain trendy for a while before it stops being trendy and becomes less mainstream. The end of a trend is entirely swayed by public opinion, though people in high up positions within the particular industry may try and eliminate a trend by creating a new one. Trends are most commonly used in the areas of fashion and entertainment, and therefore are replaced when something stops being trendy with a new trend.
Fads are more specific than trends, and though a fad may become popular, it will likely immediately fade away into obscurity. In contrast to trends, a fad will skyrocket in popularity (like the Gangnam Style Dance) and then fall away very soon after. Fads are powered to the mainstream by the energy that surrounds that particular thing, and so when this energy begins to fade away, due to it no longer being emotionally popular, then the fad simply disappears. Fads are not quite as common as trends, and do not appear as often in specific industries. Because of this, though a new fad may appear, it is unlikely to be related to the old fad in any way whatsoever.
Finally memes. The main difference between a meme and fads / trends is how they are consumed by those who are using them. Whilst people may follow a fad / trend, a meme will be modified and expanded upon, thanks to its adaptability. Now, there are many templates for memes online, and the best memes are generally those that can be applied to many, many situations. Memes seem to die due to a lack of perceived interest and due to the fact that on the internet ‘old’ things are considered not worth pursuing. Because of this, memes can have a much shorter lifespan than fads / trends. Memes are continually replaced, thanks to the factories that are meme makers. When one meme dies off, a new one will pop up to take its place.
- Let us begin with the classic case of dictionary definitions. A trend is “a current style or preference,” a fad is “an intense and widely shared enthusiasm for something, especially one that is short lived,” and a meme is “an element of a culture or a system of behaviour passed from one individual to another by imitation or other non-genetic means.” Though these definitions may seem complicated, let’s break them down one at a time.
- “Impossibly long hair has become the look for 2019,” claims one of a hundred websites that show the evolution of (mostly Western) hair styles over the years. Has it? Work with your team to research what causes hair styles to come and go. Were they more stable in the past, and are there regions of the world where they change less often? Do such lists—and the history that underpins them—inevitably demonstrate cultural bias?
- From my very limited knowledge of hair styles and research, it seems as if the main cause of hair styles changing within our modern society is trend chasing, particularly in an attempt to look like famous individuals. This is further emphasised by the fact that almost every single one of the hair styles shown in the hairstyles is accompanied by an incredibly famous individual from that year who had that particular hairstyle. Even when I looked at other websites, I found that though there was disagreement of what the 2019 look was, the agreement was that it was inspired by a famous musician or actor. When we look to the past, we have to split the idea of hairstyles into a few different areas. Within Europe, popular hairstyles were mainly dictated by the nobility and upper class, particularly Kings and Queens. For example, King Louis XIII of France’s decision to wear a powdered wig made it suddenly popular across Europe. It’s actually rather difficult to research other regions, as Wikipedia, and the internet as a whole, is rather Eurocentric. On the Wikipedia page discussing the history of hair, nearly everything on the page was discussing Europe. I wasn’t actually able to find a good answer to this question, as to what regions of the world have hairstyles change less often. In the past, I believe that areas such as the Middle East, where hair has held deep religious significance, have had hair styles change less, but again, the incredibly Eurocentric media that I found made it difficult for me to find a perfect example. These lists definitely demonstrate cultural bias, as we would expect. The look for 2019 within the United States will be completely different from the look within China, for example. Yes, there is definitely a level of cultural bias within these forms of media, but we are unable to get around that. What we should try and do is remove the historical bias so that instead it is just the geographical bias remaining.
- Much like hairstyles (and fingernail styles), fashion trends and fads come and go—and sometimes people are glad to see them gone. Explore theories on why fashion trends tend to repeat over time. Discuss with your team: to what degree can the designers of clothing decide what people will want to buy? Is fashion a first-world problem—and is the term “first-world problem” itself a fad?
- In the above article, there are three main theories that are discussed surrounding fashion repeating itself. The first, and most common one, is the 20 year rule. This, pretty simply, is the idea that what is popular right now will be popular in 20 years time (this is also known as go raid your parents’ closet). This is a widely popularised idea, and is generally based around T.V. shows like Stranger Things and That 70s Show influencing fashion by showcasing the past.
The second theory is the 50 year rule. This rule, created by Fashion Theorist / Historian James Laver describes an entire lifecycle, which can be found in the infographic at the slideshow below. If, however, you are unable to access the photo, the general idea is that one year before something is popular, it’s considered “daring,” one year after it’s considered “unstylish,” ten years later it’s considered “hideous,” but 50 years later, it will be considered quaint, and become fashionable once again.
Finally, we have the entertaining cycle of fashion regarding the global economy. In the 1920s, George Taylor developed the “Hemline Theory” after he observed that women generally wore shorter skirts during period of economic prosperity to show that they were wearing expensive items, such as silk stockings. In fact, when you graph the height of women’s skirts with the success of the global economy, you find a really strong correlation! However, the general idea that this proves is that during times of recession, people are less likely to spend money on trendy items of clothing that may only be worn for a year, and instead prefer to buy items that have been reliably fashionable.
Of course designers do get to “decide” what people will buy by manipulating the fashion market and attempting to make certain colours or styles “in.” However, many people are not particularly involved in high-fashion as a whole, and instead buy either what is currently the latest fad / trend (more commonly caused by celebrities rather than fashion designers) or simply ignore fashion as a whole. Fashion as a whole does appear to be a first world problem, but you can rather easily argue that its not. Fashion has been around in many societies and exists at multiple levels, as fashion is not simply just high-fashion, and includes many other forms of clothing that the average person may wear (I myself am rather partial to a hoodie). Fashion has generally served the purpose of showing that you were in an economically stable situation, as you were able to afford these particular items of clothing that were considered fashionable. Yes, there are some forms of fashion that would be considered a first-world-problem (how sad that you can’t afford the latest Gucci clothing), but generally, fashion serves more of an advertisement of your personality and an advertisement of your current situation.
- In the above article, there are three main theories that are discussed surrounding fashion repeating itself. The first, and most common one, is the 20 year rule. This, pretty simply, is the idea that what is popular right now will be popular in 20 years time (this is also known as go raid your parents’ closet). This is a widely popularised idea, and is generally based around T.V. shows like Stranger Things and That 70s Show influencing fashion by showcasing the past.
- Fashion has arguably become more democratic in the age of social media. Where many people used to look toward the upper class for styling cues—the concept of haute couture—now so-called fashion bloggers and influencers have a much more direct connection to millions of followers. Discuss with your team: have platforms like Pinterest and Instagram made it harder or easier to be fashionable?
- The main difference between now and the past regarding fashion is the diversity of what is considered “fashionable.” For example, in Victorian England when Prince Albert died and Queen Victoria wen into mourning, there was a sudden craze of wearing black that swept across the nation, and continued for a decent time, making it fashionable to wear black during that time period simply due to that one person. Because of the diversity of individuals who we now follow in pursuit of fashion, thanks to platforms such as Instagram, and the easy availability of certain types of fashion that we like, thanks to Pinterest, it has become both easier and harder to be fashionable. If we wish to follow a specific type of fashion, such as Business Casual or Normcore, it’s easy than ever. However, if you wished to follow something similar to high fashion, or wished to buy super expensive brands (looking at you Supreme and your brick), then it’s more challenging, due to limited availability and incredibly high prices. Remaining relatively fashionable is easier than ever, and we always know what is in fashion thanks to Instagram and other apps.
- Consider the difference between the original release and the retro release of a sneaker model. Should retro releases of a mass-produced product be considered equivalent to the original product—and how might designers, collectors, and consumers answer this question differently?
- Retro releases of sneakers are re-releases of the same model, with subtly different changes. Nike, for example, replaced the logo on their Air Jordan III retro releases with a “Jumpman” logo. These retro releases should not be considered equal, as they often use different materials, are manufactured in different places and don’t have the rarity of the original. However, let’s see how each group of people might look at it. The above opinion is that of a collector, who sees the differences in each type of shoe. A designer would probably try and convince the public that it was equivalent to the original in an attempt to drive up sales, whilst a consumer would probably try and convince themselves that it was equivalent, in an attempt to avoid FOMO.
- Entire brands can phase in and out of popularity. Consider Champion and Fila: after years out of the limelight, they’re starting to make a comeback. What causes certain brands to come in and out of fashion? Is there a reason that their comeback is happening now instead of ten years ago or ten years in the future?
- Brands coming back into fashion is generally a combination of nostalgia, as we always want to live those “glory days,” and the good old 20 year rule. These companies have also been planning these comebacks, such as the collaborations with major brands (Supreme, Undefeated), advertising with famous individuals and partnerships with companies such as Uniqlo. These partnerships take a long time to plan, and are a major reason why the comeback isn’t happening ten years ago or ten years in the future. There is also the reason of nostalgia, shown by the fact that Fila have recently launched their “retro” line. This line of clothing that was originally released capitalises on the nostalgia that people have for these brands, as well as drawing in new individuals into those particular products.
- Popular in the 1990s and often referred to as “dad shoes”, sneakers that are chunky and outsized have recently seen a resurgence in popularity as part of the “ugly fashion” trend. This article suggests that one appeal of ugly fashion is that it offers consumers a break from keeping up with what “fashionable” is. Is ugly fashion a long-term antidote to an unsustainable problem in the fashion industry, or is it just another fad or trend?
- Ugly fashion is wearing things that stick out and would be considered ugly, with the main reason that you don’t have to be fashionable, yet you still make a statement (it’s pretty hard to not notice the person wearing huge, chunky shoes). The issue that this attempts to solve within the fashion industry is how fast fashion is moving these days, and how attempts to grab our attention are always being made by this industry. This results in outlandish designs being created by those in the fashion industry in an attempt to make people notice that particular item. However, ugly fashion allows us to step back and ignore all of this crazy fashion whilst also defying a norm.
Personally, ugly fashion appears to be a trend rather than a long-term antidote. As it says in the above article “The fashion world is moving faster than ever and, rather than seeing new styles emerging every decade or year, we're now seeing different trends pop up every month.” This fast fashion (as will be seeing very soon!) cannot be cured simply with ugly fashion, as fashion will continue to accelerate into the future. It may stop the problem temporarily, but it will not act as a solution.
- Ugly fashion is wearing things that stick out and would be considered ugly, with the main reason that you don’t have to be fashionable, yet you still make a statement (it’s pretty hard to not notice the person wearing huge, chunky shoes). The issue that this attempts to solve within the fashion industry is how fast fashion is moving these days, and how attempts to grab our attention are always being made by this industry. This results in outlandish designs being created by those in the fashion industry in an attempt to make people notice that particular item. However, ugly fashion allows us to step back and ignore all of this crazy fashion whilst also defying a norm.
- “Out with the old” may sound good to those who do keep up with fashion, until you realize that the old is taking up a lot of space in landfills. Unsold clothing, which lives in a strange space between old and new, is sometimes even burned. On average, Americans throw away about 30 kg of their wardrobe per year. For a sense of what has made it possible for people to own more clothes and wear them less, read this eloquent prologue to the 2012 book Overdressed: the Shockingly High Cost of Cheap Fashion. In the almost decade since its publication, has online shopping changed how much clothing people buy, or how frequently? What are the “fast fashion” stores most popular in your community? Do you agree with the author’s claim that “fashion is obsolescence”?
- Online shopping has come in many forms, and we should probably go through them first. Of course, we now have websites such as Amazon and AliExpress where they are entirely built around selling new stuff online, with clothes being just one part of it (6.3% of Amazon’s sales are either clothing or accessories). In addition, we have websites which allow us to sell and buy second hand clothes, such as Ebay. Finally, we have the largest section of all, stores which previously will just commercial stores but have expanded online. All of these forms of online shopping and this easy availability of clothes are definitely a factor of increased purchases of clothing. The most popular “fast fashion” store in my area is likely H & M, though Kmart is also incredibly popular.
The fashion industry as a whole definitely involves a process of obsolescence within it, as it’s ever changing with products that you’ve previously bought and were fashionable suddenly not being fashionable. We have both the issue of what was fashionable one month no longer being fashionable the next, as well as the issue that many “fast fashion” products are only made to be worn a few times. In short, obsolescence is definitely present within the fashion industry, though those who work in it might disagree.
- Online shopping has come in many forms, and we should probably go through them first. Of course, we now have websites such as Amazon and AliExpress where they are entirely built around selling new stuff online, with clothes being just one part of it (6.3% of Amazon’s sales are either clothing or accessories). In addition, we have websites which allow us to sell and buy second hand clothes, such as Ebay. Finally, we have the largest section of all, stores which previously will just commercial stores but have expanded online. All of these forms of online shopping and this easy availability of clothes are definitely a factor of increased purchases of clothing. The most popular “fast fashion” store in my area is likely H & M, though Kmart is also incredibly popular.
- More than ever, companies are trying to design sustainable clothing so that people don’t have to replace items as often. However, a recent study at the London College of Fashion suggests people don’t hold onto the clothes that last the longest—they hold onto those that mean the most to them. Discuss with your team: should companies invest in making more durable clothing, if consumers are likely to move on to new outfits anyway? Is the path toward more sustainable fashion encouraging people to have fewer clothes in their closet—perhaps in the form of so-called capsule wardrobes? Or should the focus be on more sustainable manufacturing practices?
- Companies should invest in durable clothing that is still fashionable, and doesn’t result in people buying it simply because it’s durable only for them to throw it out again when it goes out of fashion. Instead, if you are to create durable items, then you should create durable items that remain relatively fashionable, and are constantly worn, such as blue jeans.
So called capsule wardrobes are collections of clothing, generally with around 30 items. This includes everything from shoes to clothing to jewellery, and is relatively uncommon in our fashion rich world. The other option is to create more sustainable manufacturing practices for clothing, decreasing CO2 emissions and using materials with a lesser impact on the environment. Obviously, we must have more sustainable business practices in the future, and it’s important that we steadily improve these businesses practices, but with the ever looming threat of climate change, discouraging people from buying more clothing (and buying from sustainable companies) is likely a better solution right now. Both solutions need to be combined in the future for the fashion industry to become sustainable, however.
- Companies should invest in durable clothing that is still fashionable, and doesn’t result in people buying it simply because it’s durable only for them to throw it out again when it goes out of fashion. Instead, if you are to create durable items, then you should create durable items that remain relatively fashionable, and are constantly worn, such as blue jeans.
- Is rental clothing a way for people to experience novelty without overspending—or over-consuming? Discuss with your team: do you agree with its arguments for why fewer men than women rent clothing, or are there better explanations? How different is renting clothing from leasing a new phone for two years or borrowing a bicycle from a rideshare service for the afternoon?
- Rental clothing companies allow people to wear clothes that they like once or twice, for example at special events, without having to spend too much money on buying them. Generally, these clothes are rented for 2 - 3 days, though some companies offer extended rental programmes, allowing you to rent them for longer. One interesting quirk of the rental clothing industry is that almost no men rent clothing, except when they need a tuxedo. The NYT article above contains many reasons for why men don’t rent clothes from men with an interest in fashion, including a fear of herd mentality, the fact that men swap clothes less with their friends, the slower growth rate of the menswear market, less pressure and scrutiny around wearing the same thing twice and the process of acquiring the clothing being enjoyable to some. I think that all of these are great reasons, and though there are many more, I think the main reason is that men (generally) have a lesser interest in what would be considered “fashionable” than women. Though men always want to look their best, there are fewer who are invested in the fashion industry as a whole. Hopefully that will change in the future.
The main difference between renting clothing and leasing a new phone or borrowing a bicycle is the idea of personal style. Whilst a phone or a bicycle is borrowed or rented for a simple purpose, often when you rent clothing, you are saying that you find this fashionable and would pay money towards it, but are unable to buy it yourself. It is less a question of “I need this” and more a question of “I want this due to societal pressures.”- Notes from Jut: A prime example of this would be prom culture. Formal, Semi-Formal, Gala, or whatever your high school calls it, prom is but a mere excuse for high schoolers to dress too high up because what other occasion can it be socially acceptable to wear something so fancy? The boys often rent a well-fitting tux by default when they could just as easily wear something they’re more used to. Females, on the other hand, take it a leap farther by actually buying their gowns which makes it more expensive. It’s strange to show up to events like these and not dress accordingly. The same sentiments are seen in weddings for not only the groom and bride but also their guests.
- Rental clothing companies allow people to wear clothes that they like once or twice, for example at special events, without having to spend too much money on buying them. Generally, these clothes are rented for 2 - 3 days, though some companies offer extended rental programmes, allowing you to rent them for longer. One interesting quirk of the rental clothing industry is that almost no men rent clothing, except when they need a tuxedo. The NYT article above contains many reasons for why men don’t rent clothes from men with an interest in fashion, including a fear of herd mentality, the fact that men swap clothes less with their friends, the slower growth rate of the menswear market, less pressure and scrutiny around wearing the same thing twice and the process of acquiring the clothing being enjoyable to some. I think that all of these are great reasons, and though there are many more, I think the main reason is that men (generally) have a lesser interest in what would be considered “fashionable” than women. Though men always want to look their best, there are fewer who are invested in the fashion industry as a whole. Hopefully that will change in the future.
- “Out of sight” and “out of style” are often coupled with “out of mind”, but the past trends that influenced the clothing of today are indelibly woven into the fabrics in your closet; this clip from The Devil Wears Prada offers an interesting insight into the impact of those hidden histories. Is it worth understanding the trends behind an item of clothing before you buy it, or is it better to just shop more efficiently based on aesthetics alone? What were the trends that led to the clothes you are wearing right now?
- Okay, this question is something that I personally have quite strong opinions on. I loathe shopping most of the time, and am the type of person who likes to get in, find something that looks nice, and get out again, so I would be an advocate of efficiency in shopping. I can obviously see, however, why many would want to see the trends behind an item of clothing, just as I would be interested in all of the trends in Literature or a particular Video Game before I bought it.
Alright, let’s dissect some trends. Right now, the main items of clothing that I am wearing are a WSC T-Shirt and a pair of track pants. The track pants can be traced back to sport, where they were first created in the 1920s by Emile Causset. These, at the time knitted, pants allowed athletes to stretch / run comfortably. I’m wearing a very casual pair of track pants, but there are many variations, including fashion pants, wind pants and tear-away pants. The T-Shirt first began as an undergarment used in the U.S. Navy. As the Great Depression hit, it then became a very useful piece of clothing, using little fabric but providing protection from the sun. The WSC logo on the T-Shirt comes from the long trend of printed t-shirts, which have been common since the 1960s as a form of personal expression. WSC would likely have followed on from the trend of concert t-shirts, where individuals could buy a t-shirt which showed they had attended a specific concert. Of course, my taste in fashion is very uninformed, and I’d recommend that you have a look at the trends that inspired some of the items of clothing that you are wearing!
- Okay, this question is something that I personally have quite strong opinions on. I loathe shopping most of the time, and am the type of person who likes to get in, find something that looks nice, and get out again, so I would be an advocate of efficiency in shopping. I can obviously see, however, why many would want to see the trends behind an item of clothing, just as I would be interested in all of the trends in Literature or a particular Video Game before I bought it.
- In Livermore, California, there is a light bulb that has been on for over a century (except for ten panicky hours in 2013). This so-called Centennial Bulb even has its own website. Someday, they will write songs about it. But they will also ask a question: if a lightbulb from 1901 could still be shining in 2020, why weren’t all 20th century lightbulbs built to last in the same way? Were lightbulb manufacturers conspiring to sell lightbulbs designed to burn out sooner? In fact, they were. Working with your team, investigate the Phoebus Cartel and the idea of planned obsolescence: products made to break down on purpose, to require people to buy new ones. Discuss with your team: do manufacturers have the right to make products that will allow them to sell more products later? After all, if every bulb could light up forever, the potential market for new light bulbs would shrink dramatically. Is it possible that consumers prefer to buy products that don’t last as long, if that means they cost less and can provide instant gratification more often?
- Xavier here. Let’s start by defining Planned Obsolescence before we move onto how it originated and the specifics. Planned obsolescence is basically the practice of designing a product and giving it an artificial life time, resulting in it not being usable after a certain point in time, either a specific date or a number of hours used. The idea is believed to have originated from the Phoebus Cartel, a group of lightbulb companies who were facing the issue that as they continued to develop better technologies, lightbulbs would need to be bought less often and they would therefore make less profit. Because of this, all of the companies who were part of the Phoebus Cartel agreed to limit the life of a lightbulb to approximately 1000 hours. Now, we have energy efficient LED bulbs that can last up to 25,000 hours, but at the time, 1000 hours was the best that people were able to get, despite the fact that the technology was available that could have resulted in much more efficient and long-lasting lightbulbs. One of these lightbulbs is still running today, the above mentioned Livermore Lightbulb, which has been running for over 1,000,000 hours. Other products which commonly have planned obsolescence built into their design include smartphones, with the average phone only lasting 2.7 years.
Manufacturers definitely do have the right to make products that will only last a limited period of time, but they certainly shouldn’t be as short as 2.7 years in the case of phones. Instead, we should move towards higher quality products, which will result in higher employment due to skilled labourers, that costs more and will last longer, still driving profit for the companies who create them. However, many economic overall would need to occur for this to happen.
One argument against planned obsolescence that can instead be found above is that it’s driven by consumer demand. Consumers want many different qualities within a product including a low price (which results in corners being cut in an attempt to have a cheaper product that competitors, decreasing how long it can function), small size (smaller phones are easier to carry but are much harder to fix) and low amounts of consumer training (often, by making it impossible to replace things, user complaints are greatly reduced). To fix this, we need to give feedback to manufacturers on what we want within new products. However, these products definitely will not provide the instant gratification we receive from buying cheap products over and over again. For example, you always feel great when you get a new phone, and often people would prefer to have to buy multiple cheap phones than a single expensive one. There really isn’t an easy solution to planned obsolescence, but hopefully one will arise in the future.
- Xavier here. Let’s start by defining Planned Obsolescence before we move onto how it originated and the specifics. Planned obsolescence is basically the practice of designing a product and giving it an artificial life time, resulting in it not being usable after a certain point in time, either a specific date or a number of hours used. The idea is believed to have originated from the Phoebus Cartel, a group of lightbulb companies who were facing the issue that as they continued to develop better technologies, lightbulbs would need to be bought less often and they would therefore make less profit. Because of this, all of the companies who were part of the Phoebus Cartel agreed to limit the life of a lightbulb to approximately 1000 hours. Now, we have energy efficient LED bulbs that can last up to 25,000 hours, but at the time, 1000 hours was the best that people were able to get, despite the fact that the technology was available that could have resulted in much more efficient and long-lasting lightbulbs. One of these lightbulbs is still running today, the above mentioned Livermore Lightbulb, which has been running for over 1,000,000 hours. Other products which commonly have planned obsolescence built into their design include smartphones, with the average phone only lasting 2.7 years.
- Every January, the world gathers in Las Vegas for the Consumer Electronics Show (CES)—to see how the latest technologies that we don’t know we need will bring happiness to our lives. There are wall-sized LEDs and robots that make tea, and over in the corner someone probably has a foldable time machine. But while these experimental products earn a lot of the media coverage, there is also another time-honored tradition at CES: companies like Dell, Samsung, and LG show up with annual upgrades to their laptop lineup. No matter how minor those upgrades might be, they market them fiercely; any company that offered no new version would be left behind. The hope: not just to attract new buyers, but to persuade owners of existing laptops to buy new ones (2020). Discuss with your team: should companies be required to give consumers the chance to upgrade to the latest version of their products at minimal cost from year to year? If so, what should happen to the older but still functional products?
- Shaurya here, so I’ll be giving a debate-focused perspective. One obvious reason for providing cheap upgrades is that since companies don’t have significant revenue to earn from their customers repurchasing their product, they’d probably need to make the upgrade significant to attract new customers to actually make a profit. Innovation would be paramount. The principle of innovation, and the idea of expanding the horizons of how design and promotion clash would define the affirmative side of this debate.
On the negative, there’s an economic, practical case to make for incentives. It’s arguable that the choice to upgrade to a new version would improve customer loyalty, but the loss in revenue would be huge. Companies would not be able to afford this if they want to maintain current costs, and a direct point of contention could be that it’s hard to fund research into new technology when your profit margin is being slashed.
There’s also the question of large companies using this model dominating the market, whether older people would be incentivised (or discouraged) to keep up with technology, the idea of consumerism in society (particularly through advertising), and a huge host of other perspectives.
My view on what should happen to the older products is recycling. E-waste is a gigantic problem, but also an extremely lucrative industry. Instead of throwing away old phones, companies could recycle older phones to make newer ones and reduce their costs in the process. This could move us towards a cyclic economy, and help with resource management.
- Shaurya here, so I’ll be giving a debate-focused perspective. One obvious reason for providing cheap upgrades is that since companies don’t have significant revenue to earn from their customers repurchasing their product, they’d probably need to make the upgrade significant to attract new customers to actually make a profit. Innovation would be paramount. The principle of innovation, and the idea of expanding the horizons of how design and promotion clash would define the affirmative side of this debate.
- Apple has made an artform of advertising small upgrades as revolutionary; each iPhone model was more or less the same for two years, with the iPhone 4S a mildly souped-up iPhone 4, the iPhone 5S a mildly souped-up iPhone 5… the pattern has broken lately, but only somewhat. For many Apple fans, always having the newest model took on great importance, even if the changes from year to year were not vast. Other companies followed Apple’s lead (as they did in so many ways): OnePlus entered the same sort of cycle, just twice as often, and Samsung went at the same pace but never bothered with letters. Discuss with your team: should companies be required to publicize how little their products are changing from year to year?
- On one hand, it would definitely force companies to innovate, and customers would be better informed to make purchasing decisions. A point to consider though: should we buy newer phones because we believe our old ones are obsolete? If there’s no significant difference, would we still buy a new phone?
I’d like to drop a link to an article that started me down the path of paranoia towards firms a few years ago (just like Nehru!): https://www.cnet.com/news/apple-and-samsung-fined-for-slowing-down-phones-with-updates/ - ~Shau
- On one hand, it would definitely force companies to innovate, and customers would be better informed to make purchasing decisions. A point to consider though: should we buy newer phones because we believe our old ones are obsolete? If there’s no significant difference, would we still buy a new phone?
- Long before computers and phones, it was General Motors that first introduced annual model updates, for its cars in the 1920s. Even in the midst of the Great Depression, the idea was to excite consumers about buying a new car before their existing car broke down—and it worked. The approach soon spread to all car manufacturers and endures to this day. People take for granted that cars should have an updated design every so often or they risk looking old. “Age doesn’t automatically translate into awfulness,” notes one reviewer of the 2020 Nissan Rogue. That this needs to be said speaks to how much society has internalized planned obsolescence in the automobile industry. Discuss with your team: what other industries depend on a similar approach to maintain sales—and should manufacturers be required to keep their products looking the same unless there are major functional upgrades?
- The first article mentions how the Model T by Ford Motors sold extremely well, saturated the market, and then there was no need to buy any other car. So Alfred Sloan (CEO of GM) decided he’d release new cars every year to fabricate demand, with slight improvements to incentiviSe purchases. This caught on with the entire economy, and by the 1960s, GM dominated the vehicle market.
Then an energy crisis hit in the 1970s, and emissions regulations let compact Asian cars break into the market. Then, the consumer electronics industry picked up the idea. GM said they are trying the same tactic after their 2009 bankruptcy and bailout. (stop bailing them out for the love of Cthulhu)
The second article is a stupid long review of a car and I don’t care enough to write much about it. The gist is that the Nissan Rogue is a pretty good car even though it’s an old model, and its features are constantly compared to newer cars, where the focus is on how it “still holds up” in most ways.
The last question is complicated, but I don’t think that a product’s value only comes from design. Aesthetic changes matter quite a bit, and slight changes in functionality can matter to people. For example, I think a slimmer or smaller laptop actually makes a significant difference to me. I think consumers should be given a choice about what they prefer. As long as promotion and design are transparent and non-deceptive, I don’t see any issue with ‘insignificant’ changes.
~Shau
- The first article mentions how the Model T by Ford Motors sold extremely well, saturated the market, and then there was no need to buy any other car. So Alfred Sloan (CEO of GM) decided he’d release new cars every year to fabricate demand, with slight improvements to incentiviSe purchases. This caught on with the entire economy, and by the 1960s, GM dominated the vehicle market.
- You are probably reading this outline on a phone you can’t open, at least not easily—even just to change the battery would require professional assistance. Many products, from toaster ovens to tractors, are now designed in a way such that their buyers can’t repair them. Sometimes even professionals struggle. Discuss with your team: do manufacturers have a responsibility to create products that everyday people can fix—or is it worth sacrificing that kind of accessibility for thinner and more elegant designs?
- The first (also annoyingly long) article talks about how devices are no longer repair friendly, have become much thinner and delicate (called ‘design anorexia’), and the annual e-waste of about 55 million tons could be slashed. Repair reduces the need to produce more.
“Sandra Goldmark of Pop Up Repair says that for every pound of waste diverted from landfill, you can reduce as much as 40 times that amount by not manufacturing new stuff upstream.”
There’s also a mention of repair cafes where community members help others fix their broken devices. Startups were mentioned too, ‘Fairphone’ released a durable phone with separate parts which are easily repairable, and ‘Remade’ offers repair services.
~Shau - The second article highlights the arguments for and against what is called the “right-to-repair”. It shines a light on the thoughts of those in the agriculture sector regarding the exclusive repair practices manufacturers are pushing with their products. It has increasingly become more difficult to manually repair tractors, let alone diagnose them because firms are requiring farmers to have specialised access and tools to autonomously fix their own capital. It handicaps the ability to more efficiently mend equipment and only prolongs the process as it requires farmers to go to an authorised repair shop even for a minor fix.
To counter this concerned sentiment, firms are suggesting that they provide the necessary equipment their customers need to repair their products instead. Manufacturers are concerned that if they publish their product’s source code on how the tractors operate, it will force them to “turn over their intellectual property.” However, farmers disagree on this as a viable solution as specialiSed equipment will only mean that they’d need to buy and rely more on the manufacturer’s goods. It will only result in an increase in the overall price of repair, perpetuate the difficulty of fixing one’s own equipment, and disadvantage the consumers and independent repair shops further.
The article also accounted for bills in Minnesota that tried to fight for the consumer’s right-to-repair, however, it did not pass. 22 other US states tried to pass similar bills as well to no avail.
~Jutin - I strongly agree that items should be easier to repair. I think the extreme focus on slim products is weird, and a fairly slim product could still be easy to repair. The reason is that being able to repair and customise electronics can actually boost sales and demand, as well as give consumers greater power. The best example of this is the custom gaming PCs market. A more global perspective could be microfinance, where jury-rigging drives a lot of entrepreneurship in places like India. Look up the UN’s CEMG HIO for more details on similar examples.
~Shau - I too believe that, overall, products should be easier to repair for its consumers. Not only does it give you the buyers more choice and power over the products they, themselves, buy but it also makes the entire repair process more efficient and self-sustaining. If manufacturers began imposing specialised tools for different products from different brands, it’ll only convolute, disproportionately benefit those placing the price tags on these products, and makes the economy ineffective.
~Jutin
- The first (also annoyingly long) article talks about how devices are no longer repair friendly, have become much thinner and delicate (called ‘design anorexia’), and the annual e-waste of about 55 million tons could be slashed. Repair reduces the need to produce more.
- Apple has been found using software limiting the performance of its phones to prevent sporadic shutdowns as batteries weaken over time. Critics see this as Apple nudging users to purchase new batteries or even new phones; in France, Apple has been sued for the practice, as it (allegedly) violates the country’s law against planned obsolescence. Discuss with your team: would a company be justified in reducing the performance of your device to make it last longer—or should this kind of behavior be against the law?
- The first article mentions that Apple throttled battery performance to stop random shutdowns. People were enraged, and so Apple added an option to stop the throttling and made new batteries cheaper ($79 to $29). But in the new iPhones, this is probably still going to be a feature that needs to be manually changed in the settings.
The second article is more complex, and it mentions a huge amount of businesses, statistics, studies, and is generally an exceptional resource for debates on planned obsolescence.
Otherwise, it talks about how France is moving towards becoming a less disposable society, and French politicians are trying to push the EU (and the French Government) to regulate planned obsolescence. It is already illegal in France through a general law on energy, but it’s not explicit or easy to prove. - As for the discussion, I think that companies should give the choice of performance throttling to the consumer. What this does in essence is let companies say things like “We have a 50-hour battery life”, but that only works when you have your brightness turned low and nothing being used. It creates a huge information asymmetry between the consumer and the firm.
~Shau
- The first article mentions that Apple throttled battery performance to stop random shutdowns. People were enraged, and so Apple added an option to stop the throttling and made new batteries cheaper ($79 to $29). But in the new iPhones, this is probably still going to be a feature that needs to be manually changed in the settings.
- France has also been looking to force companies to publish how long their products will last—a product durability index. The idea would be for consumers to have a reason to pick longer-lasting products—and thus generate less waste for the world and fewer expenses for themselves down the line. Discuss with your team: do you think this law would work as planned? Should products in other industries—such as clothing, cars, and pillows—also be required to advertise their expected durability?
- The French government wants to use coloured stickers to denote durability on a scale of 1 to 10 to encourage product life.
- I personally think this might work if properly publicised, but I don’t see how governments would check these durabilities. (How do you check the effect of use over time? Play Subway Surfers for 8 years?) My other question is about what would happen if it did work? People might suddenly start getting obsessed with durability, and that could reduce consumption. It’s arguable this could hurt sales, as well as economic growth and prosperity. Yet, it’s also possible that a structural change (like a change to a customisation and modification market!) could offset the economic slowdown.
- In some cases, new products really are better than old ones. Phones might have larger screens and better cameras; cars might drive more safely or have more efficient engines. Discuss with your team: is it possible that what critics describe as planned obsolescence is just consumers preferring actual technological improvements? Is planned obsolescence more of an issue in less dynamic industries, such as refrigerators and toaster ovens, where technologies are not changing very much, but manufacturers still need to sell more products?
- Honestly, this is a debate topic you should address with your friends. But here’s the best article I could find on the topic: https://www.perc.org/2012/07/18/planned-obsolescence-the-good-and-the-bad/ It’s very short, and analyses the question really well.
- Honestly, this is a debate topic you should address with your friends. But here’s the best article I could find on the topic: https://www.perc.org/2012/07/18/planned-obsolescence-the-good-and-the-bad/ It’s very short, and analyses the question really well.
- “Next time we’ll qualify less scholars.” A grammatical traditionalist would be irate at the use of the word “less” instead of “fewer”; their head might literally explode. Except it wouldn’t really explode: literally is a word which has lost its literal meaning and now mainly offers emphasis. The same traditionalist would be upset at the mention of “their” head—the phrasing should be “his or her”—but today many woke people favor “they” as a gender-neutral pronoun. As for “fewer” versus “less”—fewer should apply to quantities and less to amounts, but the terms are now used so interchangeably that fewer and less people care about it every year. And the word “woke” no longer means “someone made me stop sleeping”—at least, not literally.
- Old words change. New words emerge. A thousand years ago, the English word “meat” referred not just to things like beef but also to fruits and vegetables; today, it only means food that used to be part of an animal (though with a few exceptions—one can still eat the meat of a coconut). If someone named Jim had led a failed revolt in 17th century London, we might be debating whether the term “you jims” is sexist—but, instead, the ringleader was a jim named Guy, and now we use the term “guys” to refer to people in general, or sometimes to Billie Eilish.
- That’s a reference to Guy Fawkes, a member of the gunpowder plot. Members of the revolt were called “Guys” after the failure of the plot (Honestly, this is genuinely just kind of weird)
~Shau - Apparently, the etymology of the slang term “guys” —the gender neutral term for a group of people— comes from Fawkes. The word initially meant “creepy people”, then changed to mean “men”, and now the more widely known definition.
The article further talks about whether or not “guys” is appropriate in the workplace because of how informal it is (not because it can be perceived as an insult because of its history). ~Jutin
- That’s a reference to Guy Fawkes, a member of the gunpowder plot. Members of the revolt were called “Guys” after the failure of the plot (Honestly, this is genuinely just kind of weird)
- If enough people in a community start misusing (or repurposing) a word in the same way, eventually the dictionary catches up: the word develops a new accepted meaning. The same goes if enough people choose to consistently ignore a grammatical rule—such as the proscription against split infinitives. It’s okay now to boldly go where no guy has gone before. In these cases, language is said to be experiencing semantic change; its critics tend to call it semantic drift. Discuss with your team: does semantic drift do more to keep language fresh or to weaken our ability to communicate?
- Shaurya here! This article was so painfully pedantic, and I have a strong opinion on this! Semantic change is literally how language works. The idea that a grammatical ‘literary standard’ that is objective exists is so painfully stupid. Linguists know that there is never a specific set of rules for how language works. For example, the Hindi we speak today in India is actually a mixture of Urdu, Pashtun, Parsi, Sanskrit, and god knows what else. This confusing mess was called “Hindustani”, and literally no one cares that words and syntax changed. But some people did care. Those were politicians in independent India who wanted to preserve Hindi’s ‘purity’, and assigned some religious value to a set of phonetics. They wanted to force South Indians and the rest of the country to speak Hindi as well. But the problem was, the Hindi they spoke was so confusing that my mother (a PhD in Hindi!) has no idea what they said or wrote. They were trying to preserve something no one could understand and impose it on hundreds of millions.
This obsession with the “right” grammar in a language is completely counterproductive to human society. Language changes because its purpose is to facilitate communication. Grammatical rules exist to serve that purpose! The notion that changing the way we use grammar hurts us is absolutely stupid.
For the sake of argument, I can mention that using specific vocabulary does enhance communication. Instead of saying something was ‘lit’, it’s much more descriptive to say it was ‘exhilarating’. Specific words are used by writers to paint stories too, and Orwell’s Newspeak from 1984 is a great example of this.- Comments from Jut: Speaking of formulating new words to paint stories, Anthony Burgess’ Nadsat from A Clockwork is another 😩👌example of how authors creatively communicate narratives. Language is and should almost never be stagnant as it lives, grows, and transforms alongside the people that speak them. Pedantry or arbitrary rules should not bar the natural shifts in language; doing so is only a waste of time.
- Another reference on this topic is Noam Chomsky’s lectures on linguistics. But honestly, while the writer of this article was flexing his convoluted vocabulary, one word came to my mind: rodomontade. In other words, he’s a pretentious prick.
~Shau
- Shaurya here! This article was so painfully pedantic, and I have a strong opinion on this! Semantic change is literally how language works. The idea that a grammatical ‘literary standard’ that is objective exists is so painfully stupid. Linguists know that there is never a specific set of rules for how language works. For example, the Hindi we speak today in India is actually a mixture of Urdu, Pashtun, Parsi, Sanskrit, and god knows what else. This confusing mess was called “Hindustani”, and literally no one cares that words and syntax changed. But some people did care. Those were politicians in independent India who wanted to preserve Hindi’s ‘purity’, and assigned some religious value to a set of phonetics. They wanted to force South Indians and the rest of the country to speak Hindi as well. But the problem was, the Hindi they spoke was so confusing that my mother (a PhD in Hindi!) has no idea what they said or wrote. They were trying to preserve something no one could understand and impose it on hundreds of millions.
- Consider the following words and phrases that have evolved over time. What do they mean today? What did they mean before? Discuss with your team: what words do you think are in the process of changing in our world today? ~Shau and Jut
- nonplussed | To be nonplussed is to be (1) at a loss of what to think, or (2) bewildered. These are the traditional definitions, anyway. The word comes from the old but now little-used noun nonplus, which refers to a state in which nothing more can be said or done, so to be nonplussed is essentially to be at a standstill or an impasse.
- disinterested | Basically, disinterested means being impartial (with no interest - such as property - in the issue), while uninterested means bored. But often they are switched so no one knows anymore.
- aggravated | To make something worse or to annoy. The latter is more informal. There are no lexical issues around this word but if you break down the Latin roots of “aggravate”, it literally means to “add weight”. This version of the definition is obsolete and is not used
- Extra | Basically, other than the normal meaning, people use it to describe someone who just goes beyond the normal expectations you have of them. Like someone ranting about semantic change for like 4 paragraphs. :P
- could care less | Supposed to be couldn’t care less, but obviously saying you could care less means that you do care. Here’s a detailed article on this: https://www.quickanddirtytips.com/education/grammar/could-care-less-versus-couldnt-care-less
- basic | if you’re a boomer, this only means surface-level or elementary but, if you’re an intellectual teen, you’d also know that this is an informal adjective that means typical, generic or mainstream ~Jutin onwards
- bald-faced | It initially meant to literally have nothing cover your face—such as having a beard or mask— but that was the 16th century and now it more figuratively means someone who is rudely and insultingly apparent, open and/or easy to read
- Irregardless | it’s a non-standard synonym for regardless. Both words mean the same but the double-negative in irregardless makes it confusing and clunky.
- plethora | apparently this initially started off as a medical term that means “excess bodily fluids” in the 1500s then, it figuratively became “too much or too full” in the 1700s. Today, it still means excessive or abundance.
- awful | the suffix “-ful” often exemplifies its base word. Colourful is full of colour, Painful is full of pain, but awful isn’t full of awe. Instead of reverence, awful means unpleasant, appalling, or terrible. It used to mean something to be respected and looked up to but that is no longer used.
- incredible | “in-” is the prefix for not and “credible” means to be reliable. Based on that two information, “incredible” should mean incredulous or lacking trust but it actually means extraordinary. The more negative definition was used in the 1400s but we almost always imply positive connotations with this word today.
- fortuitous | this means accident or chance but is often confused with fortune. Fortuitous is a happy coincidence— similar to serendipity — but fortunate implies that it was given or blessed by luck.
- Super | this prefix means “above,” “beyond,” or “over” in Latin but is now, more commonly understood as “very much”. With the modern definition, words as “supersede” and “supercomputer” make little sense once it’s analysed. “Sede” comes from the Latin “sedere” meaning “to sit” so, supersede would mean “extremely sit” by the modern connotations of super. If we used the original “beyond”, we can interpret it as “transcending a seat” or replacing a person of power.
- Words can also drift different directions in different communities. In Singapore, students study “maths” and the word “students” is composed of eight alphabets—in the United States, students wish “math” were a WSC subject and the word “alphabet” is composed of eight letters. As you investigate the following terms as they relate to semantic change, consider the social and cultural forces at play, and how they might vary from place to place. ~Shau
- etymology - This refers to the origin of a word. Pro Tip: The answer is almost always just Latin.
- metaphor - Something that isn’t literal, but represents a concept. Like “This is a walk in the park” being used to describe something easy.
- synecdoche - When a part is used to represent the whole. For example, “Avan published the 2020 notes for SST”, where “Avan” means “The Avansalpacaresources website”
- Metonymy - Something similar used to represent a concept. Like “Dubai” being used to refer to the UAE. (GUYS THERE ARE OTHER EMIRATES IN THE COUNTRY I PROMISE)
- Comments from Jut: I’ve met too many people who think Dubai is a country and not even know what UAE is.Also, synecdoche is a type of metonymy and metonymy falls into the category of metaphors as it involves replacing a word with another word that is associated. (e.g. the crown doesn’t literally mean the ruling king or queen but it figuratively conveys that because of conceptual association.)
- generalization and specialization - Generalisation is stating things in abstract terms that encompass basic concepts. “WSC websites tend to have long notes” Specialisation could be the opposite, where something is observed and stated in a very narrow lens.
- Analogy - A comparison used to explain a concept. “The WSC staff works like an engine, where the staff sacrificing their sleep is the fuel”
- Hyperbole - An extreme exaggeration.
- Word reappropriation - When words are reclaimed and used in different ways. “Baekjeong” in Korea is an example, it went from an insult for butchers to the name of BBQ places.
- Amelioration - The opposite of pejoration (much less common). “Noble” went from being a blueblood to being a good person. (noble)
- Pejoration - Words that deteriorate in meaning. Like “awful” going from “awe-inspiring” to… awful.
- People might double-take nowadays when they hear someone pronounce “ask” as “aks”, but this was commonplace for hundreds of years. Investigate the Great Vowel Shift of the English language and consider: have there been any similar changes in other languages you know?
- The “ask” / “ax” article doesn’t really talk about vowel changes but more highlights how the two pronunciations of the word “ask” are equally correct. It’s just that “ax” has fallen out of use and is non-standard for generally, unclear reasons.
- It began in the late 14th century England and spanned about two centuries, the Great Vowel Shift marked the transition from Middle English to Modern English via phonetic changes in pronouncing the vowels. Now, the video provided generalises the change as long vowels being pronounced in the back of the mouth from the front of the mouth. While yes, the shift affected all long vowels but what changed is more complex than mouth position depending on the vowel observed. The “ea“ in “sea” used to be pronounced as two separate vowels but in the shift it changed into /iː/, one vowel or a monothong. On the other hand, with the words such as “house” , previously pronounced as /hoose/, developed diphthongs with the vowels “o” and “u” being combined to make an /ow/ sound. Here’s a very informative video explaining how Middle English sounded very different from Modern English to have a better idea of what English sounded like before the shift: https://www.youtube.com/watch?v=WeW1eV7Oc5A
- Another example of a phonetic shift would be the The High Germanic consonant shift somewhere between the third and fifth centuries. The sounds /b/, /d/, and /g/ (the sounds in the English words bog, dog and gate respectively) became unvoiced after the shift. This basically means that when you said the letters, your throat wouldn’t vibrate so, /b/ turned into/p/, /d/ turned into /t/, and /g/ turned into/g/. If you said all of these letters with your hand on your throat you should not feel vibrations when you say the unvoiced letters. Other changes also occurred during this time but it’s too much linguistic jargon and knowledge to explain here. This phonetic shift resulted in Old Germanic, as opposed to West Germanic languages like Old English which didn’t adapt to the change.
~Jutin
- You’re using your phone when someone messages you that New Zealand just won the Quidditch World Cup! How to respond? On a laptop keyboard, you might bang out an excited keysmash: asdfafasffsa. But that would be much harder to do on a smartphone, where you’d have to tap out different letters with your thumbs—and then backspace to prevent them from being auto-corrected into something like “avocado”. This interview with linguist Gretchen McCulloch covers some of the ways in which how we use language might be evolving. Explore with your team: how have new methods of communication (such as Morse Code and touchscreens) changed the words or terms we use?
- When I was about 6 years old in 2008, I vividly remember my grauntie asking me if I understood the text “how r u?” on her Motorola phone. Being a very dumb and confused child, I was stumped on what exactly “r” and “u” meant in that sentence seing that they clearly had more meaning than the standard alphabets they usually meant. This was back in the days when phones were novel and non-ubiquitous. Today, texting like that is commonplace but it wasn’t always. Before touchscreen was invented, creative forms of spelling were novel as it was adopted to make typing more efficient. Early phones had a number pad instead of your standard QWERTY keyboard, so each number pad had 3 associated characters and you’d single, double, or triple click on a button to select the desired character. This gave birth to spelling “later” as “l8r”, “see” as “c”, “today” as “2day”, and many more. Abbreviations like “brb”, “ttyl”, “lol”, and “btw” also came from this technological change.
- Morse code also experienced similar shortening to make communications easier. Instead of painstakingly spelling words and phrases, abbreviations such as “FWD” for “forward”, “MSG” for “message”, and “PLS” or “PSE” for “please” became standard. “88” was also used to mean “love and kisses” and “73” for “best regards”.
- For a more personal example, I’m actually experiencing a communication change because of the method I use to communicate. I have fat thumbs and long nails so texting on my phone is terrible. I much prefer working on a laptop but I’ve had the same computer since 2012 and my keyboard is slowly falling apart. Those in the Sleeping Alpaca discord server know what I’m talking about. My “o” and more recently “d” key are starting to not work so I’ve been forced to be creative when I type. I normally cut out vowels when I text to make it easier for me “front” becomes “frnt”, “know” to “knw”, “probably” to “prbbrly”, and many others. For academic papers and AAR, I copy and pasting my “o”s and “d”s :’))) S0metimes i use 0 t0 replace my 0s but that ruined my t0uchtyping abilities s0 im trying t0 und0 that damage.
~Jutin
- Old people have always criticized the habits of the younger generation, including how they behave and how they speak (and the youth have always said “OK” in return). This second Gretchen McCulloch podcast offers an interesting look at how certain trends that some people see as harmful to the English language are actually just, like, language doin’ its natural thang.
- The first article criticises the older generations and how they’re quick to judge and undermine the youth in fears of being replaced. The author condones the sentiment that previous generations already accomplished feats the youth could never compete against. Fiefer states, “We send children off into the future, telling them the greatest moments have already passed.”
He also continues to expound on this belief by inferring that the reason why elders are quick to look down on the youth is to establish their dominance as a person. Once they’ve accepted that newer humans are equal or better than they are, older people become obsolete or simply not as useful as the new model. Our predecessors condescend to cope with the fact that the world is and will be moving forward with or without them.
Finally, the article ends on an optimistic note where the author calls for everyone to judge persons NOT based on what year they were born but instead on their character. There are always good and bad people in any generation. - Like the article, the podcast chronicles how changes, transformations, or new aspects to things, specifically language, have always been criticised. The podcast lists examples from when English was first being put into written form, to Latin spellings, and eventually to modern lexical “errors”. Socrates denounced writing as it encouraged people to not memorise as well. When Latin wasn’t a dead language and the newer speakers made spelling mistakes, they were condemned by those who spoke the language longer. Jokes on those older speakers, the Latin errors made it into and persisted in almost every Romance language. They are now the norm! The hosts then continue to talk about more modern grievances like the quotative "like", texting culture, and using literally as a figurative adverb.
~Jutin
- The first article criticises the older generations and how they’re quick to judge and undermine the youth in fears of being replaced. The author condones the sentiment that previous generations already accomplished feats the youth could never compete against. Fiefer states, “We send children off into the future, telling them the greatest moments have already passed.”
- Legend has it a man in 1700s Dublin was once challenged to invent a new word that would enter into the public lexicon in less than two days. He decided to write that new word—“QUIZ”—on every door in the town overnight. Research has suggested that this story may be apocryphal, but it does warrant investigating the sources of new words. (Shakespeare is credited as having added 1,700 new words to the English language without writing them on doors.) Are there any words (or shibboleths) that are local to your friend groups or communities, and how did they come to be? Who makes new words official?
- Xavier here, let’s dissect this rather entertaining story. Supposedly, in 1791 Richard Daly bet with his friends that (specifically) he could make a nonsense word the talk of the town throughout Dublin. He then sent out his employees (as he was a theatre manager) to write “Quiz” on not just every door, but as many walls and windows as possible. Of course, this then became a topic of discussion, resulting in him winning the bet. However, there is little proof that this story is actually true, as it was first written about in 1835, 44 years after this event had (supposedly) taken place. Also, there were uses of the word “quiz” before 1791, though it had a completely different meaning. Calling someone a quiz around this time period would basically be equivalent to calling them a nerd, or an outsider. However, this word wasn’t particularly well known by the upper-class of Dublin, and so there may be some truth within this story, where young theatre employees ran around writing it on windows for a laugh, whilst also insulting the people whose windows they were writing on.
Let’s now look at the ways that new words are actually formed. In the article above, 13 different ways are described and they are,- Derivation: Add a Prefix / Suffix. (Democratic -> Democratise)
- Back Formation: Remove a Prefix / Suffix (Easy -> Ease)
- Compounding: Two common words being added together (day + dream = daydream)
- Repurposing: Giving one word new meaning (mouse, animal -> mouse, computer)
- Conversion: Changing the word class of a word (giant, noun -> giant, adjective)
- Eponyms: Words named after a place / person (Atlas, Titan -> Atlas, book)
- Abbreviations: Shorten a word (caravan -> van)
- Loanwords: Borrow from another language (tattoo, from Tahitian)
- Onomatopoeia: A word that imitates a real life sound (bark)
- Reduplication: The repetition or near repetition of a word (hip hop)
- Nonce Words: Words with no apparent relation to any other (quark)
- Error: A misspelling that creates a new word (scrabble -> scramble)
- Portmanteaus: Compounding, but only using part of one or both words (smoke + fog = smog)
- Though these may seem like a rather large number of different ways for words to be created, you have to remember that most of these methods aren’t commonly used now, as we don’t loanwords from other languages as often as we used to.
Most words that my friends have created are portmanteaus of someone's name and something relating to an in joke about them, though I can’t think of any off of the top of my head. The much more important question here is who makes new words official? Generally, well known dictionaries will look for words that have been used often throughout the year but are yet to be added to the dictionary, and will choose them! Though it does seem like a very simple process, I would assume there will be many arguments over whether this word or that word does finally end up getting into the dictionary.
- Xavier here, let’s dissect this rather entertaining story. Supposedly, in 1791 Richard Daly bet with his friends that (specifically) he could make a nonsense word the talk of the town throughout Dublin. He then sent out his employees (as he was a theatre manager) to write “Quiz” on not just every door, but as many walls and windows as possible. Of course, this then became a topic of discussion, resulting in him winning the bet. However, there is little proof that this story is actually true, as it was first written about in 1835, 44 years after this event had (supposedly) taken place. Also, there were uses of the word “quiz” before 1791, though it had a completely different meaning. Calling someone a quiz around this time period would basically be equivalent to calling them a nerd, or an outsider. However, this word wasn’t particularly well known by the upper-class of Dublin, and so there may be some truth within this story, where young theatre employees ran around writing it on windows for a laugh, whilst also insulting the people whose windows they were writing on.
- Learn about the backlash when the Associated Press announced a small rule change involving hyphens and chocolate chip cookies. Meanwhile, Kazakhstan is switching from the Cyrillic to the Latin alphabet, much as Turkey did from the Arabic alphabet in the 1920s. Textbooks will need to be changed, signage replaced, and passports reissued. Discuss with your team: do you think you would be able to adjust to a new alphabet in your country? Can you imagine your government making any other changes to the language?
- Xavier here once again, and I think that the backlash surrounding the rule change that the Associated Press announced is ridiculous. Basically, some words are used in conjunction with another to describe something, and often require a hyphen. If these two words are acting on the same word, then they are known as a compound modifier. In the past, compound modifiers have required a hyphen not due to clarity but simply due to personal preference. So of course, when the Associated Press dictionary said that it would no longer be using hyphens in compound modifiers and it recommended that others do the same, there was widespread outrage in the linguistic world. What we do have to remember is that this is simply a recommendation, not a definite rule, and many other dictionaries still use hyphens in compound modifiers.
Kazakhstan’s decision to switch to the Latin alphabet is also a very interesting one, as they’ve had many changes to their alphabet throughout their time as a nation, firstly going from Arabic to Latin, then from Latin to Cyrillic in the 1940s and now back to Latin once again. It’s believed that the reason behind this change is tied to the languages that people speak within Kazakhstan, as only 74% of the population speak Kazakh whilst 94% speak Russian. In an attempt to move away from their Communist Russian past, Kazakhstan are changing their alphabet to Latin so that there will be a decrease in Russian speakers and in increase in Kazakh speakers, more accurately mirroring the population. Also, there are obvious advantages, such as how 90% of all information online is in the Latin alphabet.
Within New Zealand (my country) this change would be unprecedented. All of our official languages (English, Maori, NZSL) either use the Latin alphabet or are sign language. Therefore, a major change in the alphabet would be incredibly drastic, and the change would be difficult for nearly all of the population. I can’t see the New Zealand government making any other changes to the language in the future, as we’ve very much settled into a highly Westernised culture, with the Latin alphabet to go with it.
- Xavier here once again, and I think that the backlash surrounding the rule change that the Associated Press announced is ridiculous. Basically, some words are used in conjunction with another to describe something, and often require a hyphen. If these two words are acting on the same word, then they are known as a compound modifier. In the past, compound modifiers have required a hyphen not due to clarity but simply due to personal preference. So of course, when the Associated Press dictionary said that it would no longer be using hyphens in compound modifiers and it recommended that others do the same, there was widespread outrage in the linguistic world. What we do have to remember is that this is simply a recommendation, not a definite rule, and many other dictionaries still use hyphens in compound modifiers.
- Given how hard it is to predict how language will change in the future, it can be a tall order to write messages today that will still make sense thousands of years from now. This is the problem nuclear semioticians face as they try to communicate the danger of nuclear waste to people in the far future. What approach do you think they should take—or should we assume that, thanks to the Internet, today’s languages will endure intact into the future?
- One issue which not only we will deal with but many generations in the future is nuclear waste. Generally, the most common solution is to bury it, but how do we stop people in the future from digging it up? Nuclear semioticians deal with this issue. You may think that we could simply write warnings in every major language that currently exists and put them outside the site of the nuclear waste, but an issue still remains; we do not know how language will evolve in the future and we are therefore unable to predict if anyone will even be able to read the warnings. Other suggestions to warn people away from nuclear waste include pictograms (which may be misinterpreted), symbols (which can have their meaning change over time), large and threatening structures (which could actually make people curious to explore the area) and even radioactive cats! There is no certainty that the internet will still be around in the future, and therefore we cannot assume that today’s languages will endure for the next 250,000 years, when the nuclear waste would be considered “safe.” There are basically two options of what we do in an attempt to solve the issue of communication. Either we try all of these solutions, combining them in a way to dissuade people from entering the site or we do what nations such as Finland are doing, and simply ignore it. Finland’s solution is to bury the nuclear waste far underground and then aim to forget about it, allowing passive safety to occur due to the incredibly low chance of the waste being dug up.
- One informal study by a Spotify employee supposedly found that people stop listening to new music at age 33. The original blog post doesn’t draw that broad a conclusion, but it does offer interesting insights—for instance, men on Spotify seem to give up on new music sooner. Discuss with your team: what makes some individuals and groups more open to new music later in life? Do you think “taste freeze” in music would happen to a person around the same time as taste freeze in fashion and hairstyle? And in what ways (if any) would the population listening to music on Spotify not be representative of all music listeners?
- There are many, many reasons as to why someone would continue listening to new music later in life. Firstly, if you are part of the music industry (duh) or any other creative industry, you are constantly looking for inspiration, and are therefore more likely to keep listening to new music. Also, individuals who are novelty seekers, or generally look for new stimuli, will keep branching out and listening to new music.
The “taste freeze” that is described above, where you end up listening to music with roughly the same popularity at a certain age, I believe, doesn’t happen at the same time as the taste freeze equivalent in fashion and hairstyle. We are under much more pressure to keep up with current fashion and hairstyle trends due to the fact that it’s very externally visible and apparent, unlike one’s own internal music tastes. At a simple glance, you can immediately tell if a person is up to date in terms of fashion, but you cannot visually deduce if someone is musically “hip” in the same way as clothing.
Spotify is primarily used by younger generations because (shock horror!) it’s on the internet. That is not to say that there are older individuals who do use Spotify, just that they are in the minority. Therefore, this data likely doesn’t consider what happens when you are 55+, due to a lack of data on it.
- There are many, many reasons as to why someone would continue listening to new music later in life. Firstly, if you are part of the music industry (duh) or any other creative industry, you are constantly looking for inspiration, and are therefore more likely to keep listening to new music. Also, individuals who are novelty seekers, or generally look for new stimuli, will keep branching out and listening to new music.
- Out with the old, in with the older. Sports teams sometimes wear (and sell) throwback uniforms; airlines paint some of their planes with retired livery. Would you be as excited to see something old back in action as you would be to see something new? Does the same kind of reasoning explain the hipster appeal of certain old technologies—such as record players?
- We all seem to have an obsession with something that is old, whether it be an old T.V. show (Original Doctor Who) or “old” music (The Beatles, anyone?). Of course, people will then be excited to see something that they love come back, due to the nostalgia that they have for it, but that doesn’t explain the excitement that many experience despite the fact that they weren’t there for the original release. This comes from the fact that they finally get to experience something old that they wouldn’t be able to experience otherwise. In short, I would be excited to see something new come back if it either generated high nostalgia or provided me with an experience that I had previously missed out on. The “hipster appeal” of older technologies, such as record players, arises because it allows us to absorb a specific product (in this case music) in the medium that it was originally intended to be presented in. This gives us that feeling of nostalgia and allows us to have an experience as close to the original as possible.
- Is it possible for a fad not to die out but instead become an enduring part of culture? Or is such a fad not a fad in the first place?
- Whilst most fads completely die out, there are some that continue to be prevalent within our culture. Granted, not as prevalent as they previously were, but still prevalent enough that you notice it relatively regularly. Fads such as Avocado Toast come to mind immediately, as they were a craze for a while and now remain relatively popular. Other examples I can think of include Pokemon Go and memes such as Doge that have become imbedded within our culture, though they remain less popular they are still utilised and referenced. However, these are still a fad, as they had a period of time where there was an intense feeling of enthusiasm towards them, but this died out. However, instead of nearly totally disappearing, they remain present.
- How often should a hotel renovate? How about schools?
- In an ideal world, hotels would only renovate when necessary, in an attempt to save resources. However, as we can see from the above article, hotels renovate much more frequently in an attempt to find something new them that will stick and increase the revenue that they generate. Those who run the hotels are solely there to make a profit, and often do not understand the stress that they are placing on the staff of the hotel as well as those who are doing the renovation when they begin to do renovations every three years, or even every year! Therefore, renovations should only occur when absolutely necessary, or when they can provide some substantial improvement to the guests at the hotel.
The same logic can be related to schools. Schools ideally should not be renovating too often, as it will disrupt the students and result in issues surrounding classrooms, but when a substantial improvement can be made, it definitely should be. For example, my school renovated two classrooms next to the library that were previously never used and turned them into a digital technology centre, allowing students to learn coding and programming much easier than previously.
- In an ideal world, hotels would only renovate when necessary, in an attempt to save resources. However, as we can see from the above article, hotels renovate much more frequently in an attempt to find something new them that will stick and increase the revenue that they generate. Those who run the hotels are solely there to make a profit, and often do not understand the stress that they are placing on the staff of the hotel as well as those who are doing the renovation when they begin to do renovations every three years, or even every year! Therefore, renovations should only occur when absolutely necessary, or when they can provide some substantial improvement to the guests at the hotel.
- How long does it take before something old needs to be out-ed? Instagram stories expire after 24 hours and Snapchat messages are a single tap away from being lost forever (unless, in either case, you use a save function that was added long after the initial feature was implemented in the app). How do we socialize differently in a world that is renewing more and more quickly? Is this kind of social transience something we should try to avoid or embrace?
- Thanks to the wonders of modern day technology, we’re able to constantly socialise with practically anyone simply by sending them a message! Of course, this results in many differences. Without the face to face interactions, we’re unable to pick up on social cues, and can easily misinterpret phrases. Also, discussions often have much less substance (in my experience) than those you have in person. Deep, meaningful chats are the ones which seem to occur when you’re with someone, not over a text. Of course, that is not to say that all technology is bad in terms of social interaction and communication, as it allows us to communicate with all of the friends that we make at Global Rounds and ToC! What is crucial is that we ensure these conversations we are having contain substance, and aren’t simply us repeatedly saying “streaks” over and over again.
- The final scene of Mad Men is a testament to the seductive power of television advertising. Are there other products and services being marketed today in a similar way? Does it make you believe Coke can renew the world?
- I have very limited knowledge of most T.V. shows, so bear with me here. Mad Men is a T.V. show which is set in 1960s New York City, and focuses on the advertising business, as well as the lives of the individuals who work in advertising. The very end of the series focuses on Don, one of the main characters throughout the show, and in this shot, we see him meditating in the Lotus Position, during which he smiles. After this, we cut to this ad about Coca Cola, implying that Don ended up creating this ad. Products are certainly being marketed in this way all the time by having them be present but never directly referencing them. For example, Dunkin’ Donuts cups are placed in front of the judges on America’s Got Talent, but it’s never said directly during the show that they are sponsored by Dunkin’ Donuts. Products are constantly being indirectly marketed in forms of media, as hiding something in the background whilst not directly referencing that you are sponsored by it will draw the attention of some, but not all, and will generate revenue without it appearing as though you’re a “sellout.” However, these ads may not be directly referencing that they are advertisements, but that doesn’t mean that they are successful. The Coca Cola advertisement above certainly is a well made ad, but it doesn’t make me believe it will renew the world (though that may be because it’s a rather old ad and no longer has the same impact on younger generations).
- Are World Scholar’s Cup themes an example of forced obsolescence? Should they be used for more than one year?
- Short answer, yes, World Scholar’s Cup themes definitely are an example of forced obsolescence. We only need to look to the definition “a policy of planning or designing a product with an artificially limited useful life, so that it becomes obsolete after a certain period of time.” A WSC Theme is designed, has an artificially limited “useful” life and becomes obsolete after one year. However, the intent behind this forced obsolescence is completely different to larger technology companies. Whilst they use forced obsolescence to make more money, because you have to keep buying their products to get the newest features, World Scholar’s Cup uses it to ensure that there is an even playing field for all competitors. Imagine if the theme didn’t change every year, and there were scholars who had been competing for five years and knew every part of the curriculum. Wouldn’t that seem daunting to new scholars and dissuade them from entering the programme? The forced obsolescence that WSC uses allows new scholars to enter the programme, and overall has a positive benefit on Scholar’s Cup as a whole. Plus, wouldn’t it get a little bit boring if we kept learning the same stuff every year? Things do have to be renewed every now and again : )
Written by: Xavier Dickason, Shaurya Chandravanshi, and Jutin Rellin
Proofread by: Avan Fata