Enabling technologies
Guiding Questions
- Make two lists.
- First, list ten technologies you use every day before school starts. These technologies can be gadgets—such as toasters and foldable Samsung phones—or larger works of infrastructure, such as the pipes that bring hot water to your shower.
- Huawei Phone
- Internet
- Water pipes
- Night Lamp
- Earphones
- Microwave
- Fridge
- Macbook
- Computer mouse
- Charger
- Second, list ten technologies that make you better at things, from calculators to running shoes.
- Calculator
- Badminton racket
- Macbook
- Phone
- Glasses
- Clock
- Playstation
- Headphones
- Running shoes
- Mouse
- First, list ten technologies you use every day before school starts. These technologies can be gadgets—such as toasters and foldable Samsung phones—or larger works of infrastructure, such as the pipes that bring hot water to your shower.
- Now, consider the lists you have made and ask yourself the following questions about the technologies you listed.
- Which ones depend on other technologies? A fitness tracker might be less useful without access to GPS, but a watch could keep time on its own—at least, until its battery runs out.
- Practically all the electronics listed above rely on a source of electric power, which in turn relies on some sort of generator of said power. A computer mouse relies on a computer to connect and function, whilst earphones rely on an input source (such as a phone or Macbook) in order to transmit the audio they’re meant to amplify. The water pipes rely on pumps and water heaters stored somewhere around the building.
- How do they work? Arthur C. Clarke once wrote that “Any sufficiently advanced technology is indistinguishable from magic”—but, in some ways, so is any insufficiently understood technology. Take some time to consider technologies you may have taken for granted. Is a quartz crystal vibrating on your wrist 32,768 times per second? Is a heated ceramic plate breaking the hydrogen bonds in your hair—on purpose? Most technologies require energy; if yours do, where does that energy come from?
- I will not bother to explain the inner machinations of a phone and computer for the masses, since that explanation would require a separate page (and because there are many more websites more fluent in communicating this information to you!). I will however, simply state that glasses work by focusing light on the retina in the right way in order to focus the eye on the objects nearby or in the distance. The water pipes work with a series of pumps and valves that contain or release hot water to the shower head when the corresponding tap is turned.
- I’m not going to continue explaining my own technologies, since I suspect you would benefit more from trying to explain the “magic” behind your own technologies that see use before school and help you improve your skills.
- How different would your life be without each of the technologies on your list? Your answer and your teammates’ may vary. Someone with diabetes may struggle without a blood sugar monitor, whereas even without access to a rowing machine the captain of the rowing team might still be able to row on a nearby lake.
- Life would be pretty different without the phones and laptops that our society relies on. I probably would resort to reading books or showering with kettle heated water (as my grandparents do) instead of checking social media or simply turning a tap. I could probably live without a computer mouse or badminton racket (I shall simply rely on the trackpad of my macbook and the stock rackets my school provides). Without glasses however, I probably wouldn’t be able the world very well and would likely sustain several injuries due to poor vision. Calculators are something I can live without, I see little need to use them in the future given the prevalence of online calculating tools and spreadsheet manipulation. In short, there are many inventions that we take for granted which would change the way our entire society functioned if they were to suddenly disappear.
- Where did they come from, and how long ago? Your parents may have used calculators in school, but their parents probably kept slide rules in their desks.
- Ah now we get into the history of the technologies which most of our world rely on. So let’s begin in chronological order.
- Glasses: Originally referred to as “reading stones” (basically a magnifying glass that people placed on texts to help them read better), the basic building blocks of glasses existed since 1000 AD, yet the actual invention of the glasses came sometime in the 13th century. We don’t actually know exactly when and where the glasses were invented, though works by the Italian painter Tommaso da Modena in 1352 show that eyeglasses were likely the work of Italian (specifically Venetian) glass-blowers, who formed the lenses and perched them with a frame on the nose. These were likely used by Italian monks and the first prescriptions for eyeglasses appeared during this time.
- Clock: ever since humanity realised that they needed some way to keep track of the day, they relied on different timekeeping devices. The first sundials were likely utilised by the Ancient Sumerian, Egyptian and Babylonian societies some 3500 years ago. After the fall of the Roman Empire, the innovations that had led to the creation of these timekeeping pieces were neglected, as the world slipped into the dark ages. It was only in the 12th and 13th century that Italian traders and travellers plying the routes of the Renaissance trade. On the wings of commerce, news of complex water clocks by Chinese inventors and Islamic sundial innovations reached the European continent. It was the Swiss and Italians who then took up the pursuit of making mechanical clocks. Prior to the 1500s, most clocks had weights as power sources, ungainly and extremely bulky when purchased. It was the invention of the mainspring and the integration of them into clocks by the German locksmith Peter Henlein that they finally began to catch on in the world. After that, the clocks began to undergo different “ages” as their popularity and reliance began to spread (which also meant the need for adaptations to where humans needed them). Feel free to investigate these time periods at this website.
- Water pipes: While the concept of metallic and concrete pipes have been around for a while (there have been examples of such devices being utilised in Ancient Roman baths), the idea of indoor plumbing on a massive urban scale didn’t emerge until the 1800s, when American urban planner began to realise the need to move huge amounts of water in a town to every possible tap, shower and bathtub that might need it.
- I’ll skip the remainder of the gadgets since these explanations are taking a while, but my parents did work with many of their predecessors before my generation came along and got an upgrade with them. My dad remembers coding commands onto a bulky desktop in university, whilst my mother remembers her basic calculator that couldn’t handle something as complex as graphing a linear function (it couldn’t even display such a thing).
- Ah now we get into the history of the technologies which most of our world rely on. So let’s begin in chronological order.
- How widespread is access to them? Some enabling technologies are far from life-or-death yet make a daily difference in your routines and experiences. Are there people who would want to use hair dryers to dry their hair quickly, but are forced to wait for time and sunlight to take their course? Are there those who would benefit from prosthetics but must make do without anything to replace their missing limbs? Are there schools where children might be able to study more effectively if they had access to air conditioning—or electricity?
- Widespread access to things like mobile phones and laptops is pretty evident in our world. According to recent data from GSMA, five billion people possess a small device that they can use to connect themselves to others and the world, whilst 3 billion people are now connected to the internet using laptops. Plumbing is certainly one of those lesser privileges, as of now billions of people still do not have access to clean running water, with the goal becoming a priority of the UN and countless other NGOs worldwide.
- Which ones depend on other technologies? A fitness tracker might be less useful without access to GPS, but a watch could keep time on its own—at least, until its battery runs out.
- As you explore this subject area, ask yourself: is there a difference between an enabling technology and a technology of convenience? And, where do we cross the line from empowerment to dependency?
- There is indeed a difference between an enabling technology and a technology of convenience. A mobile phone enables us to connect with the world and keep in contact with those who are no longer living in our neighborhood, yet Snapchat is an app of convenience in that it allows us to fulfill that same function in a different (and somewhat trivial) way. As for the line between empowerment and dependency, consider the following example. A person who was born deaf receives a hearing aid. That aid allows them to hear people clearly and eases conversation for them, yet if this hearing aid were to suddenly cease functioning, they could still use sign language, writing and even body signals to interact with others. Often times dependency comes when we cannot “live” without an object. The western world has trouble imagining life without a mobile phone in their pockets, yet millions of African, South American and Asian people who can’t afford such a device have lived like this for decades.
- Some technologies require no power source, but many, if not most, do. In general, the more portable a technology, the more likely it is to use a battery—although there are exceptions, such as some vacuum cleaners, which must be plugged in wherever you want to use them.
- With your team, conduct some basic research into the science of batteries. How do they work? What makes a rechargeable battery different than a standard battery? Be sure to consider key terms such as: anode | cathode | electrolyte | capacity | discharge
- Ah batteries, those round (sometimes square and sometimes circular) objects of near-magical power that provide electricity to our devices. Without the invention of the battery by Italian scientist Alessandro Volta in 1800, it’s hard to imagine our society of digital information the way it is today. So let’s take a look at these often overlooked objects that we take for granted.
- The standard battery is made up of three parts: electrodes, an electrolyte and a separator. The electrodes are made out of conductive material and two are present in every module, each one serving a different purpose. One such electrode is known as the cathode and it connects to the positive end of the battery, where the electrical current leaves and electrons enter when the battery is being used. The other side electrode is known as the anode and it connects to the minus side of the battery, which is where (yep, you guessed it!) the electrical current enters and electrons leave during discharge. Common materials for the anode and cathode are zinc as well as manganese dioxide respectively. Separating these two parts, as well as being inside them, is the electrolyte, a gel-like substance that contains electrically charged particles (called ions). These ions then combine with the materials making up the electrodes, creating the electrical current that a battery sends to objects when a circuit is closed. The final bit is the separator, which is usually just a porous cardboard material that keeps the anode and cathode separated, preventing a short circuit caused by the two electrodes coming into contact. By plugging batteries into a source (say a flashlight or radio), the circuit has been completed and the electrochemical reactions which produce electrons. These electrons, seeking the positive charge of the cathode (yet with no way to get there thanks to the presence of the separator), travel into the electric device before returning to the cathode.
- The main difference between rechargeable batteries and non-rechargeable batteries are their materials. Rechargeable materials function by reversing the flow of electrons, thus causing the anode and cathode to return to their original state. These batteries are often made out of different materials, such as Lithium Ion (wherein the anode is made of carbon and the cathode is made of lithium cobalt oxide) as well as nickel-metal hydride batteries, which are used in electric vehicles as fuel cells.
- With your team, conduct some basic research into the science of batteries. How do they work? What makes a rechargeable battery different than a standard battery? Be sure to consider key terms such as: anode | cathode | electrolyte | capacity | discharge
- How does wireless charging work? Discuss with your team: is there anything to fear from wireless power?
- You’ve probably seen commercials of phones that can charge without wires. The latest Iphones and androids possess that feature (starting with the Iphone 8 and the Samsung Galaxy S8) and it works off a similar concept to conventional wired charging. Wireless chargers often come in small sized bumpers or platforms and are somewhat more clean when it comes to charging (albeit tests have shown it to be slightly slower than conventional cable charging, but I’m sure time will change that). In terms of competing standard, Qi (pronounced “chee”) is the most popular wireless charging system, amongst others such as Powermat and Airfuel Alliance. As for how it works, wireless charging utilises electromagnetic induction. This process means that when the back of the phone touches the charger, an induction coil creates an alternating electromagnetic field, which the receiver coil in the phone converts into electricity to feed into the battery.
- The main thing to fear from wireless power is the fact that we may soon live in world without cables, reliant on all sorts of wireless devices that may one day fail us. In terms of the technical side, the main hurdle facing tech innovators and huge companies like Apple and Samsung is where to go after wireless charging.
- Speaking of fear: not long ago, the batteries in certain phones gained notoriety for spontaneously exploding. A few years earlier, MacBooks were blowing up. Why do batteries explode? Is it ever safe to depend on a technology that can sometimes be dangerous?
- If you’ve never heard of batteries exploding, I suggest you keep a lookout for your devices (especially if you possess an older macbook or a Samsung Galaxy Note 7). Exploding batteries are kind of like Dylan’s tendency to unexpectedly behead alpacas: we know it might happen at one point in time or another, but when it does happens we are often shocked at the reason why it did. In fact, Apple Macbooks were “expected to explode” when they were released in 2008 and it caused huge consumer outcry. Apparently Apple had purposefully designed those Macbooks with exploding batteries so that you had to buy a new one almost every year (it’s called planned obsolescence and it’s a dirty business practice that continues to pollute industries). The main cause of exploding batteries lies in a breached separator. Lithium-Ion batteries are by nature extremely unstable (they kind of have to be in order to store huge amounts of power in such a small form factor), the main line of defense against a violent and extremely hot short circuit is a thin slip of polypropylene. So a battery with a short circuit and one whose electrolyte is flammable will provide a nice fireworks show (or a surprising explosion) when the two electrodes come into contact with one another.
- We have always depended on dangerous technologies and we see no problem in doing so. The first lightbulbs were capable of killing humans, whilst the alternating current we use nowadays was once used to electrocute criminals (something called the electric chair), even our watches were once coated with radium 226, a radioactive element that if not contained properly could easily wreak havoc on our bodies. We can never expect a product to be released with no safety hazards whatsoever, granted there are processes to ensure that products launch with as little safety hazards as possible; but even those cannot guarantee a product is free of danger. Innovation and danger walk hand in hand together, we simply need to find a way to separate the two.
- Explore the future of batteries, then discuss with your team: how would better power sources affect our everyday lives? Would they mostly just make our phones last longer between charges, or would they have impacts on access to technology for, say, underprivileged populations?
- Firstly let’s get this out of the way: batteries are here to stay. It’s just not possible within the near foreseeable future for humans to bypass the chemical processes of powering their electrical devices or the commercial testing required to mass-produce non-battery reliant devices. There are however, researchers working on making the battery as small and as safe as possible. One such solution lies in Silicon as a substitute material for the anode. According to Sila Nanotechnologies, “an atom of silicon can store 20 times more lithium can atoms of carbon”. Another solution for safety lies in the electrolyte, which some are beginning to test for the best materials. One company called Ionic Materials (how fitting) is experimenting with an ionically-conductive plastic that is also fire-retardant (finally a solution to those nasty explosions of phone batteries on planes). Of course there are a multitude of other battery-related projects and experiment going on, each trying to improve one of the 3 parts in their own way.
- Many technological prognosticators have forecast an “Internet of Things”—in which everything around us is connected to the Internet, from the clothing we wear to the chairs on which we sit. Yet these devices would probably require batteries. Consider this potential battery technology, which uses freeze-dried bacteria. Discuss with your team: is it acceptable to use living creatures to generate electricity, or is this a form of exploitation? Would it be different if the bacteria were horses, or tardigrades? Does the calculation change if the creatures are dead?
- Despite my ability to explain electrical mechanisms contained within a conventional battery, I struggle slightly when it comes to the experimental biological mechanisms in paper batteries. As such, please direct your attention to the website linked in the prompt in order to better grasp the concept. As for the ethical side of things, I really question why the WSC asks these questions now when we’ve been exploiting animals for their power for centuries. We literally measure the power of our cars through horsepower (literally the value defined by a single horse pulling some weight), we kill thousands of endangered species every year to fashion designer bags and clothes, we’ve even been able to extract drinks from the venom of snakes (not something for the faint of heart). Humans have this sort of natural belief that their increased intelligence gives them the right to exploit and use every other species on this planet. This battery using bacteria is no different, albeit the bacteria don’t resist our use of them since they’re not ingrained with that sense. Other animals don’t change the situation at all.
- The calculations would probably be different if these were deceased animals. There is the excellent argument that we’re using their remains to benefit the rest of intelligent life (cue “Circle of Life” Lion King soundtrack).
- Not all devices use batteries. Some are autonomously powered, through sunlight and other means, and others we still plug in to the electric grid (a.k.a. mains electricity) through outlets in our walls. But not all plugs are the same. Explore with your team: why does a “British” plug look different than a “European” plug—where did these differences originate, and does it mean the electricity behind them is different, too? Be sure to learn the difference between alternating and direct current, and between adapters and converters.
- Perhaps there will be future where our entire society and the products which we rely on, from our clothes to our microwaves to our cars, will run on autonomously powered batteries (ones that work off changes in the temperature or even ambient radio signals).
- Sometimes large groups of people behave in predictable ways; we all laugh when an alpaca falls off a surfboard. But we may be moving into a Bandersnatch era in which technologies enable a more fragmented social experience—and the consequences may even include changes in how we consume electricity. Discuss with your team: should we find ways to use technology to spread out different kinds of social burdens, from when people commute to work to when they eat their meals?
- This is rather interesting, the Telegraph explores the dramatic drop in electricity surges after a conventional TV programme (say a climactic cliffhanger episode or a particularly nail-biting soccer game). Instead the rise of online streaming services such as Netflix and BBC iPlayer have meant that the nation won’t simultaneously get out of their sofas to put a kettle on the stove (ah the British). This is just one such example of how technology can fragment social experiences and it may soon cause some concern. In my own opinion I don’t think it’s required to find ways in order to spread out these social burdens. The way we commute or the time in which we eat is already ingrained in our biological clocks, sure applications can help make sure we stick to those clocks more (i.e a meal scheduling app or a reminder to start commuting), but they won’t change it instantly or on such a wide scale.
- Companies such as Amazon and Google are competing to develop (and sell products for) the “smart home”—but the truth is that homes have been getting “smarter” for generations, and more comfortable for even longer than that. Discuss with your team: is there a limit to how smart and comfortable we should want our homes to be?
- Home is where the heart is. That much is true. Home is probably where we feel like the world outside can no longer hurt, exhaust or drain us of our will to live. To that end, the smart technologies making home life more effortless and seamless are a boon for our generation. As for a limit on how smart and comfortable our homes should be, it shouldn’t be to the point where we are entirely dependent on the smart technology to run our home.
- Consider the temperature control technologies that enable many of us to live even in climates that may be outside our comfort zones. How do homes stay warm when it is cold out? How do they get cold when it is hot out? Explore how indoor heating and air conditioning work. Is it worse for the environment when we warm a cold place or when we cool a warm one? Discuss with your team: to save resources, should governments regulate how much people can control the climate in their private spaces?
- Ah yes the remote Siberian town of Oymyakon, the coldest inhabited place on earth (at least by humans). Temperatures here can drop to as low as -55°C (-67°F). Toilet trips are an adventure in of themselves, vegetables are practically unheard of and school is almost always in session. Homes stay warm by indoor heating, either through pipes or by keeping the cold out through smart design. As for how they stay cold when the outdoors is hot, a marvelous invention called air-conditioning comes in handy (of course, there is the more conventional and environmentally friendly fan). It probably is worse for the environment when we cool down a place, considering that air conditioners release harmful chemicals that contribute millions of tonnes of carbon dioxide in our atmosphere every year. Governments probably won’t be able to control the climate in people’s private spaces, since that infringes on a lot of basic human rights when it comes to home comforts. Perhaps a sort of incentive to offer those who contribute less to global warming by leaving their air conditioners or some sort of tracking system to monitor air conditioner pollution.
- Some homes are found in areas with significant pollution. Here, people who can afford it may rely on air purifiers to try to breathe cleaner air indoors, whether at home, at school, or even in hotels and other public spaces. What are the different ways in which these filters work? If you found yourself in Beijing on a smoggy day in June, would you rather have an ionizing filter or one based on activated carbon? Discuss with your team: will increasing access to indoor air purification cause people of means to disregard increasing pollution outdoors? Should rich people be forced to breathe the same air as poor people?
- Air purifiers work pretty simply by dragging in air and moving it through a series of filters which remove any harmful particles of allergens, dust or even microorganisms that can harm the human nasal system. Basic and HEPA filters (the most common type of air purifier) work off this system and they require filter-changing every couple of months or so. Ionizing filters work by producing a steady stream of negatively charged ions that electrify the particles in air, causing them to stick to the positively charged collection plates inside the machine. An Activated Carbon Filter is much more effective at removing odors, gases and even volatile organic compounds from the air in your household.
- If I found myself in Beijing on a smoggy June day (which I very well might be when WSC globals swing around!), I would probably seek the activated carbon filter, given that an ionizing filter doesn’t necessarily get rid of the particulates so much as prevent them from escaping into the air I breathe (also people around me who are allergic to ozone would suffer from the emissions of that purifier type).
- There is obviously the risk that we won’t care so much about the outdoor air quality so long as our indoor air quality is perfectly normal. Yet governments, corporations and individuals also have the responsibility to avoid that being the case; since a high level of air pollution likely indicates the global mean temperature (a frightening indicator of how close we are to global climate catastrophe) is rising slowly. All people, regardless of income or disposition, should be able to breathe the same air quality outdoors as well as indoors.
- Consider this argument that air conditioning created the modern city. Pay special attention to the idea that air-conditioned cities separate those who can afford to inhabit air-conditioned spaces from those left on the inhospitable street. Discuss with your team: is inequality of comfort an issue for governments to resolve? To what degree does access to enabling technology lead to a loop in which those who lack access are less able to gain it?
- Inequality is obviously an issue for governments to resolve. An air conditioner is nowadays no longer a comfort, but a sign of societal growth and economic development (the very fact that a city has many social spaces with air conditioners is a sign of how developed that area is). Those who still need to face nature’s harsh temperatures need some time to enjoy the comforts of artificially cooled air.
- There is a pretty high degree to which access to enabling technology results in a sort of feedback loop to whose who can’t access it. When the mobile phone revolutionised the way we conduct business and connected with one another, those who couldn’t afford one began to watch as our society excluded them more and more (without the social media or stream of information readily available at their fingertips).
- Of course, there is more to a home than being able to wear sweatpants and a t-shirt in your living room. There is also the need (or, for some, preference) to keep it clean and orderly. From washing machines to the FoldiMate, inventors have sought ways to lighten the load of housework. Look into the following devices and consider them in the context of the emerging field of home automation, or domotics. Be sure to consider not just how they function but their impact:
- vacuum cleaners | dishwashers | washing machines | home robots
- Vacuum cleaners: Nowadays vacuum cleaners range in size and application. From small handheld devices that work off batteries to reach tight places, to huge truck-size cleaners that suck in contaminated soil or clean up after a serious spill on the roads. Yet the basic concept of a vacuum cleaner has remained practically unchanged for a century: suck in dust and other unwanted particles by creating a temporary vacuum of air. The origins of this device are as messy as the homes it’s called upon to clean. Many credit the concept of such a contraption to Daniel Hess, who in 1860 patented a sort of carpet sweeper with a bellows attached to create a draft of air. In 1867, Ives Mcgaffey of Chicago improved on the design by making the device upright in structure and implementing fans to move the air. Yet the $450 dollar price point (adjusted to today’s standards) and the complex functioning mechanism (it required a hand crack to operate) meant that the “Whirlwind” never saw mass-production or major popularity. The invention of gasoline as a fuel source in 1892 added another step in the creation of the vacuum cleaner. John S. Thurman of St. Louis created his gasoline-powered “pneumatic carpet renovator” in 1898, making house calls for $110 dollars (fitting, considering this vacuum cleaner was the size of a horse-drawn carriage. After the turn of the 20th century, the British had their crack (and a rather good one it was!) at innovating on American designs. British Navy engineer Hubert Cecil Booth created “Puffing Billy”, an improvement on Thurman’s renovator that often called at high points of society (in 1902 Booth was apparently commissioned to clean out Westminster Abbey for the coronation of King Edward VII and Queen Alexandria, who were later so impressed with the machine that they purchased two for Buckingham Palace and Windsor Castle). But in 1907, a humble asthmatic janitor in an Ohio department store would create the concept which formed the foundation of every subsequent vacuum cleaner. James Murray Spangler found his tedious job of sweeping the entire department store extremely tiring (considering he had asthma, the effect was probably magnified to be very bad). Yet as a 1900s Dr. Frankenstein, he tinkered with a broom, pillowcase and electric motor to create the basic design of the modern vacuum cleaner. It used ceiling fan motors and paddle blades to create a sucking air flow, blowing it back out into the attached pillowcase. It wasn’t until the next year that near-bankruptcy had forced Spangler to sell the patent rights to one William Hoover, who poured millions of dollars into research and marketing, making his company a literal household name.
- The main impact that the vacuum cleaner had on our lives was how it revolutionised home cleaning. For those who could afford this enabling technology, the days of sweeping and dusting were all but gone. No longer did it take hours to completely clean a house, meaning people could do better things with their time and contribute to social welfare or the economy.
- Dishwashers: Alright then, let’s turn back the clock 10 years before Daniel Hess invented his carpet sweeper (that’s 1850 for those keeping track out there). Joel Houghton tries his hand at inventing a machine that will automate the washing of dishes, a tedious job that often left many with broken crockery and glass cuts. His design was patented but was also a flop, it basically consisted of a hand-cranked splashing device attached to a wooden wheel. It was not until Josephine Cochrane, granddaughter of John Fitch (the inventor of the steamboat) applied her engineering talents to the job in the 1880s and 90s that the dishwasher was truly born. Josephine's invention consisted of a wooden wheel lying flat in a copper boiler, where wire-framed compartment could secure dishes and avoid broken plates. Unveiling her design at the 1893 Chicago World Fair, hotel and restaurant owners were stunned at this marvelous contraption. In the 1920s Miele introduced their first electric top-loading dishwasher for mass-production. Yet the stock market crash and subsequent Great Depression meant that few would be able to afford such an appliance, which at that time retailed for as much as a housekeeper’s annual salary. It was not until the beginning of WW2 that William Howard Livens created the modern front-loading dishwasher, with a drying element added in 1940. Dishwashers would go on to become a common household appliance and save hours of painstaking dishwashing jobs for housewives and maids alike.
- Washing machines: Unlike most of the other devices on this list, the concept of a washing machine has existed for quite a while and we’ve been washing our clothes using the same principles for centuries. Way before the 1800s, humans often used metal or wooden washboards that they scraped clothes on near a riverbank or other source of water to wash the dirt off. The Romans are credited with inventing a crude soap known as lye for their communal washing houses (basically huge tubs of water in public where women could go wash clothes). The main concept of a washing machine, a sort of wooden drum with a metal agitator was developed (or at least patented) to two Americans in 1851 and 1858 (James King and Hamilton Smith respectively). In 1908, the Thor was the first electric model of washing machine, created by the Hurley Machine company of Chicago (boy that part of the US sure does love its household appliances doesn’t it?). After that other companies in the US and Europe would continue to improve on the design, making the motor more efficient and the drum size larger to accommodate more clothes. The first “smart” washing machines came about in 1998 with the introduction of the Fisher and Paykel SmartDrive, which determined load size and prescripted a corresponding wash cycle.
- Washing machines are pretty normal in our lives nowadays. Yet there are still a large amount of people who prefer the traditional clothes line method (or they simply cannot afford such an enabling technology). Sometimes we can take them for granted and forget how to wash clothes manually (my grandparents are adept at both the new and old ways of getting dirt and other grime out of their outfits).
- Home robots: Ever since the early 2000s, with the advent of microprocessors and intelligent machines, companies have constantly been trying to find ways to integrate artificial intelligence into the household. The advertisements of the 1980s and 90s showed a futuristic 2000s which included automatic robot companions which looked over the household (somewhat disappointing then that such concepts never materialised). But these devices are becoming more popular in today’s society. From Amazon’s Alexa to Google’s Home system, our households are slowly being invaded by a new sort of life, one which thinks in 1s and 0s. Though they have yet to become widespread (which is already a challenge companies are tackling), there will be a day in which a human’s entire household tasks can be handled simply by saying a command to activate a home robot.
- Vacuum cleaners: Nowadays vacuum cleaners range in size and application. From small handheld devices that work off batteries to reach tight places, to huge truck-size cleaners that suck in contaminated soil or clean up after a serious spill on the roads. Yet the basic concept of a vacuum cleaner has remained practically unchanged for a century: suck in dust and other unwanted particles by creating a temporary vacuum of air. The origins of this device are as messy as the homes it’s called upon to clean. Many credit the concept of such a contraption to Daniel Hess, who in 1860 patented a sort of carpet sweeper with a bellows attached to create a draft of air. In 1867, Ives Mcgaffey of Chicago improved on the design by making the device upright in structure and implementing fans to move the air. Yet the $450 dollar price point (adjusted to today’s standards) and the complex functioning mechanism (it required a hand crack to operate) meant that the “Whirlwind” never saw mass-production or major popularity. The invention of gasoline as a fuel source in 1892 added another step in the creation of the vacuum cleaner. John S. Thurman of St. Louis created his gasoline-powered “pneumatic carpet renovator” in 1898, making house calls for $110 dollars (fitting, considering this vacuum cleaner was the size of a horse-drawn carriage. After the turn of the 20th century, the British had their crack (and a rather good one it was!) at innovating on American designs. British Navy engineer Hubert Cecil Booth created “Puffing Billy”, an improvement on Thurman’s renovator that often called at high points of society (in 1902 Booth was apparently commissioned to clean out Westminster Abbey for the coronation of King Edward VII and Queen Alexandria, who were later so impressed with the machine that they purchased two for Buckingham Palace and Windsor Castle). But in 1907, a humble asthmatic janitor in an Ohio department store would create the concept which formed the foundation of every subsequent vacuum cleaner. James Murray Spangler found his tedious job of sweeping the entire department store extremely tiring (considering he had asthma, the effect was probably magnified to be very bad). Yet as a 1900s Dr. Frankenstein, he tinkered with a broom, pillowcase and electric motor to create the basic design of the modern vacuum cleaner. It used ceiling fan motors and paddle blades to create a sucking air flow, blowing it back out into the attached pillowcase. It wasn’t until the next year that near-bankruptcy had forced Spangler to sell the patent rights to one William Hoover, who poured millions of dollars into research and marketing, making his company a literal household name.
- vacuum cleaners | dishwashers | washing machines | home robots
- Sometimes, you want to eat something delicious: maybe a seafood dinner, maybe eggs for breakfast. You could order (it is hard to ever say no to Foodpanda) but, if you choose to cook, you’ll discover that enabling technologies are all over the modern kitchen. Explore how the following devices work, and discuss with your team: do they have drawbacks? Does the rise of such devices make it harder for people without them to cope—and does it impact typical household roles and employment opportunities?
- microwave | toaster | convection oven | rice cooker | coffee maker | juicer
- Microwave: Perhaps rather interestingly, the microwave was the result of a happy incident involving a radiation tube and some chocolate. Percy LeBaron Spencer was an American engineer (self-taught) who at the time of WW2 was working at Raytheon Corps, a producer of magnetrons (basically vacuum tubes that produced microwave radiation and were used in radar system). According to witnesses, Spencer had been testing some magnetrons when he realised that the chocolate bar in his pocket had melted. A few more tests with popcorn kernels and eggs revealed that microwave radiation had the ability to cook foods faster than conventional ovens did. In 1945 his patent was filed and the end of the war saw the introduction of the first microwave, named the Radarange 1161 (not the most creative of names I’ll be honest, but hey the guy was a physicist at heart, not a show person). Initially microwaves were only accessible to businesses such as hotels and restaurants, considering the $5000 dollar price tag and 1.7 meter tall design. It was only in 1967 that Raytheon acquired Amana Refrigerators and created the first tabletop microwave (retailing for about $500 dollars), known as the Amana Radarange (guess they stuck with Spencer’s somewhat lackluster name). Today more than 90% of United States households possess a microwave, with the device becoming the “number 1 most useful household appliance” in that country (via some surveys taken in the early 2000s). As for how it works, well that concept still remains practically unchanged since Spencer first debuted his brainchild. A microwave is normally made out of metal, since the material does not allow the radio waves being used to escape the device (preventing yourself from radiation-related harm). Magnetrons within the microwave (essentially 2 magnets on either side of a vacuum tube which produce magnetic and electric fields due to the flow of electrons). These microwaves (radio waves at a frequency of 2,450 megahertz) then interact with the water molecules in the food to produce thermal energy from the vibrations.
- There probably are some sort of problems with microwaves, they draw a huge amount of energy per use and they can prove a health hazard if poorly constructed. If anything though the role of some jobs have been made more convenient by the introduction of microwaves. Maids and nannies no longer need to spend a large amount of their time cooking food for the younglings, they can simply microwave pre-prepared food.
- Toaster: The concept of toasting has been around for quite a while. Roman soldiers apparently took pieces of bread with them on their military campaigns, taking the time at night (or in the morning if there was no danger nearby) to brown the bread over an open flame. The actual word for toast comes from the Latin word “tostum”, meaning “to scorch/to burn”. The British took a liking to this understated meal and took it back with them as well as to the Americas when the 13 colonies were set up. In 1893, the advent of electricity allowed a Scotsman by the name of Alan MacMasters to invent the first ever electrical toaster. Called the “eclipse toaster”, it was marketed by the Crompton Company and wasn’t that great of a device. The iron wiring used in the filament meant that a fire risk could often develop (so the cartoon cliche of burnt toast could also translate to a burnt house in that day). In 1905, Albert Mash began to apply his metallurgical know-how to the situation. He developed a solution made up of chromium and nickel which was later named “Nichrome”, this new heat-resistant material was implemented in toasters by George Schneider to make Dew toasters. In 1909 Frank Shailor, operating under General Electric (yes that company which Thomas Alva Edison had a part in creating), created the “D-12” toaster. While this toaster was better at toasting bread, it could only toast 1 side at a time and required the operator to open it and turn it the other way when it was done (lest said operator return to find their kitchen in flames). After several more editions of the toaster, Charles Strife created the final components that make up the modern toaster: timers and springs. By attaching a timer and spring to the mechanism, people no longer needed to worry about their toast being overdone or having to turn the toaster upside down to get the bread out (always a pain thanks to those crumbs). Over the years toasters would continue to develop, increasing in size while decreasing in cost and continually innovating (by the 1950s, toasters knew when the bread was fully toasted and ejected it themselves).
- The main drawback of a toaster is that even modern ones can be annoying to operate. The slew of electronic buttons and cramped spaces complicate the inner workings of such a device. I see no reason why the toaster makes the lives of those without it harder. Sure they might not be able to eat their bread fully toasted by the wonders of electricity and technological wizardry, but the handy “open-fire” method might as well serve the purpose the exact same way.
- Convection oven: Much like the toaster, the convection oven technically had its roots well before the advent of electricity or automation. Ancient civilisations dating as far back as the Egyptian, Greek and Roman (basically the “big three” of ancient civilisations) have been cooking their food through in ovens powered by wood-fed flames (those still exist today, just visit any traditional pizzeria or old restaurant). While the convection oven was only implemented in 1945 (gee a lot of these inventions started breaking new ground just after war didn’t they? Then again “war is the mother of invention”). The Maxson Whirlwind Oven was the first model introduced with a convection system; allowing for the equal distribution of heat within the foods (so think of a Communist oven). Essentially a convection oven contains a fan and exhaust system at the back of the module, blowing hot air over the food and then ventilating back out (as opposed to a conventional oven where the heat just sort of lingers around at heats up food as it wishes). Convection ovens actually let us cook food faster and saves energy in terms of electricity consumption. While those that don’t possess a convection oven might not be able to cook/bake their foods quite as quickly as though who do, it isn’t necessarily an enabling technology that makes it more difficult for them to cope.
- Rice cooker: Ah the rice cooker, the humble and often overlooked kitchen appliance that has made billions of lives much easier around the world. Prior to the invention of the electric rice cooker, people mainly relied on stoves to cook rice in pots (in many places of the world, this is still a viable technique). It was actually an Asian country to begin innovating on the rice cooker (your guess as to which Asian country). If you guessed Japan, you’re correct (really a no-brainer but China would’ve been another logical guess). In 1945, just after the end of WW2 and 2 nuclear bombs (gee we sure do love to make household things after a war don’t we?), the first rice cooker prototype emerged in the country. It was essentially a wooden tub connected to a few electrodes, into which one poured a few cups of rice and water. The boiling of water through heat transmitted by the electrodes allows the rice to begin cooking. Initially these rice cookers were a major health risk, since chefs could be electrocuted if they touched the wrong parts. Mitsubishi, adept at creating cars, then applied it’s know how this field. Replacing the wooden pot with an aluminium one not only increased the conductivity of the material (allowing for faster cooking times), after that it was merely a matter of improving on the design through minor additions. Modern rice cookers used infrared thermal radiation to boil water and cook rice. LCD Screens and Touch buttons are modern additions, allowing for different settings of rice cooking to happen (“fluffy” rice is something apparently).
- The main drawback of a rice cooker is that it can often take a while for the rice to cook, anywhere from 45 minutes to 90 depending on the model. Furthermore actually scooping rice out of one is still pretty dangerous, the aluminium (or other metal) pot is extremely hot after cooking and I myself have suffered some burns while getting rice.
- Coffee maker: Yet another device which our parents and most adults can’t seem to live without, the coffee maker is as rich in history as the flavour of the drinks it brews. Coffee itself has an interesting history (which I won’t go so in-depth with), but essentially the Turks have been shown to drink coffee since the 6th century, with the drink spreading to Europe, Asia and later the Americas (given it’s African birthplace, this expansion is pretty good!). For centuries, people have been making coffee with practically the same process. Roasted and ground coffee beans were placed into a pot with some water and boiled until the coffee has been fully brewed. It wasn’t until the Archbishop of Paris came along in 1780 that the concept of drip-brewing was invented (trust the French to add some style to the way we brewed our drinks). I’ll be honest the history of the coffee maker is rather complex since there were so many variations of the ways in which coffee was made (how the Turks made coffee differed distinctly from how an Italian did, which differed even more from how the Americans did it). The basic concept remains the same however: press coffee beans through water in order to create that caffeinated drink that is 3rd most consumed beverage in the world (after water and tea). It probably isn’t so disadvantageous for the people who can’t afford their own coffee maker seeing as plenty of cafes and restaurants will easily provide that service for a (generally) cheaper cost.
- Juicer: After a tiring workout or a long day, a nice cold glass of juice (be it orange, apple, guava or other fruit/vegetable) is sometimes all we need to cool down. Humans have fallen in love with extracting juices from natural produce and archaeologists have found evidence of juice-making processed dating as far back as 150 BC, when humans pounded pomegranates and figs until the juices came out. It was not until 1936 (rather late compared to the other devices) that one Norman W. Walker (a British businessman and pioneer of nutritional books about juices) invented the first ever juicer: the Norwalk juicer. Interestingly, one can still buy a Norwalk juicer model even to this day. Since then, many different companies have tried to improve on the Norwalk, with different methods such as fast-spinning blades to the traditional mortar and pestle pounding action. To this day, the juicer is really still primitive in it’s design, we don’t see many model with different settings and LCD screens (some in fact, are still manually operated!).
- Microwave: Perhaps rather interestingly, the microwave was the result of a happy incident involving a radiation tube and some chocolate. Percy LeBaron Spencer was an American engineer (self-taught) who at the time of WW2 was working at Raytheon Corps, a producer of magnetrons (basically vacuum tubes that produced microwave radiation and were used in radar system). According to witnesses, Spencer had been testing some magnetrons when he realised that the chocolate bar in his pocket had melted. A few more tests with popcorn kernels and eggs revealed that microwave radiation had the ability to cook foods faster than conventional ovens did. In 1945 his patent was filed and the end of the war saw the introduction of the first microwave, named the Radarange 1161 (not the most creative of names I’ll be honest, but hey the guy was a physicist at heart, not a show person). Initially microwaves were only accessible to businesses such as hotels and restaurants, considering the $5000 dollar price tag and 1.7 meter tall design. It was only in 1967 that Raytheon acquired Amana Refrigerators and created the first tabletop microwave (retailing for about $500 dollars), known as the Amana Radarange (guess they stuck with Spencer’s somewhat lackluster name). Today more than 90% of United States households possess a microwave, with the device becoming the “number 1 most useful household appliance” in that country (via some surveys taken in the early 2000s). As for how it works, well that concept still remains practically unchanged since Spencer first debuted his brainchild. A microwave is normally made out of metal, since the material does not allow the radio waves being used to escape the device (preventing yourself from radiation-related harm). Magnetrons within the microwave (essentially 2 magnets on either side of a vacuum tube which produce magnetic and electric fields due to the flow of electrons). These microwaves (radio waves at a frequency of 2,450 megahertz) then interact with the water molecules in the food to produce thermal energy from the vibrations.
- pressure cooker | magnetic stove
- Pressure cooker: One of the less conventional (and certainly less common) inventions around the kitchen, a pressure cooker is nonetheless rather useful when it comes to cooking foods quickly and without as much fuel consumption. A pressure cooker works in a similar fashion to a rice cooker (in fact the latter is merely a more specific extension of the former). It traps steam produced by the boiling of a cooking liquid (usually water) inside the pot, causing internal pressure to rise in addition to temperatures. However, despite being less popular than other kitchen appliances, the pressure cooker actually began its developmental stage much earlier than the others. In 1697, Frenchman Denis Papin invented the “steam digester” in order to lower the cooking times of foods, so he created a simple airtight cooker that combined pressure and steam to lower cooking times. About a hundred years after this, Nicolas Appert (a confectioner) developed a canning process by sealing foods in jars and cooking them in boiled water, contributing to the concept of pressure cooking as we know it today. Over the next century, another bunch of individuals would have a crack at this invention (some would succeed, others would crack under the pressure). It wasn’t until 1939 that German Alfred Vischer would present his “Flex Seal Speed Cooker” at the 1939 New York World Fair. The US and UK would then lead the “pressure cooker revolution” as companies from both companies invested millions of dollars (or pounds sterling) into making the device larger yet more efficient. Modern pressure cookers often have electronic settings on what type of food is being made, with the most basic models having a rubber seal and whistle to tell the user when the cooking is done.
- The pressure cooker does have some drawbacks. Firstly it can be extremely dangerous to use, you cannot simply leave it to go do something else (explosions have occurred due to pressure cooker related recklessness). Furthermore, you can easily burn yourself if you don’t wait a while before scooping out the cooked food (even though much of the steam has been released gradually, the metal pot can still be extremely hot).
- Now pressure cookers aren’t exactly so enabling in their nature and people who can’t afford them can simply rely on heat from a stove (or wooden fire) as opposed to using one. A pressure cooker really serves to just minimise the cooking time, not to revolutionise the way in which we cook.
- Magnetic stove: Alright then, one of the newest items on this list ironically has an explanation so complex and confusing that I’ll leave it to this website to explain the whole thing. Basically a magnetic stove (or an induction stove/hob as its called scientifically) heats up pots and pans directly instead of relying on indirect thermal radiation like conventional stoves do. An easy way to tell magnetic stoves apart from others is the lack of any gas outlets or metallic pot-rests above the module. Now magnetic stoves are certainly advantageous in that they can boil water and cook foods slightly faster than the conventional stove can, though they do require specialised pots and pans (those labelled “induction-compatible”) and can be a notch more expensive to install (anywhere between $1000 to $10,000 depending on the size and model).
- It doesn’t really matter if you have a magnetic stove or not (I don’t, my family still relies on old-fashioned metallic stoves). While it may be faster, a normal stove will suffice for the enabling of those wishing to cook.
- Pressure cooker: One of the less conventional (and certainly less common) inventions around the kitchen, a pressure cooker is nonetheless rather useful when it comes to cooking foods quickly and without as much fuel consumption. A pressure cooker works in a similar fashion to a rice cooker (in fact the latter is merely a more specific extension of the former). It traps steam produced by the boiling of a cooking liquid (usually water) inside the pot, causing internal pressure to rise in addition to temperatures. However, despite being less popular than other kitchen appliances, the pressure cooker actually began its developmental stage much earlier than the others. In 1697, Frenchman Denis Papin invented the “steam digester” in order to lower the cooking times of foods, so he created a simple airtight cooker that combined pressure and steam to lower cooking times. About a hundred years after this, Nicolas Appert (a confectioner) developed a canning process by sealing foods in jars and cooking them in boiled water, contributing to the concept of pressure cooking as we know it today. Over the next century, another bunch of individuals would have a crack at this invention (some would succeed, others would crack under the pressure). It wasn’t until 1939 that German Alfred Vischer would present his “Flex Seal Speed Cooker” at the 1939 New York World Fair. The US and UK would then lead the “pressure cooker revolution” as companies from both companies invested millions of dollars (or pounds sterling) into making the device larger yet more efficient. Modern pressure cookers often have electronic settings on what type of food is being made, with the most basic models having a rubber seal and whistle to tell the user when the cooking is done.
- microwave | toaster | convection oven | rice cooker | coffee maker | juicer
- Consider the technologies of personal grooming: hair dryers and straighteners, shavers, waxing, cosmetics, even toothbrushes. When were they invented, and how have they changed over time? Discuss with your team: is a device still an enabling technology if it helps us accomplish that which helps us achieve an aesthetic goal but does not technically make us more “able”?
- Apologies scholar, I am exhausted from having to write so many explanations about the inner workings and history of the devices which we rely on to create food. I refuse to go in depth with the science and history of personal grooming inventions, so I strongly suggest that you do so yourself (especially on those inventions which you’re interested in or use a lot).
- A device is not an enabling technology if it helps us accomplish an aesthetic goal. Without hair-dryers, straighteners, shavers and cosmetics we are still capable of looking rather human and can always find more natural ways to spice up our looks (natural pigments were and are still used in many cosmetic items, with some toothbrush hairs still being made from animal tails).
- We often glorify high-profile technologies and those who develop them—but we spend less time considering the less glamorous technologies and those who need to work with them. For every computer programmer, there is an electrician who makes it possible to plug that computer into a wall. Discuss with your team: what are some of the technologies we think the least about but rely upon the most?
- The technologies that we think the least about yet rely on the most are the technologies we never see in our everyday lives. From the electricity stations which distribute power to our houses (not to mention make it safe for humans) and the invisible water pipes which heat up and transfer our water to where it’s needed, the hidden enabling technologies are the ones who’ll never get noticed until they break down.
- For instance, consider waste disposal and treatment. Every time you walk out of a bathroom, you leave something behind. Where does it go, and where does the water you use come from? Explore with your team: how does plumbing work, and what are some of the latest innovations in the industry? Be sure to spend some time looking into the world of high-tech toilets, trenchless sewers, and even paper towel dispensers, including new models in China that use facial recognition to limit how much you take. But also consider how the legacy of older technologies can still affect us today—in this case, leaving millions of Moscow residents without running hot water every summer.
- Plumbing basically works on the principles of pressure. Two different systems are hidden within the walls, floors and ceilings of your home. One to carry freshwater from the building’s water supply into the home, while another carries wastewater into the sewage system. That’s basically how the two act and if you want more specifics kindly click on the link (as I will be asking questions about those specifics on the practice challenge!). As for the latest innovations in this industry, they’re already pretty evident around us. Any visitor to a 5-star hotel, expensive shopping mall or other high-end building will likely encounter toilets that have their own buttons (usually for flushing, spraying or even keeping the seat warm!). Paper towel dispensers still have traditional aspects, though the Chinese version of limiting how much we take is rather ingenious. As for the legacy of older technologies, that is a rather interesting case. Every year in Moscow, the Russian capital’s 12 million or so inhabitants experience a dreaded ritual in which most of the city’s hot water supply is shut off. The ritual dates back to Soviet times, when the supply of hot water was a government centralised business (meaning that instead of each house having their individual heating units, the municipal government controlled the water heating unit in one large area). During the winter months, this supply was often used (considering Russian winter, this is a necessity) and in the summer the city often stopped providing this service. This allowed the municipal administration to conduct repairs on the hot water system (as with most Soviet built things, they were easy to repair whenever they constantly broke down).
- Sometimes bugs and vermin invade our homes and workplaces, from Argentine ants to weasels and rats. Is there a way to build a wall to keep them out? Consider the pest control industry and the technologies that enable it. Discuss with your team: do we unfairly criticize traditional chemical pesticides for their impact on human health, or do they create an artificial world order centered on unhealthy and unsustainable food production?
- The pest control industry is one of those reluctant areas which we need to keep innovating in. For our one true enemy when it comes to trying to feed our mouths is mother nature itself (somewhat ironic that the force which gave us land and water to cultivate crops can also breed animals which destroy and kill those crops). Ever since humanity has been able to find ways to grow crops, we’ve also had to find ways to eliminate the critters who threaten our food supply. From simply beating away insects, to the ever stereotypical scarecrow, to even the use of chemical engineering. The article in questions explores a new technology: the Photonic Fence. A structure based around a targeting system that identifies hostile pests to a certain crop species and fires a concentrated laser to neutralise the threat (kind of like a Star Wars fence!). We don’t unfairly criticize traditional chemical pesticides for their health hazards and the temporary world order they’ve created, but it is the sad byproduct of relying on nature to kill nature (fight fire with fire).
- You are probably reading this outline on a device built on an assembly line. Consider the technologies that make mass production possible, from stainless steel to the conveyor belt. In the context of industry, what is a “prime mover”? Discuss with your team: do newer technologies make manufacturing jobs less “dirty” but also less valued? Or does modern society’s interest in manufactured products mean we respect those who manufacture them more than ever?
- In the context of industry, the “prime mover” is an initial mechanical or natural source of motive power (literally what causes movement in any mechanisms). From the steam in locomotives, to the oil in our cars, to the electricity in our electronics. Newer technologies can make manufacturing jobs less “dirty” and also less valued, one of the greatest fears many people have is the rise of artificial intelligence and the amount of “dirty” jobs they’ll takeover as well as devalue when they do come online en masse (robot apocalypse anyone?). As for the assembly line, we must thank one Henry Ford for revolutionising the way our products are made. Initially, many products were made by hand, an individual assigned to a certain product who would oversee its production from start to finish. As a result, production costs were high (it did take quite a bit of education and craftsmanship to create anything commercial back then) and the efficiency of factories was limited. Keep in mind though, that while Henry Ford did popularise the assembly line, it was Ransom Eli Olds who actually implemented it first in his car factories (though Ford did later add the conveyor belt, making him take the credit for a moving assembly line). We put barely any respect on the people who create our products: we would rather respect a businessman (or woman!) who designed the product than the thousands of employees who work day in day out to assemble those products in packed, noisy and fairly tedious conditions.
- We have come a long way from Captain Hook. Take some time to consider each of the following “assistive” technologies and how it functions, whether by bending light or by limiting range of motion. Which has been around the longest, and which are evolving the most quickly?
- prosthetics | hearing aids | walkers | wheelchairs (including racing wheelchairs)
- Prosthetics: You’d probably think that prosthetics were invented rather recently and that the ability to replace our appendages with a mechanical substitute is something that only came about in the past 200 years. Well you’d be wrong (and the images down below can easily prove what is about to be said). Archaeologists and historians have found evidence of the Ancient Egyptians (yes that’s right, 3000 years before today), creating prosthetics, namely in the discovery of a wooden toe next to the mummy of a woman. During that time, basic prosthetics made from wood, bronze or even straps of leather have been found. The most famous example of European historical prosthetics being the Roman Capua Leg, a bronze case for a fake leg, likely owned by a Roman soldier around 300 BC. In the Middle Ages, prosthetic development hadn’t gone so far (understandable given that those times were also called The Dark Ages), yet a key introduction was the use of hinges on arms and legs. Over the next 200 years, different countries, companies, governments and individuals would get involved in the development of prosthetics. Unlike most other devices, the prosthetic structure wasn’t standardized or popularised. You’ve got to remember that this wasn’t some sort of mass-produced good you could easily walk into a pharmacy and obtain, you often had to get one specially made for your case. It was during the American Civil War that the concept of prosthetics began to skyrocket (given the injuries sustained, this seems appropriate). World War 1 would also see entire government set up prosthetic limb factories for soldiers missing an arm (or leg). While early prosthetics simply consisted of cranks, straps and gears, they were later modified to be controlled by the person’s other limb. Nowadays we have expensive prosthetics that can respond to our thoughts and soon these may become commonplace and affordable for the masses.
- Hearing aids: Keep in mind that hearing aids are not some sort of “miracle healer” tool that can suddenly relieve a deaf person of their inability to hear noises. Hearing aids differ from those devices (known as Cochlear Implants) quite significantly. Before we delve into the function of one however, let’s take a look at the history of these devices. In the 17th century, hearing aids came in the form of “ear trumpets”, literally a hand-held tube of sheet metal or other resonant materials that amplified any incoming soundwaves and funneled them directly to the user’s ears. Yet during this time, there were no mass-production concepts for such a device (there didn’t need to be, not everyone was hard of hearing). These devices didn’t work too well and were very tiring to use (imagine having to life one to your ear and keeping it there during an entire conversation!), but that didn’t prevent them from being the only available option until the invention of the telephone. Once electricity became mainstay in the cities of the west, people began to realise that telephone receivers were actually better at amplifying sounds for partially deaf people than those hearing aids were (score for Alexander Graham Bell!). It was another famous (or rather, infamous) electricity mogul named Thomas Edison who took the technology to it’s newest interpretation: carbon transmitters. Designed to be used with a telephone, the carbon transmitter amplified the sound within an electrical signal by 15 decibels (about half of the amount necessary for those hard of hearing to fully understand what’s being said). These were in use from the turn of the 20th century to the 1920s, when vacuum tube hearing aids were produced. The use of vacuum tubes allowed the sounds to be amplified by about 70 dB (more than enough for the user), yet these tubes were also extremely bulky and not practical for everyday wearing. A small wooden box often had to be hung around the user’s neck, with a connected receiver that they also had to hold up to their ear whenever sounds they wanted to hear were being amplified. It was only after the Second World War that the hearing aid became more compact, with pocket-sized units being produced (albeit the mess of wires that connected the amplifier to the earpiece was slightly less cosmetically appealing). It was in 1948 that the invention of the transistor (a switch controlling the movement of electricity by changing the flow of electrons) finally allowed Raytheon Corps (remember them?) to create hearing aids that could slip behind the ear and fit within it. Since then, digital technology has allowed them to pick up more sounds and become even smaller (with the invention of silicon transistors shrinking the form immensely).
- Hearing aids function in three parts: the microphone picks up any sounds and sends them to the amplifier, which makes the sound louder, then a receiver sends those amplified sounds into the ear for easier hearing.
- Walkers: Otherwise known as a walking frame, walkers are a tool for elderly or crippled people when it comes to providing additional support/balance for any movement (most commonly walking). You might also know this sort of enabling technology as a Zimmer frame, after the namesake UK company that mass-manufactured them. The basic design of a walker is pretty straightforward and it hasn’t evolved very much since then: a waist-high lightweight frame that’s slightly wider than the user (hence the need for pediatric and bariatric walkers for children and obese users respectively), usually with wheels or tennis balls on the front two legs depending on the strength of the person. Modern walkers are usually height-adjustable, since they’re made for whoever might wish to use them (the key is to maintain a slight bend in the arms, so blood circulation can continue unabated). In the early 1950s, a man by the name of William Cribbes Robb received a US patent for his “walking aid”, whilst later persons such as Elmer F. Ries and Alfred A. Smith would receive patents for non-wheeled walkers and modern frame walkers respectively.
- Wheelchairs: One of the oldest technologies on this list, the wheelchair predates many other enabling technologies for those hampered with biological conditions. Friezes from Ancient Greece and stone slates from Ancient China depict wheeled seats being used to transport humans. Yet the Europeans would not catch onto such a technology until 1595, when an unknown inventor created an “invalid’s chair” for King Philip II of Spain, whilst in 1655 a paraplegic (someone who has difficulty controlling their lower extremities) watchmaker by the name of Stephan Farffler created a self-propelling chair made of 3 wheels (not the most popular design at the time apparently). In 1783, John Dawson would create the Bath Wheelchair (so named for the town in Britain where it was created). While not the most comfortable of chairs, it did somehow manage to win over more customers with it’s three wheel design (2 at the front, 1 at the back). In 1869 a patent for the wheelchair emerged with rear push wheels and front caster wheels (caster wheels are the ones you find on supermarket trolleys, able to spin and face whatever direction they’re travelling in). Over the next 45 years inventors would add more innovations to the wheelchair, from hollow rubber wheels to push rims (those small bars on the outside of the wheel for self-propulsion) and even spoked wheels for lightweight construction. It was in 1932 that engineer Harry Jennings created a folding steel wheelchair (similar to most we see nowadays) for his paraplegic friend Herbert Everest (who’d crippled his back in a mining accident). When the two founded Everest & Jennings, they would come to monopolise the wheelchair market of the US for many years. While the two were mass-producing their tubular frames, Canadian inventor George Klein (the same person who invented the surgical staple gun) was leading a team of engineers at the National Research Center Council of Canada in a program meant to assist wounded veterans after WW2. It was here that the first electric wheelchair, with its own motor for easier propulsion, was created. In 1965 Everest & Jennings would mass-produce these electric wheelchairs. Even now wheelchairs are still being revolutionised, with research ongoing to see if they can somehow be programmed to act with their user’s minds.
- Racing Wheelchairs: If any of you have ever watched the paralympics before, you’ve probably been astounded at the lightning fast speeds and incredible stamina of the wheelchair racers: paraplegic athletes who compete in races using specially designed racing wheelchairs. The concept of the paralympics (as well as the main cause behind the invention of the racing wheelchair) goes to one Sir Ludwig Guttmann, an englishman who worked at the Spinal Injuries Centre within Stoke Mandeville Hospital in Aylesbury. He introduced competitive sports as a rehabilitative activity that crippled veterans of WW2 could take part in. Over the next couple of years they would spread to Europe and later America, with the first paralympic games occuring at the 1960 Rome olympics. Racing wheelchairs differ from normal ones in that they’re built for speed and as such only posses 3 wheels (2 large tires at the back and 1 at the front). The athletes manually propel themselves by pushing huge push rims on either tire, leaning into the wind as they jockey for a lead position. Truly one of the greatest ways in which our society has been able to connect those who might not be able to perform the same daily activities we do: sports.
- speech recognition | eyeglasses | sign language to speech conversion
- Speech recognition: Picture this: a “couch potato” reclined on (what else?) a couch watching some television show (or more fittingly, a Netflix series), the brilliant idea of ordering some food (be it a pizza or healthy salad) enters their mind. They simply say “Hey Siri!” and the Iphone in their vicinity wakes up and asks what they’d like to know. This is the magic of voice recognition, making technology more accessible for those who might not possess the ability to type on a keyboard or use a mouse. Yet this technology, though it seems fairly new, actually has it’s (albeit very limited) origins in the 1950s. Before full speech recognition could occur, an era known as “baby talk” (where only numbers and digits could be understood). In 1952, “Audrey” was invented by the Bell Laboratories (yes they had the somewhat sentimental tradition of naming their computers back then). Later improvements in computers and the technology meant that more words and numbers could be understood. It was only in the 2000s after the popularisation of personal devices that Google (and later Apple) would include speech recognition software in their devices, allowing one to connect to the cloud of data and pull up results for their queries. As for how these softwares work, I’ll let this website explain that for you.
- Eyeglasses: Ah finally, a technology I use constantly on a daily basis on this list (and admittedly, something I depend on in order to see properly). Eyeglasses consist of lenses (which bend light and focus them on the retina in specific ways depending on the power of the design). (THIS INFORMATION IS SIMILAR TO WHAT YOU READ AT THE BEGINNING). Originally referred to as “reading stones” (basically a magnifying glass that people placed on texts to help them read better), the basic building blocks of glasses existed since 1000 AD, yet the actual invention of the glasses came sometime in the 13th century. We don’t actually know exactly when and where the glasses were invented, though works by the Italian painter Tommaso da Modena in 1352 show that eyeglasses were likely the work of Italian (specifically Venetian) glass-blowers, who formed the lenses and perched them with a frame on the nose. These were likely used by Italian monks and the first prescriptions for eyeglasses appeared during this time. Over the next centuries, more individuals would come to add more variations to eyeglasses. Benjamin Franklin was noted for his invention of the bifocal glasses (where 1 lens had different powers to the other).
- Sign language to speech conversion: This is one of the more experimental devices in our list. The basic concept is pretty self-explanatory, some sort of motion sensor (or camera) will record sign language and then instantaneously (through some sort of artificial intelligence) reference the gestures to a cloud database and then relay the sentences or words in speech. Perhaps in the near future this’ll be a hand-held appliance (or even something integrated into an earpiece!) which would bridge the gap between those who cannot hear and those who can.
- adaptive eating devices
- Adaptive eating devices: From the basic utensils in our kitchen to high-tech robotic assistants for those with difficulty controlling their hands or feeding themselves. Adaptive eating devices are everywhere you look (even straws somehow fit into this category), the most basic adaptive eating devices consist of spoons, forks, plates, mugs and bowls that have easier to hold grips or specialised designs for ease of use. The higher tech devices often rely on robots or even automated systems in order to provide assistance to those unable to feed themselves. Perhaps in the future these features can be integrated into a robotic servant in order to provide a more seamless home comfort experience.
- prosthetics | hearing aids | walkers | wheelchairs (including racing wheelchairs)
- Look at the origins of the “optophone” in the early 1900s—an optical character recognition technology that could “sound out” letters and numbers for the blind. If a similar device were developed for music, would it be more useful for entertainment or for education? What new applications might devices of this kind make possible?
- One of the earliest attempts to help blind people understand written texts better, the optophone was invented in 1913 by Dr. Edmund Fournier d’Albe of Birmingham University and relied on selenium photosensors in order to detect black print and convert it into an audible output. The audible output was often a series of time-varying chords of tone that the blind could then interpret. Only a few units were built however and the limitations of the time meant that it was extremely impractical and very inefficient (at a display during the 1918 Exhibition showed a snail pace of one word per minute). Later models could achieve 60 words per minute, but even this were complex to operate and required education in order to decode the tonal chords.
- If such a device existed for music, it would probably find uses in both entertainment and education; with teachers using it to teach chords to blind people as well as their friends to include them in musical activities. With the technology we have today, there would probably be other applications for the blind to use.
- One of the earliest attempts to help blind people understand written texts better, the optophone was invented in 1913 by Dr. Edmund Fournier d’Albe of Birmingham University and relied on selenium photosensors in order to detect black print and convert it into an audible output. The audible output was often a series of time-varying chords of tone that the blind could then interpret. Only a few units were built however and the limitations of the time meant that it was extremely impractical and very inefficient (at a display during the 1918 Exhibition showed a snail pace of one word per minute). Later models could achieve 60 words per minute, but even this were complex to operate and required education in order to decode the tonal chords.
- Consider technologies that allow us to track and improve our own health—from fitness bands and glucose monitors to stationary bicycles. Discuss with your team: is it possible for these health-enabling technologies to be too helpful—and, if so, in what ways?
- Walk into any gym and you’ll find yourself surrounded by a myriad of these technologies, ones that track our heart rate, monitor our progress and allow us to gauge how “fit” we are. While it may be possible for these health-enabling technologies to be too helpful, the consequences aren’t that severe. Granted we may cease working out more than we should because “according to the fitness band” we’ve already burned enough calories for the day. Actually this may soon become one way for the robots to takeover the world, make humans feel as if though they’re “fit” when in reality they’re extremely unfit and then exploit that weakness to conquer us (just a theory).
- Enabling technologies do more than help people overcome physical impediments; they can also address social and resource limitations. Could 3D printing improve the living conditions of people without adequate access to housing in their communities, or does it face obstacles that this article overlooks? Could we one day unpack portable classrooms from our car trunks? Discuss with your team: what other applications can you imagine for 3D printing that might help those in need? Be sure to look at its use to create prosthetic limbs and even more comfortable helmets.
- Ah 3D printing, the true “magic” of our world. Whether it’s printing entire houses for a fraction of the price in a fraction of the time, or enabling those with amputated limbs to participate in daily activities again, or even giving the Vatican Swiss guard some stylish new helmets to show off on parade. 3D printing could indeed improve the living conditions of people without adequate access to housing, but it’ll have to tackle a few more obstacles along the way. Namely how to set up such machines and where to build these houses (I doubt a municipal government will be OK with removing hundreds of slums just to make a few dozen 3D printed houses). Maybe one day we can take school with us (literally!) by 3D printing foldable tables, chairs and blackboards. Perhaps the schools of the future will take learning to the great outdoors even more so than the ones today, with students changing their school location every day. Other than the applications listed in the prompt, I can imagine 3D printing giving those in poverty printed foods, affordable furniture, works of art to culturally enrich themselves and maybe even phones with which they can finally become one with the age of information.
- Sometimes we choose ways in which to limit our own lifestyles, out of concern for health, religion, or the environment; sometimes those choices are made for us. Either way, a person might want something that would ordinarily be uncomfortable or out of bounds. A vegan might crave a burger; a left-handed person might benefit from a mouse in tune with their intuition. Explore the science and design of vegan meat substitutes and of “left-handed” products, then discuss with your team: should technology allow people to bypass limits that they choose for themselves? Why would it be controversial whether genetic engineering could produce kosher pork? Should the government mandate that left-handed products cost the same as their right-handed counterparts?
- The science and design of vegan meat substitutes is still fairly natural: vegans still rely on tofu, tempeh and other vegan-friendly meats to get their proteins (some of those substitutes are extremely delicious I might add). However, perhaps genetic engineering will allow for the vegans to finally join us and eat meat without fretting over the impact it had on some animal elsewhere (understandable argument by the way). Scientists are still trying to find a way to make these “lab meat” burgers (taken from the cells of living animals and then allowed to grow under laboratory conditions) cost-friendly and more like real meat (since the uncanny value comes into play here as well). Until then, we’ll have to simply wait it out and deal with the dietary restrictions of some cultures/religions. Technology can allow people to bypass limits that they set for themselves, especially if those people look at the fine print when it comes to those limits (granted there will still be those fanatical faithful who refuse to eat genetically engineered meat, or those vegans who won’t eat anything made remotely from animals).
- If there ain’t no such thing as a free lunch, it might as well be easy to pay for. Recent years have seen the rise of new technologies for buying things—and for merchants to keep track of what they are selling. Consider technologies that “smooth out” financial transactions. How contactless credit cards work—or Apple Pay? In some parts of the world, people even pay for things with QR codes. Where is this practice most common, and why?
- Contrary to popular belief (or simply a result of poor naming), contactless credit cards aren’t exactly “contactless”; they still require the user to wave or tap it on a reader. Basically these cards have a chip inside them that transmits radio waves whenever a payment terminal picks up the signal (there’s also an antenna embedded in the plastic to ensure a connection is made). Also known as an RFID (radio frequency identification) system, the incoming radio waves are intercepted by the payment terminal and the transaction is (usually) successfully carried out. Contactless credit cards are more common in developed regions of the world, mainly Europe, America and some countries in Asia (Singapore, China, South Korea, Japan).
- Apple Pay works of the principles of near field communication (NPC, something we’ll delve into later on). Basically the system (first launched with the Iphone 6 series in 2015 and currently only available in the UK, US and some countries in Europe) is a rehash of contactless credit cards. The app basically works off the same principle as contactless credit cards: when your phone detects a payment terminal Apple Pay awakens and asks for a fingerprint in order to verify the transaction.
- QR Codes are something that the rest of the world (as in the eastern part of Earth) is also catching up on. Basically QR codes work off any apps on smartphones that link to a mobile wallet (or a bank account for some cases). Instead of tapping your phone to a reader, you’ll display a unique QR code to the cashier, who will then scan it and take the necessary amount of money in the transaction. Google, Apple and many other companies (both technological or banking related) are innovating their own payment systems to be compatible with QR codes.
- Be sure to explore how each of the following works:
- near field communication | payment terminals | captchas
- Near field communication: Otherwise known as NFC (and often indicated by a special blue and white symbol), near field communication is a rising payment method in smartphones (popularised by android phones like Samsung or Sony). The basic principle of this technology is similar to Wi-Fi or Bluetooth: transmit information over radio waves to be received by another device. As mentioned previously, RFID technology powered by electromagnetic induction (the same technology used in wireless charging and magnetic stoves) is used in near field communication. Basically the radio waves contain commands or verify transactions for payment terminals (something to be discussed next) to read and then act upon. I’ll leave this website here for any further investigation.
- Payment terminals: If you’ve ever submitted to the cruel machinations of our capitalist society and handed over your credit card (or seen your parents hand over theirs), then you’ve probably seen the cashier insert it into a payment terminal. The terminal is often made up of a PINpad (where your parents or yourself discreetly enter the PIN to verify the transaction), a screen (to display any relevant prompts or information), a strip on the bottom or side (where the credit card can be swiped or inserted) and a network connection (for authorization of payments and communication with the bank). Whenever a credit card is inserted into the machine, it contacts the merchants services provider or bank for data transmission as well as authorization regarding the payment. Prior to all this, ZipZap machines were the main form of payment terminals, where one had to manually enter all information relevant to their transaction.
- Captchas: You’ve certainly encountered this before. Take a survey, fill out an account sign up sheet or visit any restricted access website and your screen will display a CAPTCHA test. Usually this means typing in a series of letters and numbers which have been distorted in an image. It seems relatively simple, but that’s exactly the point. Humans can solve CAPTCHAs with no sweat, but robots and machines will find it almost impossible (at least for now) to answer such a question correctly. CAPTCHA stands for Completely Automated Public Turing Test to Tell Computers and Humans Apart (and I thought the Radarange 1161 was a horrible name). Some people refer to them as Human Interaction Proof (HIP), but their concept is the same: verify to a software that a real human is trying to use it instead of a possibly exploitive malware. Ironically, CAPTCHA generating algorithms can create hundreds of thousands of tests that they themselves will not be able to solve had they not known the answer before. For more information on different types of CAPTCHAS and their common usage: please visit this website (don’t worry, it won’t ask you to prove you have a pulse).
- EFTPOS | QR codes | chip and pin | magstripe | contactless payments
- EFTPOS: Electronic Funds Transfer at Point of Sale is another electronic payment system that relies on terminals to receive and verify electronic funds in the form of credit cards. Originating in the United States in 1981, it should be noted that these systems are usually country specific and cannot interconnect (hence why some credit cards don’t work in foreign shops).
- QR codes: A new form of contactless payments that are increasing in popularity, QR (or quick-response) codes are made up of a series of pixelated black and white squares of varying size and configuration. Each square relates to information stored on the code, meaning that these can be used to access websites, easily add people on social media (Snapchat in particular) as well as verify transactions to a mobile wallet. QR Codes can be used in 3 ways when it comes to payment: QR scanners (whereby a QR code is displayed and scanned to deduct money from an online wallet), manual QR code payment (where the retailer displays a QR code for you to manually conduct the transaction) and individual QR code scanning (whereby you pay another person who has the same app by scanning their QR code, usually used in smaller businesses like Uber or Zapper).
- Chip and pin: A fairly new technology, Chip and Pin relies on government-backed funding as well as corporate banking initiatives to get off the ground. First unveiled in the UK back in 2003, the Chip and Pin system, otherwise known as an EMV (Europay, Mastercard and Visa) card has an embedded microchip within the payment card. It’s often considered more secure than a magstripe (or magnetic strip, which is the next topic of explanation) because of the secure technology in the card and the need for an externally entered PIN. You can probably tell the two cards apart by the presence of a square shaped microchip (often 1 centimeter in every dimension) embedded within the card. Everytime this chip makes contact with a payment terminal (usually through insertion), a non reusable transaction code unique for that purchase is used, requiring the user to type in their PIN to verify that the transaction has their consent.
- Magstripe: Invented by former US politician Ronald Klein and first implemented in 1960s credit cards, magnetic stripes (abbreviated as Magstripes) were the first mechanisms used in credit cards when they were given out to the public. Magnetic stripes are often visible on the back of credit cards, where they are swiped into payment terminals in order to perform a payment. They function off the principal of data retention on “tracks”, basically three strips of iron particles suspended in plastic film which contain information about the credit card (account number, name, expiration date, service code and card verification code). When scanned, the tracks provide data to the payment terminal (or other reader, since these stripes are often used more in security situations such as hotel rooms, building access or identification cards). However, this technology is vulnerable to fraud, as information thieves can use devices to skim and copy the data that the stripes contain, creating duplicate copies of the card for illegal and unauthorized transactions by the rightful owner. They are slowly being replaced by EMV and Pin-and-chip cards, though much of the developing world (as well as institutions other than banks in the developed world) rely on this technology.
- Contactless payments: Please see the explanation for this mechanism at the beginning of this section. WSC likes to repeat itself (or do I simply go over resources prematurely?).
- near field communication | payment terminals | captchas
- Some argue that lack of access to the technologies of the modern financial system are one obstacle to people finding their way out of poor communities. Discuss with your team: should the government provide every vendor with a credit card reader? Should all cash be made “smart” so that it knows who owns it—reducing the value of theft in hopes of reducing street crime in struggling communities?
- There are indeed some drawbacks to digitizing the way we make payment, not just with contactless credit cards but also with the rise of cryptocurrencies (which, unlike their volatile exchange rates, are maintaining an upwards trend in popularity). Perhaps one day the poor will be unable to find their way out of their income problems simply because they don’t possess the credentials to earn credit cards which the world depends so heavily upon. Perhaps governments will one day find a need to pass laws regarding the mass-distribution and standardization of “smart” cash and payment terminals to ensure their economy can catch up with the rest of the world. It would certainly prevent street crime rates from remaining at their current level or indeed increasing if cash was made to recognise its owner and devalue itself immediately (or transfer to a secondary backup account!) when a theft has been detected.
- The television show Revolution posited a world in which electricity stopped working. Discuss with your team: how much would this dramatic development change your life?
- I think I speak for not just all scholars, but also many people when I say that this development would have a massive impact on our lives. Our society is so dependent on the electricity that flows around our houses and into our outlets that we would essentially have to halt all technological development, long-distance communication and information publishing if electricity were to suddenly stop. I’m sure we’d find other ways of generating power (it’d be pretty cool to see a sort of steampunk society in which everything was powered by an alternative form of energy). In terms of the actual impact on my personal life, well this website wouldn’t be able to exist for one thing and the WSC itself would never have achieved the international community it has now (and expanding!) if no one could view the website or hear about it on social media.
- Is easy access to Google making us worse at remembering things? Is Google Maps making us less able to get around on our own? Discuss with your team: when does the application of technology become dependence on technology—and is dependence necessarily bad?
- Easy access to the internet can result in humanity being slightly worse at remembering certain things. Indeed “cognitive offloading” as its referred to in research terms (basically the idea that we give the responsibility of our memory to the technological powers of the internet) can impact our ability to recall many things (from the year of a historical event, to the routes we take to work, to even the names of our colleagues). The application of technology becomes dependence when technology itself becomes the dominant factor in our ability to do something. Without Google Maps, we could always rely on street signs and verbal communication to ask for direction, yet without the laptop, we wouldn’t be able to communicate to other thousands of kilometers away.
- Dependence isn’t necessarily a bad thing. We humans have been depending on our ability to draw power from mother nature for a while, and only now is it becoming more apparent how bad it is. The same can be said for our enabling technologies, a person with disabilities might have some difficulty without access to a cochlear implant or wheelchair, but they can still manage to get around and live life their own way.
- Easy access to the internet can result in humanity being slightly worse at remembering certain things. Indeed “cognitive offloading” as its referred to in research terms (basically the idea that we give the responsibility of our memory to the technological powers of the internet) can impact our ability to recall many things (from the year of a historical event, to the routes we take to work, to even the names of our colleagues). The application of technology becomes dependence when technology itself becomes the dominant factor in our ability to do something. Without Google Maps, we could always rely on street signs and verbal communication to ask for direction, yet without the laptop, we wouldn’t be able to communicate to other thousands of kilometers away.
- Consider enabling technologies first introduced in works of fiction, such as medical tricorders, Babel fish, robot servants, and the hoverboard. Is any explanation given for they work? Are we developing anything like them in the real world—and, if so, what is the science behind them? Should fans of their imagined versions brace themselves for disappointment?
- Ah the world of movies and books. You as a scholar probably remember reading or watching a piece of fiction, seeing some sort of magical technology that seems so casually implanted in that universe (for me it was the laser bolts, lightsabers and light-speed engines in Star Wars). Even in works of commercialisation (or at least those of the past decades), we can find evidence of some sort of technological wizardry of the future (too bad those automated homes and flying cars never came true, at least not yet!). Often times, if the device is central to the plot or lore of a story, it’ll be explained in some detail (I still possess a book explaining how a lightsaber works). Other times, we’re simply left with no explanation as to how the device works. Yet some of the examples mentioned in the prompt actually might become a reality very soon:
- Medical tricorder: In the 1960s Star Trek TV series (ugh how I detest having to write about the rivals of my fandom), Dr. Leonard McCoy uses a medical tricorder to practically instantaneously diagnose any patient with their condition and treatments for them. While the concept of a “catch-all” electronic device that will diagnose patients immediately is still catching on, there already exist some alternative experimentals. Scientists and app developers are joining forces to work on hospital or even government sponsored applications that can take health measurements (such as heart rate, blood pressure and temperature) in a non-invasive way. Other features of these applications include electronic assistants where one can input their symptoms, which the intelligence will then reference to a database of all medical literature to find the most likely diagnosis. If implemented en masse, this technology could save thousands of lives and allow the human race to live ever longer, seeing as many doctors often make mistakes with their diagnosis and therefore their treatment.
- Babel Fish: A species in the Hitchhiker’s Guide to the Galaxy series, the Babel fish is a lifeform that, when held to one’s ear, translates any and all incoming speech regardless of whatever language it was originally spoken in. The book actually credits some sort of complex biological system that decodes the brain wave matrices and nerve signals from the speech centres. Basically this is the equivalent of a universal translator in our world. While some services like Google Translate or Wordreference do exist, these are only capable of decoding speech after it has been written or spoken, not during conversation. Now there are several high-end devices and prototypes currently in the works that function similar to earpieces. Using a microphone in the device, any spoken words are then translated immediately using a database of the language (again, AI at work) before another microphone relays the translated message within milliseconds of the original text being spoken.
- Robot servants: Practically any sci-fi film or book included mention of a robot servant in it’s story. Now granted this may be the technology we have the most progress in, assuming you’re willing to stretch the definition of “robots” in some way. By extension, Amazon’s Alexa and Google’s Home services are stationary devices that control aspects of the house (through speech commands and bluetooth connections). However, these devices don’t possess their own consciousness and require our input everytime we need them to do something. Perhaps the rise of AI will allow us to integrate these technologies together in order to create a robot servant.
- Hoverboards: For some odd reason, humans have disfigured this term quite badly. If you search up any science fiction shows, it’s likely that there will be a hoverboard included in the character’s possessions. However, the actual development of a real hoverboard has stalled, mostly because we’re concerned about something else: fake hoverboards. Somehow self-stabilising scooters (those two wheeled platforms that everyone seems to think is cool) have now taken over as the predominant type of “hoverboard” (ironic considering they don’t actually hover off the ground). Researchers are attempting to find a way to make a hoverboard, probably through the use of air currents or water jets, but mass-producing these complex devices (along with the risks of accidents, need for government approval and hoverboard-related infrastructure) won’t make it anytime soon.
- Ah the world of movies and books. You as a scholar probably remember reading or watching a piece of fiction, seeing some sort of magical technology that seems so casually implanted in that universe (for me it was the laser bolts, lightsabers and light-speed engines in Star Wars). Even in works of commercialisation (or at least those of the past decades), we can find evidence of some sort of technological wizardry of the future (too bad those automated homes and flying cars never came true, at least not yet!). Often times, if the device is central to the plot or lore of a story, it’ll be explained in some detail (I still possess a book explaining how a lightsaber works). Other times, we’re simply left with no explanation as to how the device works. Yet some of the examples mentioned in the prompt actually might become a reality very soon:
- Today, someone writing a screenplay can use screenwriting software to ensure they follow the right template and approach. Such technologies have seemingly transformed the creative process; it was not long ago that a writer working on the second draft of a novel would need to retype it from scratch, instead of opening a DOCX file and moving words around. It was not long before that that there were no typewriters with which to type that novel in the first place. Discuss with your team: have technologies allowed us to be more creative? How do you foresee their impact in the future—will every artist use a drawing pad, or will computers take over the creation of art altogether?
- Technologies haven’t necessarily allowed us to be more creative, they’ve simply allowed us to express that creativity in more ways than we were originally used to. We’ve still had science fiction novels, fantasy films and basic videogames for some time, but the rise in screenwriting software has allowed us to create these stories in a much shorter amount of time. In the future, technologies will allow us to simply say our story ideas while some AI compiles and formats it into a book for our review and publishing. There is the obvious fear that computers will take over the creation of art, but then again, humans seem to have some sort of devaluing factor whenever they hear that a piece of art was painted by a robot. Humans will still possess their creative potential (it may be the only thing we’re good at in the future).
- Consider internationalization (referred to as i18n) and localization in software development. What kinds of factors do they need to take into account? What are some other ways developers can take to ensure their tools are accessible by a wider range of people? Discuss with your team: should all websites and online services be required to exist in multiple languages and with modifications to account for different cultural norms in different societies?
- As you may have guessed, internationalization and localization in software development refer to how open-minded the design of a program has been in regards to it’s user base. An internationalised software means that it has been adapted to various languages, cultural nuances and regional factors without changing the goal of the program (i.e phones can often be set depending on which region they’re going to be used in). A localised software means the opposite: it has been developed for a specific group of people, language, region or culture by adding locale-specific components (i.e Youtube videos which have pre-uploaded subtitles in another language). The main factors that developers need to take into account is the purpose of the software and the likelihood that a diverse range of people will use it. There’s no point in adding foreign language support to an app that only people of a certain ethnic group are suppose to use, likewise there’s no point in limiting the languages offered by an app to a single one if more than 1 ethnic group is going to use it. The government can’t really mandate that all websites and services exist in multiple languages, since it’s the consumer responsibility to help influence the developers in deciding whether or not that would be appropriate. I doubt Weibo would find much footing in the US, or KaKaoTalk in Egypt and even Google in North Korea.
- Spend some time learning about the technologies that enable learning—from the archaic abacus and slide rule to newer devices that some schools are phasing out while others struggle to obtain them at all, such as overhead projectors and whiteboards. How do automated response systems (“clickers”) work, and how can they be used in non-competitive classroom settings? Of course, the earliest “automated answers” were in paper form, through scannable sheets popularized by companies such as Scantron beginning in the 1970s. Discuss with your team: in what ways are tests that are easier to score good for students and teachers? Do such systems have any downsides for teachers or students—and in what ways are they vulnerable to exploitation?
- Ever since we’ve been able to set up educational institutions and offer disciplines other than theology or sports, educators constantly find ways to integrate technology into the classroom. The abacus (still in use today and is rather mesmerizing to watch) and slide rule are just two examples of the myriad of seemingly archaic tools students of the past once used. Now we have classrooms filled with technology, from the personal research devices students bring to school, to overhead projectors and sound systems allowing teachers to display their teaching material in full view of everyone (rather than relying on the fairly old-school white/black board). Now let’s talk about that dreaded test which we scholars face at every round we go to: the Scantron Test:
- The scholar’s challenge is actually taken in the form of a Scantron test: a multiple-choice bubble sheet that corresponds to question numbers in a separate booklet. When you sit for a Scantron test, there’s a high chance it’ll be a machine that mercilessly and calculatingly reads your answers and decides your mark (much to the fear and loophole-seeking personas of some people). Made popular in the 1970s yet having existed since the 1930s, Optical Mark Recognition (OMR) technology work of a basic principle. Their scanners detect the amount of light passing through the sheet, with the neatly darkened answer ovals blocking said light and therefore indicate an answer to the reader (in the form of a signal transmission).
- The Scantron test has revolutionized the way in which we conduct tests. No longer does a teacher need to worry about their class of 20 students taking an exam later or earlier than every other class, now an entire group of 200 students can take the exam at the same time. The lack of any human error in grading also contributes to the accuracy (and at times brutality) of the test scores generated, meaning any scuffles over partial-credit answers or poorly constructed diagrams are avoided. Of course there are some negatives: a fully scored Scantron test will raise questions amongst teaching faculty and any human errors in feeding the sheets to the Scantron machine may result in serious panic over a misgiven test score.
- The Scantron tests are vulnerable to exploitation in many ways. Since it’s a machine taking into account the test scores, smudging a #2 pencil over the entire bubble area will often result in a full test score. Drawing lines between the answers and even putting chapstick on the wrong ones are all possible techniques (please view this website for more, as talking about cheating is extremely dishonorable on this site).
- The scholar’s challenge is actually taken in the form of a Scantron test: a multiple-choice bubble sheet that corresponds to question numbers in a separate booklet. When you sit for a Scantron test, there’s a high chance it’ll be a machine that mercilessly and calculatingly reads your answers and decides your mark (much to the fear and loophole-seeking personas of some people). Made popular in the 1970s yet having existed since the 1930s, Optical Mark Recognition (OMR) technology work of a basic principle. Their scanners detect the amount of light passing through the sheet, with the neatly darkened answer ovals blocking said light and therefore indicate an answer to the reader (in the form of a signal transmission).
- Ever since we’ve been able to set up educational institutions and offer disciplines other than theology or sports, educators constantly find ways to integrate technology into the classroom. The abacus (still in use today and is rather mesmerizing to watch) and slide rule are just two examples of the myriad of seemingly archaic tools students of the past once used. Now we have classrooms filled with technology, from the personal research devices students bring to school, to overhead projectors and sound systems allowing teachers to display their teaching material in full view of everyone (rather than relying on the fairly old-school white/black board). Now let’s talk about that dreaded test which we scholars face at every round we go to: the Scantron Test:
- Can the same technology that enables one group disable another?
- Of course it can, just look at the internet. The ability to access the internet enables our society to gain insight on a treasure trove of information, connect with those on other landmasses and even voice our opinions where it might not otherwise be heard. Yet on the flip side, those millions of people who do not have access to this enabling technology are barred from public discussions, unable to know more about the world around them and can often cause their physical marginalisation. In short, the power that enables one group often comes at the price of disabling the powers of another. It is our duty as humans to ensure that no matter what pitfalls technology brings to those who cannot gain access to it, we as physical beings support these beings in their efforts to connect with the rest of the world