7 Technology Myths Busted

We uncover some of the biggest falsehoods in technology


The idea that absorbing violent content through media encourages our own violent thoughts and actions isn’t new. In fact, it’s been around since violent scenes first appeared on the silver screen in the 1970s, with parents and conservative groups fearful of the negative impact viewing such things could have. The swift transformation of video games in the decades that followed, from family friendly titles such as Super Mario to the R rated Grand Theft Auto series, did nothing to allay their concerns. Suddenly young adults, rather than just watch a person harm another in gruesome ways on the screen, could take control of an avatar and commit such virtual crimes themselves. In Grand Theft Auto — a famous example of such a game — players could even shoot or simply run down innocent bystanders. While these games were designed purely for entertainment, gamers found their appetites for on-screen violence ever increasing so scientists decided to step in and investigate their potential impact. Several scientific findings have been published on the topic, and at first glance it seems like bad news for gamers. In a laboratory setting, numerous studies asserted the same conclusion: exposure to violence could invoke such behaviour in the viewer. However, a more recent comprehensive survey released in 2014 used crime statistics to debunk this view. The researchers compared rates of youth violence against consumption of violent video games and discovered the two were inversely related. The study had shown that youths were becoming less inclined to commit criminal violence with the rise of violent video games.


Lots of us long for a Mac of our own, with their sleek design, sophisticated hardware and intuitive software catapulting them to the top of many wish lists. Add to that the common notion that they’re immune to viruses, and they almost sound like the perfect machine. Only, as more users are discovering, Macs are susceptible to viruses, spyware and other types of malware just like PCs. However, this myth hasn’t arisen from nowhere. Macs do encounter much less malicious software (often abbreviated to malware) than Microsoft PCs, which has led to their inflated reputation. A primary reason for this is simply that there are more people using PCs therefore making them the obvious target for opportunistic hackers. Today, with a growing number of Mac users around, hackers have more incentive to design viruses for Macs. However, by their very design, Macs are much better equipped to deal with possible threats, with their inbuilt security measures capable of restricting unknown applications from installing on the system. But, there is no computer that is completely secure.


Shuffle playlists are great when we’re in an indecisive mood. Not sure what music to listen to? No problem. Just click ‘shuffle’ and the device will randomly choose songs from a playlist or library for you to listen to. Or will it? At least in the case of the music streaming service Spotify, the answer is no, it’s not quite as random as you might expect. Instead, they’ve designed an algorithm to make your shuffle playlist seem more random than a truly random playlist would be. And as bizarre as that sounds, makes sense when we consider that humans are very good at making patterns — even when there aren’t any. The algorithm attempts to circumvent a human invention known as ‘gambler’s fallacy’, which explains our tendency to think that if a coin has landed on heads five times in a row, then it’s likely to land on tails on the next toss. But really, every landing on heads or tails is equal, more or less. When we hear an artist on shuffle appear twice in quick succession, we instinctively wonder how the playlist can be random if the same artist has cropped up twice so soon. So Spotify have introduced the algorithm to separate an artist’s songs in order to cater to what we perceive to be random.


Like many tech-related myths, presenting megapixels as the sole determinant of image quality is a result of misleading marketing campaigns. And unfortunately for consumers, all the big phone and camera-creating manufacturers have hopped onboard with this advertising strategy. But more doesn’t necessarily mean better, and in some cases, more megapixels can even make your photographs worse! Digital cameras — unlike their predecessors that captured images using light-sensitive film — build images through pixels, which each process a small fraction of light caught by the camera’s sensors. With more pixels comes more units to capture incoming light, increasing the camera’s resolution and providing images with more detail. This can be helpful when making large prints or zooming in on images, but otherwise you’ll notice little difference between a seven- and ten-megapixel camera, for example. It’s also important to note that there are many more factors at play than just megapixels, with the camera lens, sensor, flash and software all being important elements. Plus, with more megapixels comes the requirement for more light to accurately capture the image, so a higher megapixel camera can produce lower-quality images than one with less megapixels when the other components are not up to scratch.


This battery myth, which supposedly helps to extend a device’s lifespan, is a notorious example of an incorrect piece of information that seems to endure even when it becomes outdated. And, if we’re able to admit it, most of us have probably shared this ‘helpful’ tip with others, unaware that our advice will actually harm their product’s battery life rather than help it. Most modern batteries, including all those used in our precious Apple iPhones and MacBooks, make use of lithium-ion batteries. Compare these to traditional battery technologies and you’ll find that they are claimed to charge faster, last longer and, most importantly for addressing this myth, charge best in short, ‘topping-up’ bursts. Apple measures their battery lifespans in cycles, with one cycle being equal to 100 per cent discharge, but that doesn’t mean that you should completely drain your battery before plugging in your device. Instead, it’s best to split a charge cycle across multiple charges. In fact, most tech advisors suggest never letting your phone battery get too low, nor too high. Not that a full-charge will be overly damaging, but consistently leaving your device plugged in until it has stored every last drop of energy can reduce its lifespan in the long term. Instead, take advantage of your device’s inbuilt charging design, which will likely be a ‘quick-charge’ to 80 per cent and ‘tricklecharge’ from 80 to 100 per cent. This design ensures that you can get power back quickly but stops your device from overcharging. So discard this common myth and stop waiting for your bar to empty before filling it up. Instead, keep your bar in the green, and charge from 40 per cent to 80 per cent for the most efficient battery life.


You may have seen a piece of movie sabotage involving the use of a magnet to erase the contents of a hard drive, or you may have simply been told to keep your devices well clear of them, but this danger is largely mythical. For forms of fl ash memory that use solid state drives, magnetism will have no effect whatsoever, so your laptop, smartphone and USB stick are probably perfectly safe. For hard disc drives, however, the danger is partially real. These devices create a binary code using polar alignments on the magnetic parts, so a strong enough magnet could alter the polarity and ruin the data. Myth confirmed? Not quite, as the magnet would have to be as strong as an MRI machine to have any impact. So unless your devices are going to be exposed to a super-magnet, they’ll be safe.


Keyboards beginning from the top left with the characters Q-W-E-R-T-Y have become ubiquitous with modern computers. And as many of us find this keyboard style easy to use, it seems appropriate that the alphabet is arranged in this way simply because it’s the most efficient. However, the QWERTY layout is actually a relic from the typewriter era. Originally, typewriters were arranged in alphabetical order, but as commonly used letters were placed next to each other this caused the machine to jam if these letters were struck in close succession, as the bars that pressed against the paper would collide. QWERTY was the answer to this issue, so common keys were placed further apart from one another. However, the ‘Dvorak’ and ‘Colemak’ arrangements are arguably more efficient, as commonly used characters are placed where they can easily be reached. But given you would have to retrain your brain and fingers, most of us will probably continue to stick with QWERTY.


This article was originally published in How It Works issue 108, written by James Horton 

For more science and technology articles, pick up the latest copy of How It Works from all good retailers or from our website now. If you have a tablet or smartphone, you can also download the digital version onto your iOS or Android device. To make sure you never miss an issue of How It Works magazine, subscribe today!