Billboards, Porn, And Why Everything Is So Easy To Hack
If you haven’t seen Goatse, and don’t know what it is, preserve your virgin eyeballs. Suffice to say, Goatse is one of the oldest and foulest images from one of the oldest and foulest corners of the internet.
This is why, when an electronic billboard in the middle of downtown Atlanta was hacked to display Goatse, it was a Very Big Deal. Not just the police, but also the FBI and the Department of Homeland Security are looking into the incident.
This is where I get to the point of this article, which is – how does stuff like this happen? The answer is going to be the opposite of what you think. Most media articles will report the billboard story as a “hacking” incident, which puts the onus of the story on hackers.
These people, with malicious intent, used lines of code, energy drinks, and probably some EDM, to ruin the lunches of Atlanta commuters. This is a false narrative. When the Titanic sank, was it the iceberg’s fault?
Here’s a question: When you push the button to close the doors of an elevator, did you just “hack” the elevator? No you didn’t – you pushed a button. Hacking that billboard in Atlanta required approximately the same amount of effort.
The exact details are unclear, but it looks like the company, YESCO, failed to change the default password on its internet administration portal.
(Protip: If you ever use or purchase an internet-connected device, change the password immediately. Otherwise, breaking into that device will be as easy as Googling the owner’s manual.) Prior to the incident, security researcher Dan Tentler even warned YESCO what might happen. Their seven-word reply was, “Not Interested but thank-you for the follow-up”.
The primary reason why information security is broken in this day and age is not because hackers are getting better at hacking. It is because most companies, in most industries, don’t care about protecting their customers. Banks are required by law to take information security very seriously. Hospitals and healthcare companies are required to take similar measures under HIPAA.
Once you get outside those industries, however, the requirements fall away very quickly. If you handle a large volume of credit card information, for example, there’s a standard known as PCI DSS, which is supposed to help companies protect their customer payment information. Compliance with PCI DSS is almost completely voluntary, however, and the standard might not work so well.
Once you get outside well-delineated industries such as services, finance, and healthcare, mandated information security protections more or less evaporate. Operate an airline? Feel free not to worry about infosec. Manufacture medical devices? Spend your R&D dollars elsewhere. What if you make a smart, internet-connected lock? Who would ever think of hacking that?
These are all incidents, taken from the last two months, where security researchers discovered very large, very exploitable flaws in infrastructure that we might consider critical – which were subsequently ignored. In the cases of the lock and the airplane, the respective companies threatened legal action against the researchers involved.
Chris Roberts, the man at the center of the airplane controversy, is in a bit of a moral gray area. His actions probably deserve their own article’s-worth of discussion, but let’s back up a step: For doing nothing more than pointing out security flaws in devices which, if they fail, could cause incalculable theft and property loss at the very least, and mass death at the very worst, two people have received legal threats, and one will almost certainly go to prison.
Do you still think hackers are the most serious problem here?
It makes more sense to think of hackers as the canary in the coal mine. I’m not saying that there aren’t supervillains out there – serious breaches do occur, and result in the loss of money and property.
When those breaches happen, however, you’re not likely to hear about it. Even when you have a breach on the scale of last year’s Target hack, companies will wait weeks or months before notifying consumers.
Again, companies don’t focus on protecting information. They want to deny, mitigate, and cover-up. Fixing a vulnerability means admitting to it, which means embarrassment, which means no.
The highly-visible actions of pranksters and dedicated security researchers are sometimes the only way that these companies get held to account. We should applaud their efforts – unless we want planes to start falling out of the sky.