If something can connect to a network, it can be hacked. Computers and phones are still popular targets, but increasingly so are cars, home security systems, TVs and even oil refineries. That was the message at this year’s Black Hat and DefCon computer security conferences, which took place last week in Las Vegas. The annual conferences draw a mix of computer researchers and hackers who present the latest bugs and vulnerabilities they’ve discovered. It’s a combination of public service, business and sport. These are some of the more popular targets covered at this year’s conferences. By drawing attention to them, the “white-hat” hackers hope to encourage greater security from the various manufacturers and industries, and more vigilance from consumers. Typically, the presenters inform manufacturers of bugs ahead of their talks so the companies can fix the issues before they are exploited by criminals. Remote-controlled cars Someone hacking your computer can be an inconvenience. Someone hacking your car can be deadly. A pair of presentations on hacking cars kicked off the DefCon conference on Friday. Australian hacker Zoz outlined the security issues fully autonomous cars will face and said car-hacking is inevitable. Autonomous vehicles like cars and drones are essentially robots, and they rely on sensors to operate. He said a hacker could theoretically take complete control of a car over wireless networks or trick its various sensors into feeding a motorist false information about location, speed and the proximity of other cars or objects. Fully driverless cars are still a few years away, but computerized systems are common in vehicles on the road today. Electronic control units can control a range of car functions, including braking, accelerating and steering. They manage security features, in-car displays and even seat belts. Researchers Charlie Miller and Chris Valasek, funded by a grant from the U.S. military’s DARPA, looked into what kind of damage hackers could do to a car by taking control of a Toyota Prius and a Ford Escape. To access the systems, they had to physically connect a computer to the cars through a diagnostics port. They wrote custom software that let them hijack the cars’ systems. Once in control, they disabled brakes, changed the display to show incorrect speed or gas levels, and messed with the steering and seat belts. They were able to kill the engine and toy with with less consequential features like the car’s horn and lights. Toyota played down the wired demonstration and said it is focusing on security measures to prevent wireless attacks. Compromising smartphones Attacks on personal computers used to be the bread and butter of cybercriminals, spawning a lucrative industry of black-market malware and the anti-virus programs that fight them. The next big target is smartphones. Mobile devices are not impervious to attacks, even though walled-off app stores have kept much of the malware at bay. Kevin McNamee demonstrated how a piece of malware could turn an Android smartphone into a “spy phone” that remotely monitors its owner, sending information on the location, communications and content, like photos, back to a third party. The hack isn’t new, but McNamee managed to inject the malicious code into popular apps like “Angry Birds.” Once it was installed, the user would have no idea that their phone was acting as a remote surveillance device. Verizon “femtocells” – small boxes used to extend cell service – were hacked by security researchers at iSEC Partners to intercept calls and any other data sent over cellular networks like texts, images and browsing history. The wireless carrier issued a fix for all its femtocells, but researchers say other networks could still have the same issue. With $45 in hardware, researchers Billy Lau, Yeongjin Jang and Chengyu Song turned an innocent-looking iPhone charger into a tool for gathering information such as passcodes, e-mails and other communications, and location data directly from the smartphone. Apple thanked the researchers and said it is deploying a fix for the bug in its iOS 7 software update, which comes out this year. The too-smart home Thanks to cheap, low-power sensors, anything in your house can become a “smart” device, helpfully connecting to the Internet so you can control it from a computer or smartphone. Smart home security devices have the potential to cause the most damage if hacked, and two separate demonstrations showed how to break in by opening “smart” front-door locks. Another unsettling trend at the conferences was spying on unwitting people through their own cameras. Home security cameras could be disabled by someone who wanted to break in, or they could be turned into remote surveillance devices. One researcher showed how she easily took over the camera stream on a child’s toy from a computer. Researchers Aaron Grattafiori and Josh Yavor found bugs in the 2012 model of the Samsung Smart TV that allowed them to turn on and watch video from the set’s camera. Samsung said it had released a software update to fix the issue. (Many security experts suggest placing a piece of tape over any cameras you don’t want surreptitiously watching you, just to be safe.) Hackers get personal Even in the wake of this year’s NSA revelations, a homemade surveillance device that sniffs out pieces of data from your various computing devices, even when they’re not online, is disturbing. Brendan O’Connor, who runs a security firm and is finishing a law degree, has created such a device, dubbed CreepyDOL (DOL stands for Distributed Object Locator; “Creepy” is self-explanatory). The device cost $57 to make and consists of a Raspberry Pi computer, a USB hub, two WiFi connections, an SD card and USB power inside an nondescript black case. Computers and phones act as tracking devices and leak information constantly, according to O’Connor. When plugged in, CreepyDOL detects nearby phones and computers and uses them to track people’s location and patterns, figuring out who they are, where they go and what they do online. To demonstrate the device without breaking any laws, O’Connor showed his own information as sniffed out by one of the devices. Using a gaming engine and Open Street Maps, he hovered over his dot on a map. It brought up his name, e-mail address, a photo, the dating website he used, details about his devices and the locations he visited in town. In a worst-case scenario, as imagined by O’Connor, a miscreant could plug in one of the devices under any Starbucks near a capital building to pick up the scent of a state senator and wait for them to do something compromising. “You find somebody with power and exploit them,” said O’Connor. The creation is remarkable for how simple it is. It’s likely others have similar knowledge and setups that exploit the same security flaws in applications, websites, devices and networks. Industrial facilities The most frightening targets highlighted at the conference were the opposite of personal. Critical infrastructure such as oil and gas pipelines or water treatment plants are potential targets for hackers. Many industries are controlled with supervisory control and data acquisition, or SCADA, systems. The systems are older, installed at a time when people weren’t concerned about cyberattacks, and connect to the Internet over an unsecured network protocol. The reason the systems are online in the first place is so that they’re easier to monitor. Some, like oil pipelines, are often in remote locations. Multiple demonstrations at the conferences showed just how simple it is to hack energy systems. Researchers Brian Meixell and Eric Forner staged a mock hack of an oil well using pumps and a liquid container filled with teal liquid. They got into the system, turned the pumps on and off and overflowed the containers by feeding the system false data. If it happened on an actual oil well, the hack could result in an environmental catastrophe, according to the researchers. It’s possible to shut down an entire industrial facility from 40 miles away using a radio transmitter, according to researchers Carlos Penagos and Lucas Apa. They demonstrated injecting fake measurements, causing the device that received them to behave differently. For example, someone could trigger a water tank to overflow by faking an abnormally high temperature. The industries and U.S. government are aware that industrial systems are vulnerable, but their remoteness and age make upgrading difficult and expensive. There is no built-in system for releasing software patches, like there is with personal computers.