Fall 2013

Unmasking hacking

Software detects activity that may disguise an attack

Feature Image
Jonathan Cohen
Victor Skormin and his team use a bank of multiple screens to get a closer look at patterns of computer-system activity. Each dot represents one system call. System calls are basic operations that allow computer programs to talk to each other or interact with the digital environment.

When Binghamton University Professor Victor Skormin describes his work, he references the 1996 blockbuster Independence Day, in which a computer virus is used to save Earth from alien attack.

But Skormin is quick to note that the attacks he and his research team want to fight are of the real-world variety. Nuclear power plant operations, military systems, flight controls and cell phones are among the potential targets, and the software they developed is the defense.

Cybernetworks across the globe are increasingly coming under attack. Not only are the hacking attempts occurring more frequently, the attacks are becoming more sophisticated — with some having the potential to cause significant destruction and casualties. With hackers growing more adept at avoiding detection, organizations will need to do more to protect their networks. That’s where Skormin — a distinguished service professor of electrical and computer engineering — and his team of four doctoral students and a research scientist come in.

With $2.3 million in funding from the Air Force, Skormin and his team have been upgrading and testing software they developed that detects abnormal computer-system activity. These anomalies could mean hackers have accessed security vulnerabilities they can use to penetrate or take over a system. Or they could mean that something innocuous has occurred, such as the installation of a legitimate program or a routine system upgrade. It’s up to the network administrators or security personnel to determine what has happened — and fast.

“The war is raging in cyberspace,” Skormin says. “Initially, a targeted attack doesn’t do anything malicious so it cannot be detected by off-the-shelf antivirus programs.”

How it works

Hackers write attack programs that get into computer systems via download from the Internet — disguised as harmless software — or upload from a USB flash drive, Skormin says.

If it is a real, targeted attack deployed by professionals funded by a foreign government, it would first explore the environment, basically making itself at home in the computer system and conducting some initial surveillance, he says. Once inside, the attackers can then control what the computer does.

Perhaps a computer is programmed to maintain a system temperature of 500 degrees, Skormin explains. The malware could affect the computer’s temperature monitoring system so it erroneously reports the temperature as being much lower than it really is. This would cause the computer to turn up the heat and lead, literally, to a system meltdown.

Skormin provides another example: Hackers potentially could use malware to take over a flight-control system and divert the aircraft to another location or cause it to crash.
“If you attack a nuclear power plant,” Skormin says, “you can cause explosions and meltdowns. Civilians will be dying. You can easily create a second Chernobyl. Such an attack is originated in cyberspace, but it attacks physical space.”

When computer programs run, there are expected behaviors. The key is to track what’s normal and look for misbehavior or typical malicious behavior. “It is important for us to understand what the computer is doing,” Skormin says.

As a computer operates, it issues system calls, or a gigantic flow of data, that describe exactly what the computer does at all times, he says. Skormin compares system calls to letters of the alphabet. Skormin’s software extracts and monitors the system calls, or “letters,” aggregates them into application process interface (API) functions, or “words,” and then aggregates the words into functionalities, or “sentences.”

“Only when you have the sentence can you understand whether it is benign or malicious,” he says. “If we find indications that we’re dealing with a malicious sentence, we would terminate the process.”

At that point, he says, the FBI or other law enforcement and national security agencies could be contacted, depending on what systems are being attacked and how.

“We’ve been lucky so far that one of these sophisticated attacks has not occurred already,” says Leonard Popyack Jr., MS ’89, PhD ’98, a former student of Skormin’s and now an associate professor of cybersecurity at Utica College. He’s also owner of Anjolen, a company in Utica, N.Y., that conducts computer security research and development for defense contractors, government entities and other clients.

“Years ago, we lost power in the entire Northeast and the southern part of Canada,” Popyack says. “That could have easily been attributed to a cyberattack. It didn’t turn out to be that, but this is the world we’re in.”

Mistakes occur in large amounts of code, he says. Flaws creep into system design.

“There have been many, many attempts to increase security on code and hardware and software, but they never seem to go far enough,” Popyack says. “There’s always a balance between security and usability, and in general, the public wants usability. That usability requires less security.”

Bringing the program to market

Skormin and his team test their software extensively in a computer lab in the University’s new Engineering and Science Building.

It’s important to have this controlled testing environment because “a lot of stuff can’t be tested without infecting a real set of network computers,” Skormin says. Instead, the team uses a well-secured, experimental network test bed they built on campus. The team attacks and hacks into things such as a flight simulator they purchased for testing purposes.

For the students involved, this work is their passion. Zachary Birnbaum ’12, one of Skormin’s students, spent the summer in the lab testing and retesting the software and demonstrating what it can do. He’s entering his second year of doctoral studies in electrical engineering at Binghamton and looks forward to a long career in cybersecurity.

“I’ve always liked being on the forefront of science,” Birnbaum says.

Meanwhile, Skormin is pounding the pavement to bring the software to market. In May, he made a presentation at Brookhaven National Laboratory on Long Island, N.Y., to solicit support. He was later invited to begin testing the software there in a simulation environment. In July, he had meetings with a defense contractor and the U.S. Navy.

“There is interest in this work,” he says.

But as Popyack notes, “People don’t want to invest in security because they don’t feel the need to,” until a system is broken or a catastrophe occurs — and then it can be too late.

Skormin says his biggest challenge is securing more funding for research and development, including additional pay to recruit and retain top-notch doctoral students like Birnbaum. Another barrier is gaining acceptance from potential customers, who can be slow to act and adopt new technology.

The military, for instance, suffers from its own budget cuts and has its own constraints, Skormin notes. He estimates his software’s implementation has a multimillion-dollar price tag. The division of the Air Force that funds his project is different from the division that could purchase the software for implementation.

“It is not that easy to penetrate the market,” he says. “It’s the inertia of the system.”