User:Colin / Wikimedia Commons / CC BY-SA 4.0 / Via commons.wikimedia.org
Like outgunned sheriffs in the Wild West, organizations from tech giants to government agencies have turned in recent years to bounty hunters to keep themselves safe. These mercenaries are hackers and security researchers, who companies pay to find and disclose flaws in their software and devices. The increasingly accepted practice is called the “bug bounty” system, and it gives hackers a legitimate way to reap rewards for making tech safer without going rogue. Still, the process can be daunting — how can companies strike the right balance between throwing their products open to hacking and keeping tight control over their security practices?
Traditional bug bounty programs, like those run by Microsoft, Twitter, and dozens of other organizations, are open to the public, meaning that anyone can warn about security flaws they think they've found. But these public bug bounties often incentivize too much, generating scores of redundant bugs that may or may not pertain to actually harmful vulnerabilities. These can overwhelm companies that aren&039;t set up to handle them. And when companies aren&039;t ready to handle an influx of bug reports, they can overlook or have a delayed response to serious security vulnerabilities.
To deal with the problem, some organizations have decided to make their bug bounty programs private, meaning only certain hackers and researchers can submit bugs. This helps organizations build up to a public program over time by controlling the quality and frequency of submissions. Apple signaled the perks of private bounties when it announced its first ever, private bug bounty program last week at the hacker conference DefCon. LinkedIn, Tor, and a host of other entities are also keeping their programs closed, at least for now. According to BugCrowd, a company that runs bug bounties for clients, 63% of all its programs have started private, a proportion that is growing. HackerOne, a competitor, recommends private programs to all its customers.
“Inviting the world to submit can be an overwhelming and scary process,” said Jonathan Cran, VP of operations at BugCrowd. “It makes sense for companies to start with trusted folks.”
And trust is a big issue. One of the reasons bug bounty programs took so long to catch on after Netscape ran the first one in 1995 was the perception that these programs attracted the attention of malicious hackers. So not surprisingly, most organizations start their private bug bounties with a group of security researchers whom they already have a relationship with. That&039;s how Apple&039;s program, which starts in September, will work. According to Cran, the starting pool in a private program is generally between 50 and 100 researchers, though he has seen programs launching with a few as two. In addition to ensuring a manageable stream of germane reports, starting small helps companies get an overall picture of where potential exploits are. It&039;s a way for corporations doing bounties for the first time to dip their toes in the water before going public.
“You&039;re going to find out you&039;re more secure in some areas than you thought and less secure in others” said Alex Rice, CTO and cofounder of Hackerone. “There may be things you’re completely unaware of, like vulnerabilities in unmaintained code.” Determining where and how prevalent these problems are, according to Rice, helps companies set competitive prices, and standardize how quickly they deal with bug reports (which is sometimes a source of tension in big bug bounty programs).
The Complications of Closing Off Your Bounties
Some have argued, though, that such programs signal to hackers that they have a limited amount of time to find and sell exploits on the lucrative private market. In other words, that they encourage malicious hackers to find all the exploits they can before the program is opened up to the bug-hunting public.
“Every one you’re fixing, you’re erasing the value of one in the black market,” said Rice. Less than a week after Apple&039;s announcement, a private security firm offered $500,000 — twice the size of the biggest bounty in Apple&039;s program — for iOS Zero Day exploits. (Though, enormous sums for iOS Zero Days are nothing new.)
Another complication for private programs is that they have the potential to alienate researchers. One of the main benefits of bug bounty programs is incentivizing people with the skill to hack corporations and governments to use those talents for good. Though many companies, including Apple, would likely accept a valid vulnerability report from a hacker outside its private bug bounty, such a hacker may not think to submit it to a private program in the first place.
However, most private bounty programs plan to eventually expand to be more public, which Apple says it will eventually do. Rice said HackerOne programs have stayed private from three days to three years, though they typically last around three months. Cran says BugCrowd recommends six months for most clients. Indeed, in an ideal world, the announcement of a high-profile private program such as Apple&039;s signals to hackers that a company is, according to Cran, “eventually going to pay for things,” and a cue to “rip open an iOS device and test,” even if the program is at first closed.
“Everyone assumes private programs have hard restrictions,” said Katie Moussouris, a security consultant who created Microsoft&039;s first bug bounty and more recently advised the Department of Defense on its Hack the Pentagon program. “But it&039;s more of a perception problem than an access problem. One of the biggest issues is simply confusion over how to get invited to an initially private program.”
And getting in early matters. Sean “meals” Melia is the top-ranked hacker on HackerOne&039;s all-time leaderboard by its proprietary “reputation” metric. He makes more money on bounties than he does at his day job at a security firm — “And I make good money at my regular job,” he told BuzzFeed News. But even Melia, the very picture of a trusted hacker, wasn&039;t invited to a recent, major private bounty until nearly a year after it launched.
By the time Melia gained access, “people had already gone through and picked off a lot of the low hanging fruit,” he said. “I was pretty bummed. It&039;s disheartening to see people with low reputation or who are new to the platform were invited before me.” It&039;s easy to imagine a disheartened hacker, left out of a bounty program like Apple&039;s, turning to the private market.
Still, as private bug bounty advocates are quick to point out, companies have always had private bug testing that left out the vast majority of hackers. Even if they&039;re private, the growing number of bug bounties are a sign that even the most cautious organizations, from Apple to the American government, have realized that they — that everyone — needs the participation of the greater cybersecurity community to make their systems and products as secure as possible.
“Traditionally people didn’t talk about the fact they had a private program at all,” said Moussouris, who consulted with Apple prior to the announcement of its program. “This is a shift in thinking. It&039;s also saying to the world, we&039;re open to this concept, but we are also going to learn as we go through this process.”
Quelle: <a href="Why Silicon Valley Is Turning To An Exclusive Group Of Hackers To Fix Its Code“>BuzzFeed
Published by