Slashdot: News for Nerds


Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

PunkSPIDER Project Puts Vulnerabilities On (Searchable) Display

timothy posted about a year and a half ago | from the anarchist's-cookbook dept.

Security 85

First time accepted submitter punk2176 writes "Recently I started a free and open source project known as the PunkSPIDER project and presented it at ShmooCon 2013. If you haven't heard of it, it's at heart, a project with the goal of pushing for improved global website security. In order to do this we built a Hadoop distributed computing cluster along with a website vulnerability scanner that can use the cluster. Once we finished that we open sourced the code to our scanner and unleashed it on the Internet. The results of our scans are provided to the public for free in an easy-to-use search engine. The results so far aren't pretty." The Register has an informative article, too.

cancel ×


How do you interpret the results? (0)

Press2ToContinue (2424598) | about a year and a half ago | (#42993309)

Guess I'm a 'tard 'cause I can't find the explanation for what the vulnerabilities mean.

But I know what dusty rose on gray means. (It means "Welcome to the 70's.")

Re:How do you interpret the results? (0)

Anonymous Coward | about a year and a half ago | (#42993429)

Guess I'm a 'tard 'cause I can't find the explanation for what the vulnerabilities mean.

I would [] have [] to agree [] .

Re:How do you interpret the results? (0)

Anonymous Coward | about a year and a half ago | (#42993485)

WOW, you can use google!

Now we should throw out all teachers ...

I think I got punked (-1)

Anonymous Coward | about a year and a half ago | (#42993427)

Most of sites I try to check gives no results, and all the sites I can find show zero vulnerabilities. Does this really have any value, or did you just try to punk slashdot?

Re:I think I got punked (0)

Anonymous Coward | about a year and a half ago | (#42993447)

Try 'Google'

uncheck some boxes (1)

Bananatree3 (872975) | about a year and a half ago | (#42993531)

try "The" or "and" or other basic title searches - and also uncheck some of the vulnerability boxes, and you'll see examples.

Next - SE for houses without security systems (1)

raymorris (2726007) | about a year and a half ago | (#42993467)

Well hours next project be a search engine showing which houses don't have good security systems, or showing the weaknesses in each home's security? What an aweful way to attention whore - by giving criminals a list of defenseless people.

Re:Next - SE for houses without security systems (2)

Anonymous Coward | about a year and a half ago | (#42993477)

That's not a perfect metaphor and you know it, because if your house is insecure it puts you in danger. If your website is insecure it puts the users in danger. If your house has no security system, you're personally aware. If your website is insecure you likely do not.

I'm not making an argument as to whether or not this is a good idea, but you're over simplifying things on purpose.

Re:Next - SE for houses without security systems (1)

NF6X (725054) | about a year and a half ago | (#42993523)

I don't see how this is different than publishing a searchable database of unlocked doors that I found in my neighborhood, with the claim that my purpose is to improve my neighborhood's security. I do not see this as oversimplification. A group (gaggle? herd?) of tweakers could use the database to find an unlocked house whose owners are on vacation and then squat there, using it as a base to burgle other houses in my neighborhood, just as malicious hackers could host malware on a vulnerable site. It's still a dick move to publish the list of unlocked doors for all to see.

Re:Next - SE for houses without security systems (3, Insightful)

Jah-Wren Ryel (80510) | about a year and a half ago | (#42993803)

Well, at least one difference is that when a website gets hacked it is almost always the people visiting the website who are the target because the goal of the hacker is either to grab information about those users from the hacked system or to use the hacked system to distribute exploits to anyone that browses there.

While when a house is broken into, it is basically a problem for the owners of the house and not really anyone else.

So publishing a list of vulnerabilities on websites serves the purpose of shaming the website operators into better protecting their users.

Re:Next - SE for houses without security systems (1)

BemoanAndMoan (1008829) | about a year ago | (#42994529)

So publishing a list of vulnerabilities on websites serves the purpose of shaming the website operators into better protecting their users.

So by that logic, I assume you rape every woman you pass on a dark street, mug the elderly who don't go out in groups, and commit every other crime of opportunity to shame people into what *you* consider proper, minimum safe behavior. How brave and noble of you.

I'm so tired of people dressing up shitty behavior under the guise of protecting them when really all they are doing is being selfish, self-satisfying little asshats.

If this guy wasn't such a douche, he'd be emailing the websites a notice letting them know of the vulnerabilities, not making the list available for everybody. This would have been a good example of how decent behavior could have helped protect both visitors and the site owners, instead of what at best will become a life lesson taught through severe litigation and (if we are lucky) state prosecution.

Re:Next - SE for houses without security systems (1)

Kiwikwi (2734467) | about a year ago | (#42994607)

So by that logic, I assume you rape every woman you pass on a dark street, mug the elderly who don't go out in groups, and commit every other crime of opportunity to shame people into what *you* consider proper, minimum safe behavior. How brave and noble of you.

Phew. For a moment I was worried this thread would descend into hyperbole and strawman arguments.

Re:Next - SE for houses without security systems (0)

Jah-Wren Ryel (80510) | about a year ago | (#42996465)

So by that logic, I assume you rape every woman you pass on a dark street, mug the elderly who don't go out in groups, and commit every other crime of opportunity to shame people into what *you* consider proper, minimum safe behavior. How brave and noble of you.

Yes, in fact I make suire to rape and mug every chance I get!!

The fact that you have to make such an absurd argument ought to be a clue that you have misunderstood the original point.

Re:Next - SE for houses without security systems (0)

Anonymous Coward | about a year ago | (#43004475)

Or, it's like posting a map that shows where all of the registered gun owners live in N.Y. It's an invitation, and sites like this are a conduit for crime -- a dashboard/intelligence service for criminals.

Re:Next - SE for houses without security systems (2)

tibman (623933) | about a year and a half ago | (#42993847)

I view it as a list of dark alleys you shouldn't walk down.

Re:Next - SE for houses without security systems (0)

Anonymous Coward | about a year ago | (#42999579)

I'm not really sure what the point of your response is. You're claiming you don't see any reason for it to be different, yet you don't refute any of the reasons I listed in my post.

Re:Next - SE for houses without security systems (1)

ArsonSmith (13997) | about a year and a half ago | (#42993735)

Yea, let's release a list of known non-gun owners.

Ethics (3, Insightful)

Adam Gignac (2834761) | about a year and a half ago | (#42993503)

Funny; my professor just told a networking class recently when discussing vulnerability scanners that it was seriously unethical to scan a system without permission - it would be like walking through a parking lot and checking which cars are unlocked. I think most people would agree with him. This project might have good intentions, trying to encourage the sysadmins to tighten up their security, but I think there's a better way to do it than public shaming.

Re:Ethics (1)

DCisforBoners (1880920) | about a year and a half ago | (#42993551)

Some of these sites are likely entrusted with sensitive user information. The car analogy is only apt if you borrowed $100 from a couple of your closest friends for rent and left that in the car you forgot to lock while you were getting a taco. As I see it, the benefit of this type of public shaming is it reinforces in end users the idea that you should be careful who you trust with your data. For admins, if the majority of listed sites use web technology "x", maybe if you're designing a new site you look for an alternative.

Maybe. 99% are not (1)

raymorris (2726007) | about a year and a half ago | (#42993683)

some of the of sites are likely entrusted

And 99.97% are some guy trying to make ends meet by offering online chemistry lessons or showing you how to hook up your home theatre. IF there were any sites found that held personal information, the right thing to do would be to contact those sites, not encourage people to hack the personal information.

Certainly it does no good whatsoever to give script kiddies a list of sites to deface. The most popular hosting is Godaddy, with their $10 / month hosting account. (35% of sites are Godaddy sites.) The sites with hosting budgets of around $10-$50 month make up 95% of all sites. So that's mostly who is affected - some elementary school art teacher selling used computer parts online in his spare time.

Re:Maybe. 99% are not (0)

Anonymous Coward | about a year and a half ago | (#42993749)

But website owners can be jerks and have been in the past, often either ignoring people personally informing them of vulnerabilities or even threatening them with lawsuits or whatever.

Re:Maybe. 99% are not (1)

DCisforBoners (1880920) | about a year and a half ago | (#42993789)

That's a fine point, but if you were that consumer shopping to implement a prefab site, wouldn't you like to know if the technical foundations are sound? If 1000 GoDaddy sites are hacked in a day maybe that prompts a response from the host.

90% of everything is crud. (1)

Tenebrousedge (1226584) | about a year and a half ago | (#42994121)

The most popular hosting is Godaddy, with their $10 / month hosting account. (35% of sites are Godaddy sites.) The sites with hosting budgets of around $10-$50 month make up 95% of all sites.

As a web developer I may say categorically, fuck them. If you put a site on the web, it is your responsibility to make sure that it is secure. If you are not able to do that to a professional standard, you should not do it. In point of fact, there is a need for a licensing organization to prevent amateurs from practicing web development. The problem isn't that this website is exposing poor security practices, it's that it's not promoting professionalism.

We have passed the point where it's okay for the layman to host a site. Even if you're not collecting information about your users, you can still be attacked or used as an attack vector. The era of democratization of the web is over.

Re:90% of everything is crud. (1)

Anonymous Coward | about a year ago | (#42994601)

Could you please post a list of your client's websites? :)

Re:90% of everything is crud. (0)

Anonymous Coward | about a year ago | (#42996537)

Lol, Web Developer speaking from authority. You take life as seriously as a locksmith. AOL and Apple have tried walled gardens before. People have spoken and they are willing to accept risk in exchange for the diversity of choices produced. Good luck with your campaign to censor TCP/IP. Silk Road and the Pirate Bay say you'll make TONS of headway on that battle.

Re:90% of everything is crud. (-1)

Anonymous Coward | about a year ago | (#42997223)

You have no idea what is even being discussed. We're talking about HTTP/HTML, and having quality standards. Mostly trying to fix the impression that J. Bob Smallbusinessman has that it's okay to cheap out on a web dev.

Go play with your bits and let the grownups talk.

Re:90% of everything is crud. (1)

Zmobie (2478450) | about a year ago | (#42996803)

Actually I do somewhat agree with the spirit of this post here. Software Engineering is a discipline that can affect large amounts of people and where not many people actually understand it. This is very similar to any other type of engineering (civil, nuclear, electrical, etc.) and to practice those other disciplines generally you have to have a P.E. or at least a P.E. signs off on the work done. Under current models, this is not in any way required for software and while most of your real software engineers don't really need to have this, for every one of them there are 10 or more idiots that picked up a "Complete Idiot's Guide to PHP" and started throwing websites up. The entire point of the license model is to ensure quality of work because engineers are working on things that affect the public tremendously.

Now, playing devil's advocate to my own point, you can also argue that the no license required is how the web and software grew like it did. There have been plenty of great projects and ideas put out there that would not have been if a P.E. were required to sign off on the work. Indie development would all but die under this kind of model or at best the cost would increase significantly because of needing a P.E. to review everything after the fact and point out the real problems.

Kind of a double edged sword honestly, but there is a valid point in Tenebrousedge's post.

Re:90% of everything is crud. (1)

fast turtle (1118037) | about a year ago | (#43000389)

and the god damn license isn't worth the paper it's printed on. What makes the difference is that an Engineer is Legaly Liable for any screwups. In other words, they've got a bit more at risk then Joe Sixpack who's throwing shit at the web to see what in hell is going to stick. Until the liability issue changes for web developers, nothing is going to change

Re:90% of everything is crud. (1)

Zmobie (2478450) | about a year ago | (#43010767)

I'm a little confused here, your post kind of contradicts itself. You say the license isn't worth the paper it is printed on, but then say

an Engineer is Legaly Liable for any screwups

The license is what allows someone to legally be an engineer for most disciplines. We went over this when I took my engineering ethics course back in college, and there have been numerous (some very frivolous in fact) lawsuits to keep people from using the term or actually practicing any form (of licensed) engineering. The only current exception to this is software engineers can legally use the term without a valid license because one doesn't exist.

The entire point of what I said in the first half of my post is that a P.E. would in fact make those software engineers legally liable for their work (within reason, even in other engineering disciplines things happen you just have to show that they took reasonable steps and practices to try and prevent it) therefore doing precisely what you said and shifting the liability on to these web developers.

Re:90% of everything is crud. (0)

Anonymous Coward | about a year ago | (#43009411)

lol. You're a funny guy.

There are very few disciplines that require the strict enforcement you are talking about, web dev is not one of them all that often.

When it is, it's not people who react like you to these issues that are the ones in the hot-seat, it's the open thinkers who have a head on their shoulders and enough knowledge to know shit goes wrong sometimes and it's out of most people's hands. Granted, they do their fucking damnedest to prevent anything they do from being attacked, but they don't shoot the person who's stuff they have to fix and fix now for making the mistake.

Good try, though. I'm sure the MPAA and their allies would love to have a soul like you onboard. It's the same mentality.

Re:Ethics (1)

kermidge (2221646) | about a year and a half ago | (#42993609)

"....I think there's a better way to do it than public shaming."

Ok, such as.... what?

      If someone puts up a web site I have to figure that it might be for people to visit. If that site has vulnerabilities, I have to give the owner benefit-of-doubt that they might likely want to know such, as I also have to figure that they wish for it to be safe from attack - to prevent defacement, hi-jacking for attack app insertion, making off with private infos, etc.
      Therefore I'd hazard a guess that, since everyone's means are limited, by their own knowledge and skill, time available, to do their own testing, or budget to hire it done, that whatever agency is able to easily and quickly point out a few of the more common vulns (and also the same ones used by many of the crims to make money) - that they'd welcome the info so they might fix their site.
      Punkspider seems to fit that bill.

      So, unethical. How so? Is it somehow more ethical to not test and have site open to attack, or is it both more moral and more practical to get information that'll help to protect the site?

Re:Ethics (2)

GrumpySteen (1250194) | about a year and a half ago | (#42993701)

They're scanning large numbers of websites and putting vulnerability information up on a public website for anyone to view without notifying the website owners, much less giving them a chance to fix the problems before sending hackers their way. There's nothing ethical about that.

Re:Ethics (0)

Anonymous Coward | about a year and a half ago | (#42993773)

Why do I feel that most of the people claiming this is unethical are all the same person.

In the past hackers used to notify the owners and give them a chance to fix problems and the owners would either do nothing or even threaten to sue. Any slashdot reader should know this.

and chances are the hackers already have a system that does this and have long had this system. In fact, this is probably far behind what the hackers already have. Website owners who store user data have a responsibility to keep up with vulnerabilities and fix them ASAP.

Re:Ethics (5, Informative)

raymorris (2726007) | about a year and a half ago | (#42993875)

In the past hackers used to notify the owners and give them a chance to fix problems and the owners would either do nothing or even threaten to sue. Any slashdot reader should know this.

Anyone claiming to know anything about the topic of web security should know the procedure used to remedy ~90% of all vulnerabilities. Those security updates you get each week don't appear out of nowhere. Someone like myself files a security ticket with the vendor or affected party. The vulnerability is confirmed and analyzed, then other vendors who are likely to have similar vulnerabilities are notified. A patch is pushed, THEN a CVE is issued. After that, more mainstream sites like slashdot pick up on, and link to, the CVE which explains what the vulnerability was is links to the update to fix it. That's typically about a week after the vulnerability is teported and 2-3 days after the fix is available. That's how securitu issues are normally handled, they aren't ignored. (If they were ignored we wouldn't average 100 security notices per week, would we?)

When I found the PowerDNS vulnerability I could have come straight to Slashdot with "how to take down wikipedia and millions of other sites". If I were a scumball attention whore I would have done so. Instead, I reported it through proper channels. Wikipedia was patched within 36 hours, then other sites. The next day, the CVE went out, THEN you heard about it. I still get to brag - I just do it AFTER a) wikipedia and other responsive sites are safe and b) I have something worth bragging about, having protected wikipedia from being exploited.
This jackass is merely attention whoting at other people's expense. He hasn't done anything special - just ran Nessus - but is advertising himself via the results rather than handling them responsibly.

Suppose for a moment that some of the sites could leak sensitive information. Suppose also that sites which leak sensitive information should be slapped. Well, the slashvertised site, the cracker's search engine, is most certainly leaking sensitive information, ergo he should be slapped!

Re:Ethics (1)

kermidge (2221646) | about a year and a half ago | (#42993901)

Thank you; I learned something. Several somethings, in fact.

Re:Ethics (0)

Anonymous Coward | about a year and a half ago | (#42994175)

I'm not defending what this person is doing, I agree he is going about it wrong. but some site owners have also been known in the past to be jerks.

Re:Ethics (2, Informative)

punk2176 (2840475) | about a year ago | (#42994233)

Hmm, a few issues with this...

1) The statement that we "just run Nessus" is incorrect. We wrote our own scanner that works on a Hadoop cluster. Why is this important? It means that we can handle a lot more scans than anyone else (several thousand per day with a small cluster) and it's also specifically made for mass scans. This is important in point 2 below.

2) The process you're describing is for finding a vulnerability in a piece of software in general (e.g. a common CMS), not a specific vulnerability in an implementation of a piece of software (e.g. a specific website). That's a huge difference. You wouldn't put a CVE up for a SQL injection bug in a specific implementation of a site (you would only if it was common to an entire CMS for example). Anyway, what we hope is to build a community of like-minded security folks that can help those website owners fix their *specific issues* first and if applicable go through the process you describe when needed. We also want to provide this for free.

3) What if the vulnerability is in a custom built site that no one cares enough about to do security research on. Who's letting them know their issues? We hope to provide a view of this to the website owner and yes, push them a little to get their security ducks in a row.

4) We're not attention whores or jackasses. Calling people names isn't nice and makes us sad.

Re:Ethics (0)

Anonymous Coward | about a year ago | (#42994403)

I'm sorry you are only on this site to push your warez.

Re:Ethics (0)

Anonymous Coward | about a year ago | (#42994715)

Calling people names isn't nice and makes us sad.

Unauthorized port scans isn't nice and makes us pissed. Jackass.

Re:Ethics (4, Insightful)

BemoanAndMoan (1008829) | about a year ago | (#42994883)

We hope to provide a view of this to the website owner and yes, push them a little to get their security ducks in a row.

No, you don't. If you did you'd have built your system to make *them* aware first, instead of posting a "don't blame the messenger" shame tool that exposes their vulnerabilities.

The hacking-promotes-security argument is weak sauce, even more so in your case. The vast percentage of people you've exposed (i.e. not anonymous mega-corps, but rather small mom-and-pops set up and left un-managed by unskilled sysadmins, innocuous self-hosting newbies, etc.) will likely never encounter your list, even after it provides scriptkiddies with an easily digestible list of opportunities who wipe their servers and turn them into warez hubs only to be rinse-repeated because they will *never* know any better.

You are merely a new vector for the disease, selling itself as a cure. Where in this is your moment to feel proud?

Re:Ethics (1)

Thruen (753567) | about a year ago | (#42995005)

I'll help resolve these issues for you.

1) The software used is a very minor part of the point, and as far as the ethics argument goes means literally nothing.

2) The start of the process he's describing, reporting the bug to the people who can deal with it, is the important step that doesn't change. Yes, it is different than dealing directly with software developers. It also means they probably aren't capable of fixing it so quickly. The software developers have a huge edge in that area. It does need to be dealt with differently, but public shaming without giving them a chance to fix it is not dealing with the problem it's exasperating it.

3) If it's a custom site that no one cares to do security research on, chances are nobody's looking to attack that site anyway, until it's posted on a search engine calling it out as a target. As for who's letting them know, adding them to a search engine is NOT letting them know. Odds are these smaller sites aren't out there looking to see if anyone's found a vulnerability in their site. I didn't see anything that states you warn the site owner and even attempt to give them any time to fix it. If you want someone to get their security ducks in a row, step one is telling them everything you know about their problem, and step two is giving them time to fix it. If putting them in a searchable database is any step, it's much further along.

4) I can't say much about this point as it truly depends on your intentions. You may have good intentions and think this is a great thing to do, but you are most certainly going about it all wrong. You shouldn't be advertising this on Slashdot, you should be emailing webmasters everywhere to tell them about their vulnerabilites. And while I'm here, a feature suggestion: an option to search by software used rather than the specific address, that way people can search the software they're using and find out if any specific implementation is vulnerable without relying on their site already being in this database. And before you say it, yes, web developers could and should run these scans on their own, I'm not defending sloppy security at all, but the developers who aren't running the scans also aren't going to go search for their site on yours without being told they have a problem.

Re:Ethics (1)

WoOS (28173) | about a year ago | (#42995335)

Now let's not get too harsh.

How 'good' or 'bad' this database is depends IMHO partially on its query interface. If one can only ask for single (FQDN) URLs (with a query rate limit) and gets the vulnerabilities of that specific URL as an answer (plus maybe some pointers on what the vulnerable software likely is), it might actually be useful for the somewhat technically-inclined web owner. The proposed list of vulnerable software would probably not help them as they would have to remember which SW their site runs.

If on the other hand one can simply search the database for "give me all sites suffering from vulnerability X", that is not helping a web site owner at all but a cracker very much.

The OP's site seems to be slashdotted (seems Hadoop is not applied in the frontend) so I cannot check but from the comments here it seems it implements the second option. That is definitely something that should be changed.

The other thing any web site owner should take from this article is that Mass vulnerability spidering has reached mainstream. Much better to announce that on slashdot (and as many other news sites, magazines, periodicals as possible) than have people discover it in a year on their own.
Now the next useful topic on slashdot would be a discussion on: "Should I host my own site, blog, shop, ... or better use a service (doing all the security stuff)." But I have to check on my wordpress software version ;-)

Re:Ethics (1)

Thruen (753567) | about a year ago | (#42995819)

I'm really not trying to be harsh, sorry for coming off that way. I'm probably a little biased because I have experience being a small business IT guy by default (as in no real training, just better with computers than the other people there), so that's really who I relate to the most. In my position I understood sometimes you need to seek help elsewhere, and I did, but I also learned that problems aren't as easy to fix as people think, nor are they always cheap, and money can be a big issue for a small business. I would've appreciated the help if PunkSPIDER sent me an email describing a vulnerability, and I would've tried to fix it quickly, but if I couldn't figure it out myself I'd have to call someone else in, which gets expensive and might take time to budget it. Instead, PunkSPIDER just puts it in their search engine, unbeknownst to me the lowly inexperienced IT guy, so instead of me being able to fix it before it becomes a problem I still don't find out about it until it's a bigger issue. I know, the heart of the problem is having an IT guy who doesn't have the proper training, but if you've ever worked at a small business during hard times you know spending more isn't always an option, sometimes you need to work with what you have, and that's what they did, I was there so they used me.

On to everything else, it is actually closer to the first implementation you describe, although I don't know about the rate limit as I didn't test it when the site was still loading quickly earlier.

To clarify my suggestion of searching by software, I didn't intend for it to list addresses, only vulnerabilities related to that software. As for the website owner not being able to remember what software they used, without getting into how one sets up and runs a web site without even knowing what software they're using, if they're that technically deficient they almost definitely won't be able to fix it themselves (if they even understand the information they're looking at) and should already be looking for someone to handle the technical end of things.

And as for what they should take from the article, it is mainstream and people should know about it. However, to use a more extreme example, the same can be said for copyright infringement, but would advertising The Pirate Bay or isoHunt really be the right way to alert people to that fact?

As for the discussion about whether you should host your own site or blog, it seems pretty straightforward to me. If you have the technical know-how and understand what you're doing as well as the costs involved, go ahead and run it yourself. But if you lack that knowledge, it's a silly question to ask. It's like anything else, just because you can make something work doesn't mean it's a good idea to do it yourself. I've been in that position, trying to fix something that's over my head, and while I could generally make it work, it was never as good as having the professionals fix it. If you want an analogy (because those are popular here) you can really swap out web development with any other skill in the world. Just because you can figure out how to (fix your car, plumb/wire your house, build any sort of structure, sew your own clothes, stitch your own cut, tow a car) doesn't mean you should do it yourself instead of leaving it to the professionals. And yes, I know I pointed out why that's not always easy or in the budget, it may not always be an option for people, but if it is an option it's the right one.

I drifted a bit off the real topic in the end here, my bad.

Re:Ethics (1)

Dextrously (1086289) | about a year ago | (#43018651)

I see a note on the punkspider site to opt out of having your site scanned. Is there a specific way to opt in as well? I would be interested in seeing what results could come out of scanning a few of my sites. I've tried using Skipfish in the past, and a few other scanning utilities, and got a lot of false positives, and also a lot of missed positives. Things I knew were vulnerable and just wanted to see if the scanner would pick up on it.

Thanks for the great work! I look forward to seeing the results, even if some people don't like it. Perhaps sending a notice to "webmaster@domain.tld" would suffice? Possibly even a month or so. Just something along the lines of:

Hey, found some vulnerabilities on your site, this is what they are......
DOMAIN.TLD is currently in queue to be listed in our database on 00/00/2013: click here to request a time extension (or possibly a removal from the list completely) before listing, or click here to queue up another scan of your site when the vulnerabilities are patched.
If you would like assistance in fixing these vulnerabilities, feel welcome to come to our forums or read [link to OWASP info] more information.

I know I would personally appreciate such an automated approach.

Personally, I don't think there should be a removal from the list option if adequate time is given. That's just my opinion though. I feel like if you've done your due diligence to notify the maintainers, after a month or two of time, the public should have a right to know so that they can be avoided.

How many times have researchers found things (0)

Anonymous Coward | about a year ago | (#42994773)

Only to be "blown off" & nothing gets fixed? Answer = Plenty of times.

(This forces those sites into action, especially since it proves those sites are @ risk & guilty of negligence putting their viewers @ risk of infestation by malware/malicious script injections etc./et al).


P.S.=> I understand YOUR point on "responsible disclosure" though - I really do: IF you approach a site with an issue & they don't fix it though (and yes, that happens too)?

However/Then, a responsible website would "brush up" on things like using binding of variables to query strings and then using stored procedures for database query access for the sake of their viewing public too, on the "flip side" of things, for security's sake, in their own & that of their users!

Otherwise - it IS blatant negligence & essentially refusing to "patch" (after all - OS vendors have to do it or should, why not websites also?)...

(All that above should be done, as well additionally inspecting what the state of security is on the adbanners they float too, since this report from CISCO blows the lid off that too -> More dangerous to click on an online advertisement than an adult content site these days, Cisco said: -> [] )

... apk

You found that: Was fixed - this wasn't (0)

Anonymous Coward | about a year ago | (#42995009)

Continuing on my last posts' premise -> [] where I found an issue in hosts files after 12/09/2008 MS "Patch Tuesday" in VISTA onwards (Windows 7 &/or Server 2008 r2 + beyond) where hosts files could no longer use the faster to load into memory 0 blocking "ip address", an analog to a DROP request in a firewall pretty much (due to smaller files resulting) & faster to parse line-by-line as well (via the tcpip.sys built-in DNS resolver loading hosts & referencing it, FIRST, before anything else by default -> [] ).

Fact is - I reported this to Microsoft during their "Engineering Windows 7" blog, here -> []

In addition to THAT?

Here on /., I literally also got their VP of the "Windows Client Performance Division" to concede my point that using 0 as a blocking "IP address" is superior (faster/more efficient) vs. the 6 characters-per-line larger & slower even (worse yet, vs. the larger by 2-8 characters per line to parse loopback adapter address of & that it would be slower, to LOAD & PARSE that larger custom hosts file result, ala his words quoted next below:



"Of course, larger files take longer to load." - by Foredecker (161844) * on Wednesday December 09, @10:34PM (#30384666) Homepage

FROM -> []


Quite a bit faster results happens with smaller blocking addresses noted below, due to smaller filesize for looped programmatic reads by the IP stack of the hosts file, & NOTICEABLY SO!

(As it's linearly related to the diff here between these filesizes being read in, where large size differentials result):



0 = ~ 42mb size = ~ 53mb size = ~ 58mb size


Using a custom hosts file with each above having 1,934,453++ entries largely composed of KNOWN malicious sites online to be blocked out (what I use now).


* See my point? It NEVER got fixed... & ought to be!

(Linux doesn't have this issue & it's 1 thing I will DEFINITELY hand it over Windows in fact - hearing that from ME, the "poster child for Windows' fanboy on /." is a rarity, mind you...)

Lastly: 0 STILL WORKS, oddly enough, on Windows 2000 SP#2 onwards, into XP, & right into Windows Server 2003 though too, oddly enough - whereas, again, by comparison, it doesn't on Windows VISTA, 7, Server 2008 r2 & beyond!

("Will wonders NEVER cease"?)


P.S.=> You got lucky, & yours was DIRECTLY "security-related" + got fixed...

Needed it, since DNS has issues (worst of all being largely MOSTLY unpatched vs. the Kaminsky redirect poisoning bug for 1/2 a decade++ now mostly worldwide & worst of all, @ the ISP level):


5 years after major DNS flaw is discovered, few US companies have deployed long-term fix (vs. Kaminsky Bug above...): []


Which custom hosts files actually SECURE against it by using hardcoded favorites reverse DNS resolved vs. the in arpa addr 'tld' that houses that information from ICANN & VERISIGN servers that use DNSSEC (to proof vs the kaminsky bug no less)... which allows you to avoid the rest of the vulnerable & recursively setup remainder of DNS servers, worldwide (faster resolutions too out of hosts cached locally in RAM as well).


However - My point was mostly on efficiencies & speed by comparison, for the reasons noted above! It still remains unpatched, years later!

So, my point from my last posts' made by this example I provided from FIRST HAND EXPERIENCE, as is yours:

It's a mixed bag of results!

(E.G./I.E.-> You can tell them they have 'issues' all day long, but will a vendor of ANY ware, OS, or even website DO ANYTHING ABOUT IT?)

In your case, yes... in mine, no!

Thus - This "name & shame", especially after you *try* to do a "responsible disclosure" as I did above, does them and their users, a favor (by hopefully getting them to FIX what needs fixing, especially for security online - which WOULD be & IS, the "right thing to do" once they're aware of it)

... apk

Re:You found that: Was fixed - this wasn't (1)

Thruen (753567) | about a year ago | (#42995051)

So are you saying your argument in favor of the name & shame strategy is pointing out times where companies were named and shamed and still didn't fix it? I see a flaw in your argument... Which is sad because it's a valid point that you're trying to make, sometimes the name & shame strategy does work. But that's not really what this search engine does anyway, it's not as if they're posting on their front page that a site has vulnerabilities, you still need to go out of your way to check a specific site to find the vulnerabilities, which means it's not likely that the general public will hear about the problem, hard to call it the name & shame strategy when they're not doing much to make it publicly known. Beyond that, you point out that the first thing to do is to alert the developers and give them some time to fix it, and while I have looked, I haven't found anything that suggests this site does either of those. You have a very valid point in that in many cases just alerting the developer gets nothing done, but it holds little meaning in regards to this search engine as it doesn't really do any of those things you think should be done.

In favor of it IF you reported it 1st to adversely (0)

Anonymous Coward | about a year ago | (#42995547)

Affected vendor, as I did (& the parent poster to my post too) -> []

"Which is sad because it's a valid point that you're trying to make, sometimes the name & shame strategy does work." - by Thruen (753567) on Sunday February 24, @09:19AM (#42995051)

It does & CAN, especially if you do it FAIRLY (as I stated in my subject-line above - confront the adversely affected vendor, first... only right & fair to do, imo @ least!).

THAT, is truly "responsible disclosure"

So, yes - I agree, 110%: It IS or can be, something that works...

NOW - what I noted on MS' IP stack & custom hosts files?

Hey, on MY part??

That's NOT the only "fix" I've helped make with vendors over time!

(Even giving them code to do it as with UltraDefrag64 more recently -> (A 64-bit FREE defragger for Windows), in showing them code for how to do Process Priority Control @ the GUI usermode/ring 3/rpl 3 level in their program (good one too), & being credited for it by their lead dev & his team... see here -> [] or here [] (I could've posted BETTER code too, lol, but it works in concept porting from Delphi Object-Pascal to C/C++ easily enough).

Which ended up fixing a "bug" for them later, here -> [] via its implementation (partially, NOT fully yet as I outline it & use in my applications such as this one -> [] )


ALSO as I did with FireFox/Mozilla folks years beforehand who came right into with us, & helped fix it with the site's owner/webmaster, Philip.

Thus: BOTH issues were patched with my suggestions & notifying them...

From smaller vendors too!


BOTH companies in FF & UltraDefrag64 were better about it by FAR, unlike MS, whom I notified YEARS AGO, though... & right to the head of the division concerned with PERFORMANCE too, no less!

(& that IS performance gains I proved & he even conceded... nothing was done!).


P.S.=> So, do I agree with "name & shame" tactics? Yes, but... ONLY if you report it validly to the concerned software maker 'oem', first! Only fair...

... apk

Re:Ethics (0)

Anonymous Coward | about a year ago | (#42995329)

I'm really sorry but there are really several point of view regarding "Ethics". It's not because you are a security researcher that you do have the final word on this.

Some people consider that 0-day should be released in the wild as early as possible and make as much damage as possible, because only then deciders will start to realize that security is a real concern and only then people will start to conceive systems that are, from the start, more resilient to exploits.

As an example as far as I know I've only had to patch in emergency *two* server-side Java bugs allowing remote attacks on Java webapp servers in the last two years: one for the DoS concerning the infinite loop when parsing floating-point numbers (where I applied the unofficial patch that came out immediately, not by Oracle... Thanks to people releasing the exploit in the wild ASAP, also allowing 3rd party patch to come out *before* Oracle did move / MS patch tuesday / whatever) and one for the DoS concerning the query parameters degenerating into O(n) instead of O(1).

I know Java on the client-side has been pretty miserable lately, but Java on the server-side is quite robust.

So I'm sure I buy that the one procedure to remedy to 90% of all vulnerabilities consist in patching. Patching is giga-important.

But before that the most important is to pick a technology which DOES NOT require to be constantly patched (hint: companies relying on patch tuesday ain't exactly qualifiying and shall keep having Vupen and all the others discover remote root exploits).

Accepting that inferior systems which needs constant patching are a fatality is probably the number on security concern.

I don't disagree that you should always patch. But when you have to patch each week then at one point some introspection is needed and you have to wonder if you shouldn't be using a more robust stack in the face of security exploits...

Re:Ethics (0)

Anonymous Coward | about a year ago | (#43000593)

Attention whoting much?

Why do we care about some vulnerability you found?

Re:Ethics (0)

Anonymous Coward | about a year ago | (#43009511)

He can whote if he eants to

he can reave your rends behind

Because yore rends don't whote

and if they don't whote

Well their no rends of his.

Re:Ethics (1)

GrumpySteen (1250194) | about a year ago | (#42996679)

> Why do I feel that most of the people claiming this is unethical are all the same person.

Because dismissing all of the people who are telling you that you're wrong as one nutcase with multiple accounts is far easier than acknowledging that you're actually wrong and everyone knows it.

Re:Ethics (0)

Anonymous Coward | about a year and a half ago | (#42993827)

There's nothing ethical about that.

This makes it seem as though you believe there is one universal set of "ethics" that applies to everybody. Back in the real world, what's ethical for each person depends on what that person believes.

If you want to say 'you find this behavior unethical', that's fine. Please though, lose the preposterous notion that your ethics are everybody's.

Re:Ethics (1)

blackest_k (761565) | about a year ago | (#42994429)

On the other hand you could just check the sites you manage and design with this tool and see if it finds any problems. It is important your website is standards compliant and it is just as important that your site is secure. If you ever got hacked you will soon find your site blacklisted and a pile of work to rebuild the site and more importantly restore your reputation.

Without this tool you will be hacked eventually if your site has vulnerabilities, it has to be a good thing for you to know beforehand so you can secure your site.

It would be even better if after scanning a site that it sends an email to the site is a fairly safe bet. If you haven't used the tool at least you know someone else has and you need to know hopefully before your site gets hacked.

This tool might lower the bar for script kiddies but there are plenty of people who can hack your site without using this particular tool. You only need one to be successful.

Re:Ethics (0)

Anonymous Coward | about a year ago | (#43009431)

Ain't got ethics in that!

Re:Ethics (1)

del_diablo (1747634) | about a year and a half ago | (#42993651)

The analogy falls apart if the parking lot is a guarded parking lot that guarantees the cars safety.

Re:Ethics (2)

GrumpySteen (1250194) | about a year and a half ago | (#42993707)

I'm fairly sure those don't exist. Even the guarded parking lots have disclaimers that say they aren't responsible for theft and damage.

Re:Ethics (1)

ArsonSmith (13997) | about a year and a half ago | (#42993787)

I think a better example may be someone going through stores and seeing which ones are posting people's credit card info on a large board behind the cashier for all to see or the ones that are actually trying to keep it hidden. That is seriously about how stupid many of these web sites are and if this was happening in meat space this list would be uncontested as a supreme public service.

Re:Ethics, over zealous (0)

Anonymous Coward | about a year and a half ago | (#42993947)

What if PunkSPIDER had notified the sites of vuls. and as is typical with site owners/hosts they simply do not care, unless it grabbed the attention of the media/press and users that actually cared were pissed/worried about it.

I do not know if PunkSPIDER had reported vuls, to sites, if they did, and they saw no response to patch them, then this project of there's will surely grab some attention by shaming.. There are other orgs out there that you could ask to handle this (legally) but they are more concerned with larger impending matters (and rightful so).

Re:Ethics (2)

Jafafa Hots (580169) | about a year and a half ago | (#42994199)

You professor is an idiot.

This is more like going to a public parking lot and testing to find out whether the security cameras are real and working, not working, or fakes, and then telling people they shouldn't park their cars there if they want to park where there are security cameras.

Re:Ethics (2)

Thruen (753567) | about a year ago | (#42994867)

Actually, it's less like telling people they shouldn't park there, and more like creating a searchable database of areas with no security cameras. It doesn't take much thought to realize the people looking for a place to park aren't going to search this database, it'll be used by the people looking for safer areas to steal those cars. Or to drop all this stupid analogy crap that never seems to have a positive effect on discussions, this is a searchable database that's only going to be used by people who are looking for vulnerabilities. Is the average user even going to know about this? Nope, but hackers everywhere will definitely know about it. Odds are, site owners aren't going to search it either, which may be a little irresponsible, but is nowhere near as bad as finding the vulnerabilities and trying to make sure everyone knows about it. They claim good intentions, and maybe they really have good intentions, but there's nothing good about this.

Re:Ethics (1)

Zapotek (1032314) | about a year ago | (#42995085)

The prof gave a wrong simile, you are an idiot. WebAppSec scanners can inject harmful payloads (like emptying whole DB tables harmful, a simple string like "or 1=1" in the wrong place can can cause loads of trouble) and should be never run again live/production websites.

Also, those guys are overly excited about their own work to the point or arrogance but give them time. They'll either get to appreciate all the complexities of those types of systems and power on or just give up after a while.
They got the attention they wanted now anyways...

Re:Ethics (1)

Reemi (142518) | about a year ago | (#42994285)

How about you borrowed your expensive camera to a friend and you noticed he left it visible inside his car on the parking lot. A parking lot known for many burglaries, in other words he was inviting somebody to steel your camera.

Would it be ethical to check if he locked his car so you can protect YOUR OWN belongings?

I agree with your statement, but looking at my log-files I wonder why the good guys should not be allowed to perform one scan while the bad ones are performing hundreds a day. Why should the bad ones have all the information?

Re:Ethics (1)

martin-boundary (547041) | about a year ago | (#42994371)

The basic car analogy fails to capture that vulnerabilities in computer systems are often used as stepping stones for further attacks on other computer systems. In the car analogy proper, the only person affected by a break-in is the unlocked car's owner, while the other car owners are safe provided their car doors are locked.

But say the criminal is a joyrider. He picks an unlocked car, and then drives around the parking lot smashing into other locked cars for fun, and then runs away. Now the question: is it wrong to check if some other car in the lot is unlocked and shame its owner? The chance that your car, even if locked, will be damaged due to some other car being unlocked and used as an attack vector is now non negligible.

Re:Ethics (0)

Anonymous Coward | about a year ago | (#42995179)

I agree. YOU have no right to appoint yourself as guardian and watchdog. How much time do admins like myself spend hunting down
dogs like this that fill my log files with their 'self-fulfilling need'. Nobody wants them. You are truly UNWANTED AND UNLOVED!

Re:Ethics (1)

gawdric (2849539) | about a year ago | (#42996113)

Public shaming has its place. Think back, if you were involved then, to the mid to late 90s. Smurf attacks were the in thing. Places like and others began listing networks that had smurfable broadcast addresses. For a while, this did the work for the script kiddies but eventually the networks made it a point to remove themselves from the database and now the problem is nearly gone. I can see this database having a similar impact.

Can you tell me... (0)

Anonymous Coward | about a year and a half ago | (#42993577)

Which banks are vulnerable, and how to hack them? I want to know this for the lulz.


Law suit (0)

Anonymous Coward | about a year and a half ago | (#42993615)

Are you trying to get sued? Even if you're doing nothing illegal you are going to get some people in hot water or get their systems exploited. Both of which could lead to you getting sued into oblivion or having some not so nice police taking all your stuff in the middle of the night and putting you in handcuffs.

While the act is intended to be noble, you may not have thought through how this works in the real world and what the social reaction to be.

Re:Law suit (-1)

Anonymous Coward | about a year and a half ago | (#42993671)

How is this not as noble as what Anonymous has done in the past?

They hacked Los Zetas when one of their own was kidnapped. They took down a bunch of GoDaddy servers. They used DDoS against evil capitalist groups such as VISA and Mastercard. They hacked Sony and released personal information for the world to see. They kidnapped a member of 4chan, and then he was raped by 17 members of Anonymous because he said he voted for Romney.

These are the most honorable people to have ever existed. Anonymous is the last defense against the tyranny of Obama. If it wasn't for President Obama, Anonymous would not exist. This proves that Obama is pure evil.

The truth is... (1)

Anonymous Coward | about a year and a half ago | (#42993747)

It's a tool. Tools can be used for good and evil, it just depends on who's hands the tool is in. Take Metasploit for example -- it's used widely by both whitehat security researchers and blackhat criminals.

As a security researcher, I'll add that PunkSPIDER doesn't shine light on anything that the bad guys don't already know. I'm glad to see another tool that helps enable those who are charged with defending web applications.

Couldn't find any - the results so far ARE pretty (1)

G3ckoG33k (647276) | about a year and a half ago | (#42993783)

Tried two dozen sites that I visit regularly. No issues. Most are top 100,000 on alexa but a few below 1,000,000.

Re:Couldn't find any - the results so far ARE pret (0)

Anonymous Coward | about a year and a half ago | (#42993819)

Tried two dozen sites that I visit regularly. No issues. Most are top 100,000 on alexa but a few below 1,000,000.

Just type partials, like ".org" and check the boxes, you will get some results.

:) Law & Society Trust (0)

Anonymous Coward | about a year and a half ago | (#42993845)

Law & Society Trust
Timestamp: Fri Aug 10 09:47:28 GMT 2012
BSQLI:0 | SQLI:52 | XSS:0

Law & Society Trust? LOL

Re::) Law & Society Trust (0)

Anonymous Coward | about a year ago | (#42994723)

You're doing better than me, I don't think the URL search really works. I tried a URL search for [] and got nothing at all for that domain but pages of not-really-related results that mention MyCompanyThatIWontMentionHere somewhere else within their URL.

Re:Couldn't find any - the results so far ARE pret (2)

Sqr(twg) (2126054) | about a year ago | (#42994241)

Typing * in the search box gets you everything, it seems.

  762 pages (times 10 sites per page) for "bsqli"
  77 pages for "sqli"
  421 pages for "xss"

Re:Couldn't find any - the results so far ARE pret (3, Informative)

punk2176 (2840475) | about a year and a half ago | (#42994109)

So one thing that we've been trying to make clear is that the project is *on track* to scan the entire Internet, we haven't scanned everything yet. We have scanned about 70k sites and have under 4 million indexed. Our next version is going to be clearer on what is and is not scanned - currently we just say 0 vulnerabilities if we haven't scanned it, indicating that we have not found vulnerabilities in it yet - not necessarily that it doesn't have any. This was all part of our ShmooCon presentation which just hasn't been released to everyone yet! The system is self-sustaining at this point so these numbers are constantly going up. The "not pretty" comes from the fact that we have over 100,000 vulnerabilities from just scanning about 70,000 sites (some sites have multiple vulnerabilities).

Re:Couldn't find any - the results so far ARE pret (1)

Zapotek (1032314) | about a year ago | (#42995421)

There are a few assumptions being made here that should be addressed for people unfamiliar with the field:
  • It would be impossible for results of that magnitude to be manually verified in order to weed out false-positives, which are a real problem.
  • Just because that scanner hasn't found any vulns it doesn't mean there aren't any.
  • As others have pointed out, this is highly unethical. Scanning a site can be disruptive (and even devastating under some circumstances) which is why every such vendor discourages use of their software against live/production websites.

I imagine you saw HD Moore's nmap scan of the internet and thought to yourself "Wow, we got to get us some of that!" but this is a really bad idea and I imagine you already know that. The only way to have gone forward with this is after weighting the bad (ethical issues, fallout from site owners, possible legal troubles, etc.) and the good (getting attention) and here we are.

Re:Couldn't find any - the results so far ARE pret (1)

Anonymous Coward | about a year ago | (#42996601)

Please publish your scanning IP so it can be blocked by people that wish to opt out of this

Lawsuits and ethics. (1)

sdsucks (1161899) | about a year and a half ago | (#42994037)

I hope you've got a good lawyer and money to keep him or her happy. The first exploit you publish about a large (organization|government|important person) is going to give you a really, really, really big headache - at best.

Also... ethics - you have none. For this, as someone who has spent past lives working in IS, I hope you rot in a miserable existence.

Fame grasping by a very amateur security "expert".

Wellcome to the gray area. (0)

Anonymous Coward | about a year ago | (#42994535)

Broadcasting yourself playing a game is something that could be view as broadcasting somebody copyright material. Similar to broadcasting parts of a movie.

Game companies allow it, because is free ads, more people buy these games, because of the videos. But... like modding, is something that may one day change, would put a creativity activity in shaking terrain.

Consider yourself warned. I will say "I told you so".

Re:Wellcome to the gray area. (0)

Anonymous Coward | about a year ago | (#42994731)

Commenting on the wrong story again, Mr. AC?

SF? (0)

Anonymous Coward | about a year ago | (#42994543)

So where is the connection to San Fransisco? More specifically, the viewers of this site who are in the bay are for the next week?

Why would they put this up, Now?

TL;DR (1)

thejynxed (831517) | about a year ago | (#42994673)

Most web sites aren't written with security in mind, but pageviews, rankings, and advertising revenue. News at 11.

How about that NASCAR race?

Java vs .NET (1)

sproketboy (608031) | about a year ago | (#42998659)

Java 96, .NET 20247. LOL.

Perhaps a Suggestion (1)

utkonos (2104836) | about a year ago | (#43000363)

I was at your talk at ShmooCon and was quite impressed. What if for any domains that you discovered vulnerabilities on you were to automatically pull whois data (if the TLD has whois servers or web based whois without a captcha) and send a quick email about your findings to any emails listed? A shameless plug: ruby whois [] is the best programmatic whois client and parser out there IMHO. It would make the above suggestion quite simple.
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account