Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

2M New Websites a Year Compromised To Serve Malware

kdawson posted more than 6 years ago | from the wave-upon-wave dept.

Security 72

SkiifGeek writes "Sophos claims that they are detecting 6,000 new sites daily that have been compromised to serve malware to unsuspecting site visitors, with 80% of site owners not aware that they have been compromised — though this figure is probably on the low side. With increasingly vocal arguments being put forward by security experts criticizing the performance and capability of site validation tools (though many of these experts offer their own tools and services for similar capabilities), and rising levels of blended attacks, perhaps it is time you reviewed the security of your site and what might be hiding in infrequently used directories."

cancel ×

72 comments

Sorry! There are no comments related to the filter you selected.

How to Check a LAMP Server? (3, Interesting)

MankyD (567984) | more than 6 years ago | (#22181118)

Everytime I read about a new form of server malware, I try to check a LAMP server that I run. So far I've come up clean but I've hardly done a full inspection. Anyone know of a good way to scan a set up? Sophos says that they are detecting thousands of new sites - how are they scanning them?

Format every single day (1)

MonsterOfTheLake (880659) | more than 6 years ago | (#22181176)

Just to be sure You can thank me later (in cash)

hit it with a hammer (1)

rs232 (849320) | more than 6 years ago | (#22181274)

hit it with a hammer

If every Nigger got hit with a hammer (-1, Troll)

Anonymous Coward | more than 6 years ago | (#22181536)

the world would be a better place. I mean look at it pragmatically. They're niggers. They aren't getting any stupider. It'd benefit the economy in a lot of ways too. Imagine all the Dead Nigger Storage and hammer shops opening up. We could make YouTube mashups of niggers being hammered to the song HammerTime. If you're too much of a pussy to hit them in the head just hit them in the balls. Natural selection in action.

Re:hit it with a hammer (-1, Offtopic)

jank1887 (815982) | more than 6 years ago | (#22182264)

no, you just have to show it the hammer...

Re:How to Check a LAMP Server? (0)

rs232 (849320) | more than 6 years ago | (#22181200)

"Everytime I read about a new form of server malware, I try to check a LAMP server that I run. So far I've come up clean but I've hardly done a full inspection"

The only way to be sure is to run it from a CD image and reboot nightly ..

"Anyone know of a good way to scan a set up? Sophos says that they are detecting thousands of new sites - how are they scanning them?"

They're not, they just need a little boost to the stock price, please buy our PRODUC~1 .. :)

Re:How to Check a LAMP Server? (1)

ThePilgrim (456341) | more than 6 years ago | (#22182662)

Sophos is a private company. they dont have a stock price that needs raising.

Re:How to Check a LAMP Server? (1)

MarkGriz (520778) | more than 6 years ago | (#22185080)

The only way to be sure is to run it from a CD image and reboot nightly ..
I thought you were supposed to nuke it from orbit?

Re:How to Check a LAMP Server? (1)

electrostatic (1185487) | more than 6 years ago | (#22188886)

It it practical to use an HD in write-protect mode? (Some hard drives have a write-protect jumper: Google "ide hd write protect jumper".) Put the system software on the WP'd drive and use a second HD for the remainder. This topic raises another question: I wonder how Google is able to crawl the web while protecting its own systems from malware.

Re:How to Check a LAMP Server? (1)

dotancohen (1015143) | more than 6 years ago | (#22191760)

Google is not running every ActiveX control and .exe that is comes across. I should imagine that they only need to protect against SQL injection.

KDAWSON--Please, please read this and respond (-1, Troll)

Anonymous Coward | more than 6 years ago | (#22181212)

Your worthless headlines that are not news. Keith Dawson, why not just go and run in front of a train? It would be better for me, better for you, and better for this site that you make useless.

I am tired of all of the mistakes and worthless banter that you post on an hourly basis. GO AWAY.

Re:KDAWSON--Please, please read this and respond (1)

KDAWSON sucks (1165799) | more than 6 years ago | (#22181304)

Do eeetttt!

Re:KDAWSON--Please, please read this and respond (2, Informative)

flydpnkrtn (114575) | more than 6 years ago | (#22181698)

OK I know I'm feeding the trolls but you know you can choose to NOT see certain authors' stories under Preferences->Homepage, right?

Re:How to Check a LAMP Server? (5, Informative)

Smidge204 (605297) | more than 6 years ago | (#22181260)

I thought about this myself. One possible solution that I considered would be to maintain a local list of files on your server and their CRC/Hash values. A script on the server would scan all the files and output a similar list than you could then check against your local copy and would quickly identify any new or changed files. This could be set to a cron job to do periodic scans or just initiate a manual scan whenever.

Might not be the best solution but it should be easy to implement. Larger sites can do incremental scans. It would be harder to detect corruption of databases, though, unless you know what to look for or have a concrete way of validating the contents.
=Smidge=

Re:How to Check a LAMP Server? (2, Informative)

Bongfish (545460) | more than 6 years ago | (#22181510)

For this, you'd want to use something like Tripwire or AIDE. It's been used for years, and will detect changes to files.

You're right that it won't help you detect that somebody has managed to insert a chunk of javascript or PHP in your insecure mySQL/PHP web app, though. Perhaps a combination of Snort, Ntop (if it wasn't shit), a "hardened" PHP binary and config, and log monitoring would alert you in the case of an attack.

The problem is that there's a lot of badly written or out of date software out there that can be exploited, even without discovering new holes. If you're running this sort of thing and making it publicly accessible over the net, somebody is going to take advantage of it.

Re:How to Check a LAMP Server? (1)

Bongfish (545460) | more than 6 years ago | (#22181636)

> insecure mySQL/PHP web app

I meant "vulnerable", but feel free to insert jokes about neurotic software here:

Re:How to Check a LAMP Server? (1, Informative)

Anonymous Coward | more than 6 years ago | (#22181706)

That's exactly what Radmind does:

http://radmind.org/ [radmind.org]

Re:How to Check a LAMP Server? (1, Informative)

the_olo (160789) | more than 6 years ago | (#22181764)

I thought about this myself. One possible solution that I considered would be to maintain a local list of files on your server and their CRC/Hash values. A script on the server would scan all the files and output a similar list than you could then check against your local copy and would quickly identify any new or changed files. This could be set to a cron job to do periodic scans or just initiate a manual scan whenever.

Congratulations! You have just described Tripwire [sourceforge.net] .

Re:How to Check a LAMP Server? (1)

leet (1202001) | more than 6 years ago | (#22187458)

Tripwire is excellent and I've used it for years. Another good thing to do is e-mail the results of running a Tripwire check on a daily basis, or more often if that's a preference. That way you get the server reports in your inbox on a regular basis.

Tripwire is good for systems that are well understood. If you don't understand the changes that can happen on a system then it won't do you much good. But running Tripwire gives the administrator an understanding of system changes over time. So initial understanding should not be a barrier. It's a good thing.

Re:How to Check a LAMP Server? (1)

spinfan (893209) | more than 6 years ago | (#22182544)

A quick and easy way to check all your files is to use md5deep, which
will scan directories recursively and generate all the md5s into
a single file which you could compare against a baseline:

http://md5deep.sourceforge.net/ [sourceforge.net]

Re:How to Check a LAMP Server? (1)

loncarevic (102877) | more than 6 years ago | (#22182654)

For File integrity, you can always utilize somthing like:

AIDE - Advanced Intrusion Detection Environment

AIDE is a file integrity checker that supports regular expressions. Licensed with GPL.

www.cs.tut.fi/~rammer/aide.html

Re:How to Check a LAMP Server? (1)

budgenator (254554) | more than 6 years ago | (#22183410)

what you would have to do is set up a script to take a local mirror of the website that has every authorized file, in it authorized form with their appropriate hashes and compare it to what is on the website via FTP and report any additions, deletions or changed files. This would tell you that the site proper is what you uploaded and unchanged. Other problems seems to be that some big server farms have been rooted which is probably out of your control, that why I wouldn't trust a script running on the website's server.

Radmind (2, Informative)

fitterhappier (468051) | more than 6 years ago | (#22183582)

Radmind: http://radmind.org/ [radmind.org] . Radmind's is designed for this purpose exactly. It's a tripwire with the ability to roll back changes, or capture them and store them for deployment to other systems.

Re:How to Check a LAMP Server? (1)

michaelwigle (822387) | more than 6 years ago | (#22181540)

I don't have a great answer but I may have found a step in the right direction. Check out Tiger at http://www.nongnu.org/tiger/ [nongnu.org] . It's a Debian scanner that checks for common signs that someone has pwned your system. I'll warn you now that I haven't tried it yet but might do so in the next few weeks to see how it operates. It doesn't check specifically for any malware but does check for signs that someone has altered your system for remote control. Like I said, not a great solution, but another tool in the toolbox.

AFICK. (1)

Penguinisto (415985) | more than 6 years ago | (#22181552)

You first need a file integrity checker. AFICK (my favorite) or similar will do a run, on whatever period of time you set a cronjob and conf file at. You then get an email to you listing what files have changed over that period of time.

Also, keep an eye on how big your maillog files are - if they suddenly grow by some exponent, you've been turned into a spam server (or a newsletter went out - five seconds of peeking at the live output should tell you which).

Also, you can keep an eye on the http access logs - with a bit of scripting and uniq, you can tell if one site is constantly connecting (e.g. most PHP hijacks usually full files in from another, far more compromised site). If you get a ton of HTTP connections from a site you don't expect (e.g. a shitload of calls come in from some site ending in *.cn, the IP addy resolves to somewhere halfway across the planet, things like that), then you may want to get your suspicions up.

I suspect Sophos is crawling the page and scanning what they get for common munge-ups and mis-directions. It wouldn't surprise me if they tend towards overstating the case somewhat, but I don't know what they're looking for exactly so it'd be hard to tell for sure.

/P

Additionnal malware detection tools (1)

DrYak (748999) | more than 6 years ago | (#22185064)

In addition to the other tools mentionned by /.ers, there are 2 root-kit checking tools that are worth mentioning :
- chkrootkit [chkrootkit.org]
- rkhunter [rootkit.nl]

They are scripts that scan the system for known root kits, weird behaviours and hidden files in unusual places.
They can both be used to scan an offline system (booted from a live-cd and the system mounted under some directory),
and a live online system (they check the system for suspicious behaviour that may reveal a root-kit trying to hide it self - for example the "ps" command doesn't show the same processes as the "/proc" directory could mean a root-kitted "ps").

They are available in a lot of distributions (Debian Etch has them in the repository - probably the corresponding Ubuntu has them too) and the packages usually come with "cron" entries that can automatically scan the system and email a report to the administrator.
They are also downloadable and installable from their websites and feature configuration files that cover the most frequent distributions.

You should install them, run some initially check, (eventually edit the script to remove some false positive, i.e.: hidden files about which the script complains but which are normal part of the system), and add crontab entries to do daily checks and e-mail you positive results.
This will help you against having your server rootkited.

-----------------------

Another tool worth mentioning is ClamAV [clamav.net] .
That's an open-source signature-based virus scanner, whose maker have been praised for their very fast response time in case of new emerging threats.
You could set it up to periodically check files in the directories that are served. (/srv/www, /srv/ftp, etc.)
The scanner is not very fast, but supports some specialized-hardware acceleration [clamav.net] (it might be worth considering it if the server is rather important, and gets significant mail-traffic too). Some teams are also working on GPGPU hardware acceleration (mentioned in nVidia's book "CPUGems 3").

This will help you get some protection against website that you're hosting that may have been hacked into (with bugs in PHP pages, for exemple) and are now serving malwares.

-----------------------

Because the way malware evolve, you may have to upgrade the above softwares to later versions than those shipped with your OS.
Some distribution propose it in their security updates.
For Debian, keep in mind that this kind of "later version requirement" packages go in the "volatile [debian.org] " repository and not the "security" one, modify your sources accordingly.
("security" : we keep the exact same version for stability reasons and only patch critical errors.
"volatile" : for security reasons, some packages (mostly various scan engines) may require updating to a later versions.
"volatile-sloppy" : warning, the packages are really different. b0rkage of config files may ensure (mostly software like gaim/pidgin).

This is a page [sectools.org] with a top 100 of various security tools which may also inspire you (for example they mention a webserver scanner called Nikto).

Also, always keep in mind that a compromised machine is not a machine that you can trust. Thus in addition to creating new entries in you crontab, you should also test your machine offline as part of the security checks.
For example, occasionnaly, when you have to take your server offline for planned updates (rebooting to newer kernel version or non hot-plugable hardware upgrades) you may want to scan your system while booting on a LiveCD in case the root-kit are efficient enough to go undetected once they are active.
(That is, if the conditions allow you to perform such a scan : the machine is physically accessible, you can plan in the website's schedule sufficiently long down-time to perform the scan, or there's a backup server to ensure availability while the main server is scanned).

A last, different and harder to implement idea (somewhat related to the XKCD's aquarium [xkcd.com] ) would be to use virtual machines. Often, big hosting companies offer their LAMP server as guest Linux systems running inside virtual machines scattered across their actual servers.
Fore some security paranoia, you could virtualize your LAMP stack inside a VM and have the host system outside the VM perform scans.
(You diminish the risks of having not to trust the machine in case of compromise - because the attacker will have not only to root-kit the LAMP stack but also break free out of the VM to make it unreliable)
As a bonus for bigger paranoia and faster detection of new emerging root-kits, you could both scan critical files from your virtual LAMP both over the network and by reading the disk file from host OS outside the VM and see if their checksums differ (this could show some root-kit at work trying to hide it self and maskarading as non-infected files).

Re:How to Check a LAMP Server? (1)

ratboy666 (104074) | more than 6 years ago | (#22185542)

I use tripwire to build hashes for all files, and store them "off-machine". The hashes are compared against a baseline, and any differences are highlighted. Disk use in monitored, and (not yet found) anomalies are investigated. Logs are examined, and ssh dictionary attacks are dropped.

The server does NOT run database -- only (pure static) apache. Scripts are NOT run on this machine. Certain things are directed off to https, on another machine, with user/password authentication. That, in turn, actually talks to the db. If there IS a breakin, it almost certainly would have to be an "inside job" and I would be able to find the perp (but only 12-ish people use this system).

The entire disk is also backed to tape, and the size of the tape is checked (it has never been larger than the data backed up). The idea is that a non-OS method is used to determine what the size should be, and this can be compared to the OS method. If the OS method "lies" (that is, root-kitted), the other method will give a different result.

If the contents/length of the files change, the hash would be different, and the hashes are under the control of a DIFFERENT system. Basically, the attacker would need to modify tape firmware, and TWO machines to "invisibly" corrupt the front-end.

So either the hacker has been REALLY REALLY good, or it hasn't happened.

So, I don't know if I have been "exploited", but I really don't think so... Tripwire with offlining logs is probably ok for you.

2 million a year? (0)

Anonymous Coward | more than 6 years ago | (#22181134)

2 million websites a year? Are these all IIS websites on PCs that the PC owners don't know they are hosting? How can the number be this high?

Re:2 million a year? (1)

MacarooMac (1222684) | more than 6 years ago | (#22181250)

According to the performance and capability [beskerming.com] link

"The figure of 2 million new site compromises per year seems to be quite significant, but could be explained by virtual hosting servers with many sites on the one physical server being compromised, leading to the same vector affecting multiple sites (in some cases thousands of sites)."

Two Million Niggers just released from Jail (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22181146)

Coincidence? I think not. Imagine a beowulf cluster of niggers holding down Natalee Portman to pour hot grits on her in Soviet Russia. Then imagine instead of watching TV and thinking about watermelon they got to work actually robbing websites as much as they try to rob people. That's niggers in action. Don't worry though, you can protect yourself by placing some KFC and Watermelon inside gang territory. We in the business call this a nigger trap. Very effective.

You misspelled Natalie (1)

MonsterOfTheLake (880659) | more than 6 years ago | (#22181190)

Just a heads up, the typo distracts from your main point

Good to see you hate Niggers too. (-1, Flamebait)

Anonymous Coward | more than 6 years ago | (#22181282)

Ron Paul 2008. Cause somebody needs to exterminate the darkies and John McCain just isn't up to the job. Rudy would just talk about how he left all the niggers to die in 9/11. Especially the nigger firefighters. Who would think niggers could hold a hose? Honestly.

Oh, THAT was your point? (0, Flamebait)

MonsterOfTheLake (880659) | more than 6 years ago | (#22181432)

That's terrible! I hope you get AIDS in the butt!

validated certified imdemnified .. (1)

rs232 (849320) | more than 6 years ago | (#22181152)

But I thought all these sites were validated and certified and IP imdemnified, else what was the point of paying huge wads of dosh to all the lawyers, oh wait, now I get it .. :)

Hmm, time to improve the common tools (4, Interesting)

davidwr (791652) | more than 6 years ago | (#22181160)

Perhaps the time has come to harden the "common stacks" so certain switches are off.

For example, once you set up your web site, "lock it" so if there are any changes to files or directories that shouldn't change, the site will break in a non-harmful way rather than be compromised.

If and when these files need updating, the "unlock" process should be done using a tool independent of the main web-server process, perhaps by using a different web-server process running on a different port or even a process on a different computer that validates the request then passes it on to the main web server.

Re:Hmm, time to improve the common tools (1, Funny)

Anonymous Coward | more than 6 years ago | (#22181208)

Even better. When a server is compromised, it will burst into flames, burning down the entire data center it's hosted in. You know, just in case the virus spread.

Re:Hmm, time to improve the common tools (1)

morgan_greywolf (835522) | more than 6 years ago | (#22181482)

Easier. For a LAMP stack, here's what's needed:

Everything on the Web server should be mounted read-only, preferrably from a machine behind a firewall. A firewall sits behind that machine and your inside network. The only way to write to the file system should be from behind the firewall. Any temporary files that need to be created for download or parsing or whatever, where read/write is necessary, should only be done from a RAM disk. Reboot the server nightly.

The database server should also sit on the inside network, with a proxy running between the two firewalls, along with the read-only file server.

All connections should be secure and encrypted.

Or, just let someone else host it. Get a managed dedicated server, pay them to maintain it, and sue the pants off 'em if something happens to it.

too complicated for "simple" setups (1)

davidwr (791652) | more than 6 years ago | (#22182548)

Actual different machines with actual different firewalls are good for hosted solutions and IT departments that know what they are doing, but they are too complicated for non-geek do-it-yourself mom-and-pop-businessman/home-user solutions.

However, a stack that puts a virtual or other hardened subsystem to hide the non-read-only files and databases behind in an easy-to-use form should be doable.

Re:too complicated for "simple" setups (1)

morgan_greywolf (835522) | more than 6 years ago | (#22184308)

Actual different machines with actual different firewalls are good for hosted solutions and IT departments that know what they are doing, but they are too complicated for non-geek do-it-yourself mom-and-pop-businessman/home-user solutions.
Should the non-geek do-it-yourself mom-and-pop-businessman/home-user REALLY be putting a live box out on the public Intartubes with exposed services? Wouldn't they be MUCH better off with a hosted solution, especially given that shared hosting can cost as little as $5 a month?

Re:Hmm, time to improve the common tools (1)

Penguinisto (415985) | more than 6 years ago | (#22181610)

For example, once you set up your web site, "lock it" so if there are any changes to files or directories that shouldn't change, the site will break in a non-harmful way rather than be compromised.

If it's not supposed to change at all, just issue chattr +i on it to make it immutable. Then it won't change, even w/ system root permissions. Just remember to unset the flag any time that you do want to change something ("chattr -i").

/P

Re:Hmm, time to improve the common tools (0)

Anonymous Coward | more than 6 years ago | (#22182136)

Umm that's great and all until you have a web site that you need to update on a daily basis or more frequently (like 99% of the web). Then your spending all day changing attributes on files. New idea please.

Re:Hmm, time to improve the common tools (1)

Penguinisto (415985) | more than 6 years ago | (#22183790)

Err, you do realize I wrote "If it's not going to change" up there, right?

chattr is not meant to be any sort of cure-all - far from it. But for files (and even directories) in the chroot jail that don't change very often (if at all), like logo images, certain site-structural files and scripts, etc? It works just fine.

For instance, a typical Wiki or any sort of CMS rig-up has lots of CGI and/or PHP files that you wouldn't expect to see modified until you either apply a patch or customize them yourself. What's wrong with locking 'em up until you're ready to change them? As long as you keep track of what you make immutable (and what you don't), I don't see a problem here.

This doesn't mean you have to lock everything - just the bits you know won't change, but can be a potential hazard.

/P

Re:Hmm, time to improve the common tools (1)

argiedot (1035754) | more than 6 years ago | (#22185606)

I'm no expert, but why not just use a database and print all your stuff out of that? Say, like Wordpress (I know, bad example). That way you only have to secure one part, the part interacting with the database and you could make all your files immutable :)

Just check your access logs (1, Informative)

Anonymous Coward | more than 6 years ago | (#22181220)

I would just run a perl script that does a regex on the access logs for anything that does not match the files that should be delivered to clients. Put the perl script in a cron job and let it run. Also do an MD5 hash on those files regularly and check for any changes to static files. And use very strong root passwords, don't let root account login remotely, and use ssh keys with no interactive logins.

my 2 cents...

Re:Just check your access logs (1)

JavaRob (28971) | more than 6 years ago | (#22188460)

I would just run a perl script that does a regex on the access logs for anything that does not match the files that should be delivered to clients. Put the perl script in a cron job and let it run.
Sounds like LogCheck. I'm using that; it provides a decent organized daily summary of all access to the server.

Also do an MD5 hash on those files regularly and check for any changes to static files.
Know any pre-written scripts/software that handle this? The trouble is, you'd have to secure the hashes as well, which gets tricky, plus it doesn't offer any help for regularly changing content or database contents (where a content hack like secretly added JS would probably be inserted). But still, better security for non-changing files seems like a good idea; the chattr +i suggestion above seemed useful, assuming you are keeping your root account safe.

And use very strong root passwords, don't let root account login remotely, and use ssh keys with no interactive logins.
Strongly agreed WRT SSH keys vs. passwords. Another one: if you use cPanel, or phpMyAdmin, or Tomcat's management console, anything like that -- ONLY let it bind to 127.0.0.1 and use SSH tunneling to connect. There's simply no reason to let these listen to public IP if you don't want the "public" accessing & exploiting them.

SSH tunneling is easy to set up; if the PuTTY instructions are too complex, try the BitVise Tunnelier [bitvise.com] (free as in beer ONLY), it's super-simple. Map local port 10000 to 127.0.0.1:10000 on the host (cPanel example) and you're all set.

Re:Just check your access logs (1)

JavaRob (28971) | more than 6 years ago | (#22188530)

Whoops -- I mean LogWatch, not LogCheck (though I think that's another, similar package).

virtualized rootkits (2, Interesting)

Speare (84249) | more than 6 years ago | (#22181252)

Okay, say someone's site is served by an ISP. The ISP gives the site owner a shell account and manages the LAMP infrastructure. The shell account is likely a virtualized instance, meant to limit the damage that each little site can do to the hosted infrastructure, not to limit the damage that the host does to little sites or their visitors. How can the site owner "check their own site" in such a case? Virtualization itself is a sort of rootkit conceptually, so how can the virtualized account check for malicious rootkits in its own instance or in the greater infrastructure?

Re:virtualized rootkits (0)

Anonymous Coward | more than 6 years ago | (#22182180)

How can we know that the universe we live in isn't really a digital simulation on a massive scale?

Re:virtualized rootkits (1)

BlackSnake112 (912158) | more than 6 years ago | (#22182412)

We don't until we find the really big reset button.

Re:virtualized rootkits (0)

Anonymous Coward | more than 6 years ago | (#22184968)

how can the virtualized account check for malicious rootkits in its own instance or in the greater infrastructure?

This problem is as old as philosophy itself. How do I know I'm in the real world, and not The Matrix?

Decartes offered a solution [wikipedia.org] : "I think, therefore I am." He apparently forgot that "I am thinking" presupposed existence, so only ended up making an argument with this form: "X. So X", like "The Green Bay Packers are the best football team ever, therefore the Green Bay Packers are the best football team ever. Go Farve!"

The truth is: Nobody Can. It's impossible on the order of a perpetual motion machine, or traveling faster than the speed of light. If you don't control the hardware that provides the observations, you only get whatever observations are given to you.

This is the same reason that "cloud computing" is only a passing fad. Businesses don't want to store "customerlist.txt" on hardware they don't own. Now you know why.

Completely useless. (3, Insightful)

Lumpy (12016) | more than 6 years ago | (#22181276)

Until they release the fricking list of IP addresses or Domain names.

I would love to put that list in my squid blocking file to protect my users.

Re:Completely useless. (1)

Bryansix (761547) | more than 6 years ago | (#22184672)

That would be a sump move. If it was IP addresses then once an IP address was re-assigned to a good host you still wouldn't see their website. You have no way of removing IP addresses from your list.

If it was domains names the same problem would apply but after domains cleaned infected files.

I say all of this because I was a victim of stupid block lists when I got a new IP and tried to send email out on it. It was blocked because of the previous owner and getting removed from most lists was non-obvious.

Re:Completely useless. (0)

Anonymous Coward | more than 6 years ago | (#22187106)

Umm yeah, they'll give you that list. It's called a product, and it's what they sell.

6000 sites? (1, Interesting)

Anonymous Coward | more than 6 years ago | (#22181288)

TFA says 6000 infected webpages. Could be a big difference, but TFA doesn't elaborate.

what does this look like from the client? (3, Interesting)

oni (41625) | more than 6 years ago | (#22181410)

If I run FF and keep it patched, am I safe? If I did get compromised, what would the symptoms be?

I tend to think that keeping my OS patched keeps me pretty safe, but there's always a delay after a new vulnerability is discovered before the patches come out (the zero day) and what concerns me is that if someone has a very large network of compromised web servers, they can roll out a zero day vulnerability to all of them and do a lot of damage.

As to symptoms, I think spyware used to be the big problem, and infected computers would have popups and such. But now I think that infected machines will be used primarily to send spam. Is that correct?

Re:what does this look like from the client? (1)

dotancohen (1015143) | more than 6 years ago | (#22187544)

So long as you are not using an operating system that is named after the most easily broken part of the house, you should be safe.

Yes... (0, Troll)

darkvizier (703808) | more than 6 years ago | (#22181774)

But how many of those websites were compromised to serve Satan?

What I wanna know is ... (2, Interesting)

jc42 (318812) | more than 6 years ago | (#22181798)

When do we get a FOSS runtime library for using this valuable public resource?

Imagine all the useful things we could do for the world if we all had access to this distributed computing power.

Re:What I wanna know is ... (1)

KublaiKhan (522918) | more than 6 years ago | (#22182380)

Shush, I'm trying to put together a business model based on that idea. Don't go blabbing it everywhere! ;-p

Re:What I wanna know is ... (1)

jc42 (318812) | more than 6 years ago | (#22184416)

I'm trying to put together a business model based on that idea.

Well, I think you might be a bit late with that. ;-)

But think of the good things that could be done with a free and open implementation.

OTOH, it's been more than 25 years since the first true distributed OS was announced, and the idea hasn't exactly taken the world by storm.

Re:What I wanna know is ... (0)

Anonymous Coward | more than 6 years ago | (#22186360)

Imagine all the useful things we could do for the world if we all had access to this distributed computing power.

What are you waiting for? Imagine Useful Things, then take the best five and ASK people to donate computing power. If they are really Useful Things, don't you think you should be able to get at least a few thousand machines?

80% (1, Interesting)

Anonymous Coward | more than 6 years ago | (#22182062)

with 80% of site owners not aware that they have been compromised

Wait. So 20% of site owners know their site has been comprimised and they haven't done anything about it and are still serving up malware? Sounds to me like someones making up statistics.

Yes... (3, Interesting)

SigmundFloyd (994648) | more than 6 years ago | (#22182126)

Sophos claims that they are detecting 6,000 new sites daily that have been compromised to serve malware
...but do they run Linux?

Re:Yes... (1)

i*rod (1021795) | more than 6 years ago | (#22197416)

Per Netcraft Sophos.com is running: Linux Apache 27-Jan-2008 213.31.172.77 SOPHOS

Re:Yes... (1)

Jake96 (69645) | more than 6 years ago | (#22197546)

Yes, some do - and it's not a knock on Linux, either.

I work at a small webhost. We're 100% Linux, and have somewhere in the low hundreds of thousands of sites on about forty servers. I come across a compromised site about every other day, and those are the ones that are making themselves obvious - malicious javascript, form abuse, SQL injections, etc. Being on Linux servers has nothing to do with how secure the sites are. The users pick their own passwords and manage their own content, and the sites that get compromised usually do a poor job of one or both.

Bad passwords are the number-one problem. I check regularly and always find accounts using "password". I'm at a loss, frankly. Someone *will* guess your password if you're just going to be blindingly obvious with it. We've had to resort to blocking FTP for half of Africa, most of Asia, and a few spots in eastern Europe.

Second-ranked problem is poor code maintenance. Many hacked accounts are running PHP-based applications that haven't been patched or updated in three years. Yes, your b2evolution install from 2005 is vulnerable now. Back up your data and upgrade. You'll feel much better afterward, I promise =). Most of the PHP vulnerabilities can be (and are) prevented with mod_security, but the occasional user still figures out that they can add a SecFilterEngine Off line in .htaccess and the next thing you know they're sending their 50,000 closest friends information on a great pharmacy in Canada that ships ci4-L1ss right to your door.

What Linux does for us is keep these problems isolated to the user accounts. We've had exactly one box rooted in the last five years, which ain't perfect but ain't bad either, considering the number of attempts we attract. So, yes, run Linux and Apache, and keep them patched and locked down. Watch your logs. Keep good backups and know how to roll them back onto a server. But don't think that's going to keep your website safe from Turkish script kiddies.

Somebody should warn... (2, Funny)

gzerphey (1006177) | more than 6 years ago | (#22182250)

Somebody should warn 3M that they are next. I'm sure they would want to prepare. Ok, sorry I'll get my coat.

Vendor FUD or Real? (4, Interesting)

a-zarkon! (1030790) | more than 6 years ago | (#22183860)

I for one would like some description of how they're detecting these 6000 new sites per day. Also, what are they considering a website? Do they include bot systems that configured to listen on port 80 as part of the worm propagation and command/control? That's not really a website in my opinion, but it may be in theirs. It would be great if they published a list of the 42000 new websites they have discovered over the past 7 days, you know just to back up their claim. Wouldn't hurt to notify the owners of those sites that they've got a problem.

Absent more detail, I am calling shenanigans on this statistic, Sophos, and the Register. I am soooo sick of the FUD.

Harumph!

Re:Vendor FUD or Real? (1)

sn0wghOst (609454) | more than 6 years ago | (#22209848)

It is always wise to be considerate of potential FUD, but you also have to realize that Sophos is a for-profit company, and the likelihood of them publishing a list of information that helps them to follow their for-profit program is highly unlikely. It would be like asking Coke to publish their secret recipe.....

Time for a white hat virus? (0)

skintigh2 (456496) | more than 6 years ago | (#22187776)

Why can't a bunch of white hats get together in some country with lax or missing Internet laws and make a virus that SLOWLY propagates throughout the Internet looking for VERY OLD vulnerabilities, infect those machines, download the patches and turn on auto-update, maybe scan the local network and then outside for a week or so, then alert the owner with a polite pop-up and background change telling them they've been infected by at least one virus and should get some AV and some patches and maybe even list some sources.

Dick move? Yes. Illegal in the US? Yes. Nightmare for admin-type-folks? Yes.

But if it's attacking vulns that are YEARS old it would only affect forgotten or luddite-owned boxes.

If it's slow, like a packet a second or slower, it wont clog any networks.

And it will prevent a lot of the stupid viruses that slam the net every now and then.

Re:Time for a white hat virus? (1)

skintigh2 (456496) | more than 6 years ago | (#22187816)

Oh, and it would probably kill most/all of the bot nets.

Re:Time for a white hat virus? (1)

JavaRob (28971) | more than 6 years ago | (#22188108)

Oh, and it would probably kill most/all of the bot nets.
No -- many of the bots nowadays lock down the PC themselves (once they're in...) to keep it "safe" from competing bots. They even actively remove other bots when they can manage it.

As for the idea, though... I think about that as well. Even if just to get onto computers that haven't been compromised by a really effective bot yet (as I mentioned above) would be a big step.

Alas, most of the people talented enough to write such a thing are probably either:
* well-employed enough that they don't want to risk the jailtime
* working on building botnets of their own
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>