Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

BitTorrent For Enterprise File Distribution?

Soulskill posted more than 5 years ago | from the make-it-so dept.

Software 291

HotTuna writes "I'm responsible for a closed, private network of retail stores connected to our corporate office (and to each other) with IPsec over DSL, and no access to the public internet. We have about 4GB of disaster recovery files that need to be replicated at each site, and updated monthly. The challenge is that all the enterprise file replication tools out there seem to be client/server and not peer-to-peer. This crushes our bandwidth at the corporate office and leaves hundreds of 7Mb DSL connections (at the stores) virtually idle. I am dreaming of a tool which can 'seed' different parts of a file to different peers, and then have those peers exchange those parts, rapidly replicating the file across the entire network. Sounds like BitTorrent you say? Sure, except I would need to 'push' the files out, and not rely on users to click a torrent file at each site. I could imagine a homebrew tracker, with uTorrent and an RSS feed at each site, but that sounds a little too patchwork to fly by the CIO. What do you think? Is BitTorrent an appropriate protocol for file distribution in the business sector? If not, why not? If so, how would you implement it?"

cancel ×

291 comments

Sorry! There are no comments related to the filter you selected.

Sneakernet (5, Insightful)

91degrees (207121) | more than 5 years ago | (#26111385)

The bandwidth of a DVD in the postal service isn't great but it's reasonable and quite cost effective.

Re:Sneakernet (4, Insightful)

tepples (727027) | more than 5 years ago | (#26111451)

The bandwidth of a DVD in the postal service isn't great but it's reasonable and quite cost effective.

From the summary: "I would need to 'push' the files out, and not rely on users to click a torrent file at each site." I imagine that the following is also true: "I would need to 'push' the files out, and not rely on users to insert a disc and run setup.exe at each site."

Yes yes yes!!! (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26111557)

When I was 17, my parents had an 15 year old female exchange student from Spain. My parents wanted a girl that could hang out with my little sister. My sister and Veronica (the exchange student) didn't get along very well, but they still did a few things together. I was in heaven to have such a hot girl always hanging around our house. And, the best part was that she always wore such skimpy clothes and even occasionally changed clothes without shutting the bedroom door. I caught a few glimpses of her in just her panties and bra. She had such a perfect body and dark smooth skin.

One day, my mom informed me that I would be taking my sister and Veronica to the mall. I hate the mall, but I agreed--mostly just to get the chance to walk around behind Veronica and stare at her perfect ass as she walked around the mall in the tight mini skirt she was wearing that day. When we got to the mall, my sister ran into a group of friends that she knew from school, and she took off leaving Veronica alone with me. I felt a bit uncomfortable, but Veronica said in her broken English that she needed to buy clothes. So, we went into JC Penney. I tagged along with her as she picked out some clothes and a swimsuit. Then she headed over to the dressing rooms. I sat down outside to wait for her. After a few seconds she came out in one of the outfits and asked how I liked it. I said she looked very beautiful, and she kinda blushed at that. Then she told me to come into the dressing room for a second. I asked her why, and she said she wanted to know if I liked the swimsuit, but she didn't want to have to walk out into the main part of the store to show me. So I stepped into the dressing room and she shut the door behind us. I thought she would ask me to turn around, but she didn't! She just started undressing right in front of me! I was getting so horny. I stared at her dumbfounded as she slipped off her blouse, skirt, then her bra and panties. She asked me if I liked her body and I think I managed to mutter yes. She bent over to pick up the swimsuit and I had a perfect view of her soft pussy mound. I noticed that it was glistening a bit with drops of fluid. I wondered if she was horny for me. I brushed my hand against her ass as she was standing up and she turned and smiled at me. Then I knew it was my opportunity. I grabbed her arm gently and turned her around and pulled her body towards me. We started kissing passionately and I touched every part of her naked body I could reach. She slipped my shirt off over my head and I felt her wonderful breasts press against my chest. I turned her around so that I could massage her breasts and finger her pussy while I kissed her neck from behind. She seemed to really enjoy that. Before long my pants were off and I let my hard cock slide between her butt cheeks. She bent over slightly and directed my cock towards the wet mound between her legs. I felt the head of my dick penetrate about an inch into her and I almost came right away. But I held back and slowly thrusted until my whole cock was buried in her damn tight pussy.

She kept saying, "Yes...mas...yes...mas!" And I knew I was about to climax. So I reached around and grabbed the front of her thighs and humped her as hard as I could. I nearly lifted her off the ground as I thrusted into her. The feeling of her ass ramming against my inner thighs was the best! And, I came deep into her pussy.

We kissed a lot more and finally cleaned up to leave the dressing room. I found out that she was a virgin too before that day. But, she had fucked herself with cucumbers back in Spain so she would experience no pain on her first time. That summer turned out to be the best summer ever. We taught each other everything about oral sex, anal sex, toys, and mutual masturbation. WOW!

Re:Yes yes yes!!! (0, Offtopic)

kurt555gs (309278) | more than 5 years ago | (#26111793)

To bad I don't have any mod points, this one (above troll) is better than the usual.

Re:Yes yes yes!!! (0)

Anonymous Coward | more than 5 years ago | (#26111847)

No way, that's kiddie pr0n! And with a (probably) illegal alien at that!

For shame.

Re:Yes yes yes!!! (0)

Anonymous Coward | more than 5 years ago | (#26111879)

A 17 y/o having sex with a 15 y/o is legal almost everywhere, except some US states.

Re:Yes yes yes!!! (2, Funny)

Anonymous Coward | more than 5 years ago | (#26111813)

Slashdot: news for nerds, stiffs that matters.

Re:Sneakernet (1)

91degrees (207121) | more than 5 years ago | (#26111659)

Since these are disaster recovery files, I'm assuming they only need to be used in case of a disaster.

If so, the data is just as good on a DVD as on a hard disk.

If not my idea's rubbish but suggesting it was inexpensive.

Re:Sneakernet (1)

Tokerat (150341) | more than 5 years ago | (#26111681)

Well it is also not described what TYPE of disaster recovery it might be - some of that info might be "What to do if there is a fire on store property.pdf" but some of it might also be "CustomerTransactionDataRecovery.exe"

If that's the case, you would need a store to run setup.exe every time...

Re:Sneakernet (1)

the_B0fh (208483) | more than 5 years ago | (#26111749)

What has that got to do with anything? If that's the case, the files that are currently pushed out to DR still has to be executed manually or automagically.

The OP is not asking for that - the OP wants the files to be transferred automagically. A DVD works perfectly fine, just has high latency.

Re:Sneakernet (1)

neomunk (913773) | more than 5 years ago | (#26111867)

I don't think that putting a DVD into a (hopefully) physically secured computer is as automagical as doing absolutely nothing on the client end while a script/daemon takes care of all the work.

Re:Sneakernet (4, Insightful)

maxume (22995) | more than 5 years ago | (#26111941)

Also, burning (and packaging and mailing...) a bunch of DVDs isn't necessarily cheap/quick/easy, so it breaks down pretty quickly as the number of stores increases.

Re:Sneakernet (1)

fxkr (1343139) | more than 5 years ago | (#26111617)

Problem is, both latency and packet loss are quite high...

Re:Sneakernet (0)

Anonymous Coward | more than 5 years ago | (#26111697)

First: You mean latency, not bandwidth.

Second: Things change so you can't just repeat this line year after year, you have to revisit the question regularly. What does the math look like in 2008? What would be an estimate of disc cost + mailing cost + recurring employee time cost vs file transfer cost?

Re:Sneakernet (1)

91degrees (207121) | more than 5 years ago | (#26111923)

First: You mean latency, not bandwidth.

No. I mean bandwidth. Latency is meaningless here because you're just sending a disc out. The data transfer rate is 4GB over a couple of days which is somewhere in the order of 10-100Kbits/s.

Re:Sneakernet (0)

Anonymous Coward | more than 5 years ago | (#26111983)

No, bandwidth is meaningless because it can scale trivially (just send more DVDs).

Oh and I guess you did the math and it didn't work out in your favor. I thought so too.

BitTorrent (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26111393)

Fucking BitTorrent, it is reducing me my sells.

better approach (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26111397)

A couple weeks ago, while browsing around the library downtown, I had to take a piss. As I entered the john, Barack Obama -- the messiah himself -- came out of one of the booths. I stood at the urinal looking at him out of the corner of my eye as he washed his hands. He didn't once look at me. He was busy and in any case I was sure the secret service wouldn't even let me shake his hand.

As soon as he left I darted into the booth he'd vacated, hoping there might be a lingering smell of shit and even a seat still warm from his sturdy ass. I found not only the smell but the shit itself. He'd forgotten to flush. And what a treasure he had left behind. Three or four beautiful specimens floated in the bowl. It apparently had been a fairly dry, constipated shit, for all were fat, stiff, and ruggedly textured. The real prize was a great feast of turd -- a nine inch gastrointestinal triumph as thick as his cock -- or at least as I imagined it!

I knelt before the bowl, inhaling the rich brown fragrance and wondered if I should obey the impulse building up inside me. I'd always been a liberal democrat and had been on the Obama train since last year. Of course I'd had fantasies of meeting him, sucking his cock and balls, not to mention sucking his asshole clean, but I never imagined I would have the chance. Now, here I was, confronted with the most beautiful five-pound turd I'd ever feasted my eyes on, a sausage fit to star in any fantasy and one I knew to have been hatched from the asshole of Barack Obama, the chosen one.

Why not? I plucked it from the bowl, holding it with both hands to keep it from breaking. I lifted it to my nose. It smelled like rich, ripe limburger (horrid, but thrilling), yet had the consistency of cheddar. What is cheese anyway but milk turning to shit without the benefit of a digestive tract?

I gave it a lick and found that it tasted better then it smelled.

I hesitated no longer. I shoved the fucking thing as far into my mouth as I could get it and sucked on it like a big half nigger cock, beating my meat like a madman. I wanted to completely engulf it and bit off a large chunk, flooding my mouth with the intense, bittersweet flavor. To my delight I found that while the water in the bowl had chilled the outside of the turd, it was still warm inside. As I chewed I discovered that it was filled with hard little bits of something I soon identified as peanuts. He hadn't chewed them carefully and they'd passed through his body virtually unchanged. I ate it greedily, sending lump after peanutty lump sliding scratchily down my throat. My only regret was that Barack Obama wasn't there to see my loyalty and wash it down with his piss.

I soon reached a terrific climax. I caught my cum in the cupped palm of my hand and drank it down. Believe me, there is no more delightful combination of flavors than the hot sweetness of cum with the rich bitterness of shit. It's even better than listening to an Obama speech!

Afterwards I was sorry that I hadn't made it last longer. But then I realized that I still had a lot of fun in store for me. There was still a clutch of virile turds left in the bowl. I tenderly fished them out, rolled them into my handkerchief, and stashed them in my briefcase. In the week to come I found all kinds of ways to eat the shit without bolting it right down. Once eaten it's gone forever unless you want to filch it third hand out of your own asshole. Not an unreasonable recourse in moments of desperation or simple boredom.

I stored the turds in the refrigerator when I was not using them but within a week they were all gone. The last one I held in my mouth without chewing, letting it slowly dissolve. I had liquid shit trickling down my throat for nearly four hours. I must have had six orgasms in the process.

I often think of Barack Obama dropping solid gold out of his sweet, pink asshole every day, never knowing what joy it could, and at least once did, bring to a grateful democrat.

Different torrent client ? (5, Informative)

drsmithy (35869) | more than 5 years ago | (#26111399)

No need to get fancy with an "RSS feed". rTorrent, at least, can be configured to monitor a directory for .torrent files and automatically start downloading when one appears. You could set this up, then simply push out your .torrent file to each site with something like scp or rsync.

Re:Different torrent client ? (5, Interesting)

Anonymous Coward | more than 5 years ago | (#26111471)

rtorrent [rakshasa.no] watching a directory for .torrent would be the way to go. And then use unison [upenn.edu] to keep the .torrent directory in-sync.

Re:Different torrent client ? (0)

Anonymous Coward | more than 5 years ago | (#26111569)

I think even uTorrent can monitor a directory =]

Re:Different torrent client ? (1)

rusl (1255318) | more than 5 years ago | (#26111831)

ya, that's a pretty standard BT client feature.

Re:Different torrent client ? (0)

Anonymous Coward | more than 5 years ago | (#26111803)

uTorrent and Azureus both do this.

Storm or some other botnet (1, Funny)

Anonymous Coward | more than 5 years ago | (#26111403)

Ask a warez site.

Dedicated Server (0, Flamebait)

Szentigrade (790685) | more than 5 years ago | (#26111405)

Wouldn't a dedicated server provide what you need? Upload your recovery files once and than have the server transfer them to each client at high speed. Simple and cost effective.

technologies working together isn't patchwork (1)

Yonkeltron (720465) | more than 5 years ago | (#26111413)

these are technologies that have been proven effective when working together by people everywhere. if you put it together, test it and build a system for fail-safes etc., you should be fine!

ask us (4, Informative)

TheSHAD0W (258774) | more than 5 years ago | (#26111417)

Next time you should ask at the official BitTorrent IRC channel [irc] .

The Python BitTorrent client [bittorrent.com] , which runs on Unix, has a version called "launchmany" which is easily controlled via script. It should fit your needs very nicely.

Snail-mail USB sticks (-1, Redundant)

timeOday (582209) | more than 5 years ago | (#26111421)

4GB of files once per month, why bother using the network?

Re:Snail-mail USB sticks (5, Insightful)

SirLurksAlot (1169039) | more than 5 years ago | (#26111477)

Why would they want to pay for those USB sticks (and any shipping fees that might be involved) when they have a perfectly good network already in place to send the data in a secure manner? There are too many variables involved in using USB sticks as a means of transferring back-up data. Sticks could get damaged, lost, stolen, etc, not to mention that the server at each store would need to allow USB access which could potentially open them up to other security risks. Just imagine if someone at a store decided to plug in their own USB stick and swipe a few files. Nice idea, but there are too many risks involved with a physical transfer of data.

Re:Snail-mail USB sticks (2, Insightful)

hedwards (940851) | more than 5 years ago | (#26111851)

Because depending upon the actual files that might be overkill. For recovery files there's probably a lot of similar or same files in each batch. Something like Jigdo, rsync or distributing diffs might be a lot more efficient.

With those the main concern is having an appropriate client to automatically handle the updating on that end.

Most of those options would also be capable of checking the integrity of previous updates and could be run more frequently just to verify that the data is uncorrupted. I think that bittorrent has similar capabilities.

Works great (5, Insightful)

Anonymous Coward | more than 5 years ago | (#26111427)

BitTorrent is an excellent intranet content-distribution tool; we used it for years to push software and content releases to 600+ Solaris servers inside Microsoft (WebTV).

-j

Sure, why not? (5, Insightful)

sexybomber (740588) | more than 5 years ago | (#26111431)

Is BitTorrent an appropriate protocol for file distribution in the business sector?

Sure! BitTorrent, remember, is only a protocol, it's just become demonized due to the types of files being shared using it. But if you're sharing perfectly legitimate data, then what's wrong with using a protocol that's already been extensively tested and developed?

Just because it's been used to pirate everything under the sun doesn't make it inappropriate in other arenas.

Re:Sure, why not? (0)

tylerni7 (944579) | more than 5 years ago | (#26111827)

I don't know what the poster meant exactly when he said appropriate, but I figured it was something like "is it inappropriate to use the client's bandwidth to push our software?"
If that wasn't what he meant, well, maybe it should be inappropriate. There is certainly nothing wrong with bittorrent, but I'm not sure how the clients would react if they knew that they had to use their bandwidth to push your software.

They might be fine with it, especially if it's a closed network, and that bandwidth can't be put to any good use. However, unless it was already written in a contract somewhere that they agree to let you use their connection for anything, I don't think it would be appropriate to silently push bittorrent clients on their machines that they don't know about.

Re:Sure, why not? (2, Insightful)

Bert64 (520050) | more than 5 years ago | (#26111835)

Pirates still prefer FTP, it seems all of the big warez groups are still pushing files around using FTP...

Re:Sure, why not? (2, Interesting)

hedwards (940851) | more than 5 years ago | (#26111871)

The main problem is that it introduces an extra vulnerability. With it the capability of very efficiently spreading malware and viruses around. Depending upon how locked down things are, it might not be a problem, but still it's definitely something to worry about.

And yes, I am assuming that somebody's going to get their machine infected or that somebody's going to break into the VPN traffic. Not necessarily likely, but still has to be considered.

rsync (5, Informative)

timeOday (582209) | more than 5 years ago | (#26111439)

How much do these disaster recovery files change every month? If they stay mostly the same, using rsync (or some other binary-diff capable tool) may let you keep your simple client/server model while bringing bandwidth under control.

In a word, Yes (4, Informative)

cullenfluffyjennings (138377) | more than 5 years ago | (#26111443)

I've seen bittorrent used for several business critical functions. One example is world of warcraft distributing updates using it.

Re:In a word, Yes (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26111801)

christ, you just defined WoW updates a business critical function? kids these days...

Re:In a word, Yes (1)

hedwards (940851) | more than 5 years ago | (#26111875)

I suspect that the GP is either referring to professional farmers or to Blizzard's own staff.

Re:In a word, Yes (5, Insightful)

nabsltd (1313397) | more than 5 years ago | (#26111927)

For Blizzard, updates to World of Warcraft are very much a "business critical function".

World of Warcraft uses it. (0)

gblackwo (1087063) | more than 5 years ago | (#26111449)

Must be good enough for the rest of us.

Cisco already makes a product to do this - WAAS (5, Informative)

colinmcnamara (1152427) | more than 5 years ago | (#26111453)

It is like Rsync on steroids. Cisco's Wan optimization and Application Acceleration product allows you to "seed" your remote locations with files. It also utilizes some advanced technology called Dynamic Redundancy Elimination that replaces large data segments that would be sent over your WAN with small signatures.

What this means in a functional sense is that you would push that 4 Gig file over the WAN one time. Any subsequent pushes you would only sync the bit level changes. Effectively transferring only the 10 megabytes that actually changed.

While it is nice to get the propeller spinning, there is no sense reinventing the wheel.

Cisco WAAS - http://www.cisco.com/en/US/products/ps5680/Products_Sub_Category_Home.html [cisco.com]

Re:Cisco already makes a product to do this - WAAS (2)

CaymanIslandCarpedie (868408) | more than 5 years ago | (#26111721)

I'm a huge fan of WAN accelorators (though I prefer the products from Riverbed), but not sure of the fit here (and is certainly isn't anything like what the OP is asking about). First, these devices aren't cheap especially if you need to communicate between tons of locations as seems to be the case here as each location will require a unit. Even the lower-end product in the category will easily run 10k. Second we don't know how much the files being moved once a month are similar. If not a majority identical, this product wouldn't really provide any benefit. If the file is basically identical then you can do the same thing with rsynch or similar for free. Even if they are identical, the functionality you are talking about is based on local caching. Since the files being moved are only moved once a month (depending on other data moving between sites and size of disk cache the unit has) that cached could likely have been cleared by the next months data is moved. Again rendering this expensive solution useless.
  use
Again, I love WAN acceloration and if properly used and understood in the right situations, it is some of the most useful and worthwhile pieces of kit in any datacenter. However, for the use mentioned. Doesn't seem the right fit to me.

Re:Cisco already makes a product to do this - WAAS (4, Interesting)

Bert64 (520050) | more than 5 years ago | (#26111863)

Bittorrent will transfer the differences too, if you make a new file overwrite an old one, it will replace any chunks which are different.

Re:Cisco already makes a product to do this - WAAS (3, Informative)

Anpheus (908711) | more than 5 years ago | (#26111967)

BitTorrent is not very flexible in this regard and so if you have bits -added- to the middle, then everything after the first added bit will need to be updated.

The worse case is of course, if you have new material at the beginning and everything is shifted. BitTorrent is not designed for that.

Re:Cisco already makes a product to do this - WAAS (3, Insightful)

jamiebecker (160558) | more than 5 years ago | (#26111977)

presumable "on steroids" means "with a fancy GUI".

rsync does this too. rsync can push or pull.

besides, there are plenty of rsync gui's, too.

however, bittorrent is almost certainly the best solution for this purpose -- the real question is coherency. You always know that eventually you'll have a complete and perfect copy at each location -- but how do you know WHEN that copy is complete so you can work on it? if this is strictly a backup system, then it's not needed, but it's probably not a good thing to be using files as they're being written:

some scripting -- rsync or BTdownload -- would fix this. copy the files to a working location when the update is complete, and then work from there while updates are restarted on the temp dir.

Azureus (0)

Anonymous Coward | more than 5 years ago | (#26111459)

Azureus, for instance, will happily check a directory regularly for torrents and just start downloading those. It should be trivial to apply some sort of external mechanism to PUTting such torrents in place on needed computers.

Bittorrent is not secure (1)

Hal_Porter (817932) | more than 5 years ago | (#26111461)

DHT or the like might seed your files outside the company. Ok, I'm too lazy to work out if that really is a threat, but I'm not sure that bitorrent is appropriate for data that you don't want to end up in the public domain.

You could probably rig up a system where scripts check secure FTP servers for updates, and download them. Cascade the SFTP servers so that each one feeds out to two more, geographically close ones and you'll be ok. If possible only download diffs, not the whole thing. And find an SFTP client which will pull several files at a time since that gives better throughput on high latency connections which are window size limited.

Re:Bittorrent is not secure (5, Informative)

jd142 (129673) | more than 5 years ago | (#26111537)

While security is always something to be considered, this from the question:

"private network of retail stores connected to our corporate office (and to each other) with IPsec over DSL, and no access to the public internet"

Private network? Check.
No access to public internet? Check.

So pretty much no way for the files to be seeded outside the company.

And even if there were a way to seed on the internet when they don't have access to it, password protect the file so only a client with the password can download it. That's not unbreakable, but if a competitor wanted the information there are easier ways to get it.

Re:Bittorrent is not secure (1, Offtopic)

Hal_Porter (817932) | more than 5 years ago | (#26111677)

I've worked at places that use IPSec or VPN. A common problem is that the server is loaded down so the secure connection is rather slow. So people will use VPN to pick up email/access the intranet when they need to and plain IP for internet access. Or people will bring laptops home and use plain IP on their unsecured home wireless network. Problem with this scheme is that if one of the machines with the files on is abused like this you could potentially have a problem. And if the files end up on pirate bay or you'll probably get fired no matter how encrypted they are.

Now in an ideal world everyone would understand things enough to not connect a secure machine to the public internet and also that it doesn't matter if files leak if they are sufficiently encrypted. Or that as you put it "if a competitor wanted the information there are easier ways to get it".

But we don't live in that world.

Mind you I'd define sufficiently encrypted as something much more secure than a passworded ZIP file.

Re:Bittorrent is not secure (1)

rusl (1255318) | more than 5 years ago | (#26111893)

Well, you could easily turn off DHT function (not all clients have that ability anyway). Then you could maybe have an IP whitelist similar to the functionality of the azureus plugin that blacklists certain RIAA/MPAA peers. And then you could even define certain ports only - those ports could be encrypted tunnels or something? I wouldn't know how to do all that stuff myself (well, turning off DHT is simple) but it seems that all these kinds of features could be tweaked if someone really was serious about it.

Re:Bittorrent is not secure (1)

SanityInAnarchy (655584) | more than 5 years ago | (#26112101)

password protect the file so only a client with the password can download it.

I don't know of a good way to do that with BitTorrent. Simpler to just encrypt the whole file, so anyone who downloads it is just helping seed, and can't read the file.

That's not unbreakable

With a large enough key, and properly applied crypto, it can be unbreakable until quantum computers become feasible.

As for DHT, I don't see where that's a problem -- trivial to simply disable it, or use a client which doesn't support it.

Re:Bittorrent is not secure (1)

nabsltd (1313397) | more than 5 years ago | (#26111947)

DHT or the like might seed your files outside the company. Ok, I'm too lazy to work out if that really is a threat, but I'm not sure that bitorrent is appropriate for data that you don't want to end up in the public domain.

Every BitTorrent client that supports DHT also has the ability to disable it.

In addition, since this is a VPN network, the client IP addresses are likely to be non-routable, so even if you did leak the torrent through DHT, it's pretty unlikely that anyone outside the company would be able to connect to a client running at 192.168.1.1.

If the CIO expects "official" support... (5, Informative)

aktzin (882293) | more than 5 years ago | (#26111473)

Personally I like the portable media shipment suggestions. But if your CIO/company requires enterprise software from a large vendor with good support, have a look at IBM's Tivoli Provisioning Manager for Software:

http://www-01.ibm.com/software/tivoli/products/prov-mgrproductline/ [ibm.com]

Besides the usual software distribution, this package has a peer-to-peer function. It also senses bandwidth. If there's other traffic it slows down temporarily so it won't saturate the link. Once the other traffic is done (like during your off-hours or maintenance windows) it'll go as fast as it can to finish distributing files.

CIO's want pre-built software (1)

obstalesgone (1231810) | more than 5 years ago | (#26111491)

Get it pre-built and externally supported. It'll be a lot easier to fly by your CIO.

The solution you suggested makes sense.

1. RSA keys are shared across the network.

2. A new file becomes available on your "central" server and is placed into a directory automatically shared by a bt client on the central server.

3. A simple script on the central server checks a list of servers it needs to update, and tells each of them to initiate a transfer using the bittorrent protocol.

4. ???

5. Profit.

Re:CIO's want pre-built software (2, Insightful)

obstalesgone (1231810) | more than 5 years ago | (#26111505)

Better yet, tack on:

6. Give the script that handles this a name, build deployment tools, and release them under GPL.

No, you fool! (5, Funny)

bistromath007 (1253428) | more than 5 years ago | (#26111493)

Haven't you been reading the warnings around here about how bad it is for the Internet? If big business starts using BT we'll microwave the baby!

Re:No, you fool! (2, Interesting)

Mad-Bassist (944409) | more than 5 years ago | (#26111991)

Oooooh... I can see the whole issue of throttling suddenly becoming very amusing as the corporate behemoths start slugging it out.

WAFS from GlobalScape (1, Informative)

Anonymous Coward | more than 5 years ago | (#26111509)

We do something similiar using WAFS by GlobalScape (Previously Availl).

http://www.globalscape.com/wafs/

It provides bit-level updates to data either on a schedule or continuously, and can keep a specified file version archive too. The continuous update to HQ should keep DSL utilisation low.

Chained client/server (4, Insightful)

Manfre (631065) | more than 5 years ago | (#26111515)

Have you thought about building up a distribution tree for your sites?

Group all of your stores based upon geographic location. State, region, country, etc. Pick one or two stores in each group and they are the only ones that interact with the parent group.

E.g. Corporate will distribute the files to two locations in each country. Then two stores from each region will see that the country store has the files and download them. Repeat down the chain until all stores have the files.

load balancing? (1)

abigsmurf (919188) | more than 5 years ago | (#26111527)

Bittorrent is incredibly wasteful for the initial seeding and is pretty intense on network equiptment. You have to be careful configuring all of the network settings, last thing you want is all of the stores either crashing their routers or maxing out the connections.

why not spread out the backups? Limit the bandwidth of the backups to allow enough regular traffic and have different stores send their backups on different days

Re:load balancing? (1)

rusl (1255318) | more than 5 years ago | (#26111921)

It's not rocket science to configure the bandwidth limits in just about any BT client. If I can seed things fast enough on my tiny little home ADSL connection with only 30kb up (and not overload things because I set the bandwidth caps below my maximum) I don't see why a much faster fancier network would fail.

Re:load balancing? (4, Interesting)

abigsmurf (919188) | more than 5 years ago | (#26112125)

It's not usually the bandwidth that kills networks for BT (although it does if you're not careful), It's the hundreds of temporary connections and half open connections it does. Lots of routers weren't designed for this and give up the ghost. You can configure them to lessen this (port forwarding, limiting it in the client) but some routers just can take it. There are also lots of routers which degrade over time with heavy BT usage and need occasional reboots.

SurgePlus Offsite File Synchronization. (-1, Offtopic)

Blowit (415131) | more than 5 years ago | (#26111541)

I think you should check out http://www.netwinsite.com for a product called Surgemail. It has a built in program called Surgeplus, a client that synchronizes any folders you want. You could create a script on each machine to do daily archival changes and have Surgeplus automatically upload these files to Surgemail.

I know this is a mail server however for 5 accounts this is free to use. You can log into the same account and have each offsite to have a backup of all the files in a distributed manner.

so the storage folder you want to backup to would be like this:
c:\Offsite\Store001 -- current store backup is stored in
c:\Offsite\StoreXXX -- Other stores will sync into these folders automatically (will automatically create it from the server)

On the server you would have the folder you want to store it in as:
private/Backup/Store001
private/Backup/Store002
private/Backup/Store003 ...

BEST OF ALL, THIS IS FREE for under 5 users. Since you only need to login as the same user, it will be synchronized across all of your remote stores giving you full offsite secure backup. No need to pay for offsite backup services.

This is also a full fledged mail/calendar server so if you want to use that portion too, it is the least expensive mail server to use internally.

Check it out and give it a try. My clients love it!

Re:SurgePlus Offsite File Synchronization. (1)

Blowit (415131) | more than 5 years ago | (#26111577)

Here are the direct links for the product:
http://www.netwinsite.com/surgemail/index.htm

http://www.netwinsite.com/surgeplus/index.htm

BITSAdmin (0)

Anonymous Coward | more than 5 years ago | (#26111545)

If you're using Windows XP or above, take a look at the built in tool "BitsAdmin."

WTF? (-1, Flamebait)

rerunn (181278) | more than 5 years ago | (#26111551)

Is BitTorrent an appropriate protocol for file distribution in the business sector?

WTF does the business sector have ANYTHING to do with what protocol you use?? The business wont give a crap if you used smoke signals, as long as they get their shit when they need it.

I could imagine a homebrew tracker, with uTorrent and an RSS feed at each site, but that sounds a little too patchwork to fly by the CIO

Go to the kitchen, get the dullest butter knife you can find, and then try your hardest to slit your left wrist. Grind as hard as you can. If you dont see bone, keep trying. Try harder. Did ya really need to submit an article to figure this out??

Re:WTF? (1)

the_B0fh (208483) | more than 5 years ago | (#26111815)

Please, think of the PFYs. His DR fileset is only 4Gigs. My pr0n is bigger than that. ASCII/text pr0n!

Others have already given him the best solution for his case - DVDs. Overnight them, and he is done. Latency may be a bit much, but not that much more than doing it over DSL or dialup.

Now, lets go back to discussing OT stuff.

Captain disillusion (4, Informative)

jonaskoelker (922170) | more than 5 years ago | (#26111559)

with IPsec over DSL, and no access to the public internet.

Unless you have very long wires, some box is going to route them. Are those your own?

Otherwise, your ISP's router, diligent in separating traffic though it may be, can get hacked.

Why am I saying this? Not to make you don your tinfoil hat, certainly, but just to point out that if the scenario is as I describe, you're not 100% GUARANTEED to be invulnerable. Maybe a few tinfoil strips in your hair would look nice... ;)

About the actual question: bit torrent would probably be fine, but if most of the data is unchanged between updates, you may want to compute the diff and then BT-share that. How do you store the data? If it's just a big tar(.gz|.bz2) archive, bsdiff might be your friend.

If you push from a single seeder to many clients, maybe multicast would be a good solution. But that's in the early design phase I think, which is not what you need :)

Best of luck!

Re:Captain disillusion (0)

Anonymous Coward | more than 5 years ago | (#26111949)

Otherwise, your ISP's router, diligent in separating traffic though it may be, can get hacked.

It is clear you missed the bit about IPSec... or perhaps don't know what it does. The whole idea of VPNs, or encryption in general, is that they assume every point between you and your destination is Carol and they protect your data anyway. So yeah, you're right, it's not 100% guarenteed to be invulnerable, but to break it you'd have to guess an unreasonably hard to guess number. More like 99.9999% guarenteed (assuming you do your initial key distribution properly and don't let people do dumb things on the machines on each end of the VPN tunnerl)

Re:Captain disillusion (0)

Anonymous Coward | more than 5 years ago | (#26111953)

Yes the ISP's routers could get compromised, but that's the whole point of IPsec - so that someone in the middle can't read your traffic.

Re:Captain disillusion (0)

Anonymous Coward | more than 5 years ago | (#26112111)

Unless you have very long wires, some box is going to route them. Are those your own?

Otherwise, your ISP's router, diligent in separating traffic though it may be, can get hacked.

Unless the traffic is encrypted with IPsec before it gets to the ISP's router.

Many large ISPs offer a hosted VPN service, where the ISP manages everything for you. From the article summary it isn't clear if they use a hosted VPN service or they just purchase regular DSL and have a VPN router that is configured to null-route non-VPN traffic.

And it doesn't matter who owns the router, but who pwns it :)

Re:Captain disillusion (1)

Seth Kriticos (1227934) | more than 5 years ago | (#26112133)

As far as I know there is only one thing in life that is somewhat 100% GUARANTEED, and that is, that it ends. Everything else is just a question on probability.

To go back to the topic: If you don't trust your ISP (legitimate thing) then you should encrypt the data before sending.

If you use a sophisticated encryption algorithm (like AES or serpent) and then send it out, then the listeners will have some problems reading the data. You probably even could use public BT in such a case with rsync'ed .torrent files.

To go back to my original topic, the smartest thing to do is to lower the probability of data infiltration. So use a IPSec'ed VPN to send out encrypted form of the archives via rtorrent.

I know, eats resources, but it would be fun, no?

How I would do it... (5, Interesting)

LuckyStarr (12445) | more than 5 years ago | (#26111605)

...is quite straight forward in fact.

  1. Create a "Master" GnuPG/PGP Key for yourself. This key is used to sign all your data as well as your RSS feed (see below).
  2. Set up an RSS feed to announce your new files. Sign every entry in it using your "Master-Key".
    • All the stores check the validity of your RSS feed via your public key.
    • All the stores have one (or the same) GnuPG/PGP key to decrypt your files. The beauty of GnuPG/PGP is that given many destinations you can encrypt your data so that every recipient (each with their own key) can decrypt them. Nice, eh?
  3. Set up a standard BitTorrent server to distribute your files.
  4. Announce all your new files via your RSS feed.

This has many advantages:

The beauty of this system is that it relies heavily on existing technology (BitTorrent, RSS, GnuPG, etc), so you can just throw together a bunch of libraries in your favourite programming language (I would use Python for myself), and you are done. Saves you time, money and a lot of work!

Furthermore you do not need to have a VPN set up to every destination as your files are already encrypted and properly signed.

Another advantage is: As this is a custom-built system for your use-case it should be easy to integrate it into your already existing one.

How is the VPN setup (5, Informative)

eagle486 (553102) | more than 5 years ago | (#26111637)

If the VPN is setup in a standard hub and spoke configuration then bittorrent would not help since all traffic between sites has to go via the central site.

Your best bet is multicast, there are programs for software distribution that use multicast.

Re:How is the VPN setup (0)

Anonymous Coward | more than 5 years ago | (#26111857)

If the VPN is setup in a standard hub and spoke configuration then bittorrent would not help since all traffic between sites has to go via the central site.

Did you read the article summary? (emphasis added)

"private network of retail stores connected to our corporate office (and to each other) with IPsec over DSL"

it's called dsync (5, Interesting)

slashdotmsiriv (922939) | more than 5 years ago | (#26111639)

and you can find documentation for it here:
http://www.cs.cmu.edu/~dga/papers/dsync-usenix2008-abstract.html [cmu.edu]

It is rsync on steroids that uses a BitTorrent-like P2P protocol that is even more efficient because it exploits file similarity.

You may have to contact the author of the paper to get the latest version of dsync, but I am sure they would be more than happy to help you with that.

Re:it's called dsync (4, Informative)

slashdotmsiriv (922939) | more than 5 years ago | (#26111657)

I hate to reply to my posts, but this link has an even shorter description of the tool:

conferences.sigcomm.org/sigcomm/2008/papers/p505-puchaA.pdf

Call me old fashioned (1)

Hognoxious (631665) | more than 5 years ago | (#26111649)

I'd get a station wagon and fill it with tapes. Go on, mod me "-1 old fashioned"

Cleversafe? (1, Informative)

Anonymous Coward | more than 5 years ago | (#26111655)

You should take a look at cleversafe.org - it's an opensource 'dispersed storage' infrastructure which allows you to slice up files and distribute them across a network of storage servers. Not sure if this would get you what you want, but it's worth looking into.

Foldershare? (1, Informative)

MunkieLife (898054) | more than 5 years ago | (#26111699)

I like the bittorrent idea more... but if you're looking for something simple and free - Foldershare. Not sure if this works for you, but I use Foldershare to sync files between several of my offices. It is peer to peer, with a central server to initiate the connection. If you have a 4GB file, perhaps you could rar it into smaller pieces, then this could work for you. If you don't have an internet connection though, this totally won't work for you. Heh.

Rsync or DFS sound like good choices (0)

Anonymous Coward | more than 5 years ago | (#26111715)

You don't say if the files are changed at the remote sites, or just at head office.

Rsync is an option - have 10 remote sites replicate from the master, then have other stores replicate from the submasters.

You don't say if you're running windows, but the distributed file system [wikipedia.org] works pretty well. Supports remote differential compression.

uhhhhh multicast? (1)

branto (811992) | more than 5 years ago | (#26111745)

sounds like a problem that multicast-based file transfer is designed to solve. http://www.tcnj.edu/~bush/uftp.html [tcnj.edu] You said IPSec VPNs, but is it just ipsec, or is it gre inside ipsec? If there's no GRE, then forget what I said.

Use existing technology (5, Funny)

Mostly a lurker (634878) | more than 5 years ago | (#26111747)

CIOs are notoriously conservative. Any solution you suggest that involves building a solution from scratch will scare them. The solution is to use existing proven technology. In the MS Windows world, at least, root kits have been distributing updates successfully for years. You should be looking at simply modifying an existing root kit to your requirements.

IPSec over DSL (1)

kabloom (755503) | more than 5 years ago | (#26111755)

Are you using IPSec in Tunnel mode or Transport mode? If you're using it in tunnel mode, then you're not going to fix your bandwidth problem, because all data has to go through corporate HQ anyway because that's where the tunnels end.

Downtime? (0)

Anonymous Coward | more than 5 years ago | (#26111789)

Um, ok the data goes down and you have it everywhere but the main site that needs to download it again at a capped rate. How do you get it back to the hosting data site rapidly enough to be useful?

A encrypted usb memory key and a stamp go a long way.

Seriously it's a good idea and nice in practice but have you ever tried sitting there on your hands while a boss with a whip watches you download the company files at 150k/s. If this is to be able to backup your branch office sites and restore remotely that is fine, I just wouldn't want a 3hr downtime to show up on my record while you retransmit data.
I actually have backups go to other sites across the nation now because of hurricanes damage or if the world ends and the future civilization's life hangs in the balance of our spreadsheet data.

It is more my last line of defense.

Depending on VPN topology (1)

Razron (12415) | more than 5 years ago | (#26111823)

Most VPN setups like this are hub and spoke with the central office being the spoke. So connections that go from one remote sit to another still have to go through the central office. So you still have a bandwidth problem at the central office. If you have your VPN setup as a mesh so it has connections to multiple sites you might be able to get this to work. The problem you run into then is most inexpensive VPN solutions will only be able to handle so many different VPN tunnels before they run out of CPU. Not know what you used to setup your remote offices as a VPN concentrator this may not be a problem.

Hadoop DFS sounds more appropriate (1)

kevinodotnet (122092) | more than 5 years ago | (#26111837)

Why not use the Hadoop distributed file system [apache.org] ? It offers automatic replication and you can treat each "store" as a "rack" to guarantee multiple remote backups.

You also get the immediate advantage of having a single file namespace and instant streaming access to all of the files from any single location.

The only advantage to Bittorrent that I can see is faster recovery time since a single store can source the backup from from N other stores (instead of 2, or whatever number of replications you have decided on).

Sub.TV already does this (0)

Anonymous Coward | more than 5 years ago | (#26111873)

Sub.tv use bittorrent to distribute large video files to plasma screens in student unions - they auto-download - IIRC, it's an older Azureus client, presumably written with a plug-in, that ran on an always-on windows box.
It seems an entirely appropriate mechanism for it, and they're already doing what you seem to want!

rsync (1)

ratsbane (1363433) | more than 5 years ago | (#26111881)

I've set up something similar to this. You almost certainly don't need to transfer ALL of the 4gb every month - you just need to update a copy in the corporate office with all of the changes from the locations. Rsync is the answer. It figures out what's changed and only transfers the changed stuff, which is typically a trivial amount. Rsync is a brilliant piece of work. it's made for exactly the sort of thing you're trying to do. It will work so well you'll think there's some kind of quantum voodoo going on. Also, check out rdiff-backup. There's a version for windows and you can rsync easily between windows and *nix. If security is an issue (and it sounds as if it isn't) you can rsync over ssh, too.

Use Existing Tape Backup Software Features (0)

Anonymous Coward | more than 5 years ago | (#26111899)

I have used commercial packages like the Enterprise Backup Solution we already use to backup data to tape to mirror files. Even across a SLOW AS CHRISTMAS T1 connection it works VERY well to only copy the files that change on a daily basis. So, unless you are modifying GIGS of data at-a-time, keep it simple.

Lotus Domino! (0)

Anonymous Coward | more than 5 years ago | (#26111903)

Lotus Domino! It's replication keeps databases/websites and documents/files contained within them in sync across multiple servers. You can specify how the data is distributed across the network with connection documents.

Integrate a datacenter server (1)

Korkman (704276) | more than 5 years ago | (#26111915)

Setup a cheap file server in a datacenter, hook it up into your VPN network and store all backups there. Use rsync - very fast, uses SSH nowadays for auth and encryption. Encrypt the whole backup partition (dmcrypt, truecrypt, etc.) and keep the key private. Manual mount and key entry after rebooting. That way datacenter operators can't (easily) gain access to the files. Or transfer already encrypted files, which will destroy rsync performance, though.

Set SSH and all the other services to listen on the VPN IP only, making the machine invisible to the common internet.

Not as fancy as Peer-To-Peer distribution, but very reliable and fast. Also you get less administrative headaches, I think.

Windows DFS -- Dont use FRS (5, Informative)

anexkahn (935249) | more than 5 years ago | (#26111943)

In windows 2003 R2/Windows Server 2008 they really improved DFS. It lets you set up throttling in 15 minute increments, and with Full Mesh replication, it decentralizes your replication..kind of like bit torrent. However, you have to make sure you don't accidentally use FRS, because it sucks. Where I work we have 5 branches that pull data from our data center. I have DFS replication setup so I can have all our software distribution at the local site. I need to keep the install points at all the sites the same, so I use DFS to replicate all the data, then to get to it I type \\mydomain.com\DFSSharename Active Directory determines what site I am in, then points me to the local share. If the local share is not available, it points me to the remote share, or to a secondary share in the same site...so it gives you failover for your file servers. If you don't have any windows boxes, this wont work, and this really locks you into Microsoft, but it won't cost you anything more than what you have already paid. Below is a link to Microsoft's page with more information, including how to set it up: http://www.microsoft.com/windowsserver2003/technologies/storage/dfs/default.mspx [microsoft.com]

Re:Windows DFS -- Dont use FRS (0)

Anonymous Coward | more than 5 years ago | (#26112005)

mod parent up, this is how to do it. As much as I hate windows, I hate to say that R2 replication works and it works well.

Rely on someone else's bandwidth (0)

Anonymous Coward | more than 5 years ago | (#26111979)

Use an existing service to provide it: http://bitsrepublic.com/ [bitsrepublic.com]

NFS with DFS (1)

flyingfsck (986395) | more than 5 years ago | (#26112127)

You could set up a NFS distributed file system. That may be more amenable to your boss and will have other advantages too.

Try looking into CleverSafe (0)

Anonymous Coward | more than 5 years ago | (#26112131)

www.cleversafe.com

Do not use BitTorrent (0)

Anonymous Coward | more than 5 years ago | (#26112153)

Take a look at your company's network topology. If it is a typical branch setup, like "hub and spoke" where your branches are all remotely connected through the central head office, then BitTorrent will waste bandwidth. Why have a peer to peer application like Bittorrent routing traffic from a branch, up to the head office, and back down to another branch? You do not want to impact other applications running across the WAN on a remote branch.

Unless your WAN topology is fully meshed, peer-to-peer apps are probably not so efficient. It's better to use a direct-push strategy. Take a look at Microsoft DFS (distributed file system) - you can control replication links and times, or use a protocol like FTP and put QoS network restrictions on it. Schedule pushes for off-peak hours, where possible. Stagger updates to each branch, if necessary. My company uses an IBM product called Tivoli to push updates to branches, because it has bandwidth control capabilities. There are other apps like this out there (probably cheaper as well).

BitTorrent is better suited to Internet downloads, and because bandwidth is controlled autonomously in each client, what's to prevent client's in different sites from hogging all the bandwidth in any given branch?

More commercial solutions... (1)

Strawberry (97160) | more than 5 years ago | (#26112159)

Both Kontiki and Ignite sell enterprise-type (supported, maintained etc.) P2P systems that can be deployed internally if you need something off-the-shelf.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?