Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Transforming the Web Into a Transparent 'HTTPA' Database

timothy posted about 3 months ago | from the security-still-needed-note dept.

Security 69

An anonymous reader writes MIT researchers believe the solution to misuse and leakage of private data is more transparency and auditability, not adding new layers of security. Traditional approaches make it hard, if not impossible, to share data for useful purposes, such as in healthcare. Enter HTTPA, HTTP with accountability.
From the article: "With HTTPA, each item of private data would be assigned its own uniform resource identifier (URI), a component of the Semantic Web that, researchers say, would convert the Web from a collection of searchable text files into a giant database. Every time the server transmitted a piece of sensitive data, it would also send a description of the restrictions on the data’s use. And it would also log the transaction, using the URI, in a network of encrypted servers."

cancel ×

69 comments

Sorry! There are no comments related to the filter you selected.

Encrypt the encrypt data and then give everyone (2)

BitZtream (692029) | about 3 months ago | (#47238351)

the key.

All of these sorts of silly ideas depend on no exploits and everyone being a 'good guy'.

If those two things were the case, there would be little to no reason to implement something in the first place.

Re:Encrypt the encrypt data and then give everyone (1)

QRDeNameland (873957) | about 3 months ago | (#47238471)

Yep, sounds like just another variation on the evil bit [ietf.org] .

Re:Encrypt the encrypt data and then give everyone (4, Informative)

fuzzyfuzzyfungus (1223518) | about 3 months ago | (#47238643)

Even older than that. The 'Send the data, along with a description of restrictions on the use of the data' is (along with a dash of 'semantic web' nonsense, which appears to be a new addition), classic "Trusted Computing".

Mark Stefik, and his group at Xerox PARC, were talking about 'Digital Property Rights Language' back in 1994 or so, and by 1998, if not earlier, it had metastasized into a giant chunk of XML [coverpages.org] . That later mutated into "XrML", the 'Extensible Rights Management Language', which eventually burrowed into MPEG-21/ISO/IEC 21000 as the standard's 'rights expression language'.

Some terrible plans just never entirely die.

More like labeling a paper copy "confidential" (1)

raymorris (2726007) | about 3 months ago | (#47239075)

From the beginning of the article:

> HTTPA, designed to fight the "inadvertent misuse" of data by people authorized to access it.

It sounds to me that it's more similar to labeling a paper file "confidential, for xyz use only". By attaching the confidentiality information directly to the data, you seek to avoid having someone absent-mindedly email the information to a vendor, without thinking about the fact that the information is supposed to be kept confidential.

missed the point. "inadvertent misuse". reminder (3, Informative)

raymorris (2726007) | about 3 months ago | (#47239065)

I think you've missed the point. Quoting the beginning of the article:

> HTTPA,designed to fight the "inadvertent misuse" of data by people authorized to access it.

I've had this conversation more than once:

Bob - Why did you tell people about ___. That was supposed to be a secret.
Sally - Oh, I'm sorry, I didn't realize that was supposed to be kept confidential.

Also this thought "oops, what I just said was supposed to be kept confidential. I messed up."

Those are the situations the protocol is supposed to address, the INADVERTENT release of confidential data. It's the digital equivalent of stamping a paper "confidential, for abc use only". Any time the system accesses the data, it is also reminded of the confidentiality rules attached to that data. This so they can, through processes and software, avoid mistakes. For example, a client could be set so that an attempt to copy confidential data to clipboard instead copies the reminder "this is confidential information", so someone copying it into an email without thinking gets reminded.

Re:missed the point. "inadvertent misuse". reminde (0)

Anonymous Coward | about 3 months ago | (#47241539)

I think you've missed the point. Quoting the beginning of the article:

> HTTPA,designed to fight the "inadvertent misuse" of data by people authorized to access it.

I've had this conversation more than once:

Bob - Why did you tell people about ___. That was supposed to be a secret.
Sally - Oh, I'm sorry, I didn't realize that was supposed to be kept confidential.

Also this thought "oops, what I just said was supposed to be kept confidential. I messed up."

Those are the situations the protocol is supposed to address, the INADVERTENT release of confidential data. It's the digital equivalent of stamping a paper "confidential, for abc use only". Any time the system accesses the data, it is also reminded of the confidentiality rules attached to that data. This so they can, through processes and software, avoid mistakes. For example, a client could be set so that an attempt to copy confidential data to clipboard instead copies the reminder "this is confidential information", so someone copying it into an email without thinking gets reminded.

Why is it being proposed as a general protocol then?

I think you may see what is intended for you to see, but nothing else.

Re:missed the point. "inadvertent misuse". reminde (0)

Anonymous Coward | about 3 months ago | (#47242709)

you can't fix stupid with technology

fold-down shelves, ATMs suggest otherwise. (1)

raymorris (2726007) | about 3 months ago | (#47242831)

In shopping malls, guests frequently leave bags in the restroom inadvertently. In some malls, the stalls have a little spring-loaded shelf that folds down to set your bag on. The thing is, the shelf blocks the door. You CAN'T leave the stall without picking up your bag to raise the shelf out the way. Nobody has ever accidentally left a bag on one of those shelves.

Another common "oops" used to be leaving one's ATM card in the ATM. You'd insert your card, hwy the money you came for, then leave without the card. Some machines were reprogrammed so that it wouldn't give you your money until you took the card out. The money is the whole reason you're standing there, so people rarely left without their money. By requiring them to take the card back to get the money, technology fixed stupid. Later ARMs have you SWIPE the card, so it never leaves your fingers . You can't leave the card in the machine if you never put it in the machine. Another, better, technological solution for an act of stupid.

Other examples include cases where peoplewho weren't paying attention would press the wrong button. Making the wrong button look and feel completely different from the right button largely fixes that. Remember when forms on web pages used to have the "reset" button , which cleared everything you'd just entered in the form. People frequently made the stupid mistake of clicking "reset" rather than "submit". Removing the "reset" button fixed that stupid mistake..

See "The Design of Everyday Things" for more technologies that prevent or reduce stupid mistakes, as well as a good analysis of the psychology of stupid . It covers what types of stupid mistakes there are (such as doing the right thing with the wrong object ie tossing "it" in the trash, then realizing your left hand is still holding the empty can, while your right hand is no longer holding your keys).

Might have a place (1)

dcooper_db9 (1044858) | about 3 months ago | (#47243141)

Years ago I was working as a subcontractor to a major defense contractor. I had a conversation with IT that went something like this:

IT to all personnel: Anyone with a computer must review each file on their drive and label any that might contain confidential information. Please insert our company logo and the following text into any confidential files.
Me to IT: To clarify, I have approximately X files on my hard drive. Do I really need to review ALL of my files?
IT to me: Yes
Me to IT: Do you have any tools I can use to automate this?
IT to me: No. You need to open each file, review it and determine if it contains confidential information. Then insert the logo and message into any files that do.
Me to IT: I just want to make sure I'm understanding your instruction. The vast majority of my files are operating system files. Some files, like the Outlook PST file might contain confidential information. They're not documents, spreadsheets or anything like that. Modifying those files might affect the performance of my computer. Also, I have several Microsoft Access databases containing thousands of records of sensitive information. I can insert the confidentiality message into the database but it might be more useful to add the message to the reports.
IT to me: No, you must insert the confidentiality message into any files containing confidential information.
Forward to my supervisor: Can you take a look at this? This is going to take a lot of work.
Supervisor to me: I looked into it. You're going to have to do this.
Me to IT: Which department do we bill this to?
IT to me: Your department.
Me to IT: Procurement?
IT to me: Yes.
Forward to procurement: I ran the numbers. It's going to take me a year of working full time to get this done. Can you authorize this?
IT to me: You don't need to review your files.
Me to IT: Okay, thanks.

Privacy Ultimately Loses (1)

TrollstonButterbeans (2914995) | about 3 months ago | (#47239491)

Privacy, sadly, is a losing proposition.

1) Google and advertisers track you + accumulate data.
2) The government does the same
3) Credit reporting agencies and banks selling your debt/credit card transaction data.
4) Employers
5) Insurance companies + on and on

Facebook and Google and LinkedIn are just 3 companies built on invading your privacy and there are tons more.

Short version: You are losing your privacy. "Not liking it", "Angry posts" and the like won't change this.

On the plus side: They really aren't interested in "you". Not in the slightest, they just want to monetize "you".

If you don't like it, create fake awesome information about yourself and spread it all over the internet! Control your brand with utter bullshit!!

Re:Privacy Ultimately Loses (1)

Jane Q. Public (1010737) | about 3 months ago | (#47239865)

Short version: You are losing your privacy. "Not liking it", "Angry posts" and the like won't change this.

So, what do you think WILL change it? Moving along with the other sheep to the other side of the field, hoping the wolf won't get you next?

Angry posts are more likely to change it than what you seem to be advocating. Enjoy your cage.

Re:Privacy Ultimately Loses (1)

TrollstonButterbeans (2914995) | about 3 months ago | (#47241065)

>So, what do you think WILL change it?

Government action and privacy laws are the only solution, which I can't see the government being interested in because they are one of the main perpetrators (NSA spying, etc.).

"Angry posts" didn't stop email spam or telemarketing abuse -- someone complaining on the internet is of no concern to a company that is trying to generate revenue. Both of those were dealt severe blows by laws.

Re:Encrypt the encrypt data and then give everyone (1)

Jane Q. Public (1010737) | about 3 months ago | (#47239803)

In essence, this is the same silly-assed idea that was mentioned here the other day, which some famous-for-something-else computer science has been working on for all of 50 years, or some damned thing.

They want to take the Web, and make into some kind of holy f*king persistent interconnected-data mess, which would be broken all the time because data that is supposed to be persistent seldom is after a few years.

I do not want the Internet to be an "interconnected database". I think if we tried to do it today it would fail miserably, AND that it would be an extremely, horribly bad idea all around.

That's literally the worst idea I ever heard (4, Insightful)

NoNonAlphaCharsHere (2201864) | about 3 months ago | (#47238381)

So we have a stateless database with built-in DRM on every record and user tracking. Brilliant.

Re:That's literally the worst idea I ever heard (-1)

Anonymous Coward | about 3 months ago | (#47238395)

So we have a stateless database with built-in DRM on every record and user tracking. Brilliant.

Why does that worry you? What do you have to hide?

Re:That's literally the worst idea I ever heard (1)

HalAtWork (926717) | about 3 months ago | (#47238981)

Are there still people asking this question? Seriously? Please don't be so shallow. [slashdot.org]

Re:That's literally the worst idea I ever heard (1)

Opportunist (166417) | about 3 months ago | (#47239151)

Only my anger over people asking me this.

Re:That's literally the worst idea I ever heard (1)

kwark (512736) | about 3 months ago | (#47239799)

I have nothing to hide, but nobody needs to know that.

Re:That's literally the worst idea I ever heard (1)

ganjadude (952775) | about 3 months ago | (#47239987)

i want to hide the fact that I have nothing to hide

Re:That's literally the worst idea I ever heard (4, Informative)

NotInHere (3654617) | about 3 months ago | (#47238473)

The original paper [mit.edu] has examples where such a DRM-based system has some legitimate usages. One was for patient data. If you want to eliminate special client software there, you can have this system, and run everything on the browser. The system abstracts and standardizes the access control, which is hopefully already present, and helps to close holes in the implementation. For intranets the model perfectly makes sense, however deployment into the wild wide web is of course extremely harmful.

Media people didn't alter the story, as the paper already contained discussion about www deployment, but only picked the bullshit non-intratnet-web part.

Re:That's literally the worst idea I ever heard (1)

NoNonAlphaCharsHere (2201864) | about 3 months ago | (#47238515)

Or you could just airgap your intranet. Even moving sensitive pages to a different port and firewalling outside access on that port should do it.

Re:That's literally the worst idea I ever heard (1)

postbigbang (761081) | about 3 months ago | (#47238779)

But no one ever really does that. Although you can state-freeze an OS, none of the OS makers have useful constructions that allow vetted air-gap updates via media transfer.

The entire scheme looks like a paradise for someone that wants to crack it like an egg. This, too, shall pass.

Re:That's literally the worst idea I ever heard (0)

Anonymous Coward | about 3 months ago | (#47238853)

What the heck do you call setting up your own apt repo to serve the airgapped network? You download on one, burn to write-once media, verify on a dedicated (also isolated) machine, and mount the now read-only media (CD/DVD) on the server acting as repo for the airgapped network. Do I really need to link you to the Debian handbook?

Re:That's literally the worst idea I ever heard (1)

postbigbang (761081) | about 3 months ago | (#47238871)

You're missing a bunch of steps.

You need to diff it all, make sure it MD5s (or better). Other dependencies have to be checked. While many of the Deb repos are fine, there's then the rest of the stuff you're using-- whose dependencies might not be in a cute and highly watched (if we're lucky) spot.

So you can apply this technique with other OS families and come up with similar questions, and no good airgap answers. You update only a core set of stuff, yes, the OS, but only after a lot of steps. And we hope you don't use a flash drive or other media that doesn't have/get an infected bootsector. Rootkits are ugly.

Re:That's literally the worst idea I ever heard (0)

Anonymous Coward | about 3 months ago | (#47239079)

I spend part of my day managing systems on an air-gapped network, and it's not too bad to keep RHEL with a local satellite server up-to-date. But I agree that it definitely is a pain, and there are always corner cases that are a lot easier to do on the internet-connected systems.

Re:That's literally the worst idea I ever heard (1)

Anonymous Coward | about 3 months ago | (#47238903)

What? No one does that? Maybe not in the US, but over here in Europe we certainly do have air-gapped networks where security is necessary (military). We were able to push Windows security updates just fine.

Re:That's literally the worst idea I ever heard (1)

fuzzyfuzzyfungus (1223518) | about 3 months ago | (#47238667)

"If you want to eliminate special client software there, you can have this system, and run everything on the browser."

Being able to reuse more browser code might well improve the existing proprietary client; but (as with any DRM system) you can't eliminate special client software; because you would otherwise be incapable of distinguishing between a web browser that happily accepts your 'HTTPA' and then ignores your restrictions and one that accepts and obeys.

As ever, you'll either need some suitably dense obfuscated blob, or to go all-out-TPM and do secure remote attestation and 'trusted' everything from preboot on up.

Re:That's literally the worst idea I ever heard (1)

NotInHere (3654617) | about 3 months ago | (#47241079)

Being able to reuse more browser code might well improve the existing proprietary client; but (as with any DRM system) you can't eliminate special client software

I've meant that the developers don't have to reinvent the wheel. Of course the software will be closed source, but the developers of the medical application won't need to invent their own access control, but can use this component. Its a good idea when it makes hospitals more secure.

Re:That's literally the worst idea I ever heard (0)

Anonymous Coward | about 3 months ago | (#47239807)

" If you want to eliminate special client software there, you can have this system, and run everything on the browser."

Nope, that still requires a special client, but now in your browser.

Re:That's literally the worst idea I ever heard (0)

Anonymous Coward | about 3 months ago | (#47238769)

Re:That's literally the worst idea I ever heard (1)

Opportunist (166417) | about 3 months ago | (#47239157)

I find it somewhat hilarious that the Wikipedia article about the Evil Bit has "do not track" in its "see also" section.

'send a description of the restrictions' (2)

Anonymous Coward | about 3 months ago | (#47238403)

Privacy's on the honor system now!

Re:'send a description of the restrictions' (1)

jhantin (252660) | about 3 months ago | (#47238437)

And this is any different from status quo how?

Re:'send a description of the restrictions' (3, Insightful)

SecurityGuy (217807) | about 3 months ago | (#47238513)

We aren't taking the status quo, building a whole new layer on it, and pretending it'll work. That's how it's different.

Re:'send a description of the restrictions' (1)

Opportunist (166417) | about 3 months ago | (#47239159)

Umm... overhead?

More trusted third party foolishness (4, Informative)

jhantin (252660) | about 3 months ago | (#47238427)

All I see here is a bunch of stuff that all depends on trusted third parties... and in security circles, "trusted" means "can screw you over if they act against your interests". In this case it relies on trusted identity providers, labeled 'Verification Agent' in the paper.

It all breaks down if a verification agent is compromised, and the breach of even a single identity can have severe consequences that the accountability system cannot trace once information is in the hands of bad actors.

The authors effectively admit that this entire mechanism relies on the honor system; it explicitly cannot strictly enforce any access control, because in the context of medical data access control may stand between life and death.

Finally, the deliberate gathering of all this information-flow metadata would add another layer to the panopticon the net is turning into.

Re:More trusted third party foolishness (1)

fuzzyfuzzyfungus (1223518) | about 3 months ago | (#47238675)

Silly consumer, the risk of 3rd parties undermining the system is why we need to make Trusted Computing mandatory! Secure remote attestation for all, only rights-management-compliant systems allowed on the network!

Re:More trusted third party foolishness (1)

jhantin (252660) | about 3 months ago | (#47270303)

This is oddly close to what I think DRM ought to be: advisory, not enforcing. Remove the accountability aspect, not least because it's a farce that leaves the most recent honest party holding the bag, and you have my concept of an ideal DRM engine: provenance meta-tags that let you know what color your bits are [sooke.bc.ca] , which you can use if it affects you or ignore if it doesn't, leaving no rights-holder the wiser no matter what course you take.

Accountability-oriented DRM, which prevents no action but forces your use of certain combinations of certain colors of bits into public record, would be prone to false positives. Pulled in some GPL code to a local build of 7-Zip? Chances are the other code doesn't have a GPL exception to allow linking against the non-Open unrar, so the resulting software likely may not be conveyed (in the GPLv3 sense) at all, but creating or using the resulting combined work won't infringe anyone's rights and shouldn't require you posting public notice that you have created such a combined work if you have no intent to convey it.

Re:More trusted third party foolishness (1)

technology_dude (1676610) | about 3 months ago | (#47239863)

This may be out there in left field but what if: It is, I think, a fact of life that everybody is going to have an internet connected device of some sort and bandwidth will continue to be mostly sufficient. What if a protocol could be developed so that the contents of packets were encrypted by default and the location of the encryption key or some other permission approver was part of the packet. If the sender had to be verified (tracked) the location of the key or approver would be checked. Let's say we had white lists on our mobile device and the mobile device IP address was in the packet somewhere. I login to my healthcare website, the website contacts my mobile device and since they are whitelisted, they can decrypt the packet contents. Basically, your data would be your data and you would only release to those you wanted to have it. If you did not have someone in your white list and it was important to you, maybe they could, gasp, call you? I'm not sure the analogy would hold up but it would be similar to a door entry system but you have total control over the database of people who can enter.

What? (1)

iCEBaLM (34905) | about 3 months ago | (#47238439)

Web browsers with DRM built in? Terrible.

Re:What? (1)

Opportunist (166417) | about 3 months ago | (#47239163)

Ah, so anyone but the legitimate user can easily access the content? Just like with the other forms of DRM?

another solution (2)

strstr (539330) | about 3 months ago | (#47238443)

just remove all restrictions all together and get rid of censorship and copyright and trademarks and privacy rules and classification and secret law.

"don't create data that shouldn't be made available to everyone" becomes the new rules.

and don't worry about that content being made of you because it's just capturing reality as it was at one point.

make the rules so that nothing can be hidden or kept secret.

http://www.obamasweapon.com/ [obamasweapon.com]

Re:another solution (1)

strstr (539330) | about 3 months ago | (#47238461)

my rules also say that patents are now invalid and no one owns ideas or content or information any more.

likewise even child porn is ok as long as the kids didn't feel they were raped or forced into participating when they made the content. >:)

obviously if a video of a rape was made it would be public because it fits under the rule that it documented one point in time as it was really happening.

and censorship is bad, mkay.

but it might lead to prosecuting to the perps who did the assault :P..

Re:another solution (1)

Opportunist (166417) | about 3 months ago | (#47239167)

But ... but 4 out of 5 people involved enjoy gang rape!

(now mod me down, I got karma to burn!)

Re:another solution (1)

SecurityGuy (217807) | about 3 months ago | (#47238529)

Right, so when you get a disease that people are irrationally afraid of and no one will hire you, then what?

The whole throw away privacy argument relies on everyone being more or less rational. Even if everyone is, maybe you get diagnosed with a disease that's going to kill you in a few years, but you'll be functional up until the end. Plenty of people won't hire you just because they won't hire someone who is only going to be there for a couple years regardless of the reason.

Re:another solution (1)

tlambert (566799) | about 3 months ago | (#47238617)

Right, so when you get a disease that people are irrationally afraid of and no one will hire you, then what?

You sue them, and they give you money for the rest of your life, just like you worked for them, but you don't have to come in and actually work?

What's the downside here?

Re:another solution (1)

viperidaenz (2515578) | about 3 months ago | (#47239273)

They appeal and delay the law suits and keep it dragging along in court until you're dead?

Re:another solution (1)

tlambert (566799) | about 3 months ago | (#47239791)

They appeal and delay the law suits and keep it dragging along in court until you're dead?

Just be cheaper to buy off than to fight.

Re:another solution (1)

SecurityGuy (217807) | about 3 months ago | (#47240567)

The downside is having to prove that they didn't hire you for that reason. Who the best candidate is is subjective. Did I not hire tlambert because he has $DISEASE, or because the other guy was a better candidate?

And as the other reply points out, lawsuits take time. I don't know about you, but I actually need money regularly. Sitting on my butt while a lawsuit wends its way through court is not something I can afford to do.

another solution (0)

Anonymous Coward | about 3 months ago | (#47239069)

no, get rid of copyright and and patents.

Trademarks are there to prevent fraud. I'm not allowed to make ice cream, sell it to you, and tell you it's Ben&Jerrys.

Copyright enables the music, movie, and videogame industries. I will not shed a single tear to see those industries collapse, but, others might.

Patents, otoh, do nothing for nobody. Trade secrets are bullshit, because patents were supposed to be the answer to trade secrets.

Privacy laws, protecting things like medical records or naked pictures of your wife, can be good.

Stallman wrote about the push by various business school types to put all of these rules together as something called "IP". By tying your rejection of copyrights to a rejection of laws against me getting a naked picture of your wife and publishing it on the Internet, you do their job for them.

Not impressed by this (0)

Anonymous Coward | about 3 months ago | (#47238451)

A few issues come to mind. Slowing down the traffic, database integrity, invasion of privacy, denial of anonymity..... To mention only a few. Security of the background "database", administration of it, and much more.

Track everything and ignore the restrictions anywa (1)

kheldan (1460303) | about 3 months ago | (#47238457)

Sounds to me like something the NSA would come up with, a universal database tracking everyone's access to every little thing on the internet, and the so-called 'restrictions' are as meaningless as the 'do not track' flag in a web browser, it only works when everyone is playing by the same rules.

Do not track (0)

fustakrakich (1673220) | about 3 months ago | (#47238539)

Part deux

firsT (-1)

Anonymous Coward | about 3 months ago | (#47238561)

Bad summary or stupid idea? (1)

manu0601 (2221348) | about 3 months ago | (#47238583)

Is it a bad summary or a stupid idea?

As it is explained, it seems that system does not cover the case where someone gets the data and leaks it

Re:Bad summary or stupid idea? (5, Informative)

tlambert (566799) | about 3 months ago | (#47238633)

Is it a bad summary or a stupid idea?

Yes.

As it is explained, it seems that system does not cover the case where someone gets the data and leaks it

It's advisory access controls with voluntary indications of use, with transaction metadata logging.

(1) The rights you could be granted are based on the object, not the actor and the object
(2) You obtain the exported rights list
(3) You voluntarily provide a purpose in line with the rights which are granted
(4) Your voluntary compliance with the rights list is logged as metadata, because collection of metadata isn't controversial at all
(5) You retrieve the data
(6) You use it however the hell you want, because you're a bad actor
(7) If you are a good actor, you enforce use restrictions in the client

and...

(8) You try to sell the idea as somehow secure, even though it's less secure than NFSv3, since NFSv3 at least requires the client to forge their ID

So "Yes" - a bad summary, and a stupid idea.

Do not want (0)

Anonymous Coward | about 3 months ago | (#47238601)

Do not want

Dumb idea and here why (1)

bl968 (190792) | about 3 months ago | (#47238623)

This is a dumb idea that sounds like a good concept. It like any other good thing on the internet requires that no one be malicious. SMTP didn't used to be restricted until spammers abused it. All that it takes to defeat HTTPA is a client written to ignore the A part.

Many issues (1)

symbolset (646467) | about 3 months ago | (#47238671)

The problem is distribution of trust. This is solveable.

Re:Many issues (1)

BitZtream (692029) | about 3 months ago | (#47238705)

Really? No one in the known universe has figured out how enforce 'trust' on others.

Please, enlighten me, how do you force me to owner your request to not tell your secrets to someone else? Kill me before I have the chance to do anything with the information? Thats the only known method, and it still depends on no exploits; such as me finding a way to tell the guy standing next to me before you manage to kill me.

Re:Many issues (0)

Anonymous Coward | about 3 months ago | (#47239835)

100%. You solve it by retaining trust internally.

Buzzword bingo (1)

Lehk228 (705449) | about 3 months ago | (#47238677)

BINGO!

Lost me at Semantic Web (1)

Gothmolly (148874) | about 3 months ago | (#47238725)

More hoopla, with bandwidth and CPU intensive DRM and user activity tracking on top. What problem is this even trying to solve?

Re:Lost me at Semantic Web (0)

Anonymous Coward | about 3 months ago | (#47239203)

"Lost me at Semantic Web"

rofl.

"What problem is this even trying to solve?"

The problem of how we can earn you more internet lolpoints. Mode parent +1 funny!

Re:Lost me at Semantic Web (0)

Anonymous Coward | about 3 months ago | (#47239847)

Industry momentum. HTTP, HTML, XML, SOAP etc aren't complicated enough yet.

We need to continue making needles complexity a religion, after all we need to keep on selling hardware, sell more books, pitch conferences with $latest_buzzword_methdology that repackages the old. Even the assholes who run "learn to program in a day companies" need to earn money, so until dumb history grads who found "internet companies" with their banker boyfriends can edit a config file and declare they've developed an app, we're not done.

how is this not toad's freenet (1)

rewindustry (3401253) | about 3 months ago | (#47238755)

under a different protocol?

how diff from google analytics? (1)

xxxJonBoyxxx (565205) | about 3 months ago | (#47239255)

...which already logs unique uris and often classifies using server- config'ed tags?

Really want to share a secret Bob? Alice? (1)

INT_QRK (1043164) | about 3 months ago | (#47239999)

Maintain a physically secure, access controlled, TEMPEST hardened room in a secret protected location. Verify through periodic repeated inspection and test that all production media in the room is physically isolated from all untrusted communications networks (ideally, all networks). When you absolutely must share secret information with Alice, invite Alice to your room. Verify her identity, physically hand her the the information to read, monitor her while she reads the information, then physically retrieve the information and escort Alice out of the room when she's done. Any and all discussions regarding the information remain in the room and allowed nowhere else. Alternately and less desirably, convey the information to Alice's corresponding secure room via trusted courier. In agreement with Alice, monitor her with proven effective investigation and surveillance techniques for the duration of your trusted relationship for any behavior or conditions in mitigation to continuing a relationship of trust. This is a proven system with high, but imperfect reliability. Nothing is perfect, but anything, IMI, anything on the Internet? Not as much.

what about lazy users? (0)

Anonymous Coward | about 3 months ago | (#47244201)

You have to make security EASY for a user. Any technology can be taken down by laziness ))

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>