×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

Heartbleed Coder: Bug In OpenSSL Was an Honest Mistake

lucag Re:It's not just the implementation (447 comments)

Still, I see no reason why the payload must be arbitrary rather than some fixed strings.
This looks like an hidden control channel bound to happen.

about 8 months ago
top

Wolfram Language Demo Impresses

lucag Re:Not so sure about the language... (216 comments)

Indeed, and it appears that this is actually the goal of the project, per the original announcement
  http://blog.stephenwolfram.com...
The scary bit, is that many of the "novelties" there announced (i.e. homogeneous treatment of input, output and data, etc.) are actually quite old ideas in the arena of functional programming (lisp or scheme are built upon these foundations)... sometimes they work nicely; often you risk ending up with academic exercises.
I am myself not too keen on "revolutionary technologies" which should rather be considered "evolutionary developments" (even when the evolution actually provides something new and useful)!

What is new here should be the integration with a massive database of `facts' and the possibility of performing elaborate queries, relying on `ready-made' algorithms.
This is very convenient and potentially useful but
  a) it has little to do with `programming' per se; it is a programmatic interface to a knowledge-based system (where the knowledge itself includes also the algorithms being requested)
  b) it is opaque, in the sense that there is little control on what code is doing what data: many of the functions act actually as black boxes and it is not straightforward to see how to actually get in control of the system and/or understand what is actually being done in order to provide an answer.

A further remark: (b) is most of the time not required at all (we just want to get a rough picture of something), but it is essential e.g. for scientific applications.

about 9 months ago
top

Wolfram Language Demo Impresses

lucag Not so sure about the language... (216 comments)

As much as I would like to be impressed, what I see is quite underwhelming: a functional application language with some interface to "facts" and "databases" with a pattern matching engine might make some analysis easier but ... the principles of the language are mostly what you come to expect if you have seen lisp once or any modern functional language,e.g. haskell.

I can see it as being useful, but as another commenter pointed out, "FindShortestTour" is a library function (which might be handy), but definitely not an example of how concise the language might be; the same could be said about "EdgeDetect" or the like. The power of the language can be measured in how easily it can be extended or non trivial algorithms can be implemented ... not in how many functions are offered (even if this could be more convenient none-the-less).

about 9 months ago
top

A MathML Progress Report: More Light Than Shadow

lucag Re:LaTex plugin (84 comments)

Acttually LaTeX has not been written by Donald Knuth, but it is a macro language built upon TeX by Leslie Lamport (which is also a very remarkable computer scientist, but I suppose he is not who you were thinking about).
Yes, the grammar is horrible, but once the basics of the language are mastered it feels quite a natural setting where to write text; if you want to implement a program in it, on the other hand, things are not so "easy". Look at http://stackoverflow.com/questions/2968411/ive-heard-that-latex-is-turing-complete-are-there-any-programs-written-in-late
This being said, the people at http://www.luatex.org/ are doing a really good job to integrate the engine of tex with lua and exposing its internals, as to offer a "reasonable" programming language both for tuning the typesetting and for actually implementing algorithms within the documents.

1 year,24 days
top

A MathML Progress Report: More Light Than Shadow

lucag Re:LaTex plugin (84 comments)

Not really; if you are a professional mathematician, then you almost definitely have a good and fluent command of latex; as such the issue of a GUI is hardly relevant (and it feels "natural" to write equations in a certain way)

If on the other hand you just want to write some formulas on your web page, then I concur that latex might be the wrong technology and it is also a technology which is not so easy to integrate with GUI tools as soon as what you want to do is non-trivial (speaking of which I would like to point out that most CAS support a form or another of tex output for their results; the code they produce is "interesting", to say the least and I always find more convenient to retype the formula than to copy and paste what is needed).

1 year,25 days
top

A MathML Progress Report: More Light Than Shadow

lucag Re:LaTex plugin (84 comments)

For a human?
  Absolutely nothing: I actually sort of like the example, even if you would need at least

\catcode`/=0 /def/implies{/Leftrightarrow}

somewhere in your file for the "/implies" to work ;))
This is exactly one of the problems I was pointing out: too much flexibility is not so good in this case.

For a computer?
Well, a computer can do a good job to print that out, also; as I said, mathjax does render such an expression (provided you do not use external macros, etc. etc.) within a
web page. Actually this is what I use on my web page when I put online the abstract of a paper; Sciencedirect (a service by Elsevier) also makes full texts of papers available using the same trick. Yet, it is fairly clear that this looks more like a stopgap measure than a solution for the problem: the standard for mathematics on the web should not be designed for humans, but rather for ease of parsing and processing (within a DOM) by machines, with "sort of" standard techniques and tools.

1 year,25 days
top

A MathML Progress Report: More Light Than Shadow

lucag Re:LaTex plugin (84 comments)

Well, several reasons ...
First of all, latex is a macro language whose goal is to typeset documents, not render web-pages: it is based upon the tex engine and several of its extensions (nowadays I feel very confortable with luatex and I really enjoy the extra flexibility it provides). Its goal is to compose pages (encapsulating some of the typographic best practices in an algorithmic form), not just to write math.
The syntax of tex math mode is an handy way to write formulas and feels very natural (at least once you get used to it) and comfortable, but it is by no means perfect.
In particular, as soon as you need a non-trivial layout (e.g. for commutative diagrams), it requires you to understand how things will be rendered on the page and/or write your own custom macros in order to get a decent presentation (usually nowadays with a mixture of tikz/pgf packages; in olden times it was a matter of building up vboxes and hboxes by hand). Clearly, all of this is not acceptable if you are thinking about a standard for web pages like mathml.

I have found the mathjax project to do almost what you want: render mathematical formulas on a page in an "almost portable" way starting from a description of them in a standard restricted subset of tex (the subset almost everybody is using, unless there are `special needs'); it solves a real problem and it does it job in a commendable way, but it is by no means a proper solution.

In summary, there are at least these problems to consider:
  1) latex is a document description language (among other things), not just a for typesetting formulas; here we are dealing with a subset of tex math mode operators (e.g. we do not want custom macros)
  2) the typesetting algorithm of latex does not play nicely with the model used by html5 and it is based on different assumptions
  3) parsing a latex document is actually very different from parsing an xml one: as long as the document is "trivial", the conversion is sort of straightforward (\begin ... \end become opening and closing tags, etc. etc.), but as soon as you want to exploit some of the flexibility afforded by the system things get ugly very quickly.

lg

1 year,25 days
top

Lavabit Case Unsealed: FBI Demands Companies Secretly Turn Over Crypto Keys

lucag Re:Certificate Authorities compromised? (527 comments)

So what?
A SSL certificate is used just to provide end-to-end encryption, not to protect the storage.
As such, it is sort of pointless to wonder if the root certificate used by any major provider has been or is known by some federal agency or not... it is much easier to ask the owner of the server for its contents than to intercept communication.

This being said, it appears that lavabit used encrypted storage as well but there is something amiss in the way the protocol was implemented, I fear.
(I have never been using their service, so it might be I am grossly misreading things: corrections would be very welcome!)

Let me explain: as long as encryption and decryption are being performed by a remote server there is no guarantee that data might not be captured (ok ... homomorphic encryption might be going to change part of the scenario; unfortunately is far from practical nowadays and so it will be in the next 5/10 years).
There are basically three approaches I might be thinking about
  1. perform decryption with a custom program on the client: the key is never sent "in clear" and the server just owns a public key to encrypt data as soon as they are received; however there is a window in which the server knows the plaintext (i.e. before writing it down to permanent storage) and might copy it.
[the sensible option is to ask people to use gpg and then rely on public servers, trusting the cryptography]
  2. perform decryption locally in a javascript client in the browser. This might actually work, and with the proper setup it is also possible to use public key algorithms
  (basically the user has to upload a copy of her private key encrypted with a symmetric algorithm to the server, together with a public key; upon a decryption request the server downloads the packet in the javascript app and locally decrypt it; then, once the private key is recovered it moves on to locally decrypt every single datum as stored remotely). There is the same disadvantage as in 1 here, in the sense that the server can copy the data while they are "in clear", but no special client is required. I point out, however, that in this scenario it is possible for the server to offer a compromised javascript page which also uploads the secret key as soon as decryption is required; as such the surface of attack is larger.
3. perform decryption remotely by providing a symmetric (and/or private) key. Here it is just a matter of trust between the user and the server in that the administrators are not going to either clone the data (yet this they could have done also in scenarios 1 and 2) or keep a copy of the key as provided. This is the simplest solution, but also the least safe of them all.

In summary: do not trust anybody to do cryptography in your own stead (unless you work on homomorphic encryption, of course ;-) ) and least of all to do decryption of any data; if you need secure (in the sense of 'secret') mail require all parties to use client applications providing the encryption on their own machines and not to delegate to any third party (third parties might be used to store encrypted data, though).

about a year ago
top

Silent Circle Moving Away From NIST Cipher Suites After NSA Revelations

lucag Re: Mixing the signals (168 comments)

The point is not so much that the cipher would be weaker, as that it would be no stronger than using any of them and there are some cases where it could actually be as weak as the weakest of them both. For instance, you do not gain anything under a "known plaintext" scenario.

Consider this case: you have an enciphering machine (say E) and you want to recover the keys being used by probing its behaviour with a series of
texts (which are either `random' or suitably chosen by you).
If E(m,k1|k2)=B(A(m,k1),k2)
where A,B are your original systems and k1,k2 the respective keys, we might try to mount an attack by intercepting the stream between A and B.
There is a slight security advantage, as a chosen plaintext attack for E becomes a known plaintext attack for B (the chosen plaintext is m; the known one is A(m,k1)) but if B is vulnerable the attacker can recover k2 and strip the second layer of encryption. Now we are left with attacks against A under a known plaintext model (which might work or might not). This is a variant of the usual "meet in the middle" approach used against 2DES; if you want a direct parallel, just consider
having to look for collisions (x,y) to
B^(-1)(E(m,?),x)=A(m,y)
where "?" denotes an unknown key.
A particular case is when x=y *as a design decision*. If this turns out to be the case (argued as "256 bits should be enough for anybody!" or the like), then it is actually the weakest cipher which matters (and not the strongest one).

Furthermore, it can well be that there are distinguishers for the first cipher under consideration; if that turns out to be the case an attacker can infer strong statistical properties on the input stream to the second which could be exploited.

about a year ago
top

Silent Circle Moving Away From NIST Cipher Suites After NSA Revelations

lucag Re: Mixing the signals (168 comments)

Nope. Two weak ciphers do not make a strong one, just a mess.
This is not to say that a cryptosystem should not be designed from basic (and rather insecure) primitives suitably chained and iterated: this is actually the case for all modern block ciphers from Feistel-style networks to the AES. The point is that it is not sensible to rely for security on the rather unpredictable interactions between different encryptions and the actual risk is indeed a false sense of security.

A different problem is whether it makes sense to consider "replaceable" encryption algorithms as suggested. In the case of public key systems this would not be a good idea, as the properties, security parameters and behavior might be widely different (even in comparable usage scenarios) and unexpected weaknesses might appear. As for block ciphers, they are sort of supposed to be interchangeable (for given block and key length) ; however, it has to be considered that a negotiation protocol might always be fooled by an attacker in order to select the "weakest" (in some sense) algorithm.

In short: in cryptography flexibility might (and usually can) be a liability rather than an advantage. The best course of action is to be able to fully audit a "simple" implementation (and be somehow able to guarantee some security) rather than leave too much room for unsuspected attacks.

about a year ago
top

Silent Circle Moving Away From NIST Cipher Suites After NSA Revelations

lucag Madness (168 comments)

The least I would have expected from the documents about the extensive spying done by NSA was a generalized weakening of cryptography.
While it is true that some algorithms might have been deliberately weakened by the NSA, I doubt this could have been systematic; especially for those which are best investigated by the cryptological community at large.
  In particular, NIST mandated cipher suites while definitely amenable to some theoretical attacks in some cases, have been independently investigated and, as of today, no effective practical attack is known against AES. I would never trust a 'homemade' algorithm for anything, nor waste time to try and analyse it (cryptography is actually part of my job) unless there were some really compelling reasons for doing so (e.g. interesting mathematics, peer review requests or unusual attack models being considered).
Skein and twofish are definitely interesting algorithms, and they have also been well regarded in the competitions leading to SHA3 and AES; they are definitely not a bad choice, but to choose them because whatever has been selected by NIST is "tainted" by NSE (and not other architectural or practical considerations) resembles more a form of superstition than anything else.

about a year ago
top

Btcd - a Bitcoind Alternative Written In Go!

lucag What programming language? "Go" or "Go!"??? (150 comments)

Please: observe that "Go!" and "Go" are two quite different programming languages.
The client appears to be written in "Go" which is the language by google, but the headline would suggest "Go!" by McKabe & Clark.
I find the same ambiguity in the text... still, I am looking forward to the time we shall use the full unicode range in order to have
similar looking, yet entirely different, names. That shall be fun!

about a year and a half ago
top

Ask Slashdot: Web Site Editing Software For the Long Haul?

lucag Re:Emacs (545 comments)

I find the combo
emacs with nxhtml and CEDET (for php)
very nice and powerful (the only inconvenience is the startup time with CEDET, but that is what you get if you want semantic analysis performed by the editor).
I seem to remember there should be also something running under eclipse, though.

regards,
  lg

more than 3 years ago
top

Writing Linux Kernel Functions In CUDA With KGPU

lucag Re:ECB Mode is totally insecure (101 comments)

Writing parallel code is difficult. Writing parallel code which makes sense even more. Actually, if you have a quad-core CPU and do ECB instead of CBC, then you can manage a 4x increase in performance ... no need to use a GPU!
(The reason is that ECB encryptions might be done in parallel, as each of them is independent; for CBC you need to know
the encryption of textblock-1 in order to produce that of a block).
A counter mode (CTR) might make sense for ecryptfs, but the security analysis is definitely non-trivial to make.
Actually, it is amateurish at best to say that this implementation of ecryptfs "is not a toy" ...
(per http://code.google.com/p/kgpu/wiki/IozoneBenchmarkResults )
it is, in fact, something which seriously compromises security.

more than 3 years ago
top

No P = NP Proof After All

lucag Re:This will NO break any encryption algorithms... (318 comments)

Actually, to build a cryptosystem off an NP-complete problem would be a very bad idea indeed.
It is worth observing that while NP problems are believed to be hard in general, most of the `average' instances
can be solved quite easily. There are several papers (Levin84 was the first, but see also
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.39.8775 ) on the topic.
A further remark: people seem to assume that "NP complete" means "as hard as conceivable". This is utterly
false. A solution to an NP problem can, by definition, be verified by a Turing machine in polynomial time; this
is not the case for more general classes of problems (for example those in class PR or R).

more than 3 years ago
top

The More Popular the Browser, the Slower It Is

lucag So what? (367 comments)

The linked article seems to be quite devoid of propercontent ... after a test of some browsers on just one computer (and, I guess, just one OS) they deem that there is an inverse correlation between popularity among the people visiting their site and performance.
Not quite what I would call an accurate and scientific approach!
This being said, there might be a grain of truth in the very fact that the more popular the browser the more "corner cases" are exercised (and thus have to be implemented). By corner cases, I do not mean what the standard dictates, but what you find (ab)used on way too many pages.

more than 5 years ago
top

SGI Lives On, In Name At Least

lucag If only ... (107 comments)

Compaq had changed its name to Digital when it still had time ...

[just being nostalgic and wondering who had the "bright" idea to dump development of the Alpha line in favour of ia64 ! ]

more than 5 years ago
top

Intel Cache Poisoning Is Dangerously Easy On Linux

lucag Re:First you need root on the box (393 comments)

The point of this exploit is not to install a rootkit, but to do it without altering the kernel or the executables at all; this is clever & nice.
Yet, there is the "trascurable detail" that you have to become root first; this seems to be lost to the author of the second piece in the summary.

As the poster says, once you are root you can do anything you want (including, but not limited to, reflash the bios in many cases) and hide all your tracks; to get a rootkit hidden without messing the system, that is definitely more challenging.

more than 5 years ago
top

Ancient Books Go Online

lucag Re:Copyright on Ancient texts is nothing new (198 comments)

To prepare a critical edition requires a non-trivial amount of effort and work, and it makes sense that it is counted as a creative activity: it is not just to "recover what is there", but also to propose and suggest a model in which a text might fit. Actually, the original text itself may not be subject to copyright (nor anybody might claim so) but the actual compilation does. So, while copyright on the texts of Homer has definitely expired (and it cannot be claimed by, for example, the Greek government as "rightful heir"), a critical edition of the Iliad is protected.

Actually, if one just wants to read an ancient work the point might have limited relevance
(since -usually- it might be possible to find late XIX century critical editions which are "good enough"). However, for scholarly study it is of the utmost importance to determine which lectio is being followed (and why).

For a paradoxical example of "what a commentator might do", I point out the novel "Pale Fire" by Nabukov [a short description is here: http://en.wikipedia.org/wiki/Pale_Fire ]
This is not supposed to happen in real life, though [even if some "comments" on sacred texts might have had even more radical effects]

I would like to add a further remark: most libraries and galleries control reproduction rights for their possessions (e.g. by forbidding to take pictures); this is something quite different from the copyright of the author.

  For example, consider this page on the National Gallery web site:

http://www.nationalgallery.org.uk/home/copyright.htm

The National gallery has copyright FOR ALL THE PHOTOGRAPHS OF THE PAINTINGS [...] ON THE WEBSITE
and then they notice that
  "For some more recent works in the collection the work itself will also be in copyright. "

more than 5 years ago
top

Ancient Books Go Online

lucag Nice collection, and with pdf download as well (198 comments)

There are already several project to scan and/or make available ancient texts [see, for example,
http://gallica.bnf.fr/ or http://www.archive.org/ , not to say of the more specialist sites like http://www.etana.org/ (for ancient near-east history) or the impressive Posner Collection at
http://posner.library.cmu.edu/Posner/ ]
However, most of these (with the remarkable exception of gallica and cmu)
  mostly present late XIX
early XX century editions of the texts. This is good, but I feel it is definitely interesting to get also some "primary texts" online, which is what this project is doing [I don't quite like that la "Description de l'Egypte" is under 8000 BC- 499 AD, rather than 1800 AD - 1849 AD: the books are ABOUT Egyptian Antiquities, yet they were written after the Napoleonic expedition!]

I was going to complain about the need to use wget to get the books to browse off line, yet I have just seen that there actually is an option to download the texts as pdf files (alas not djvu); this is really a nice surprise; actually, I was expecting the donating libraries to try their utmost to prevent this [not that it would ever works]

I would say that this is really a worthy project.

P.S.
  There is a small editorial here as well, but I don't know if it requires subscription to be read:

http://www.nature.com/news/2009/090420/full/news.2009.377.html

more than 5 years ago

Submissions

top

Encrypted traffic being intercepted

lucag lucag writes  |  about a year ago

lucag (24231) writes "It appears that most of the encrypted traffic over the net is currently being intercepted and in some way decoded by NSA.
Here there are some references, unfortunately scant on the details that matter; actually it is not clear if it is broken algorithms, compromised protocols, MITM attacks or just plain old simple "enforced" cooperation by the providers.

http://www.theguardian.com/world/2013/sep/05/nsa-gchq-encryption-codes-security

http://www.nytimes.com/2013/09/06/us/nsa-foils-much-internet-encryption.html?smid=tw-nytimes&_r=0"

Journals

lucag has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?