garyebickford (222422) writes "Space Finance Group (in which I'm a partner) has launched a Kickstarter to fund updating the "famous Integrated Space Plan", created by Ron Jones at Rockwell International in the late 1980s and early 1990s, and can be found on walls in the industry even today. The new Plan will be a poster, but also will provide the initial core data for a new website. The permanent link will be thespaceplan.com. As additional resources become available the website will be able to contain much more information, with (eventually) advanced data management (possibly including sources like Linked Data) and visualization tools to become a resource for education, research, entertainment, and business analytics. The group also hopes to support curated crowdsourcing of some data, and is talking to Space Development companies about providing data about themselves. They hope to be able to construct new timelines and show the relations between events and entities — companies, agencies, people, etc." Link to Original Source top
How 'private' is TOR, really? ('hidden network', indeed!)
garyebickford (222422) writes "I have a suspicion that TOR is nowhere near as private as is generally assumed. We can assume that some fraction of all the nodes out there are run by what I'll term 'spies' — entities who want to know things about whoever's using TOR. The question is, what fraction is sufficient to be able to reconstruct missing pieces, and figure out with a high degree of reliabillity what the 'real' source and destination are, assuming those 'spy' nodes can all talk to each other? There is some good math for doing such reconstructions of networks where most of the nodes are unknown. I suspect that the necessary fraction is somewhere near 10%. It's quite possible that your friendly neighborhood 3-letter spook shop knows a lot more about what's going through the TOR network than any of us, the great unwashed, realizes. So, how much of the TOR network needs to be 'cooperating' to significantly compromise privacy?" Link to Original Source top
Would you support National Space Society videos on the importance of space?
garyebickford writes "I didn't know where to put this (it's not really science, where most space stuff goes), but I think it's important so I'm asking Slashdot: how important is space development in your opinion? How would you tell people about it? National Space Society wants to produce several videos about the importance, and the potential, of commercial space development.
I think space development could be the most important subject today. The potential of space industry for relieving various resource issues on Earth (including even possible 'climate change' and other ecological concerns) could make many of the contentious issues of the day moot. Just as the 'discovery' of the New World and sudden availability of large amounts of various resources disrupted the economic and political systems of the entire world, so also space could completely change the game on Earth again. I think this is not only overall a good thing, I think it is inevitable and we should be planning for it" Link to Original Source top
garyebickford writes "I'm not a big Facebook user (and I keep the privacy settings tight). I also don't like the fundamental model, where storage of one's life history is dependent on a single for-profit company whose interests do not coincide with mine. It's as if my shoebox of old pictures had a Facebook gateway on it.
So, what would be the complications involved in making a peer-to-peer, server agnostic tool with some of the features of Facebook and a possibility of achieving the necessary network effect? Obviously there are costs and security questions, so I don't think it means pure peer-to-peer as in BitTorrent. I'm thinking more like a network of Jabber servers run by many vendors. I can choose a company that I trust to retain my data, at a cost that I am willing to pay (whether in advertising or cash or whatever). Since there would be no technical barriers, the vendor market could remain competitive and vendors would tend to provide better service and support.
The key is probably the protocol for finding 'friends' and transferring the proper amount of data. Could it be based on jabber's networking model? I would think that for privacy all the data would have to be transferred directly from each vendor's server to the browser, unless it is passed through intermediary vendors in encrypted form. IMHO this could be a very cool project." top
Multilingual programming languages - why or why not?
garyebickford writes "I've thought about this a few times. With all multilingual work on web pages, maybe it is time to make programming languages multilingual. I think it would be relatively easy for the core language — just as in web pages, provide a set of translation files for each 'human' language, and at the top of the source file have a directive to use whichever human language is the default. Then editors could pick up that directive, and display the source code in that language or any other.
Handling source code libraries could be done in a similar manner, although it would be significantly more complicated. And comments — well those might be problematical. And it might be necessary for the programmers to include translation files for variable names. But using this method, someone programming in French and someone programming in English, for example, could both work on the same code base in the language they are most comfortable with, and the code would make sense to both." top
The proliferation of debris orbiting the Earth – primarily jettisoned rocket and satellite components – is an increasingly pressing problem for spacecraft, and it can generate huge costs. To combat this scourge, the Swiss Space Center at EPFL is announcing today the launch of CleanSpace One, a project to develop and build the first installment of a family of satellites specially designed to clean up space debris.
This looks like a reasonable method, although I think that at some future point it might be useful to just put at least the smaller stuff in a higher 'parking orbit' for later destruction or recycling. This way you wouldn't lose one vacuum cleaner for each satellite retrieved. And much later down the road, it might be useful to collect bigger units — expended boosters, for example — as raw materials and/or containers. The cost of getting the mass into space has already been spent.
I optimistically foresee a future where much of the stuff sent into orbital space has a recycling function built into the design." Link to Original Source
garyebickford writes "I'm just wondering — is there any real use for a fondleslab for a programmer or sysadmin? I just don't see what the use of these things is, even for the great unwashed. I have on occasion used my cell phone to log in to a server and do minor maintenance, but I would not use the phone for programming or editing beyond some triviality like a hosts file except in extreme circumstances. Touchpads are too big to fit in a pocket, with resolution too low to do any useful work (for biz types maybe a small spreadsheet or something, but that's a different group), and without a cover they seem to me to be too vulnerable to scratching and breakage.
So, please enlighten me with all your collective wisdom, and warm me with your flames." top
garyebickford writes "My company looks like it's going to adopt the dev environment I've been using for a while, which is based on integrating Mercurial, Trac and dotProject. So I'm thinking that they are soon going to want to know how to do things. I can't be the first one in this position, so I'm wondering if someone else has made up some slides etc. that would be useful in training my fellow geeks (and that they are willing to share.)" Link to Original Source top
garyebickford writes "Today is Ockham's birthday. Time to read about what he actually said.
Although he is commonly known for Occam's razor, the methodological principle that bears his name, William of Ockham also produced significant works on logic, physics, and theology.
He is also considered the father of epistemology. One notable aspect of his life is that he apparently did not fit into the normal schooling process, leaving before achieving his master's degree (equivalent to high school diploma) and got in trouble with the Church due to his original thought, but was widely admired for his creative and far-reaching ideas. Sounds like me!:) (and a lot of/.ers)" Link to Original Source
I've realized for years now that it would not be difficult for a motivated group to get malware code built into any of dozens of open source projects. For security reasons I won't go into methods here, but it's true. And the fact is, that fewer and fewer *nix users or administrators ever see, much less review, much MUCH less, really understand the code. Most of us don't have time or motivation to do so — we just install binaries using yum, apt-get, ports, or whatever.
While the total number of servers is tiny compared to the number of Microsoft boxes, their potential impact is much greater, while vulnerabilities are discovered quite often. So it seems to me that there are high value targets of opportunity. So why are we not seeing such attacks? Are the bad guys (whoever they are) saving their attacks for when they need them, such as the start of an international conflict?" top
garyebickford writes "I construct fairly complex applications. I'd like to know others' approach to makeing installable packages for local installation in a FreeBSD environment. Should I build a port?
Our applications generally use a mix of software including shell scripts that call programs that are themselves in the BSD ports system, including PHP with some important non-core modules (also in ports), Apache, plus very rarely non-ported third party software that has to be compiled and installed. The systems commonly require creation of directory trees and crontab entries.
In order to make life simpler for the production IT folks, I'd like to have an automated installer. I've built installers using shell scripts. I haven't used Make for much in the past (so there is a learning curve) but I wonder if that would be appropriate here. If so, then perhaps I should go all the way and build a BSD port package. However it doesn't appear that there is a 'local ports' section that would be appropriate for this usage. I've glanced at the Porters Handbook but not read it thoroughly.
We use Mercurial for version control, and I'd like to finally end with a package that can be built automatically from the repository." top
garyebickford writes "(This article is from March 2008. I searched Slashdot for polarized LED and did not find anything, so I'm submitting now.)
This article is about "Ultra-efficient LED, Developed By Student, Will Vastly Improve LCD Screens, Conserve Energy" but I think they missed the boat. The most cost-effective and easy-to-use 3D systems at this point use polarized glasses and polarized screens.
There are several other systems in the works, including some that don't require any glasses — see Philips WOWvx, etc. However, all those without glasses are only 3D in the horizontal axis, requiring the viewer to maintain something like an upright position — not good for watching in bed.
This student noted that the light emitting part of an LED is already polarized, but the packaging doesn't maintain that. So he worked out how to preserve the natural polarization. This leads me to think that this could be used by OLED displays to provide an excellent 3D display experience, regardless of one's orientation (as long as the glasses stay on.)
[Rensselaer Polytechnic Institute student Martin] Schubert's polarized LED advances current LED technology in its ability to better control the direction and polarization of the light being emitted. With better control over the light, less energy is wasted producing scattered light, allowing more light to reach its desired location. This makes the polarized LED perfectly suited as a backlighting unit for any kind of LCD, according to Schubert. Its focused light will produce images on the display that are more colorful, vibrant, and lifelike, with no motion artifacts.
Duke University and United States Army scientists have found that a cheap and nontoxic sunburn and diaper rash preventative can be made to produce brilliant light best suited to the human eye.
Duke adjunct physics professor Henry Everitt, chemistry professor Jie Liu and their graduate student John Foreman have discovered that adding sulfur to ultra-fine powders of commonplace zinc oxide at about 1,000 degrees centigrade allows the preparation to convert invisible ultraviolet light into a remarkably bright and natural form of white light.
I can see the comic books now... "Super Baby defends the planet against the aliens! Shooting intense beams of light from his diaper, he attacks! The aliens put up their defensive shields in vain! Hot beams of sulfur-zinc-light fry them in their flying saucers!"
The researchers said they are producing white light centered in the green part of the spectrum by forming the sulfur-doped preparation into a material called a phosphor. The phosphor converts the excited frequencies from an ultraviolet light emitting diode (LED) into glowing white light.
Zinc oxide would be both a less-toxic and cheaper light source than the combinations used in today's commercial LEDs — gallium nitride and cerium-doped yttrium oxide, they said. Cerium-doped yttrium oxide is also used in today's mercury-containing fluorescent bulbs, Everitt added.
Liu's lab originally stumbled on to the light emitting potential of sulfur-doped zinc oxide while studying its electronic conductivity. "We just lit it up with an ultraviolet laser and — whammo — there was a lot of white light coming out," Everitt said.
garyebickford writes "Does an actually useful antivirus software package for Linux exist — either FOSS or commercial?
Today's article cited in Slashdot's Apple section discusses Apple's recommendation for its users to install antivirus software. I think that Linux users who think viruses won't attack their machines are increasingly whistling in the dark. Depending on its lack of market penetration will eventually be a failing strategy, as Linux becomes more popular in servers, appliances and especially desktops and laptops.
Considering the widely distributed development process and vast number of applications developed by (largely altruistic) independent teams all over the world, preventing viruses permanently is an intractable problem. I have personally come up with at least a few motives and methods for evil baddies to incorporate evil software into essential Linux applications in such a way that the exploit might not become known until triggered some time in the future. If you think about it, it is not a substantially different problem than a 'mole' infiltrating a high tech company or government body.
I confess that I sometimes do not exercise sufficient care when installing software. I suspect I am not alone, and even if I do take care, it would be highly difficult for me or anyone, no matter how sophisticated, to catch all possible exploits by reading the code. What if a user is offered a downloaded software package? What if it's my hypothetical grandmother who I have converted to Linux? They might accept the installation, and even provide the root password. No, they shouldn't — but some absolutely will. Or what if the software exploits a hidden flaw in some other software to achieve root access?
Once an exploit is executed and discovered, the community will no doubt be very quick in response, but that is closing the barn door after the horse has been stolen." top
The technical politico-economic solution to Amtrak
garyebickford writes "This article discusses the weird problem of the US passenger train company, Amtrak.
Amtrak is a "two-headed beast," said David Laney, a Texas lawyer who chaired Amtrak's board from 2003 to 2007. "On the one hand, it's a private company... that's told to run itself efficiently to make money. On the other hand, you have the concept of Amtrak as a public service provider."
Congress created Amtrak in 1970 from the wreckage of the nation's unprofitable private passenger rail service. Lawmakers structured it as a private corporation — with most of the stock held by the federal government — with the hope it would one day become self-supporting.
That goal has largely faded. Amtrak officials point out that passenger rail is subsidized throughout the world, and it gets about $1.3 billion in federal funding a year.
Amtrak is the quintessential political solution to a problem. It seems almost purposely designed to make passenger rail as difficult as possible, and neither to die nor survive. It should be renamed "Zombie Rail". It is the worst of all possible worlds, encouraging inefficient, outdated monopoly rail companies to continue doing the same, badly. It combines what government does badly with what business does badly.
I have a better idea. It combines what government does reasonably well with what business does better. Government is fairly good at building and maintaining infrastructure, and for a variety of reasons is the natural 'vendor' of infrastructure. The Interstate Highway System is a good example.
Business is generally better at customer service, finding and making opportunities, and reducing operating costs. In a truly competitive environment, that is... In a monopoly environment, business acts just like government.
So. the solution: The Government should buy all the rails from the railroad companies (nationalize the infrastructure), and run them as, essentially, toll roads. The existing rail companies could continue to run their trains, paying per-ton-mile rates equivalent to their present costs (or less), but their monopolies would go away — probably phased out over time. In return, they could run their trains anywhere. These companies could be smaller, leaner, more efficient and probably more profitable than they are now. At present their costs of rail and property maintenance greatly reduce their viability as businesses, and they have little or no ability to expand rail into new locations, lacking a simple eminent domain process.
This the key: Any entrepreneur who believes they can make money running a passenger train from A to B could do so, (subject to various qualifiers — analogous to those the airlines must meet for airport access). The entrepreneurs would own their trains, just as trucking companies own their trucks.
The government could then upgrade the infrastructure according to need per a defined national policy — such as priority for passenger trains, high speed rail, or whatever. Rail could compete in the government policy marketplace on an equal basis with highways.
The Interstate Highway System demonstrates the success of this approach. I will note that the Intracoastal Waterway, another national transportation system that works this way, is not doing so well. Funding has been cut to the point where dredging in some areas is more than ten years behind schedule, and no barges can get through — which means less and less usage — which means less and less funding." Link to Original Source
garyebickford writes "We have long known that there is no gravity — the Universe sucks. More recently we have learned that at larger scales, there is no dark energy — the Universe blows. This article from National Geographic proposes some silliness about the entire Universe being pulled by "something" from outside. Well trained observers will realize this is just proof that the Universe is going to hell!:D" Link to Original Source top
garyebickford writes "Enquiring minds want to know: Where can I learn functional programming, and oh-by-the-way catch up on some serious holes in my mathematical training? A BS in math? An MS in CS with a bunch of math behind it? If so, where can I do it part time and/or online? What are the best schools for my situation? I think that just reading the MIT course materials is no longer enough for my leetle brain.
I think I'm lacking some fundamental theoretical bases for the modern computing direction. I'm thinking of going back to school, and looking for suggestions. I think I'm not the only one with these kinds of issues, so I'm posting this in hopes for some benefit from the Wisdom of Slashdot — whatever that may be!
Background — I started out in school a long time ago, doing imperative programming in the days when Hoare, Brinch Hansen and Wirth were the Word on campus. Structure Programming was big when I was in my first job. (Can you draw your program on a piece of paper with no crossing lines?) I dabbled a bit in APL (more recently A+ and J), never really tried LISP (Never got a Round Tuit.) My career has been entirely based on imperative languages.
Now I am interested in Haskell, Scheme (for different but related reasons), OcaMLand of course LISP. I've also been a Neural Nets fanboy for a long time. And, of course, these days computing is going to require ever more seriously, effective asynchronous parallel programming solutions.
Until I understand the underlying framework, I won't feel comfortable with the unfamiliar terminology, such as Monads. Will 'pure' functional programming (one input, one output, no side effects) work for a given problem? I don't know enough to answer the question. How do I choose between Haskell and Scheme, for starters? Either one would apparently work (better than PHP?) for the next phase of the project I'm working on.
I think now that I need to retake Advanced Linear Algebra, Real Algebra, Differential Equations and a bunch of other stuff. It seems that I need this stuff, and I need to learn it in a coherent, connected way. In addition to the computing aspects, I need the equivalent of a B S in math. So far I have not been able to build a mental model to hang this branch of computer science on." top
garyebickford writes "How do you make yourself a better programmer? Do you study methodologies, or code, or expose yourself to more languages? Or do you just write more stuff? Do you review and edit FOSS code to 'help the community' and simultaneously improve your awareness of methods?
DLowe has some interesting and IMHO powerful thoughts about how to become a better programmer, starting from a chess analogy. He found that studying tactics was the best way to become a better chess player, and he argues that studying coding problems is the best way to become a better programmer. It's not whether you use "Agile" or "Extreme" or "Object-oriented", or "method X"; it's studying and practice at improving (short, isolated example) code. This makes a lot of sense to me. I found a similar example in studying mathematics — reading the book often let me think I knew the stuff, but it was doing the problems, over and over, where I really learned. The 'book learning' may have taught me the concepts, but not in a way that I could really use the tools.
I still lose more games than I win, but as long as I've slept and eaten recently, I don't lose because of head-slapping tactical errors. Better still, I am recognized in the club as someone who will punish my opponents' tactical mistakes brutally.
Chess players study tactics by means of puzzles. A typical tactical puzzle has a picture of a chessboard with a few pieces, and a single sentence problem description: "white to play and win," or something similar. Tactical puzzles are not generally taken directly from actual games. Instead, they are simplified or idealized from real games — pieces which aren't relevant to the problem are removed, for example, and the real game elements of time pressure and personality and so on are completely removed.
He extends this to programming, saying:
"Yes. I believe the equivalent is reviewing code on paper. Not necessarily literally on paper, but in isolation from the compiler, the schedule, the politics and everything else that comes with a professional software project. Just the code. Unlike chess, the problem description is always the same: "how can this code be improved." Ignore the big picture, the product, and high-level software objectives such as performance, portability or reusability. Just think literally: how can the *CODE* be improved, from the perspective of a human reader.
And just like the chess player solving a tactical puzzle, arm yourself! In chess, there are names for all of the common tactical patterns. Fork, pin, skewer. Discovered check, driving off, piling on. The list goes on a bit, but it's finite. In the same way, there is a jargon for describing code quality in much more concrete terms than the all-too-common (and uselessly vague) "bad smell". There are the basics: use meaningful names, keep it simple stupid (KISS), don't repeat yourself (DRY), be consistent, and so on. Then there are more complex quality issues like cohesion, coupling, information hiding, referential integrity and separation of concerns. Challenge yourself to use the correct terminology, both in describing problems and in suggesting improvements.
garyebickford writes "Palm has just sent out a letter to its newsletter list, announcing the cancellation of the Foleo. The notice is also posted to their Blog.
"As many of you are aware, we are in the process of building our next generation software platform. We are very excited about how this is coming together. It has a modern, flexible UI, instant performance, and an incredibly simple and elegant development environment. We are working hard on this platform and on the first smartphones that will take advantage of it.
In the course of the past several months, it has become clear that the right path for Palm is to offer a single, consistent user experience around this new platform design and a single focus for our platform development efforts. To that end, and after careful deliberation, I have decided to cancel the Foleo mobile companion product in its current configuration and focus all our energies on delivering our next generation platform and the first smartphones that will bring this platform to market. We will, of course, continue to deliver products in partnership with Microsoft on the Windows Mobile platform, but from our internal platform development perspective, we will focus on only one."
According to the email, 'Foleo II' will be based on the new platform (which I surmise is the Linux based PDA platform)
They also say that this cancellation will cause a $10 million charge-off — 'a lot of money, but less than the costs of supporting two platforms', and offer both thanks and apologies to developers and potential customers." top
PatchGuard, a Microsoft technology to protect key parts of Windows, will be hacked sooner rather than later, a security expert said Thursday. Hackers will break through the protection mechanism soon after Microsoft releases Windows Vista..."
Is Linux, and every other operating system, potentially just as vulnerable to hacking of the kernel as Windows Vista? I'm a userspace geek, and I don't know much about Linux kernel protection, but I am concerned. What if Linux were subjected to the same level of maleficence that MS Windows has to deal with? How can the headline, "Linux server compromised at Acme National Bank; 10 million customers' data stolen." be prevented?
I worry that the many in the Linux community, and perhaps the entire non-MS community, may be too complacent about security. It appears to me that there is a lot of 'whistling past the graveyard' going on. If nothing else, I would think that the constant barrage of attacks focused on MS Windows would eventually if not sooner (already?) result in development of a product that is 'safer' than any other. Sooner or later Linux, MacOSX and the others will become attractive targets.
The common answers I hear about this are:
"Hackers aren't interested in Linux (or the other *nices), because it's too hard to hack them
There aren't enough Linux boxes to interest crackers."
"The Linux user model is inherently more secure. You have to be root to get to the kernel."
While only using 'trusted sites' helps, that is a very thin and porous barrier, even when it's possible. Many applications I use are not part of any distribution, and we commonly download new 'toys' from various web sites. A hidden rootkit could even be surreptitiously inserted into a packaged or commercially-supported application such as Flash, RealPlayer, JavaVM, OpenOffice.org or Firefox by a member of the development team — or spread among several applications and only become operational when all are installed and one or more of them is actually run. Ken Thompson shows how hard this is to prevent.
Perhaps the 'kernel system' needs the capability to discover, identify and correct compromises to its own health and security, some means of dynamic protection from attacks, even those that may come from a supposedly 'legit' software installation. While security in toto depends on protections at all levels from social engineering and user interfaces to the kernel, (how) can the kernel be protected?"