Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Bash To Require Further Patching, As More Shellshock Holes Found

TemporalBeing Re:Nothing to do with language (307 comments)

So brilliant programming languages do not permit eval($ENV["FOO"])?

Correct, because good programming languages don't have anything like eval().

Normally, in a decent programming language, if you're convinced you really need to execute unknown-until-runtime code, the first step is to get over your misconception. If you're unable to get over the misconception, then you do something like fopen('tmp.c'); fwrite... fclose, system('cc variousflags tmp.c'); system('tmp'); and then you spend the next few weeks worrying about how awful what you did was, and rethink your unnecessary "need" to run generated code.

eval() is not C or C++. It's Bourne Shell, and it's a necessary evil for a Shell language to have. That said:

So what again was your point?

12 hours ago
top

Bash To Require Further Patching, As More Shellshock Holes Found

TemporalBeing Re:Soon to be patched (307 comments)

What makes you think Windows doesn't have problems like this?

They did. But it is a long time since that last vulnerability on this scale. Following the embarrassing Nimda and Code Red (and many vulnerabilities in IIS), Microsoft started it's "security push". The central part of that is the Secure Development Lifecycle (SDL) which as a collection of processes, methodologies, tooling, mandatory education, guidance and mandatory threat modelling, reviews and auditing.

OSS has SDL too, primarily due to very extensive use by many projects of code testing (f.e autotest), Coverity testing (provided for free to OSS projects), use of Valgrind, and more. OSS projects generally have far more testing involved with them than closed projects, in part b/c people report bugs and those bugs become tests in the project's test suite.

And with companies like Red Hat, Coverity, etc providing more influential secure coding testing and fixing patches (Red Hat, Debian, etc) it probably gets done more often.

But let's not also forget in OSS patches tend to stay in the code; where MIcrosoft has a long history of applying a fix in one patch, undoing it in the next, and having to repatch over and over again (see the WMF bugs, repatched from Win NT4 through Windows 7, if even Windows 8, e.g after SDL was implemented).

The difference is that being open source third parties can review the code and find problems. There is no way to keep them secret and from the public.

That all fine and dandy. Only, these bugs (the original Shellshock and these later) have existed for 22+ years! During all that time, nobody (we hope) "reviewed the code and found problems". So, if there were any third parties looking at the source, they failed miserably (or sold exploit information on the black market).

Look, there have been bugs found in old MS code as well. A few years back there was a vulnerability in the old DOS emulation code.

It is time to let the myth of the many eyes die. The community is not going to help you by reviewing code unless you *pay* them to do so. It is the most boring discipline of developing code, and nobody does it out of interest.

A company like Microsoft can *pay* people to review and audit code. A big part of SDL is exactly those supporting roles and checks/gates. The open source community must wake up and set up foundations OpenSSL style and start asking those who reap the biggest benefits for some funding.

Companies like Coverity provide it for free to OSS projects, and companies like Red Hat and Canonical pay for it as well. The OSS community has the practices and reviews in place on many projects; there's a few, like OpenSSL, that have slipped through due to the structure of the teams surrounding them. For OpenSSL many probably assumed that since OpenSSL was FIPS certified that things were being done in the OSS branch too; Heartbleed revealed that assumption to be false and now we have LibreSSL as a result where the norms for OSS projects are being applied.

Also, fixes were pushed out within hours of notification.

Do you really want to go there, given the incomplete patches and host of related problems which could have been found had the maintainers taken more time?

Part of SDL in Microsoft is exactly a process where, when a vulnerability has been reported, they must take time to analyze if there are related or similar vulnerabilities, what impact a patch could have. On top of that they have a gigantic test farm where they test for compatibility with a huge number of popular software applications.

Essentially, what Microsoft does *internally* and prior to releasing information on the bug, is now what for bash takes place *externally* (external security researchers) and *after* the vulnerability info was released.

Look at it this way - projects release the fixes very quickly, vendors verify the fixes are sufficient and collaborate with the projects to make sure they are. In this case, Bash fixed the original ShellShock bug very quickly; and then another, similar bug was found. You can expect it to be patched quickly as well. This happens regardless of whether there are active exploits, theoretical or real.

Now look at the Microsoft path: They won't even start on fixing it until it has been verified as being actively exploited. Until then, they don't disclose that there was an issue reported; nor do they fix it. It just get listed in their long list of "yeah, we know about it but we're not going to tell anyone about it or do anything about it because no one is use it against anyone yet".

Look at it this way. BASH has had this problem evidently for years and there haven't been any exploits. It was discovered by researchers analyzing the code. In an MSoft world, where nobody has access to the code but MSoft, the public finds out about security holes after they have been exploited.

No no no no. This bash problem was discovered by someone trying to see if you could pass a lambda (an anonymous function) from a bash shell instance to a subshell. He then noticed some weirdness and investigated.

After the bug has become known, security "researchers" homed in on the bash interpreter. Still from *the outside* (i.e. NOT looking at the source code), more vulnerabilities were found (see Tavis Ormandy's tweets).

The easiest way to find these bugs remains to just play around with bash and try to throw it off with weird syntax. And that is how these bugs are being found.

There is absolutely no evidence that having open source code makes the product more or less secure. To be honest, only the most obvious bugs are ever found by inspecting the code - which tend to be the same class of bugs that would be found with just some cursory testing.

No, the quality of the code is impacted by the quality assurance processes that surround the development process, such as testing, threat modelling, security audits, tooling, guidance etc.

Having the source code allows for projects like Coverity to do security testing. it does help in many ways; but as with anything there are still limits as you can only test for things that you know about, and 20 years ago Lamba Functions in C/C++ were unheard of - they didn't exist until C++11 for C++.

Yes, there is still a lot of stuff like you describe; but often when people find those things they can also determine why they are happening like that and file more helpful bug reports or even include patches with the filing up front. It's all a matter of how much effort the finder wants to put in, and whether they can read the code for that particular project sufficiently to determine what they are finding (not all do).

12 hours ago
top

Bash To Require Further Patching, As More Shellshock Holes Found

TemporalBeing Re:Soon to be patched (307 comments)

The Shellshock bug is from 1992.

Please quote the bug report and date filed.

12 hours ago
top

Bash To Require Further Patching, As More Shellshock Holes Found

TemporalBeing Re:Soon to be patched (307 comments)

Not to mention RedHat is not any kind of savings vs. Microsoft..

Have you included hardware costs in your equation ?
Windows Server does need beefier hardware more than not to provide the same level of service

Certainly does.

Maybe that has changed with Windows Core. But I haven't used it yet, and using it instead of the GUI costs in ease of administration that Windows seem to offer in favor of linux.

MinWin could certainly run on nearly anything that runs Linux where the two are available on the same processor; but you can't get MinWin as it was simply an internal project for Microsoft that they never made a public release of.

Server Core, though, was meant to compete with Linux Systems. However, (last I knew) they severely limited it by allowing only one "Server Role" per Server Core system. That is, a Windows Server Core system could have the "File Server" Server Role installed, but then it cannot also do other things, Active Directory Server Role or Exchange Server Role, etc. And they did the dumbest thing of putting the command-line in the GUI, well - not so dumb since they have really horrific command-line tools so its nearly impossible to be effective and efficient on the command-line.

(Yes, you can do a lot on the command-line, but only after a lot of development effort to do so. Good luck trouble shooting on just the command-line.)

12 hours ago
top

Bash To Require Further Patching, As More Shellshock Holes Found

TemporalBeing Re:Soon to be patched (307 comments)

I think the GP is referring to the business precept that it isn't the amount of shit rolling downhill that matters, but rather where it stops.

That's why you see companies doing seemingly silly things like purchasing manufacturers' support contracts for obscene amounts of money when the same support could be provided in-house for less than a tenth the cost. The reason is that if all hell breaks loose, if you have a support contract(*) you can shift the blame onto the manufacturer, but if you're doing it in-house and it breaks, you only have yourself to blame.

So if you have a Microsoft machine, and it has a severe remote vulnerability, you can say "Hey, it's not our fault! We're using software provided for this purpose by one of the biggest software companies in the world. If they didn't know it was broken, what chance did we have?" Whereas if you're using a system which was put together in-house, especially if such a system was billed as a "cost-cutting option" in opposition to a CYA Microsoft option, the blame stops at the in-house people who put it together. Offloading the blame to a nebulous group of people on the Internet to whom you paid absolutely no money tends not to work.

*) I know Microsoft typically doesn't have a support contract with similar monetary disaster penalties, but the same general pass-the-blame approach applies, even if no money changes hands because of it.

Which is where Canonical, Red Hat, and SuSE step in - to provide those CYA agreements.

12 hours ago
top

Bash To Require Further Patching, As More Shellshock Holes Found

TemporalBeing Re:Soon to be patched (307 comments)

Lol.

You pay for Software Assurance and yet their release cycles seem to push past the edge of the 3 year SA agreements in so many cases, requiring ongoing renewal of SA in order to not lose it.

They took notes from Cisco's playbook on how to long term fuck their clients.

They started the that policy in when Windows XP (and Server 2003) were released, and lost a third of the customers when they didn't deliver a new release within the 3 year Window. Yes, some of those got new policies later after that, but that policy change also drove many to evaluating Linux as an alternative (well documented at the time in the news) and some to switch when they found out they didn't really need Windows or Microsoft.

12 hours ago
top

Ask Slashdot: Is Reporting Still Relevant?

TemporalBeing Re:static versus dynamic, access & post proces (178 comments)

Even the manager above you wants a report they can easily edit, merge, and send up the chain; it makes reporting on multiple projects a lot easier to do.

Just open the region selection tool, draw a rectangle around the parts of the data you want, Control + X, and then Control +V the picture into your report... . easy as pie ^_^

Or print it out, and they can grab scissors and cut out the data and paste it onto their report.

There are a lot of legit ways to handle this one.

While those may work, they are neither desirable nor sufficient for the kinds of reports that management uses.

13 hours ago
top

Ask Slashdot: Is Reporting Still Relevant?

TemporalBeing Re:static versus dynamic, access & post proces (178 comments)

It's not like they're going to turn to page 36 of some report, and start asking us what's the deal about latency increasing from 5 milliseconds to 10 milliseconds.

The CEO may not, but you're boss's boss might; or your boss might condense that 100 page report to 50 pages for his boss who might make it 5 pages for their boss, which might become an excerpt in the report that the CEO does read.

And that 5 ms jump to 10 ms might very well make it into a report to the CEO if it caused a major issue for the organization, especially an organization that is solely centered around providing IT services to other organizations.

13 hours ago
top

Ask Slashdot: Is Reporting Still Relevant?

TemporalBeing Re:Dashboards are not Reports (178 comments)

Exactly; not to mention that the various levels of management in an organization will summarize reports from those under them for those above them.

4 days ago
top

Ask Slashdot: Is Reporting Still Relevant?

TemporalBeing Re:static versus dynamic, access & post proces (178 comments)

A screenshot of a report is a poor substitute for an Excel or PDF report where you can copy and paste the data.

This is where picatext or other OCR software comes in handy.

Also... in principle, you could make or use screenshot software which also captures the text from the window shown.

Neither of which your CEO, President, VP(s), Sr. VPs, or Directors want to do or learn, nor should they have to. That's what underlings are for, and they don't want their secretaries doing it either (should they have one). Their time is too valuable to the corporation for that, and yes, they will all pull data from that report to make another report.

Even the manager above you wants a report they can easily edit, merge, and send up the chain; it makes reporting on multiple projects a lot easier to do.

4 days ago
top

Microsoft On US Immigration: It's Our Way Or the Canadian Highway

TemporalBeing Re:Fine! (363 comments)

Implying that "doing good" and "making money" are mutually exclusive... I believe this to be a false assumption.

The problem is you cannot use a charitiable non-profit to promote a for-profit business, and that is exactly what the Bill and Malinda Gates Foundation does, tieing grants to using Microsoft products, etc.

4 days ago
top

Ask Slashdot: How To Keep Students' Passwords Secure?

TemporalBeing Re:password manager (191 comments)

If they are using iPads with the latest version of iOS 8, they can just save the passwords using the keychain in safari with autofill (only works if a site is HTTPS, however)

So long as it can be backed up, that is fine. But you need to have a backup for safety in case something happens to that particular iPad or Chromebook, which will in part depend on the web browser being used - whether it uses its own set or the system's, and if it is its own if that gets included in the backups.

But yes, I would highly recommend using a password manager and teaching the kid how to use it properly, possibly even having them setup a master password for it, that only you (and those you authorize) have access to that the kid knows and is instructed not to provide to anyone without your permission, even their teacher.

4 days ago
top

Ask Slashdot: Who Should Pay Costs To Attend Conferences?

TemporalBeing It's negotiable... (182 comments)

Yes, the employer should provide training; but they also have to prioritize what training is important to them, and that might not line up with what is important to you. So it's a negotiation. It seems like they've been considerate for local conferences, and that's great. But you can't expect more without negotiating and proving to them the benefit of it.

In the end, I hold one rule: If my employer is paying for, then I'm representing them at the conference. Their name is on the registration, etc. If I am paying for it, then I'm a free agent during the conference and I'm representing myself and myself alone. If it's somewhere in between, well...best to play it safe and act like you're representing your employer because they'll probably think of it that way.

So, if you really want to go, then go on your own dime, as a free-agent. And make it clear to them that that will be the case unless they want to poney up at least some of the money for the conference. Who knows, may be there is some other opportunity they could use a well while you're in the area to make it more than just a conference trip too. But you'll have to negotiate all of that.

about a week ago
top

Fork of Systemd Leads To Lightweight Uselessd

TemporalBeing Re:Let's rephrase that (468 comments)

Let's rephrase that - I don't understand what point you are making with that number unless it's about hibernate being better than it used to be on most hardware or something else I've missed.

Hibernate? What's that? My computers are either powered on, or powered off entirely. They don't sleep (S3) or hibernate (S4).

about a week ago
top

Fork of Systemd Leads To Lightweight Uselessd

TemporalBeing Re:kill -1 (468 comments)

On my work laptop, namely b/c I had to take it home for something and it would otherwise overheat in my bag:

up 10 days, 21:38, 6 users, load average: 0.33, 0.29, 0.25

On the system I actually work on:

up 71 days, 25 min, 2 users, load average: 0.00, 0.01, 0.05

I have several different systems (servers) like the above and I pretty much reboot them all at the same time when upgrades require it and I feel like rebooting the system. I probably have a few that are longer than that even.

about a week ago
top

Outlining Thin Linux

TemporalBeing Re:min install (221 comments)

If you want a real thin install, pick something like Gentoo and Slackware. You can build minimal installs from the kernel up. In ye olden days when I was working on pretty minimal hardware (low RAM, slow CPUs, small drives), I used to install minimum base on top of a very small kernel (only the hardware found on the machine, plus a few generic IDE drivers just in case I had to move the HD and fire it up on another computer). It's a pain in the rear, and with even low-end hardware having huge amounts of RAM and storage space, I don't bother.

The whole point of the net install version of Debian is that it installs a very base version of Linux; and then you build on top of it. If you really need some sort of unique kernel variant, most fine tuning can be done in /boot or /proc.

Debian, and Slackware - agreed.

Gentoo? Not really unless you setup a build server for yourself separately, namely because Gentoo will bloat a bit due to build dependencies and there's not much in the Gentoo Portage Repositories that are merely binaries since it is a source based distribution.

That said, replace Gentoo with Arch and you're exactly right.

about a week ago
top

How Our Botched Understanding of "Science" Ruins Everything

TemporalBeing Re:TFS BS detector alert (793 comments)

Science works without even the existance of ultimate causes and absolute truth.

Well, yes - it doesn't require existence of ultimate causes, but by nature it is very much about finding "absolute truth".
To deny that, is to misunderstand what Science is and what it seeks.

about a week ago
top

How Our Botched Understanding of "Science" Ruins Everything

TemporalBeing Re:In lost the will to live ... (793 comments)

That was the novelty of Christianity 2000 years ago-

It was already centuries old by the time Christianity started.

When Christianity (Followers of the Way) showed up around 50 A.D, Judaism was extremely focused on (i) racism - you had to be a member of Jewish descent - and (ii) works - you had to keep the law, which also reflected in social strata (e.g Pharisees, rich vs. poor, etc) in numerous ways (kinds of sacrifies one could use to fulfill the law, etc). Christianity did away with both of those, pointing only to faith in Christ as a requirement; making all equal.

Christianity also did away with the "mysteries" of religion, which is one of the reasons why it spread through the rest of the Roman Empire as it did. All the religions had things that only the high priests knew; you only got to know them by climbing the "corporate ladder" (for lack of a better term) of the religion; and this is still reflected in many modern religions (Islam, Mormonism, Buddism, etc). Christianity, by contrast, told everything about itself to everyone who wanted to listen; essentially no ability to differentiate between priest and worshipper.

And yes, the Western Orthodoxy did do a lot of things to elevate those in the church hierarchy (e.g bishops, priests, etc) to non-human levels, etc - wrongly so. (I'm not familiar enough with the Eastern Orthodoxy to say anything there; so they may have also, but I can't say one way or another.) In part, that is human nature as people try to control others, etc. But that is not a tenant of Christianity in any way.

about a week ago
top

Kickstarter Lays Down New Rules For When a Project Fails

TemporalBeing Re:All this because Clang went Clunk? (203 comments)

Regular finance account reporting of how the money is being used should be required. If you can't handle it, don't ask for money.

Such production of reporting and auditing of reports has costs and could consume significant amount of project funds.

Nonsense. If it's a serious project, they should already have an accountant or at least some form of accounting software - once you have that, it's pretty simple to produce a basic cash flow report. Regardless of what your business is, tracking the financials is basic to it. If not just to know whether or not you can afford that widget or software package, because come the end of the year you have to let the IRS know. If the project doesn't have financial tracking, it's a sign to run - far and fast.

If it's a small project (not aiming to go big or anything, and I'm specifically thinking about tinkering projects that are looking for under $10k total), then yes the money for the requisite software could be substantial relative to the project costs.

If it's anything bigger, then yes, they should have to do more. Even so, the cost of a CPA to audit and maintain their books could still make up a substantial portion of the costs for projects under $50k.

So as with anything, it needs to be graduated or may be Kickstarter provides some of those services - e.g they provide a Quickbooks account and a team of CPAs to review them and help the projects out unless the project certifies that it can do it on its own with a reference to their hired CPA (which companies would have no problem with) in exchange for a slightly smaller pinch from Kickstarter.

about a week ago
top

Native Netflix Support Is Coming To Linux

TemporalBeing Re:Finally! (178 comments)

It almost seems like an accident, though. They need to move to HTML5 because Microsoft supports its technologies like high school students support their relationships.

12 years for Win XP.

However, Silverlight is already out of support. It didn't even make 3 years of support. I think the big thing that did it in was also the same thing that MS tried to show it off with - the Olympics on-line broadcasting in the US. Too many restrictions and it didn't go anywhere. NBC left it behind shortly after; an there has been zero large deployments of it since (at least any where near that scale).

about two weeks ago

Submissions

top

Linux-based GPS Units?

TemporalBeing TemporalBeing writes  |  more than 3 years ago

TemporalBeing (803363) writes "I'm looking to a GPS unit, in-car windshield mount, for my wife. I know there are some units on the market already that run Linux, and I'd like to lend them my supports over their non-Linux brethren. However, I am quite new to looking at them and looking over TomTom's and Garmin's website does not provide any info on what OS they run. Android or another custom Linux is okay; and I need maps for the U.S.A. So, what do you recommend?"
top

The evils of CVSNT

TemporalBeing TemporalBeing writes  |  about 4 years ago

TemporalBeing (803363) writes "CVSNT was originally a port of CVS to Windows, as well as some enhancements for the Windows environment. It is the backbone of projects like TortoiseCVS. This last spring, officially announced in late June, the current maintainers of CVSNT decided make the project a for-pay only as they migrate from being just CVS to their EVS system, which purports to integrate Subversion and other systems as well. In the process of doing so they have (i) closed down all mailing lists, even those they advertise on their website, (ii) no longer provide binaries except through back-end channels to open source projects like TortoiseCVS for distribution so those projects can continue, (iii) cut-off access to their source repositories, and (iv) done this all by saying they were advised to do so by the FSF, pointing only to a few web pages on the FSF's site. While the FSF does endorse that the GPL, LGPL, and open source projects can charge for the project, I find it highly suspicious that the FSF would endorse such a move by an open source project — one that essentially makes the project a proprietary project. What makes matters worse is that there is no tool available to move from CVSNT to a standard CVS or to any other revision control system as there are numerous "enhancements" to the RCS data backend that are specific to CVSNT which tools like cvs2svn don't understand, and without access to the source won't be able to understand. Additionally, since CVSNT became a more active project than the CVS project it was derived from it has essentially become the de facto CVS version used, stranding many in CVS and subject to the whims of March-Hare. Hopefully by brining this to the attention of Slashdot, the situation can be rectified."
Link to Original Source
top

OpenMoko Freerunner dead?

TemporalBeing TemporalBeing writes  |  more than 4 years ago

TemporalBeing (803363) writes "I've been looking to get an OpenMoko FreeRunner for a few months now; however, I wanted to get the A7 model as it has the Buzz Fix already applied. Sadly, The A7 model isn't available from OpenMoko with the 850MHZ radio. I recently e-mailed OpenMoko through their contact e-mail/support about this, asking when the 850MHZ will be available, only to get the following response:

There will not have A7 for GSM850 because we had stopped the phone development. Now we are focusing on our new product called WikiReader.

This after the last September's announcement of No More OpenMoko Phone and Openmoko Phone Not Dead After All. Looks like they are really just trying to clear the stock.

Submitter's note: Original Source is an e-mail I have. Please be kind with the original source I quote — it's the best I could do with slashdot's story submission form."

Link to Original Source

top

Scientists report others fake data...

TemporalBeing TemporalBeing writes  |  more than 5 years ago

TemporalBeing (803363) writes "Scientists, at least according to the Times of London, are doing science a great injustice as One in Seven Scientists Say Colleagues Fake Data, stating:

Around 46 percent say that they have observed fellow scientists engage in "questionable practices", such as presenting data selectively or changing the conclusions of a study in response to pressure from a funding source.

And people wonder why the science is so fought nowadays. It's interesting that only 2 percent reported having engaged in such practices though...but then, is the study author trying to justify their study? Or are they presenting the facts?"

top

Science and Religion...

TemporalBeing TemporalBeing writes  |  more than 5 years ago

TemporalBeing (803363) writes "In a recent essay published by The New Republic Jerry A. Coyne provides some insight into funding of science and the public groups that provide it — even those professing to be of "Christian" leaning. Regis Nicoll writes a summary for Breakpoint (an on-line and radio broadcast originally lead by Charles Colson in which we find:

Contrary to modern criticism, the scientist who approaches the world as a product of intelligence, rather than of matter and motion, is less likely to stop short of discovery. Instead of dismissing a feature that, at first glance, appears inert, unnecessary or just plain mystifying, he is more inclined to push the envelope of investigation to unravel its function and purpose.

Comments by Breakpoint readers can be found here. (Please be kind if posting to comments there; they are moderated and they don't get the volume normally does.)"

top

LVM Disk Mirroring - to USB or not to USB

TemporalBeing TemporalBeing writes  |  more than 5 years ago

TemporalBeing writes "I recently had a hard drive fail on me, and now working my way through the recovery process. Fortunately I didn't lose much data as it seems the hard drive mostly had stuff I didn't care too much about on it...things easily recoverable by re-install. Thankfully, it was a Linux system using LVM2.

As I work through this process I am also thinking about how to keep from losing data in the future, and have decided to setup a basic mirror RAID on the system, which is relatively new — e.g. circa 2005 — and supports USB2.0 without a problem. I am also thinking of doing the same on my home server — circa 1997/1998 — that only has USB1.1, and is in a fully operational state — though it doesn't have LVM installed yet.

So I looked in the adds this week, and noticed a Western Digital MyBook Essential 500GB drive on sale this week for $89, which leads me to my question for SlashDot:

I know USB is slower than internal drives for performance. But is it slow enough that it would not be good to use for mirroring the internal drives as part of a software drive mirror implemented via LVM2? Or should I try to go with internal hard drives for the task?

My goal is to try to keep the budget down, and right now get a mirror in place so that next time a hard disk failure won't even stand a question on whether data is lost — I just pop in a new disk to mirror to."
top

CowboyNeal for President!

TemporalBeing TemporalBeing writes  |  more than 6 years ago

TemporalBeing writes "Given the poor choice of candidates for the President of the USA this election season, we here at Slashdot should organize our own campaign and put forth one of our own as a Candidate. I propose CowboyNeal for President. (Think we can get him to run?)

Let's have some fun and really enjoy the season."
top

Visualizing the Body...

TemporalBeing TemporalBeing writes  |  more than 6 years ago

TemporalBeing (803363) writes "IEEE provides a pretty nice article on how IBM is playing with technology like that of Google's GoogleEarth, only for medical, electronic health records instead. From the article:

"The 3-D coordinates in the model are mapped to anatomical concepts, which serve as an index onto the electronic health record. This means that you can retrieve the information by just clicking on the relevant anatomical part. It's both 3-D navigation and a 3-D indexed map," explains Elisseeff..."You can think of it as being like Google Earth for the body," is how Elisseeff frames the mapper engine. "We see this as a way to manage the increasing complexity that will come in using computers in medicine.""
"

Link to Original Source

Journals

top

Microsoft's Real Plan?

TemporalBeing TemporalBeing writes  |  more than 7 years ago What's Microsoft's real plan? With the advent of .Net, the Microsoft/Novell deal, the splitting of Microsoft into three major groups internally, and the impossibility of Windows being developed the same way that Vista was for the the generation of Windows it becomes quite possible that Windows as we know it - with an NT Kernel and all - is no longer the future of Windows. Just how might Microsoft surive? Check out my full blog describing Microsoft's Real Plan.

From the blog:

It has been my speculation that .Net was the start of Microsoft's plan for how they will survive in a post Windows world.

...

Imagine (for a moment) Microsoft releasing a new version of Windows - Windows NG (for Next Generation) - that does not provide any backwards compatibility whatsoever. If Microsoft did this, they would need to be able to quickly push a lot of people to support their new system; or they could ride on the shoulders of giants - existing OS's that are already out there that have a lot of software

...

then how could Microsoft use an existing OS? What would there be for them to use? Well, there is always the BSD's, but then Microsoft would have to fork and support their own - kind of like Apple did; which could be costly. Or, Microsoft could chose a Linux Distribution (Novell's SuSE?) and make it its primary back end; add on the extra tools to move their infrastructure over (Vista's User Mode Sound and Video drivers, and .Net) and a user interface to make it look like Windows

A possibility? Sure. Likely? Only time will tell.

top

Programming vs. Software Engineering & Why Software Is Hard

TemporalBeing TemporalBeing writes  |  more than 7 years ago I noticed the Slashdot Article on Why Software Is Hard and wrote a response in my Blog. Should be a good read for any techy. The blog entrie primary talks about Software Engineering vs. Programming. Needless to say, these go hand-in-hand with why software is hard. To quote from the Blog:

The key difference, however, is that the Software Engineer realizes that the "programming process" is just the implementation phase of creating software; and that there is a lot more to be done before the implementation phase can even begin. Comparitively, the programmer wants to just jump in and start writing code as soon as they have been handed a task, skipping the rest of the process, and possibly even ignoring any part of that process if anything from it was handed to him/her.

And FYI - the blog is more than just a link to the Slashdot article, and its related article. It also includes a link to few postings on OS News and its sister article, as well as some responses to a couple of the comments to that article. Needless to say, Slashdot (from what I could see) was a lot more forgiving of the original article.

Slashdot Login

Need an Account?

Forgot your password?