top Back To Faxes: Doctors Can't Exchange Digital Medical Records
My experience in studying Medical Informatics is that they had no idea on how to create an ecosystem. Firstly, they were wrongly insistent on the need for everything to be coded. Take a look at things like SNOMED and LONIC as an example.
HL7 is a completely over engineered mess and it's a standards process driven by too many doctors and other health professionals and way too few computer scientists. It tries to capture the process of health care as a protocol. Completely wrongheaded. By the way, I worked on the UML 2.0 standard committee, which I think is reasonable by comparison to HL7, which is a major user of UML. Let that sink in.
HIPAA also has completely outdated and overly complex requirements as well. It was well intended, but it needs replacement. The law standardized technology, not requirements and that's a mistake.
Epic is a total mess. A local hospital system in my state adopted it and (surprise), it was horribly over-budget and there are still issues. And it's legacy code out of the box. It's all based on MUMPS and bits and pieces hacked on top of it.
Overall, the main problem is insisting that the problem be solved all at once, versus step by step. Step one, establish a system for identification for health providers and patients. This includes a system to get a identity of a patient via known data while providing a high level of confidence that the requestor of information is a health provider. Solve this, and then you can start talking about interchange. And start simple. Forget highly coded documents. Exchange vital history, procedure history, problem list and notes. That's it. Then move forward based on actual user demands.
Frankly, Clinton had the right idea with the national health id. If we could create an ID that everybody had that was only used for medical identification, that'd be great. But I doubt that'll happen, so we will be stuck with a huge data deduplication problem.
It's not easy, but it's more doable than people think. And heck, open source as a means of standardization is a fine part of this equation that is completely ignored.
top HP Introduces Sub-$100 Windows Tablet
A fair point. I was thinking the Android price point was more around 69-79. Clearly I haven't been shopping extensively.
top Microsoft's Asimov System To Monitor Users' Machines In Real Time
Sure, it's fine to be skeptical, but it's easy to verify (or not). You don't think Windows has a big enough market that people won't analyze every bit of traffic that comes out of the next OS?
Plenty of programs have had that customer experience improvement program opt-in for a while. I haven't seen anything that suggests that you really can't opt out of it, that data is sent anyway. I'm sure that if somebody found evidence of that, we'd hear about it instantly.
Sure, it may be required as part of installing the technical previews (but even that's not clear). How it works in the release, who knows. I agree that the best move would be not to have it at all in the RC or RTM builds, but that's not impossible or even unlikely.
top Adobe Photoshop Is Coming To Linux, Through Chromebooks
This isn't a port. It's streaming the application. It is actually running on their cloud, so you could do the same on Linux, Windows, whatever.
This is just another part of them moving to a cloud-based model. No big deal.
top HP Introduces Sub-$100 Windows Tablet
I guess Microsoft's plan to charge nothing for small screen form factors is having a bit on a effect. Even 20 bucks would be a significant impact on that price. At that price, there'd be enough people to see if you get a Linux distro on it, and it's close enough to cheap android levels.
For me, it's cool, because I'm more versed in Windows development and since it's full Windows, I can easily install whatever the heck I want on it (no developer unlock, etc, etc). Save up, get a few and just have them around the house.
top New Research Casts Doubt On the "10,000 Hour Rule" of Expertise
The way the rule is stated and repeated in modern culture is a vast oversimplification, and so a critique is fine. As some have noted, the argument was also about the "ability and drive" to put in the 10,000 hours. Certainly, individual factors do play a role. The only reason this is controversial is when people try to apply it to certain populations, where there is no evidence for that at all (in fact, plenty to the contrary). The article itself notes this.
But, it does raise a question: Are there skills require innate abilities to truly master, and if so, what are they and how do they differ from those that don't? There is evidence to suggest that the former is true.
This rule is often linked to how to be successful, but the studies have all been on skills that have no direct links to financial success. Brilliant musicians don't get paid well by default. Chess players aren't sport stars. Artists struggle.
I am curious if programming is a skill that does require an innate mindset to truly master (I do believe these skills do exist), or if it just a skill that demands disciplined practice. I've seen no evidence either way, so anything would be speculation on my part.
top PostgreSQL Outperforms MongoDB In New Round of Tests
Yea, that was great, but Redis did actually turn into a useful tool compared to Memcached.
top Ask Slashdot: Finding a Job After Completing Computer Science Ph.D?
There's a sad lack of proper work for PhDs in our field. I'm in the same boat, but I am working now as a contractor.
Sure, people say that there is a glut on the market, but nobody notes that this is due to drastic cuts in research funding at all levels. Maybe that'll change and we there will be more research and academic positions.
As a practical matter, I disagree with leaving your PhD off your resume. You'll have a large gap to explain (what did you do in all those years) and it's not hard to find out that you do have a doctorate.
The best thing to do is explain that a PhD is one of the best examples that are you are self motivating, able to work on a problem diligently and independently, and that is valuable to any employer. Then, get out there and try to find a employer that gets that (in other words, is worth working for). That's hard, but that's what it'll have to be.
I'm seriously considering a hefty pay cut and trying to get a postdoc, because I do miss working on actual interesting problems. Don't discount this either.
top The Site That Teaches You To Code Well Enough To Get a Job
Sure, it looks like it, and there are plenty of people with jobs out there that can lash something together. I worked with somebody at a startup would was struggling to get a web page working. After a few minutes, I realized the problem. She had no idea that you could loop through an array backwards.
We don't need more "coders". We need more software engineers and computer scientists.
Actually, maybe not. Maybe we need a workforce that is organized and that would stand against employers who insist on completely devaluing our field in a search for easy money, tossing aside qualified people in search for exploitable labor. That's the problem. I think we should be defending our industry and those that have the proper skills to do it well. Just because the latest, most visible trend is to hack together a mobile application or web site for a quick buck doesn't change the need for fundamentals.
Things like data structures, algorithms, discrete mathematics, computer architecture, etc. do matter. Not having a basic understanding of computers and computation leads to an astonishing amount of bugs, security holes and wasted effort. Some people have just accepted this as the cost of business. I say it's past time that we really stood up and say, no, things should be better. But since we can't collectively bargain, we are stuck.
I know, who cares, the money is awesome. It'll be like that forever, right? What does it matter that nobody can count on having a career after ten years because they are seen as too old with an outdated skill set.
This isn't about school, although I think a proper CS education is still the best way to learn this stuff. But you can get it with diligent self study and experience as well. In the end, real programmers have the conceptual understanding to adapt and excel in the long term. That's what we need more of. Real careers, not just jobs.
top Researchers Propose a Revocable Identity-Based Encryption Scheme
To address the summary, the difficulty is in proving certain security aspects, as current models don't fit the assumptions that RIBE models use. In practice, it could be fine.
The article seems to propose a set forward in a scheme to manage the keys by combining two previously proposed methods in a novel way. I can't judge if this is indeed an advance as I am not familiar with this domain. The main advance claimed is that the publicly needed parameters is constant. This suggests that other schemes had an issue in which the public information would keep growing as the number of issued keys and users grew, causing a scaling issue that limited practical, widespread applications. Again, I can't judge if this is indeed correct.
But, as noted, this does require a trusted third party to ultimately decide if a key is valid. Also, a lot of the work seems to be temporally based; the identity is combined with a timespan to create a key that is only use for a given set of time.
It's an interesting idea overall. It avoids the public key problem by making the information you need the channel in which you communicate on. (For example sending a encrypted email in which the key is the email address),
top What To Expect With Windows 9
I don't mind the start screen too much, but a proper start menu is a good start, and bringing Metro apps to the desktop is a start. The library for metro application actually has a long of good ideas in it, so expanding it beyond touch applications is a good idea.
The toughest part is that Windows 8/8.1 came with some really noticeable kernel and userland performance improvements. The switching between metro and the desktop is pretty smooth on all the hardware I've used. If they get back the power user desktop functionality, it's a good start back.
top Willow Garage Founder Scott Hassan Aims To Build a Startup Village
While there are many things about startups that are attractive, in the end, it's just a job, not a lifestyle. It's best to work to live, not live to work. These efforts to create all inclusive environments for programmers will just lead to burnout when the bubble pops. And yes, it is a bubble. We don't need yet another mobile social enabled whatsit pieced together quickly.
If this was an environment to create new formal verification tools or other revolutionary software tooling, then I'd be interested. Right now, it seems we are going a bit backwards. It's harder to create a nice UI on the web than it was on the desktop more than ten years ago. In the last few years, this is the first time that my job is becoming harder. For the longest time, editors got better, debuggers got better, frameworks got better and there were more tools for the job than before. Now, there's no real commercial breakthroughs in static analysis, security, formal verification, domain specific languages. It's all just mobile apps with no depth. Sure, this has driven some new useful stuff (say, Hadoop), but when big data is just for marketing and ads, what's the point?
top Web Trolls Winning As Incivility Increases
"There is no free speech without anonymity"
Can't disagree more. All anonymity does is protect people from consequences or having to stand by their opinions. In certain cases, this is useful (a protected source for a news story), but in most cases, if you can't say something without the cloak of anonymity, you shouldn't say it. In a lot of cases, it's cowardly.
The First Amendment just established the right of free speech, not that you shouldn't suffer the consequences of that speech. And if you are worried about being monetized, create your own forum.
about a month and a half ago
top seL4 Verified Microkernel Now Open Source
I feel an explaination may help. This project is based on a formal specification (in Isabelle/HOL) about what should be true about the microkernel. This specification rules out things like buffer overflows, null pointer dereferences and other properties by recasting these ideas in terms of higher order logic and uses automatic theorem proving tools to verify the proofs and that implementations match the specification.
There's even a binary verified version for ARM, so you don't even have to trust that your compiler works (but, there is progress in verified compilers, so hopefully an x86_64 version is on the way). The value in this is in using the tool chain and creating new, formally specified abstractions and implementing them in an verified manner to implement more secure, robust programs on top of this kernel. Of course, the microkernel makes assumptions about the hardware, boot loader, but formal verification is used more often in hardware, and you have to trust something at some point.
This opens a whole set of possibilities to the community as a whole. As a random example, you could formalize the Arduino language (or a kernel for that language) and create a verified version of that system that runs on this microkernel. This would be a big effort, but you could do it.
Overall, this is a positive step in lower the costs of verified co-designed systems and I hope it attracts more interest in software formal verification.
top Ode To Sound Blaster: Are Discrete Audio Cards Still Worth the Investment?
There are plenty of external boxes that allow for more options for recording and output at that price range. There's are good 2x2 boxes out there for less even.
If you are working in audio, you are using different kit. If you are an audiophile, you are probably just using the digital output into an amp anyway.
top HP Unveils 'The Machine,' a New Computer Architecture
While HP Labs may not be what it was, it is good to see that HP finally has a CEO that will give them the funding they need to go for the big ideas. We need more research and development funding period. The government needs to increase funding for the NSF and other organizations. And, yes, big companies need to start making long term investments. Microsoft Research is growing. It seems HP Labs is growing again.
Let's hope other big players step up too. I'm tired of money being thrown at yet another mobile application and having that being held up as a paragon of innovation. People are being critical of HP investing in this while Facebook throws 19B of assets at a messaging application? What's wrong with this picture?
top Xanadu Software Released After 54 Years In the Making
The more I read about Ted Nelson and the story behind it, there's much to learn. Firstly, what an extreme example of becoming too enmeshed with ideas (worse, ideas about ideas). His drive to index everything seems to be driven from his extreme case of ADD. But not every thread of thought needs to be catalogued and indexed, something that is harder to remember in the days of social media.
But mercilessly tracing connections between ideas can truly be a madman's folly. The crux of scholarship is not obsessively tracking down references and sources, but steadfastly ignoring side roads and making your point. It's not jumping from source to source endlessly in the search of absolute truth.
While these ideas sounds awesome to the ADD side in myself, in the end it is a distraction. Attention is a necessity, because it allows us to selectively ignore things versus having to slavishly follow the random whims in our heads.
Seriously, this story seems like something straight out of a Umberto Eco novel. And it's sad, because it is really way too late for this to matter.
top Apple Announces New Programming Language Called Swift
Agree completely. What wasn't clear from my comment is that I don't think Apple really thought out some of these decisions to the level needed and ended up with weirdness that doesn't need to be there.
top Apple Announces New Programming Language Called Swift
Proving again that language design is just plain hard. It's fine to make decisions and compromises if they are really well thought out ones.
top Don't Be a Server Hugger! (Video)
I've seen two reactions to cloud-based systems with people I've worked with.
The first always is talking about all the negatives including downtime, how awful vendors are to work with and how much better they are and that nobody really understand the business like they do.
The second just notes that some of the stuff from the cloud helps them update existing infrastructure to be more useful and flexible and that they are happy to work to making the existing datacenters work better and use off-premise vendors when it makes sense. If you call that a private or hybrid cloud, that's fine by them.
Oddly, the first group also seems to be those that dismiss it outright are the ones that seem to think that it needs to take a month to get a new server. And insist they must have absolute control over every one of those servers and every service and that giving anybody the ability to do so (even for development or testing) is insanity.
The first group also seems to be taking down a server for maintenance every weekend or so for a few hours, can't seem to add an account in anything in less than four days, don't even know half the services they actually run with the other half being half-baked home-grown monsters. And mail and the network just go down and they are still running a six year web proxy and ban Pandora or Spotify because security.
The second group I don't notice because stuff just works, and when I talk about self-service VMs, they point me to the project they are testing right now, but are waiting on management to approve. And they are okay with me streaming some tunes, because they've got that traffic prioritized correctly and they are okay with me installing stuff because they are proactive in detecting threats and don't blindly trust a machine just because it is in the building.
TL; DR. I agree. The sysadmins that don't care about where the machines are (only about what they do) don't need to worry. Those that do, do need to worry.