Apple To Distribute OS X Lion via the Mac App Store
Physical media as a delivery mechanism is *dying*. Apple's got a vehicle they're promoting for unified app and OS management, they're behind the game on this, and folks criticize them for playing catch-up? Seriously? I call B.S. on the "walled garden" noise as well. First, It's a proprietary OS, for cryin' out loud! It's reasonable to expect that the OS update works as other App Store app: the Lion upgrade will be installable on every Mac you own. This is *better* than the current approach, since it effectively nixes the single-user vs. family license EULA distinction. If anything, Apple has been very aggressive about App Store pricing, so this might even be cheaper than prior DVD-based updates.
The only concern I have is how they'll handle the recovery case. But this use case is bloody obvious and the backlash for failing to handle it would be huge. I expect there's something in the works. If it's not up to snuff, then OS X users can and should complain vociferously on the 'net and to apple.com/feedback.
Apple To Distribute OS X Lion via the Mac App Store
Except that a lot of the people that consist of the Apple "Grass Roots" are power users who are likely to balk at such a setup.
Unless Apple just uses a similar (hopefully better) process that XCode did in the App Store: it drops an "Install XCode" bundle in your /Applications folder, which can be dumped off onto other volumes/media as needed. If they're clever, they'll just have an option in that installer/app to create bootable install media. Also, no one's going to want to keep a useless N GB installer on their hard drive all the time. Apple will have to deal with this somehow.
But this is the 1.0 trial of this approach. The power users will mostly just wait until the DVDs come out (which have been announced), buy those, and see how things play out. FWIW, I'd much rather have the bootable USB stick like the recent Macbook Air's come with. We've all been through the Update Wars too many times to want to be on the front lines. (Real OS X power users don't update immediately anyways -- they wait until they've shown that all of their mission critical apps work on the new release.)
Is Canonical the Next Apple?
Umm, nonsense? Results for "snow leopard virtualbox" and "lion virtualbox" look pretty reasonable. I didn't even have to add "os x" to disambiguate.
The Tablet Debate: 3G Or Wi-Fi?
For my part, I went the WiFi-only and tethered smartphone route. I use my 3G service via a tethered device regularly, but that situation may change. If it does, I can easily discontinue the tethering service option with my mobile provider, no fuss, no penalties, no fiddling about with contracts. Also, no paying extra for 3G hardware I might not actually use. This works swimmingly for my needs.
That said, a colleague uses her tablet very heavily for daily work and requires reliable connectivity. She opted for a 3G capable device because the near-to-100% connectivity (vs WiFi only) was worth it, and the workflow interruption for setting up tethering would have been too disruptive.
The End of Content Ownership
Joel Spolsky had a Joel on Software post pertinent to this subject back in 2002, except then it was applied to understanding just why numerous large companies were jumping on the open source bandwagon. (Hint: it's not due to a sudden shift to a Stallman-esque viewpoint.) Joel talks a bit of economics, lays out the details, and provides a number of examples, like the one below.
Headline: IBM Spends Millions to Develop Open Source Software.
Myth: They're doing this because Lou Gerstner read the GNU Manifesto and decided he doesn't actually like capitalism.
Reality: They're doing this because IBM is becoming an IT consulting company. IT consulting is a complement of enterprise software. Thus IBM needs to commoditize enterprise software, and the best way to do this is by supporting open source. Lo and behold, their consulting division is winning big with this strategy.
Seen in this context, these providers are rapidly commoditizing an entire marketplace as a complement to increase the demand for their products and services.
Closely related reading: This timely post on Facebook's Open Computing Project.
P.S.: I certainly don't think that Joel's ideas capture the whole of the open source movement, but it's one valuable perspective. At minimum, there's also big wins for many parties, whether individuals or companies, who can cooperate to share the burden of a cost center (such as an operating system, a web server, etc.)
Twitter Discards Client UI Community
While Twitter was both vague and threatening in this missive (really, wtf?), I'm also picking up a definite theme of them wanting to shut down client apps that are primarily used by spammers. I've noted a short list of apps/sites that generate follows that are virtually always spam -- to the point where I want to auto-block any follow that occurs via those apps.
How the PC Is Making Consoles Look Out of Date
The article includes three videos that give a fantastic insight into where PC graphics are headed, including a version of Epic's Unreal engine, Crytek's Cryengine 3, and DICE's Frostbite 2 engine. Considering that these leaps in eye candy are only possible with the current state of PC graphics, we wonder how long consoles will be the target platform for development of blockbuster games.
This is nonsense. Engine developers have a vested interest in staying ahead of the curve and showing off their wares. The engines developed for PC gaming systems now can be viewed as advance R&D effort on the engines they'll license for future generation consoles and mobile devices. If the PC gaming market helps to cover that initial R&D investment, all the better.
Game developers, on the other hand, will go where the money is. Unless PC gaming takes an unprecedented upswing in market share, game devs will tend to stick with console and mobile platform development.
Gmail Accidentally Resets 150,000 Accounts
What gets me is when folks think that storing data remotely (aka "in the cloud") is anything more than a single copy for data survival purposes. It's just another kind of storage with a different set of benefits and risks than the typical "hard drive under the mattress" backup approaches. It still pains me to think back to folks I knew in the 90's who trusted their web hosting providers to have robust redundancy and backup systems. When those providers had severe data loss issues... these users had no local backups. Major hobbyist sites with tons of work put into them ==> straight into the great bit-bucket in the sky.
As a baseline for important data, you want to have three copies of it and you want to verify your recovery process. Ideally, two of those are never in the same location at the same time. (e.g. backup #1 is secured offsite, and occasionally swapped with backup #2). Ye Olde Cloude counts as (at most) one extra copy in this sort of scheme.
Japan's Elderly Nix Robot Helpers
"Putting the patient in control" doesn't apply to elderly people who are losing cognitive ability due to old age or outright dementia -- and there are going to be A LOT of them. For many people who succumb to Alzheimer's, the body is hale but the mind goes first. Today, this necessitates human assistance (family, eldercare staff, etc.) to get this person through their day.
While mobility is certainly an issue for some patients, it's *far* easier to keep the elderly mobile and functional than the appalling state of U.S. nursing homes would suggest. Having dealt with this within my own family, this entails the nursing home working with physicians in a sleazy "you scratch our backs, we scratch yours" relationship. Specifically, the physician may do little to no actual work, gets paid by the nursing home, and produces evaluations/prescriptions on demand and fully to the desires of the facility. In one such example, the physician had no office and no phone number except the home's answering service and was essentially unreachable. Via such arrangements, patients who would otherwise very able to be mobile and manage well are labelled "aggressive", strapped into wheelchairs, and drugged insensate. It doesn't take much artificially reduced mobility to essentially kill an elderly person.
Japan's Elderly Nix Robot Helpers
Hiroshi Yamamoto covers this exact topic in one of the short stories in The Songs of Ibis. The angle there is the introduction of the first past-the-uncanny-valley android robot for nursing the elderly. Yamamoto takes on many of the particular challenges of working with the elderly (and with an aging population). The stories generally have a lovely classic sci-fi feel, using fiction to simultaneously explore new worlds and topical subject matter. It's also pretty darn near Clarke's definition of 'hard' sci-fi (and comp-sci-fi!) while remaining thoroughly enjoyable.
Virgin Mobile To Start Throttling Broadband2Go
What would you have Virgin Mobile do instead? So far, this is among the most reasonable takes on the problem I've seen to date. Let's lay down some assumptions:
1) The cost-per-user for the service has turned out to be too high at the current pricing.
2) Analysis shows that a small percentage of users are super-heavy traffic users gobbling up many, many GB per month.
Virgin Mobile could then:
1) Up the plan pricing for everyone to accommodate the upper-end of the bell curve's massive usage. This will penalize the overwhelming majority of users for a few users' overuse. (Isn't that just the same crap that everyone complains about with stupid no-differentiation rulemaking in schools? "It has to be the same for everyone.")
2) Keep the same pricing, but impose a throttle that imposes a penalty only on those users who are breaking the pricing model. They still get service, but at a degraded level.
3) .... ??? (Your Answer Here.)
Are 10-11 Hour Programming Days Feasible?
Go read the book Slack by Tom Demarco. The first part of it explains quite clearly why this is a stupid idea, and what's required for an organization to work smarter instead of just harder. Some highlights:
- Demarco defines concepts "efficiency" (the speed of movement towards some goal) versus "effectiveness" (are you moving towards the right goal?). The knee-jerk bad management response is to trade for efficiency, and this comes at a huge cost in effectiveness. It does you no good to move at breakneck speed towards the wrong goal. The overwork approach is self-defeating because it fundamentally misunderstands how all knowledge workers (not just programmers) function and 2) how concepts of "slack" are vital to maintaining a team's effectiveness.
- It illustrates in a number of ways just how misguided overwork is. You think you're putting in 60-100 hours a week, so the naive assumption is that equals 1.5x to 2.5x a normal work week. But each of those "hours" are rapidly devalued to be worth much less than a single baseline hour from a 35-40 hour per week baseline. The idiot manager who allows this rapidly ends up with less shippable work delivered per week despite burning out employees left and right. The team members are stressed, can't think straight, inhibit vital collaboration because of "crash mode", and their neglected life tasks start to seep in as "undertime" during working hours -- because they have no other choice!
I've definitely been-there and done-that with the mismanaged startup thing myself. The more time goes on, the more I see just how ineffective that team was. Sure, it felt macho to pull enormous hours and ship something anyway... but nowadays I could get a sizable multiple of work out of that same team and have them going home on time every day. Even IF you win the options lottery, the earnings per hour equation for startup overwork *sucks*. Note that "mismanaged" is the operative word here, not "startup". I've worked for a very well-run startup that didn't involve its employees in insane workloads.
The opportunity cost for the overwork lifestyle is immense as well. That you're even asking this question says that you still undervalue the value of your time greatly -- it's not just worth what some hiring manager or HR staffer tells you. It's your life, dammit! As a very wise co-worker once told me, "Work won't love you back."
iBook Store Features Leave Indie Publishers Behind
If that were the case, once Apple were confident in their tech specs, surely they'd allow users of other operating systems to create apps for iOS?
There's no "allow" here, as in Apple acting as the bully keeping you out of its tree fort. Apple would have to significantly increase its development investment in the iOS development tool chain to maintain and QA ports for other desktop platforms. That's money directly diverted from enhancements to the toolchain and to iOS itself. The return on that investment is doubtful at best, and the lost opportunity cost is damning. Personally, I can't foresee any market for this that would justify the ongoing costs.
NSA Considers Its Networks Compromised
So to me this raises a fundamental philosophical question: why keep secrets at all, as a government?
Because nations have adversaries. This adversarial relationship can be as benign as economic competitors, instead of full-blown hot/cold war enemies. At the level of governments, control of information flow is a form of power.
For example, consider the game of chess. In chess, the entire state of the game is visible to all players at all times. There are no secrets. But there's no way to enforce anything like that in the complexity of the real world. Imagine how a game of chess would go if just one of the competitors could choose to hide the locations of their pieces, what moves they've made, and even when they've made moves. No high-stakes human organization would either unilaterally submit to being the "out in the open" player. Nor would they refuse the leverage that information control provides. To do so would essentially be organizational (if not literal) suicide.
This does pose a dilemma: if a government must resort to information control, what kinds of "process controls" are needed in a democratic society to maintain a sufficiently informed electorate? Note: "sufficiently informed" isn't just information about the government, but information about the entire world the society must interact with. Even more importantly: how might we measure the health of information flow and knowledge within a society?
IBM Discovery May Lead To Exascale Supercomputers
...that the metal connections between individual components would not be fast enough.
If you bothered to RTFA (emphasis mine):
Multiple photonics modules could be integrated onto a single substrate or on a motherboard, Green said.
I.e. they're not talking about hooking up individual gates or even basic logic units with optical communications. Anyone who's actually dealt with chip design in the past several decades realizes that off-chip communications is a sucky, slow, power-hungry, and die-space-hungry affair. Most of the die area and a huge amount (30%-50% or more) of power consumption of modern CPU's is gobbled up by the pad drivers -- i.e. off-chip communications. Even "long distance" on-chip communications runs into a lot of engineering challenges, which impacts larger die-area chips and multi-chip modules.
Order of importance if disaster strikes?
If the "data" is a full, bootable USB-powered backup drive you can worry about getting at the contents later. Keep a disk in your backup rotation in an emergency pack prepared with other grab-n-go essentials. Search the web for "emergency preparedness" and similar for other ideas for such a pack.
If you'll need access to the data in an emergency then either store that as printouts/photocopies in your emergency pack (passport copies, etc.) and/or encrypted on your mobile device of choice.
Order of importance if disaster strikes?
I'm a computer geek any data of importance I created and can create again if it is of any value to do so.
Real computer geeks fear the pointless loss of time that simple and cheap preventative measures could have avoided.
Red Hat Releases RHEL 6
... that Duke Nukem Forever would ship before RHEL 6! ;-)
But seriously, congratulations are due to all the Fedora and RedHat folks who made this happen. This opens the door to a modern package set for many, many organizations.
Ubuntu Dumps X For Unity On Wayland
The Wayland site has a good exposition of how Wayland and X differ at the architectural level. This also clearly explains (and diagrams) what happens when X runs as a Wayland client.
Looks like a breath of fresh air in the Linux rendering space, and someone with enough momentum behind them to drive it.
Linux 2.6.36 Released
Whoa. Not enough caffeine yet.... momentarily misread that as "Fanboynotify is disabled in this version." Which then had me wondering why it was posted here... O_o
John Whitley has no journal entries.