Some Guy: We had 3 switchers!
Bill: Really? That's more than we have ever had before.
Some Guy: We had 3 switchers!
Bill: Really? That's more than we have ever had before.
We could have traded some productivity for leisure time. Europe does it more than the English speaking world. I mean, how hard have you tried to make it happen?
I walk to work. 40 minutes both ways. It's 6.6km round trip. Decent exercise for the day and I get some reflection time and fresh air as a bonus.
You think it was good in your day! Ha! In my day it was a shining beacon of concentrated awesomeness. We used to have a commander who was also a taco and there was this cowboy fella who used to do all the polls. It all went downhill after they came up with the concept of adding login credentials.
I believe this music has an intelligent designer. Such complex and wonderful music couldn't possibly have arisen by chance. It would be on the same order of magnitude as a tornado blowing through a junkyard assembling a tight little jazz quartet.
I'll take one of those awards too. Win2000 and ten years.. not a problem.
Personally I love Starcraft II because they've taken network play to a new level of refinement while feeling free to build a slightly different game for the campaign. They gave up on trying to synchronize the units in the campaign and the ladder play. Getting 3 balanced races for head to head play is hard. The fact they can do it to a point where an eSport can develop is absolutely amazing (for all the reasons they mention in the article). Then they don't confine themselves to that design space when they build the campaign. Obvious and yet brilliant.
All this allows for the most enjoyable campaign I've player of any of the past Blizzard RTSs. I've played all the Blizzard RTSs and this one is the first campaign I've actually enjoyed. It's also the best ladder play as well. I'd have to expose myself to potentially fatal radiation in order to grow enough thumbs to show how much I like this game.
As someone who has watched just about every doctor who made it really doesn't matter where you start in terms of backing story. It's not like the show has been particularly self consistent. The character of the doctor is supposed to be a big enigma so as far as learning about the character or the overall story arc it doesn't matter.
if you're looking to watch a bunch of classic doctor who then I'd go for the 4th Doctor series. These are the classics. There are a batch in the middle that have Douglas Adams as the editor (and sometimes writter) that are very funny. "City of death" typifies the style. The third doctor isn't bad either. I'd also recommend watching the very first episoide of doctor who ever made as well. It's insanely good for a show that old.
The cynic in me suspects this is the attitude of many UI designers.
That said, you can do a quick mental checksum on the idea:
The classic MacOS, for instance, never had a minimize button. It did eventually have a window shade capability and later a windows shade button but I used system 7 for ages without minimize and suffered no ill effects for the experience.
I never use MacOS X's minimize feature since it's redundent with the "hide application" capability of that OS. You can hold down the option key while clicking on a window of another application in the background and it will hide all the windows of the application you're switching from.. for example.. If you hold down option and comand it will hide all applications except the one you're switching too. I find this feature much more usefull in MacOS X. I sometimes click this MacOS X's minimize button by accident and it makes the window harder to find later. Personally, I'd remove the buttons if I could.
I do use the minimize button in gnome and windows 7 however. I'm not sure if I'll miss it if it's gone.
On the topic of maximise,
I use Windows 7 fairly often and have basically stopped maximing windows. The reason is that I have 3 1080p monitors. On these monitors I usually don't want to maximize windows because this almost always results in a window that is too wide. I tend to double click the top or bottom of the window to "maximize" the window's hight automatically. This is the greatest new feature in Windows 7 IMHO. The only downside to this is that when you try and move a window horizontally, there's only a small distance you can move the mouse cursor vertically before the window "unsnaps" from its vertical maximization. I often want to move these windows between monitors and so I'd like the system to keep the windows veritcally maximized while I drag them around.
The classic MacOS didn't have a maximize button either. It did have a button that was supposed to resize the window to the minimum size required in order to show a window's contents. Much of the time this effectively maximized a window. There was a MacOS convention to have a small border area between the edge of the screen and the window when you did this. You could this this edge with an option-command-click to rapidly jump to the Finder and hide all other windows. If I had a choice I would bring this smart-resize button back. Also the option and option-command click features. The smart resize button is still there in MacOS X but often mistaken for a maximize button. Maybe because it's green and has a "+" on it.. or maybe because it's often implemented wrong... or "wrong". This makes me cry into my grey beard.
So life is indeed possible without a minimize or maximize. To me it seems somewhat arbitrairy to remove them though. I mean why? You can add new features to make using windows on multiple, large monitors easier without removing the buttons.. If the new features make the two buttons redundent *then* you can remove the tem.. Anyway, I'm not ready to grab a pitchfork over it.
He's not saying they have to pay. He's saying that it's not cool to stop supporting a codec and claim the reason is because the development model for that codec is not open enough. Basically, dropping support for a codec based on the development model is a lame reason. He then spends the rest of the article guessing at what their real motive is and not finding those reasons very likely... one of the reasons he guesses might be behind their decision is that they don't want to pay for the licensing costs. He finds this unlikely for the rational you quoted.
"It's not as if Google can't afford the $6.5 million a year, and by paying that money the company would enable web users to view open, standards-compliant, H.264 video."
The next step is to put a little box on the wall with speech recognition. Then it could print out a fine automatically..
".. you have been fined 1 credit for a violation of the verbal morality..."
I'm sick of you dang 4 digit newbies complaining about change. In my day everything was static. And then the next day your guys showed up and things started changing. I liked it when slashdot didn't have user accounts... Or users.. We had better discussions in the forums! I was my lunch now dammit. I can't believe nothing costs a nickle.
Different, updated and personalized version:
No full MacOS X, too big, lame...
It's not just developers. This is why we have unions and labor regulations. They can always find replacements: even in good times, one person in twenty is unemployed at any given time, a figure that the Federal Reserve works very hard to maintain lest it create upward pressure on wages. And most people prefer shitty working conditions to the uncertainty of finding another job, never mind actual unemployment.
This has to be balanced with the fact that the unemployment rate for programmers tends to be *much* lower than the general employments rate. Also, the skill difference between an average programmer and a great programmer is a factor of 10 in productivity which means that programmers are not commodities. If you're good at programming and well connected you can do very well.
By the way, 5% unemployment is basically full employment. It gets disproportionately harder to do better the closer you get to 0% unemployment. Programmers (especially good ones) are extremely rare and hard to hire.
> Instead, test it formally, with double blinds, hoping that it works (so you don't subconsciously suppress data).
The entire point in a double blind study is that you don't subconsciously disclose something...
> As a science type, I encourage you to not turn off your brain to astrology, Feng Shui, crystal power, and other crap.
It has been investigated and found lacking. If any of this were real it would point to new mechanisms that science would be very interested in.
Keeping an open mind is a great idea but astrology, crystal power etc.. have failed to demonstrate their purported abilities time and time again. This is why they have not found a home within the scientific establishment.
Money isn't everything -- but it's a long way ahead of what comes next. -- Sir Edmond Stockdale