If I could change what's "typical" about typical laptops ...
I must second this.
In July 2012, when I started searching for and eventually purchased my new laptop, resolution was the number one criteria. It was not an easy criteria to meet. Once you choose a high enough resolution to meet your needs you quickly find that the number of machines you have to choose from has been reduced dramatically.
For me, personally, my other major hardware issue was the keyboard layout and (to a slightly lesser extent) feel. My hands are not so large that I need to worry about tiny keys, but I often cannot understand the logic of which keys are accessed via a "Function" key and which are not. Though I eventually had to compromise, I really wanted page up/down to be first level keys. In the end, though, my new laptop's keyboard is similar to my old netbook's keyboard in that page up/down (and home/end) are function-level keys attached to the arrow keys. Thankfully, at least, the arrow keys are slightly offset onto their own "island" making them far easier to find and use with just your fingers.
In the end, I chose a Samsung Series 9 Ultrabook as the replacement for my old Dell E1405 laptop. A bit smaller in physical dimensions and weighing much less, I am generally very happy with it. As I just wrote, page up/down are not top-level keys, but, on the other hand, the keyboard is backlit. I have found this feature far more useful than I thought I would. The display's resolution is 1600x900, a small increase from the Dell's 1440x900 and a big jump up from my netbook's 1024x800 (I think). More vertical space would have been very nice, but finding something in my budget and meeting other requirements essentially ruled that out...
Also, this time, I made very sure the screen was matte. I made the mistake with the Dell laptop of getting a glossy display and it was supremely annoying. Unfortunately, with the E1405 laptops the only choice for the higher resolution screen was glossy. Ugh.
Digg Hints Its Replacement For Google Reader Will Include Social Media Content
Personally, I started using gReader Pro on my Android device approximately 15 seconds before Google made the announcement to get rid of Google Reader.
Fortunately, gReader had already made itself separate from Google Reader (or so it seems). It supports syncing what you do and what you've read with your Google Reader account, but this is optional. I've already disabled the Google Reader-related features and so far gReader is still working just fine. It has a lot of extra bells and whistles that I don't need, but the basic RSS reading functionality is very nice and is the main reason I switched to it (and paid for it). Best of all, no social media junk.
I'm still not sure if there is more going on underneath than I know about. Maybe it is more closely linked to Google Reader than I am aware? I guess I'll find out when Google Reader finally turns off.
My only complaint right now, and it is a very minor one, concerns the display of Slashdot comments at the bottom of each Slashdot RSS feed article. gReader still displays only five comments (picked seemingly at random, yet somehow never including troll/spam junk) and I can tap on the titles to expand the comment, just as in a browser. However, until recently I was still using the stock Android 2.3.4 on my Droid 3 and now I am using CyanogenMod 10.1 with Android 4.2.2. The comment box that gReader shows no longer seems to grow vertically as I open each of the five comments. A rather bizarre change, but not exactly a deal breaker...
Ask Slashdot: Is the Bar Being Lowered At Universities?
A big problem with the standardized tests has been born from their political uses. I live in Tucson, Arizona so I can only speak to the school district here (second largest in the state) and not elsewhere.
One of the driving forces behind so-called "teaching to the test" comes directly from budgetary issues. Arizona has seen fit to divy up school funding based not just on how a school performs, but on improvement of those scores. Consequently, when a schools budget is on the line teachers are under a lot of pressure to have their students do well.
You can probably imagine some of the immediate faults with a system that relies so heavily on improvement of scores. What happens to a school that is already in very poor shape? Anything the district might do to improve the situation will take time to have an effect. If a school's scores do not improve quickly enough, that school may be forced to abandon any new improvement process for lack of funding. Similarly, at the opposite end of the spectrum, what happens to a well performing school in a good neighborhood with an active community? They can most likely improve scores somewhat in the beginning, but eventually the returns will diminish. That school is already doing very well on the tests and there is little, if any, room left for the school's average to improve.
At least this state, as far as I am aware, has not tied student test performance directly to teacher pay.
I attended high school here in Tucson at University High School, a public college preparatory magnet school (Number three high school in the nation in Newsweek's latest list and the only public school in the top five). I graduated in 1997 so, thankfully, this rash of testing hadn't yet started. As graduation neared we became aware of a situation in some ways similar to this testing mess. The University of Arizona and the state offer (at least, they used to) a full tuition scholarship to any Arizona student in the top 5% (I think) of their respective classes. The argument was made that since University High, by its very nature, attracted the top students from the other local high schools all of its students should receive the scholarship. If the students were to return to their regular local high school they would easily be in that 5% bracket. The argument didn't quite work, though the limit was raised quite a lot from 5% to 25% (If I remember correctly).
By and large, I had a very good experience throught my 13 years of public school in Tucson. Slashdot will very quickly inform you that, obviously, not everybody had such an experience. I wonder what that same trip would be like today. Would I be bored out of my skull as the teacher continued to focus on what the state tests require? Very difficult to say.
Swedish School Makes Minecraft Lessons Compulsory
Are you sure about that? I seem to recall that garbage would, eventually, disappear from a landfill. If anything, it seems that SC4 actually modelled that rather accurately in that trash in a landfill takes a long time to biodegrade. If you never stop using a landfill then it will never begin to clear up. Of course, the problem is that there is no way to control garbage dispersal/destination in SC4 at a fine enough level. The only way you might notice a landfill shrinking would be to export all of your garbage.
Apple Kills a Kickstarter Project - Updated
It may be technically better, and considering it is 16 years newer than the original 1996 USB spec, it certainly ought to be. I think this is missing the point, however. Random company X could "invent" a number of new product Y's which are technically better than an existing standard, but without some sort of backing and/or lax (or completely lacking) licensing fees and rules, nobody is going to use it.
If I wanted to put a USB port on some device of mine, or even an entire USB host or slave system, how much will it cost me and to whom must I pay? The answer is nothing, zero. Unless I need a new USB vendor ID reserved or I want to use the official USB logo then I don't need to pay any sort of fees to anybody. *This* is the reason that you find USB ports on everything under the sun. Any company can add USB to anything they like without paying another company and without needing to get permission. The only moderating factor here is the need, for many devices, to have a unique vendor ID which prevents the landscape from being a chaotic free for all.
Apple, meanwhile, gets to play gatekeeper on yet another area of technology related to their phones and pads. And for what gain? Look in most stores and catalogs and it already seems that they will give the okay to just about any random piece of junk that plugs into an iPhone. They don't seem particularly picky most of the time. As for why they chose to give these people the run around for their charger? Who knows... politics, knee-jerk reaction to anything possibly Android related, stupidity, or maybe even the left hand not knowing what the right had is doing. Take your pick.
If I could print 1 replacement organ ...
Right now, I am missing my thyroid, 3/4 of my parathyroid, a bunch of lymph nodes, most of my colon, gall bladder, appendix, a chunk of my liver, and both adrenal glands.
Surprisingly, I both feel and look pretty good. I must take some replacement hormones each day for three of those missing organs, but it's not too bad. It seems, with the right selection, you can toss out quite a few things without destroying yourself. I'm not looking forward to the apocalypse, though, as I imagine prescription drugs, even inexpensive ones, will be very hard to come by. That would be bad.
If I was going to replace anything, it would be one or both of my adrenal glands. I rather miss those. :(
Meebo Discontinuing All Services Except for Meebo Bar
I'll second that. I've been using Imo for quite a while now. When I first got an Android device I tried a number of IM clients and eventually settled on Imo. I tried eBuddy for a short time, but it requires that you create an eBuddy account and then add all of your other IM accounts to that. Imo, on the other hand, acts like a normal multi-account client and has you manage your accounts locally with the client and logs into them directly from your phone.
I can see the benefit of the eBuddy method for a device where the network connection can change occasionally and if you really don't want to be caught offline it might be better. But, I would much rather do things locally, and I haven't had any issues with my network connection changing. When it does, Imo seems quite quick about reconnecting.
Imo has a few minor annoyances, such as wasting a tremendous amount of screen area on bars/labels/nothing when in landscape mode, but nothing that keeps me from using it. My biggest complaint has nothing to do with Imo, but rather with AIM. Every time I turn on my PC or laptop, Pidgin will attempt to connect (as it should) and AIM will send a message to both clients complaining that you are logged in twice. There is a link to follow, but I did not find anything there that would let me get rid of this.
Imo did have a rather serious bug that I seemed to hit with regularity where it would start forgetting account details. I normally have five accounts and suddenly there would be only three or four listed. I submitted a bug report and they asked for more info, but I never heard anything more. Fortunately, I found a work around by pressing the logoff button, then logging into one account. This would cause the list to refresh and all accounts would reappear. I haven't had this happen in a while, though, so perhaps it has been fixed.
Inside the Death of Palm and WebOS
As somebody who formerly wrote Palm programs (Weasel Reader), I don't really agree with your hardware assessment. Like most small systems with both an API and a method of direct hardware access, the amount of portability depends almost entirely on how well you use the provided API.
Up through Palm OS 4.x, the hardware all ran on m68k series processors, but there was nothing in the API specific to this hardware. Then, with Palm OS 5.0, Palm began using ARM hardware and provided a translation/emulation layer so that the new devices could still run all the old Palm OS programs. If you wrote your software according to the API guidelines then the emulation layer would run your old programs perfectly fine. In fact, because the new ARM hardware was so much faster the old Palm programs ran better than they ever did on native m68k hardware.
Of course, if you did direct hardware access then things were rather different. Most likely your program wouldn't work at all. Even then, though, the OS provided a method for checking for OS capabilities and underlying hardware. If you wrote your program properly, and checked for these option bits, then you could gracefully turn off direct hardware access if you weren't sure it would run correctly. Most likely, if you really needed that sort of access, you would add new hardware specific code for the ARM hardware.
The move to WebOS need not have killed off the old application ecosystem. There was no reason they couldn't have written another translation/emulation layer so that existing Palm OS programs could be run. Keep in mind that, even with OS 5.x, most of these apps were not that complex and most users would never have noticed a speed decrease, if there even was one. And in the worst case, they could have axed support for OS 5.x programs and provided support to run anything pre-5.x (m68k binaries), knowing that the WebOS hardware would be able to run those programs at a fast speed.
I don't know why they chose to completely ditch existing apps. If they had kept support, WebOS could have launched with the ability to run the many thousands of existing programs and that would have been a big plus, especially for businesses which might have company-specific Palm programs (inventory, point of sale, etc.) and would then have had an upgrade path.
But, as this article and numerous others have made clear, the history of Palm is overflowing with bad choices...
Free Desktop Software Development Dead In Windows 8
I use emacs for 99% of my stuff, and I have to say, while it's a great editor, I wish I had IDE-level code browsing abilities (and to a lesser extent, intellisense-style stuff). I'd kill someone for good "go to definition" support. Ctags-style stuff is a shitty substitute, at least on our code base, and I've never really been able to get the fancier stuff to work well. VS isn't perfect there either, but it's still a lot better...
Could you explain this a little more? It seems to me that "go to definition" is a rather basic thing for any IDE and since CTAGS' primary job is exactly that, I don't understand why it would not work so well on your particular code. I mean, all it has to do is understand the difference between a definition and not a definition (i.e. it doesn't need to fully understand the code), so if it is having trouble doing that job it certainly reflects poorly on the tool.
I guess I'm just curious what sort of code or code layout would cause it problems.
Free Desktop Software Development Dead In Windows 8
You know what this story actually tells? That even FOSS users don't like their IDE's. They want to use Visual Studio from Microsoft because frankly, it is much better than the open source alternatives.
No, no, and again, no...
This story only serves as flamebait and the only real thing it demonstrates is that the editor (timothy in this case) shouldn't have bothered to post it. The vast majority of FOSS developers and FOSS users (those would be people who primarily use FOSS) use the free IDEs. Why? Because most FOSS developers actually run a FOSS operating system and those, surprisingly, do not run Visual Studio.
Yes, there are some FOSS devs who do their work under Windows, and they may be slightly impacted by this (as you said, VS 2010 is still free), but it is by no means a majority.
When I need a robust business solution, I prefer it ...
No... I must completely disagree. If you actually read the poll, there is absolutely no way you could accidentally mistake this for a serious marketing poll.
A legitimate avenue for conversation is whether or not the poll is actually funny. Personally, I think it is. It's a pretty good example of buzzword bingo, but the real humor here is that this poll only exists because of all of the "Slashdot has gone uber-commercial!!!1!!!" complaints and shows that the editors are aware of the prevailing opinion.
I understand that there are a lot of humor-impaired people out there, but I suspect that a lot of the whining is from people who merely glanced at the poll, saw a buzzword, and then assumed it was a marketing poll. And, to be fair to /., there were like two or three stupid marketing polls. That's it. The haters, along with not being able to detect humor, are now defaulting to assuming that everything /. does has a sinister PHB motive unless there is some sort of mountain of evidence to the contrary.
That said... SlashBI is a pretty terrible idea...
SlashTV started really bad, but it certainly looks like they listened to the complaints and it has improved markedly, though I still pay little attention to it.
Star City and the Baikonur Cosmodrome
My gallery on my university/work machine has a great collection of albums documenting a trip to Baikonur and the Cosmodrome. They were taken by Chuck, a friend of mine and retired engineer, during his trip there for the launch of ECHO. This was an AmSat (amateur radio) relay satellite. He took a great deal of photos covering the flights, the locations, the integration and launch of the satellite, and some other interesting places in Baikonur.
ECHO Launch Campaign
I also had a satellite launched from the Cosmodrome. I worked on the University of Arizona's Cubesat Project and wrote all of the onboard code controlling the satellite. In the end we built four satellites, three of which were completely functional. There was RinconSat 1 and 2, AlcatelSat, and an engineering model. The cubesats are small 10cm cubic satellites with a control/computer board, power board, radio board, an array of 24 sensors, and an array of solar panels on the outside frame.
The hardware was quite simple, but we didn't need anything super fancy. The computer board had a PIC microcontroller and using the I2C bus could communicate with two 32 kB FRAM (ferromagnetic RAM) storage chips, a clock chip (which kept time in binary coded decimal), and the sensors. Unfortunately, at the time there were no FOSS PIC compilers so we had to use a Windows/DOS/command line compiler which was really lousy, but we managed to work around the bugs as we found them.
I was very happy with our final results. We did a great deal of testing on the ground and did radio testing by taking the satellite up to the top of a nearby mountain and then communicating with it from our groundstation. The onboard code supported one- and two-way communication and had several modes of operation. It had a default mode in case communication could not be established, a real-time mode that would broadcast a constant stream of sensor readings for a period of time while the satellite was overhead, and a regular mode that would collect readings based on a schedule and store them in the FRAM storage which you could then later command the satellite to transmit to you.
After many delays, we finally got a launch opportunity. We sent RinconSat 2 and AlcatelSat to CalPoly where they were integrated with other cubesats into the launch mechanism. They then sent them to the Baikonur Cosmodrome for the launch. At first, everything seemed to be going well, but we soon found out that it was far from well. The first stage of the rocket failed to separate and the rocket crashed 70 km downrange in a flaming crater, destroying all of the cubesats as well as the far more expensive primary payload (some sort of communications satellite). Sigh...
We don't have any sort of web site, sadly, but one of these days I need to gather up all the photos, documents, source code, and other random stuff I still have access to and make a nice web page for our late satellites.
New Firefox For Android Beta Released
I'm typing this on a Droid 3 and it meets many of those requirements. It is most notably lacking out-of-the-box root, but I fixed that soon after getting the phone. I am still rather pissed that the bootloader is locked, so even when I change the ROM I can't change the kernel. And, with it rooted, wifi tethering is open to me if I wish. The body is also fairly rectangular. Except for the rounded corners, it's quite straight.
It would be nice to have Debian beneath Android, though.
Hobbit Film Underwhelms At 48 Frames Per Second
This has always bugged me a lot. For most games, I personally think it looks better with motion blur turned off. You almost always get that option with games on a PC, but rarely can it be changed with console games.
On consoles, I think one of the reasons it is used so frequently is to help mask low or dipping frame rates. The 3D on consoles seems to be designed such that games can enable motion blur without hurting the rest of the 3D rendering performance. Most PC video cards, however, seem to take a hit when it is enabled. But, perhaps that is no longer true with newer cards? Or maybe it is only noticable on a PC because the resolution is much higher?
I've read that most console games only render internally at a size close to 800x600 and then scale to "HD" sizes... which I suppose makes sense when you consider how many years old the PS3 and XBox360 3D tech really is.
Ubuntu 12.04 LTS Out; Unity Gets a Second Chance
I'm in this boat, too.
I never stopped using Debian on my PC or at work, but somewhere around 7 or 8, I put Ubuntu on my laptop and later on my netbook. At the time, Ubuntu did a much better job of setting up some laptop specific things than Debian did. I had done a lot of work getting Debian to do some of this as well, but it was just easier to use Ubuntu and the overall experience was very nice.
Now, however, Debian has entirely caught up in this area. My old reason for running Ubuntu no longer exists. So, I've gone back. I now have Debian running on my netbook (I'm typing this on it right now) and soon on my laptop. It helps that I've been using Debian for a very long time, of course.
And, to loufoque, who was wondering below about proprietary drivers with Debian. It's not *usually* a big deal. Debian does pride itself on providing a completely free OS, so out of the box and on the installer ISO you will not find any non-free software. After that, however, you can add those repositories yourself and get access to most of what you might be missing. Specifically, after your install is done edit /etc/apt/sources.list. You will see lines for each repository and they should end with "main contrib" or maybe just "main". To the end of those lines add "non-free" and you will get access to some of the non-free packages that are available. To get the rest, go to Debian-multimedia and follow the directions there to add that repository.
With those two changes you should have access to almost all of the same non-free stuff that you would have with Ubuntu. The big downside to this after-install method of doing things is if you need access to any of that for the install to actually work. Normally this isn't the case since you can always use the text-based installer or let the graphical installer fallback to a more general video access method, but I suppose it's possible that you might not have wifi available without some of the non-free packages. Hopefully you can just use a wired connection in that case. I try to do this anyway if I'm at home simply because it makes the install process a little more smooth.
Ubuntu 12.04 LTS Out; Unity Gets a Second Chance
I don't think he's saying that it's your fault, but rather that the fault lies with Nvidia and you should be placing blame where it's actually due.
It is not in *any* way Linux' or X's fault that your Nvidia card/chipset doesn't work properly. It's Nvidia's fault and they're not exactly doing much to correct the situation. If there did exist a good Nvidia driver for 3D and the distro and everybody online told you to use it but it didn't work, then yes, go ahead an blame Linux for providing junk. But as long as Nvidia keeps all that information to themselves there's not much any Linux or FOSS developer can do to rectify the situation.
Right now, the netbook I'm typing on is using an Intel chip for video. My Dell laptop also has an Intel integrated video chip. X runs great on both of these machines and I don't have to do anything to make that happen. That's because the information is open. If the driver was broken piece of junk, then perhaps the blame could be put on Linux or Debian. My desktop PC has a fairly decent AMD/ATI Radeon card in it. Currently I'm using the free/open radeonhd driver in X and it's working very well. If I try to use the proprietary ATI driver, everything falls apart. But that's not the fault of Linux or Debian. Those developers did not write it.
I wouldn't suggest you replace the video card you currently have since, with some work, you might be able to get it working. But if you're hell bent on blaming Linux for this particular problem, at least use some hardware that the developers have at least had a chance to fail with.
Iran's Oil Industry Hit By Cyber Attacks
You're definitely far overstating the issue here...
A real civilian nuclear program simply cannot be used to create a bomb as-is. All of that complex and expensive technology needed to enrich uranium is not needed for regular reactor fuel. And, beyond that, actually building the bomb once you have the materials is definitely not "so simple" a task as you seem to think. The theory of how a "gun-type" bomb works might be, relatively speaking, simple, but the implementation of that theory is far from it. It takes a lot of knowledge and a lot of skill. If you actually want your bomb to detonate instead of just blow up like a pipe bomb, you need to carefully engineer the thing with very tight tolerances.
The real trouble and the real danger is that you can convert a civilian nuclear program or build upon it and create the tools and facilities needed for a military nuclear program. I really don't know what sort of program Iran might have or how far along it could be. Certainly, the Israelis seem to think it is real and very active. Proper monitoring could, conceivably, keep the civilian program in check and make sure it doesn't get used improperly. But, if Iran is hell bent on creating a bomb, I suppose there are a lot of ways they could hide it. I've read reports and rumors in the paper that Iran is building underground facilities to hold the bomb making gear.
Open Source Project Licenses Trending Toward Open Rather than Free
Your example is from 2009, yet you make it sound like the GPL is a virus infecting most code out there. Sure it happens, but I don't think it's nearly as big a problem as you seem to. Like you, I am also not overflowing with facts, however there are a number of diligent and skilled people out there who constantly monitor for this sort of problem. And it usually gets reported here, too.
You didn't say otherwise, but I wanted to make clear that it's not the GPL (or BSD or whatever) license at fault here. The blame lies with lazy and/or dishonest programmers who take code they shouldn't.
Most Game Console Power Draw Comes From Time Spent Idling
Thirdly, even when connected to a powered USB port - such as a mains USB adapter or a powered USB hub, the accessories will not charge unless the PS3 is on. It's not just the current, these devices were actually designed to make charging unnecessarily difficult without leaving the PS3 on or paying extra for an unnecessary charging device.
Yes, design like that is shocking.
Oh, good, it's not just me then. :)
This little unadvertised fact certainly shocked me. Obviously, it's bad enough not to charge from a PS3 in standby, but to ignore any regular USB charging cable... very annoying. Until I spilled milk on one, I had two controllers, so at least I could switch off when necessary.
What about those charging stations one can buy? How do they work with PS3 controllers?
Zimmerman Charged With 2nd-Degree Murder
Which is why I did not put the "and" in quotes in the original statement, because it is not there. My intention was to point out that the Slashdot motto includes both quoted phrases, not to imply that a story must satisfy both conditions.