Royalty-Free MPEG Video Proposals Announced
[quote]Support: here is a performance comparison of the latest iteration of the WebM encoder hardware, showing also previous versions and a h.264 encoder for comparison.
I hope you realize that the comparison you linked to compares ENCODER quality between two decoders (H264 and WebM) made by the same company? It says nothing about the abilities of WebM as a codec.
Try this one instead:
Intel, Google Team To Optimize Android For Smartphones
Name 1 useful, headless, x86, binary-only linux application, that does not have a viable alternative that can be easily ported to ARM...
You say 'plenty of software' and 'large amounts of software that can now be easily ported', but I'm genuinely hard-pressed for even a single example that proves you right.
Steve Jobs Resigns As Apple CEO
[quote]Come up with some studies showing people's reasons for buying an iPad and I'd pay attention. But just saying, "They sold more of them, that means that they love X 'feature'" isn't necessarily true.[/quote]
Fine, then go lookup customer satisfaction numbers and return rates for iPads/iPhones vs Android tablets/smartphones. Or are these figures also somehow not indicative of the fact that people buy Apple products because they happen to like them, in various ways?
Smartphones: the New Home of Crapware
So high iphone sales means the iphone is good, high android sales means android is bad...gotcha.
You are posting this as if there's some kind of paradox or contradiction in that... There isn't...
Just look around you and find out different people buy different stuff for different reasons. Many bad products and services are succesful, even though they are competing with good products and services. Price is often (but not always) the explanation.
In the case of Android, it's pretty obvious why Android is outselling iPhones, it's because there are literally hundreds of times more Android phones than iOS phones, and the vast majority of them is bought by people who either don't know about smartphone OS's, don't care about it, or simply don't want or cannot spend the money on an expensive high-end phone.
If you sincerely think 'Android outsells iPhones' implies 'Android is better than iOS', you are delusional. That would mean each and every crap Android phone you get for free in a BOGO deal is better than the iPhone, and that's why people get them. 90% of Android phones sold = utter crap, 10% = good phones. Just go ahead and look up some customer satisfaction rates and return statistics.
Smartphones: the New Home of Crapware
Because on iOS you have no choice in hardware...duh! If the iphone4 wasn't the most popular phone then iOS wouldn't be even on the smartphone radar in competition terms.
The original point was people buy more Android phones because they are better, but stating the fact that the iPhone 4 and the 3GS are the two top-selling smartphones on the market, your conclusion is that's because with iOS you only have 2 hardware options? So if Android is 'higher quality' (even though customer satisfaction numbers say otherwise, and sales of individual handsets don't seem to support that assumption), how come only 2 phones are outselling a few hundred different ones? Nice logic you have there, you are basically stating that -1 + -1 = 3, or something similar.
Also, since when does the typical Slashdot reader think 'sells more' implies 'is of higher quality'.
It's fascinating to see how flexible typical geek thinking is, when it comes to defending minority opinions.
Smartphones: the New Home of Crapware
So that's why people are leaving their Iphone 3GS's for new Android handsets. The fastest selling OS is Android, people are buying it because it's better.
Actually, in the US the fastest selling handset is the iPhone 4. The number 2 fastest selling handset is the iPhone 3GS. The former is 14 months old, the latter 26.
Nice try though...
PS Vita Specs Announced
Nice try, but the iPhone batteries don't actually noticeably degrade in 12-24 months, and even after 2 years battery life is still much better than the vast majority of other smartphones. If you really insist on keeping an iPhone for longer than 2 or 3 years, you can always replace the battery or have it replaced cheaply by the way, the warranty would have expired by then anyway.
My 2 year old 3GS still gets around 48 hours on a single charge if I don't use it a lot, and even with the heaviest of usage it never runs out of charge over the course of a single day. Many people I know with spakning new Android phones carry around a cable or charger all the time because they sometimes don't even make it to a 10-hour work day, with their 'replaceable batteries'.
MPEG LA Says 12 Parties Have Essential WebM Patents
Objection. The correct sentence here is "who is surprised that MPEG-LA claims that WebM steps all over patents controlled by MPEG-LA".
Others have already predicted VP8 would run into patent issues.
Anyone who knows even the least bit about video codec technology can tell you that it's virtually impossible to design an advanced video codec that does not infringe on any video coding patents, which isn't surprising if you consider the technology involved is decidedly non-trivial, and took over 3 decades to develop. H264 is currently the most advanced standard for video coding, but it didn't materialize out of thin air either: it simply builds on the concepts you'll see in MPEG 1, 2, 4 and so forth.
The only codec I can imagine that does not violate any H624 patents would either be incredibly crappy, or downright revolutionary. I know for a fact that VP8 is not the latter.
'The Code Has Already Been Written'
The perpetually want a set of requirements. And they get upset if a new requirement is added later. I see software as a way to explore a space. Model it. Determine what more modeling is needed. You are constantly trying to do something that usually is beyond what is computationally possible so you have to figure out what approximation is going to work
Which is exactly why you should leave the science part of the work to scientists, have them prototype a solution, work it out along the way and improving it, until it meets its scientific requirements. Then, at set points, you should leave it to the software engineers to productize the prototype based on the set of scientific requirements currently implemented, and keep those fixed as long as possible. If the target is moving, settle on a release schedule for the productized code that allows planned functional updates. This way the science work and the software engineering work can be done independently and both sides can be happen.
In reality, this often means you will end up with 2 implementations, which implies you need capable software engineers who know how to do regression testing and qualification of the software probably.
Where I work, we have physicists, mechanical engineers and construction engineers prototype solutions in Matlab, they love that tool, they can try out new ways of solving their problems quickly, prototype the hell out of them, graph and plot the results, etc. It's a great tool for scientists really. When it comes to productize the resulting solutions (simulation models, mainly) and integrate them in the (extremely expensive) hardware they will be used in, the software engineers (me, amongst others), get a functional specification, a test specification and a test report, all based on the QA'ed Matlab code, and we re-implement it in the target language best suited for the application (C++ or Python, mostly). This usually involves isolating the 'meat' of the algorithm and designing a maintainable, flexible software architecture around it, that allows deployment for other applications, adding scripting interfaces on top of the models, better regression testing, etc.
It may sound like a stupid idea to keep 2 implementation of the same idea around, but it's not, trying to productize a moving target and having two sides of the problem implemented in one piece of code will take more time, more risks, and more annoyances between the involved parties.
The way we solved it for my job works great for everyone, the only annoying thing here is middle management, who like to think they can do a better job at both the science and the software engineering work, and sometimes decide we 'have to have to productize the Matlab code', we 'have to have 1 implementation only', or 'we don't need to allocate a lot of time productizing because if the functionality is done, translating it to a usable software component is easy'. :-/
Google Boots Transdroid From Android Market
So what's the big difference between side-loading on Android and side-loading on a jailbroken iPhone, that you would want to 'not count jailbreaking the iPhone'?
Both involve messing with the device and software in ways that are outside the comfort zone of typical smartphone users, but neither are difficult or invasive enough to scare off people who don't mind doing it. I don't see the difference here, but I guess it's fair game to use a sliding scale for comparisons that involve Apple these days.
Synaptic Dropped From Ubuntu 11.10
Right... I'm a long-time Ubuntu user and I love it, but it doesn't best either Windows or OS X in the majority of cases people care about. A few compiz desktop effects don't change any of that.
Dutch To Introduce Net Neutrality By Law
Well, I have no idea how many murders you've got annually, but on a country of 16.7 million inhabitants, we had the following numbers of murders from 2010 to 2005: 170 / 178 / 161 / 143 / 149 / 201
And about half of these have been vendetta's, criminals killing other criminals. I think you can safely say we have a pretty safe country, crime-wise (don't leave your bike unattended for too long though ;-)
Android Honeycomb Will Not Be Open Sourced
"Google refuses to release embarrassing code to a world of incompetents who could potentially ruin Android's reputation by shoehorning Honeycomb into devices it was never meant to be shoehorned into". Sometimes openness just needs to take a backseat in order to protect reputation.
Seems like Google doesn't have any problem providing the Motorola's, Samsungs and LG's of the world with this 'embarrassing code' and let them sell half-baked, buggy devides running an OS that nobody can modify or improve with. Apparently 'protecting their reputation' means a lot more to them than user experience for their customers, or being 'open'.
I really don't care the least bit about what Google does with the Honeycomb sourcecode, probably they are right about holding it back because it was a rush job and not pretty to look at. That said, I think we can all safely put the hollow 'Android open, Android free!' nonsense behind us. Android is only open to the manufacturers and carriers, and Google has its priorities with them, not with you who was suckered into buying a tablet running beta software.
I'm still amazed that so many people keep up with this, if I pay $500 for a device that is not explicitly marketed as beta, as a curiosity for the adventurous, I expect it to work as advertised, including the software. If the software is so messy even Google doesn't want you to see it, ffing clean it up and make it better, before selling products based on it.
Tasmanian Dept. of Education Wants Anti-Virus for Linux, OS X
There's 90% of Windows malware wiped out. The user is, always has been and will always be the biggest source of infection. Even in the Windows world and especially today when a patched Win 7 and Office suite aren't vulnerable to drive by infections.
What does Windows have to do with anything, the statement was that there's "more OS X and Linux malware around then you might expect", which (at least to me) implies that this amount of malware is substantial enough to care about.
I love how Mac fanboys need to move the goal posts to justify their positions. But here you go anyway
Great, ram your point across by throwing stereotypes around, that's really going to help your argument /s
No doubt you have some wonderfully convenient excuse to ignore this.
No wonderfully convenient "excuse" is necessary here, because your 'list of OS X threats' is laughable and does nothing but disproving your own argument. In 10 years of OS X history, apparently only 43 pieces of malware have been identified, most of which are Trojans, which -in your own words- depend on the user as 'the biggest source of infection', and for which antivirus software completely unnecessary. If anything, that list proves that OS X is more or less immune to viruses and malware, and that a fully patched OS X install does not need antivirus, just common sense.
From your own signature:
Calling someone a "hater" only means you can not rationally rebut their argument.
And what does calling someone a 'Mac fanboy' make you?
Tasmanian Dept. of Education Wants Anti-Virus for Linux, OS X
Wow, no less then *FOURTY-EIGHT* OS X 'threats', some of which are 'proof of concept' malware and almost all others are simply Trojans or scripts that do absolutely nothing unless you start and authorize them yourself.
I guess I can still sleep at night without a virus scanner...
Are Graphical Calculators Pointless?
Also arguably, this was more useful to me than rote-learning the proof of the quadratic formula.
I would like to hear that argument.
Like you already hinted yourself, for many people it really is much more useful to devise and implement a clever 'cheat' than to just blindly repeat some mathematical trick the way the teacher wants you to repeat it. There's all kinds of real-world skills involved cheating your way through the common math exercises taught in schools, which usually teach students no more than how they have to jump through hoops.
As an example: myself, I have never been a big star on maths in school, because I found deriving stuff by hand or simplifying goniometric expressions to be extremely boring and pointless. I did (and do) understand the fundamental concepts behind them, but I didn't (and do not) know from the top of my head what the derivative of x^2*log(2x+1) is, and if I ever need to know, I'll just fire up Mathematica. Because I consistently slacked on my math exercises and homework, I got low grades and no satisfaction from mathematics at all.
Now, eventually I chose to go to college and study Computer Science at a university famous for their extremely theoretical, formal CS curriculum, I got my MSc without any major problems, and I got a job writing very complex and very specific simulation models full of FFTs, (non-) linear regression, computational geometry, curve fitting etc. Turned out that skills like being able to find and understand relevant literature, decomposition of problems, and thinking in abstractions, and most importantly: communicating with people who are experts on the theory, are much more important than learning a few tricks so you can show the teacher how to 'prove the quadratic formula' (read: repeat some symbolic gibberish you might not even understand).
If everyone was destined to be a mathematician or a teacher, learning all these tricks and details might be the most useful way to teach mathematics. In reality, most people will only ever need to be able to understand the basic theory behind math concepts, and asking them to prove all kinds of random stuff, derive expressions by hand, rewrite and simplify expressions, it does not help them at all, and it is more likely to scare them away from mathematics completely.
Apple's Secret Weapon To Win the Tablet Wars
You are fooling yourself if you still believe in that 'Apple cult' crap by now. Of all the people I know who use Apple products, maybe one or two could be described as having a 'cult-like dedication to Apple', and then I would be including myself just so I can come up with more than one example. Most of them just buy stuff because they like the way it looks and works, and because they hear their friends say good things about them. With over 90% 'high or very high' customer satisfaction, Apple doesn't need any 'cult-like dedication', the word-of-mouth marketing from their users already does half the PR for them, and the main driver behind that is customer satisfaction.
Now, of course you could argue that the customer satisfaction numbers that Apple scores are inflated, and that this 90% of satisfied users all have Stockholm syndrome, or a simply bragging to each other because they like to pimp their gadgets, but that simply doesn't make any sense at all. No company can keep up selling polished turds for over 10 years and still have the whole world think their products are great while they aren't. You might be able to pull that off once, using sufficient hype and a big marketing push that distracts from the downsides of your product, but if it actually sucks and is not worth it's money, you'll be out of business within 1 or 2 generations of your product. Nobody buys a polished turd twice.
In my perception it's more often Apple's competitors that try to create hype and distraction to sell inferior products to a loyal following of customers who don't want to buy anything made by Apple, out of principle. I'm not implying that includes you yourself, but when people don't seem to be able to get passed the 'Apple cult', 'sheep', 'buy everything with an Apple logo' or 'believe everything their god Steve Jobs tells them', it usually means someone is trying too hard to justify their own, personal preferences, without having to acknowledge that they are 'different' from what most 'normal' people enjoy in their tech purchases.
Why Mac OS X Is Unsuitable For Web Development
I was writing a C/curses application with pthreads on OSX that compiled with no modifications on Linux. Ran fine, too, after I changed a stupid assumption about select() that worked on *BSD but not Linux. And that was my fault for not following POSIX.
My last job was writing software that was tested/deployed to Solaris/SunOS (yes, the really old version) on sparc and x86-64, FreeBSD 4 and 5, 3 different Linux distro's (Redhat, Fedora Core and Ubuntu on the development workstations), on 3 different architectures (x86, x86-64 and itanium) and HP/UX on PA-Risc. You could say a pretty large subset of common Unix systems and platforms.
To extend the scope of our regression testing at one point we included OS X on PPC (since we didn't have any other PPC machines) just to see if it would be easy and if it would catch some issues masked on other OS's. It took 1 day to get all the build scripts adapted for OS X (which needed customizations for *every* OS we supported), and the complete source base except for some ancient Tcl/Tk GUI worked right away. Over 10 million lines of C and C++ code.
Over the years we've had that OS X machine it was one of, if not *the* system that needed the least tweaking or working around non-standard C/C++ standard libraries, Unix userland differences, compiler problems, etc. The HP/UX was the worst with the ancient compilers, the Solaris machines came in second with their masochistic userland and compatibility libraries that are installed in a million locations different on every system, the Ubuntu and Redhat systems were easiest, but OS X and BSD were not far off.
So yes, I can attest that OS X is a perfectly capable OS for Unix development, even if you go much lower-level than Python. If you manage to break code that runs perfectly fine on BSD or Linux, but not on OS X, 99 out of a 100 times it's because your code is dodgy and you are simply 'lucky' it didn't break on other Unixes yet.
Google Delays General Release of Honeycomb Source
[..] but the iPad2 is very much "me too" compared to the Xoom [..]
Wow, this must be one of the funniest things I've read on here in a long time. The iPad basically defined the whole market for consumer-tablets, and the iPad 2 simply widens that gap even more. It's been released publically and in great numbers before any of the Honeycomb tablets, it trounces all of them in terms of specs (except for some very minor details) AND price, it's a product that actually works and is finished, not a half-baked rush job, and it has a library of applications that is literally somewhere in the neighborhood of 1000 times larger than Honeycomb. Of all the other competing tablets running other OS's, none of them have materialized beyond vaporware. Yet here you are, claiming that Apple is playing catch-up, and the iPad 2 is very much 'me too' compared to the Xoom, which has no applications, weaker hardware, higher price, is running half-baked almost beta software that doesn't even support all of its hardware features, the ones Motorola actually got around to actually adding instead of just advertising with them (such as 4G and Flash). And then they say 'Apple fanboys' are living in a reality distortion field... :-S
[..] and the iPhone 4 is behind the times compared to the Android 4G, multi-core, and now 3D offerings[..]
The iPhone 4 is almost a year old and the iPhone 5 will be around in a month or 3. Yet, even without dual core, it's still snappier than any of the very few Android handsets on the market that have a dual-core CPU (I think there are only 2 or 3 available right now, I know just the Atrix 4G and the LG x2). Vritually no Android applications make any use of the dual-cores at all by the way. Which leaves your argument with just a stupid gimmick like 3D and a network technology that's more marketing fluff than an actual performance specification. Over here, Vodafone will be upgrading their 3G network to support 28 Mbps speeds, while that '4G' thing you are talking about (which in fact is just a name for a handful of different technologies, some of which are not even '4G' by definition) is hardly faster than 7.2 Mbps 3G in many areas.
DirectX 'Getting In the Way' of PC Game Graphics, Says AMD
The GUS sound blaster support (it was emulated, by the way) was terrible and didn't work properly in many games. When games started to go to Windows the GUS was basically dead, because Gravis never managed to write decent drivers for it, which had something to do with the fact that the GUS did not support DMA streamed audio or something. It's wavetable/sequencer-based design was basically almost fundamentally incompatible with the way Windows sound drivers were supposed to work. I never managed to get decent WIndows performance out of the GUS under Windows with the experimental drivers Gravis released but stopped developing before they actually worked.
Which was a shame, because I always loved my GUS, back then it was without any doubt the best soundcard for wavetable sequencer music, and years ahead of the competition. I did buy a Gravis Ultrasound MAX after that, which had a secondary SB16 compatible chip (an ESS Maestro something) for Windows sound.