top Finding More Than One Worm In the Apple
This is clearly the automatic resolution of a merge conflict by the versioning control software. These are such a nightmare to debug and happen all the time. Developers rarely check their entire change visually post merge. Though this can be found using static analysis that force coding standards (such as forcing the use of brackets or proper indentation for the lexical scope). Though the bugs from automatic conflict resolution can only be really improved through better versioning software. These are without question the worst and most frustrating bugs.
top Linus Chews Up Kernel Maintainer For Introducing Userspace Bug
I would never speak to someone like that. However, I would love it if Linus would talk like that to me. I would learn so much from that man. Also, we don't know anything about their personal relationship. I have friends that treat each other like this every day, and yet are still very good friends/colleagues, but they know not to speak this way to me. Linus can talk to me like that any time, but my friends and co-workers can't. You establish these limits when you start any friendship or professional association.
Mauro did make a major fcukup. He accepted a patch which returned an obviously invalid error value for an ioctl request. The worst part is that the error code was changed depending on its value, as if to sweep the problem under a rug. The beauty of the LKML is that there is no rug. The reason Mauro got burned is because he is a seasoned maintainer.
Mauro did not deserve to be shamed on the slashdot front page. Yes, he did blame userspace for the problem he created as the maintainer of the media subsystem. Also, any new kernel developer should make sure they know the coding style and inherent laws of the subsystem they are focusing on. I am sure he understands the USB video device class very well, but he submitted code that goes against basic pragmatics (changing an error value).
We all enjoy drama, and this conversation is gold for any new or old developer. I learned something from this argument. It taught me to refocus on meticulous aspects of my code, and to rehash the basics of all of the programming paradigms (which should be obvious anyway). I gather this was Linus' intent. He just took a very direct approach to filling our minds with some valuable knowledge.
Why not take something positive from this mailing list instead of contributing something negative? Stop bashing Mauro. He has a family, and stresses in life that will be exaggerated by this blunder. Linus comments were well deserved, but he took to the harsh criticism fairly well after he foolishly brushed off the initial call to his attention. Unfortunately, he ended his follow up with an unfortunate comment. I am sure he is now trying to ignore superfluous flames outside of the LKML and refocus his attention on learning through reading the code in his subsystem. The most important first step is to read the kernel source. If you don't understand some aspect, you need to look deeper. It is never enough just to code. You have to always continue to read and learn throughout your entire career as a developer.
When I read a book, I look up every single word I don't know. This has become habit, and as a result the more I read, the less I actually need to look up words. Apply the same aspect to your coding. When you are reading code, don't skip over any function or operation you *think* you understand.
about a year and a half ago
top Inside the Raspberry Pi Factory
Wow. You are really stretching my words to make that speculation. In no way did I refer to RIM, but just because I am Canadian you assume that I am a die hard RIM supporter. I am talking about the UK, not Canada. I will not go off topic of my own post, but responses like yours make me slowly loose hope for the Slashdot community.
To reiterate, my comment was that if such stickers existed, then they would be the most popular company in the world, since their chips are used in everything, but unfortunately they wouldn't be used in everything if they had such a requirement. ARM definitely does not have a bad rap, so you are definitely way off on the point of my topic.
If you think there is a lack of dominance from the UK in the industry, then you really know nothing about the industry. QED
top Inside the Raspberry Pi Factory
This is slightly off topic, but still very related to the OP.
The UK computer industry enjoyed a mini-rennaisance in 2012 thanks to the popularity of the $40 Raspberry Pi
Are they serious? Do they even know where the ARM SoC is designed?
It amazes me that the Arm Holdings stock was only around $20 a few months ago, when they are without question the most dominant, stable, and secure tech company in the world. Both Apple and Google are completely dependent on the licenses they have acquired from ARM to allow them to use their risc based ultra low power cpu in their devices, and to allow the manufacturers (samsung, ti, etc) to build those chips, and yet in some cases their stocks are twenty times more.
This amazes me, but at least ARM's stock has doubled in the past few months. There is NO bigger player in the computer industry in the world than the UK. I make this claim upon the the fact that now mobile is the dominant platform, and ARM is the only real player in that game (as of yet). Anyone can license and manufacture these chips for cheap and give us crappy hardware as a result, but the ingenuity is in their reduced and low complexity instruction set which allows for their ultra low power design, which is why almost everybody is using their SoC designs.
The only reason that nobody realizes this and their stock has been stagnant in the past is because they don't have a "ARM inside" sticker on every ARM based device made. It there was such a sticker, they would be beyond any doubt the most popular company in the world.
Disclaimer: I am Canadian (and live there at the moment), but I am also a UK citizen. I also don't hold any ARM stocks, though I am kicking myself that I still have yet to acquire any, since it would have almost doubled in value over the past year.
top Apple Hides Samsung Apology So It Can't Be Seen Without Scrolling
For being a tech savvy company, they don't seem to understand online social dynamics at all. They should have thought about the Streisand effect before making another such massive marketing slip. If they had just used Gestalt principles they could have hidden the notice in plain sight.
top Linus Torvalds Will Answer Your Questions
Linus, aside from the gnome3 fiasco, which you have been clear about, do you think that R&D into tablet interfaces is infecting the desktop experience in genreal especially with regards to user productivity? I mean this with respect to the new metro interface with Win8, the unity interface with Ubuntu, and the transformation of the mac os into a walled garden (hypocritical to be sure, especially being based on a BSD variant). I want the freedom the work and play the way I want without giants forcing to me to play the game their way.
top WTFM: Write the Freaking Manual
I second this post. K&R is the evolution of the C specification, and some versions of C compilers were written entirely from the book itself. C++ was developed in 1979, but not released to the public until 1983 (the year I was born, coincidence? I think not). Bjarne Stroustrup worked for Bell labs. Bell kept the official documentation, "The C++ Programming Language", under wraps until 1985. Wow, two whole years (though I'm sure too long for some of your old wizards). It is without question the best meta/template programming manual ever written.
inability to communicate good ideas is more an indicator the ideas aren't that good
If you find his book too complicated, then check out Stroustrup's other book, "Programming Principles and Practice using C++", which is more like a high school text book. Or check out his amazing and simplistic site for the aggregation of information crucial to c++,
He may not be the greatest writer, or the most congenial, but his ideas were great, and no one can argue against it. No language is as dominant and most crucial to the world's infrastructure, and his books (and his online material) are a great companion in your time of need.
top Ask Slashdot: Best Book For 11-Year-Old Who Wants To Teach Himself To Program?
I recommend he learn some well established api. He should get straight into it. Something graphical with very intense visual feedback. He should start by just getting the demos to work. They are *really* simple to get setup, and give great insight in to the capabilities of the frameworks. Then as he progresses he will learn other things he can accomplish and add to them.
When I was a kid, I would only read a book if I knew it contained the solution to a problem. I would read the book cover to cover if I had to, but in general, I was always more interested in building rather than reading.
My only intent was to play video games, and I would do anything to get them to work. I would save every penny just to by another mb of ram. I was constantly changing the motherboards and processors as well. When I was 11, I was writing batch scripts. At that time, I was using 80386 architecture and DOS. I moved to windows to play games like ski-free, and to play around with winsock and tcp. I got into irc for videos and music, which also introduced me the concept of a newsgroup. At this point everything changed. Any problem I had could be solved with ease. Generally, somebody else would always have faced the same problem I had, and they had solutions!
I got into web development pretty early, but I got bored with it quickly, since at that time, the specification was pretty limited, and I was always more interested in playing video games and sports instead. I'm 29 now, and I have a bachelors in computer science, and a masters in engineering. I do numerical programming with C++ and OpenCL.
I only got into hardcore programming in university, but my general experience with computing placed me highly with my peers. My university was arguably the best in the country for that particular undergraduate program. Lots of my friends had fathers who were programmers, and taught them a lot at a young age, but none of them progressed faster than I did. I easily caught up to them.
My point is that you do not need to push him into programming, He may loose interest very quickly. My interest in things has always been very volatile, since there are so many other things to be interested in as well (sports, music, novels, etc). If his passion is genuine, he can catch up easily. The one corollary is that each generation is exponentially more intelligent than the previous generation. My generation was the first to have access to an over abundance of information. Before, there was a deficit of information and a surplus of attention, but now there is a surplus of information, and a deficit of attention. Access to such an infinite pool of information has made me much more ingenuous than my father's generation. Our kids will most likely be exponentially smarter than we are. Of course, previous generations more easily focused on one particular field, which had its merits, as they made incredible discoveries. The argument that there are less things to discover now is bulls$%*. We still understand very little in the scope of things. He does need to become the next teenage billionaire. His passion(s) just need to be nurtured properly. LET HIM PLAY! He has to work for the rest of his life.
top Scientist Who Oversaw OPERA's Faster-Than-Light Neutrino Study Resigns
Einstein's theory was only accepted after the empirical work done by Eddington. Einstein's theories were completely outrageous at the time.
I did not imply that neutrinos travel faster then light, it has clearly been proven otherwise. I meant for you to infer that there may exist some unknown particle that could still possibly travel faster than the speed of light.
when you get your mistake plastered all over the media you do look a bit silly.
So it is the scientist's fault that his ideas where sensationalized by the media? He said that he doubted his own results, and just wanted other scientists to verify his own work. People love making other people look stupid just to make themselves feel smart.
top Scientist Who Oversaw OPERA's Faster-Than-Light Neutrino Study Resigns
What if Einstein resigned from his post just because he challenged Newton's laws? What if Ereditato is actually right, but this specific experiment was wrong? There are too many old jaded stubborn researchers without an open mind, which are holding back progress in every field. There is nothing wrong with questioning something and being proven wrong.
top Thick Dust Alters NASA Mars Rover Plans
If it ever found water
top Ask Slashdot: Tools For Teaching High School Kids How To Make Games?
Ogre is a 3D rendering engine with a very large community based around it. We used it for a proof of concept for a real-time simulator, and there were few limitations we faced in using it. It might not be as clean as unity, but it has more flexibility in licensing, as long as you don't might copy left, which in your situation you shouldn't. Actually, I just looked and it seems they made the switch to the MIT licence.
The proof of concept was actually a major improvement over the production simulator, but of course business politics always wins in the end.
top Ask Slashdot: Good, Relevant Usability Book?
Design of Everyday Things, Invisible Computer. Those are enlightening reads. A good text would be "contextual design" by Beyer and Holtzblatt.
top Rare Earth Restrictions To Raise Hard Drive Cost
Our rare earth mines are just getting started. But the deposits are huge. It may be a couple years before it relaxes the market, but by no means do they control the rare earth market. They want you to think that to inflate demand. There are lots of other sources all over Canada being surveyed (and all over the united states for that matter
top AMD Betting Future On the GPGPU
AMD are far from the only company to make this bet. For one, the bet is backed by Apple, who are the creators of OpenCL. Nvidia have a GPU computing SDK with full support for OpenCL for all major platforms. Even Intel has just recently provided Linux drivers for OpenCL, and have supported windows for a while. ARM will have an implementation soon for their Mali GPU architecture.
I use OpenCL for nonlinear wave equations. There may only be a few OpenCL developers at the moment, but with articles like this, the community will only grow larger. Anybody else out there? What do you use it for?
top 10-Year Study Reveals Electron Shape
No matter how high of an order you go for an approximation, there will always be a truncation error. That is the problem with using infinite series to represent physical models.
top Ask Slashdot: Alternatives To Tor Browser Bundle For Windows?
Firefox in privacy mode with NoScript, HTTPS Everywhere, Better Privacy, Tor Button extension for Tor and Privoxy, an ip filter (linux: ipblock, windows: peerblock), FlagFox extension to always have a visual cue where the server is located, and a system firewall rejecting everything but the specific ports that you use (22, 80, 443, other program specific ports, etc). If I really was worried about privacy, I would also connect across free online VPN (from your own first world country preferably). It may be possible to daisy chain a few vpns (and also a few proxy servers for that matter), but I'm not really that interested in security. Plus an intrusion detection and prevention system like Snort using a log viewer.
top Revolution of the Science Fiction Authors
Literature critics seem to be as tight as art critics. These people seem to think they can define what is consider as "good" fiction, but they really have no idea what constitutes quality fiction, so they can only compare works to other works to quantify quality.
The great part is that these people really have no say in the matter. If it was not for artists like Jackson Polluck, who went against the established fine grain, then art would be still in the dark ages. Just as the same would be true for fiction.
My definition of contemporary epic science fiction is the Ender's Game series by Orson Scott Card.
My definition of contemporary epic fantasy is Song of Ice and Fire by George R. R. Martin.
The origin of both is the epic Odyssey and Iliad by Homer. Science fiction is just fantasy plus science. There is not a dichotomy between the genres as some people may think.
Some people may put Dostoevsky's (and other's) work in a different league as the work mention above, but that is incredible (as in not credible) in my opinion. I by no means say that their work is better than Dostoevsky, but their works are certainly derivatives of such fiction. If purists question the quality of these novels, then I question their ability to read. Subjectivity is a sharp tool that is better left in the shed.
top Apple Logging Locations of All iPhone Users
"The most immediate problem is that this data is stored in an easily-readable form on your machine. Any other program you run or user with access to your machine can look through it."
Apple may not upload it while syncing or by using a scheduled cron job, but any single individual app can read it. Also, as the others said, prove to me at no event does any proprietary apple application access the file. The location data resolution is set to one second intervals, that is insane. They can easily know when I take a piss just by how often I frequent that specific location for short time periods.
The Hobbit Filming at 48fps
The 2D to 3D conversion process has really destroyed the reputation of stereoscopy. It is a poor attempt at achieving the real thing. It can only be done with two separate cameras side by side filming the same scene (or two digitally rendered scenes with separate eye points). We have two eyes, so cameras require two lenses to perceive the scene that is being captured/recorded. The replication/presentation on a video screen is another challenge.
I'm not sure that most people understand that you are suppose to look into the screen, as opposed to having things come out of the screen.
I love perceiving depth in movies, and once head tracking is used for modifying the view frustum dynamically, and it is combined with stereoscopy, the 3D experience it going to be incredible. Unfortunately, this is impossible for more than one person viewing at the moment. Unless they can somehow achieve a massive enough frame rate for multiple viewers (and shutter glasses set at different time intervals). Though by the time this becomes practical, I'd say light fields will have taken over. But even with a light field, we will still want infinite backgrounds (and side views) with depth perception. I'm not sure that I really like the idea of the parallax barrier used in the 3DS, but I imagine it produces satisfactory results.
I imagine that people actually have trouble with convergence. I'd say that it is the same when they try to view a stereogram. I have a lot of trouble viewing a
stereogram, but I have no trouble with viewing 3D video. I think that people incorrectly squint their eyes too hard as if they are trying to see a far off or blurry scene, which would give anyone a headache. It is a natural reaction for people to squint their eyes when they are trying to perceive something, especially for anyone who wears glasses.
Binocular cues are incredibly important for a true viewing experience. Tennis has already become huge adopters of the technology. It allows them to tell whether a ball is inside the line. Soon all the other sports will be the next to follow. I can't wait to look back and laugh at all the haters/flamers.