Non-Coders As the Face of the Learn-to-Code Movements
Because being able to use logic to write instructions that are correct and unambiguous is a skill that everyone should learn. And basically that's what coding is.
It's like literacy or numeracy or basic understanding of science. You have a problem as a culture if it is culturally acceptable to say "I can't do math" or "I can't understand written language" or "I have no idea about the universe around me or how people go about understanding it" or "I can't read or write logical directions."
Do you expect everyone to be a best-selling novelist (or a writer that is enjoyed for all history?) No.
Do you expect everyone to be the next Ramanujan? No.
Do you expect everyone to be the next Knuth? No.
But it is expected that everyone have basic skills in these kinds of things. It's just necessary to understand the world. If you don't understand these kinds of things -- if you don't have basic skills in language or mathematics or logic -- then you are at a disadvantage in modern society.
I group computer science'logic here separate from Mathematics. Perhaps it shouldn't be. But having a population that doesn't understand things like this shuold be considered as problematic as a population that can not read and write.
Who Is Liable When a Self-Driving Car Crashes?
If you are driving, you are responsible.
A car that drives itself is responsible for itself.
Who pays in the event of an accident is the driver. In this case, the car. Probably the manufacturer would be liable.
Manufacturers will probably get insurance for the car when driven autonomously. If self-driving cars are safer, this should be a lower insurance rate than you pay now. Additionally, self-driving cars will probably have sensor input that will prove/disprove fault.
Intel's Haswell Chips Pushing Windows RT Into Oblivion
Woah. Woah. Woah. Woah. Woah.
I will let people crap all over a post that's basically regurgitating Intel Developer Forum drivel, and I'm certainly not going to say that WinRT has a future.
But I will NOT let you trash talk Alpha.
The Alpha was simply a much better processor than anything from Intel at the time. It was pretty much the fastest out there, though you might argue with some high end POWER or MIPS 10K or something.
Maybe you were running Windows and x86 programs on the Alpha? Those weren't blazing. But native Alpha programs were fast fast fast. And the architecture is clean and beautiful. Just beautiful.
So you can say that ARM has not much advantage over x86 today. That's probably true. You can say that ARM sucks, has too much complexity, and the system architecture is an abomination. That's probably true also. But you leave the Alpha out of your talk unless you know what the hell you're talking about.
The Grasshopper Can Fly Sideways
Your "1ST rule of Rocket Engineering" can also be stated: You always develop sub-optimal rockets.
Seems like a stupid rule to me.
If an engine goes out, or there is some other problem, you need extra fuel to accomplish the mission (increased gravity drag). So you have some extra fuel and extra delta v, and that's a good thing.
But if those events are rare -- and, eventually, they should be -- then you often have extra fuel. If you can use that fuel to return the craft intact to reuse and make more money, then I think that's a damn good idea. If you must burn the extra fuel, then you will lose the stage. It will cost the company more, but "less profit" is maybe an OK choice.
The goal is to optimize cost while maintaining very high reliability. For very high reliability, you need to understand worst case behavior. For optimizing cost, you need to make the common case cost efficient. Having extra delta v for anomalies and using that delta v to lower launch cost (via reuse) when no problems arise seems like smart engineering to me.
Ask Slashdot: Exploiting 'Engineering And ...' On a Resume?
You probably have a clearance. This is very valuable to many employers. Make sure you have that at the top of your resume.
Seriously, though, Clearance + EE is quite valuable. If you're worried about seeming "rusty" on the engineering side, get a MSEE from some university... a lot of very good universities have distance programs where you might be able to get started early.
Will Donglegate Affect Your Decision To Attend PyCon?
There seem to be two groups of people here:
The first group of people is not offended by jokes, including jokes influenced by sexuality.
The second group of people is offended by jokes, especially jokes influenced by sexuality. A subset of this group is offended by such jokes when spoken by members of a certain gender. Of course, this is discriminatory so we will ignore that aspect and categorize them as offended in general.
I think there is a desire to be respectful of the second group while avoiding strict censorship of the [majority] first group.
I suggest a clearly visible sign that someone is offended by jokes influenced by sexuality (or, perhaps broadening this to include all jokes?). Perhaps a yellow hat or something like that. People within earshot of such people should refrain from telling such jokes. People wearing the sensitivity marker who hear things offensive to them can raise the issue to convention staff who will attempt to deal with the issue. People wearing the "sensitivity" marker who make such jokes will permanently lose the right to wear them.
People not wearing the sensitivity marker who hear something offensive to them should either (A) indicate to the offensive person directly that their conduct is perhaps inappropriate, or (B) move away from the offensive person so that they are no longer offended. If (A) is ineffective and (B) is ineffective or impossible the convention staff can be notified and they may or may not choose to act; anyone not wearing a sensitivity marker who is upset is free to go put on a sensitivity marker.
People may wish to have activities which may include things that people find offensive, they are free to ban sensitivity markers. Additionally, "sensitivity-marker free zones" or "automatic sensitivity marker" zones could be created. Or even entire conventions where no sensitivity markers are allowed -- one would expect a crude joke convention to probably not cater to overly sensitive people.
Of course, in an ideal world, everyone would be adult enough to know to watch their language a little bit, and to not overreact a lot. But given that certain people are especially sensitive for various reasons, we should find a way to allow them to coexist with the rest of society.
Defcad.com Wants To Be the Google of 3D-Printable Guns
It's usually hard to copyright a "thing". If you make a thing -- a new type of shelving or gun or glass or pen or chair or whatever -- you can't get a copyright on it, you can maybe get a patent on it.
So for a CAD file of a gun, the CAD file could be copyrighted... but it would be copyrighted by the author, not by the manufacturer of the gun it was a clone of (unless they were the author, of course). Now, printing out the gun might be manufacturing something covered by patents... but copying the file wouldn't be creating the gun.
3D printing will sure be interesting from a legal standpoint, it potentially brings copyright and patent law together for just about everything. I would hope that we could establish that CAD files for 3D printers are equal to recipes for the purposes of copyright: a series of steps to create something. But that's certainly not what happened for source code.
Form1 3D Printer and Kickstarter Get Sued For Patent Infringment
And IANAL, but it looks like the independent claims have a part that can be worked around; for example, copying or shifting data from one layer to the next.
Form1 3D Printer and Kickstarter Get Sued For Patent Infringment
The complaint clearly states they are filing suit based on the 520 patent.
Intel CEO Paul Otellini Retiring
Sanjay Jha is out of MOT. He'd be my pick for a replacement. You heard it here first!
AMD Licenses 64-bit Processor Design From ARM
What other company could make a processor that does both x86 and ARM? Windows 8 that runs both ARM and legacy x86 apps? I could see that as being pretty differentiated. Their GPUs are on par with nVidia, and they have better processor microarchitecture.
FCC Chief: 300MHz More Spectrum By 2015
It's the law!
And you're getting very close to the Shannon limit with turbo codes. LTE isn't much more spectral efficient as compared to HSPA+, but it has wider frequency bands and so can get more peak speed to customers.
So you can increase the amount of spectrum you have, with the current infrastructure, to get more capacity. That will buy you a few years of network traffic increase.
But eventually you have to figure out how to get less capacity demand and more SNR. There's really only one way to do that: change the infrastructure topology. And that has lots of problems.
It's kind of like we're near "Peak Bandwidth".
Slashdot Turns 15, What Are You Doing Later?
In a lot of ways, things are more or less the same. The SNR is lower, but that's true for the internet as a whole. :-(
Patent Troll Sues X-Plane
Of fark.com. He recently had to defend against one of these patent trolls. His advice on a TED talk was (if I recall): fight the infringement, not the patent. His response with a lawyer was actually pretty cheap, and asked the plaintiff to circle the infringement and to disclose various things about the shell company. The other side doesn't want battle in court, either. Drew settled for nothing.
What's To Love About C?
- It is much more portable than assembly
- The performance overhead compared to assembly is reasonable
- Most people find it simpler to develop in than Forth
- It's not the horrible monster that C++ is
This makes it very nice for all kinds of embedded environments.
Efficiency matters. Python is great, but you don't want to use it for embedded work.
Intel Dismisses 'x86 Tax', Sees No Future For ARM
All those scalar processors look the same. You can trade energy efficiency for performance and end up with a lower power processor that's a lot slower. When you push the performance, the architecture doesn't matter as much, because most of the energy is spent figuring out what to run and when to run it.
Compounding this fact, ARM isn't that great of an architecture. It's got variable length instructions, not enough registers, microcoded instructions, and a horrible, horrible virtual memory architecture.
The big thing that ARM has is the licensing model. ARM will give you just about everything you need for a decent applications SOC. Processor, bus, and now even things like GPU and memory controllers. Sprinkle in your own companies' special sauce, and you have a great product. All they ask is for a little bit of royalty money for every chip you sell. And since everyone is using pretty much the same ARM core, the tools and "ecosystem" is pretty good.
But there's not much of an advantage to the architecture... the advantage is all in the business model, where everyone can license it on the cheap and make a unique product out of it.
And nowadays, the CPU is becoming less important. It's everything around it -- graphics, video, audio, imaging, telecommunications -- is what makes the difference.
AMD and ARM Team Up
10-15 square mm? Do you even have any idea what you're talking about? Those A5's are more like 1-2 square mm, even with a l2 cache. They're tiny. And perform very poorly.
Google Files Antitrust Complaint Against Microsoft, Nokia
Google is at least trying to say "Hey, this whole patent troll environment sucks. You should really do something about this problem!"
Hopefully someone will listen to their complaint before they are forced to take matters into their own hands.
And I think everyone also sees the next step, which is retaliation. Google just bought all those Motorola patents, and having them shut down Nokia and Apple with all those 17-year-old cell phone patents would really be a step up in the Mutually-Assured-Destruction conflict, and everyone would suffer for it.
Taking this approach with the nukes in your back pocket seems much more civil than approach taken by the others.
Ask MIT Researchers About Fusion Power
ITER is a hugely expensive project, and won't produce a commercially viable power generation system.
In a lot of areas where research is done on things which don't work yet -- rockets, bridges, transmission systems, etc -- there's a general idea of how things might be able to "scale up" to meet the goals.
Is tokamak fusion really in sight of being commercially viable source of energy? If we need unobtanium to make a commercially viable reactor, wouldn't it make sense to wait until the materials are viable before making even larger tokamaks? What do we learn from making these new, bigger, more expensive reactors?
Or are we trying to build ever-bigger spark gap transmitters as a way to make radio better? Maybe we should look at other schemes?
Or, alternatively, we know of a nice, large, gravity-fed fusion reactor fairly nearby, is the engineering simpler to harness energy from that on a large scale?
Faster-Than-Fast Fourier Transform
Sure. Most DSPs don't have floating point. Looks like the AVR has an 8x8 multiply, so that really helps. Most DSPs use 16x16 multiplies, accumulating with a 32 bit or larger accumulator. That's going to take several instructions on an 8 bit micro, but it's certainly feasible for things like audio with low data rates and small FFTs. On the other hand, if you're using an Arduino to do audio FFT for things like a spectrum analyzer, this technique won't help, since you're not interested in picking up a few signals, but all the frequency bands.