Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

The Oatmeal Convinces Elon Musk To Donate $1 Million To Tesla Museum

crgrace Re:conflict (78 comments)

Indeed. I also find it strange Matt is so adamant that Tesla was shafted by modern memory, when the very unit of magnetic field strength is the Tesla! How many people get units of measurement named after them? Why did Musk name his car company Tesla if nobody had ever heard of him? Why did a heavy metal band name themselves Tesla and use the electricity metaphor in their marketing? There are researchers who probably contributed even more to the development of the modern world such as Steinmetz, Heaviside, and Shannon who are more obscure to the general public than Tesla.

about 2 months ago
top

The Pentagon's $399 Billion Plane To Nowhere

crgrace Re:What difference now does it make? :) Sunk costs (364 comments)

You seem to misunderstand what sunk cost means. You're using the phrase as an argument to keep funding the project because "we can't reverse time and get the money back". In fact, the common definition of the sunk cost is opposite of your use. Generally only future costs should be relevant to an investment decision, otherwise you run into the danger of "throwing good money after bad". There is a lot of evidence that continued funding of the F-35 is in fact throwing good money after bad.

You also present a false dichotomy. One alternative option from spending upwards of a Trillion dollars on the F-35 is to manufacture more smaller, cheaper, proven fighters such as the F-18 or indeed the F-15. Keeping our current squadrons operable is less of an issue if we build more at lower cost.

about 2 months ago
top

1958 Integrated Circuit Prototypes From Jack Kilby's TI Lab Up For Sale

crgrace Re:Microchip (76 comments)

pretty much no one says "monolithic" any more because hybrids have pretty much gone the way of the buffalo.

In my experience they are usually called "chips" or "ICs" by people in the industry.

about 2 months ago
top

1958 Integrated Circuit Prototypes From Jack Kilby's TI Lab Up For Sale

crgrace Kilby & Noyce (76 comments)

While Kilby's chip with bondwire interconnect was first, it's interesting that Noyce's concept at Fairchild using Hoerni's planar technology with all interconnect fabricated using the same photolithography as the devices is pretty much how we do it today. Kilby's concept was a technological dead end.

about 2 months ago
top

1958 Integrated Circuit Prototypes From Jack Kilby's TI Lab Up For Sale

crgrace Re:Microchip (76 comments)

What's wrong with microchip? I've always preferred it to "computer chip" because so many chips aren't entirely digital.

about 2 months ago
top

The World's Worst Planes: Aircraft Designs That Failed

crgrace Re:Where's the Goblin (209 comments)

Indiana Jones piloted a parasite fighter in Indiana Jones and the Last Crusade. I never knew those things were real!

about 3 months ago
top

Why Not Every New "Like the Brain" System Will Prove Important

crgrace Re:It's the fundamentally wrong approach (47 comments)

"Like the brain" is a fundamentally wrong-headed approach in my opinion. Biological systems are notoriously inefficient in many ways. Rather than modelling AI systems after the way "the brain" works, I think they should be spending a lot more time talking to philosophers and meditation specialists about how we *think* about things.

What you're suggesting has been the dominant paradigm in AI research for most of the 60-70 odd years there has been AI research. Some people have always thought we should model "thinking" processes, and others though we should model neural networks. At various points one or the other model is dominant.

To me it makes no sense to structure a memory system as inefficiently as the brain's, for example, with all it's tendancy to forgetfulness, omission, and random irrelevant "correlations". It makes far more sense to structure purely synthetic "memories" using database technologies of various kinds.

I have to disagree on it making no sense to structure a memory system "inefficiently" as the brain's, because inefficiency can mean different things. The brain is extraordinarily power efficient and that is an important consideration.

It's most likely, in my opinion, that we will eventually find a happy medium between things that computers do well, like compute and store information exactly, and what humans do well, process efficiently and make associations and correlations quickly.

Sure, biologicial systems employ some interesting short cuts to their processing, but always at a sacrifice in their accuracy. We should be striving for systems that are *better* than the biological, not just similar, but in silicon.

While I don't doubt silicon will be important for the foreseeable future, it does have limitations you know.

about 3 months ago
top

Ask Slashdot: Beginner To Intermediate Programming Projects?

crgrace Simulate a microprocessor. (172 comments)

When I was in graduate school I had to write a C program to simulate the operation of a small custom microprocessor. It was a truly fascinating experience (and not terribly difficult). You can start with something really simple like a MIPS variant and go from there. I actually had to write several simulators at different levels of abstraction (one only simulated the instruction set, another simulated down to the microcode, etc). Just simulating a small instruction set is a great way to get started.

The cool part of this kind of project is it gets you learning so many different things out of necessity. To run assembly code on my C-based microprocessor simulation I had to learn to write assembly language programs. Then I had to learn how to write an assembler (I did it in C but if I were doing it today I would use Perl or Python) to generate object code for my microprocessor simulation.. Then to debug the microprocessor I needed to write a disassembler and so on.

The microprocessor was microcoded so I also got to learn how to write microcode to verify fine details of the microprocessor. I got some great insight to computer arithmetic and really enjoyed it.

I can't tell you what a cool experience it is to see a simple assembly code you wrote run on a microprocessor simulation you wrote. This can lead to getting involved in emulation but I didn't do that. I'm in the chip design business now so I write simulations and models of all kinds of analog and digital circuits and it is a blast.

about 4 months ago
top

Nat Geo Writer: Science Is Running Out of "Great" Things To Discover

crgrace Re:No mysteries solvable within a lifetime (292 comments)

I think you can demolish his argument that Nobel lag is indicative of science slowing down much more easily than that.

Think of the Nobel prize as an asynchronous FIFO. Every time a Nobel-worth discovery is made it gets put in the FIFO. Each year the Nobel committee awards a prize and removes one prize from the FIFO.

What if science is speeding up? Then more discoveries will be put into the FIFO than Nobel prizes can empty. So the FIFO gets longer and the length of time between discovery and prize gets longer.

What if science is slowing down? Than the consumption rate is larger than the generation rate and the FIFO empties. Eventually a scientist would win a prize the same here the discovery is made.

I don't understand this guy's logic. It seems to me more parsimonious that there are so many great discoveries for the Nobel committee to choose from that they are starting to queue up.

So, I think his data indicate science is speeding up.

about 5 months ago
top

Nat Geo Writer: Science Is Running Out of "Great" Things To Discover

crgrace Re:Punctuated upheaval (292 comments)

In my opinion this is a bit like sitting in your backyard with a telescope opining that there are no new planets left to discover in the solar system while people are out paving the way to actually visit them.

We don't yet understand if there are simple underlying principles in biology as there are in physics. Biology is so much more complex that physics and we are still in the 19th century...

At some point someone is going to discover the biological equivalent of quantum mechanics and then the world will change again.

It could be that this discovery could be a way to harness computation to really get a handle on complexity or it could be the discovery of the underlying principles.

I can't wait to find out.

about 5 months ago
top

Nat Geo Writer: Science Is Running Out of "Great" Things To Discover

crgrace what about Neuroscience and structural biology? (292 comments)

I looked at the article and the author is focused on advances in physics, where he may actually have a point.

He doesn't seem to be aware of some of the stuff being done in neuroscience, nanotechnology, and structural biology, to name a few.

We've come so far in getting more insight into the biological and electrical nature of the brain in just a few years and the idea of a connectome (that we can actually map in principle) is a huge breakthrough that will lead to fantastic new technologies.

When one field plateaus, another explodes. Look up epigenetics and CRISPRs and prepare to blow your mind. To say we are near the end of science is crazy.

Also, this author doesn't seem to know about Occam's Razor. There are many explanations for why Nobel Prizes are taking longer to get awarded than any concept of science slowing down.

about 5 months ago
top

Nat Geo Writer: Science Is Running Out of "Great" Things To Discover

crgrace Re:Level of public funding ? (292 comments)

That's not necessarily a bad thing. Science is worthless if we don't use it in practical applications. But if we're looking for reasons why less basic research is getting done, this could play a role.

I think it's a bad thing. Most of our great advancements in consumer electronics, medicine, and computing are based on mining basic research (that was mostly publicly funded). When that mine is played out where will the raw material for new advances come from?

about 5 months ago
top

eBay Japan Passwords Revealed As Username+123456

crgrace Re:wait a minute... (80 comments)

When I was an undergrad our Unix labs had every computer named after a cartoon character. All Hanna-Barbara characters too. I liked to use dino because it was fewer characters to type.

about 5 months ago
top

Slashdot PT Cruiser Spotted In the Wild

crgrace I see it now and then (94 comments)

I grew up close to Walnut Creek so I'm in the area periodically and I've seen the PT Cruiser now and then. I'm pretty sure it's green though. The picture makes it look blue!

about 7 months ago
top

Programmer Debunks Source Code Shown In Movies and TV Shows

crgrace Re:Terminator was Apple ][ ROM (301 comments)

In the same scene there were some plots on the display of the frequency response of an under-damped second-order system... most likely the performance of some internal clock. Presumably there was frequency hopping involved because the Terminator was obviously optimizing his frequency acquisition time at the expense of clock period stability.

The Terminator had some analog circuits in there! Oh yeah!

about 8 months ago
top

Republican Proposal Puts 'National Interest' Requirement On US Science Agency

crgrace Re:Silicon Valley driven by military requirements (382 comments)

Since the earlier poster brought up Silicon, why do you suppose Bell Labs was researching the electrical properties of silicon? To produce better diode rectifiers. The idea of actually making a three-terminal switch was a long term goal that they got to a lot quicker than they expected.

Bell Labs was in fact funneling a lot of military money into developing better tubes. And they were funneling a lot of military money into investigating silicon devices. They had more than enough money to do both. The military (and Bell Labs management because they could see the future in solid-state phone switches) funded the basic research to put themselves in a position where the applications were on the horizon.

A really great book documenting this amazing time is Crystal Fire. http://www.amazon.com/Crystal-Fire-Transistor-Information-Technology/dp/0393318516

about 10 months ago
top

The Mile Markers of Moore's Law Are Meaningless

crgrace Re:Transistoacking has probably reached its limits (156 comments)

All sub-65nm and most 65nm processes are lithographically exposed in water current for the reason you stated. The next step is extreme UV or even e-beam lithography but it's expensive and very, very difficult.

You're quite right that this is an economic/mass-market issue more than a pure technical issue.

about 10 months ago
top

The Mile Markers of Moore's Law Are Meaningless

crgrace Re:question: did you *only* use Moore's Law? (156 comments)

hey thanks for the response

We were confident because the availability of that process was predicted by Moore's Law and any number of foundries were spending billions to make it happen.

Right, so did you just use Moore's Law or did you look at other factors as well?

What I mean by other factors:

> Trends of the capacity of other recent products? Did you look at teh speeds of CMOS processes from that company over the last 10 years and extrapolate?

> Did you talk to a sales rep or engineer or product development manager at the CMOS process company and **ASK THEM** how fast their upcomming models would be (approximately)

> Do literature review of what academic research groups and possible FOSS (idk if it applies for you) were doing in that CMOS wireless type transciever tech? My former university, Ball State University did research for WiMax coverage and speed for Cisco (before WiMax was ditched)...did you look at any of that to predict the CMOS process capability you needed?

I'm trying to be polite, but I call BS.

If you claim your company made that decision based **soley** on math from Moore's Law....well I have a hard time believe that claim's veracity. You are either fabricating or that company is not very wise. And if you company **did** use other factors, then that kind of invalidates your point and parenthetically supportsy my point...I won't deny that using it **might** have added value, but only IF you also did common practices like I mentioned above...

Seriously...did you use other factors besides Moore's Law?

Like asking the vendor? (or any of the others mentioned above)

Of course we used all kinds of inputs into our planning process. We would have been fools not to.

I feel like you're doing a bit of "move the goal posts" here. First you very emphatically state that "[Moore's Law] is NOT and HAS NEVER BEEN fit to predict anything invovling money or resources"

I gave you a reply from experience that that is not true, and in fact companies do use (or at least used to) use Moore's Law in their planning process (where money and resources are involved).

Now you saying I'm claiming my company invested millions blindly because we had some faith in Moore's Law. Of course we didn't, and I don't think I implied that.

First off, looking at the speed improvements from the foundry over the last 10 years as evidence is pretty much the same thing as following Moore's Law.

Second, as I'm sure you know, sales reps will say "YES" to anything, so Moore's Law helps put things in context. If they are saying something way better than Moore's Law, you have to be skeptical.

Basically, I disagree that the fact that we used a variety of factors (like virtually any company will do for any decision) invalidates my point. You said that Moore's Law isn't fit for predicting things. I disagree.

If you would have said "Moore's Law isn't fit for making significant investments in the absence of other factors or critical thinking" then I would agree with you.

about 10 months ago

Submissions

crgrace hasn't submitted any stories.

Journals

crgrace has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>