Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Ford To Offer Fuel-Saving 'Start-Stop' System

norton_I Re:Buy a Ford! (572 comments)

US != the rest of the world and vice versa

Since the article is about US models of cars, the US numbers are appropriate. Since the article claims that this feature is already common in European diesels, I would guess that it is still a win in those cars as well.

more than 3 years ago
top

Was Flight Ban Over Ash an Overreaction?

norton_I Re:From what I've heard, it really is that bad... (673 comments)

These conditions apparently don't show up enough to justify the cost of determining safe operating parameters. Therefore, no flying. It isn't really complicated -- if a bunch of airlines want to get together and pay for the testing, they can fly. Otherwise, they stay on the ground.

more than 4 years ago
top

Does Cheap Tech Undermine Legal Privacy Protections?

norton_I Re:This is completely different (282 comments)

I think that if the average person walking around had a 10% chance to be carrying a camera that could do thermal imaging, it would be hard to argue that you had a reasonable expectation of privacy and the police would probably be allowed to use it without a warrant.

If LIDAR cost 100k, I think the law enforcement would still be entitled to use it, but if FLIR cost $100 that would make a difference.

more than 4 years ago
top

Does Cheap Tech Undermine Legal Privacy Protections?

norton_I Re:"Thermal imaging devices" are not $50-150. (282 comments)

Or, learn what you are talking about.

IR cameras and film detect NIR in the 800nm - 1.3 micron range. Your stove heating element that is glowing a dim red will light up brightly in such a device, but it is completely useless for this type of application. IR thermometers and thermal imaging systems for the 0-100F range use much longer wavelengths, around 10 microns.

Note that you can't even make IR film that is any good at thermal wavelengths because it would get exposed sitting in a box. The film would have to be prepared, stored, used, and developed in a cryogenic environment. This may have been done (perhaps for IR astronomy), but you obviously can't just buy a roll of 35mm "thermal" film and pop it in a nikon.

more than 4 years ago
top

FreeNAS Switching From FreeBSD To Debian Linux

norton_I Re:openfiler (206 comments)

Openfiler's web gui is buggy as hell, its local LDAP server option is poorly documented and provides terrible diagnostic messages when improperly configured, and it has no official support for installing/booting from flash. Never trust a product that wants to charge money for the admin guide.

I only tried FreeNAS briefly, and did end up using openfiler, but I would love to see anything beat openfiler.

more than 4 years ago
top

"Accidental" Download Sending 22-Year-Old Man To Prison

norton_I Re:Never volunteer anything to the cops (1127 comments)

He is represented by a public defender, which means he can't afford a new lawyer, and his current lawyer can't afford to put together a respectable case.

more than 4 years ago
top

Apple's Grand Central Dispatch Ported To FreeBSD

norton_I Re:No because they are different (205 comments)

GCD is a mechanism to let one central authority dispatch threads across multiple cores, for all running applications (including the OS).

This is what most people talk about, and what is most obvious from the name, but it is not the interesting part of GCD.

The interesting part of GCD is blocks and tasks, and it is useful to the extent which it makes expressing parallelism more convenient to the programmer.

The "central management of OS threads" is marketing speak for a N-M scheduler with an OS wide limit on the number of heavyweight threads. This is only useful because OS X has horrendous per-thread overhead. On Linux, for instance, the correct answer is usually to create as many threads as you have parallel tasks and let the OS scheduler sort it out. Other operating systems (Solaris, Windows) have caught up to Linux on this front, but apparently not OS X. If you can get the overhead of OS threads down to an acceptable level, it is always better to avoid multiple layers of scheduling.

more than 4 years ago
top

Apple Kicks HDD Marketing Debate Into High Gear

norton_I Re:Its been done for years already (711 comments)

So we've had a defined standard that was, arguably, not the easiest to understand. THEN harddrive manufacturers started their fraud. And THEN people started complaining. So what, and please think about this, would be the right decision here?

This is revisionist at best and really just wrong. Despite all "wisdom" to the contrary, there has never been a universal acceptance of 1 MB = 2^20 bytes on computers. For instance, all of IBMs mainframe hard drives from the 60s and 70s were sold using base-10 prefixes. Early desktop hard drives from the 80s used both. I think the ST506 used base-2, but some other models used base-10. All networking and communications standards (ethernet, modems, PCI, SATA...) use base 10 prefixes for MB/s and Mbit/s. 3.5" floppy disks used NASA-style units where 1 MB = 10^3*2^10. Even while RAM is still almost always measured in base-2 units (due to manufacturing issues making it much easier to produce in power-of-2 sizes -- something which is not true for hard drives) the speed of the memory bus on your CPU is still measured in base-10 units.

It is a *good* idea to have K and M mean the same thing everywhere. A system where a 1 GB/s link transfers 0.93 GB every second is stupid. This is especially important as computers are being used in more and more environments. Should a 1 megapixel camera mean 2^20 pixels? What about CDs with a 44.1 KHz sampling rate?

about 5 years ago
top

VHDL or Verilog For Learning FPGAs?

norton_I Re:Verilog is a programming language (301 comments)

Why would you say something as silly as that spice netlist format is not a programming language?

more than 5 years ago
top

Debian Gets FreeBSD Kernel Support

norton_I Re:/usr/bin/pride, /usr/bin/ego, /etc (425 comments)

[quote]Truth is, nothing irreplacable was provided by the GNU project. [/quote]

I can't see you you can possibly think that is relevant. Whether there were other options that *could* have been used doesn't change the fact that a circa 1992 "linux" system was largely a GNU system.

I certainly agree that a modern Linux based desktop is not a GNU OS, but I think it was a perfectly reasonable request in the early 90s. I still call it Linux, mostly because the name is shorter, and I am not about to call it GNU/X11/Gnome/Linux. And the reasons I choose to run Linux over (say) FreeBSD are mostly to do with the kernel and the kernel specific system tools, and not with the userland.

more than 5 years ago
top

Debian Gets FreeBSD Kernel Support

norton_I Re:/usr/bin/pride, /usr/bin/ego, /etc (425 comments)

Compiler and toolchain, and all the 'standard' UNIX tools: the shell, the text utils like cat, grep, awk, etc.

Basically, back in the 80s, the FSF, reimplemented what was at that time nearly the entirety of what was called UNIX except the kernel (which was what the HURD project was/is). It was to be the GNU OS. While the kernel was in development, the userspace tools were developed and ported to other UNIX systems like sunos as a replacement for the often deficient historical versions supported by the UNIX vendors.

So when Linus came along and wrote a UNIX-like kernel using gcc, he could load all those programs on and have a mostly functioning UNIX environment. This was the reason RMS objected to calling it just Linux, at that time the majority of the code running on the system was GNU. It was probably a legitimate point at the time. And even if there were a different compiler, without a set of userspace tools that people could freely get and use it is unlikely Linux would have been able to take off.

Now, of course, a huge part of the user experience is provided by X11, the desktop environments, and various graphical appliations. GNOME is part of the GNU project but X.org, KDE, and most of the applications are not. So it isn't really true that GNU software is still the majority of the OS. Of course, the kernel is even less important in terms of the user environment, and despite all the other software around it, GNU utilities are what makes it (not) UNIX.

more than 5 years ago
top

Null References, the Billion Dollar Mistake

norton_I Re:Pass by reference (612 comments)

That is a pretty bad example. C++ references hardly count as references since you can't reassign them. They are really just syntactic sugar to make operator overloading look nice and reduce the number of -> operators. They cannot be used alone to create complex data structures.

A better example would be references in lisp, perl, or java. They solve many of the problems of C/C++ pointers. They can't be out of bounds. They must respect the type safety of the language. They can't point to an invalid or destroyed object due to garbage collection. However, they all support a null reference.

Maybe there is a better way to do this where you don't ever need null references, but I know two things for certain 1) SQL is not it 2) People will still make errors where data they expect to be there is not.

more than 5 years ago
top

A Quantum Linear Equation Solver

norton_I Re:not able to be used == not useful (171 comments)

You think that quantum computers are not able to justify grants or PhDs? What world are you living in?

until these quantum computers exist and are cheap enough to fill datacentres

Yeah, because classical computers were never useful to anyone (or anyone important) until datacenters existed.

No, to be really useful, quantum computing has to be as easy to afford and deploy as current computing technology.

And until then, developments that bring us closer are irrelevant? Applications that could give us more reason to develop the technology are pointless?

What exactly is your point here?

more than 5 years ago

Submissions

norton_I hasn't submitted any stories.

Journals

norton_I has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>