Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Why Women Have No Time For Wikipedia

Theovon Re:Discrimination (545 comments)

It's SUPPOSED to be objective. But it's impossible for it to be entirely objective. Having more diverse viewpoints would likely improve its level of objectivity, which is as much as we can hope for.

13 hours ago
top

US Government Fights To Not Explain No-Fly List Selection Process

Theovon Can the executive branch be held in contempt? (242 comments)

What would happen if the executive branch (which is supposed to enforce the law) simply refused to comply with a judicial order? Can someone be held in contempt? Who would take on the role of enforcing the judicial order (in terms of compelling the action or executing punishment)?

yesterday
top

Why Women Have No Time For Wikipedia

Theovon Re:Discrimination (545 comments)

You may be right. If we eliminated the barriers, then women might still not be interested. Either way, it's still bad that Wikipedia can't claim to be completely neutral if it doesn't represent a large section of human perspectives.

2 days ago
top

Why Women Have No Time For Wikipedia

Theovon Re:why the focus on gender balance? (545 comments)

No, it's a real problem here. Wikipedia is all about (1) information about the world, and (2) a neutral perspective on that information. Women do have a slightly different perspective, focusing on different information and different aspects of information. Including those additional perspectives will make wikipedia content more complete and also more accessible to female readers.

2 days ago
top

Research Shows RISC vs. CISC Doesn't Matter

Theovon CISC instruction sets are now abstractions (155 comments)

And actually so is RISC to a degree on POWER processors.

Back in the 80's going RISC was a big deal. It simplified decode logic (which was a more appreciable portion of the circuit area), reduced the number of cycles and logic area necessary to execute an instruction, and was more amenable (by design) to pipelining. But this was back in the days when CISC processors actually directly executed their ISAs.

Today, CISC processors come with translation front-ends that convert their external ISA into a RISC-like internal representation. It's on-line dynamic binary translation. Now, instructions are broken down into simpler steps that are more amenable to pipelining and out-of-order scheduling. CISC processors don't execute CISC ISAs and therefore don't suffer from their drawbacks.

It has occurred to me that this could be taken to its logical extreme. ISAs could be made entirely abstract and optimized to be used that way, along with optimizing them for reasonably efficient translation. You get the benefits of microops and the benefits of a CISC ISA (more compact code). Abstract ISAs make it easier to extend functionality in a backward-compatible way too. And unlike x86, we can shed some of the deadweight and also go to all 3-operand instructions, which have some benefits. Decoupling the ISA from the execution engine, we could get even more performance and energy efficiency than Intel does.

With a processor like Haswell, the logic area dedicated to translation is very small, which is why it doesn't matter much. On the other hand, with something like Atom, it occupies a more substantial portion of the total, making the translation (basically, elaborate decode logic) a buden on die area and therefore power consumption.

So it's not really appropriate to say it doesn't matter. It MOSTLY doesn't matter, because most of the drawbacks of CISC have been overcome. The fact that we're using an out-dated CISC ISA for x86, however, has drawbacks of having to support rare and excessively complex instructions, a plethora of addressing modes, and only having two operands per instruction.

2 days ago
top

Fermilab Begins Testing Holographic Universe Theory

Theovon Re:Particle state stored in fixed total # of bits? (246 comments)

IIRC, this isn't actually a paradox. One twin underwent acceleration, which leads to a temporal discontinuity. The other twin stayed in place. I guess if you have two clocks, and you accelerate one away from the other, you should be able to tell which one accerated and which didn't.

3 days ago
top

Limiting the Teaching of the Scientific Process In Ohio

Theovon Science as religion (519 comments)

Not teaching the scientific process may just make things worse. Doubt is a fundamental tennet of science, but many religious people (e.g. Kirk Cameron and his ilk) feel that they were "just told to believe this stuff" when they were in school. Without knowledge of the process that led to this knowledge, students will just start to treat science as an alternate bad religion or something.

Now, many kids handle uncertainty poorly, so this has to be handled carefully, but I think it's critical that science be tought as "this is the best explanation we have." Now, basically everything taught in high school is so well established (misrepresentations not withstanding), so we can explain that what they're being taught is consistent with mountains of evidence. But with the key factor that this stuff, at one time in the past, was cutting edge knowledge and did deserve to be taken with a big grain of salt. This can be expressed in terms of the history and evolution of particular sciences. We understood A at this time, and then someone discovered something, and views shifted accordingly to B. See how new evidence lead to a BETTER understanding through the scientific process??? What we're learning now has pretty well been beaten into submission, but understand that questioning assumptions is an important thing for people to learn.

3 days ago
top

Time Warner Cable Experiences Nationwide Internet Outage

Theovon This is a non-story (133 comments)

If we didn't hate TWC for other reasons, this would be dismissed as a quickly-corrected outage resulting from human error during some maintenance operation. But since we hate TWC, people make a big deal out of it and declare conspiracy and yell about bad customer service.

Nothing to see here, people.

3 days ago
top

Fermilab Begins Testing Holographic Universe Theory

Theovon Particle state stored in fixed total # of bits? (246 comments)

In special relativity, we find out that our velocity through spacetime is actually constant. If you move though space faster, you necessarily move through time more slowly.

So I'm wondering if information about particles is somehow limited to a specific amount of information. If you have more bits of precision about one thing, then the certainty about some other property is necessarily weaker because it doesn't get as many of the total number of bits something can have. Can we work out the number of bits? We need bits for position, bits for momentum, bits for other quantum mechanical properties, etc.

I'm wondering if perhaps superposition is a result of the number of bits for a given property (like spin) going to zero because they were required to increase the precision of something else. For that matter, I wonder if particles can share/trade bits, so that sometimes particles have no bits (like when they get absorbed). And maybe a body made up of particles has bits shared kinda like how a metal's conduction band is shared among all the atoms. Maybe that is the way force carriers act... trading bits. MAYBE the whole universe simply has a total number of bits, which are divided up as necessarily among the particles. And really particle interactions are just bits (and their values) being traded around within a vast amorphous ocean of bits. In that case, particles are an illusion; they're an emergent property (from our perspective) of the varying association among bits.

3 days ago
top

NVIDIAs 64-bit Tegra K1: The Ghost of Transmeta Rides Again, Out of Order

Theovon Re:Static scheduling always performs poorly (125 comments)

Peer-reviewed venues don't reject things that are too novel on principle. They reject them on the basis of poor experimental evidence. I think someone's BS'ing you about the lack of novelty claim, but the lack of hard numbers makes sense.

Perhaps the best thing to do would be to synthesize Mill and some other processor (e.g. OpenRISC) for FPGA and then run a bunch of benchmarks. Along with logic area and energy usage, that would be more than good enough to get into ISCA, MICRO, or HPCA.

I see nothing about Mill that should make it unpublishable except for the developers refusing to fit into the scientific culture, present things in expected manners, write using conventional language, and do very well-controlled experiments.

One of my most-cited works was first rejected because it wasn't clear what the overhead was going to be. I had developed a novel forward error correction method, but I wasn't rigorous about the latencies or logic area. Once I actually coded it up in Verilog and got area and power measurements, along with tightly bounded latency statistics, then getting the paper accepted was a breeze.

Maybe I should look into contacting them about this.

about two weeks ago
top

NVIDIAs 64-bit Tegra K1: The Ghost of Transmeta Rides Again, Out of Order

Theovon Re:Static scheduling always performs poorly (125 comments)

I looked at the Mill memory system. The main clever bit is to be able to issue loads in advance, but have the data returned correspond to the time the instruction retires, not when it's issued. This avoids aliasing problems. Still, you can't always know your address way far in advance, and Mill still has challenges with hoisting loads over flow control.

about two weeks ago
top

NVIDIAs 64-bit Tegra K1: The Ghost of Transmeta Rides Again, Out of Order

Theovon Re:Static scheduling always performs poorly (125 comments)

I've heard of Mill. I also tried reading about it and got bored part way through. I wonder why Mill hasn't gotten much traction. It also bugs me that it comes up on regular google but not google scholar. If they want to get traction with this architecture, they're going to have to start publishing in peer-reviewed venues.

about two weeks ago
top

NVIDIAs 64-bit Tegra K1: The Ghost of Transmeta Rides Again, Out of Order

Theovon Re:Static scheduling always performs poorly (125 comments)

Prefetching instructions hundreds of cycles ahead of time have to be highly speculative and therefore are likely to pull in data you don't need along with missing out on some data you do need. If you can improve the cache statistics this way, you can improve performance, and if you avoid a lot of LLC misses, then you can massively improve performance. But cache pollution is as big a problem as misses because it cause conflict and capacity misses that you'd otherwise like to avoid.

Anyhow, I see your point. If you can avoid 90% of your LLC misses by prefetching just into a massive last-level cache, then you can seriously boost your performance.

about two weeks ago
top

NVIDIAs 64-bit Tegra K1: The Ghost of Transmeta Rides Again, Out of Order

Theovon Static scheduling always performs poorly (125 comments)

I'm an expert on CPU architecture. (I have a PhD in this area.)

The idea of offloading instruction scheduling to the compiler is not new. This was particularly in mind when Intel designed Itanium, although it was a very important concept for in-order processors long before that. For most instruction sequences, latencies are predictable, so you can order instructions to improve throughput (reduce stalls). So it seems like a good idea to let the compiler do the work once and save on hardware. Except for one major monkey wrench:

Memory load instructions

Cache misses and therefore access latencies are effectively unpredictable. Sure, if you have a workload with a high cache hit rate, you can make assumptions about the L1D load latency and schedule instructions accordingly. That works okay. Until you have a workload with a lot of cache misses. Then in-order designs fall on their faces. Why? Because a load miss is often followed by many instruction that are not dependent on the load, but only an out-of-order processor can continue on ahead and actually execute some instructions while the load is being serviced. Moreover, OOO designs can queue up multiple load misses, overlapping their stall time, and they can get many more instructions already decoded and waiting in instruction queues, shortening their effective latency when they finally do start executing. Also, OOO processors can schedule dynamically around dynamic instruction sequences (i.e. flow control making the exact sequence of instructions unknown at compile time).

One Sun engineer talking about Rock described modern software workloads as races between long memory stalls. Depending on the memory footprint, a workload could spend more than half its time waiting on what is otherwise a low-probability event. The processors blast through hundreds of instructions where the code has a high cache hit rate, and then they encounter a last-level cache miss and and stall out completely for hundreds of cycles (generally not on the load itself but the first instruction dependent on the load, which always comes up pretty soon after). This pattern repeats over and over again, and the only way to deal with that is to hide as much of that stall as possible.

With an OOO design, an L1 miss/L2 hit can be effectively and dynamically hidden by the instruction window. L2 (or in any case the last level) misses are hundreds of cycles, but an OOO design can continue to fetch and execute instructions during that memory stall, hiding a lot of (although not all of) that stall. Although it's good for optimizing poorly-ordered sequences of predictable instructions, OOO is more than anything else a solution to the variable memory latency problem. In modern systems, memory latencies are variable and very high, making OOO a massive win on throughput.

Now, think about idle power and its impact on energy usage. When an in-order CPU stalls on memory, it's still burning power while waiting, while an OOO processor is still getting work done. As the idle proportion of total power increases, the usefulness of the extra die area for OOO increases, because, especially for interactive workloads, there is more frequent opportunity for the CPU to get its job done a lot sooner and then go into a low-power low-leakage state.

So, back to the topic at hand: What they propose is basically static scheduling (by the compiler), except JIT. Very little information useful to instruction scheduling is going to be available JUST BEFORE time that is not available much earlier. What you'll basically get is some weak statistical information about which loads are more likely to stall than others, so that you can resequence instructions dependent on loads that are expected to stall. As a result, you may get a small improvement in throughput. What you don't get is the ability to handle unexpected stalls, overlapped stalls, or the ability to run ahead and execute only SOME of the instructions that follow the load. Those things are really what gives OOO its advantages.

I'm not sure where to mention this, but in OOO processors, the hardware to roll back mispredicted branches (the reorder buffer) does double-duty. It's used for dependency tracking, precise exceptions, and speculative execution. In a complex in-order processor (say, one with a vector ALU), rolling back speculative execution (which you have to do on mispredicted branches) needs hardware that is only for that purpose, so it's not as well utilized.

about three weeks ago
top

Oracle Hasn't Killed Java -- But There's Still Time

Theovon Needed innovation: SLIM JAVA DOWN (371 comments)

Right now, if I want to ship an app that uses Java 8 features, I have to bundle an extra 40 megs of runtime. This is because Java 8 isn't yet the default. An extra 40 megs is stupid for simple apps. The runtime is an order of magnitude larger than the application. That's stupid.

If Java wants to innovate, they can find a way to maintain all the existing features and backward compatbility while using less space. That would be a worthy project and worth while for Java 9. They can make things smaller and perhaps even faster by rewriting things that are overly bloated.

about three weeks ago
top

Jesse Jackson: Tech Diversity Is Next Civil Rights Step

Theovon The root of the problem is culture & social cl (514 comments)

For some reason, Americans have developed a stereotype of "white" and "black" that is related far more to social class than anything else. When you say "white," we imagine someone from the middle class. When you say "black," we imagine someone from lower socioeconomic status. How many blacks are in the middle class, I'm not sure, but as for whites in lower classes, we have them coming out our ears. While we may have millions of blacks who live in ghettos, we have 10 times as many whites living in trailor parks.

Because of our confusion between ethnicity and social class, we end up with things like Dave Chappelle's "Racial Draft": http://www.thecomedynetwork.ca/blogs/2013/06/chappelles-show-june5-racial-draft
While amusing, it highlights the real problem, and this false stereotype is widespread throughout American culture.

I recall an interview with Bill Cosby, talking about educational advancement among black children. Peers discourage each other from studying because it's "acting white." When in fact it is "acting middle class," because this same kind of discouragement occurs among lower class whites as well. As long as education is not valued within any group, that group will have difficulty being equally represented in white collar industries.

What we have to work out to explain the disparity between population demographics and white collar job demographics is the proportions of the underrepresented groups who discourage education. People like Jesse Jackson want to make this all out to be the result of prejudice on the basis of genetics or skin color. Honestly, I think we're long past that. There are still plenty of racist bastards out there, but in general, we do not have pink people acting intentionally or unconsciously to undermine the advancement of brown people when it comes to getting college degrees.

It's not PC to talk about genetic differences, but genetics is interesting. Geneticists have identified differences between different ethnic groups, and they have correlated them with some minor differences in physical and cognitive adaptations. Things like muscle tone, susceptibility to certain diseases, social ability, and other things have been correlated to a limited degree with variation in human DNA. But the average differences between genetic groups are miniscule compared to their overlap (statistically, we have very small mu / sigma for basically any meaningful measurable characteristic).

Thus I can only conclude that correcting any disparities must come from within. Regulating businesses won't do any good, because unqualified minorities will end up getting unfairly hired and promoted. We have to start with the children and get them to develop an interest in science and math. If Jesse Jackson wants to fix this problem, he need to learn science and math and start teaching it. I assure you, even at his age, he has that capability, if he just cared enough to do it. Unfortunately for him, if he were to corrupt himself with this knowledge, he would find himself taking a wholly different approach than the "we're victims" schtick he's played most of his life. Personally, I prefer the "the universe is awesome" philosophy held by Neil deGrasse Tyson. He's one of my biggest heroes, having nothing to do with his skin tone.

One last though: I'm sure someone will find something racist in what I have said. Either that or I'm being too anti-racist and appear like I'm overcompensating. There are also aspects of these social issues I know nothing about. I'm just writing a comment on Slashdot that is about as well-informed as any other comment. One thing people should think about in general is whether or not they have hidden prejudices. It's not their fault, having been brought up in a culture that takes certain thing for granted. Instead of burying our heads in the sand, we should be willing to admit that we probably do have subconscious prejudices. That's okay, as long as we consciously behave in a way that is fair to other human beings, regardless of race, gender, sexual orientation, autism, or any other thing they didn't choose to be born with (and plenty of things they have chosen, because it's people's right to choose).

about 1 month ago
top

German NSA Committee May Turn To Typewriters To Stop Leaks

Theovon Listening to keystrokes + HMM = Profit! (244 comments)

Passwords have been stolen just by listening to keyboard click noises. Why could a typewriter be any different? A relatively straightforward codebook analysis of keypress noises plus a hidden markov model plus a Viterbi algorithm will allow you calculate the highest probability sequence of letters for a given sequence of sounds and timings between sounds even in German!

Mind you, they have to be able to get a sound bug in there, but that might be malware-infected computers nearby the typewriters.

Anyhow, basically, the technology used to do automatic speech recognition would make short work of tapping typewriters, so they’re fooling themselves if they think this’ll make much difference.

BTW, I have a strong suspicion that the Germans’ outrage is all a big charade. Every major country has big spy operations. The NSA is neither unique nor the first of its kind. The Germans could not have been ignorant of at least the general nature NSA’s dealings before Snowden, so while they openly object, secretly, this is business as usual. By doing this, they fool their people into thinking they’re not being spied on by their own government and, using the US as a scapegoat, they also generate a degree of solidarity. Russians spy operations, of course, are way worse, so their objections are the same bullshit. And the Chinese government is all about lying to, well, basically everyone while they use both capitalism and cyberwarfare to take over the world and control everyone, so their recent statement about the iPhone is also a crock of shit.

This reminds me of Andrew Cuomo’s push to restore trust in government. The whole idea is disingenuous. Governments, like any large organization, are only going to do what the people need only with checks & balances and transparency.

And as a final note, I believe that the stated purpose of the NSA is a good one: Mine publically available data to identify terrorist activity. That sounds like a good thing to do. It’s the illegal violations of privacy that are wrong. They violate our rights because it’s inconvenient to get the info they need some other way. It’s also inconvenient for me to work a regular job instead of selling drugs. There are much more convenient ways to achieve my goals that I avoid because they are wrong. To do their job, the NSA needs to find clever ways to acquire the information they need WITHIN THE LAW.

about a month and a half ago
top

Economist: File Sharing's Impact On Movies Is Modest At Most

Theovon Anti-piracy campaigns are highly effective (214 comments)

But not for the reason you think.

A question we should be asking ourselves is what impact would piracy have on movie revenue if we’d had higher speed Internet in the days of Napster and Kazaa. We currently live in a culture where even non-technical people know that piracy is a copyright violation. There’s also the looming threat of being sucked up in a dredging operation or having your ISP (or the NSA) volunteer information to the MPAA on your metadata. People don’t avoid filesharing because it’s unethical or illegal. They avoid it because it’s relatively inconvenient (requiring technical knowledge), and they fear excessive penalties if they’re caught.

If pirating movies were as simple as downloaing an app and searching people’s libraries, the amount of piracy would be far greater, and the impact on revenue would more significant.

What’s really curious to me, however, is the amount of time and effort some people spend on this. Personally, I’d rather optimize to reduce how much time I spend on it than try to see how cleverly rebellious can be. If I want to watch a movie that’s currently out on DVD, I have four classes of options:
(1) I could spend about half an hour figuring out which of the numerous available torrents is in a playable format and not a fake and then maybe a couple hours downloading it. If I’m really lucky, I can burn it to a DVD that my player will understand so I don’t have to take the time to connect my laptop to the TV.
(2) I could run down to the nearest RedBox, about 15 minutes round trip, and spend the rest of the time doing some consulting work. Not only would I have a legal copy, but I’d come out ahead financially.
(3) If I have some patience to wait a day, I can order my own copy to keep from Amazon Prime, and I’ll STILL come out ahead financially.
(4) If I’m dead-set on a lengthy download, services like iTunes offer up a wide variety of downloadable media.

I suspect most of us clever enough to avoid getting caught pirating think this way. The legal options are just easier, less costly (time==money), and less risky. Those with the skills already in a minority, so the only people doing any significant amount of piracy are those with both the skillls and nothing better to do than to see how clever they can be at unnecessarily breaking the law.

I encounter that attitude a surprising amount, though, among students. There are people who will spend more time and effort trying to BS their way through an assignment and/or find a way to avoiding the need to do it than would be necessary to actually just do the assignment. Doing the assignment requires learning something new, while all this “clever" avoidance relies on established skills. But I don’t know why these people bothered to go to college if they have no interest in learning the material. I guess they feel pressure culturally or parentally, but I don’t like it when they make it my problem.

about a month and a half ago
top

Normal Humans Effectively Excluded From Developing Software

Theovon Elites in any field must have some OCD (608 comments)

Those people really far out on the cutting edge of new sciences are successful only because they have some major obsessive qualities. They are driven to learn, understand, and create. They understand things so abstract and esoteric that it would be all but impossible to explain some of these ideas to the lay person. And each of us has some secret weapon too. Mine, for instance is that I can code rings around most other CS professors. I’m not actually smarter than them. Indeed, most of them seem to be better at coming up with better ideas on the first shot. My advantage is that I can filter out more bad ideas faster.

A key important aspect of the areas that we are experts in is that there are underlying and unifying principles. Subatomic particles fit into categories and have expected behaviors that fit a well-tested model. CPU architectures, even exotic ones, share fundamentals of data flow and computation. CS is one of those fields that in invented more than it’s discovered, and as we go along, scientists develop progressively more coherent approaches to solving problems. Out-of-order instruction scheduling algorithms of the past (Tomasulo’s and CCD6600, for instance) have given way to more elegant approaches that solve multiple problems using common mechanisms (e.g. register renaming and reorder buffers). You may think the x86 ISA is a mess, but that’s just a façade over a RISC-like back-end that is finely tuned based on the latest breakthroughs in computer architecture.

Then there’s web programming. Web programming is nothing short of a disaster. There’s a million and one incompatible Javascript toolkits. HTML, CSS, and Javascript are designed by committee so they have gaps and inconsistencies. To write the simplest web site (with any kind of interactivity) requires that one work with 5 different languages at one time (HTML, CSS, Javascript, SQL, and at least one back-end language like PHP), and they’re not even separable; one type of code gets embedded in the other. People develop toolkits to try to hide some of these complexities, but few approach feature-completeness, and it’s basically impossible to combine some of them. Then there’s security. In web programming, the straightforward, intuitive approach is always the wrong approach because it’s full of holes. This is because these tools were not originally developed with security in mind, so you have to jump through hoops to make sure to manually plug them all with a ton of extraneous code. In terms of lines of code, your actual business logic will be small in comparison to all the other cruft you need to make things work properly.

When I work on a hard problem, I strip away the side issues and focus on the core fundamental problem that needs to be solved. With most software, it is possible to break systems down into core components that solve coherent problems well and then combine them into a bigger system. This is not the case with web programming. And this is what makes it inaccessible to “normal people.” The elites in software engineering are the sorts of people who extra grand unifying theories behind problems, solve the esoteric problems, and provide the solutions as tools to other people. Then “normal people” use those tools to get work done. With the current state of the web, this is basically impossible.

about 2 months ago

Submissions

top

Ask Slashdot: Any really good texts on evolutionary details?

Theovon Theovon writes  |  about 2 years ago

Theovon (109752) writes "To me, that we evolved from earlier life forms is a straightforward conclusion. We have mountains of evidence, and current theories are sound given that data. But I'm not a biologist, so I find it a challenge to get access to much of that data. I'm looking for a single coherent tome (or maybe multivolume set) of biological data used to develop specific theories of evolution of many ancient and modern family trees. I don't want mere drawings of fossils in sequence like in a high school textbook. I want to see photographs of the original fossils, along with data about geologic strata, measurements of numerous morphological features, and explanations of the lines of reasoning that lead to particular conclusions. Sections on DNA analysis would be great too, along with any other interesting lines of evidence. The conclusions that scientists draw are fascinating, and I'd like to dig deeper into the data they started from. Would the slashdot crowd be able to help me find a top example of such a resource?"
top

'Something is deeply broken in OS X memory management'

Theovon Theovon writes  |  more than 2 years ago

Theovon (109752) writes "Ever since Apple released Lion, countless users have been complaining about performance problems, even on top-of-the-line hardware. OSX point releases have been coming out, but this issue has remained completely unaddressed by Apple, as per their usual "it's not our fault" approach to their mistakes. Well, some researchers have been investigating this. Perhaps Apple will finally take notice. The original article is here, and the OSNews article is here."
Link to Original Source
top

Pampers Dry Max diapers, chemical burns

Theovon Theovon writes  |  more than 4 years ago

Theovon (109752) writes "Despite the self-deprecating jokes, many of us slashdotters do indeed have the social skills to find mates and have children. This is why articles like the recent one on delayed umbilical cord cutting are of interest to us. Well, here's another one for us parents, something my week-old daughter is experiencing first-hand. Procter and Gamble is putting their heads in the sand and denying all responsibility in response to a spate of reports that the most recent version of their "Dry Max" diapers are causing severe rashes that appear to be chemical burns. There are articles all over the place, with questions and blogs and even P&G's lame response trying to suggest that it's a mere coincidence that rashes are increasing at the same time that their new diapers came out. The feds are investigating, and hopefully, there will be a recall soon. My little girl's rash isn't quite as severe as what I've been reading about, since we caught it early and are using liberal amounts of Desitin. We're accustomed to seeing corporate greed stand in the way of product quality, every one of us who is forced to use Microsoft products. But it's one thing to lose some work. It's entirely another when helpless babies are physically injured by a product that we're supposed to trust, and even worse when the manufacturer tries to tell us that we're the ones at fault."
top

Linux Not Quite Ready for New 4K-sector Drives

Theovon Theovon writes  |  more than 4 years ago

Theovon (109752) writes "We've seen a few stories recently about the new Western Digital Green drives, including this one on slashdot. According to WD, their new 4096-byte sector drives are problematic for Windows XP users but not Linux or most other OS's. There's an article on OS News that suggests that Linux users should not be complacent about this, because not all the Linux tools like fdisk have caught up. The result is a reduction in write throughput by a factor of 3.3 across the board (a 230% overhead) when 4096-byte clusters are misaligned to 4096-byte physical sectors by one or more 512-byte logical sectors. The author does some benchmarks to demonstrate this. Also, from the comments on the article, it appears that even parted is not ready since by default, it aligns to "cylinder" boundaries, which are not physical cylinder boundaries and are multiples of 63."
Link to Original Source
top

OGP releases video of VGA emulator booting

Theovon Theovon writes  |  more than 5 years ago

Theovon writes "Slashdot hasn't seen much news about the Open Graphics Project for some time now, but the OGP has been quite busy, especially recently. As you may recall, the OGP's goal is to develop a fully open-source graphics card. All specs, designs, and source code are released under Free licenses. Right now, they're using FPGAs (large-scale reprogrammable chips) to build a development platform that they call OGD1. And they've just completed an alpha version of legacy VGA emulation, apparently not an easy feat. They have posted a Youtube video of OGD1 driving a monitor, showing Gentoo booting up in a PC. This completes a major step, allowing OGD1 to act as the primary display in an x86 PC. The announcement can be seen on the OGP home page, and there's an OSNews.com article. Finally, the Free Software Foundation has taken notice of this and is asking for volunteers to help with the OGP wiki."
Link to Original Source
top

Dedicated compute box: Persistent terminals?

Theovon Theovon writes  |  more than 6 years ago

Theovon (109752) writes "I just built an expensive high-end quad-core Linux PC, dedicated for number-crunching. Its job is to sit in the corner with no keyboard, mouse, or monitor and do nothing but compute (genetic algorithms, neural nets, and other research). My issue is that I would like to have something like persistent terminal sessions.

I've considered using Xvnc in a completely headless configuration (some useful documentation here, here, here, and here). However, for most of my uses, this is overkill. Total waste of memory and compute time. However, if I decided to run FPGA synthesis software under WINE, this will become necessary. Unfortunately, I can't quite figure out how to get persistent X11 session where I'm automatically logged in (or can stay logged in), while maintaining enough security that I don't mind opening the VNC port on my firewall (with a changed port number, of course). I'm also going to check out Xpra, but I've only just heard about it and have no idea how to use it.

For the short term, the main need is just terminals. I'd like to be able to connect and see how something is going. One option is to just run things with nohup and then login and "tail -f" to watch the log file. I've also heard of screen, but I'm also unfamiliar with that.

Have other slashdot users encountered this situation? What did you use? What's hard, what's easy, and what works well?"
top

Theovon Theovon writes  |  more than 7 years ago

Theovon (109752) writes "It's only been two days since the announcement of the official release of Ubuntu 6.10 (Edgy Eft), and the fallout has been very interesting to watch. By and large, fresh installs of Edgy tend to go well. A few problems here and there, especially with installation of closed-source ATI and nVidia drivers, but for the most part things have been smooth. Many people report improved performance over Dapper, improved stability, better device support, etc. A good showing. But what I find really interesting is the debacle that it has been for people who wanted to do an "upgrade" from Dapper (6.06). Installing OS upgrades has historically been fraught with problems, but previous Ubuntu releases, many other Linux distros, and MacOS X have done surprisingly well in the recent past. But not Edgy. Reports are flooding into Ubuntu's Installation & Upgrades forum from people having myriad problems with their upgrades. One user described it as a nightmare. Users are producing detailed descriptions of problems but getting little help. This thread has mixed reports and is possibly the most interesting read. Many people report that straight-forward upgrades of relatively mundane systems go well, but anything the least bit interesting seems not to have been accounted for, like software RAID, custom kernels, and Opera. Even the official upgrade method doesn't work for everyone, including crashes of the upgrade tool in the middle of installing, leaving systems unbootable, no longer recognizing devices (like the console keyboard!), reduced performance, X server crashes, wireless networking problems, the user password no longer working, numerous broken applications, and many even stranger things. Some of this is fairly subjective, with Kubuntu being a bit more problematic than Ubuntu, with reports that Xubuntu seems to have the worst problems, and remote upgrades are something you don't even want to try. Failed upgrades invariably require a complete reinstall. The conclusion from the street, about upgrading to Edgy, is a warning: If you're going to try to take the plunge, be sure to make a backup image of your boot partition before starting the upgrade. Your chances of having the upgrade be a total failure are high. If you're really dead-set on upgrading, you'll save yourself a lot of time and headache by backing up all of your personal files manually and doing a fresh install (don't forget to save your bookmarks!)."
top

Theovon Theovon writes  |  more than 7 years ago

Theovon writes "Back in the 1920's a blight all but completely wiped out Chestnut trees in the United States. As such, my uncle's 1100-tree chestnut farm is a rare sight indeed. From the article, "... someone found a single tree in Ohio that the blight did not kill ..., crossed it with the Chinese chestnut, resulting in a nut with the characteristics of the Chinese variety but with the larger nut of the American tree." The article goes on to describe some interesting things about chestnuts themselves, such as the spiny burr that they grow in on the tree."

Journals

Theovon has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>