Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

GNOME 3.14 Released

Theovon Window decorations don't suck! (248 comments)

I've been complaining for years that the default KDE window manager not only looks ugly but also clashes with the rest of the theme. If they made windows look like plasma widgets, then they would look sleek, and they would look like they were designed to fit with the rest of the theme. But KDE devs seem to have no idea what I'm talking about. How can thing go so right in so many ways and then fall apart in one so conspicuous area?

On first glance, the new Gnome window decorations actually look pretty good. Maybe I'll change my mind later, but it looks like someone developed a sense of style.

5 days ago
top

Rosetta Code Study Weighs In On the Programming Language Debate

Theovon Re:Who cares about succinctness .... (165 comments)

Verilog vs. VHDL. I find that the verboseness of VHDL (which requires like 3 times as much typing) actually impedes readability. Sure, there are situations where VHDL can catch a bug at synthesis time that Verilog can't, but the rest of the time, it just makes VHDL unwieldy.

about a week ago
top

Ask Slashdot: Finding a Job After Completing Computer Science Ph.D?

Theovon I got a PhD in computer engineering too (471 comments)

I was maybe 80% sure that I wanted to go into academia, so it's not so strange that I got a PhD. But I interviewed for both industry and academic jobs. In my case, I had extensive industry experience previously. For some academics, the industry experience seemed to be a negative, and for some industry employers, the PhD seemed to be a negative. Very few companies saw the combo as a bonus, although the list of companies that did think my background was good included Intel, AMD, and my current employer (a research university -- I went into tenure track).

I recently interviewed at AMD (because they called me, and I figured it couldn't hurt to see what my alternatives are), and they grilled me hard on programming questions. They asked me things like what do 'volatile' and 'static' keywords in C mean (I was able to quickly rattle off more than the interviewer needed to know about them), and when I went on-site, they gave me some programming problems. The key reason they like me (and are writing up an offer) is because I knew a lot about programming, had done a lot of programming (despite having been in academia for 2 years, they referred me to as a veteran from industry), and I knew a good deal about each of the topics they talked with me about (CPU architecture, GPU architecture, the 3D graphics rendering pipeline, compilers, etc.).

Key ways in which this went well for me included (a) I proved that I was a very strong software engineer with practical knowledge, skill, and efficiency, and (b) I was able to show how, for me, the PhD augmented (rather than hurt) my engineering skills. That last bit is key. For instance, I showed that I could approach a problem with creative solutions, apply a scientific approach to determine the viability of the idea, and (most importantly) explain how I can fit it into the context of a BUSINESS that wants to make money from it. Coming from academia, also I know how search for existing solutions, so I can also avoid reinventing the wheel. I can look up what people have done before and incorporate some of those ideas into a new practical solution.

So, bottom line, if you want to go into industry (and not necessarily into some big company's research wing), then you have to show that you're a real engineer who can design complex solutions to complex problems and do it efficiently. You have to know a LOT about programming. On top of that, you have to know a lot of theory (algorithms, data structures, computational complexity, etc.). And you have to show that you can think in business and product terms. You're working there to make products that will sell and make money, and you have to convince them that you're unconsciously competent at doing this very well. You need to break the stereotype that PhDs arrogantly have their heads in the clouds, can't think about practical matters, and get too easily distracted by things tangential to the job at hand.

about a week ago
top

Ask Slashdot: Finding a Job After Completing Computer Science Ph.D?

Theovon Re:List the STL? Seriously? (471 comments)

Yes and no. Depends on how specific they want to get. Can you list general types of containers? They're fairly universal: linked list, vector, double-ended queue, set, map, multimap, etc. This kind of question just probes for familiarity with the kinds of facility the language and its standard libraries offer. Bonus for remembering the names of a few of the methods (like size() and push_back()). All this means is that you understand data structures and that you've managed to retain some of the facts about that language's take on them.

about a week ago
top

Data Archiving Standards Need To Be Future-Proofed

Theovon Re:Too bad your DNA is useless to most MDs (113 comments)

We seriously considered chronic lyme as a possibility and even got testing. The test came back negative, although there can be false negatives. We ultimately ruled it out on the basis of certain key symptoms being absent. Basically, we considered a LOT of things and did our best to rank the changes of each illness that might explain the symptoms. We were open to the idea of more than one cause but considered it a remote possibility; fortunately we were right.

Anyhow, homozygous MTHFR C677T can be serious, especially if there are other complicating mutations. Compared to some people my wife has a moderate problem. She had chronic fatigue (not to be necessarily confused with CFIDS), brain fog, autoimmune disease, gluten intolerance, weight gain, pale skin, hairloss, and many more symptoms. But she never lost feeling in her limbs; some people do. When you mess up the methylation cycle, all sorts of things can go wrong.

I'm not sure why you (an anonymous coward, so why am I feeding the trolls?) think that this mutation is of "dubious clinical significance." It's one of the more serious mutations, and the appropriate treatments have worked. Taking methylfolate, a few different forms of B12, and several other supplements has caused massive improvement in energy, return of proper skin tone, hair regrowth, appropriate weight loss, and so on. In other words THE TREATMENT WORKED.

This is one of those fortunate cases where a hard-to-find single cause has been identified. It explains ALL of the symptoms (many of which are secondary, caused by a deficiency caused by the underlying problem), and the treatment has worked very well. It's a little hard to get the exact dosages of vitamins right, because as soon as you get enough of one thing, the body will start repairing things, which requires other chemicals, and cause a deficiency in another thing, etc. So the fix isn't an over-night sort of thing but the progress is rapid.

And my biggest complaint is not that the MDs didn't know how to diagnose this. My complaint is that they EXPLICITLY REFUSED to help us when we were trying to track down the cause. Seriously. Most doctors just didn't have a clue and were unwilling to "do a lot of speculative testing," while some out-right said they refused to help us. Even if we came in with a list of tests to do to try to narrow down a range of possibilities (like a decision tree), they wouldn't do it. We had to figure this out completely on our own.

I don't expect MDs to know everything or be super-human. But I do expect them to listen and take patients seriously.

about two weeks ago
top

Data Archiving Standards Need To Be Future-Proofed

Theovon Too bad your DNA is useless to most MDs (113 comments)

... or for that matter any of your medical history. MDs do spot-diagnosis in 5 minutes or less based exclusively on what they've memorized or else they do no diagnosis at all.

My wife has a major genetic defect (MTHFR C677T), which causes severe nutritional problems. We haven't yet met an MD who has a clue about nutrition. Moreover, we had to diagnose this problem ourselves through genetic testing, with no doctors involved. We've shown the results to doctors, and they don't entirely disbelieve us, but they also have no clue what to do about it and still are dubious of the symptoms. (Who has symptoms of Beriberi these days? Someone whose general ability to absorb nutrients is severely compromised.)

What makes anyone think that this will change if your doctor has access to your DNA, even with detailed analysis? They won't take the time to actually read any of it. In fact a lot of what we know about genetic defects pertains to problems in generating certain kinds of enzymes, a lot of which participate in nutrient absorption. (So obviously RESEARCHERS know something about nutrition.) These nutritional problems require supplementation that MDs don't know about. Do you think the typical MD knows that Folic Acid is poison to those with C677T? Nope. They don't know the differences between folic acid, folinic acid, and methylfolate and still push folic acid on all pregnant women (they should be pushing methylfolate). They also don't know the differences between the various forms of B12 and always prescribe cyanocobalamin even for people who need the methyl and hydroxy forms.

Another way in which MDs are useless is caused by their training. Bascally, they're trained to be skeptical and dismissive. Many nutritional and autoimmune disorders manifest with a constellation of symptoms, along with severe brainfog. Someone with one of these problems will generally want to write down the symptoms when talking to a doctor, because they can't think clearly. The thing is, in med school, doctors are specifically trained to look out for patients with constellations of symptoms and written lists, and they are told to recognize this as a condition that is entirely within the mind of the patient. Of course, a lot of doctors, even if not trained to dsmiss things as "all in their head" are terrible at diagnosis anyway. They'll have no clue where to start and won't have the patience to do extensive testing. It's too INCONVENIENT and time-consuming. They won't make enough money off patients like this, so they get patients like this out the door as fast as possible.

I've had some good experiences with surgeons. But for any other kind of medical treatment, MDs have been mostly useless to me and my family. In general, if we go to one NOW, we've already disgnosed the problem (correctly) and possibly need advice on exactly which medicine is required, although when it comes to antibiotics, it's easy enough to find out which ones to use. (Medical diagnosis based on stuff you look up on the internet is really hard and requires a very well-trained bullshit filter, and you also have to know how to use the more authoritative sources properly. However, it's not impossible for people with training in things like law, information science, and biology. It just requires really good critical thinking skills. BTW, most MDs don't have that.)

MDs are technicians. Most of them are like those B-average CS grads from low-ranked schools who can barely manage to write Java applications. If you know how to deal with a low-level technician, guide them properly, and stroke their ego in the right way, you can deal with an MD.

about two weeks ago
top

Schizophrenia Is Not a Single Disease

Theovon Nutritional deficiencies! (222 comments)

Probably at least a few of those sub-disorders are actually nutritional deficiencies. We have this myth (perpetuated by MDs who have ZERO training in nutrition) that we don't have nutritional deficiencies in America. In fact, the American diet is horrible, and we all know it. B12 deficiencies are common (which is one of the reasons shots are often prescribed), as are deficiencies in magnesium, along with numerous other vitamins and minerals. Since the mid 90's, the FDA has mandated "enrichment" of foods, but the forms of the additives are NOT the biologically active forms, so some people have trouble processing them. For instance, MTHFR gene defects are common (my wife and I have different ones, and she has the really bad C677T defect), making folic acid (which is artificial) range from useless to poisonous to some people who need to take methylfolate instead. In fact, since the mid 90's a lot of people have reported declines in their health, which may be correlated with that FDA mandate (although without a more complete study, we have to assume this is anecdotal and COULD be correlated more strongly with something else).

Anyhow, my point is that many psychological disorders, such as bipolarism, are associated with vitamin deficiencies. If you look at the symptoms lists of various B vitamin deficiencies, for instance, you'll see that it is already established what kinds of psychological effects can occur in cases of "extreme" deficiencies. If we can get past the idea that nobody in America can have extreme vitamin deficiencies (you can have plenty of some vitamin but still be anemic if you can't USE it in that form), then we can start treating mental disorders using carefully controlled diets and supplement schedules. I'm sure it won't work on everyone, but it would be insane to not try it in place of loading people up on antipsychotics because the "doctors" are mental hospitals have no fucking clue about nutrition.

And just to reinforce, folic acid is basically poison to about 10% of humans. Different vegetables contain methylfolate and/or folinic acid, NOT folic acid. Defects in the genes that code enzymes that convert folinic acid and folic acid are more common than most food allergies, making this a serious problem!

One interesting side-effect of this is the proliferation of the bad genes. People with homozygous C677T mutation have about a 30% conversion rate from folic acid to methylfolate. (Meanwhile the unconverted folic acid itself interferes with the methylation cycle.) If a woman gets pregnant and takes folic acid in large quantities (which is what doctors instruct), the fetus will take all of the methylfolate, and the mother will get very sick. Meanwhile the fetus will be allowed to develop when otherwise it would have naturally aborted due to an inability on its own to convert folinic acid that you get from food. As a result, we have more people born with this defect, while people in the FDA and the medical profession are too ignorant of the consequences to deal with them properly. Mind you, if they were to take more methylfolate, the viability of this defect would increase, but at least the mothers wouldn't get as sick.

about two weeks ago
top

Link Between Salt and High Blood Pressure 'Overstated'

Theovon Personalized medicine... and nutrition (291 comments)

Yeah, much of what we know is being overturned. Some of the disinformation was probably created by the food companies that wanted to make cheaper food. Back in the 70's we were told that fat was bad, and so all these processed foods got lots of extra sugar instead. Now we find out that sugar is bad and you need to consume more of the right fats. We're also starting to see that this "food pyramid" they taught us about should be basically inverted. The reason for the food pyramid is more to do with cost (grains are cheap) than nutrition.

Today, we know a hell of a lot about the impact of genetics, microbiotic flora, and many other things that affect individuals differently. For instance, many people have some mild sensitivities to various food proteins, although no always enough to notice more than some unexplained lethargy unpredictable times after eating certain foods. Of course, for some people, it's bad, like those with celiac disease.

Here's an interesting one: Apparently, about 10% of the population (US or world, I'm not sure) has a homozygous MTHFR C677T mutation. These people cannot convert folic acid (which is artificial anyway) or folinic acid (found in lots of vegetables) into methylfolate. As a result, these people suffer from massive B9 deficiencies (which indirectly causes others, like trouble absorbing B12). Moreover, it's not just that folic acid and folinic acid are not useful to them; they're functionally poison, interfering with the normal function of the methylation cycle. So these people need to take large quantities of methylfolate and cut out certain "healthy" vegetables. They also have to cut out "enriched" foods. We're starting to see a correlation between health problems increasing in these people and the mid-90's FDA mandate to enrich certain foods with Folic Acid. Lovely.

about three weeks ago
top

Apple Announces Smartwatch, Bigger iPhones, Mobile Payments

Theovon iWatch is a pretty cool prototype (730 comments)

If I had disposable income, I might get an iWatch for my wife (who actually uses a watch). This is a great little toy for people with some disposable income and an itch to collect expensive gadgets. It looks cool and probably has some great functionality.

For those complaining about it, they're expecting too much from a first-generation product. Give it a few years, and the features, battery life, and price will improve to a point where more of us would consider buying one. Meanwhile, Apple gets to test out ideas that will improve its later products, and some of what is learned from this will also positively impact other Apple products.

Some time next year, I'll go check one out in an Apple store for about a minute. (Which is about how long I can stand to be in an Apple store since I already know everything about the products before I go there, so I get bored quickly.)

about three weeks ago
top

Unpopular Programming Languages That Are Still Lucrative

Theovon Supply and demand (387 comments)

In general, having a niche skill pays more if you can find a job in that niche. Companies that need that skill know it's a niche skill and one way or another end up realizing they have to offer higher salaries to attract the right talent. On the other hand, if you are an expert in COBOL and go to work doing web programming, they're not going to care, and your COBOL knowledge will have no impact on your income.

That all being said, I don't understand the general fear people have of learning new languages. Sure, it takes time and effort, but it should be considered part of the craft. At Ohio State, the CSE department used to teach a language called Resolve for its beginning courses. It was DESIGNED to make things like data structures and algorithms easier to code, avoiding a lot of the cruft of other languages that is unrelated to the code that matters. But students complaned and complained about learning a language they'd never use again. My thinking is that if they're going to be successful, engineers need to be willing and able to learn new languages on their own time.

At OSU, they eventually caved to student pressure and switched to Java. Why they chose Java is beyond me. Where I teach now, they use Python, and that makes a hell of a lot more sense to me. If you're entirely new to programming, there is a lot of boilerplate (from the perspective of the uninitiated) Java code you have to write just to do simple things that is unnecessary when writing Python.

about three weeks ago
top

IT Job Hiring Slumps

Theovon Re:There is no slump in open positions (250 comments)

Minimum wage is relative. Here, it could mean "the minimum amount of money offered to anyone for this kind of position."

about three weeks ago
top

IT Job Hiring Slumps

Theovon Re:There is no slump in open positions (250 comments)

People who weren't learning the material were still getting good grades (B's at least), and employers were preoccupied with GPA numbers.

about three weeks ago
top

IT Job Hiring Slumps

Theovon Re:There is no slump in open positions (250 comments)

Either that or they don't want to come to the DC area. Considering the massive cost of living there and the fact that "6 figures" can mean "just barely over 100K", maybe you're just not offering enough. I turned down a really nice offer in Arlington, VA, because $150K was worth substantially less there than $85K where I work now.

Probably the only place worse than DC would be NYC.

about three weeks ago
top

Does Learning To Code Outweigh a Degree In Computer Science?

Theovon Re:Me too. I teach CS part time. (546 comments)

In my university, we accept huge numbers of international students because they pay higher tuition. If we didn't do that, we wouldn't survive financially, because the state's economy is poor. So in terms of keeping the institution alive, this is the best thing to do. And in any case, this never seems to negatively impact our very good reputation in the northeast. (Besides, it's not like we're giving good grades to bad students anyhow.)

As long as those poor students aren't TOO distracting, the revenue they bring in is good for everyone else. (And some of them get brought around to actually find the subject interesting.) Our domestic students are almost all very good. And there is always a nontrivial portion of the international students who are also very very good. I like to think about the cases where a student who had trouble getting into to other schools was given an opportunity to unexpectedly shine in ours. This happens plenty.

about three weeks ago
top

IT Job Hiring Slumps

Theovon There is no slump in open positions (250 comments)

The companies say there aren't enough IT workers. The IT workers say there aren't enough jobs. It really comes down to there being huge numbers of IT workers but very few good ones.

As someone who educates CS students, I see the whole spectrum. There are lots of students who seriously have no interest in learning the material. All they care about is getting a diploma. Where I teach, those students don't make it all the way through the program, due to a combination of poor grades and being caught cheating. But when I was getting my undergrad degree, I was always angry about the fact that employers couldn't distinguish my A's from those of people who didn't actually learn the material.

Not surprisingly, supply and demand is a factor here. With low numbers of CS students, standards have to be lowered to keep the tuition revenue going. As the student population grows beyond capacity, schools are able to be more selective based on SAT scores, high school GPAs, and weed-out courses.

about three weeks ago
top

Does Learning To Code Outweigh a Degree In Computer Science?

Theovon A "degree" is useless to those who don't care (546 comments)

As someone who teaches computer science for a living, I can tell you that if you're only majoring in computer science because you think you need to get a degree, then the degree will be useless to you. You'll do the minimum work to pass (if that) and not retain anything you learned. Then you'll have a hell of a time trying to find a job. Employers have become jaded and assume that although you have to be a college graduate to apply, almost all college graduates are morons. This is because most of them are there just to get a degree, and employers have to go through gargantuan efforts to find those few who are actually good.

On the other hand, if you're the kind of person who is good at learning to code and you actually find computer science interesting, then getting a degree will help you immensely. If you're really smart, you would learn most of this stuff on your own anyway, but classes help you organize the knowledge, and professors can help you with the difficult questions. If you go to a good school, you'll learn more than you would if you did it alone because college degree programs help direct you along and force you to practice as you go along. Finally, when you're done and graduate with good grades, you have verifiable evidence that you've been exposed to this knowledge. If you just learned it yourself, you'd have to ask the to take your word for it, and they're not going to do that.

Also, let's not forget that finishing a degree is also proof that you are able to start and finish a long-term project. It means you have attention span and can be dependable. Being able to finish things is another rare trait that employers put a lot of effort into looking for.

about a month ago
top

Apple Reveals the Most Common Reasons That It Rejects Apps

Theovon Re:All about the brand (132 comments)

I don't know what your professors were like, but I instruct my graders to (a) do the assignment themselves for an objective set of answers (to later compare to mine) and (b) look for common "wrong" anwers and evaluate them carefully. For (b) there are three reasons why we might mark correct a "wrong" answer. One is that I just screwed up my work. Another is that I screwed up the question. And another is that I may have given a misleading explanation that lead students commonly produce a wrong answer. We also consider carefully how many people got it "right." A few times I have just dropped a question out of the grading and given extra credit to those few who got a good answer.

I suspect I put more time into grading than I "should" given tenure requirements, but I can't bring myself to do a shit job at teaching.

At least not intentionally. :)

about a month ago
top

DNA Reveals History of Vanished "Paleo-Eskimos"

Theovon Re:Paleo ? (57 comments)

From the summary, it appears to say that they SURVIVED that long, but we probably have to read the article to find out when they arrived there, which is surely a REALLY long time ago. Like more than 50000.

about a month ago
top

Why Women Have No Time For Wikipedia

Theovon Re:Discrimination (579 comments)

It's SUPPOSED to be objective. But it's impossible for it to be entirely objective. Having more diverse viewpoints would likely improve its level of objectivity, which is as much as we can hope for.

about 1 month ago
top

US Government Fights To Not Explain No-Fly List Selection Process

Theovon Can the executive branch be held in contempt? (248 comments)

What would happen if the executive branch (which is supposed to enforce the law) simply refused to comply with a judicial order? Can someone be held in contempt? Who would take on the role of enforcing the judicial order (in terms of compelling the action or executing punishment)?

about a month ago

Submissions

top

Ask Slashdot: Any really good texts on evolutionary details?

Theovon Theovon writes  |  about 2 years ago

Theovon (109752) writes "To me, that we evolved from earlier life forms is a straightforward conclusion. We have mountains of evidence, and current theories are sound given that data. But I'm not a biologist, so I find it a challenge to get access to much of that data. I'm looking for a single coherent tome (or maybe multivolume set) of biological data used to develop specific theories of evolution of many ancient and modern family trees. I don't want mere drawings of fossils in sequence like in a high school textbook. I want to see photographs of the original fossils, along with data about geologic strata, measurements of numerous morphological features, and explanations of the lines of reasoning that lead to particular conclusions. Sections on DNA analysis would be great too, along with any other interesting lines of evidence. The conclusions that scientists draw are fascinating, and I'd like to dig deeper into the data they started from. Would the slashdot crowd be able to help me find a top example of such a resource?"
top

'Something is deeply broken in OS X memory management'

Theovon Theovon writes  |  more than 2 years ago

Theovon (109752) writes "Ever since Apple released Lion, countless users have been complaining about performance problems, even on top-of-the-line hardware. OSX point releases have been coming out, but this issue has remained completely unaddressed by Apple, as per their usual "it's not our fault" approach to their mistakes. Well, some researchers have been investigating this. Perhaps Apple will finally take notice. The original article is here, and the OSNews article is here."
Link to Original Source
top

Pampers Dry Max diapers, chemical burns

Theovon Theovon writes  |  more than 4 years ago

Theovon (109752) writes "Despite the self-deprecating jokes, many of us slashdotters do indeed have the social skills to find mates and have children. This is why articles like the recent one on delayed umbilical cord cutting are of interest to us. Well, here's another one for us parents, something my week-old daughter is experiencing first-hand. Procter and Gamble is putting their heads in the sand and denying all responsibility in response to a spate of reports that the most recent version of their "Dry Max" diapers are causing severe rashes that appear to be chemical burns. There are articles all over the place, with questions and blogs and even P&G's lame response trying to suggest that it's a mere coincidence that rashes are increasing at the same time that their new diapers came out. The feds are investigating, and hopefully, there will be a recall soon. My little girl's rash isn't quite as severe as what I've been reading about, since we caught it early and are using liberal amounts of Desitin. We're accustomed to seeing corporate greed stand in the way of product quality, every one of us who is forced to use Microsoft products. But it's one thing to lose some work. It's entirely another when helpless babies are physically injured by a product that we're supposed to trust, and even worse when the manufacturer tries to tell us that we're the ones at fault."
top

Linux Not Quite Ready for New 4K-sector Drives

Theovon Theovon writes  |  more than 4 years ago

Theovon (109752) writes "We've seen a few stories recently about the new Western Digital Green drives, including this one on slashdot. According to WD, their new 4096-byte sector drives are problematic for Windows XP users but not Linux or most other OS's. There's an article on OS News that suggests that Linux users should not be complacent about this, because not all the Linux tools like fdisk have caught up. The result is a reduction in write throughput by a factor of 3.3 across the board (a 230% overhead) when 4096-byte clusters are misaligned to 4096-byte physical sectors by one or more 512-byte logical sectors. The author does some benchmarks to demonstrate this. Also, from the comments on the article, it appears that even parted is not ready since by default, it aligns to "cylinder" boundaries, which are not physical cylinder boundaries and are multiples of 63."
Link to Original Source
top

OGP releases video of VGA emulator booting

Theovon Theovon writes  |  more than 5 years ago

Theovon writes "Slashdot hasn't seen much news about the Open Graphics Project for some time now, but the OGP has been quite busy, especially recently. As you may recall, the OGP's goal is to develop a fully open-source graphics card. All specs, designs, and source code are released under Free licenses. Right now, they're using FPGAs (large-scale reprogrammable chips) to build a development platform that they call OGD1. And they've just completed an alpha version of legacy VGA emulation, apparently not an easy feat. They have posted a Youtube video of OGD1 driving a monitor, showing Gentoo booting up in a PC. This completes a major step, allowing OGD1 to act as the primary display in an x86 PC. The announcement can be seen on the OGP home page, and there's an OSNews.com article. Finally, the Free Software Foundation has taken notice of this and is asking for volunteers to help with the OGP wiki."
Link to Original Source
top

Dedicated compute box: Persistent terminals?

Theovon Theovon writes  |  more than 6 years ago

Theovon (109752) writes "I just built an expensive high-end quad-core Linux PC, dedicated for number-crunching. Its job is to sit in the corner with no keyboard, mouse, or monitor and do nothing but compute (genetic algorithms, neural nets, and other research). My issue is that I would like to have something like persistent terminal sessions.

I've considered using Xvnc in a completely headless configuration (some useful documentation here, here, here, and here). However, for most of my uses, this is overkill. Total waste of memory and compute time. However, if I decided to run FPGA synthesis software under WINE, this will become necessary. Unfortunately, I can't quite figure out how to get persistent X11 session where I'm automatically logged in (or can stay logged in), while maintaining enough security that I don't mind opening the VNC port on my firewall (with a changed port number, of course). I'm also going to check out Xpra, but I've only just heard about it and have no idea how to use it.

For the short term, the main need is just terminals. I'd like to be able to connect and see how something is going. One option is to just run things with nohup and then login and "tail -f" to watch the log file. I've also heard of screen, but I'm also unfamiliar with that.

Have other slashdot users encountered this situation? What did you use? What's hard, what's easy, and what works well?"
top

Theovon Theovon writes  |  more than 7 years ago

Theovon (109752) writes "It's only been two days since the announcement of the official release of Ubuntu 6.10 (Edgy Eft), and the fallout has been very interesting to watch. By and large, fresh installs of Edgy tend to go well. A few problems here and there, especially with installation of closed-source ATI and nVidia drivers, but for the most part things have been smooth. Many people report improved performance over Dapper, improved stability, better device support, etc. A good showing. But what I find really interesting is the debacle that it has been for people who wanted to do an "upgrade" from Dapper (6.06). Installing OS upgrades has historically been fraught with problems, but previous Ubuntu releases, many other Linux distros, and MacOS X have done surprisingly well in the recent past. But not Edgy. Reports are flooding into Ubuntu's Installation & Upgrades forum from people having myriad problems with their upgrades. One user described it as a nightmare. Users are producing detailed descriptions of problems but getting little help. This thread has mixed reports and is possibly the most interesting read. Many people report that straight-forward upgrades of relatively mundane systems go well, but anything the least bit interesting seems not to have been accounted for, like software RAID, custom kernels, and Opera. Even the official upgrade method doesn't work for everyone, including crashes of the upgrade tool in the middle of installing, leaving systems unbootable, no longer recognizing devices (like the console keyboard!), reduced performance, X server crashes, wireless networking problems, the user password no longer working, numerous broken applications, and many even stranger things. Some of this is fairly subjective, with Kubuntu being a bit more problematic than Ubuntu, with reports that Xubuntu seems to have the worst problems, and remote upgrades are something you don't even want to try. Failed upgrades invariably require a complete reinstall. The conclusion from the street, about upgrading to Edgy, is a warning: If you're going to try to take the plunge, be sure to make a backup image of your boot partition before starting the upgrade. Your chances of having the upgrade be a total failure are high. If you're really dead-set on upgrading, you'll save yourself a lot of time and headache by backing up all of your personal files manually and doing a fresh install (don't forget to save your bookmarks!)."
top

Theovon Theovon writes  |  more than 7 years ago

Theovon writes "Back in the 1920's a blight all but completely wiped out Chestnut trees in the United States. As such, my uncle's 1100-tree chestnut farm is a rare sight indeed. From the article, "... someone found a single tree in Ohio that the blight did not kill ..., crossed it with the Chinese chestnut, resulting in a nut with the characteristics of the Chinese variety but with the larger nut of the American tree." The article goes on to describe some interesting things about chestnuts themselves, such as the spiny burr that they grow in on the tree."

Journals

Theovon has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?