GNOME 3.14 Released
I've been complaining for years that the default KDE window manager not only looks ugly but also clashes with the rest of the theme. If they made windows look like plasma widgets, then they would look sleek, and they would look like they were designed to fit with the rest of the theme. But KDE devs seem to have no idea what I'm talking about. How can thing go so right in so many ways and then fall apart in one so conspicuous area?
On first glance, the new Gnome window decorations actually look pretty good. Maybe I'll change my mind later, but it looks like someone developed a sense of style.
Rosetta Code Study Weighs In On the Programming Language Debate
Verilog vs. VHDL. I find that the verboseness of VHDL (which requires like 3 times as much typing) actually impedes readability. Sure, there are situations where VHDL can catch a bug at synthesis time that Verilog can't, but the rest of the time, it just makes VHDL unwieldy.
Ask Slashdot: Finding a Job After Completing Computer Science Ph.D?
I was maybe 80% sure that I wanted to go into academia, so it's not so strange that I got a PhD. But I interviewed for both industry and academic jobs. In my case, I had extensive industry experience previously. For some academics, the industry experience seemed to be a negative, and for some industry employers, the PhD seemed to be a negative. Very few companies saw the combo as a bonus, although the list of companies that did think my background was good included Intel, AMD, and my current employer (a research university -- I went into tenure track).
I recently interviewed at AMD (because they called me, and I figured it couldn't hurt to see what my alternatives are), and they grilled me hard on programming questions. They asked me things like what do 'volatile' and 'static' keywords in C mean (I was able to quickly rattle off more than the interviewer needed to know about them), and when I went on-site, they gave me some programming problems. The key reason they like me (and are writing up an offer) is because I knew a lot about programming, had done a lot of programming (despite having been in academia for 2 years, they referred me to as a veteran from industry), and I knew a good deal about each of the topics they talked with me about (CPU architecture, GPU architecture, the 3D graphics rendering pipeline, compilers, etc.).
Key ways in which this went well for me included (a) I proved that I was a very strong software engineer with practical knowledge, skill, and efficiency, and (b) I was able to show how, for me, the PhD augmented (rather than hurt) my engineering skills. That last bit is key. For instance, I showed that I could approach a problem with creative solutions, apply a scientific approach to determine the viability of the idea, and (most importantly) explain how I can fit it into the context of a BUSINESS that wants to make money from it. Coming from academia, also I know how search for existing solutions, so I can also avoid reinventing the wheel. I can look up what people have done before and incorporate some of those ideas into a new practical solution.
So, bottom line, if you want to go into industry (and not necessarily into some big company's research wing), then you have to show that you're a real engineer who can design complex solutions to complex problems and do it efficiently. You have to know a LOT about programming. On top of that, you have to know a lot of theory (algorithms, data structures, computational complexity, etc.). And you have to show that you can think in business and product terms. You're working there to make products that will sell and make money, and you have to convince them that you're unconsciously competent at doing this very well. You need to break the stereotype that PhDs arrogantly have their heads in the clouds, can't think about practical matters, and get too easily distracted by things tangential to the job at hand.
Ask Slashdot: Finding a Job After Completing Computer Science Ph.D?
Yes and no. Depends on how specific they want to get. Can you list general types of containers? They're fairly universal: linked list, vector, double-ended queue, set, map, multimap, etc. This kind of question just probes for familiarity with the kinds of facility the language and its standard libraries offer. Bonus for remembering the names of a few of the methods (like size() and push_back()). All this means is that you understand data structures and that you've managed to retain some of the facts about that language's take on them.
Data Archiving Standards Need To Be Future-Proofed
We seriously considered chronic lyme as a possibility and even got testing. The test came back negative, although there can be false negatives. We ultimately ruled it out on the basis of certain key symptoms being absent. Basically, we considered a LOT of things and did our best to rank the changes of each illness that might explain the symptoms. We were open to the idea of more than one cause but considered it a remote possibility; fortunately we were right.
Anyhow, homozygous MTHFR C677T can be serious, especially if there are other complicating mutations. Compared to some people my wife has a moderate problem. She had chronic fatigue (not to be necessarily confused with CFIDS), brain fog, autoimmune disease, gluten intolerance, weight gain, pale skin, hairloss, and many more symptoms. But she never lost feeling in her limbs; some people do. When you mess up the methylation cycle, all sorts of things can go wrong.
I'm not sure why you (an anonymous coward, so why am I feeding the trolls?) think that this mutation is of "dubious clinical significance." It's one of the more serious mutations, and the appropriate treatments have worked. Taking methylfolate, a few different forms of B12, and several other supplements has caused massive improvement in energy, return of proper skin tone, hair regrowth, appropriate weight loss, and so on. In other words THE TREATMENT WORKED.
This is one of those fortunate cases where a hard-to-find single cause has been identified. It explains ALL of the symptoms (many of which are secondary, caused by a deficiency caused by the underlying problem), and the treatment has worked very well. It's a little hard to get the exact dosages of vitamins right, because as soon as you get enough of one thing, the body will start repairing things, which requires other chemicals, and cause a deficiency in another thing, etc. So the fix isn't an over-night sort of thing but the progress is rapid.
And my biggest complaint is not that the MDs didn't know how to diagnose this. My complaint is that they EXPLICITLY REFUSED to help us when we were trying to track down the cause. Seriously. Most doctors just didn't have a clue and were unwilling to "do a lot of speculative testing," while some out-right said they refused to help us. Even if we came in with a list of tests to do to try to narrow down a range of possibilities (like a decision tree), they wouldn't do it. We had to figure this out completely on our own.
I don't expect MDs to know everything or be super-human. But I do expect them to listen and take patients seriously.
Data Archiving Standards Need To Be Future-Proofed
... or for that matter any of your medical history. MDs do spot-diagnosis in 5 minutes or less based exclusively on what they've memorized or else they do no diagnosis at all.
My wife has a major genetic defect (MTHFR C677T), which causes severe nutritional problems. We haven't yet met an MD who has a clue about nutrition. Moreover, we had to diagnose this problem ourselves through genetic testing, with no doctors involved. We've shown the results to doctors, and they don't entirely disbelieve us, but they also have no clue what to do about it and still are dubious of the symptoms. (Who has symptoms of Beriberi these days? Someone whose general ability to absorb nutrients is severely compromised.)
What makes anyone think that this will change if your doctor has access to your DNA, even with detailed analysis? They won't take the time to actually read any of it. In fact a lot of what we know about genetic defects pertains to problems in generating certain kinds of enzymes, a lot of which participate in nutrient absorption. (So obviously RESEARCHERS know something about nutrition.) These nutritional problems require supplementation that MDs don't know about. Do you think the typical MD knows that Folic Acid is poison to those with C677T? Nope. They don't know the differences between folic acid, folinic acid, and methylfolate and still push folic acid on all pregnant women (they should be pushing methylfolate). They also don't know the differences between the various forms of B12 and always prescribe cyanocobalamin even for people who need the methyl and hydroxy forms.
Another way in which MDs are useless is caused by their training. Bascally, they're trained to be skeptical and dismissive. Many nutritional and autoimmune disorders manifest with a constellation of symptoms, along with severe brainfog. Someone with one of these problems will generally want to write down the symptoms when talking to a doctor, because they can't think clearly. The thing is, in med school, doctors are specifically trained to look out for patients with constellations of symptoms and written lists, and they are told to recognize this as a condition that is entirely within the mind of the patient. Of course, a lot of doctors, even if not trained to dsmiss things as "all in their head" are terrible at diagnosis anyway. They'll have no clue where to start and won't have the patience to do extensive testing. It's too INCONVENIENT and time-consuming. They won't make enough money off patients like this, so they get patients like this out the door as fast as possible.
I've had some good experiences with surgeons. But for any other kind of medical treatment, MDs have been mostly useless to me and my family. In general, if we go to one NOW, we've already disgnosed the problem (correctly) and possibly need advice on exactly which medicine is required, although when it comes to antibiotics, it's easy enough to find out which ones to use. (Medical diagnosis based on stuff you look up on the internet is really hard and requires a very well-trained bullshit filter, and you also have to know how to use the more authoritative sources properly. However, it's not impossible for people with training in things like law, information science, and biology. It just requires really good critical thinking skills. BTW, most MDs don't have that.)
MDs are technicians. Most of them are like those B-average CS grads from low-ranked schools who can barely manage to write Java applications. If you know how to deal with a low-level technician, guide them properly, and stroke their ego in the right way, you can deal with an MD.
Schizophrenia Is Not a Single Disease
Probably at least a few of those sub-disorders are actually nutritional deficiencies. We have this myth (perpetuated by MDs who have ZERO training in nutrition) that we don't have nutritional deficiencies in America. In fact, the American diet is horrible, and we all know it. B12 deficiencies are common (which is one of the reasons shots are often prescribed), as are deficiencies in magnesium, along with numerous other vitamins and minerals. Since the mid 90's, the FDA has mandated "enrichment" of foods, but the forms of the additives are NOT the biologically active forms, so some people have trouble processing them. For instance, MTHFR gene defects are common (my wife and I have different ones, and she has the really bad C677T defect), making folic acid (which is artificial) range from useless to poisonous to some people who need to take methylfolate instead. In fact, since the mid 90's a lot of people have reported declines in their health, which may be correlated with that FDA mandate (although without a more complete study, we have to assume this is anecdotal and COULD be correlated more strongly with something else).
Anyhow, my point is that many psychological disorders, such as bipolarism, are associated with vitamin deficiencies. If you look at the symptoms lists of various B vitamin deficiencies, for instance, you'll see that it is already established what kinds of psychological effects can occur in cases of "extreme" deficiencies. If we can get past the idea that nobody in America can have extreme vitamin deficiencies (you can have plenty of some vitamin but still be anemic if you can't USE it in that form), then we can start treating mental disorders using carefully controlled diets and supplement schedules. I'm sure it won't work on everyone, but it would be insane to not try it in place of loading people up on antipsychotics because the "doctors" are mental hospitals have no fucking clue about nutrition.
And just to reinforce, folic acid is basically poison to about 10% of humans. Different vegetables contain methylfolate and/or folinic acid, NOT folic acid. Defects in the genes that code enzymes that convert folinic acid and folic acid are more common than most food allergies, making this a serious problem!
One interesting side-effect of this is the proliferation of the bad genes. People with homozygous C677T mutation have about a 30% conversion rate from folic acid to methylfolate. (Meanwhile the unconverted folic acid itself interferes with the methylation cycle.) If a woman gets pregnant and takes folic acid in large quantities (which is what doctors instruct), the fetus will take all of the methylfolate, and the mother will get very sick. Meanwhile the fetus will be allowed to develop when otherwise it would have naturally aborted due to an inability on its own to convert folinic acid that you get from food. As a result, we have more people born with this defect, while people in the FDA and the medical profession are too ignorant of the consequences to deal with them properly. Mind you, if they were to take more methylfolate, the viability of this defect would increase, but at least the mothers wouldn't get as sick.
Link Between Salt and High Blood Pressure 'Overstated'
Yeah, much of what we know is being overturned. Some of the disinformation was probably created by the food companies that wanted to make cheaper food. Back in the 70's we were told that fat was bad, and so all these processed foods got lots of extra sugar instead. Now we find out that sugar is bad and you need to consume more of the right fats. We're also starting to see that this "food pyramid" they taught us about should be basically inverted. The reason for the food pyramid is more to do with cost (grains are cheap) than nutrition.
Today, we know a hell of a lot about the impact of genetics, microbiotic flora, and many other things that affect individuals differently. For instance, many people have some mild sensitivities to various food proteins, although no always enough to notice more than some unexplained lethargy unpredictable times after eating certain foods. Of course, for some people, it's bad, like those with celiac disease.
Here's an interesting one: Apparently, about 10% of the population (US or world, I'm not sure) has a homozygous MTHFR C677T mutation. These people cannot convert folic acid (which is artificial anyway) or folinic acid (found in lots of vegetables) into methylfolate. As a result, these people suffer from massive B9 deficiencies (which indirectly causes others, like trouble absorbing B12). Moreover, it's not just that folic acid and folinic acid are not useful to them; they're functionally poison, interfering with the normal function of the methylation cycle. So these people need to take large quantities of methylfolate and cut out certain "healthy" vegetables. They also have to cut out "enriched" foods. We're starting to see a correlation between health problems increasing in these people and the mid-90's FDA mandate to enrich certain foods with Folic Acid. Lovely.
Apple Announces Smartwatch, Bigger iPhones, Mobile Payments
If I had disposable income, I might get an iWatch for my wife (who actually uses a watch). This is a great little toy for people with some disposable income and an itch to collect expensive gadgets. It looks cool and probably has some great functionality.
For those complaining about it, they're expecting too much from a first-generation product. Give it a few years, and the features, battery life, and price will improve to a point where more of us would consider buying one. Meanwhile, Apple gets to test out ideas that will improve its later products, and some of what is learned from this will also positively impact other Apple products.
Some time next year, I'll go check one out in an Apple store for about a minute. (Which is about how long I can stand to be in an Apple store since I already know everything about the products before I go there, so I get bored quickly.)
Unpopular Programming Languages That Are Still Lucrative
In general, having a niche skill pays more if you can find a job in that niche. Companies that need that skill know it's a niche skill and one way or another end up realizing they have to offer higher salaries to attract the right talent. On the other hand, if you are an expert in COBOL and go to work doing web programming, they're not going to care, and your COBOL knowledge will have no impact on your income.
That all being said, I don't understand the general fear people have of learning new languages. Sure, it takes time and effort, but it should be considered part of the craft. At Ohio State, the CSE department used to teach a language called Resolve for its beginning courses. It was DESIGNED to make things like data structures and algorithms easier to code, avoiding a lot of the cruft of other languages that is unrelated to the code that matters. But students complaned and complained about learning a language they'd never use again. My thinking is that if they're going to be successful, engineers need to be willing and able to learn new languages on their own time.
At OSU, they eventually caved to student pressure and switched to Java. Why they chose Java is beyond me. Where I teach now, they use Python, and that makes a hell of a lot more sense to me. If you're entirely new to programming, there is a lot of boilerplate (from the perspective of the uninitiated) Java code you have to write just to do simple things that is unnecessary when writing Python.
IT Job Hiring Slumps
Minimum wage is relative. Here, it could mean "the minimum amount of money offered to anyone for this kind of position."
IT Job Hiring Slumps
People who weren't learning the material were still getting good grades (B's at least), and employers were preoccupied with GPA numbers.
IT Job Hiring Slumps
Either that or they don't want to come to the DC area. Considering the massive cost of living there and the fact that "6 figures" can mean "just barely over 100K", maybe you're just not offering enough. I turned down a really nice offer in Arlington, VA, because $150K was worth substantially less there than $85K where I work now.
Probably the only place worse than DC would be NYC.
Does Learning To Code Outweigh a Degree In Computer Science?
In my university, we accept huge numbers of international students because they pay higher tuition. If we didn't do that, we wouldn't survive financially, because the state's economy is poor. So in terms of keeping the institution alive, this is the best thing to do. And in any case, this never seems to negatively impact our very good reputation in the northeast. (Besides, it's not like we're giving good grades to bad students anyhow.)
As long as those poor students aren't TOO distracting, the revenue they bring in is good for everyone else. (And some of them get brought around to actually find the subject interesting.) Our domestic students are almost all very good. And there is always a nontrivial portion of the international students who are also very very good. I like to think about the cases where a student who had trouble getting into to other schools was given an opportunity to unexpectedly shine in ours. This happens plenty.
IT Job Hiring Slumps
The companies say there aren't enough IT workers. The IT workers say there aren't enough jobs. It really comes down to there being huge numbers of IT workers but very few good ones.
As someone who educates CS students, I see the whole spectrum. There are lots of students who seriously have no interest in learning the material. All they care about is getting a diploma. Where I teach, those students don't make it all the way through the program, due to a combination of poor grades and being caught cheating. But when I was getting my undergrad degree, I was always angry about the fact that employers couldn't distinguish my A's from those of people who didn't actually learn the material.
Not surprisingly, supply and demand is a factor here. With low numbers of CS students, standards have to be lowered to keep the tuition revenue going. As the student population grows beyond capacity, schools are able to be more selective based on SAT scores, high school GPAs, and weed-out courses.
Does Learning To Code Outweigh a Degree In Computer Science?
As someone who teaches computer science for a living, I can tell you that if you're only majoring in computer science because you think you need to get a degree, then the degree will be useless to you. You'll do the minimum work to pass (if that) and not retain anything you learned. Then you'll have a hell of a time trying to find a job. Employers have become jaded and assume that although you have to be a college graduate to apply, almost all college graduates are morons. This is because most of them are there just to get a degree, and employers have to go through gargantuan efforts to find those few who are actually good.
On the other hand, if you're the kind of person who is good at learning to code and you actually find computer science interesting, then getting a degree will help you immensely. If you're really smart, you would learn most of this stuff on your own anyway, but classes help you organize the knowledge, and professors can help you with the difficult questions. If you go to a good school, you'll learn more than you would if you did it alone because college degree programs help direct you along and force you to practice as you go along. Finally, when you're done and graduate with good grades, you have verifiable evidence that you've been exposed to this knowledge. If you just learned it yourself, you'd have to ask the to take your word for it, and they're not going to do that.
Also, let's not forget that finishing a degree is also proof that you are able to start and finish a long-term project. It means you have attention span and can be dependable. Being able to finish things is another rare trait that employers put a lot of effort into looking for.
Apple Reveals the Most Common Reasons That It Rejects Apps
I don't know what your professors were like, but I instruct my graders to (a) do the assignment themselves for an objective set of answers (to later compare to mine) and (b) look for common "wrong" anwers and evaluate them carefully. For (b) there are three reasons why we might mark correct a "wrong" answer. One is that I just screwed up my work. Another is that I screwed up the question. And another is that I may have given a misleading explanation that lead students commonly produce a wrong answer. We also consider carefully how many people got it "right." A few times I have just dropped a question out of the grading and given extra credit to those few who got a good answer.
I suspect I put more time into grading than I "should" given tenure requirements, but I can't bring myself to do a shit job at teaching.
At least not intentionally. :)
DNA Reveals History of Vanished "Paleo-Eskimos"
From the summary, it appears to say that they SURVIVED that long, but we probably have to read the article to find out when they arrived there, which is surely a REALLY long time ago. Like more than 50000.
Why Women Have No Time For Wikipedia
It's SUPPOSED to be objective. But it's impossible for it to be entirely objective. Having more diverse viewpoints would likely improve its level of objectivity, which is as much as we can hope for.
US Government Fights To Not Explain No-Fly List Selection Process
What would happen if the executive branch (which is supposed to enforce the law) simply refused to comply with a judicial order? Can someone be held in contempt? Who would take on the role of enforcing the judicial order (in terms of compelling the action or executing punishment)?