Linus Torvalds Tries KDE, Likes It So Far
They recently forced us at work to upgrade from Ubuntu 10.04 to 12.04. We have our choice of desktop environments (Gnome 3, KDE 4, XFCE 4, Cinnamon, Unity).
I spend my day in a combination of Chrome, the terminal, and Eclipse.
I have determined that KDE is the least bad of all of these alternatives. There are actually things that I like about it, too.
- Konsole has a lot of nice features, such as activity notification on tabs in the background
- The volume buttons on my headset actually work (this was not the case for XFCE)
- Bluetooth actually works
- I can have a traditional taskbar
- You can turn off the more flashy desktop effects
- Built-in dark color theme for the Oxygen theme (large areas of white on my 30" monitor are distracting)
- I can apply the dark Qt/KDE theme to GTK+ applications, including Eclipse
- General lack of polish. This is my #1 complaint about KDE, and it's everywhere. The text on my window list buttons is too low, and on the clock it's too high. The "AM" or "PM" on the clock is cut off. Text on buttons has virtually zero top/bottom padding, which looks bad. UI elements are inconsistently aligned. UI strings are often awkwardly phrased.
- Verbosity. I don't need to be notified every time I plug in a USB device, every time the power state of my machine changes, every time the network status changes, every time a file operation completes, every time a daemon crashes, or every time the desktop indexer is done. You can disable pretty much all of these notifications, but to some degree it's like playing whac-a-mole.
- Crashiness. Sometimes, daemons decide to crash randomly. Occasionally, the compositor goes crazy and locks up the entire desktop.
- Insane defaults. Preferences are nice, but they need to be set to reasonable values by default. For example, there are *way* too many global key bindings by default, the eye candy is set to an annoyingly high level by default, single-click select in file dialogs contradicts every other desktop, the default panel is huge, and a whole ton of other things.
- No good system monitor widget. GNOME 2.x had an awesome panel widget that would display CPU, network, and memory; it even displayed I/O wait CPU time in a different color, which was awesome.
- The cashew. It makes no sense, and you can't get rid of it.
If I could have GNOME 2.x back, I would. But KDE 4.x is the best of the current bunch.
Ubuntu Delays Wayland Plans, System Compositor
X works very well over a LAN, and, as bandwidth becomes cheaper, problems running over a WAN will go away.
No, it won't. The problem with X over a WAN is latency, and no amount of technology is going to change the fact that light can only go so fast.
The company I work for has a *very* fast WAN between offices, and X over the WAN is still a dog. The problem is that X is to a large degree synchronous, and operations involve multiple round trips. So no matter how much bandwidth you have, you get killed by latency.
The solution to this is to either use a framebuffer-based protocol (VNC and friends) or to use an asynchronous compressing X (NX). Neither of which is really taking advantage of the network features of X.
Supreme Court: Affordable Care Act Is Constitutional
Another bullshit hit piece on CFLs.
1 penny per month a normal rate (10 cents per kWh) is 100Wh. A typical incandescent bulb is 60W, a similar CFL is 15W. That saves 45W. So if you replace a single 60W light bulb with a CFL and it's used 3 hours per month, you've already saved more than a penny per month.
I'm guessing you have more then a single 60W light bulb and that you use it for more than 3 hours per month.
I could talk about how actual tests show that CFLs last way longer than incandescent bulbs, or that most CFLs are crushed and recycled in the USA, or that shipping things "25000 miles from China" (it's closer to 7500 miles; no point is "25000" miles away) is actually not all that energy intensive.
But I don't think your rant is based on facts. It's based on a need to be contrarian, to be seen as anything but "green", and to oppose environmental regulations.
We can have a legitimate discussion about whether the government has the right to enact environmental regulations, about whether they are effective, and about whether they are necessary. But if you start with information that is wrong, we can't really discuss anything.
Mandatory Brake-Override Proposed For All Cars
In our '06 Prius, at moderate/high speeds the car simply won't let you shift from D to N, and I really doubt the computer would pay any attention at all if the driver were to try holding the power button down. But I'll try that out when I get a chance.
This is incorrect. You can shift to neutral from D at any time in the Prius, from any speed.
At anything above a couple MPH, if you push "Park", you end up in neutral. If you try to go in reverse, you end up in neutral as well.
Also, holding the power button down works fine, at any speed. It even works if the computer gets screwed up and can't detect the speed. You do lose power steering if you do this (you keep power brakes).
Also, the Prius has a break override system already.
Stroustrup Reveals What's New In C++ 11
Whether the performance of a managed language (most commonly Java or .NET) is inferior to C++ is almost entirely dependent on your workload.
Java running on the Oracle VM (HotSpot) is actually faster than C++ in many ways (object allocation, virtual function/method calls, the built-in data structures). "Modern" C++ code that uses STL and smart pointers tends to waste a lot of time copying memory around and/or tracking reference counts. Of course, the GC in Java also spends a lot of time copying memory around, but it can be done concurrently and for most programs it is quite efficient.
C++ wins big in two main ways: latency and with loopy scientific code. The fact that you have GC in Java makes latency hard to predict, and if you need to meet latency targets a high percentage of the time C++ is probably the right choice. As for scientific code, Java VMs typically make very poor use of vector instructions and do not have anywhere near the sort of loop/memory ordering optimizations you find in something like ICC.
iOS Vs. Android: Which Has the Crashiest Apps?
People often get it confused with garbage collection and while the end results are similar, ARC occurs only occurs at compile time so there is no runtime performance hit.
What are you talking about? Reference counting is absolutely a form of garbage collection, and it's not a particularly good one at that. And it most certainly does have a performance impact; indeed, it's significantly slower than tracing GC in most cases.
There are advantages to reference counting. It's simpler, releases memory sooner, and has more predictable performance characteristics than trace-based GC. But it is a fallacy to pretend that it has "no runtime performance hit".
Ask Slashdot: Mirrorless, Interchangeable Lens Camera Advice?
Ignore the people telling you to get a DSLR because it has better picture quality.
There are a lot of factors that determine the quality of your images, but the most substantial is sensor size. The sort of DSLRs that you would buy (that is, the ones under $2000) use APS-C sized sensors.
Guess what the Sony NEX-5N (a MILC) uses? An APS-C sensor. And it's arguably the best APS-C sized sensor on the market.
The NEX-5N takes pictures that rival any APS-C DSLR, and it does so for a considerably lower price than many DSLRs.
There are still a lot of good reasons to buy an APS-C DSLR over the NEX-5N:
- Lenses. The NEX series uses E-mount lenses, and there aren't a lot of choices. This is improving, but we're still talking about the difference between thousands of lenses for EOS (Canon) or F-mount (Nikon) and fewer than 20 for E-mount. You can get adapters for A-mount (Sony DSLR) lenses, but they bulk up the camera. You can also get adapters for virtually any other format (including EOS and F-mount) but you lose auto-aperture and autofocus.
- Speed. The NEX-5N is not a slow camera by any means, but there are many DSLRs that are faster.
- Battery life. DSLRs can keep the screen off, plus they generally have larger batteries. The NEX-5N lasts ~350 shots on a charge; expect 2x that from a DSLR
- Manual controls. I find the controls on the NEX-5N to be fine, especially since you can customize the buttons and create a
custom quick menu. Still, DSLRs typically have more buttons which means easier access to settings quickly.
- Viewfinder. If you want a viewfinder, optical is tough to beat (though the NEX7 has an OLED viewfinder that is excellent).
And there are a lot of good reasons to buy an NEX-5N over an APS-C DSLR:
- Lens adapters. You can mount basically any 35mm lens on the NEX-5N with an adapter because the flange-back distance is lower than on any other format. This includes Canon and Nikon lenses, classic and modern rangefinder lenses like Leica lenses, and a lot more. Yes, you have to use manual focus and aperture. But it's still a very cool capability.
- Size. The NEX-5N is way smaller and lighter than any DSLR. Even with the Sony 18-55mm lens it fits in a large pocket or small camera bag, and it's even smaller with the Sony 16mm pancake lens.
- It doesn't look like a DSLR. This may be a big factor if you don't want to look like a professional photographer (for example, at concerts or while doing covert journalism).
- Video. The NEX-5N takes 1080p60 video in H.264 at 28Mbps, and 1080p24 at 24Mbps. Most DSLRs in the same price range (and even many that are more expensive) are limited to 1080p24 or 720p60, both of which are inferior if you want to record fast action (like sporting events) or just hate low-frame-rate video.
- Value. The NEX-5N has better high-ISO performance, better dynamic range, and more resolution than basically any camera under $1000.
I love my NEX-5N. It is not perfect for everyone, or for every purpose. But if you aren't interested in buying a ton of lenses, you don't like using a viewfinder, and you prefer a compact camera without crappy picture quality, the 5N is a really good choice.
Inside the World's Largest LAN Party
You can get that kind of Internet connection in the US, and it's done every year for the ACM/IEEE SC trade show.
How Android Phone Makers Are Missing the Marketing Boat
So complaining about Android having a "more standard" connector totally misses the fact that from the standpoint of people buying the phone, the Android connector is simply not as standard.
What the hell are you talking about? Every Android phone that you can buy today uses a USB micro-B connector. Every recent BlackBerry device, every Windows Phone device, every Kindle, every Nook, and a significant fraction of non-smartphone devices use micro-B as well.
My USB wireless headset uses micro-B. My portable hard drive uses micro-B.
To claim that the 30-pin dock connector is more common than micro-B is flat-out wrong. Micro-B is quite literally the *only* connector I have on portable electronics that I own. Being able to bring a *single* cable on a trip that will charge my phone, charge my wireless headset, and connect to my portable hard drive is invaluable. Try doing that with a dock connector.
Ask Slashdot: What To Tell High-Schoolers About Computer Science?
I'm a new software engineer for Google. My job is low-stress, my workload is reasonable, and there are many different options for the advancement of my career, regardless of whether I want to write code on a day-to-day basis long term. The pay and benefits are also good, and I get to travel quite a bit.
One of my friends is also a new software engineer, but he started out at a small company that made medical software for smartphones and the web. He decided that he didn't like the company, and now he's at another startup working on software to help consumers monitor and reduce their energy consumption.
I have another friend at Microsoft, and one at Amazon. They are also paid well, enjoy their jobs, and feel that they have many, many options.
Maybe my peer group is not representative of the software world as a whole. I am well aware that there are crappy software companies out there, but the reality is that you are still much better off going into CS from a versatility and marketability standpoint than most other degrees. Nearly every product or service involves software, and someone has to write it.
US Government Seizes Email of WikiLeaks Volunteer
2011 US House reauthorization of the PATRIOT Act (HR 514):
Yea - 210
Nay - 26
No vote - 5
Yea - 67
Nay - 122
No vote - 4
In 2011, 35% of Democrats voted to reauthorize the PATRIOT ACT and 87% of Republicans do. The sooner you stop thinking that there are no differences between the parties, the sooner you can realize that your vote actually does make a difference.
I know that it's popular to trumpet the 'there is no difference' line on Slashdot. But instead of doing that, why not do some actual research into the positions and voting records of your candidates? Maybe then you will figure out that there *are* real differences and that the reality of a complex representative political system means that you are going to disagree with your representatives on a good number of issues.
C++0x Finally Becomes a Standard
Garbage collection is more efficient than reference counting most of the time. There are a number of tests that demonstrate that Java programs spend less time allocating and deallocating memory than C++ programs.
Calling BS On Unpaid Internships
My parents are well-off but they are hardly 'rich' in the traditional sense. They most certainly do not have 'connections' to anyone in a powerful position. I went to public school, got good grades, and did well on the ACT. Based on my grades, residence, and ACT scores, admission to my university (University of Colorado) was guaranteed by law.
I have nothing but respect for the CS department at the University of Colorado. All of my professors were excellent - they were both experts in their field and cared deeply about seeing their students succeed. But CU is not particularly well known in CS, nor do we typically rank in the top 30 or so programs on any of the BS 'top school' lists. My chances of getting into a company like Google with a degree from a lesser-known school and no experience would be damn near zero.
In 2007 I got a (paid) internship with Agilent. I ended up designing, implementing, documenting, and deploying a CRM application that is used by 300+ call-center workers to direct sales inquires. My internship was originally 3 months, but I ended up working part-time for another 8 months during the school year to finish the project.
When Microsoft did interviews at my university, my Agilent experience was part of why I was able to get a pre-screen interview, and it was a major part of why they decided to fly me to Redmond for a day of interviews. Less than a week later I was offered another (paid) internship at Microsoft. I was the only person that year from my university to intern at Microsoft.
In 2009 and 2010, I worked (paid) internships at Google. This year I started there as a full-time employee. There is no doubt that my internship experience was instrumental to me being able to get a job at Google.
In a world without internships, I would just have been another student from a lesser-known CS program. There would be no reason for a company like Google or Microsoft to take a chance on me.
In the technology world I would not recommend accepting an unpaid internship. Google and Microsoft pay their interns extremely well, and Agilent paid me an excellent salary too. There are too many companies out there who are willing to treat you as a real employee, who are willing to give you real work, and who are willing to pay you. If you are talent, do not take a job for $9 per hour that involves unskilled work. It won't look good as work experience and there are so many better opportunities.
Cheap GPUs Rendering Strong Passwords Useless
Reply to: Re:Ha Ha, mine goes to 11
Re:Ha Ha, mine goes to 11 (Score:3, Interesting)
by alt236_ftw (2007300) Alter Relationship on Sunday June 05, @09:11AM (#36344906) Homepage
Single point of failure.
Essentially, you will need to carry a copy of your password bank with you AND the application which opens it at all times to function.
This means that if it gets compromised (your memory stick gets stolen/your dropbox account gets compromised/ etc...) an attacker will only need to guess/bruteforce/dictionary attack/social engineer/look over your shoulder one password and gain access to everything in your wallet.
No one is going to do that. Seriously. No attacker that I am worried about is going to go to the trouble and risk of physically stealing my property to get into my accounts.
And, by the way, even if they do get my vault they need to crack my salted, 10 character, lowercase/uppercase/number/symbol password, which has been run through a million iterations of SHA-256 to generate the key for my password vault. Good luck with that one.
I am not immune to attack. But I am a hell of a lot harder to attack than the typical user. That alone means that the chances of me being a target are very low.
Has the Console Arms Race Stalled?
I now sit with a nearly *5 year old* dual core 2.4GHz CPU (overclocked to 3.3GHz mind you) and I can't find even a $1000 CPU that will give me anywhere near a worthwhile performance bump for anything other than super specific parallelizable applications like scientific computations or workstation-style 3D rendering.
You're not looking hard enough.
I have a laptop with a 2.53GHz Core 2 Duo Penryn (which is actually a better architecture than your Conroe or Athlon 64). It's a fine machine.
I also have a desktop with an i7-2600, a $300 CPU.
It's night and day when you push the machine. Even in single-threaded code the i7 is about twice as fast, and in multi-threaded code it's 3x, 4x, or even more in many cases.
Clock for clock, Sandy Bridge chews up the first-generation Core 2 CPUs and spits them out. And then my i7 is clocked higher - much higher.
HDMI Brands Don't Matter
120 and 240 FPS are invisible to the human eye. More importantly, the source material is either at 20, 24, 29.97, or 60 FPS, so either you have the extra frames showing the same frames again (thus being useless), or you generate extra frames which didn't previously exist and which look a bit plasticky and odd. In test after test, the "Motion Plus" and other BS upframing is rated as adding noise, because that's all it does to the signal.
Two issues with that theory.
First of all, 60 is not evenly divisible by 24 (actually, 23.976 because of historical NTSC reasons). Since basically no consumer 60Hz televisions actually drive the panel at 24 (23.976) frames per second (even though they may accept 24p input), they generate the additional frames using a 3:2 cadence (one frame is displayed for 3 refresh periods, then the next for 2). The fact that frames are displayed for uneven periods creates judder.
On a 120Hz set each frame is simply displayed for 5 refresh periods. On a 240Hz set each frame is displayed for 10 refresh periods.
Moreover, I used to agree with you about motion interpolation as used on 120Hz displays, but since actually buying one my opinion has completely reversed.
Yes, the frames are faked. Yes, it looks a little weird. But I am frankly tired of the fact that films (and many television programs) are produced at an absurdly low frame rate (24Hz) that makes motion jerky and hard to follow. I like the fact that motion interpolation makes things look smoother, even if it does occasionally add artifacts.
Motion estimation technologies have gotten very good and the better TVs (like the Sony XBR9 I have) do a very good job of making film look like smooth video. It may not look 'cinematic', but I like the way it looks and many others do as well.
Google Talk Enables Video Chat On Android Phones
Your headphones are broken, or your device is. Skype works fine with the TRRS headset that came with my Nexus One, and it works fine with the PSP headset I picked up for $4 before Ultimate Electronics went out of business.
Is Canonical the Next Apple?
I was a Linux desktop user for 10 years and just switched to Mac - not because of some nebulous "experience" (I still run fvwm over gnome or kde when given the choice), but I was sick of waiting for my laptop to reboot all the time, and the MacBook is the first computer I've ever used where power management actually, really works.
Maybe you should have been using Windows. I have been using sleep on my desktop and laptop for the past 6 years or so, and despite a wide variety of hardware (two ThinkPad models, an Acer budget laptop, a generic "Whitebook" laptop, an Asus EEE PC netbook, and self-built systems with Intel and AMD CPUs, NVIDIA/AMD/Intel chipsets, and motherboards from Asus/Gigabyte and even Jetway) I have never had a problem with sleep working properly.
To be honest, I have had decent success with sleep in Linux, too. Not 100%, but in the last 5 years things have gotten considerably better, especially if you have Intel hardware.
RIM BlackBerry PlayBook: Unfinished, Unusable
for 90% of Americans you have verizon or AT&T, sprint is only useful inside of city limits.
AT&T + Verizon is actually closer to 60% of the market, not 90%.
And I can't comment about Sprint, but T-Mobile works fine for me all over Colorado, including in rural areas.
outside of cities all coverage quickly drops to voice calls only.(I know one parking spot that my signal goes from 3G to edge depending on how which way the wind is blowing.
Maybe you should stop using AT&T's shit network. Verizon and Sprint's networks are 100% CDMA2000 EV-DO.
And EDGE is not "voice calls only". It's not fast but it does work.
Google, Microsoft In Epic Hiring War
As someone who has been offered full-time by both Microsoft (turned down) and Google (accepted), I can tell you that you're dead wrong.
Do you already have a creative reputation or prominent contacts in the field? If so, stop here and come and work for us - though your talents will probably go to waste.
I had neither. I have worked in small business and I spent two years as a graduate researcher. I don't want to do either anymore. What I want out of Google is good pay and benefits, work-life balance, and to work with smart and motivated peers. My intern experiences at both Microsoft and Google told me that both companies offer that. The decision to work at Google primarily came down to the fact that they were willing to hire me where I want to work (Boulder, CO).
Did you go to a top school, regardless of your background?
I respect my university (University of Colorado Boulder) but it is most certainly not of the same reputation as MIT, Berkeley, Caltech, CMU, Stanford, or one of the other top universities.
Straight out of college as you are, can you answer some inane questions on undergraduate computer science?
I had a mixture of algorithm-type questions, OO design and programming questions (C++ and Java), and some system design questions. All of these topics are applicable to a software engineering position.
How what about some silly puzzles to prove your geekiness?
Now I know that you haven't interviewed with either company recently. Neither Microsoft nor Google have used puzzle questions since 2007; in fact, puzzle questions are expressly forbidden now at both companies.
How about Unicode? Do you know some obscure facts about Unicode? What about HTTP? Have you memorised enough of the RFC?
No one at Microsoft or Google gives a shit if you can memorize standards. Hell, you don't even need to know the standard library. On one of my questions I wrote pseudo-Java that used collection methods that don't exist (with names taken from the C++ STL and Python).
How would you improve ______?
I have never been asked subjective questions like that by either Google or Microsoft.
Do you have the same attitude as us?
No one at Google or Microsoft is going to ask you questions about your attitude or motivations, except perhaps your recruiter (who doesn't have any say in whether you get hired).