Does Learning To Code Outweigh a Degree In Computer Science?
Computer Science is largely very specific applied math and theory. It includes algorithms, algorithm efficiency, a bunch of math, data structures from a theoretical design standpoint, and computer architecture. It tends to be very academic.
University programs vary widely on what the programs focus on, but generally Comp Sci is about the math and theory, and programming is something you do on the side to get the assignments done to illustrate the theory you are learning. With Computer Engineering and Software Engineering programs, things tend to be more hands on and focused more on doing than theory.
Programming, as desired by business, is NOT computer science. Business wants the most simplistic designs (i.e. always use linked lists instead of more appropriate data structures), and above all, they want you to code whatever it is FAST FAST FAST so you can SHIP SHIP SHIP. Because generally, most business are not software businesses, and they don't value developers or software beyond getting the minimum quick and dirty solution out of them as fast and as cheap as possible. Also, most business are not doing anything remotely resembling state of the art, and value the ability to hire a newbie to replace you.
CS grads have it rough. They know too much theory to be satisfied with basic programmer jobs, but they don't know enough about efficiently slapping out code day in and day out to have an easy time in a basic programmer job. The degree can get you in the door though. A lot of places filter out folks with no degree.
Not that there aren't some grads who still can't code their way out of a wet paper bag.
There's all sorts of stuff about programming that you will never learn in a CS program, such as when to select designs based on implementation risk and ease of maintenance, rather than algorithm efficiency. It sucks, but the people who pay for you to write the software could not give two shits about how well the code is designed as long as it mostly works and ships on time. For the most part, that CS theory is mostly only directly relevant to later in your programming career, and when you actually have some autonomy to "do it right" versus "do it yesterday", or if you strike out on your own.
Comcast Customer Service Rep Just Won't Take No For an Answer
They bounced me around through escalation teams for 45 minutes before letting me cancel. I bought a house in an area that isn't serviced by them and they tried to get me to agree to paying $200,000 to run cable to my new house. Bastards!
Ask Slashdot: How Is Online Engineering Coursework Viewed By Employers?
If they require a master's or PhD, it's not an entry level position.
They either a) are trying to change the world with new or hard stuff and want a theory guy to guide things or b) don't know what they are doing or c) don't want to mess around with kids straight out of school who haven't figured out the corporate metagames and "git'er done" culture yet.
There's the optimal implementation on paper, given infinite time for implementation, and there's the "We have two weeks, do what you can pull off" implementation that business is usually looking for. Business values programmer time more than academia does. I know my CS degree didn't prep me for that very well.
Actual raw engineering is a bit less wild wild west than software... there are legal definitions of what a certified engineer is responsible for; i.e. if people die as a result of your engineering mistakes, it's your fault, not just some edge case bug. But the same corporate BS is still driving it, so the same stuff applies... HR is still about risk avoidance, it's just that a guy with a master's or PhD had to jump through more hoops to get to the table and thus the wheat is seperated from the chaff so to speak.
Business doesn't care about getting the best candidate, they care about getting the guy who looks like he's good enough for the money they are willing to spend on him and won't end up as a disaster. And also, some of those job postings may require a master's or PhD so they can legally justify hiring an H1-B after there is no one "qualified" to be found.
Ask Slashdot: How Is Online Engineering Coursework Viewed By Employers?
Business (HR specifically) doesn't give a shit about your degree. They care about a) that you have the checkbox, b) who you worked for previously and are not lying about it, and c) whether it looks like you aren't a total fuckup who will cost them. It's about risk avoidance.
The actual team you interview with (if it wasn't an HR drone) cares that you look like you know your shit and can carry your weight.
Engineering and especially computer degrees are such a total crapshoot on the skills you get in a candidate, that they don't know how to weigh your degree. Even degrees from badass schools sometimes come with folks who still can't code their way out of a wet paper bag. Besides, most of that senior level theory stuff in the degree won't help you much in a real world job until the late stages of your career, and will piss off your peers who don't have the same background, and definitely piss off management, who barely understands what a linked list is.
The quality of in person versus remote will depend on your learning style, and whether you actually would make use of those in-person office hours anyway.
Is Overclocking Over?
Look, digital electronics are still subject to analog limitations. When you overclock, you squeeze the hysterisis curve, increasing the probability that your chip incorrectly interprets its the state of a particular bit as the opposite value. i.e. you get random data corruption. This is why you eventually start crashing randomly the more you overclock.
While overclocking a chip that has been conservatively binned simply to reduce manufacturing costs but is actually stable at higher clock rates is reasonable, trying to overclock past the design limits is pretty insane if you care at all about the data integrity. Also, you tend to burn out the electronics earlier than their expected life due to increased heat stress.
I never overclock.
Hot Multi-OS Switching — Why Isn't It Everywhere?
Most devices barely work in one operating system, let alone having to deal with being initialized and controlled by multiple driver models and switching back and forth between them hot.
They are simply not designed for that scenario. Hence, the hypervisor, and virtualized devices under it.
RadioShack Trying To Return To Its DIY Roots
Radio Shack has been a ripoff for years. Why the hell would anyone who knows enough to DIY pay $4 for a 5 cent part? Sure, it might take a few days for it to come from Mouser, but honestly when you're designing a circuit, you need a lot of components, generally plan out what you need in detail and a retail place just isn't going to stock whatever exotic parts you are going to need for your project anyway.
Since there are far more folks who aren't with it enough to DIY, Radio Shack is far better off overcharging the masses for extension cords, sub par computers, and low grade RC cars in the mall. They just want the masses to THINK that smart people shop there.
Western Washington Univ. Considers Cutting Computer Science
Western's CS program is one of the ones that grew out of a math base. It's pretty hardcore on the theory, but you're sort of on your own for learning the stuff that business wants. Which is fine.... even if the program focused on exacty whatever buzzwords corps want these days, corps don't generally hire CS grads straight out of school. The stuff you learn in the 400 level classes is great for senior developers to know....but you're not going to start out as one. It wasn't till my 3rd job out of college (which I'm still at) that I actually got to touch source code at work. For long term personal growth, I'm really glad that I had my ass kicked with the theory; I find that the rigorous methods that were drilled into me really help me tackle the hard problems I work on every day. (debugging nasty kernel mode race conditions in code written by others for example). Besides, if you can handle the proofs and algorithm stuff, you can handle anything else, though you'll sure as hell not enjoy writing silly business apps over and over.
You know what the job finding foks at Western tell you about finding a job once you graduate? They tell you to forget about finding anything remotely in your field. The real difficulty in getting hired after college has less to do with your skills and what you're taught and more to do with risk aversion for employers...they don't like hiring green kids who don't understand corporate politics yet. You have to persevere in order to get to do what you love.
Computer science is supposed to be hardcore...unfortunately there is a huge variation in what different universities consider to be computer science, let alone what the business world thinks. For some, any old programming is CS, for others, they focus on software engineering methods, and some hardly touch on theory and math at all; others still consider web page design to be CS. CS is about understanding the extreme limits of what computers and software are capable of and pushing the limits of what's possible....it's not supposed to train you for "IT" (which most businesses consider to be the guys that fix their computers).
You really should not be doing a computer science degree unless you are going to be some kind of developer and you get off on things that require in depth knowledge of how to design and compare the performance of different algorithms, want to fix bugs no one else can, want to write really hardcore software (such as doing speech recognition, computer vision, or 3d rendering) at the bleeding edge, and need to be able to prove why your design is better than someone else's design. The industry is already full of very experienced, very compentent people who don't have CS degrees. In fact, many of them started before such degree programs even existed. They know how to code, but they generally don't have any exposure to the more advanced theory stuff and are therefore not inspired by it, nor do they generally value it. The degree is MUCH more a long term investment for your career than a credential to get your foot in the door, as you'll eventually get to apply the theory and start doing things that wow. After you've taken your lumps that is.
Ask Slashdot: Do I Give IT a Login On Our Dept. Server?
You're doing work for the hospital on the system; therefore they need access to it.
Not only that, but there are all sorts of legal requirements around any data on the damn thing. Technically, your calendar, which includes appointment data and scheduling for when you worked on which patient's stuff probably falls under the domain of medical records....
There's a reason that beaurocracy isn't real compatible with you throwing up a server for whatever.. there are legal requirements that make it so every little thing needs to have enterprise grade bs and management behind it. At least on paper anyway.
Not only that, but once you've used it for that, who'se going to sanitize the data off it when you're done with it? I'm surprised the IT guys didn't show up with crowbars demanding admin accounts, followed shortly by dismantling the thing.
That said, I'm sure it's a sweet iphone calendar thingy or whatever.
Does Syfy Really Love Sci-Fi?
I hate wrestling, and I hate Ghost Hunters. It's all they show now. Neither one is science fiction or epic fantasy. Those idiots who took over Syfy don't understand that the people who used to watch SciFi don't watch anymore, because of their stupidity. They have killed off every show that was even moderately interesting to watch.
The whole point is that Scifi was a place where stuff that wasn't mainstream could flourish. The audience doesn't want the bland stuff that's dumbed down for people with a 50 IQ. Now the morons who own it have turned it into another version of TBS.
With Scifi dead, I have no reason to bother keeping cable other than the History channel, which is also starting to go downhill with stupid reality shows. (Pawn Stars is great though..it's actually genuine.)
Is an Internet Kill Switch Feasible In the US?
It's a stupid idea.
Besides, the economic impact alone from breaking the internet in the US for any period of time makes "pushing the kill switch" political suicide anyway.
Also, it's exactly the same power as "we want to shut down the phone system so you can't communicate or call 911 during a revolt, or whenever, you know, some politician feels like it".
British ISPs Embracing Two-Tier Internet
Also, what people don't realize is that the internet is already a loose confederation of networks owned by only a few corporations who have peering deals with each other, and they already throttle each other under the table.
There have already been incidents where the Internet experiences massive failures when these companies get into pissing contests with each other and shut off each other's access to influence negotiations.
British ISPs Embracing Two-Tier Internet
Akamai is very different from a "two tier strategy".
Akamai is all about having local data centers nearer to high traffic population centers. This has the side effect of relieving congestion on the main internet backbones by essentially doing local caching. You want the data, and it happens to be located on a server closer to you, which by coincidence does not have to bottleneck through the backbone as much, so you get better scaling and performance. This strategy is net positive because the internet as a whole benefits by reduced waste and the hosts can deliver content more efficiently with a better user experience.
A two tier internet is something *very* different. That's taking the same pipe, and allocating priority to the rich and powerful at the expense of those who don't pay the premium; there is still the same amount overall of bandwidth available but they want to allocate less of it to you and more of it to companies that pay. How that will actually work is that those who pay more get internet hosting that works, and everyone else gets screwed with a broken, high latency, congested network. Oh, and the price for them will also go up while the service goes down.
Everyone else should get really pissed off about this crap, once they figure out how bad the deal is for them.
Let me put it this way: if this sort of thing is allowed, more advanced internet services developed over the next few years will only be possible when they are run by huge corporations with deep pockets, and all other innovators will be shut out in the cold. And that means you get to pay more for those services because there won't be any competion.
How Do You Store Your Personal Photos?
Does your buddy know the encoding for the data?
Does his method work on "known good" flash memory?
I'd make sure that you understand the data encoding in the flash memory, and which type it is and how it maintains the data before drawing too many conclusions.
If the flash is failed due to wear, then that's expected. (if you run disk stress against an early generation flash key, you can wear it out pretty fast)
How Do You Store Your Personal Photos?
Most of the time, the need for physical drive recovery is due to one of the following cases:
The controller board on the drive went bad. Replaceable with minimal effort, and the right part.
A moving part failed (e.g. the reader arm or whatever) Replaceable with some effort, and the right part.
Somebody hosed the partition table. Usually possible to fix with a hex editor if you can manually reconstruct the table and/or use backup copies elsewhere on the disk. There is (expensive) software which will do this sort of stuff for you. Not for the faint of heart.
Filesystem corruption. Good luck, unless you have filesystem internals knowledge. Bits that aren't corrupt might be saved, with much effort, by a filesystem developer. (i.e. mere mortals are screwed).
RAID failure causes inaccessibility of some stripes. (eg two simultaneous disk failures in a RAID5, or one disk failure in a RAID0) You might get some data off the remaining stripes, but it is likely that unless your file happens to be smaller than the stripe size, and happens to not cross a stripe boundary, you will have lost significant portions of the data. Takes an expert to reconstruct what little can be saved.
Physical damage to the platter.(e.g. a scratch) If you are lucky you might be able to read some bits off the parts of the disk that aren't damaged. Depends on the nature of the physical damage though.
Failure due to hairline fractures can sometimes be worked around by freezing the drive long enough to get the data off.
I'm discounting eloborate theoretical scenarios where you use some kind of external reading equipment on the drive. In the real world, recovery companies try the above techniques and give up if they fail.
With solid state drives, you have the same story for partition table and filesystem issues, and for controller boards, assuming it's a seperate piece. I suppose the equivalent to physical scenarios with platters would be if you desoldered the flash chips and moved them to another identical drive, which is more difficult and far more expensive labor. Plus, only some of the chips may still be good, and the way that wearleveling blocks work, your data will actually be scattered across the chips noncontiguously for the most part, so you're likely to only get partial recovery anyway.
Pot Grower's Privacy Challenged
Require a security system? Really? In Montana most people don't even bother to lock their cars.
While it's a good idea to have one as a business, it really should not be a requirement. Now you have to staff your police to respond to the constant stream of false alarms that come from said security systems. There's a reason most police jurisdictions bill you for their time when responding to a security system incident after the first false one.
Plus, now you are requiring businesses in your small town to pay service fees to an external security monitoring company. That's a drain on your local economy. Good job, because that's $30-$100 per business per month that isn't staying in your community.
Should Colleges Ban Classroom Laptop Use?
You really should be communicating with your profs during class, not dinking around on your laptop. Unless you type quite a bit faster than most hunt and peckers, you're going to take better notes with pen and paper anyway, and there are relatively few situations where you really need a computer during class as a tool.
What really should be banned is the use of PowerPoint lectures. You know the ones..... where your prof essentially scanned all the relevant sections of the textbook and then cruises through 200 screens of unreadable shrunken slides at light speed while staring at his laptop. Most folks use it as a crutch rather than a visual aid to get a handful of important points across visually, or provide a persistent framework for discussion. It's irritating. I'd much rather see the chalk board in use, screeching and all... at least then the instructor is forced to cover the material at the speed it takes him/her to think through it, giving you enough time to grok it during the lecture.
Net Neutrality Supporters Hammered In Elections
Lame. Not enforcing net neutrality allows, and encourages service providers and media companies to selectively censor whatever they want; they can without violating any legal requirements simply make it too expensive for anyone else to voice their opinion publicly or provide competing content. Heck, you don't even have to actively censor things... you just make the performance and cost of everything that isn't your content slightly more expensive, and economics will make sure that your message is seen more than the 2nd class citizen content.
There is too much danger of these huge corporations manipulating free speech and culture to allow anyone to do that. It will quite rapidly devolve into the same environment as TV... only the rich and big companies can afford to publish.
The internet is a communications media. You don't let the phone company tell you what you can talk about on your phone. Why should you let your ISP tell you what you can see on the internet? It's the same garbage where the TV corporations want to control your internet the same way they control what shows get made. Hint: the shows that get made aren't the shows the people want to see; they are the shows the corporate executives think will sell to advertisers best.
Are Desktop Firewalls Overkill?
Layer your firewalls like the design of a medieval keep. Exterior curtain wall, plus defensible keep.
You don't know whether threats come from inside or outside; therefore when in doubt firewall everywhere.
Intel Wants To Charge $50 To Unlock Your CPU's Full Capabilities
For most applications where CPU power matters, hyperthreading actually hurts performance, because the scheduler assumes that the hyper cores are real cores, when in fact they are not. A hyperthreaded core is not a full core, and will bottleneck waiting for the bits it doesn't have on the real core. I prefer not even having hyperthreaded cores unless the os and applications are aware that some cores are gimpy hyperthread cores and takes that into account; i.e does not schedule on them, or only schedules tasks that are not actually crippled by dependency on the real cores on the hyper cores.
Try running a database on hyperthreaded cores....it blows chunks. It's better if you just disable them in BIOS.