Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Software-Defined Data Centers: Seeing Through the Hype

Soulskill posted about a year ago | from the mashing-up-jargon dept.

Businesses 39

Nerval's Lobster writes "In case you didn't catch it yesterday, AllThingsD ran a piece endorsing the idea of the software-defined data center. That's a venue where hordes of non-technical mid- and upper-level managers will see it and (because of the credibility of AllThingsD) will believe software-defined data centers are not only possible, but that they exist and that your company is somehow falling behind because you personally have not sketched up a topology on a napkin or brought a package of it to install. If mid-level managers in your datacenter or extended IT department have not been pinged at least once today by business-unit managers offering to tip them off to the benefits of software-defined data centers—or demand that they buy one—then someone should go check the internal phone system because not all the calls are coming through. Why was AllThingD's piece problematic? First, because it's a good enough publication to explain all the relevant technology terms in ways that even a non-technical audience can understand. Second, it's also a credible source, owned by Dow Jones & Co. and spun off by The Wall Street Journal. Third, software-defined data centers are genuinely happening—but it's in the very early stages. The true benefits of the platform won't arrive for quite some time—and there's too much to do in the meantime to talk about potential endpoints. Fortunately, there are a number of resources online to help tell hype from reality."

cancel ×


Sorry! There are no comments related to the filter you selected.

You know you want to... (0)

White Flame (1074973) | about a year ago | (#44016451)

Just call it Software Defined Cloud 2.0 and be done with it already.

Re:You know you want to... (1)

mwvdlee (775178) | about a year ago | (#44016553)

Would it be out-of-date to call it a Virtual Cloud?

Re:You know you want to... (2)

khasim (1285) | about a year ago | (#44016705)

I think they already did. From TFA:

Sometimes the hype tends to pan out and concepts such as âoee-commerceâ become a normal way to shop.

60% of the time, it works every time.

Either way, the term âoesoftware definedâ is with us to stay, and there is real meaning and value behind it if you look past the hype.

Except that the term "software defined" is not itself defined except by whatever marketing department wants to make it fit their product.
And the term will eventually be replaced with another marketing term.
Just as SaaS replaced ASP.
Just as ASP replaced thin-client.

What all these âoesoftware-definedâ concepts really boil down to is: Virtualization of the underlying component and accessibility through some documented API to provision, operate and manage the low-level component.

Which means that you'll only have the access and granularity that the API gives you.

And somewhere, someone will have to deal with the real servers and switches and such. And to him, you'll just be another account in a bunch of accounts. Sure, they'll deduct the cost of your downtime from your next bill. Well, the cost of what you pay them per business hour per business day.

Don't like it? Just try to get your data from them so that you can move it to a different provider.

And each provider will be under the same pressures to reduce costs as much as possible in order to maximize their profits.

Re:You know you want to... (1)

visualight (468005) | about a year ago | (#44019071)


There's already a lot of people who know puppet or chef but not much else (and think they know something). I think the trend is going to continue, ultimately spreading to the distro's themselves as they become less malleable, less flexible, with fewer differences between them (Hi Lennart). We'll have frameworks for frameworks. Linux will be Windows.

And then the trend will reverse. The current crop of 20-somethings will be in their 30's and "reinventing Unix".

wat (-1)

Anonymous Coward | about a year ago | (#44016467)

bla bla blaaa blaaa words wat is this crap

End of the "UNIX/Linux guru" (5, Insightful)

Animats (122034) | about a year ago | (#44016505)

We're no longer constrained by the need to have deep specialized knowledge in the low-level components to get basic access to this technology.

That's what it is really about. The unit of computational resource is a standardized, empty server. It's not "maintained", it's wiped and reloaded. If something goes wrong with it, its load is sent elsewhere, and eventually the unit will be replaced by someone who unplugs it and plugs in another one. Nobody in the data center really has to have much of an idea of what's going on with the computers. Their concerns are power, cooling, cabling, and physical security.

Most of them will be paid at security-guard levels.

Re:End of the "UNIX/Linux guru" (1)

DavidClarkeHR (2769805) | about a year ago | (#44016687)

Most of them will be paid at security-guard levels.

Wait a minute ... are we talking about USA-Rent-a-cop security guards, or sultan of dubai security guards?

USA-Rent-a-cop mini wage and BYOG (1)

Joe_Dragon (2206452) | about a year ago | (#44017003)

USA-Rent-a-cop mini wage and BYOG.

Yes there was a data center that posted a job for security guards after they got robed and they where paying min wage and having your gun was a big plus.

Re:End of the "UNIX/Linux guru" (-1)

Anonymous Coward | about a year ago | (#44017843)


Mistaken.. (0)

Anonymous Coward | about a year ago | (#44016727)

The stack constructed atop these solutions are actually a lot more complicated than those that continue to be at the core of many enterprise workloads. The need continues and in fact grows for 'gurus', but they won't be focused as much on the continued health of any particular instance, but on the resiliency of the workload on top of a set of unreliable instances (assuming that AWS and similar take over the market entirely, I actually believe that going to ecosystems like AWS is currently overhyped and IT will stop doing large moves to it in the near future or there might even be a backfire as most companies realize that they do not have the skills required to provide an adequate service under those conditions).

There is a lot to learn from the model, but I envision ultimately a compromise. AWS and similar being inspirational to IT orgs to cut out some of the truly meaningless crap they have, but preserving some of the instance reliability that their less-than-genius application developers require to deliver remotely acceptable behavior. I'll agree that you need to be able to tolerate failure *anyway* but the truth is the run of the mill application developer isn't good enough to pull it off. Enterprises will continue to have a requirement to persist instances whatever it takes, and failures will just continue to be a nightmare.

Hell, look at netflix. They are among the best at providing a decent experience on top a complicated, unreliable infrastructure. Nevertheless, they've suffered 2-3 multi-hour outages a year for their entire service,. It's common for a request to play a video to fail and require retry by user (or for a video in flight to stop working). Despite all their ability and all their effort, they still provide an experience that falls short compared to many enterprise implementations.

Re:Mistaken.. (1)

tnk1 (899206) | about a year ago | (#44018343)

That is because stuff like AWS is has been missing important pieces. You would not believe the effort I have had to go through to find some sort of active-active database solution in AWS so that I do not have to worry about a node failing and having to have someone mess around with failover to get it running again.

However, as I speak, solutions are popping up for the things that are missing. It's my impression that right now, with the right third party vendors who are direct connected with AWS, you can run just about anything you want. Enterprise or not.

There are a lot of potholes you still have to worry about, but right now, unless you think you need your own and/or specialized hardware for some reason, you can run just about anything you want in AWS. The last thing I need to get operating in AWS is Oracle RAC, which is a gigantic pain in the ass. A year or two ago, it just wouldn't work because I needed to rely on EBS, and you couldn't even configure second interfaces on your Linux hosts. Now, you can get iSCSI, NFSv4 storage from a third party vendor and you can create extra interfaces to do your heartbeat stuff.

I don't think datacenters are going away, and Linux skills certainly aren't, because we still have to admin Linux in EC2, but I think co-location of small to mid-sized companies might take a dive. This stuff is ten times better than having to run a datacenter yourself unless you have a specific need.

Re:Mistaken.. (1)

rioki (1328185) | about a year ago | (#44027001)

You can, but do you want to? My experience is that cloud computing, especially EC2 is quite expensive. The only benefit you get is elasticity (the E in EC2), that is you can react to radical load changes. But most enterprise applications have very predicable use patterns and are mostly flat (9-5 Mo-Fr). In almost all cases having your own servers is the cheaper solution and WAY simpler to deploy. (If you are paranoid you may still use virtual machines are servers for quit failover.)

Re:End of the "UNIX/Linux guru" (2, Interesting)

Anonymous Coward | about a year ago | (#44016747)

I'm tempted to do a snarky rewrite of your post, like this:

Much the same could be said for the future of programming: soon, one will use these newfangled "libraries" and "macros" for every task, never stopping to consider hand-optimizing their assembly code, because there will be no assembly-code gurus left. Nobody engaging in this sort of plug-and-chug "programming" will really have to have much of an idea of what's going on with the computers. Their concerns are stringing together pre-packaged libraries with just enough logic to solve real-world problems. Most of them will be paid at security-guard levels.

I'm afraid, though, that the point would be lost in the sarcasm. When datacenter jobs are basically custodial/janitorial/security in nature, progress will have been made. Minds that are able and willing to tackle difficult systems should be applied to life's real problems, not to the OCD maintenance of computer systems in a data warehouse. This is the modern analogy of the industrial revolution: what artists (gurus) once handled with skills finely honed by experience and the wisdom of generations of artisans, shifts of line-workers armed with machinery soon churned out in quantities orders of magnitude greater and at far lower prices, and the truly skilled went on to work on other problems, and the less-skilled despite being less-skilled still got the job done, and progress was made, and the world is a better place for it. Sure, your local Guru is going to be replaced; he'll go to work on something else, and downtimes will be lower, and costs to access computing will be lower, and applications more powerful, and progress will be made.

Re:End of the "UNIX/Linux guru" (0)

Anonymous Coward | about a year ago | (#44017715)

Jokes on you. The smart Linux guys have become smart DevOps and Automation guys. Have configuration management, will travel.

Re:End of the "UNIX/Linux guru" (1)

tnk1 (899206) | about a year ago | (#44018377)

Yes, as soon as I discovered DevOps buzzword, it went on my resume. The number of unsolicited requests I get for DevOps placement is ridiculous. And what is better, DevOps isn't really even as well defined as something like Agile. While I am definitely qualified to say I do DevOps work, by some definitions you could probably just look up what it means, teach yourself Puppet or Chef, and now you are "DevOps".

This will not eliminate diagnosing issues... (1)

t'mbert (301531) | about a year ago | (#44018721)

As systems become more inter-connected and more dependent on standard components, they also become more difficult to diagnose. Problems in one seemingly benign part of the system can affect it and render it unusable, and now those parts may be spread around virtual datacenters and servers. We need OS guru's now more than ever, but it's also expected that those guru's know many different technologies (hence, DevOps and other automation-oriented skills). It's the corollary to what's happening in development: one developer can build more software faster than ever before, but they must also be knowledgeable in a wider range of technologies.

Re:End of the "UNIX/Linux guru" (0)

Anonymous Coward | about a year ago | (#44027681)

You mean their concerns will be to acquire power, cooling the bodies, cabling the females and physical security from public forced? Will require a lot of security guards indeed.

Wut. (4, Insightful)

bmo (77928) | about a year ago | (#44016509)

It's as if there's something genetic in MBA types that makes them abuse English so awfully as this summary exemplifies.

It's a good thing that tomorrow is Bloomsday.


Re:Wut. (1)

PolygamousRanchKid (1290638) | about a year ago | (#44017023)

It's as if there's something genetic in MBA types that makes them abuse English so awfully as this summary exemplifies.

Never fear! See this "software-defined" craving as an opportunity . . . replace your MBAs with "software-defined management!"

love slashdot summaries (4, Insightful)

Osgeld (1900440) | about a year ago | (#44016531)

the other day I saw a summary that was not even a complete sentence, now today I see one that could have had all the words above the third point removed and it would not have made any difference because its just some asshat getting on the whine train about management.

somewhere there is a middle, maybe one decade slashdot can consistently hit it!

Re:love slashdot summaries (1)

kiwimate (458274) | about a year ago | (#44017283)

Yep. I am wondering if I read the same article. Far from being a ringing endorsement of some buzz phrase (which I'd never heard of), it starts off with a warning about falling for hype and then continues to describe what a software defined data center actually is and finishes with some prognostications about what will be.

I think the main point missing is one that any IT manager will see. (Mind you, I labor under the impression that IT managers, who usually do have some kind of business sense, understand that you can't just buy endless quantities of hard disk and servers on the premise that you'll need it some day. Perhaps I've been lucky, but all the managers I've worked with over the past 20 years certainly got that.) Namely, that unless you are using Amazon's services, you don't have endless arrays of hard disk behind the door and the idea of someone idly clicking a button to format TB of disk at a whim comes crashing back to reality when the next business unit who tries the same thing comes knocking to ask where all the free disk space has gone. I.e. same as always, capacity isn't free, performance isn't automatic, and quotas are a necessary evil.

Summary sounds to me like a typical hatchet job aimed at registering lots of page views by giving a deliberately inflammatory starting point that's designed to get /. readers' blood boiling. And clever them (and dumb me), I've fallen for it as soon as I click "submit".

So what's the problem (1)

NemoinSpace (1118137) | about a year ago | (#44016549)

The article on robot controlled data centers received warm reception. This article seems more critical. It simply explains what Amazon is already doing, automation through programmed check boxes.
I'm not talking about some goofy VB controlled hack either. When you automate something, you better have a good understanding of the process. Pushing your labor off to users with credit cards willing to pay for the privilege has been the business model for quite some time.

Our phone system isn't broken... (0)

Anonymous Coward | about a year ago | (#44016763)'s software-defined.

Kevin Fogarty wrote the summary (0)

Anonymous Coward | about a year ago | (#44016825)

Yes, one could discover that by clicking on one of the links, but it's not the main link - it's the one at the bottom.

When I first read the summary (here on Slashdot), I thought it was an endorsement of software defined data centers combined with an unusual plug for AllThingsD (not just another online publication...). In fact, it's just the opposite, Fogarty is saying it might be a load of hype after all. That's the kind of nuance you lose by cutting and pasting paragraphs from a column to a newsfeed-type blog.

Re:Kevin Fogarty wrote the summary (0)

Anonymous Coward | about a year ago | (#44017213)

Well, unlike many people, I do give his views a lot of creedence.

Re:Kevin Fogarty wrote the summary (0)

Anonymous Coward | about a year ago | (#44017961)

You I must have read different summaries. The TFS I read, pretty clearly warns that the PHBs are about to be inundated with convincing but misleading information about an immature technology. Maybe they changed it.

Re:Kevin Fogarty wrote the summary (1)

rioki (1328185) | about a year ago | (#44027127)


The TFS I read, pretty clearly warns that the PHBs are about to be inundated with convincing but misleading information about an old technology.

The technology is nothing new as mentioned and disagreed in the AllThingD article "We’ve been doing this since the 80s.". The new thing is the business model of operating a data center, rending out the hardware and combine it with visualization and automation software. But if you already have a data center you are and if you don't you should be using some form of automated provisioning.

The only real change I have seen in the space is that (visualized) hardware is not rented by the hour instead of a 12 month contract. But that is not what either articles are about... Nothing new under the sun.

If anyone *actually* cared... (1)

Anonymous Coward | about a year ago | (#44016833) would point them at something like this, instead of a Wired piece.

"Software-defined" (1)

oldhack (1037484) | about a year ago | (#44016859)

Another senseless buzz-defined buzz phrase.

Acronym abuse. (1)

msauve (701917) | about a year ago | (#44016863)

Really, pointing to an article which (repeatedly) says the acronym for Software Defined Data Center is "SDDN?" That refers to Software Defined Data Network. The latter isn't limited to data centers, although that's where the most clear benefit lies. SDDN is just a single part of SDDC, which also make use of virtualization of compute and storage resources.

Reality (0)

Anonymous Coward | about a year ago | (#44017225)

Posting as AC since I am not authorized to talk for my company, sir, you really don't know what you are talking about. There are companies, our being one of them doing it in reality. What it took was to start from scratch, blank slate, and reenvision, how you would do infrastructure today if you had all the resources you needed available (which we did) and you wouldn't have to deal with legacy.

Attempt at making sense of it. (0)

Anonymous Coward | about a year ago | (#44017415)

For certain uses (datacenter and supercomputers) there is a demand for a special LAN that is large, fast, and cheap.

IP Routers can handle large networks, but are neither fast nor cheap. This is because IP is designed for the internet, and is overkill for a single-owner single-building LAN.

Ethernet switches are individually fast and cheap, but since they really only handle tree topologies properly, they lose efficiency when the network gets large.

The solution is to add a routed network layer that's not IP and is local to the LAN. This is called a fabric protocol, or something to that effect.
However, all fabric protocols are proprietary nightmares of vendor lock-in. SDN is an attempt to break the lock-in by letting you write your own fabric protocol using a set of standardized primitives provided by all SDN-compliant fabric switches.

However the set of standardized primitives is itself a new proprietary protocol...

Or I could be completely wrong, but at least this is the impression I got.

frisT ps0t (-1)

Anonymous Coward | about a year ago | (#44017535)

What color should you get? (0)

Anonymous Coward | about a year ago | (#44018037)

I think mauve has the most RAM.

Software defined... (1)

thegarbz (1787294) | about a year ago | (#44018967)

I'm going to make me a software defined sandwich get in my software defined car and drive myself to my software defined work where I fix software designed problems (well the last part is at least technically true)

But seriously is this the new synergy? The latest buzzword craze? When did doing some fancy programming suddenly mean we had to slap the word software-defined on everything?

I should pre-emptively go out and patent the process of doing anything "in software", because "on a computer" was sooo 2012.

Hype cycle (1)

nickovs (115935) | about a year ago | (#44021629)

"More than an actual technology, SDDN is the culmination of many other efforts at abstracting, consolidating, managing, provisioning, load balancing and distributing datacenter assets." Which is a fancy way of saying it's a bunch of commodity PCs running Xen, attached to some f***ing big Juniper QFabric switch with some PHP scripts to let middle-managers bring up servers without knowing where they are. It's just that right now our stage in the hype cycle [] is the Peak of Inflated Expectations.

Stock photos are funny (0)

Anonymous Coward | about a year ago | (#44023085)

I like that, for a piece about software defined data centers, the stock photo that gets in the way of the content is of hardware that you'd find on a desktop.

Really? A closeup shot [] of a row of what appears to be Dell(tm) desktop systems? C'mon /. editors, you can do better. Oh wait, they're Dice editors now...well shit.

"the credibility of AllThingsD" (0)

Anonymous Coward | about a year ago | (#44023473)

I never thought I'd see "the credibility of AllThingsD" in a sentence on Slashdot. Katie Boheret is cute, but the combined candle power of Walt Mossberg, Boehret, and the others couldn't light a shoe box.

How about "(because AllThingsD is the fluff consumed by people who don't know anything about technology)" instead?

Wow... talk about your broken analogies... (1)

rnturn (11092) | about a year ago | (#44030115)

``I imagine the software defined data center to be a Fantasia-like world where Mickey is the IT staff and the brooms are networking, storage, compute and security.''

Has the author actually seen Fantasia? Mickey's attempt to control the brooms turned out to be a disaster. I imagine the software-defined data center as being a boon for consultants (the Sorcerer?) who come in and clean up the ensuing mess that the Corporate Mickeys create.

One suspects that the software defined data center is going to have so many knobs and levers that nobody will be able to effectively control it all. Maybe not even the Sorcerers. Touch one knob and affect -- quite probably adversely -- a half dozen other knobs and levers.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?