Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Quiet Before the Next IT Revolution

Soulskill posted about a month ago | from the before-the-AIs-violently-revolt dept.

IT 145

snydeq writes: Now that the technologies behind our servers and networks have stabilized, IT can look forward to a different kind of constant change, writes Paul Venezia. "In IT, we are actually seeing a bit of stasis. I don't mean that the IT world isn't moving at the speed of light — it is — but the technologies we use in our corporate data centers have progressed to the point where we can leave them be for the foreseeable future without worry that they will cause blocking problems in other areas of the infrastructure. What all this means for IT is not that we can finally sit back and take a break after decades of turbulence, but that we can now focus less on the foundational elements of IT and more on the refinements. ... In essence, we have finally built the transcontinental railroad, and now we can use it to completely transform our Wild West."

cancel ×


Sorry! There are no comments related to the filter you selected.

Horseshit (0)

Anonymous Coward | about a month ago | (#47660795)

Horsehit, your company ran out of money to buy more tech. That's called 'stasis'.

Re:Horseshit (2)

Z00L00K (682162) | about a month ago | (#47661019)

Now - just because one company goes belly-up doesn't mean that another can't take over and be successful.

What you have is not by far a successful IT platform yet, you have the foundation. What is limiting is the ISPs and their customer agreements that effective limits the users to being consumers of bandwidth and services. When the ISPs realizes that their models with bandwidth throttling and agreements prohibiting customers to set up services at home slows down development of new companies and services then you will see new creations. Not everything will be successful, but enough will be to build the next big corporation.

Re:Horseshit (4, Insightful)

some old guy (674482) | about a month ago | (#47661491)

I envy your optimism and agree that ISPs are the problem, but I don't see how new companies and services will force change upon ISPs.

New ISPs? Not in the state-sanctioned monopolist USA.

Loss of customers? See above.

The ISP and backbone provider bridge trolls sleep soundly, knowing that no one has the money or statutory permission to build competing bridges.

Only the FCC and Congress could do that, and the oligarchs are quite happy with the current bridge trolls.

Re:Horseshit (1)

tsa (15680) | about a month ago | (#47661531)

That's only in the US. Here in the EU we solved that problem long ago, paving the way for the development of new companies and services. You will be left behind if you keep letting the market decide these things.

Re:Horseshit (1)

Jawnn (445279) | about a month ago | (#47662113)

Yeah? But, but... All those EU countries and their policies are socialist. How could it be possible that socialist policies lead to anything that is better, faster, and more readily available? The free market ensures that where there's a demand, those things are always available to anyone who wants to buy them. Right. RIGHT?

Re:Horseshit (2, Interesting)

Anonymous Coward | about a month ago | (#47662489)

I know that was sarcasm, but for the sarcasm impaired (or the ignorant), I recommend reading Greenspan's testimony to a congressional oversight committee in 2008 where he was forced to admit that the objectivist-based idiocy about free markets and rationality always winning out that underpinned the Reagan Revolution and subsequent de-regulation and freeing of the "free market" does not work in the real world.

Amazing, to see someone who gazed admiringly on Ayn Rand as he sat at her feet forced to admit his entire philosophical base is a fallacy.

  = = =

REP. HENRY WAXMAN: The question I have for you is, you had an ideology, you had a belief that free, competitive — and this is your statement — “I do have an ideology. My judgment is that free, competitive markets are by far the unrivaled way to organize economies. We’ve tried regulation. None meaningfully worked.” That was your quote.

You had the authority to prevent irresponsible lending practices that led to the subprime mortgage crisis. You were advised to do so by many others. And now our whole economy is paying its price.

Do you feel that your ideology pushed you to make decisions that you wish you had not made?

ALAN GREENSPAN: Well, remember that what an ideology is, is a conceptual framework with the way people deal with reality. Everyone has one. You have to — to exist, you need an ideology. The question is whether it is accurate or not.

And what I’m saying to you is, yes, I found a flaw. I don’t know how significant or permanent it is, but I’ve been very distressed by that fact.

REP. HENRY WAXMAN: You found a flaw in the reality

ALAN GREENSPAN: Flaw in the model that I perceived is the critical functioning structure that defines how the world works, so to speak.

REP. HENRY WAXMAN: In other words, you found that your view of the world, your ideology, was not right, it was not working?

ALAN GREENSPAN: That is — precisely. No, that’s precisely the reason I was shocked, because I had been going for 40 years or more with very considerable evidence that it was working exceptionally well.

Re:Horseshit (5, Insightful)

davester666 (731373) | about a month ago | (#47661039)

alternately, it will soon be time for the pendulum to swing back to "we've got to have everything in-house, these security breaches are killing us" and "dumb terminals and having everything in the 'cloud' is killing productivity when the cloud is down, we need real apps so users can work even when the cloud doesn't"

I will add to the list.. (0)

Anonymous Coward | about a month ago | (#47661809)

'Our vendors' "virtual appliances" are horrible to update and are rife with security problems, we need them to give us an application and we'lll manage the OS so we can update the libraries'

Re:Horseshit (1)

K. S. Kyosuke (729550) | about a month ago | (#47662399)

Or, in the future, you could just have apps that can run their parts either on the client side or on the server side depending on what's more advantageous in the given situation (based on network bandwidth, network latency, intra-app communication patterns, current server load, client CPU performance, client storage options etc.) That seems like the most flexible option I've ever heard of, and it subsumes having an offline mode. (Plan 9 already did something vaguely similar on the OS level.)

Re: Horseshit (0)

Anonymous Coward | about a month ago | (#47661633)

Yep. Mostly BS.

We're in the beginnings of our next DC forklift upgrade--fabric interconnects, 40G pipes (debates over 100G & 10G), new server architectures, SAN changes, and a WiFi overhaul outside the DC.

Things are much more stable, but we are still seeing considerable changes in the underlying layers.

cheap air jordans sale (-1, Troll)

angela0310 (3768699) | about a month ago | (#47660797)

It is no doubt an optimistic enterprise.jordan shoes on sale [] But it is good for awhile to be free from the carping note that must needs be audible when we discuss our present imperfections, to release ourselves from practical difficulties and the tangle of ways and jordan shoes [] It is good to stop by the track for a space, put aside the knapsack, wipe the brows, and talk a little of the upper slopes of the mountain we think we are climbing, would but the trees let us see it.There is to be no inquiry here of policy and method. cheap air jordans sale [] This is to be a holiday from politics and movements and methods. But for all that, we must needs define certain limitations. Were we free to have our untrammelled desire,cheap retro air jordans [] I suppose we should follow Morris to his Nowhere, we should change the nature of man and the nature of things together; we should make the whole race wise, tolerant,cheap jordan shoes online [] noble, perfect — wave our hands to a splendid anarchy, every man doing as it pleases him, and none pleased to do evil, in a world as good in its essential nature, as ripe and sunny, as the world before the Fall.

A rather simplistic hardware-centric view (5, Informative)

msobkow (48369) | about a month ago | (#47660809)

The article is a rather simplistic hardware-centric viewpoint. It doesn't even begin to touch on the areas where IT has always struggled: design, coding, debugging, and deployment. Instead it completely ignores the issue of software development, and instead bleats about how we can "roll back" servers with the click of a button in a virtual environment.

Which, of course, conveniently ignores the fact that someone has to write the code that runs in those virtual servers, debug it, test it, integrate it, package it, and ship it. Should it be an upgrade to an existing service/server, add in the overhead of designing, coding, and testing the database migration scripts for it, and coordinating the deployments of application virtual servers with the database servers.

Are things easier than they used to be? Perhaps for they basic system administration tasks.

But those have never been where the bulk of time and budget go.

Re:A rather simplistic hardware-centric view (3, Interesting)

jemmyw (624065) | about a month ago | (#47660825)

Indeed, and virtualization is a rapidly evolving part of infrastructure right now. We may no longer be upgrading the hardware as rapidly (although I'm not certain about that either), but the virtual layer and tools are changing, and upgrading those requires just as much upheaval.

Re:A rather simplistic hardware-centric view (1)

LordWabbit2 (2440804) | about a month ago | (#47661823)

Virtualization is a pain in the ass. Want a new prod server? *click*
Want a new dev environment? *click*
Want a new db server? *click*
Need an FTP server? *click*
Need an HTTP server? *click*

Before you know it when you need to deploy a small software change it becomes a big deal because you have a billion bloody servers to update.
Before virtualization (or at least the ease of virtualization) you took your time and planned - checked available resources etc. Resources were scarce, RAM wasn't so abundant, disk space was metered out. Now the prices have dropped (or capacity increased, or both) that it doesn't really matter. At least in Dev, in Prod SSD space is the bottleneck now.

Hardware still matters (1)

mu51c10rd (187182) | about a month ago | (#47662873)

It seems too many forget that all this virtualization still runs on physical servers. Those physical servers still need hardware upgrades, monitoring, and resource management (especially when one starts oversubscribing). I don't get why people keep thinking hardware went away. Instead of lots of 1U servers, now you have big iron running lots of virtual servers.

If you didn't know what you were doing ... (4, Informative)

khasim (1285) | about a month ago | (#47660949)

Are things easier than they used to be? Perhaps for they basic system administration tasks.

But those have never been where the bulk of time and budget go.

They could be if you did not know what you were doing. Like I suspect the author of TFA did not know.

From TFA:

Where we once walked on tightropes every day doing basic server maintenance, we are now afforded nearly instant undo buttons, as snapshots of virtual servers allow us to roll back server updates and changes with a click.

If he's talking about a production system then he's an idiot.

If he's talking about a test system then what does it matter? The time spent running the tests was a lot longer than the time spent restoring a system if any of those tests failed.

And finally:

Within the course of a decade or so, we saw networking technology progress from 10Base-2 to 10Base-T, to 100Base-T to Gigabit Ethernet. Each leap required systemic changes in the data center and in the corporate network.

WTF is 10Base-2 doing there? I haven't seen that since the mid-90's. Meanwhile, every PC that I've seen in the last 10 years has had built-in gigabit Ethernet.

If he wants to talk about hardware then he needs to talk about thing like Cisco Nexus. And even that is not "new".

And, as you pointed out, the PROGRAMMING aspects always lag way behind the physical aspects. And writing good code is as difficult today as it has ever been.

Re:If you didn't know what you were doing ... (2)

kevingolding2001 (590321) | about a month ago | (#47661497)

WTF is 10Base-2 doing there? I haven't seen that since the mid-90's.

That was probably the "or so" that came after the word 'decade'.

A "decade or so" could be taken to mean 2 of them, which puts it back in the mid 1990's.

I think you're just being Captain Pedantic, when all the GP was really trying to say was that things move pretty quickly in IT.

Re: If you didn't know what you were doing ... (0)

Anonymous Coward | about a month ago | (#47661643)

You mean you don't have infinite storage space for all those VM snapshots? And I thought it was just us....

Re: If you didn't know what you were doing ... (0)

Anonymous Coward | about a month ago | (#47662221)

And besides, rolling back a patch on a production system has for many years been as easy as booting from the mirror, so it's really nothing new. You did break the mirror before you patched, right?

Re:If you didn't know what you were doing ... (5, Insightful)

Iamthecheese (1264298) | about a month ago | (#47661697)

Programming in good code isn't hard at all. What's hard is programming well when you're on the fifth "all hands on deck" rush job this year, you have two years of experience and no training because your company was too cheap to pay a decent wage or train you, a humiliating and useless performance review is just 'round the corner, and you doubt anything you type will end up in the final product. The problem is a widespread cultural one. When IT companies are willing to spend the time and money for consistent quality that's when they'll start to put out quality products.

Re:If you didn't know what you were doing ... (3, Insightful)

See Attached (1269764) | about a month ago | (#47662335)

Agreed 100% There is never time to do it right, but there is always time to do it over. Reviews that admit success, but celebrate weakness are not positive experiences. There is another trend of third parties marketing infrastructure solutions to high level management, skipping local subject matter experts. This triples the work we have to do. change is fine, and embraced, but we are paid for something. Provide stability and compliance in a rapidly evolving globalized environment.

Re:If you didn't know what you were doing ... (0)

Anonymous Coward | about a month ago | (#47662275)

WTF is 10Base-2 doing there? I haven't seen that since the mid-90's. Meanwhile, every PC that I've seen in the last 10 years has had built-in gigabit Ethernet.

That's his point. A lot of change over a decade or so (90's and early00's) going through these standards. Now things are pretty much stable with gigabit ethernet.

Re:If you didn't know what you were doing ... (1)

nine-times (778537) | about a month ago | (#47662549)

Where we once walked on tightropes every day doing basic server maintenance, we are now afforded nearly instant undo buttons, as snapshots of virtual servers allow us to roll back server updates and changes with a click.

If he's talking about a production system then he's an idiot.

Why? Is it your contention that the work of sysadmins and support personnel has just been trouble-free for decades, and all the problems were caused by a sysadmin "not knowing what they were doing"?

Re:A rather simplistic hardware-centric view (1)

dbIII (701233) | about a month ago | (#47661253)

and instead bleats about how we can "roll back" servers with the click of a button in a virtual environment.

Meanwhile I'd like to be able to turn clusters into a virtual server instead of having to codes specificly for clusters. Something like OpenMosix was starting to do before it imploded. Make serveral machines look like one big machine to applications designed to only run on single machines.

Re:A rather simplistic hardware-centric view (2)

Junta (36770) | about a month ago | (#47661855) does what you request.

It's not exactly all warm and fuzzy. Things are much improved from the Mosix days in terms of having the right available data and kernel scheduling behaviors (largely thanks to the rise of NUMA architecture as the usual system design). However there is a simple reality that the server to server interconnect is still massively higher latency and lower bandwidth than QPI or HyperTransport. So if a 'single system' application is executed designed around assumptions of no worse than QPI inter-process connectivity, it still won't be that nice and an application managing the messaging more explicitly will fare better.

But if you have to use an application that can do multi core but not multi node and force it to scale *somewhat*, ScaleMP can help things out significantly.

Re:A rather simplistic hardware-centric view (1)

dbIII (701233) | about a month ago | (#47662151)

Cool, I'll definitely look into that. I thought that line of thought had been abandoned.

Re:A rather simplistic hardware-centric view (4, Insightful)

SuricouRaven (1897204) | about a month ago | (#47661335)

Even software is slowing down, though. A lot of the commodity software reached the point of 'good enough' years ago - look how long it's taken to get away from XP, and still many organisations continue to use it. The same is true of office suites: For most people, they don't use any feature not present in Office 95. Updating software has gone from an essential part of the life cycle to something that only needs to be done every five years, sometimes longer.

Around 2025 we will probably see a repeat of the XP situation as Microsoft tries desperately to get rid of the vast installed base of Windows 7, and organisations point out that what they have been using for the last decade works fine so they have no reason to upgrade.

Re:A rather simplistic hardware-centric view (1)

mysidia (191772) | about a month ago | (#47661383)

Updating software has gone from an essential part of the life cycle to something that only needs to be done every five years, sometimes longer.

Not if you actually care about security. The older software and operating systems prior to Windows 8 don't support newer more-effective security attack mitigation approaches such as ASR.

Re: A rather simplistic hardware-centric view (1)

Anonymous Coward | about a month ago | (#47661523)

By then there'll be virus scanners in switches

Re:A rather simplistic hardware-centric view (3, Insightful)

SuricouRaven (1897204) | about a month ago | (#47661535)

I'm talking about new product major versions, not just patches.

The only reason many organisations are ditching XP right now is that MS stopped supplying updates. That isn't "Getting new software to further advance the organisation." That's more "Reluctantly going through the testing and training nightmare of a major deployment because Microsoft want to obsolete our otherwise-satisfactory existing software."

Re:A rather simplistic hardware-centric view (2)

Warbothong (905464) | about a month ago | (#47662395)

A lot of the commodity software reached the point of 'good enough' years ago - look how long it's taken to get away from XP, and still many organisations continue to use it.

I find it hard to believe that operating systems became "good enough" with Windows XP. Rather, Vista took so long to come out that it disrupted the established upgrade cycle. If the previous 2-to-3-year cycle had continued, Vista would have come out in 2003 (without as many changes, obviously), Windows 7 in 2005 and Windows 8 in 2007. We'd be on something like Windows 12 by now.

It's good that consumers are more aware and critical of forced obsolecence, but I don't agree with the "XP is good enough" crowd. It makes sense to want the latest (eg. Windows 8); it makes sense to use something until it's no longer supported (eg. Vista); it makes sense to use something that's "good enough" (eg. Windows 95 for features, or 2000 for compatibility). XP is none of those: it's out of date, unsupported and a bloated resource hog.

Re:A rather simplistic hardware-centric view (0)

Anonymous Coward | about a month ago | (#47661507)

To be fair it sounds like they're looking at the sysadmin/techician side of putting things together and running things rather than the actual development side of things. Doesn't matter whether it's hardware or software, both can be divided that way. They're under the same umbrella but really two different areas entirely. It's like saying that a taxicab company and a car designer are in the same area. At some very high level sure they are, but they're still in very different niches.

Re:A rather simplistic hardware-centric view (0)

Anonymous Coward | about a month ago | (#47661677)

So next up in the taming of the 'IT' West the Great turf grab by the robber baron corporations , extermination of the indigenous populations (IT workers) by starvation and poverty. Then Build a few libraries with heavy DRM of course, to sooth your conscience.

It's the American way Baby!

Re:A rather simplistic hardware-centric view (0)

jellomizer (103300) | about a month ago | (#47661979)

Well beyond hardware, Software reliability over the past few decades has shot right up.

Even Windows is very stable and secure. Over the past decade, I have actually seen more kernel panics from Linux than a BSOD. We can keep servers running for months or years without a reboot. Out Desktops,Laptops, and even mobile devices now perform without crashing all the time, and we work without feeling the need to save to the hard drive then backup to a floppy/removable media every time.

What changes has happened sense then on the software level.
1. Server Grade OS for the desktop/mobile devices. Windows XP on uses the NT kernel, Macs and iOS use a Unix Derivative. Android and GNU/Linux are Linux based system. All of these OS's were designed for Server based useage with proper memory management and multi-tasking as well support from SMTP. Causing a lot of those silly crashes of Yesterday a thing of the past.

2. Understanding and prevention of buffer overflow. While we knew about buffer overflows for a long time. But it was found to be a security issue in the late 1990's. So newer languages and updates to existing compilers are designed to try to prevent them. Plus the OS now randomizes the memory segments to help reduce the risk.

3. The "Try" "Catch/Except" commands. It is nearly impossible to try to break a complex program during testing. The Try/Catch idea in modern languages while many old schoolers claim leads to sloppy code, it does attempt to deal with the fact that the world isn't perfect and people make mistakes, and allows a clean way exit your program or procedure on error conditions. Meaning your program will still run when things are not perfect, as well once it quits your data is in a clean state to prevent further data corruption.

4. The rise of Server based programming. We go threw cycles of who should do the work The Server or the end use device. How often do you actually need to download a program any more, I bet most of you don't remember Software stores back in the 1990's where you had to buy a program to do everything. They were cheap $10 programs, or you can get a collection of shareware. But in general everything you needed to do on your computer needed a program. If you wanted an electronic Encyclopedia you needed to get one and have it stored on your computer. You wanted a program that had some forms and did some calculations you needed a program.... This created a situation where you had a lot of programs on your computer with DLLs/Shared Libary versions conflicting each other. Today a lot of these small program are now done over the Web, On the server, with Javascript to make the UI clean. But that means there is less stuff running on your desktop that could be conflicting with each other.

5. Rise of Virtualized Systems. If you had a server, you needed to put all your stuff on it, on the same OS, with a complex set of settings which made them more suited to mistakes. Virtualization allows you to have a bunch of custom settings designed to do one job and do it well. Vs. one server to do many things.

6. The rise of the interpreted and virtual machine languages. Most stuff done today doesn't require compiling straight to machine code, but to a virtual code (Java/.NET) or it interprets the code in run-time (Python/Ruby/PHP). That give the developer separation from the hardware. Yes it slows things down, but it also prevents a lot of oddities that happen when you make a program on a 32bit vs 64bit OS. Or even the minor difference between a Core i5 and a Core i7.

7. The fall of easy to use languages. Back in the Old days, there was a lot of languages like FoxPro, Visual Basic (not the .NET) which were designed to be used by Non-Programmers. These type of languages are not being used as much any more, that means the programming is happening with people who know how to program and not from people who just know how to use a computer. This means the programs wrote in these hard (By hard meaning there is some design and though behind it) to use languages are designed better and not as cobbled together.

8. Inexpensive database. Back in the old days, a Relational Database costed thousands of dollar or more. Too much for most peoples usages. Now with MySQL, Microsoft SQL, PostGreSQL and the others, having a Relational Database to you program isn't going to break the bank, and you have a tool that allows for better collection organization and retrieval of data.

9. Google Search/Widely available broadband internet. Back in the old days of coding, you had to use your reference book, or help in the software. However most programmers write programs that do something different than what the compiler makers had in mind, so you need to figure out a work around. Today if you hit something that seems overly complex you do a Google search and you can usually find out a better way to do it. The old way you just did it the bad way, it worked so you kept it.

10. Fast Processors, Lots of RAM. The old days everything needed to be optimized. This first added more time to your programming, and opened your code for a lot of bugs, because you needed to strip out as many checks that you can get away with. Also you need to keep an eye on the RAM you are using and write methods to store data and retrieve it over an over again. Today you wouldn't think twice about storing a few megs of data in RAM process it, than just save the output on disk. Easier code, less errors.

Re:A rather simplistic hardware-centric view (1)

Junta (36770) | about a month ago | (#47662233)

Software reliability over the past few decades has shot right up.

I think this is a questionable premise.

1) Accurate, though has been accurate for over a decade now
2) Things have improved security wise, but reliability I think could be another matter. When things go off the rails, it's now less likely to let an adversary take advantage of that circumstance.

3) Try/Catch is a potent tool (depending on the implementation it can come at a cost), but the same things that caused 'segmentation faults' with a serviceable stack trace in core file cause uncaught exceptions with a serviceable stack trace now. It does make it easier to write code that tolerates some unexpected circumstances, but ultimately you still have to plan application state carefully or else be unable to meaningfully continue after the code has bombed. This is something that continues to elude a lot of development.
4) Actually, the pendulum has swung back again in the handheld space to 'apps'. In the browser world, you've traided 'dll hell' for browser hell. Dll hell is a sin of microsoft for not having a reasonable packaging infrastructure to help manage this circumstance better. In any event, now server application crashes, client crash, *or* communication interruption can screw application experience instead of just one.

5. Virtualized systems I don't think have improved software reliability much. It has in some ways make certain administration tasks easier and beter hardware consolidation, but it comes at a cost. I've seen more and more application vendors get lazy and just furnish a 'virtual appliance' rather than an application. When the bundled OS requires updates for security, the update process is frequently hellish or outright forbidden. You need to update openssl in their linux image, but other than that, things are good? Tough, you need to go to version N+1 of their application and deal with API breakage and stuff just because you dared want a security update for a relatively tiny portion of their platform.

6. I think there's some truth in it, but 32 v. 64 bit does still rear it's head in these languages. Particularly since there are a lot of performance related libraries written in C for many of those runtimes.

7. This seems to contradict the point above. Python pretty well fits that description.

8. This has also had a downside, people jumping to SQL when it doesn't make much sense. Things with extraordinarily simple data to manage jump to 'put it in sql' pretty quickly. Some of the 'NoSQL' sensibilities have brought some sanity in some cases, but in other cases have replaced one overused tool with another equally high maintenance beast.

9. True enough. There is some signal/noise issue but better than nothing at all.

I think a big issue is that at the application layer, there has been more and more pressure for rapid delivery and iteration, getting a false sense of security from unit tests (which are good, but not *as* good as some people feel). Stable branches that do bugfixes only are more rare now and more and more users are expected to ride the wave of interface and functional changes if they want bugs fixed at all. 'Good enough' is the mantra of a lot of application development, if a user has to restart or delete all configuration before restart, oh well they can cope.

Re:A rather simplistic hardware-centric view (0)

Anonymous Coward | about a month ago | (#47663097)

You can patch Windows without a reboot? TLDR the rest.

Re:A rather simplistic hardware-centric view (1)

nine-times (778537) | about a month ago | (#47662483)

The article is a rather simplistic hardware-centric viewpoint. It doesn't even begin to touch on the areas where IT has always struggled: design, coding, debugging, and deployment. Instead it completely ignores the issue of software development, and instead bleats about how we can "roll back" servers with the click of a button in a virtual environment.

And now is when we have a long and stupid debate as to whether the term "IT" signifies a grouping of all computer-related work including development, or whether it's limited to workstation/server/network design, deployment, and support. And we go on with this debate for a long time, becoming increasingly irate, arguing about whether developers or sysadmins do more of the 'real' work, and...

Let's just skip to the end and agree that, regardless of whether IT 'really' includes software development, it's pretty clear that the author didn't have software development in mind when he wrote the article.

Re:A rather simplistic hardware-centric view (2)

jon3k (691256) | about a month ago | (#47663057)

I kind of agree with TFA here -- hear me out here. We went through a pretty fundamental shift in the datacenter over the last 10 years or so, and it's finally settling down. Of course there will be constant evolutionary progressions, updates, patches, etc we're basically done totally reinventing the datacenter. 10GbE, virtualization, the rise of SANs and converged data/storage, along with public/private/hybrid clouds - these huge transformative shifts have mostly happened already and we're settling into this new architecture. That's not to say there won't be patches and upgrades, but fundamentally, from a architectural design perspective, things have basically settled down in the datacenter. The major pieces and components of the datacenter are pretty much set at this point and now we'll just continue to innovate on top of this infrastructure.

So say the useless corporate desk jockey (0)

Anonymous Coward | about a month ago | (#47660827)

So say we all!

Time for privacy? (0)

Anonymous Coward | about a month ago | (#47660829)

We have built the darknets and their hidden services. We have the cryptocurrencies and the second/third generation ones are scaling much better than bitcoin. Maybe we can settle into the modern world a bit and fix some of the totally broken/stupid shit now? Credit cards, server side password auth (At least use SRP if you still want passwords), dozens (or more) certificate authorities fully trusted (if you thought a single point of failure was bad, how about a hundred points of failure?), email, IPv4, using third parties to host unencrypted private group communication (facebook and others) etc.

There is so much stuff that is clearly shitty from a security/privacy perspective, and not very hard to fix. Maybe we will have time to fix some of this now?

Not paying attention? (5, Interesting)

felixrising (1135205) | about a month ago | (#47660847)

I assume you are talking about the hardware... because once you have a "private cloud", the next step is moving away from setting up servers and configuring the applications manually, and getting into full on DevOps style dynamically scaling virtual workloads, that are completely (VM and their applications, the network configuration including "micro networks" and firewall rules) stood up and torn down dynamically according to the demands of the customers accessing the systems.. those same workloads can move anywhere from your own infrastructure to leased private infrastructure to public infrastructure without any input from you... of course, none of this is new... but it's certainly a paradigm shift in the way we manage and view our infrastructure... hardly something static or settled. Really this is a fast moving area that is hard to keep up with.

8k video? (1)

hooiberg (1789158) | about a month ago | (#47660889)

As soon as we have 8k video commonly available, which could be as soon as 2020, if Japan gets to host the Olympic Games, we will run out of storage, out of bandwidth, and there is not even a standard for an optical disc that can hold the data, at the moment. So our period of rest will not be too long.

Re:8k video? (1)

Anonymous Coward | about a month ago | (#47661015)

By then we will have the infamous Sloot Digital Coding System that will encode an entire movie down to 8KB so what are you so worried about?

Re:8k video? (1)

SuricouRaven (1897204) | about a month ago | (#47661353)

The only problem there is that it is, for most purposes, pointless. Most people would be hard pressed to tell 720p from 1080p on their large TV under normal viewing conditions. What we see really is a placebo effect, similar to the one that plagues audiophile judgement: When you've paid a heap of cash for something, it's going to sound subjectively better.

Re:8k video? (1)

hooiberg (1789158) | about a month ago | (#47661485)

Personally, when reading pdf articles, the difference between 1080p and 4k make a world of difference. The aliasing is almost invisible in 4K. Also, thin diagonal lines in plots are much more clearly defined. Because this difference is so large, I have little doubt that 8k will also make a difference, although maybe not as much as from 2k to 4k. I have not yet had the pleasure of comparing video on 1080p and 4k.

Re:8k video? (1)

Bing Tsher E (943915) | about a month ago | (#47662103)

It's easy to tell the guys from IT. Everybody else reads the PDF files for their content. The IT guy looks for artifacts in the font rendering.

Re:8k video? (1)

Lumpy (12016) | about a month ago | (#47661831)

Exactly, yet you will have a TON of people claiming they can tell the difference. In reality they can not.

99% of the people out there sit too far away from their TV to see 1080p Unless they have a 60" or larger and sit within 8 feet of the TV set. The people that have the TV above the fireplace and sit 12 feet away might as well have a standard def TV set.

But the same people that CLAIM they can see 1080p from their TV 10 feet away also claim that their $120 HDMI cables give them a clearer picture.

INFOWORLD - the E-RAG that tells you how to get (-1)

Anonymous Coward | about a month ago | (#47660891)

a man in 3 easy steps. Please, SAY NO! to re-ragging infoworld. If we want its drivel we can pick it up at the checkout stand.

"10 things you need to know about Microsoft's Surface Pro 3 and the ways it will make you better"

"The 3 ways the Internet of things will unfold and how you will make it happen"

"Passwords aren't the problem -- we are"

"Apple's Swift falls back to earth after initial surge but will it matter to you"

"Caution: Cloud brokers may not deliver what you expect but will make you want it even more"

"Quick guide: Which freaking database should I use before I die?"

"Get real: Oracle is strengthening -- not killing -- Java (Get it now and live your life)"

So much titalation, but sans Stern's tits. It cannot be taken seriously. Seriosly.

Good lord (0)

ruir (2709173) | about a month ago | (#47660905)

BS has arrived to slashdot.

Re:Good lord (0, Offtopic)

Anonymous Coward | about a month ago | (#47661021)

You must be new here.

Absolutely (1)

schreiend (1092383) | about a month ago | (#47660953)

In the recent decades we've been eyewitnesses to the revolutionary breakthroughs in such fields as energy, transportation, healthcare, and space industry, to name a few. The technologies emerged are nowadays pretty much ubiquitous and impossible to go without in day to day life. Yet the hardware IT industry is stuck with Moore's law and silicon, and there's even an embarrassing retreat to functional programming in the software branch.

Is this guy an ostrich? (0)

Anonymous Coward | about a month ago | (#47661023)

Because his head is in the sand! DEEP!

Good (1)

penguinoid (724646) | about a month ago | (#47661079)

Now standardize all your password requirements to a strength-based system without arbitrary restrictions or requirements, and standardize your forms' metadata so that they can be auto-completed or intelligently-suggested based on information entered previously on a different website. Trust me, this sort of refinement will be greatly appreciated.

Re:Good (0)

Anonymous Coward | about a month ago | (#47661345)

standardize your forms' metadata so that they can be auto-completed or intelligently-suggested based on information entered

I dig it! Regards, A Spammer

What about security (1)

andrew71 (134546) | about a month ago | (#47661091)

I don't subscribe to this rose-tinted point of view, especially if you look at all this beautiful tech from the security standpoint.
Most of the tech we deal with today was originally designed without security concerns. In most cases, security is an afterthought.
So much for sitting back and taking a break.

Talk for yourself. (0)

Anonymous Coward | about a month ago | (#47661093)

>Now that the technologies behind our servers and networks have stabilized,

Talk for yourself. I know since kinder garden that the bugs of today are the work of tomorrow !

Re:Talk for yourself. (0)

Anonymous Coward | about a month ago | (#47661351)

kinder garden

Your spelling has a bug (two bugs, actally) which i doubt will be fixed by a future job.

So much for your theory, dear french AC <whitespace padding> !

Ah yes (1)

rrohbeck (944847) | about a month ago | (#47661117)

Moore's law has run out of steam. Yay!

Re:Ah yes (1)

coofercat (719737) | about a month ago | (#47661763)

I think it's more about the end of the MHz wars. Nowadays, to get more power, you add more cores. If you can't do that, you add more boxes.

If you've got a single threaded million instruction blob of code, it's not executing very much faster today than it was a few years ago. If you're able to break it into a dozen pieces, then you can execute it faster and cheaper now than you could a few years ago, though.

Moore's law hasn't really run out of steam, it more that it's rules have changed a bit - the raw power may be going up and getting cheaper, but the way to use it all has changed.

Back on topic, I'd say TFA is roughly right - the data centre isn't going through mainframe/big iron/commodity hardware changes any longer. Things are getting refined and improved, but the major shifts in approach seem to be coming to an end.

As others above have mentioned, there's still plenty going on in the world of coding/testing/deploying. In some sense, stabilising the physical kit gives us room to think about those things in more detail.

Re:Ah yes (1)

Lumpy (12016) | about a month ago | (#47661837)

you can have 256 cores at 500mhz and a 4 core 5ghz system will be a lot snappier and faster. because most of what is used for computing does not scale to multi core easily.

I will take a 2 core 5ghz system over a 4 core 3ghz system any day.

Wrong (2, Insightful)

idji (984038) | about a month ago | (#47661217)

No, you IT people are no longer the great revolutionists - your time is gone. You are now just plumbers, who need to fix the infrastructure when it are broken. Other than that, we don't want to hear from you, and we certainly don't want your veto on our business decisions - that is why a lot of us business people use the cloud, because the cloud doesn't say "can't work, takes X months, and I need X M$ to set it up", but is running tomorrow out of operational budget.

Re:Wrong (2)

ruir (2709173) | about a month ago | (#47661315)

Sure the cloud runs with gremlins, fuck yeah. I guess you also dont care about your mechanic says and use the "garage", and also do not care without you dentist says and go there once every 5 years. If you do not care what professionals advise you, you are an idiot and do not deserve competent people working for/with you. Douche bag.

Re:Wrong (1)

Bing Tsher E (943915) | about a month ago | (#47661991)

With the increased reliability of modern cars, people do make fewer trips to the garage. So it's not unlikely that cars won't be in the garage more than every 5 years.

I guess the same fact being true for IT really bugs you. The IT drones where I work are right now in a tizzy because the corporate IT people in Mexico are taking over. Because they can, and it saves a lot of $$, and also because the local fucks just aren't needed much anymore. There's no need for a guy to clean the lint out, all the mice are optical now. Telecommuting so you can spend your days feeding your horses and only come into the office when you've got a full truckfull of ag waste to pitch into the office dumpster is over. Enough of the stuff is automated.

We can feel your anger, but ya better figure out how you're going to provide value. The days when Admin=God are over. Go ahead and dig nostalgically through your archives of BOFH stories, but it's ovuh.

Re:Wrong (1)

ruir (2709173) | about a month ago | (#47662039)

Give me a car that is in the garage only once in 5 years, and I wont mind to pay the price tag. There are IT drones/fucks and then there are IT people, and guess what I do not belong to the former. As much as there are fucks in any other job, yours included. People that get all cozy, did not had the vision to go ahead, keep up with the times, and get away with the menial tasks. But guess what, the stuff does not automate yourself and does not run alone. And when it seems to run alone, is because people are doing their job. And yep, I am one of the guys that runs the "cloud", and "automates" things...I am neither angered nor worried, but I dont have the patience to answer to clowns like you that think things run by themselves, and do not respect a whole profession out of envy because they do a real job, and you probably onde produce reports and do your yes-man thing to get into the corporate ladder. And yes, I can work anywhere in the world telecommuting, and I am so wanted in the market that when I run into jerks like the posters above, I have the luxury to find a better job in a short time.

Re:Wrong (0)

Bing Tsher E (943915) | about a month ago | (#47662091)

You are right to be so nervous and defensive.

Re:Wrong (1)

ruir (2709173) | about a month ago | (#47662341)

I am no more defensive than you are a big idiot. This has been an ongoing trend for ages...People think they can get without people who maintains things working, and then, depending on the quality of the work of the previous team, things go down after one year, at most two years, and they spend much more in external firms/consultants fixing the "malfunction". I have already fixed a couple of places and earned a lot of money thanks to idiots with a prejudice and lack of vision like you. And no, I dont even touch hardware or "toners", I have minions for that. Double idiot.

Re:Wrong (0)

Anonymous Coward | about a month ago | (#47663231)

sour grapes???

Re:Wrong (0)

Anonymous Coward | about a month ago | (#47661375)

Fucking plumbers...they say it will take 3 months to run the pipes, and they wont let me use cardboard pipers and run them as I want in the bottom of the wall. They thing they are special, I just need them when my house has more water inside than the local river. The role of IT is not fix things when they are broken, idiot.

Re:Wrong (0)

Bing Tsher E (943915) | about a month ago | (#47662025)

You're the data janitors. Get used to it. We don't need shiney new handles on the file cabinets every year.

Re:Wrong (2)

fisted (2295862) | about a month ago | (#47661387)

So essentially you're saying that you, as a technically illiterate person, don't give a crap about the opinion of your sysadmin in technical questions.
Oh, wait, you've already mentioned you're a business person. Enjoy your Dunning-Kruger while it lasts.

need to fix the infrastructure when it are broken.

Shall we fix your understanding of the English language while we're at it? Or would that be too mission-critical a business decision?

Re:Wrong (1)

ruir (2709173) | about a month ago | (#47661425)

There are no problems with that, we also think he a bigger and useless idiot, and will move to best places ASAP.

Re:Wrong (0)

Bing Tsher E (943915) | about a month ago | (#47662007)

Quit fucking around and go change the toner in the LJ5 in finance. Chop-chop, IT boy.

Re:Wrong (1)

Anonymous Coward | about a month ago | (#47662261)

*notes down username*
Sure thing, massa! Nice private Emails and interesting browsing behaviour over there.
How much to keep my mouth shut? Looks like I might earn more $$ than you, after all, dear BA boy.

Oh BTW, the toner is replaced -- oh, look, it's already printing. Wait, why do you...?
Oh, neat. Looks like you're about to be fired.

Re:Wrong (0)

Anonymous Coward | about a month ago | (#47662405)

Quick, I need to print the photo of me sucking the cock of the CFO because it is the only way I know of getting promoted.

Re: Wrong (0)

Anonymous Coward | about a month ago | (#47661703)

Yeah, some days I feel like a plumber or electrician. As a network geek, we run required infrastructure that will shut down the business if not done right.

If we do it well, nobody notices us until a pipe breaks, and they debate our cost to the organization. At the same time, we're constantly replacing old stuff and enlarging it to meet increased demands.

Parts of IT might be considered skilled trades, but ignore solid advice at your own peril. We have made great in roads in our organization, when they pull their heads out of *ahem* the sand and actually listen. We are not denying the business needs, we're pointing out the substantial disparity in business goals--reliability, cost (TCO), and whiz bang features. We (usually) are trying to give a competent risk analysis, not protect little fiefdoms or opinions.

It's all happy savings and fun until the remote cloud goes down.

Re:Wrong (2)

Lumpy (12016) | about a month ago | (#47661841)

Awesome, we need to join the plumbers union and start getting $125 an hour then. Thanks for your support!

Re:Wrong (0)

Bing Tsher E (943915) | about a month ago | (#47662021)

Right. You also need to start wearing a shirt with your name embroidered on it. Welcome to the working week.

Re:Wrong (0)

Anonymous Coward | about a month ago | (#47661877)

No, you mechanics people are no longer the great revolutionists - your time is gone. You are now just plumbers, who need to fix the car when it are broken. Other than that, we don't want to hear from you, and we certainly don't want your veto on our car - that is why a lot of us pilot racers use the garage, because the garage doesn't say "can't work, takes X months, and I need X M$ to set it up", but is running tomorrow out of operational budget.

Re:Wrong (-1)

Anonymous Coward | about a month ago | (#47661923)

hey dipshit, I am an IT guy. I was a manager in several organizations. I put together two ISPs. I probably have more money in the bank you will ever dream of. I have a hot wife whereas you probably suck dicks as pasttime to advance your career. I advanced my own career of my own merit, and not because of family or acquittances (which I suspect it is far much more than you can say). And finally, if you stupid little show of person will manage the courage to say to my face my opinion of 15 years in IT does not count for how to run your IT operations, that you know better and we dont know shit, it will be a pleasure being fired after beating your stupid arse.

Re:Wrong (0)

Bing Tsher E (943915) | about a month ago | (#47662059)

Let me guess: you've not taken a writing class since junior high school. You haven't had to, because you got a job in 8th grade changing out floppy drives (a phillips screwdriver is STILL the only physical tool you really know how to use) and it's been all uphill since then (graduating from HS was for the dumbfucks who never figured it out!)

Your 'hot wife' gives blow jobs to anybody with enough bling. Didn't you know that about her?

Oh lookie... (4, Informative)

roger10-4 (3654435) | about a month ago | (#47661227)

Another submission to a superficial article from syndeq to drum up traffic for Info World.

Re:Oh lookie... (1)

ruir (2709173) | about a month ago | (#47661317)

Why was this modded down? for being the truth?

These are the sorts of things stupid people say... (1)

Anonymous Coward | about a month ago | (#47661275)

...right before the next, undreamed-of computing revolution knocks everyone on their ass.

Re:These are the sorts of things stupid people say (0)

Anonymous Coward | about a month ago | (#47662017)

I think that was the point of OP... that kind of "big thing in IT" hasn't happened since... well... mid-'90s when internet went everyone and their pet went onto the Internet. And pure technology hasn't had a HUGE leap in a generation (perhaps "everyone getting a video card" was a big one... but I doubt it).

And you know what, even *then* there were more personal websites and development than there's today. How many people do you know who setup their own website these days? Back then, it was just about every techy... now it's... facebook profiles...some progress!

Re:These are the sorts of things stupid people say (1)

NemoinSpace (1118137) | about a month ago | (#47662303)

Don't be too hard on the guy. Thisis why evolution invokes death. Just about the time you get it all figured out and decide it's not worth going in and punching your time clock, nature punches yours.

Looking at the past... (4, Insightful)

CaptainOfSpray (1229754) | about a month ago | (#47661319) a dinosaur in the last days before the meteor. The future is over there in the Makerspaces, where 3D printing, embedded stuff, robotics, CNC machines, homebrew PCBs at dirt-cheap prices are happening. It's all growing like weeds, crosses the boundaries between all disciplines includg art, and is an essential precursor to the next Industrial Revolution, in which you and your giant installations will be completely bypassed.

You, sir, are a buggy-whip manufacturer (as well as a dinosaur).

Re:Looking at the past... (2)

Bite The Pillow (3087109) | about a month ago | (#47661695)

Seems like you are talking about general computing and related applications of computing. This guy is talking about business IT, which is a tiny subset of computing.

It would not be inappropriate to mod parent off topic for posting a generalist reply to something written for a specific audience.

Re:Looking at the past... (2)

CaptainOfSpray (1229754) | about a month ago | (#47661721)

Not off-topic .. TFA is claming to know where "the next IT revolution" is coming from, and I'm saying he is looking in exactly the wrong direction.

"the technologies...have stabilized" (1)

Anonymous Coward | about a month ago | (#47661667)

What the -- ?? "the technologies behind our servers and networks have stabilized" -- when did this happen? I'm not a datacenter person, but isn't the world filled with competing cloud providers with different APIs, and things like OpenStack? Did all this stuff settle down while I wasn't paying attention?

Another layer has solidified (2)

seoras (147590) | about a month ago | (#47661717)

I think would be a better way of looking at what this article is on about.
Back in the late 80's early 90's when I graduated and started my career in the Networking Industry the OSI 7 layer model ( was often referred to. You don't hear it mentioned much these days.
If you applied IT history and economics to it you'll find that each of those layers saw a period of fantastic growth & innovation (a few short years) before becoming IT commodities and having little value left to reap but at the same time becoming stable and allowing growth & innovation in the next layer above.
Cisco, the once darling of Wall Street, benefited from the growth & innovation in layers 3 to 5.
All 7 layers are now stable and "complete", there's no growth value left in them, Cisco as the example struggles when it once printed money.
I'd like to see someone attempt to define layers 8 ->12 with an attempt at extrapolating into the future with layers 13 and above.
On a related topic I've been reading a lot of articles around the hardships of making money as an independent App developer.
It occurs to me, taking this layered view of the economies of IT, that perhaps software itself has seen it's best days behind it.
That in fact to find value as a lone developer, or even as a company, software is just a commodity now which should be free with the money coming from the services you sell on top of, or a few layers above.
How long until machines program themselves after a short interview with their human "client" as to their requirements (layer 13)?

Another layer has solidified (0)

Anonymous Coward | about a month ago | (#47662575)

Why stop at defining layers 8 through 12? We need at least 42 layers for a new OSI model to truly be complete.

um (2)

Charliemopps (1157495) | about a month ago | (#47661769)

Now that the technologies behind our servers and networks have stabilized, IT can look forward to a different kind of constant change, writes Paul Venezia.

I don't think Paul Venezia works in IT.

I for one am enjoying our new quiet. (2)

nimbius (983462) | about a month ago | (#47661795)

As a senior engineer, im glad to get some downtime before the "next revolution." I certainly havent had to patch any hacks or bugs related to our transcontinental wonkavator. this week ive done nothing but drink pina coladas and enjoy a long vacation instead of worry about vendor lock-in and incompatibility, which as we all know was solved during the IT Revolution(c). thanks to the IT revolution (and especially the cloud) ive had plenty of time to spend with friends playing my favourite games, which in no way were encumbered by a lack of reliable infrastructure to play them on (thanks again IT Revolution!) Technologies used in the corporate data center like DRAC and EFI PXE have worked so well that i dont even have to worry about security vulnerabilities or bugs. gone are the days when disk and ram shortages were commonplace, as are the days when disks were specifically coded to certain vendors and controllers.

Like the railroad, its about control (1)

Gothmolly (148874) | about a month ago | (#47661799)

Translation: Bandwidth and ubiquitous connectivity, along with a generation trained to have no privacy are in place. Let the police state begin.

If you think things like rural electrification are about helping people, you have your head in the sand.

Transform the Wild West (0)

Anonymous Coward | about a month ago | (#47661865)

So, IT can commit genocide? Awesome.

IT Revolution? (1)

Anonymous Coward | about a month ago | (#47662023)

At the risk of pissing off some folks, I must say I've worked in IT since before it was called IT, and I can honestly say no revolutions will come from that area. After all, IT isn't known for it's innovative and R&D atmosphere. IT is the result of cramming middle management, contractors, and novice-to-mediocre developers together in cubicles. Sure, it's a steady paying job, which is why most of us do it. The revolutionary stuff will continue to come from those who have the luxury of choosing not to be part of the corporate IT scene. It will then be discussed and consumed by "IT Professionals" around the country. The positive outcomes of such consumption will be published ad nauseam in various tech rags in such as way as to make the choice to implement a revolutionary idea the same as creating it.

I know such a rant makes me sound like some bitter developer who feels he didn't get enough credit for something he developed, but that's not really the case. I've just been in IT a long time. Why not call a fish a fish?

IT revolution is really just a general IT adoption of a (hopefully) revolutionary idea. But yeah, IT revolution is easier to say.

Security (0)

Anonymous Coward | about a month ago | (#47662139)

That is going to be the next big problem for IT. Especially with iPhones and Android phones on corporate networks.

Quiet != stasis (1)

Livius (318358) | about a month ago | (#47662417)

The on-going technology churn we've seen in the last decade is *not* a feature of a revolution in progress that may be coming to an end, it's a reflection of stagnation in technology, without the ideal data centre technology (at least in terms of software) having achieved any kind of dominance. There's been a endless parade of new web technologies, none of which is more than an ugly hack on HTML. Websites are better than they were in twenty years ago, but certainly not 20 years' worth of progress better.

Nothing has stopped or stablized (3, Interesting)

labradort (220776) | about a month ago | (#47663267)

The concept is false. Things have changed in how they break and what we are concerned about on a daily basis. 10 years ago I didn't have compromised accounts to worry about every day. But I did spend more time dealing with hard drive failure and recovery. We are still busy with new problems and can't just walk off and let the systems handle it.

If you believe IT is like running your Android device, then yes, there is little to be done other than pick your apps and click away. If you have some security awareness you would know there is much going on to be concerned about. When the maker of a leading anti-virus product declares AV detection is dead, it is time to be proactive looking at the problem. Too many IT folk believe if there is malware it will announce itself. Good luck with that assumption.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>