Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

You Are Not Mark Zuckerberg, So Stay In School

FallLine Re:Ideals and reality (438 comments)

What you are talking about in terms of running a small business is risky. It is risky because when Best Buy's Geek Squad, or any local large operator sees you as a threat, it won't be long before you are eliminated one way or another. Sometimes it's possible to operate "under the radar" but the fact that anyone would need to do so is clear indication that one already knows that there are risks of being destroyed by larger, predatory businesses.

I beg to differ. No doubt being an entrepreneur is risky and requires a lot of hard work (in most cases), but the overwhelming majority that do fail fail for reasons that have little to do with any kind of direct response from the competition (let alone anything unethical), e.g., insufficient capitalization, poor financial management, il-concieved products/services, bad execution/implementation, etc.

Big companies have a lot more capital and resources to throw at problems, but the reality is that are often slow to respond to anything and, when they do "respond", their response is based on group-think and a fundamental lack of understanding for the marketplaces in which they operate. These big companies remain successful because they have inertia and a lot of capital to acquire younger companies, not because they are effective at killing the competition or finding new areas for growth.

I have seen this time and time again, first hand as an entrepreneur and coming from a family of entrepreneurs, many of whom that have gone head to head against some of the largest companies in this country and come to be the dominate player in their chosen the market or at least carve out a strong niche for themselves, allowing for the many stakeholders to profit handsomely (including employees), without any of the sorts of shady behavior you imply.

I have known at least a few instances where the founders have actually cheered when their primary competition was acquired by multi-billion dollar corporation since they have a strong conviction that these big companies, even though being known to try to play a little dirty at times and have more capital to spend on things like marketing, actually lack the discipline to be as an effective competitor as the relatively smaller prior-organization.

To re-iterate, most companies fail due the reasons of their own making. To the extent competition is an issue at all, it's more that their product or service fails to provide a sufficiently compelling reason from customers to switch from the existing products/services or have something enough different to attract wholly new customers.

Finally, I'll point out that it's not a zero-sum game. Many markets have a lot of potential room for growth. Sometimes new competition is the best thing that can happen for all (or at least most) market plaeyrs because brings in some ideas/vigour and creates vital competition where before little existed -- which spurs all the companies to invest in R&D -- which leads to growth for all as the products become substantially more attractive for all.

about 4 years ago
top

The "Scientific Impotence" Excuse

FallLine Re:The problem is politics (892 comments)

One huge problem with these sorts of debates is that people seem to think that actionable policy descends automatically from scientific conclusions as if it is simply a matter of logic. This is simply not the case because personal values, risk preference, economic principles, and more play a necessary and important role in decision making.

For instance, although the science is fairly certain that the earth has warmed over the last 150 years and that CO2 acts as a greenhouse gas in and of itself, it does not automatically follow that we should abandon anything that produces CO2 or other GHG. Even if we were absolutely certain that X tons of additional CO2 today result in actual damages Y tomorrow (which we are nowhere close to), rational people can still choose other courses of action because they believe that the costs of said action are less than the proposed solutions today. Unfortunately many scientists have unnecessarily politicized this area by conflating their scientific findings with their personal preferences on policy. They are NOT the same thing.

Both science and policy outcomes are likely to be better when society appreciates the difference and proceeds with their debate accordingly.

more than 4 years ago
top

Microsoft Office 2010, Dissected

FallLine Re:I don't quite agree (291 comments)

To be very clear about this: I have argued from the start that most small to medium sized businesses would be better off outsourcing typically generic services like email because outsource companies enjoy systematic advantages (scale, specialization, expertise, etc). I was also specifically challenging your apparently sweeping arguments that they should be categorically rejected " simply because of liability and privacy" or because they are "not tightly bound by laws regarding retention and usage". You may have intended this to refer only to Google Mail/Apps specifically, but I think it was fair to interpret them as a broader attack on outsourcing IT services generally. Regardless, that is the point I took issue with.

however, all of this is academic and is entirely dependent upon process, policy,hardware, and software. It's like stating that it's less risky to let someone else drive than it is to drive your own car. In some cases yes, in my case, I would argue no, it all depends on the details - and again, I wasn't making generalizations, I was stating my opinion about why I don't use it and the opinion of those in similar positions to myself whose opinion I am aware of.

I agree there can be unique circumstances that make outsourcing a poor choice, but I think you overstate your case. A typical outsourced email operation enjoys substantial structural advantages that an in-house operation rarely has given more standard requirements. A better analogy would be asking whether you're better off flying in a single-engine plane with a private weekend pilot or flying commercial airlines in the US on a wide-body jet. OK, that's perhaps an overstatement with respect to comparative risks, but nevertheless....

But why would a 50 person company need 24-7 support for office applications and e-mail?

Some medical devices business, for instance, require this kind of support (my last company) since they are supporting patients in a clinical capacity nationwide 24-7. Likewise, some of my current clients require support at 10PM or later since they're sending multi-million dollar proposals at the last minute (a lot of money at stake). In any event, my point is that a 24-7 operation has an easier time doing maintenance (since they can schedule it after hours easily) and are more likely to be able to attack a problem as soon as it is detected (which may well be earlier too since their operations are often more professional/proactive). Perhaps you can have your admins stay till 5AM to fix a problem or do a routine upgrade, but they're probably going to make more mistakes because they're tired and will be of little use the following morning when/if something blows the following morning.

I'm sorry but we're an ISV and we have some pretty sharp people here, but none of our 'non techies' would easily be able to fill the role of a squashed Google Apps admin or manage mail difficulties, or be able to convey those issues intelligently to whatever support mechanism Google has in place.

I bet they can handle most of the routine stuff (e.g., add/drop/change accounts) long enough to comfortably locate a replacement (which, btw, probably is NOT a full-time IT person)-- certainly far better than they are to be able to re-build a RAID array or deal with a complex AD replication issue.

What downtime? Are you envisioning some scenario where there's a private company with an understaffed, overworked IT group and someone burns down a file server someplace right before a big sales presentation? While certainly what I would term 'uncommon' it is a very possible scenario; however, it is simple to argue that a small company could at least remedy the situation themselves whereas Google losing your mail (as has happened), or Google Apps not being reachable (happened several times that I'm aware of for long periods of time) is something you can do absolutely nothing about. That's ignoring the obvious problems with "Hey Bob, where's that spreadsheet you had ready for the board meeting today?" "Uh, well, I can't show it from my laptop because when I went to synch it this morning, we were having ISP difficulties, so I couldn't get it off the cloud..." In all instances I can think of, the worst case scenarios are all better on the 'no Google Apps' side of the river.

(Again, I do NOT think Google Docs is a replacement for MS Office generally speaking). Where I think you go wrong is in confusing the availability of Google Mail, practically the lowest possible cost option for an actual company, with the significantly higher cost hosted Exchange options using more stable software and established technology. I would challenge you to try to offer a better emails/contacts/calendar experience for all of your 50 users for a mere $2500 a year (including server acquisition, IT overhead, data center, bandwidth, licensing, etc) which is what Google is charging. I bet you spend at least twice this much in IT manpower alone to deal with backup/maintenance/issues/extra time necessary to handle it as-is (without more exotic/expensive systems to ensure higher levels of availability)

Why? Adding 50 people to your staff is much more a problem relating to hardware, imaging, OS infrastructure than it is to e-mail, file server space, or office productivity software. Adding 50 people to your staff and giving them an approach to office collaboration they've likely never used, seen, or heard of, would be a much bigger problem. Again, I can imagine scenarios where what you're saying is absolutely correct, but I certainly don't think it applies in the general case and certainly not in my case(s)

(Again, not talking about Google Docs!) Consider the costs to step-up server, licensing, IT staff, data center size, bandwidth, etc as you grow or add services. If you buy a server hoping it will last for 2 years as rapidly growing company, you will probably be forced to buy a lot more capacity than you really need if you think you will reach, say, 100 users in the next year and/or as mailbox sizes expand (if you allow this). What about your data center when your company needs to move to a new office space? It's kind of hard to hire 0.2 sysadmins. Ok, maybe you can find a decent part-timer where you are but you lose something in the process (imho). These are just a few examples of stepping issues.

I don't understand why using Google Apps or Google Mail would keep a company, who needed data centers to accomplish their goals, from needing data centers anyhow? Are you saying that these data centers exist simply to support something like Microsoft Office and Exchange?

Actually for this $60M (250+ FTE and rapidly growing) client of mine I've kept them out of a needing to invest in a data center entirely with judicious use of outsource services and I am running their entire IT operation with around 5-10 hours a week from my company (a bit pricey on an hourly basis, but they get a lot more value too). I don't need to be on-site to deal with any critical servers. With the servers that I do directly manage at Rackspace, I can be confident that if a hard-drive or power supply fails, it will be replaced in less than an hour--even if it happens at 2AM and I'm under some bus tire :-). This frees me up to focus on the tasks that really offer value and means that I can bring on a relatively junior full-time IT guy to manage the day-to-day issues when I go off-premises in a few months, after I've delivered a major piece of software, and that it will be more than enough even with increased demand. Likewise, I know when this client outgrows their current suite in 10-12 months I will be able to manage the move painlessly and with minimal investment of time.

I think it would be very simple to argue the exact opposite. Google is quite obviously a giant, virtually irresistible target to hackers. Given the hacking of their e-mail systems most recently, I would think this would actually be an argument against using Google mail for things you want to be private. The more and more people who use Google Apps, the more likely Google Apps will be penetrated (I know, that's obvious.)

My point here was that if you are a tenant in some kind of shared facility and you're a relatively obscure 50 person company in a relatively unsexy area, your data is not going to be of much interest to most would-be hackers or snooping employees. Google itself is obviously a huge name target for hackers (on the other hand, Exchange as an application is at least as large of a target, with highly exposed code (albeit in machine code), and it's difficult for Microsoft/Customers to test and deploy patches across the many different configurations and millions of installations.

Having also been acquired by large companies (both Microsoft and then at another company by SIEMENS) and 'integrated' (love that term which should be worded 'ugh, you change now, do things same way we do'),

Agreed :-) The acquirer of my last company sent in a "SWAT" team from IBM, directed by their usual middling mid-level yes-man people, (they treat them like dirt) and some other consultants to do the remaining integration that I resisted after I gave notice (they didn't understand wtf they were talking about) and it was epic fail by all accounts. Sigh. They're still trying to shoehorn their ERP system, several years later, at the cost of several million in customization (and that's just the estimate!) and it's going to massive clusterf*ck if they pull the trigger. I am half tempted to just deliver a clean functional (fully buzzword compliant) re-write and offer to sell it to them for half the cost (a mere $2M), but they'd never go for it if it doesn't come from SAP or one of their other established business partners. Then again, I think I'd probably commit seppuku if I had to deal with their C-level operating company people again (or, worse, wanna-bes) for any extended period of time.

more than 4 years ago
top

Microsoft Office 2010, Dissected

FallLine Re:I don't quite agree (291 comments)

I am simply making a general case for the pros and cons of outsourcing email and related services (albeit a bit more pro in the general sense). I am not trying to argue for Google Apps specifically and, in fact, would generally not recommend the product today to most businesses unless their needs were very minimal and/or cost was the overriding concern.

I don't believe that I'm giving short shrift to any "more likely risks."

I only meant this insofar as your actual argument on /. goes, not that you are making an inappropriate decision for your environment.

You may consider the failure rate of industry standard backup hardware to be riskier than storing all of your contracts, designs, patent documents, and corporate e-mail on a publicly traded U.S. company's servers, but I do not.

In my experience and in the experience of many others, there is a high failure rate when actual restores are required -- particularly in smaller organizations that do not systematically test their backups on a regular basis -- and it has less to do with hardware failure than with misconfiguration, small backup windows, occasional media failure, poorly understand expectations/training (e.g., time, granularity, etc), improper storage of tapes, etc. Furthermore, the shortcomings of doing it in-house extend beyond just the ability to do a successful timely restores. In the organizations I have managed, for instance, a few hours of downtime can easily cost them a million dollars in lost sales, not to mention potential hassles/penalty from regulatory agencies, customer relations issues, etc.

I do not see what baring the outsource company's publicly traded status has to do with this debate. Regardless, it is difficult for me to imagine a situation in the various reputable service providers today wherein senior management would make a decision to systematically rifle through their clients files--particularly not if they are contractually obligated not to do things like this and will likely lose customers (at the very least) if they are ever caught. However, I am willing to acknowledge it as a potential risk (however remote). I might be a bit more sympathetic to this argument if said company is a potential competitor of yours or might be likely to want to acquire you at some point. The more realistic concern, imho, is whether or not they have good security procedures in place and whether they actually execute on them to prevent their own employees from misdeeds or some outside hacker (likewise for in-house operations!).

Furthermore, if you really have this much truly sensitive material online that could threaten the very survival of your organization if leaked/stolen (i.e., you have a competitors that could actually substantially steal your IP and get away with it), I would argue that putting it all online in a readily accessible manner is probably a mistake no matter where you host it and email is generally not treated this way. If you're not behind secure firewall/VPN, with mandatory two-factor authentication, w/ minimal granular access levels, auditing, segregation of duties, etc in place.... the concern of company management stealing secrets seems a comparatively remote risk.

How quickly can we respond to a problem after hours? What does this have to do with Google Apps? Or is this now about all 3rd party services?

My point here is that most good service providers make strong commitments to high availability and have the resources in place to actually deliver on it well. Most 50 person companies simply cannot afford to staff even one highly qualified sysadmin 24-7, never mind all the other resources necessary to deliver. Without this kind of staffing, it's harder to do necessary maintenance and respond quickly to problems as they happen, i.e., before they impact primary business hours.

What happens if the admin gets hit by a bus (I use this analogy at the office a lot as well ;)) ? The same thing that happens if the person administering our usage of Google Apps gets hit by a bus.

A company can easily share credentials with a number of reasonably intelligent people to allow them to fill this role quite easily because it's doesn't require much technical know-how or knowledge of the environment. The situation gets very different when you get an even moderately complex in-house IT environment because it requires a lot more skill to operate reliably not to mention specific knowledge of the environment as it is configured.

If you need me to make a generalization I would be happy to state that I find it difficult (but not impossible) to imagine scenarios where private companies should use Google Apps.

I would argue pretty much exactly the opposite (albeit not specific to Google Apps).

A private company likely has:

1) fewer resources to survive lost sales or opportunity b/c of downtime.
2) less scale to make high availability cost effective
3) far more cash flow sensitive (survival)
4) larger stepping problems with rapid growth/movement/strategic reposition
5) can generally use extra cash flow far more productively elsewhere in the organization, i.e., better to hire an engineer or good salesman than buying IT hardware or hiring an additional IT employee.
6) more likely to need to re-locate and thus re-invest in their data centers (been in that situation several times)
7) probably a bit less likely to be a target of some hacker.

Having managed IT very much in-house from 10 to 800+ employees (very IT intensive), from private to publicly traded, to being subsequently acquired and having to integrate into a large (Fortune 50) company, and having consulted for a variety of SMBs (especially start-ups) outside of this... I can say with confidence that most would be better advised to outsource their email and other similar services unless they have very clear reasons for not doing so or already have a large investment in IT for whatever reason that cannot be more efficiently utilized elsewhere.

more than 4 years ago
top

Microsoft Office 2010, Dissected

FallLine Re:I don't quite agree (291 comments)

I am not claiming that all organizations that reject it do not have valid reasons. There are valid reasons for companies not to use it today -- even for some smaller companies. However, many IT organizations make these sorts of decision in an ignorant and/or self-serving fashion.

I have known many people in IT that think they will promote their own careers by maximizing their budget, head count under them, getting experience with latest/sexiest technology, etc without really considering the company's needs. I have known other that simply lack the ability to think critically about the issues (e.g., use an arbitrary requirement to rule out alternatives without considering the cost v benefit). For instance, they look at the per mailbox cost but fail to really take into consideration just how much overhead managing said technology in-house incurs or the cost of the risk they take by using shortcuts. Others are simply very conservative by nature (in the sense of personal career risk) and until outsourcing becomes the norm in their line of business or management essentially insists, they will not go with the program.

I do not know your organization. However, I wonder whether your preoccupation with trust in your IT organization is informed and rational. Yes, it is possible that the outsource company might do something in more of a top-down fashion that is not aligned with your company's interests. However, those businesses that actually profit substantially directly from their users (e.g., not gmail) have a strong incentive not to do something that might cause their client to lose confidence in them. A company that abuses their clients will ultimately hurt themselves.

In the mean time, I think you are giving short shrift to other more likely risks that correlate strongly with how these services are actually provisioned and managed. In other words, data loss, extended downtime, poor security, malfeasance of IT/operations folk, etc. An outsource company that serves one thousand times as many users with similar needs is far more likely to be able to do that job better and more cost effectively because they specialize in it and have the scale to do is cost effectively (though, as I said, the business is only came into maturity in the past few years and I believe it will take more time to be able to accommodate a wider range of requirements).

Consider:

How often do you test your backups?
How quickly can you respond to problems after hours?
What happens if you or key admin(s) gets hit by the proverbial bus?
How much expertise do your admins really have with your mail servers?
How many people have admin level access?
How many vendors or non-IT people have access to your data center(s) for various reasons?
Do you have offsite DR -- how good is it really?
What controls do you have against internal malfeasance?
How good are your data centers really?

My only point here is that you can't just look at "trust" in an outsource company in isolation of the other risks that hinge on decisions you make in this context. Your environment might be tighter and more cost effective given the stuff you really need but I haven't seen you articulate why this might be the case. Just some food for thought.

more than 4 years ago
top

Microsoft Office 2010, Dissected

FallLine I don't quite agree (291 comments)

As a former CIO, I disagree with your diagnosis of the issues. Many companies, both large and small, outsource services to companies with access to all manner of sensitive materials (e.g., documentation destruction, electronic reading rooms, business continuity services, AR, etc). The difference is how those services are implemented and the trust in the organizations, not so much the laws that specifically regulate their offerings or even the ability to sue them.

In my opinion, the problem with Google Apps is that they:

1) don't make many important explicit commitments (e.g., availability, security, retention policies, restoration times, etc)
2) provide very little visibility into their implementation
3) their low cost service model provides little room for day-to-day customer service (e.g., mailbox restore) and the confidence to know that you can rapidly escalate a problem should one arise (not to mention offline backup)

I say this because this implies the issue is not inherent to outsourcing email in principle. The outsource service model is the future for generally commoditized services like email. There are several offerings today that I believe are generally superior to in-house for most SMBs that want Exchange functionality and need good availability. I have recommended Rackspace's Hosted Exchange to a $60M (revenues) client of mine and a few others. I am generally quite pleased with it, though there are a few shortcomings that will prevent others from adopting it today (especially larger organizations).

The biggest issues with the various Hosted Exchange offerings (those I'm familiar with at least):

#1: Authentication cannot be readily shared with other services, i.e., the employees need to juggle yet one more set of credentials.
#2: Limited ability to use 3rd party software (e.g., VM, Fax, two-factor authentication systems, etc) unless it exclusively uses exposed interfaces (RPC/HTTP, IMAP, etc).
#3: Won't scale well with large companies (with multiple subsidiaries/operating companies) that need/want to use more advanced AD features.

That said, these companies will figure most of this stuff out gradually until all but the most conservative big companies concede that they are better off outsourcing it, i.e., that an outside company has the scale and expertise to do a better job at less cost and in a more capital friendly way. When real customization is required then in-house makes sense, but the reality is that many of these issues are fairly widely felt and can be addressed with more generalized solutions.

more than 4 years ago
top

Engaging With Climate Skeptics

FallLine Re:Extraordinary claims... (822 comments)

If I can do anything for the cause of science, it's to repeat this: Scientists get famous by ripping the shit out of other scientists' work. The famous scientists you've heard of got famous by demolishing the work of others. As scientists, we know that. And we're always looking for some schmuck to use as a stepping stone. I know if I do bad science, I'll be a stepping stone. I know if I find bad science, I can use IT as a stepping stone. That keeps most scientists pretty damn honest.

These CRU emails pretty much prove that you put way too much stock in this. There were several emails in which these so-called scientists expressed major reservations about the quality of research being conducted by Mann, Briffa, and other researchers and yet none of this made its way into the literature and these same researchers continued to support each others work in public.

What you fail to acknowledge is:

1) The conclusions are not binary. Relative warming can be exaggerated in a climate reconstruction, while another substantially different conclusion will generally not be acknowledged as falsifying it.

2) The incestuous nature of the research community combined with the fact that they are rely on fundamentally similar and weak evidence means that it is not in their interest to try to directly discredit their peers research.

3) The vast majority of this research is NOT reproducible because complete data, code, and methods are generally NOT shared even within the so-called community. This makes it extremely difficult to prove bad methods were used or otherwise falsify the conclusions. When material is shared it is often incomplete and only shared with friendly peers which creates an incentive not to criticize for fear of getting cut off in the future and a feeling of debt.

4) There are probably no silver bullets that can easily overturn these models or reconstructions empirically. Unless someone has something like this that undeniably falsifies all prior recent work in the field all of their incentives tell them to go with the party line because they are unlikely to get published and will have scorn heaped upon them.

more than 4 years ago
top

Engaging With Climate Skeptics

FallLine Are you kidding me? (822 comments)

What possible motivation would the climate scientists have to do so? What do they gain from over hyping the possible scenarios? To promote renewable energy? Again, what do they gain from this?

Here are just a few reasons:

1) Further their own careers. Big (positive) claims about AGW are important if you want to get published in the high impact journals.

2) To get grant Money to stay publish and stay employed.

3) Face time with the media

4) Genuine-belief in AGW--even if not well supported by the actual evidence.

5) Insider politics -- why criticize a peer's research that largely agrees with your own? The incentives are reversed.

6) Other environmental motives, e.g., "even if AGW is wrong, reducing pollution, sprawl, cars, oil dependency, etc is good" (I have heard this argument a lot)

7) (Mistaken) belief in the precautionary principle, i.e., AGW is a risk and refusal to see it in cost vs benefit terms.

more than 4 years ago
top

Engaging With Climate Skeptics

FallLine Re:Scientists are not Politicians (822 comments)

Scientists have always been egotistical, with their own pet theories and human idiosyncrasies. The saving grace of science has never been the scientists, but the method in which science is conducted. Peer review, vigorous debate, and cat-fights. What we believe scientists should be and what scientists are are two very different things. The problem here is the outside influences. You and me.

I think you have managed to miss the point and (sort of) contradict yourself. We will never remove the human element -- inside OR outside -- these negative influences have always existed in science and will never be banished.

The method is what matters most in ultimately arriving at the best possible science. Although we can never systematically remove all bias, bad faith, bad incentives, or what have you, we can sure as hell demand integrity and openness in science. The more politicized and the more important science is the more important it is that science stick to its core principle of openness. All of this empirical research MUST be fully replicable. In other words, all data, methods, and code should be reasonably obtainable by any qualified individual. Whether research is replicable by qualified individual is truly an objective fact that can be agreed upon by reasonable people on any side of an issue.

There is no excuse in this day and age with the internet and cheap storage why nearly all of this cannot be archived as a condition for publication. We certainly should not be making multi-trillion dollar decisions based on science without this critical step. Without replicability it's extremely difficult to falsify research and very easy to publish false or bad research when the prominent journals are all managed by like-minded individuals who have little incentive to find flaws in your research and a lot of incentive to keep you friendly.

What the CRU emails have done is demonstrate, amongst other things, a conspiracy to prevent reasonable requests (for essential information necessary for replication) for its own sake despite: the long tradition of this in science, the policy of several journals (albeit poorly enforced), the charter of these government agencies, and FOI requests. This behavior fundamentally abhorrent to good science.

These CRU emails have given some people a view into the naked politics (against science), bias, attempts to manipulate publication, suppressing legitimate internal criticism (for political reasons), etc... Taken together, they make compelling case for the corruptibility of science and the need for outside scrutiny. You do not need to be a climate scientist to point out fundamental statistical errors in so-called climate reconstructions or to observe that data is being misrepresented on graphs ("hide the decline")... The knowledge that outsiders can actually check research for errors at any point would do a lot to keep research honest -- before and after it is published. We may never be able to ensure that good skeptical research receives a fair chance to be published, but by preventing bad research from entering or remaining in the literature science can correct itself far more quickly.

more than 4 years ago
top

New Research Forecasts Global 6C Increase By End of Century

FallLine Re:Nonsense (746 comments)

Yes, clearly this is something that is better handled by lawyers, business people and their assorted paid off goons.

I said nothing of the sort. I merely suggest that the academic process is wholly inadequate given the cost of the measures that are being proposed today and the alleged risks. We should not be making multi-trillion dollar decisions based on the output of a handful of academics following the usual insular peer review process which was never designed to do this sort of thing.

We can fund and construct a far more open and rigorous process if the science is truly "settled".

Want more reliable proxy records? Gather a team of experts to determine the best proxies to examine, clearly state the assumptions and reasons for choices, then fund this research directly so that it can be used freely by all. This data should be as complete as possible, i.e., do not do what Briffa did with his Yamal paper by just including 13 trees to represent the last 100 years while using many many more in prior years.

Want to produce reconstructions that will convince a lot more people? Surely these scientists can collectively decide on the best most defensible way to produce a limited number of reconstructions, then they can document their exact methods and explain their rationale for outside review and commentary. Demand thorough independent analysis by one or more 3rd parties, don't just wave your hand at it and say it roughly agrees with my beliefs about AGW and looks OK on paper. Allow people with expertise in relevant fields to produce relevant criticisms, e.g., statisticians, dendros, etc. Do not allow artificial hurdles to be constructed like those which exist in academic journals (e.g., commentary can only be made X days after publication, can only be Y words long, etc).

Want to produce climate model that will attempt to predict future temperatures? Share the f'n code. Document it well. Explicitly state your assumptions and defend them. Allow people to see how it fairs against the instrumental record in detail, both past and future. It's not as if the only meaningful output is what happens with average temperature. These models make all kinds of assumptions about various phenomenon, particularly with respect to positive and negative feedback effects (esp. cloud cover)... well the outputs can be documented to so how well they hold up. If CO2 output deviates substantially from projections, fine, then run it with the actuals and see how it changes...

All of this data can be archived in one place for all to see at the click of a mouse. The Whitehouse spent $18,000,000 to re-design recovery.gov. Certainly we can spend money to have one definitive source for OPEN research on the settled science of AGW. I would gladly see my tax dollars be used for the goverment to set aside, say, 10 billion dollars to assemble a panel of leading scientists from a variety of areas of expertise to manage this process, direct research funds, etc. If the science is truly settled, it shouldn't take very long to make a compelling case for AGW and dispel mainstream concerns of bias and manipulation.

more than 4 years ago
top

New Research Forecasts Global 6C Increase By End of Century

FallLine Re:Nonsense (746 comments)

That is BS. If you bothered to read the refutations, the divergences are themselves a subject of many publications, and this has been out in the open forever.

My point is that this study and several others have deliberately used this 'trick' in their presentations and that this behavior is dishonest and anti-scientific. There is no excuse for it, period. Your excuse in the prior post that the actual temperature record reflects an increase is misleading and amounts to sophistry.

I do agree that access to the raw data could be better, and even that some of the statistical methods etc have been applied poorly (or even incorrectly). You might even find, somewhere in the stack of tens of thousands of climate science publications, some that misrepresent the data, perhaps even deliberately. Not all scientists are as expert as they should be in statistics, and scientists are human and have human frailties (although that doesn't excuse anything). But this does not appear to be one of those cases. You are reading far too much into one email, and you clearly are not aware of the context.

There are only a handful of published reconstructions that show 'hockey stick' shapes stretching over ~2K years and most of these rely on a small set of similar raw data, similar statistical methods, and are authored by a small circle of scientists at even fewer institutions. Many of these studies have already been largely discredited. Furthermore, expecting that every study should be immediately shot down if it is bad is unrealistic given the biases of the climate science community and their complete lack of openness on critical issues.

You need to have a great deal of faith in their particular peer review process to believe this issue is anywhere near settled given how reluctant they have been to allow any sort of dissent by barring access to data, methods, black-balling people from journals, etc. Many reasonable people look at the facts and believe that there is a need for skepticism.

Given:

1) Emails like this (clear conspiracy to mislead, hide embarrassing facts, hide & delete data, etc).

2) The small size and incentives within the climate science community

3) Numerous fundamental documented errors in many of these studies (bad statistics, poor quality data collection, etc)

I simply have little faith in the quality of the peer review going on.

If the climate science community wants to convince more people of the need to act, to spend many many trillions of dollars in the near term, then they MUST be far more open and honest at the very least. There is no excuse in this day and age not to share data and methods freely given the fact that almost all of these people work for publicly funded institutions and are collectively asking society to turn itself upside down to fix the alleged problem.

If all of the science of global climate change depended on a single set of proxy data, then you would have a point. But it doesn't, and you don't.

I never claimed that the entire science of global climate change relies on a single set of proxy data. Even if the science is correct and the model forecasts are 100% accurate, this behavior is still bad and inexcusable, i.e., my point still stands. That said, even though I acknowledge that the earth has warmed over the last century and that CO2 does act as a greenhouse gas, there is: a great deal of unsettled science in exactly how unprecedented today's temperatures are; exactly how sensitive the climate is to CO2; the behavior and extent of feedback effects; what will happen in the future given certain CO2 levels; not to mention the 2nd, 3rd, 4th, and further order impacts of warming (if it happens as predicted).

This issue is too politicized and too important to leave to the typical academic process.

more than 4 years ago
top

New Research Forecasts Global 6C Increase By End of Century

FallLine Nonsense (746 comments)

Try reading that again: "adding in the real temps [...] to hide the decline."

So, it is some kind of proxy for measuring the historical temperatures (in this case, tree rings), and this proxy data, for some completely different reason (pollution affecting the tree growth, for example??), shows a decline in the last couple of decades.

The real temperatures (ie, the ones that are actaully measured, like with a thermometer) show an increase, so use the real measurements for the final 20 years of the data.

There would be more of a problem if this wasn't disclosed somewhere. But even then, it is an argument about how the proxy data is presented. The real temperature data doesn't show a decline.

Virtually everyone admits that temperatures have increased substantially over the last ~100 years. The entire point of these reconstructions is to demonstrate that this rise is unprecedented over the past ~2K years and follows a certain pattern. If the same methods on the same species of tree in the same area in the same study not only fail to accurately replicate the thermometer record over the last several decades but also actually diverge substantially, this calls into question the entire pursuit.

In other words, if your methodology suggests that it couldn't have been warmer from 0 BC to 1900 because tree rings were not statistically larger, but the rings actually fail to increase as predicted in recent history when we know it has warmed, then this strongly indicates that we also cannot rely on warmer past temperatures to be accurately reflected in increased tree ring size either. Of course you can speculate that pollution may be playing a role, but it is still just speculation and there are better documented conclusions one could draw from this, e.g., that tree rings do not correlate linearly with temperature, that changes in moisture content, sunshine, CO2, etc play an equally large role, etc.

Good non-politicized science should: pick a methodology; show how it correlates with the actual thermometer record; then document it clearly for better or worse over the entire course, i.e., actually show the divergence (and make the data and methods available for all for review). These so-called "scientists" actually went to the other extreme by trying to hide the divergence and present a view that was not supported by their actual research. Many of these same scientists have gone further still by refusing reasonable requests for the raw data and further information on their methods.

This is politicized "science" at its very worst.

more than 4 years ago
top

Should a New Technology Change the Patent System?

FallLine Re:Fundamentally ignorant of the business (159 comments)

Assuming this is true, which has to be assumed to give any credence because you give no backing material or references here whatsoever.

I have multiple and varied connections to the pharmaceutical, biotech, and medical devices industries (inside & outside, academic, research, financial, executive, etc), so I have a feel for the numbers. However, you can find confirmation of this in academic literature if you look. DiMasi asserts in a 2003 paper that the fully capitalized cost of clinical trials is 70% of R&D (and this number has certainly grown according to all trends), see: "The price of innovation: new estimates of drug development costs".

then I think such government regulation should be funded.

You are not presenting any concrete proposal nor any support for your vague notions, but it's a bad one in general. Government generally does a poor job at allocating capital (see: central planning) and has a serious problem with efficiency (see: motivating its employees, agency costs, interest group politics, etc). Furthermore, if you mean that industry would still retain a role, such an arrangement could create a massive agency situation whereby industry could submit multiple compounds for clinical studies and not have to pay the costs thereby making the system far more expensive for a society as a whole. There is also real expertise needed to design the trials and manage them which, again, government tends not to do well with pursuits like this.

In example I think "No Child Left Behind" was a dismal failure because it tested for poor school environments but did nothing to fund those requirements, or to catch schools up to standards.

I know it is deeply unpopular with the teachers unions, but this does not prove anything. Virtually every test score has risen or stayed the same. Do you have any objective facts to prove that on whole it was a net loss? Furthermore, with respect to the unfunded mandate stuff, that is a weak argument. It is not mandating an act--it's demanding baseline performance (and generally weakly at that). Given the dismal performance of US public school education vs most of the developed world and it's comparatively high cost, it's difficult to argue persuasively that waste could not be removed from the system by refocusing it on a more effective curriculum and reducing the vast amounts of administrative overhead. If anything it has failed to the extent that it allowed the states to control the tests and generally weaken the standards.

what's the alternative -- beta testing?

Fewer, smaller, and shorter clinical trials (that recognize that RCTs have huge limitations) combined with increased post-marketing (approval) surveillance. Namely, the cost per patient is so high and the availability of volunteers is so limited (this is a HUGE problem now) that it is impossible to conduct these trials with sufficiently large numbers of patients (sample size) to detect relatively rare and/or delayed on-set problems. With current phase III trials (average 3-4K patients) adverse drug reactions (ADRs) can only be detected that occur more frequently than 1:1,000. To have a 95% chance of detecting ADRs that occur in 1 or 2 in 10,000, you would need to enroll 600,000 patients. This is a number that is completely impractical yet would still expose half those patients to risk and the other half would just be getting placebos (a problem in and itself). Furthermore (besides raw statistics), these patients do not tend to be represent real world conditions very well (far higher compliance, generally healthier, fewer co-morbidities, increased observation, etc).

In other words, there is no substitute for actual real world use. In the meantime the FDA, which has some understanding of this problem, has every incentive to delay and restrict products from market and almost none to allow market entry (it's CYA). This problem has gotten increasingly worse over the past several decades and shows no sign of changing.

Nobody should do any such thing. I actually don't think software is dependent at all on such monopoly grants. Software patents, which are essentially patents on Boolean Math, are inherently counter-productive, and arguably a limitation on free speech rights. I think currently proprietary software isn't even all that dependent on copyright. Proprietary software is dependent on Trade Secret, since they only release binaries and hide the source. Even when they release the source it tends to be under NDA style contract provisions. Open Source also has better business models than Copyright entails.

The software industry is almost entirely dependent on various forms of IP and related forms of IP support. Copyright is most critical, but so to is trademark, patents, trade secret law, and various other protections (even government support for contracts can be a form of this). There are almost no software sales (not to mention development) in countries that have weak support for IP, even in relatively industrialized ones like China and India that are starting to move in this direction. The vast majority of paying customers just care about being able to use the actual object code as it exists, not to understand how it works or be able to make their own improvements. It is trivial to copy software and break any protection schemes (see: the internet, china, etc). IP-free regimes support very little ability for software companies to recoup their R&D and is no guarantee of open source code either (not to mention that most open source business models would be equally dead).

Furthermore, you are entirely mistaken, software patents are typically not on math or even basic algorithms as you imagine them (I actually am a named inventor on at least one). You might argue that all software is ultimately built on math, but that is normally not contained in the claims of most patents, and is equivalent to claiming that a patent on, say, an engine design is equivalent to patenting atoms (or that copyrighting a book is equivalent to copyrighting individual words). In other words, it's a non-nonsensical argument. That is not to say there are no problems here, but that your argument is very bad.

Why would I pose the idea of Democratic government taking over an industry if I thought it was inherently bad? You're making that claim, not me. And you have the burden of proof here -- the need for big pharma has never been proven, in that it has never been proven that similar or better treatments wouldn't arise without giant drug companies and monopolistic patents.

Actually you have the burden of proof, since you are arguing against the status quo (not to mention against common sense, decades of basic economic research, empirical comparisons with various socialist regimes, etc). Furthermore, you need to present an actual proposal, not just half-baked ideas in order to even begin to offer an argument. In addition, I have presented many arguments against your vague ideas and demonstrated flaws in your thinking.

I do not accept your arguments, and the burden of proof is with those that say patents are needed, or that the same or better results cannot be achieved through less exclusionary means. That is what has not been proven.

Again, the burden is on you since you are arguing against the status quo. However, if you an arguing that shorter patent lengths would work better, then you need to address the many problems I have presented at the start of this thread (e.g., the actual effective patent life today (post-marketing approval), the R&D cost, risk, costs of capital, S&M, etc).

more than 4 years ago
top

Should a New Technology Change the Patent System?

FallLine Re:Fundamentally ignorant of the business (159 comments)

I thank you for enumerating all the ways that big pharma would fail without government protection. Any business that depends on so much government protection should simply be made part of the government, and thus subject to Democratic decision making instead of private profiteering. Short of that, they should be allowed to fail just like any similarly poor business model would on the free market.

Funny you should say that. The overwhelming majority of the R&D costs are imposed by government demanding very expensive and lengthy randomly controlled trials (RCTs) combined with the FDA's excessive conservatism to an extent that is generally not socially optimal. These increased costs and delay affect not just the cost of the approved drugs, but the labeling of them, i.e., the manner in which companies can market them, and the ability to sell some good medications at all (some good drugs are bared from the market due to the limitations of RCTs).

In other words, if you are arguing this from a libertarian perspective you should at least recognize the governments role on the other side of the equation. You should also extend this same argument to software and most other technology oriented business since they too depend on government granted monoplies.

If you are not arguing that government action is inherently bad, then you should at least present a coherent argument for why I am wrong.

If you accept my arguments, then you really need to explain why strong patent protection is problematic since those drugs would not be developed without it and the "problem" would be entirely academic.

more than 4 years ago
top

Should a New Technology Change the Patent System?

FallLine Fundamentally ignorant of the business (159 comments)

There are several MAJOR flaws in your argument that, essentially, companies should only be able to retain exclusivity until they recoup the strict costs of their R&D on their successful projects.

First, the real costs of R&D extend will beyond just the one successful project, but generally also several failed ones. In order to break even on R&D in general, they need to recoup it on just a few of their successes. In the pharmaceutical industry, this ratio tends to be less than 1 success for every 10 drugs (and this is really only accounting for drugs that gain approval... not market success).

Second, the R&D costs of a product are generally just a small fraction of the costs to actually create a successful product. Companies need to spend considerable sums promoting a product before they can break even, not to mention legal, manufacturing capacity, various other overhead, etc. All of these monies are put at risk for an uncertain outcome and generally take several years to pay off after market entry.

Third, you absolutely MUST provide strong incentives to take risk. The higher the risk, the higher the reward needs to be. This is well established by many years of economic and financial research. Who would invest in a drug company that produces novel drugs when you can essentially make the same profit by investing in a generic manufacturer with far less risk and with a far shorter timeline? Why invest in drugs at all when there are safer businesses still?

Fourth, you need to account for the time-value of money. Even if your investment was essentially risk free, you would demand an appropriate rate of return because you could invest that money in other places. A savings account would be far more liquid and likely have less risk in practical terms (not to mention other preferable investments).

Fifth, accounting for costs is not this simple. What is the cost for me, an entrepreneur, to drop out of the job market for several years when I can earn 100K+/yr salary in a risk-free job? Founders would simply never take that risk if we could only MAYBE break even if everything pans out. What about the engineers and developers I hire that accept less salary in return from an equity interest in the business (i.e., potentially large rewards in exchange for a riskier job and less salary today)? You would essentially need to dramatically increase the compensation of all such employees in excess of what it is today.

Sixth, this would give a huge edge to larger and better funded companies that could swoop into a business with an expired patent and reduce the profit potential to near zero under your regime (presuming you accounted for a bunch of the above problems).

I could name several other critical issues, but suffice it to say that your idea is a total non-starter and demonstrates an ignorance for why the current system generally works far better than you imagine (like most people on slashdot).

more than 4 years ago
top

MySQL Cofounder Says Oracle Should Sell Database To a Neutral 3d Party

FallLine The REAL news is RMS' admissions (207 comments)

In the letter he co-authored:

1) They all but admit that the dual-licensing is critical to the survival of not just MySQL in its current form, but also any fork derived from it.

2) They completely fail to mention any sort of alternative business model for MySQL or any of its derivatives, i.e., no mention of the mythical "support" business model especially or anything else.

3) They neglect the potential that code can be forked and successfully managed by un-paid volunteers.

4) They ignore that possibility that its users will donate money or large companies will make substantial contributions to cover development.

5) They also acknowledge the fragility of the open source ecosystem due to conflicting licenses

@@@@@@

Although I believe their analysis is largely correct as far as MySQL's survival is concerned, it demonstrates very little faith in GPL-licensed projects to grow and be maintained in absence of proprietary rights and directly contradicts the overall message of RMS and company. If they merely posited that Oracle could delay the development of MySQL for, say, a year it would not necessarily be contradictory, but to propose this is an earth shaking event and do so in the manner which they did is simply inconsistent with RMS and his follower's message.

more than 4 years ago
top

Mickos Urges EU To Approve Oracle's MySQL Takeover

FallLine Re:Not so fast (67 comments)

That's not what I was trying to say. Any BSD software could potentially be used in a closed-source project, and lots of people contribute to BSD projects.

But I would not contribute to a project where I had to sign over my copyright unless I had a very clear reason for doing so.

Why not? The reasons for the company wanting this are pretty clear. They need that to do any kind of transaction not compliant with the GPL license and their current-day business model depends on it. I suppose they might try to carve out a specific license for themselves from each contributor ahead of time, but with a large number of small contributors it would be hard to manage and presents a lot of potential legal issues for themselves and their clients (especially if they need to change their strategy in unforeseen ways).

Worst case, the company decides they want to become a proprietary software company at some point in the future and make a private fork from the GPLd codebase (or even offer it under BSD terms instead). That possibility seems generally unlikely to me and your code will still be available under the GPL terms (they cannot remove what's already out there). That aspect of it is not so different from contributing to a BSD-style licensed product, except that only THEY or someone whom they've assigned rights to can do that instead everyone (including yourself). If you're a GPL-leaning person, this seems a lesser "problem" than contributing to a BSD-style licensed project in terms of its practical effects.

Do you another specific reason or is it principle of it that bothers you? How is that any different in practical terms from contributing to a BSD-style project (and one that will in all probability behave as if it's a GPL or at least LGPL project)?

about 5 years ago
top

Mickos Urges EU To Approve Oracle's MySQL Takeover

FallLine Re:Not so fast (67 comments)

They aren't just "exceptions", you have hand-waved away a majority of the free software people use.

I was talking about those really vibrant projects with large and sustained development efforts. You know, as in, those that have a bunch of full time developers, QA teams, technical writers, and so on. I admit there are a large number of smaller projects and some of those have significant and unique value in this world. However, there is a big difference between, say, DRH working part-time on sqlite (never mind that it's a BSD-style license) and TrollTech (now Nokia) paying a bunch of developers to make sure that not only does their core code work, but that it actually works in all the key platforms, that it's well documented, that it can be easily installed and plugged into Visual Studio and other environments, and so on. I see a difference in those sorts of projects. I believe that MySQL, as a critical application that is widely used and used by more average users, needs to fall more into the latter than the former. In other words, it's not enough to say that a handful of hackers will make cool changes on their own time. Someone needs to ensure that it fits together, that it's safe, that it can be readily installed, that it's clearly documented, and so on. Almost as importantly, users need to be convinced of this for mainstream acceptance.

The way I see it, there are really just a handful of large vibrant projects like this in the open source world and they are almost entirely either funded by the dual-licensing model (for-profit) or as a non-profit organization that receives large and sustained donations from companies that have a vested interest in seeing the project succeed.

To be fair, I should explicitly mention a third model which is basically a company like RedHat that primarily makes their business by providing a certified packaged product and updates (which, I think, is very different than the "support services" envisioned by idealists) for businesses that value reliability over the bleeding edge. This model fundamentally depends on the users needing some entity to organize and bless the release because there is substantial complexity and uncertainty in the open source world. I do not think this model is that applicable to MySQL, given that they are essentially just dealing with one package and that it does not constantly need to be adapted to accommodate the latest and greatest hardware. It definitely limits what can be charged and is potentially marginalized by 3rd parties that perform the same function. That said, it can be a mission critical application for some businesses and some might be willing to pay for that sense of security. I would not want to build such a business though or rely on it for my livelihood.

For one thing, any dual licensed software project requires that you sign over the rights if you are an outside contributor. Not many open source projects do that, because it generally eliminates outside development except from some special cases.

You are confusing cause and effect. Those dual-licensing projects are dominated by the inside team because their scale and scope requires that kind of organization for most efforts and it's not worth a lot of time trying to involve the community in major efforts (which is not so different than the subsidized non-profit model), not so much because people are philosophically opposed to making contributions that might get used in a closed-source project. If you contribute code to a dual-licensed project your code will still be available under the GPL to everyone thats want to do an open source project. It simply gives a handful of closed-source developers an opportunity to build a downstream product with it (almost never any kind of competing product), but their financial contribution strengthens the open source efforts that you care about.

I don't think you should marginalize this as a model for a successful project.

Linux is a special case because it is a true platform for hardware vendors and device makers to sell their own products. Even there, there are limits to how much they can justify contributing to extending the platform to modern general purpose desktops (e.g., Windows and Mac OS), business networks (Active Directory, File sharing, and so on). I know you can and probably will point to ldap, samba, whatever the latest and greatest window manager is, and so on, but I see a huge difference between them all. This is basically the 80/20 rule at work, i.e., ~20% of the effort has been put in with maybe ~80% of the results. Unfortunately, that's not enough to compete against mature products when the overwhelming majority of users that have better things to do with their time then work through all the issues (presuming they even have the requisite skills).

We can argue about this back and forth till we both turn blue in the face, but I've said my piece. Thanks

about 5 years ago
top

Mickos Urges EU To Approve Oracle's MySQL Takeover

FallLine Re:Alternatives (67 comments)

From the perspective of the programs using it, nothing has changed

Except MySQL's users generally are not just installing a static set of code that will never need changing. They are implicitly wedding themselves to the product and depending on its backers to supply patches as-needed and on the promise of future enhancement to support new hardware, better design, etc. In other words, the quality of the group behind the fork and their ability to attract funding to continue its development is a major question for any serious user.

Even if all the original MySQL developers were to actually join with some forker's company (providing they do not have non-competes in place), they would need to abandon the dual-license model that provided MySQL with a lot of revenue because Oracle would retain copyright ownership. That would call into question the ability for the company to survive and fund continued development efforts.

I, for one, would place low odds on that organization's ability to find a viable business model or find enough charitable contributions to operate as a non-profit. Even if it _could_, mere uncertainty alone can prevent it from acquiring the critical mass it needs.

about 5 years ago
top

Mickos Urges EU To Approve Oracle's MySQL Takeover

FallLine Re:Not so fast (67 comments)

Linux doesn't, either.

Much of the Linux's kernel development is subsidized by several large companies with an interest in seeing it grow as platform and a handful of embedded-device type companies. Then you have distros like Redhat that make money by selling update services essentially. It's an unusual model to fund development and not one that is applicable to most other software markets. Its success as a product is almost entirely in the server-space and this in part an accident of history (in my opinion), thanks to the web and the emergence of commodity hardware.

It's fine for what it does (web servers, devices, non-mission critical servers, some scientific computing, etc). That said, I do not see it making any real in-roads on the desktop, replacing microsoft's whole network infrastructure (Active Directory, Exchange, etc), and so on and a big part of the reason for this is because the license in the entire linux eco-system precludes the ability to build a successful for-profit COMPANY to drive that kind of end-to-end development effort.

PostgreSQL doesn't use hybrid licensing.

PostgreSQL is not a company and they have had many years to acquire commercial supporters (like Redhat, EnterpriseDB, etc) and make their products more user-friendly. I actually like PostgreSQL these days and I'm fairly impressed with their tools.

I suspect that EnterpriseDB may have played a substantial role in their recent enhancements (they're certainly one of their biggest financial contributors) and they enjoy the ability to sell proprietary enhancements to PostgreSQL. Regardless, I do not see it as a model that can be reliably replicated to maintain MySQL (not to mention create whole other new products and services).

I think your argument is too focused on money being sent to some central authority, which distributes the money to various people to make a database system and provide support. However, that's not the only way that money can move, and money is not the only incentive for people to accomplish things.

Money is the primary reason substantial groups of people organize to work on projects with focus and direction and to a schedule that they generally would not otherwise follow. Some software projects just need a handful of devs working on it at their whim, but there is a whole world of software out there that requires so much more (more developer hours, more focus, more insight, more varieties of contributors, i.e., technical writers, etc)

Granted, there are some exceptions, but that's just what they are: EXCEPTIONS. I do not think you can reasonably extrapolate from a handful of barely comparable situations and say the mere possibility of support or no-business development model means there is no cause for concern.

I do not think it is an accident that those few open vibrant open source products have been developed by companies that made money almost entirely by selling proprietary rights (dual licensing) or have been subsidized heavily by some other business for other reasons.

At the very least, if Oracle drops MySQL it will impose a substantial delay and create great uncertainty until some kind of organization can coalesce, acquire support, convince users of its viability, attract developers, etc. The very fact that a couple hundred people can fork it can actually harm this effort.

In short, I do not see GPL3 is being a magic bullet for the survival of products just because anyone can fork.

about 5 years ago

Submissions

FallLine hasn't submitted any stories.

Journals

FallLine has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?