Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!



Brain Stimulation For Entertainment?

Paul Fernhout Re:Supernormal Stimuli & The Pleasure Trap (88 comments)

"Neuroadaptation" is the key issue of what stronger stuff does not taste better in the long term. We just can't always have the rush of the first taste of the potato chip (salt, fat, crunch) if we start eating them all the time. Our tastes just start to expect that level regularly and if we go back to food with less, we feel bad for a time until our tastes readjust again. The same thing might be true of direct brain stimulation?

From the Pleasure Trap article: "Like our other sensory nerves, our taste buds also will "get used to" a given level of stimulation -- and this can have dangerous consequences. The taste buds of the vast majority of people in industrialized societies are currently neuroadapted to artificially high-fat, high-sugar, and high-salt animal and processed foods. These foods are ultimately no more enjoyable than more healthful fare, but few people will ever see that this is true. This is because they consistently consume highly stimulating foods, and have "gotten used to" them. If they were to eat a less stimulating, health-promoting diet, they soon would enjoy such fare every bit as much. Unfortunately, very few people will ever realize this critically important fact. Instead, nearly all of these people will die prematurely of strokes, heart attacks, congestive heart failure, diabetes, and cancer as a result of self-destructive dietary choices."

Still, you said your experience differed. So I wonder what else might have been different. You used the word "almost". One issue is how frequently people eat junk food. Even a couple times a week might be a problem?

Also, there is a certain style to cooking good healthy foods so they taste good. For example vegetables should not be overcooked... Dr. Fuhrman and his wife have some good cooking tips in various videos.

Medically-supervised fasting is another way to reset taste buds (in about a week). That may be why most religions include fasting as part of their traditions (watered down these days). When I fasted for more than a week, afterwards stuff with salt and sugar tasted offensively strong. Simple soups and plain vegetables tasted great, with various flavor nuances. Sadly, over the last few years I've become readapted to stronger less-healthy stuff (living in a family with other people eating other stuff).

In general, it is a good question what aspects of modern technology have overall made us happier or less happy over the long term. Aspects of today's fancy computers (including 24X7 social media) may in some ways be increasing stress for people more than they make us happier? Too many choices can also be stressful. Anyway, a complex topic.


Economists Say Newest AI Technology Destroys More Jobs Than It Creates

Paul Fernhout Rethinking economics for AI and post-scarcity (638 comments)

LOL. :-)

My own comments on that: http://www.pdfernhout.net/post...
"In general, economists need to look at what are major sources of *real* cost as opposed to *fiat* cost in producing anything. Only then can one make a complete control system to manage resources within those real limits, perhaps using arbitrary fiat dollars as part of a rationing process to keep within the real limits and meet social objectives (or perhaps not, if the cost of enforcing rationing for some things like, say, home energy use or internet bandwidth exceeds the benefits).
    Here is a sample meta-theoretical framework PU economists no doubt could vastly improve on if they turned their minds to it. Consider three levels of nested perspectives on the same economic reality -- physical items, decision makers, and emergent properties of decision maker interactions. (Three levels of being or consciousness is a common theme in philosophical writings, usually rock, plant, and animal, or plant, animal, and human.)
    At a first level of perspective, the world we live in at any point in time can be considered to have physical content like land or tools or fusion reactors like the sun, energy flows like photons from the sun or electrons from lightning or in circuits, informational patterns like web page content or distributed language knowledge, and active regulating processes (including triggers, amplifiers, and feedback loops) built on the previous three types of things (physicality, energy flow, and informational patterns) embodied in living creatures, bi-metallic strip thermostats, or computer programs running on computer hardware.
    One can think of a second perspective on the first comprehensive one by picking out only the decision makers like bi-metallic strips in thermostats, computer programs running on computers, and personalities embodied in people and maybe someday robots or supercomputers, and looking at their characteristics as individual decision makers.
    One can then think of a third level of perspective on the second where decision makers may invent theories about how to control each other using various approaches like internet communication standards, ration unit tokens like fiat dollars, physical kanban tokens, narratives in emails, and so on. What the most useful theories are for controlling groups of decision makers is an interesting question, but I will not explore it in depth. But I will pointing out that complex system dynamics at this third level of perspective can emerge whether control involves fiat dollars, "kanban" tokens, centralized or distributed optimization based on perceived or predicted demand patterns, human-to-human discussions, something else entirely, or a diverse collection of all these things. And I will also point out that one should never confuse the reality of the physical system being controlled for the control signals (money, spoken words, kanban cards, internet packet contents, etc.) being passed around in the control system."


Economists Say Newest AI Technology Destroys More Jobs Than It Creates

Paul Fernhout Star Trek "waiters" like Guinan likely do more... (638 comments)

http://en.memory-alpha.org/wik... "Guinan was the mysterious bartender in Ten Forward, the lounge aboard the USS Enterprise-D. She was well known for her wise counsel, which had proven invaluable many times. She was an El-Aurian, a race of "listeners" who were scattered by the Borg. Q, however, once suggested that there is far more to her than could be imagined. "

Or consider Vincent's sometimes influential role in Eureka's Cafe Diem:
"Cafe Diem is the cafe of Vincent, on the main street of Eureka. It's the place where everybody meets to eat one of Vincent's extraordinary meals or have a cup of his signature "Vinspresso". "

James P. Hogan in "Voyage From Yesteryear" provides other examples of why some people wait tables in a gift economy -- even when robots could easily do it.

Also, in a post-scarcity future many undesirable aspects of any tasks can be engineered out. Tables might be built of materials that were easy to clean. Cleaning cloths might be super-absorbent. You might wear technology that made taking orders easy. You boosted immune system would make catching disease from a diner unlikely. And so on...

See Bob Black on this:
"Liberals say we should end employment discrimination. I say we should end employment. Conservatives support right-to-work laws. Following Karl Marx's wayward son-in-law Paul Lafargue, I support the right to be lazy. Leftists favor full employment. Like the surrealists -- except that I'm not kidding -- I favor full unemployment. Trotskyists agitate for permanent revolution. I agitate for permanent revelry. But if all the ideologues (as they do) advocate work -- and not only because they plan to make other people do theirs -- they are strangely reluctant to say so. They will carry on endlessly about wages, hours, working conditions, exploitation, productivity, profitability. They'll gladly talk about anything but work itself. These experts who offer to do our thinking for us rarely share their conclusions about work, for all its saliency in the lives of all of us. Among themselves they quibble over the details. Unions and management agree that we ought to sell the time of our lives in exchange for survival, although they haggle over the price. Marxists think we should be bossed by bureaucrats. Libertarians think we should be bossed by businessmen. Feminists don't care which form bossing takes, so long as the bosses are women. Clearly these ideology-mongers have serious differences over how to divvy up the spoils of power. Just as clearly, none of them have any objection to power as such and all of them want to keep us working. "

Or listen to or read "The Skills of Xanadu" by Theodore Sturgeon:

Why do people host dinner parties for friends when they involve "work"?

Why do people knit when they can buy machine-woven cloth for less than that of the raw yarn?

In some ways, waiting tables and preparing food are far more important jobs than most of what most people do for "paid" work these days... As Bob Black wrote in the above-linked essay:
    "I don't suggest that most work is salvageable in this way. But then most work isn't worth trying to save. Only a small and diminishing fraction of work serves any useful purpose independent of the defense and reproduction of the work-system and its political and legal appendages. Twenty years ago, Paul and Percival Goodman estimated that just five percent of the work then being done -- presumably the figure, if accurate, is lower now -- would satisfy our minimal needs for food, clothing and shelter. Theirs was only an educated guess but the main point is quite clear: directly or indirectly, most work serves the unproductive purposes of commerce or social control. Right off the bat we can liberate tens of millions of salesmen, soldiers, managers, cops, stockbrokers, clergymen, bankers, lawyers, teachers, landlords, security guards, ad-men and everyone who works for them. There is a snowball effect since every time you idle some bigshot you liberate his flunkies and underlings also. Thus the economy implodes.
    Forty percent of the workforce are white-collar workers, most of whom have some of the most tedious and idiotic jobs ever concocted. Entire industries, insurance and banking and real estate for instance, consist of nothing but useless paper-shuffling. It is no accident that the "tertiary sector," the service sector, is growing while the "secondary sector" (industry) stagnates and the "primary sector" (agriculture) nearly disappears. Because work is unnecessary except to those whose power it secures, workers are shifted from relatively useful to relatively useless occupations as a measure to ensure public order. Anything is better than nothing. That's why you can't go home just because you finish early. They want your time, enough of it to make you theirs, even if they have no use for most of it. Otherwise why hasn't the average work week gone down by more than a few minutes in the last fifty years?
    Next we can take a meat-cleaver to production work itself. No more war production, nuclear power, junk food, feminine hygiene deodorant -- and above all, no more auto industry to speak of. An occasional Stanley Steamer or Model T might be all right, but the auto-eroticism on which such pestholes as Detroit and Los Angeles depend is out of the question. Already, without even trying, we've virtually solved the energy crisis, the environmental crisis and assorted other insoluble social problems.
    Finally, we must do away with far and away the largest occupation, the one with the longest hours, the lowest pay and some of the most tedious tasks. I refer to housewives doing housework and child-rearing. By abolishing wage- labor and achieving full unemployment we undermine the sexual division of labor. The nuclear family as we know it is an inevitable adaptation to the division of labor imposed by modern wage-work. Like it or not, as things have been for the last century or two, it is economically rational for the man to bring home the bacon, for the woman to do the shitwork and provide him with a haven in a heartless world, and for the children to be marched off to youth concentration camps called "schools," primarily to keep them out of Mom's hair but still under control, and incidentally to acquire the habits of obedience and punctuality so necessary for workers. If you would be rid of patriarchy, get rid of the nuclear family whose unpaid "shadow work," as Ivan Illich says, makes possible the work-system that makes it necessary. Bound up with this no-nukes strategy is the abolition of childhood and the closing of the schools. There are more full-time students than full-time workers in this country. We need children as teachers, not students. They have a lot to contribute to the ludic revolution because they're better at playing than grown-ups are. Adults and children are not identical but they will become equal through interdependence. Only play can bridge the generation gap.
    I haven't as yet even mentioned the possibility of cutting way down on the little work that remains by automating and cybernizing it. All the scientists and engineers and technicians freed from bothering with war research and planned obsolescence should have a good time devising means to eliminate fatigue and tedium and danger from activities like mining. Undoubtedly they'll find other projects to amuse themselves with. Perhaps they'll set up world-wide all-inclusive multi-media communications systems or found space colonies. Perhaps. I myself am no gadget freak. I wouldn't care to live in a push button paradise. I don't want robot slaves to do everything; I want to do things myself. There is, I think, a place for labor-saving technology, but a modest place. The historical and pre-historical record is not encouraging. When productive technology went from hunting-gathering to agriculture and on to industry, work increased while skills and self-determination diminished. The further evolution of industrialism has accentuated what Harry Braverman called the degradation of work. Intelligent observers have always been aware of this. John Stuart Mill wrote that all the labor-saving inventions ever devised haven't saved a moment's labor. The enthusiastic technophiles -- Saint-Simon, Comte, Lenin, B.F. Skinner -- have always been unabashed authoritarians also; which is to say, technocrats. We should be more than sceptical about the promises of the computer mystics. They work like dogs; chances are, if they have their way, so will the rest of us. But if they have any particularized contributions more readily subordinated to human purposes than the run of high tech, let's give them a hearing.
    What I really want to see is work turned into play. A first step is to discard the notions of a "job" and an "occupation." Even activities that already have some ludic content lose most of it by being reduced to jobs which certain people, and only those people, are forced to do to the exclusion of all else. Is it not odd that farm workers toil painfully in the fields while their air-conditioned masters go home every weekend and putter about in their gardens? Under a system of permanent revelry, we will witness the Golden Age of the dilettante which will put the Renaissance to shame. There won't be any more jobs, just things to do and people to do them. "

So, are there any people who might find waiting tables in the right context to be fun or otherwise worthwhile? Given that many fun or worthwhile things can be hard or have some unpleasant parts?


Economists Say Newest AI Technology Destroys More Jobs Than It Creates

Paul Fernhout Being a good parent takes a lot of time... (638 comments)

As does being an informed citizen, a good neighbor, a good friend, a good sibling, a good storyteller tailored for local needs, and so on. So, always lots of important things to do even when we don't need to "work" for someone else for a wage...

Check out: http://www.primitivism.com/ori...
"When Herskovits (13) was writing his Economic Anthropology (1958), it was common anthropological practice to take the Bushmen or the native Australians as "a classic illustration; of a people whose economic resources are of the scantiest", so precariously situated that "only the most intense application makes survival possible". Today the "classic" understanding can be fairly reversed- on evidence largely from these two groups. A good case can be made that hunters and gatherers work less than we do; and, rather than a continuous travail, the food quest is intermittent, leisure abundant, and there is a greater amount of sleep in the daytime per capita per year than in any other condition of society.
    The most obvious, immediate conclusion is that the people do not work hard. The average length of time per person per day put into the appropriation and preparation of food was four or five hours. Moreover, they do not work continuously. The subsistence quest was highly intermittent. It would stop for the time being when the people had procured enough for the time being. which left them plenty of time to spare. Clearly in subsistence as in other sectors of production, we have to do with an economy of specific, limited objectives. By hunting and gathering these objectives are apt to be irregularly accomplished, so the work pattern becomes correspondingly erratic."

See also my essay: "Basic income from a millionaire's perspective? "

Or more general on post-scarcity: http://www.pdfernhout.net/post...


Brain Stimulation For Entertainment?

Paul Fernhout Supernormal Stimuli & The Pleasure Trap (88 comments)

"Harvard psychologist Deirdre Barrett argues that supernormal stimulation govern the behavior of humans as powerfully as that of animals. In her 2010 book, Supernormal Stimuli: How Primal Urges Overran Their Evolutionary Purpose,[9] she examines the impact of supernormal stimuli on the diversion of impulses for nurturing, sexuality, romance, territoriality, defense, and the entertainment industry's hijacking of our social instincts. In the earlier book, Waistland,[2] she explains junk food as an exaggerated stimulus to cravings for salt, sugar, and fats and television as an exaggeration of social cues of laughter, smiling faces and attention-grabbing action. Modern artifacts may activate instinctive responses which evolved in a world without magazine centerfolds or double cheeseburgers, where breast development was a sign of health and fertility in a prospective mate, and fat was a rare and vital nutrient. ..."

"An abundance of food, by itself, is not a cause of health problems. But modern technology has done more than to simply make food perpetually abundant. Food also has been made artificially tastier. Food is often more stimulating than ever before--as the particular chemicals in foods that cause pleasure reactions have been isolated--and artificially concentrated. These chemicals include fats (including oils), refined carbohydrates (such as refined sugar and flour), and salt. Meats were once consumed mostly in the form of wild game--typically about 15% fat. Today's meat is a much different product. Chemically and hormonally engineered, it can be as high as 50% fat or more. Ice cream is an extraordinary invention for intensifying taste pleasure--an artificial concoction of pure fat and refined sugar. Once an expensive delicacy, it is now a daily ritual for many people. French fries and potato chips, laden with artificially-concentrated fats, are currently the most commonly consumed "vegetable" in our society. As Dr. Fuhrman reports in his excellent volume Eat to Live, these artificial products, and others like them, comprise a whopping 93% American diet. Our teenage population, for example, consumes up to 25% of their calories in the form of soda pop!
    Most of our citizenry can't imagine how it could be any other way. To remove (or dramatically reduce) such products from America's daily diet seems intolerable--even absurd. Most people believe that if they were to do so, they would enjoy their food--and their lives--much less. Indeed, most people believe that they would literally suffer if they consumed a health-promoting diet devoid of such indulgences. But, it is here that their perception is greatly in error. The reality is that humans are well designed to fully enjoy the subtler tastes of whole natural foods, but are poorly equipped to realize this fact. And like a frog sitting in dangerously hot water, most people are being slowly destroyed by the limitations of their awareness. ..."

2 days ago

BitTorrent Launches Project Maelstrom, the First Torrent-Based Browser

Paul Fernhout Subsistence, Gift, Exchange, Planned & More (67 comments)

On alternatives to profit-making websites emphasizing other types of transactions than exchange, see my comment: "1. Outdoor Holiday Lights 2. ??? 3. Profit!" http://slashdot.org/comments.p...

As I mention there, I've been working on-and-off towards software for supporting a social semantic desktop. Many other have of course (like with NEPOMUK), I'm just one more. The Maelstrom sounds like it may be heading in that direction too.

I have some later stuff I have not released yet, but it is pretty similar to this:
"A step towards a social semantic desktop in JavaScript using a NodeJS or PHP backend "

A key idea there is to write applications that spread their content state across a set of files, where you change the content state by adding a new file rather than changing an existing file.

So, for a simple example, imagine you have a document you can find by some UUID. When you make a new version of it, you write out a new file that references the same UUID but has a later timestamp. When you want to display the content, you search through all the versions of the document you have and display the one with the latest timestamp. Every actual file can be referenced by its SHA256 hash value and its length

Now, things can rapidly get more complex that that like by having hyperdocuments where only part of the document is in each file and so on. That requires a somewhat a different style of writing applications than is typical today.

In that version, you can have log files you add to, which can be generated by the system as it accepts new files and sees if they have special indexing tags. You can also have git-like variables that represent a pointer to a specific file and which can only be changed if you present the current version of the variable.

That older version is a bit more complicated than the one I'm working on, which has been progressing mostly by subtraction. :-) In the new version (not yet on GitHub, but I plan to put it there at some point), I got rid of the logs and variables, and replaced them with memory indexes of all content which is always a JSON document. Standard indexing of the files is simple and mainly just enough to find related ones which you can process or index further locally. Indexing in the server is based mainly on files having an optional ID (representing a document potentially with versions under the same ID) and having optional tags (to provide context about hyperdocuments), as well as having a SHA256 and length for direct retrieval. You can also query a server for files that match those IDs. Eventually, I see those queries as being like "magnet URIs".

I've been writing a Single Page Application in JavaScript that uses that new backend to support "Participative Narrative Inquiry" (implementing ideas outlined in my wife's book "Working with Stories");

I think there is a great potential for such tools for community dialog and community planning and community design. I have a video related to that on the front page of site that is currently running the Pointrel20130202 software:

Of course *many* people have been working towards a social semantic desktop (like NEPOMUK). And there are many document-oriented databases (CouchDB, MongoDB, etc.) and a variety of other databases of different sorts. These are just my own experiments and I don't know if they will succeed in being generally useful. I remain hopeful that someone will develop a general purpose system for this and it will be useful for communications, planning, and design. Maybe Maelstrom (or Maelstrom plus some new apps written in the way described above) will be it.

The Theodore Sturgeon short sci-fi story, "The Skills of Xanadu" is part of my own inspiration. Both these links are ironically down at the moment (background info and an audio version of the story):

This is still up, with the text of the story:

Other ideas and inspirations (from 2006 and earlier):
" Hyperscope is a browsing tool that enables most of the viewing and navigating features called for in Doug Engelbart's open hyperdocument system framework (OHS) to support dynamic knowledge repositories (DKRs) and rising Collective IQ. HyperScope works with the Mozilla Firefox version 2.0."

And "Memex" is another inspiration, as essentially a distributed system where people are making copies of information to share (photographically in that case).

Of course, none of that solves the problem you raised of good content being costly to produce because it takes a lot of time. That remains true. But, using Google and reading lots of blogs and sites like Slashdot, and participating on various mailing lists. I've seen that there are many economic alternatives, which I discuss on my site. It is probably only because of cheap and easy access to all that information that I was able to educate myself on these topics as much as I have (always more to learn and self-correct).

There is a lot of truth to this comment by C Mattix:
"Sid Meier is a time traveler"
"I get to break this out again:
                As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he dreams himself your master.
                                Commissioner Pravin Lal, "U.N. Declaration of Rights"
                                Accompanies the Secret Project "The Planetary Datalinks"

Of course, here is a whole dictionary of alternatives someone told me about, so it is not like that knowledge is that hard to find if you want to find it:
"The Dictionary of Alternatives: Utopianism and Organization" by Martin Parker, Valerie Fournier, Patrick Reedy
"This dictionary provides ammunition for those who disagree with the early twentieth-first century orthodoxy that 'There is no alternative to free market liberalism and managerialism'. Using hundreds of entries and cross-references, it proves that there are many alternatives to the way that we currently organize ourselves. These alternatives could be expressed as fictional utopias, they could be excavated from the past, or they could be described in terms of the contemporary politics of anti-corporate protest, environmentalism, feminism and localism. Part reference work, part source book, and part polemic, this dictionary provides a rich understanding of the ways in which fiction, history and today's politics provide different ways of thinking about how we can and should organize for the coming century"

Although putting healthy alternatives into practice is generally far harder than just learning they exist in theory. There is a history behind why we have the current systems we have. For example, see Howard Zinn's chapter "Columbus, The Indians, and Human Progress" on Columbus and the genocide of the Arawaks in Haiti for gold, just as one example. And the system does not change easily even when almost everyone agrees change make sense (the continued existence of JavaScript's warts like default globals are a prime example even though pretty much every JavaScript developer including Brendan Eich would like the default to be otherwise).

To quote Zinn, on what Haiti used to be like before Western Imperialism arrived:
" Arawak men and women, naked, tawny, and full of wonder, emerged from their villages onto the island's beaches and swam out to get a closer look at the strange big boat. When Columbus and his sailors came ashore, carrying swords, speaking oddly, the Arawaks ran to greet them, brought them food, water, gifts. He later wrote of this in his log:
        "They ... brought us parrots and balls of cotton and spears and many other things, which they exchanged for the glass beads and hawks' bells. They willingly traded everything they owned... . They were well-built, with good bodies and handsome features.... They do not bear arms, and do not know them, for I showed them a sword, they took it by the edge and cut themselves out of ignorance. They have no iron. Their spears are made of cane... . They would make fine servants.... With fifty men we could subjugate them all and make them do whatever we want."
    These Arawaks of the Bahama Islands were much like Indians on the mainland, who were remarkable (European observers were to say again and again) for their hospitality, their belief in sharing. These traits did not stand out in the Europe of the Renaissance, dominated as it was by the religion of popes, the government of kings, the frenzy for money that marked Western civilization and its first messenger to the Americas, Christopher Columbus. ...
      The Indians, Las Casas says, have no religion, at least no temples. They live in "large communal bell-shaped buildings, housing up to 600 people at one time ... made of very strong wood and roofed with palm leaves.... They prize bird feathers of various colors, beads made of fishbones, and green and white stones with which they adorn their ears and lips, but they put no value on gold and other precious things. They lack all manner of commerce, neither buying nor selling, and rely exclusively on their natural environment for maintenance. They are extremely generous with their possessions and by the same token covet the possessions of their friends and expect the same degree of liberality. ...""

When the world wide web first got going substantially in the late 1990s, there had been a hope the web would be a different sort of society -- perhaps one more like the (better parts) of what the Arawaks had. Do we all really need to ignore the social and cultural wealth from the web staring us in the face like Columbus did and instead focus on shiny yellow stuff we can't eat and which can't keep us warm and which can't tell us interesting stories?

As Philip Grrenspun wrote more than a decade ago:
"One of the beauties of Web publishing is that it can be free or nearly free. It can be done in such a way that it need not make money. And indeed if your site is destined to lose money it is much less humiliating when you can say that making money wasn't the idea. Nonetheless there are plenty of folks who've forgotten that greed is one of the seven deadly sins. This chapter, therefore, is about how to make money on the Internet. ...
        The Web is a powerful medium for personal expression, for sharing knowledge, and for teaching. It has also made a lot of people very wealthy, but that doesn't mean you can get rich by adding banner ads and referral links to what started out as a beautiful non-commercial site.
        Aside from those who started our with a decades-old centralized computerized database of some sort, the real money in the Internet business has been made by those who operate online communities."

The promise of Maelstrom, or a Social Semantic Desktop, or even pre-spam email or that matter, is to support a distributed community without a need for a business model that makes money to host a few hard-hit servers. If that community is large enough, there will always be useful and interesting content created on it for whatever reasons. The biggest problem these days is more the other ay -- there is too much content of low quality, and someone needs to weed through it. Again, that is a place the community can help. As explained here:
"The Internet, electronic mail, and the Web have revolutionized the way we communicate and collaborate - their mass adoption is one of the major technological success stories of the 20th century. We all are now much more connected, and in turn face new resulting problems: information overload caused by insufficient support for information organization and collaboration. For example, sending a single file to a mailing list multiplies the cognitive processing effort of filtering and organizing this file times the number of recipients - leading to more and more of peoples' time going into information filtering and information management activities. There is a need for smarter and more fine-grained computer support for personal and networked information that has to blend the boundaries between personal and group data, while simultaneously safeguarding privacy and establishing and deploying trust among collaborators."

Those are the sorts of tools needed on top of Maelstrom or whatever other distributed systems we use.

My own suggestions on that:
"This suggestion is about how civilians could benefit by have access to the sorts of "sensemaking" tools the intelligence community (as well as corporations) aspire to have, in order to design more joyful, secure, and healthy civilian communities (including through creating a more sustainable and resilient open manufacturing infrastructure for such communities). It outlines (including at a linked elaboration) why the intelligence community should consider funding the creation of such free and open source software (FOSS) "dual use" intelligence applications as a way to reduce global tensions through increased local prosperity, health, and with intrinsic mutual security.
    I feel open source tools for collaborative structured arguments, multiple perspective analysis, agent-based simulation, and so on, used together for making sense of what is going on in the world, are important to our democracy, security, and prosperity. Imagine if, instead of blog posts and comments on topics, we had searchable structured arguments about simulations and their results all with assumptions defined from different perspectives, where one could see at a glance how different subsets of the community felt about the progress or completeness of different arguments or action plans (somewhat like a debate flow diagram), where even a year of two later one could go back to an existing debate and expand on it with new ideas. As good as, say, Slashdot is, such a comprehensive open source sensemaking system would be to Slashdot as Slashdot is to a static webpage. It might help prevent so much rehashing the same old arguments because one could easily find and build on previous ones. ..."

Do I know how to do that? Not really -- I'm learning as I go, as are so many other people out there. As Van Gogh wrote: "I am always doing what I can't do yet in order to learn how to do it". :-)

As far as software, imagine a system where, for example, you could download this essay, and if you wanted all the related links, and you could always keep a local copy of all the content (including the slashdot page) and you could still follow all the links in an easy way and continue to annotate and comment on and summarize all the content locally, and share such changes with others when you wanted to. We don't yet have such a seamless system for doing that, but someday we might. And overall, I think that would be a good thing.

about a week ago

Fraud Bots Cost Advertisers $6 Billion

Paul Fernhout 1. Outdoor Holiday Lights 2. ??? 3. Profit! (190 comments)

The biggest problem here is ignoring that there are different types of transactions in a community, which include subsistence, gift, exchange, planned, and theft (as discussed on my own website). Selling eyeballs to advertisers to fund a website is primarily an exchange economy transaction. But, as with putting up holiday lights just to make the darkness cheery, there can be gift giving involved in an action (even with a substantial power bill for the lights). You put up lights this year in one place, someone else puts up lights some other year somewhere else, and we all (in theory) enjoy the spectacle. Or, like many towns have tax-funded street lights for safety and convenience, government agencies like NOAA can put up useful websites about the weather with hazardous weather alerts, or NASA can put up useful websites about space science. People can also put up personal websites with journals or "How To" documents just because they are useful or interesting to themselves and their family (subsistence) and accept that it is OK if others look at them.

About a dozen years ago, I read somewhere on Philip Greenspun's website (on making websites), a comment to the effect that, if people announce they are getting a cat, or learning to play the piano, or taking vegetarian cooking lessons, people very rarely ask, how are you going to make money at that? But when people start a website, that seems to be the first question other people ask.

Of course, things have changes a bit now that so many people use Facebook or similar instead of just hosting their own website. It's ironic, since it is so cheap to host your own content now on a paid website (US$5 per month for a cheap one?) or even free on GitHub pages and similar. Or you can get a FreedomBox-like "wall wart" server (in theory) that just serves content through your ISP (in theory, since many ISP's prohibit servers on personal accounts).

I plan another comment related to the Pointrel software ideas I've been working on (including a social semantic desktop) and how it overlaps the ideas discussed in the BitTorrent Project Maelstrom to have distributed content. My work is still in flux (and may never succeed perhaps), as are other options like FreedomBox or Maelstrom which are works in progress. But the point is, more options are emerging for creating and distributing content and we may, at some point, get away from centralized servers and back to the older model where people had local copies of books and papers or went to local libraries for copies of such. The model of the web right now is like than expecting that every time someone wanted to read something of some sort they visit the office of the person who wrote it. And if that person's office door is closed, you can't read it. We can do better as a society. Yes, people can make copies like of Wikipedia pages, but the context is lost and the copies are hard to manage. We could hopefully do better.

However, it is fair to ask how people can survive physically and financially in the 21st century. I feel a basic income for everyone in the USA (not just people over 65 on "Social Security") and other countries too could be part of the answer to that, and that such a world would be overall a better place with more creativity and more subsistence production and more gift giving and healthier participation by citizens in government planning -- and with less theft by "clickfraud" or other means. However, even without a basic income, the "git economy" aspect of the internet has saved me a lot of money and trouble, from people generally freely sharing advice (including links to free software) on personal blogs (or on an advertising supported site like Slashdot). I hope my own contributions as part of that informational gift economy will prove worthwhile and useful at least to some people here and there.

about a week ago

Neglecting the Lessons of Cypherpunk History

Paul Fernhout Encryption is conceptually broken because... (103 comments)

... you can't organize a mass political movement or broad cultural change by hiding what you are doing. You need to convince people to believe in a cause and be willing to commit resources to support it. And overall that requires broad mass communications and engaging more and more people, any one of whom could report you to "authorities". Successful broad change in a democracy is going to be focused on legal & non-violent means to change public opinion. Encryption is generally about hiding communications and their contents, which is the opposite of what you need to be doing to make large scale social change.

Encryption to ensure security is like the same argument for personal handgun ownership. While you can make arguments for such things and personal protection as individual solutions, neither do much by themselves to change the societal culture (including changing spending policies and laws) to make the community healthier and safer. An emphasis on such shows a fundamental misunderstanding of the core social problems that confront us related to building healthier communities around shared values.

Encrypted communications also don't help much when the person you are communicating with forwards everything to someone you don't know. And as an XKCD comic shows, a pipe wrench can defeat most encryption fairly straightforwardly. Encrypted communications can also be compromised in practice any number of ways, which then leaves you with a false sense of security and depending on something you should not be trusting. So, not only is a focus on encryption misleading, it is dangerous.

Sure, encryption may enhance privacy and in that sense affect a balance of power between individual and state, and it is useful for protecting commercial transactions against criminals. It has its place. But that place is not at the heart of making social change of the kind we need for the 21st century -- which I feel relate more to making the most of abundant modern technology despite a culture and habits of mind adapted for scarcity.

Or as I've said elsewhere:
"As I see it, there is a race going on. The race is between two trends. On the one hand, the internet can be used to profile and round up dissenters to the scarcity-based economic status quo (thus legitimate worries about privacy and something like TIA). On the other hand, the internet can be used to change the status quo in various ways (better designs, better science, stronger social networks advocating for things like a basic income, all supported by better structured arguments like with the Genoa II approach) to the point where there is abundance for all and rounding up dissenters to mainstream economics is a non-issue because material abundance is everywhere. So, as Bucky Fuller said, whether is will be Utopia or Oblivion will be a touch-and-go relay race to the very end. While I can't guarantee success at the second option of using the internet for abundance for all, I can guarantee that if we do nothing, the first option of using the internet to round up dissenters (or really, anybody who is different, like was done using IBM [punched card tabulators] in WWII Germany) will probably prevail. So, I feel the global public really needs access to these sorts of sensemaking tools in an open source way, and the way to use them is not so much to "fight back" as to "transform and/or transcend the system". As Bucky Fuller said, you never change thing by fighting the old paradigm directly; you change things by inventing a new way that makes the old paradigm obsolete. ...
        As with that notion of "mutual security", the US intelligence community needs to look beyond seeing an intelligence tool as just something proprietary that gives a "friendly" analyst some advantage over an "unfriendly" analyst. Instead, the intelligence community could begin to see the potential for a free and open source intelligence tool as a way to promote "friendship" across the planet by dispelling some of the gloom of "want and ignorance" (see the scene in "A Christmas Carol" with Scrooge and a Christmas Spirit) that we still have all too much of around the planet. So, beyond supporting legitimate US intelligence needs (useful with their own closed sources of data), supporting a free and open source intelligence tool (and related open datasets) could become a strategic part of US (or other nation's) "diplomacy" and constructive outreach."

And also:
"Our biggest advantage is that no one takes us seriously. :-)
        And our second biggest advantage is that our communications are monitored, which provides a channel by which we can turn enemies into friends. :-)
        And our third biggest advantage is we have no assets, and so are not a profitable target and have nothing serious to fight over amongst ourselves. :-)"
        Let's hope those advantages all hold true for a long time. :-) "

Other ideas:
"Fresh Start For the Left: What Activists Would Do If They Took the Social Sciences Seriously"

about two weeks ago

Should IT Professionals Be Exempt From Overtime Regulations?

Paul Fernhout Lochner v. New York (1905) to Parrish etc. (1937) (545 comments)

That is a great historical link, thanks! And that leads to this other on West Coast Hotel Co. v. Parrish, for two Supreme Court decisions that lead up to the Great Depression and then its resolution:
"West Coast Hotel Co. v. Parrish, 300 U.S. 379 (1937), was a decision by the United States Supreme Court upholding the constitutionality of minimum wage legislation enacted by the State of Washington, overturning an earlier decision in Adkins v. Children's Hospital, 261 U.S. 525 (1923). The decision is usually regarded as having ended the Lochner era, a period in American legal history during which the Supreme Court tended to invalidate legislation aimed at regulating business.[1]"

about two weeks ago

Should IT Professionals Be Exempt From Overtime Regulations?

Paul Fernhout Studies show hours worked past 40/wk unproductive (545 comments)

So, ultimately, the whole thing is self-defeating in general. Crunch times may be one thing, but on a regular basis, productivity declines even as people look busy.

One example:
"The most essential thing to know about the 40-hour work-week is that, while it was the unions that pushed it, business leaders ultimately went along with it because their own data convinced them this was a solid, hard-nosed business decision....
        Evan Robinson, a software engineer with a long interest in programmer productivity (full disclosure: our shared last name is not a coincidence) summarized this history in a white paper he wrote for the International Game Developers' Association in 2005. The original paper contains a wealth of links to studies conducted by businesses, universities, industry associations and the military that supported early-20th-century leaders as they embraced the short week. 'Throughout the '30s, '40s and '50s, these studies were apparently conducted by the hundreds,' writes Robinson; 'and by the 1960s, the benefits of the 40-hour week were accepted almost beyond question in corporate America. In 1962, the Chamber of Commerce even published a pamphlet extolling the productivity gains of reduced hours.'
        What these studies showed, over and over, was that industrial workers have eight good, reliable hours a day in them. On average, you get no more widgets out of a 10-hour day than you do out of an eight-hour day."

With software, it is so easy to introduce a bug when you are tired or distracted (one reason team programming often saves money). A bug (especially a conceptual one) might be very expensive to debug down the road, especially if it makes its way to production. How many times have programmers spent days chasing a bug that was a one line fix? So, it may well be the case that longer hours mean *negative* productivity and higher costs for the extra hours worked past 40 per week even when the employee is not paid for the hours.

There is another complicating factor. Big companies in the 1970s such as HP or IBM invested in actually training employees, creating the pool of workers that Silicon Valley drew from initially. Investing in employee training is now rare, due in part due to little loyalty on either side of the employee/employer relationship in many companies. So, given that the tech industry moves so fast, where does the training time come from (including to read Slashdot :-)? Ideally, training should happen during those 40 hours. But in practice, many people working in IT have to keep current on their own time.

Yet training produces many benefits:
"A new study from a team of European researchers found that job training may also be a good strategy for companies looking to hire and retain top talent. When workers felt like they had received better job training options, they were also more likely to report a greater sense of commitment to their employer.
    For the study, psychological scientists Rita Fontinha, Maria Jose Chambel, and Nele De Cuyper looked at IT outsourcers in Portugal-who must constantly update their skills in order to keep up with the fast pace of new technology. The researchers hypothesized that when people were happy with the training opportunities their employer provided, they would be more motivated to reciprocate with an enhanced sense of loyalty to the company.
    This kind of informal balance of expectations between employees and management is known as a "psychological contract." When workers feel that their employer has fulfilled their obligations under the psychological contract, they're more motivated to uphold their end of the perceived bargain by working hard and staying with the company."

As you point out, this culture of (needless) overwork does discriminate against people with families. Likely it contributes to the focus often on finding young programmers so they can put in 60-80 hour weeks? The same programmers who often ignore customer service and will leave at a moment's notice chasing after the next shiny new thing that comes along, increasing workload from the cost of turnover?

So, there is a big IT cultural issue here... Not clear how to fix it though, but regulations like in the 1970s that had companies paying for overtime could be a start. As with the original 40 hour work week, if IT companies accept these overtime rules, it will probably be because they realize it is ultimately in their own financial interest and a way to level the playing field for everyone...

Soul Skill added the link to the DSLE page with overtime regulations, which say: "The exemption described above does not apply to an employee if any of the following apply: ... The employee is an engineer, drafter, machinist, or other professional whose work is highly dependent upon or facilitated by the use of computers and computer software programs and who is skilled in computer-aided design software, including CAD/CAM, but who is not in a computer systems analysis or programming occupation."

I can only think that engineers and machinists had better union representation back then? :-) I can wonder, with the successful and amazing Orion launch today by NASA, were most of the engineers and machinists working on that Orion project working 80 hour weeks with no overtime pay? Would launch have gone better with sleepy machinists milling the rocket exhaust nozzles? Would the launch have gone better if the ground crew fueling it has been up half the night? It seems to me to be crazy to even consider such things as even plausible -- no good engineering manager would allow that -- yet that is exactly the kind of conditions so many software projects labor under. Is it any wonder there are so many bugs?

That said, engineering managers have gotten good at estimating projects, whereas "Software is Hard". So, mis-estimates are going to continually be putting pressure on everyone to deliver by a superhuman effort. Here is an insightful idea, if also an ironic challenge:
"Scott Rosenberg coins this as Rosenberg's Law: Software is easy to make, except when you want it to do something new. The corollary is, The only software that's worth making is software that does something new."

about two weeks ago

Should IT Professionals Be Exempt From Overtime Regulations?

Paul Fernhout "Working hours: Get a life" at economist.com (545 comments)

Thanks for the link, AC: http://www.economist.com/blogs...
"Working hours: Get a life ... The Greeks are some of the most hardworking in the OECD, putting in over 2,000 hours a year on average. Germans, on the other hand, are comparative slackers, working about 1,400 hours each year. But German productivity is about 70% higher. ... So maybe we should be more self-critical about how much we work. Working less may make us more productive. And, as Russell argued, working less will guarantee âoehappiness and joy of life, instead of frayed nerves, weariness, and dyspepsia"."

Interesting comments there like on work culture in South Korea, and I've just read the first couple comments of hundreds...

about two weeks ago

Node.js Forked By Top Contributors

Paul Fernhout Felt similar about the "firing" bit as extreme (254 comments)

I especially liked the link to "empathy is a core engineering value" though: http://www.listbox.com/member/...

Linked from: https://www.joyent.com/blog/th...

And if so, should not empathy extend throughout all levels of a learning organization, including between managers and subordinates? Everyone is learning stuff all the time, including about cultural changes. Firing someone rather than trying to understand the situation and the individual's motives more first and whether change is needed or possible does not seem "empathic". Perhaps that is the kind of thing you tend to learn after many years of experience being a parent or other long-term caregiver (including a long-term manager or mentor) when you see someone learn and grow and change over a long time?

Plus, as other comments suggest here, there is an assumption in this blog post that may ignore the possibility the issue was about consolidating minor changes rather than having them as individual commits. If this issue was deemed by enough of the community to be important, maybe a more systematic patch would indeed be in order? One tiny change is not much work, but it may set a bad precedent?

Also, it is not empathic to coworkers and the rest of a company and community depending on someone to fire that person without notice without reasonable review or attempts at remediation for a less than egregious offense (contrast with, say, someone accused of physically assaulting a coworker). The issue there is proportion and risk/harm assessment.

So, the response of "we would have fired him" seems too extreme in multiple ways.

I am all for meaningful diversity in workgroups, like discussed in this book:
"The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies"

However, the problem with some of these "politically correct" initiatives or statements which seem on the surface to be helpful to promote "diversity" is that they can actually make workspaces more stressful for *everyone*. Someone can bully with the rules (or their interpretation) just as much, or more, than with a fist... Here is a website by psychologist Izzy Kalman that explores some issues related to bullying and truly creating happy productive workplaces by *really* emphasizing empathy and forgiveness and growth and free speech:

Just think about it -- does everyone at Joyent now need to be afraid of getting fired if they check the word "he" into the codebase, even by accident? Or maybe by saying "he" accidentally as a meeting? There are potential unintended consequences of creating a different sort of hostile workplace climate, like many US schools are finding out these days as a result of "zero tolerance" policies (like biting a cracker at lunch to make it shaped like a gun can get you in deep deep trouble).

For reference, here is what makes for happy productive creative workplaces in general (Autonomy, Mastery, Purpose):
"RSA Animate - Drive: The surprising truth about what motivates people"

Anyway, these are all complex issues about language, sex, management, control, gender roles, cultural change, recruitment, productivity, norms, and more. They are tricky to talk about or write about without seeming uncaring or inept because of various assumptions people make about the context or the people involved -- and the fact that none of us are "perfect" (and that perfection can be in the eye of the beholder based on priorities). It is sad to see such great software get mired in them. But I guess they are present in some form wherever we go or whatever we do.

BTW, since this whole furor is supposedly at the root about making women feel more accepted and happy in technology, here is a NY TImes story on what has made women happier than average in at least one Western country:
"Why Dutch women don't get depressed"

A tangential quote from there: :-)
    "Once married, however, sex often took a back seat; for some early Calvinists even sex within marriage was sinful, de Bruin says, and Dutch women sublimated their sexual energy into domestic bullying.
    "They ordered the men around - there are many stories of bossy women and subordinate men," she said. "We know this from the literature of the 16th century, and it hasn't changed."
    Modern Dutch men are expected to share the chores at home, "without being told, or when told," de Bruin said. The Dutch woman "wants the man to do housework to help her feel equal, but he has to do it her way."
    Which perhaps raises the question, do Dutch men get depressed?
    Not much, according to de Bruin, who says that the behavior of the sexes evolved simultaneously, that Dutch men like their women bossy while Dutch women are not keen on macho men. Still, she sympathizes with men who have to negotiate a jungle of rules that they never understand and that are always set by women.
    "Luckily," she said, "most men have enough Tarzan in them to like a bit of a jungle.""

That is an example of of how happiness relates, in part, to cultural expectations and their acceptance. Reading that, I can perhaps understand my own parent's marriage a bit better (both being immigrants to the USA from the Netherlands) including with a contrast to the family lives of the mostly Catholic Italian and German families I grew up around -- as well as the stereotyped TV families...

Also on that theme of women in the Netherlands:
"IN AMERICA, as pretty much everywhere in the world, the happy narrative of development and freedom has involved more women working in the cash economy, achieving financial independence and thus greater autonomy. It's interesting when you find a country that seems to buck these sorts of universal narratives, and as Jessica Olien points out in Slate, the Netherlands bucks the women's-development narrative in a pretty odd fashion: it has extremely high indicators for gender equality in every way (education, political participation, little violence against women, ultra-low rates of teen conception and abortion) except that women don't work. Or not full-time, anyway, at anything like the rates at which women work in most OECD countries. ... "When I talk to women who spend half the week doing what they want -- playing sports, planting gardens, doing art projects, hanging out with their children, volunteering, and meeting their family friends -- I think, yes, that sounds wonderful. I can look around at the busy midweek, midday markets and town squares and picture myself leisurely buying produce or having coffee with friends.""

Personally, I feel the movement of women into the paid workforce (and out of roles in the subsistence, gift, and planned economies, see my website on that) has weakened the USA given men have not moved to take up the slack. As a consequence, in the USA, we see our home life suffering for want of real cooked food from private gardens, our non-profits suffering for lack of volunteers, and our politics failing for lack of anyone to pay attention or get involved as a volunteer in political campaigns or go to town meetings. Plus, women in the USA are a lot more unhappy now than before (according to studies), probably because many of them gave up the potnetial for a role with a lot of autonomy and creativity and mastery (raising kids and running a household) for jobs low status low wage jobs with a lot of supervision. My "gender neutral" approach towards the factoid in the nested quote is to want a "basic income" for everyone in the USA.... :-)

Related study:
"What's Happening To Women's Happiness?"
"First, since 1972, [in the USA] women's overall level of happiness has dropped, both relative to where they were forty years ago, and relative to men. You find this drop in happiness in women regardless of whether they have kids, how many kids they have, how much money they make, how healthy they are, what job they hold, whether they are married, single or divorced, how old they are, or what race they are. (The one and only exception: African-American women are now slightly happier than they were back in 1972, although they remain less happy than African American men.) ... The second discovery is, this: though women begin their lives more fulfilled than men, as they age, they gradually become less happy. Men, in contrast, get happier as they get older. ..."

That is the cultural backdrop behind so many cultural trends and changes in the USA. So, people are still tinkering with trying to improve that without really asking deep questions about a fundamental shift in most women's lives to ones with less autonomy, mastery, and purpose as they took paying jobs in an economy that has seen stagnant wages (and unpaid overtime) and lots of competition between workers for limited promotions? Sure, working as a software developer at a company like Joyent might seem purposeful (changing the landscape of FOSS web apps we all rely on), but really, most jobs in the USA don't have that level of obvious purpose, in part because work has often become so specialized and compartmentalized.

Obviously expectations about gendered pronouns are in flux in the USA (related to other social changes) and that causes various sorts of stress. Norms are definitely shifting. I pretty much always reword sentences to avoid choosing he/she for individuals or I use "they" incorrectly on purpose. A sentence with "he" for an arbitrary individual person does sound more archaic, although, to me, a sentence with "she" using the alternating she/he style also sounds forced. It's a bit astounding to think that issue has now bubbled up to the point where entire communities are getting forked over it.

about two weeks ago

Node.js Forked By Top Contributors

Paul Fernhout Re:Effort dilution (vs. Stigmergy) (254 comments)

"The scourge of Open Source disguised as choice.."

All too true too often. And the failure of the Linux Desktop to gain traction is a prime example of that (other than finally essentially via the Chromebook).

That said, "Stigmergy" is a way that large structures (like the FOSS landscape?) can get built by entities following relatively simple local rules. For example, termites build big complex mounds by getting excited when they see other termites having accomplished something small but interesting (creating an arch).

See especially:
"Collaboration in small groups (roughly 2-25) relies upon social negotiation to evolve and guide its process and creative output. ... Collaboration in large groups (roughly 25-n) is dependent upon stigmergy."

Although in the termite case, they are increasingly joining together their separate actions vs. splitting apart the community in the NodeJS case. Maybe this is more akin to how a new generation of termite "queens" and their consorts takes to the air and finds a new place to create a new mound?

In any case, as a software developer moving into using NodeJS (and JavaScript in general) for new projects, this is not the sort of new I really want to hear. That is because it seems, in the short term, to increase risk (including from dilution of effort and community). In the long term I can, of course, be cautiously hopeful that the social and organizational issues will get worked through one way or another.

Fortunately, and why I like the JavaScript ecosystem even as I find JavaScript the language awkward to work with,there are many possible JavaScript containers to run stuff in. Here are a couple more for the server:

"Nodyn is a Node.js compatible framework, running on the JVM powered by the DynJS Javascript runtime"

"Ringo is a CommonJS-based JavaScript runtime written in Java and based on the Mozilla Rhino JavaScript engine. It takes a pragmatical and non-dogmatic stance on things like I/O paradigms. Blocking and asynchronous I/O both have their strengths and weaknesses in different areas."

So, in the Stigmergic sense, the idea of JavaScript everywhere (including on the server) is taking off as all us little FOSS termites get excited about the idea and work together on various arches. And with ways to compile C to code that can run efficiently on a JavaScript runtime, I wonder f we will see more and more adoption of JavaScript containers and further improvements in them.

While divisions of this look painful, when you step back and look at the landscape of millions of software developers who like to develop software (sometimes in different styles or with different emphases) this kind of forking is inevitable.

Reflecting on this though, I started shifting from Python around the time that the "Benevolent Dictator for Life" Guido van Rossum created a new (somewhat) backwardly-incompatible version of Python (3) while the community kept pushing support for the old one. Perl faced a similar issue with a new version going to version 6. I'm sympathetic to that dilemma for the original authors, but those are, to a lesser extent, and maybe with less drama, other examples of these sorts of tensions of priorities and individual vs. community control regarding priorities and future directions.

We probably need to develop much better understanding of what makes a FOSS project a success (in terms of community dynamics) and how it can stay a success despite trying to fix up early design choices.

Sometimes workarounds can keep things together for a time, like how JSLint/JSHint and "use strict" is a workaround for the problematical issue of JavaScript variables being global by default when not declared with var. That seems to be the right answer for the community, but the right technical answer for the programmer would really have been to change the language so that variables are scoped differently.

For JavaScript, it seems the influence of the original developer was lost very early on and a defacto community process happened very quickly. However, the evolution of JavaScript as a population of capabilities has been a painful process for anyone who has programmed in it over that time Again though, libraries and frameworks like jQuery or Dojo or others (NodeJS and CommonJS) tried to patch a community solution as a workaround on top of fundamental initial language spec issues (no modules, no Deferred, no basic string manipulation like startsWith, etc.).

Sadly, much of all this new development still has not reached the ease of use or consistency of, say, VisualWorks Smalltalk from the early 1990s. (which was killed by licensing costs, not being FOSS, even as it limps along as a proprietary product after missing its chance to be adopted by Sun instead of Java due the then "owner" of VisualWorks to wanting runtime fees for set-top boxes).

Anyway, back to NodeJS coding...

about two weeks ago

NASA's Orion Capsule Reaches Orbit

Paul Fernhout ROI for Innovation vs. Conquest (140 comments)

I was reading "Pandora's Seed: The Unforeseen Cost of Civilization" by Spencer Wells this morning. He makes the point that hunter/gatherers tend to walk away from social conflicts, whereas people in large militaristic agricultural hierarchies instead tend to end up fighting wars for resources as they see no other alternatives. I had a lot of youthful optimism in the 1970s stemming in part from the US space program and many space-related TV shows (Thunderbirds, Star Trek, Space: 1999, Lost In Space). To be potentially capable of the military conquest of the planet Earth, a country probably has to be of the scale of WWII Germany or the USA -- having about 5% of the planet's population and land. So that means, ignoring moral aspects and such, the maximum return on military investment for Empire can be at most about 20 to 1 relative to the total resources (including people) you are starting with and essentially gambling. By contrast, investments in Research & Development, such as the space program like with Orion or new energy sources like hot or cold fusion or dirt-cheap solar PV or whatever have the potential to produce much greater returns than 20:1 on investment. Imagine if the USA had poured the cost of the Iraq war (three or more trillion US$ at this point) into fusion research. We might have 1000X as much cheap less-polluting energy to use (including for space launches) than we have now. Increasing human capability to get into space and live there in self-replicating space habitats potentially could produce another 1000X or more return in land area to live in. Even as 100 trillion dollars to make the first such self-replicating space habitat, the ROI is so much higher than that of preparing to fight a global war of empire-building.

Maybe we can see a return to other ideas, like those from back when NASA overall was more optimistic under Carter?
"Advanced Automation for Space Missions"
"This document is the final report of a study on the feasability of using machine intelligence, including automation and robotics, in future space missions. The 10-week study was conducted during the summer of 1980 by 18 educators from universities throughout the United States who worked with 15 NASA program engineers. The specific study objectives were to identify and analyze several representative missions that would require extensive applications of machine intelligence, and then to identify technologies that must be developed to accomplish these types of missions. This study was sponsored jointly by NASA, through the Office of Aeronautics and Space Technology and the Office of University Affairs, and by the American Society for Engineering Education as part of their continuing program of summer study faculty fellowships. Co-hosts for the study were the NASA Ames Research Center and the University of Santa Clara, where the study was carried out. Project co-directors were James E. Long of the Jet Propulsion Laboratory and Timothy J. Healy of the University of Santa Clara."

There are probably nuances here regarding how much of the country is at risk in such a military gamble and so on, as well as the value of military investments for deterrence (how much is enough?), but that is the broad brush picture I've always seen based on that early optimism. And given that a supervolcano like Toba (mentioned by Spencer Wells as killing of most humans about 70,000 years ago) or a pandemic (like Ebola) could wipe out most people (from a decade long winter and a new ice age), it seems investments in cooperation to develop productive innovations including space habitats has a much better risk/reward ratio than most military investments which ultimately still don't secure you against supervolcanos or plagues and similar things.

While it has sometimes been called "The Conquest of Space", it is a very different thing than the conquest of lands that already have people on them, just using them differently, like exemplified by the Sand Creek Massacre against Native American's that just passed its 150 year remembrance:

Technology is an amplifier. What human emotions and aspirations do we want it most to amplify?

about two weeks ago

NASA's Orion Capsule Reaches Orbit

Paul Fernhout Congrats to NASA on a great launch! (140 comments)

What great news to wake up to! Hoping for many more optimism-promoting successes like this on the road to humans living in space habitats that can duplicate themselves from sunlight and asteroidal or lunar ores.

Here is a PBS NewsHour video with launch footage:

BTW, that PBS NewsHour Orion article led me to another PBS NewsHour article which formed the basis of my most recent "optimistic" Slashdot story submission on how restoring 1970s overtime regulations could boost the US economy:

With a stronger economy, maybe there would be even more demand for space-related ventures of all sorts?

about two weeks ago

Interviews: Malcolm Gladwell Answers Your Questions

Paul Fernhout Thanks for response on lead & crime & Tipp (48 comments)

I wonder if the reply counts as a good enough "reputable source" to update the Wikipedia article on the Tipping Point?

I was also glad to see two questions mentioning automation issues (one referencing a basic income). Maybe we'll see a new book on that as Malcolm Gladwell explores those issues more in depth?

about two weeks ago

Pantry Pests Harbor Plastic-Chomping Bacteria

Paul Fernhout Re:Mutant 59: The plastic eaters (45 comments)

Yes, I was surprised this article was collapsed on the main page. The potential for problems if bacteria start eating through common plastics is enormous.

about two weeks ago

Bad Lockup Bug Plagues Linux

Paul Fernhout Re:What's happening to Linux? (257 comments)

I went through this around 2007-2008 when after running Debian as a desktop for about five years on two desktops, my wife and I got tired of the breakage with every major update. While I was willing to put up with more, my wife got tired of me spending a few hours trying to sort things out on her desktop with every update -- often basic things like graphics driver stuff in a multi-monitor setup. Power savings never worked (I gave up on it).

What often drove updates was wanting to use the latest version of Eclipse or Firefox or other applications. My wife went first, going to a Mac Pro, and I followed about a year later. We're still using that hardware, although upgraded in various ways (memory, drives, graphics cards and monitors).

That said, Linux is everywhere and those years of working with it all the time have been very useful in maintaining servers (including in VirtualBox) and embedded hardware (NAS, routers, media, other) which generally face less updates that desktops. I feel Linux settled down to stability a couple years after that (driven in part by Ubuntu's widespread adoption) -- although it sounds like instability has picked up again. I feel that in general about FOSS -- maybe the old guard is getting bored or old or tired or busy or burned out and new people move to web stuff?

Of course now, my wife's Mac Pro from 2007 is not supported for an upgrade past Snow Leopard. Mine is, but I'm not sure if it is worth it yet. But, more and more, software coming out has a minimum of later versions. And there are no more Snow Leopard updates. And my wife's machine has a sporadic kernel panic or something once every few weeks or so. And mine has also been doing some lockups, although not recently after resetting the PRAM.

There were some big disappointments leaving Debian. I liked cut-and-paste under Linux where selecting something put it in the copy buffer. Mac is harder, including weirdness about having to menu click within the selected text to pull up a copy menu. Apt get was great (when stuff was compatible) and a sad loss to not have. Also, Mac's GUI design with a single global menu is just *terrible* on a multi-monitor setup, especially if the monitors are different heights; having a menu per application window like Linux makes so much more sense. I also don't like the fact that I could easily (without copyright concerns) virtualize old Linux setups, but you can't really do that with Mac OS X -- in that sense, all my work feels "contaminated" by copyright issues. That said, Apple Time Machine "just works" as a backup solution (ignoring the risks of having a plugged in backup hard drive in a worst case).

about three weeks ago

Google Should Be Broken Up, Say European MPs

Paul Fernhout Or Google could be made into a public utility... (237 comments)

Just saying, there are other options; whether we pursue them is a different story. Google's non-search activities (like Google Apps, Chromium, other Google Lab stuff) generally only make significant financial sense to the company in the context of their search business, so breaking up Google means those spinoff businesses would probably immediately go bankrupt.

What was really wrong with an AT&T that funded Bell Labs and created UNIX with government-mandated 5% or so of revenue to be spent on (free and open source) R&D like was the case with AT&T? As someone once said, Bell Labs was funded by people dropping dimes into boxes across the country. Telephone costs have changed in the USA since the breakup, *but* it is not really clear how much of that had to do with the "baby bells" and competition and how much had to do with Moore's law an an exponential reduction in computing costs per MIP that made packet switching (even in the home) so much cheaper.

"The End of AT&T: Ma Bell may be gone, but its innovations are everywhere"
"It's 1974. Platform shoes are the height of urban fashion. Disco is just getting into full stride. The Watergate scandal has paralyzed the U.S. government. The new Porsche 911 Turbo helps car lovers at the Paris motor show briefly forget the recent Arab oil embargo. And the American Telephone & Telegraph Co. is far and away the largest corporation in the world.
    AT&T's US $26 billion in revenues--the equivalent of $82 billion today--represents 1.4 percent of the U.S. gross domestic product. The next-largest enterprise, sprawling General Motors Corp., is a third its size, dwarfed by AT&T's $75 billion in assets, more than 100 million customers, and nearly a million employees.
    AT&T was a corporate Goliath that seemed as immutable as Gibraltar. And yet now, only 30 years later, the colossus is no more. Of the many events that contributed to the company's long decline, a crucial one took place in the autumn of that year. On 20 November 1974, the U.S. Department of Justice filed the antitrust suit that would end a decade later with the breakup of AT&T and its network, the Bell System, into seven regional carriers, the Baby Bells. AT&T retained its long-distance service, along with Bell Telephone Laboratories Inc., its legendary research arm, and the Western Electric Co., its manufacturing subsidiary. From that point on, the company had plenty of ups and downs. It started new businesses, spun off divisions, and acquired and sold companies. But in the end it succumbed. Now AT&T is gone. ...
    Should we mourn the loss? The easy answer is no. Telephone providers abound nowadays. AT&T's services continue to exist and could be easily replaced if they didn't.
    But that easy answer ignores AT&T's unparalleled history of research and innovation. During the company's heyday, from 1925 to the mid-1980s, Bell Labs brought us inventions and discoveries that changed the way we live and broadened our understanding of the universe. How many companies can make such a claim?
    The oft-repeated list of Bell Labs innovations features many of the milestone developments of the 20th century, including the transistor, the laser, the solar cell, fiber optics, and satellite communications. Few doubt that AT&T's R&D machine was among the greatest ever. But few realize that its innovations, paradoxically, contributed to the downfall of its parent. And now, through a series of events during the past three decades, this remarkable R&D engine has run out of steam. ...
    The funding came in large part from what was essentially a built-in "R&D tax" on telephone service. Every time we picked up the phone to place a long-distance call half a century ago, a few pennies of every dollar--a dollar worth far more than it is today--went to Bell Labs and Western Electric, much of it for long-term R&D on telecommunications improvements.
    In 1974, for example, Bell Labs spent over $500 million on nonmilitary R&D, or about 2 percent of AT&T's gross revenues. Western Electric spent even more on its internal engineering and development operations. Thus, more than 4 cents of every dollar received by AT&T that year went to R&D at Bell Labs and Western Electric.
    And it was worth every penny. This was mission-oriented R&D in an industrial context, with an eye toward practical applications and their eventual impact on the bottom line. ..."

In this content, "search" (and a related constellation of applications) has become a public utility. So, just treat it like one. Facebook likewise could be treated that way. As could Microsoft.

In general, these sorts of market failures (given the rich market leaders tend to get richer and more market leading) show a fundamental problem with free market ideology in practice in the 21st century. It does not matter in the social/political consequences if Google might someday be replace in our attention by some next huge monopoly market spanning entity. The point is that this keeps happening with significant effects on out social and political fabric, and the company names just change.

In any case, if Moore's law continues for another couple decades, today's Google server farm's computational capability might fit on a laptop of the 2040s, which could also store all the surface internet content of today. At that point, with all the possible human cultural content you might want stored and searchable just inches from your brain, what would Google's business model be? So, in that sense, this political power issue may be self-limiting, although we will see new issues, as "the right to be forgotten" will take on new complexities on the order of asking the populace to forget about what it previously learned about someone...

about three weeks ago



Should IT professionals be exempt from overtime?

Paul Fernhout Paul Fernhout writes  |  about two weeks ago

Paul Fernhout (109597) writes "Nick Hanauer's a billionaire who made his fortune as one of the original investors in Amazon. He suggests President Obama should restore US overtime regulations to the 1970s to boost the economy (quoted by PBS NewsHour):
"In 1975, more than 65 percent of salaried American workers earned time-and-a-half pay for every hour worked over 40 hours a week. Not because capitalists back then were more generous, but because it was the law. It still is the law, except that the value of the threshold for overtime pay--the salary level at which employers are required to pay overtime--has been allowed to erode to less than the poverty line for a family of four today. Only workers earning an annual income of under $23,660 qualify for mandatory overtime. You know many people like that? Probably not. By 2013, just 11 percent of salaried workers qualified for overtime pay, according to a report published by the Economic Policy Institute. And so business owners like me have been able to make the other 89 percent of you work unlimited overtime hours for no additional pay at all.
    The Obama administration could, on its own, go even further. Many millions of Americans are currently exempt from the overtime rules--teachers, federal employees, doctors, computer professionals, etc.--and corporate leaders are lobbying hard to expand "computer professional" to mean just about anybody who uses a computer. Which is almost everybody. But were the Labor Department instead to narrow these exemptions, millions more Americans would receive the overtime pay they deserve. Why, you might ask, are so many workers exempted from overtime? That's a fair question. To be truthful, I have no earthly idea why. What I can tell you is that these exemptions work out very well for your employers. ...
    In the information economy of the 21st century, it is not capital accumulation that creates growth and prosperity, but, rather, the virtuous cycle of innovation and demand. The more innovators and entrepreneurs we have converting ideas into products and services, the higher our standard of living, and the more people who can afford to consume these products and services, the greater the incentive to innovate. Thus, the key to growth and prosperity is to fully include as many Americans as possible in our economy, both as innovators and consumers.
    In plain English, the real economy is you: Raise wages, and one increases demand. Increase demand and one increases jobs, wages and innovation. The real economy is simply the interplay between consumers and businesses. On the other hand, as we've learned from the past 40 years of slow growth and record stock buybacks, not even an infinite supply of capital can persuade a CEO to hire more workers absent demand for the products and services they produce.
    The twisted irony is, when you work more hours for less pay, you hurt not only yourself, you hurt the real economy by depressing wages, increasing unemployment and reducing demand and innovation. Ironically, when you earn less, and unemployment is high, it even hurts capitalists like me. ..."

If overtime pay is generally good for the economy, should most IT professionals really be exempt from overtime regulations?"

Link to Original Source

Safety first, privacy second in use of cameras in schools

Paul Fernhout Paul Fernhout writes  |  about a month and a half ago

Paul Fernhout (109597) writes "Caroline Murray reports for the Sacandaga Express: "Just this year, the Broadalbin-Perth Central School District completed Phase 1 of a plan to install high-tech security cameras in every school across the district. For the first time, high school and middle school students started off the school year with security cameras pointed at them from every direction, including hallways, staircases, and public rooms, such as the cafeteria and gymnasium. For some veteran students, the cameras feel a bit invasive. "It is like '1984' with big brother," senior Hunter Horne said while walking down the hallway. ... Superintendent Stephen Tomlinson said safety is the driving force behind the technology, however, admitted student behavior also plays a role in utilizing the equipment. Tomlinson said students have rights, and he wants to respect their privacy, but their rights change when students step foot on school grounds. ... Tomlinson said he already notices the culture has changed in the high school. He believes the amount of bullying and vandalism in the hallway is greatly reduced already. Gennett said faculty and teachers have peace of mind now, knowing the entire school is under surveillance. "It would be very difficult to find a location in our buildings where you can hide, or you can go, and intentionally do something that is not acceptable in our buildings," Tomlinson said. Some of the administrators view the security cameras as entertaining. Seniors Smith and Horne said certain staff members will call-out students over the loud speaker, and tell them to take off their hats."

One question not addressed in the article is whether forcing a child to submit to total one-way surveillance is a form of bullying or in some other way a vandalism of privacy or democracy? See also David Brin's "The Transparent Society" for another take on surveillance, where all the watchers are also watched."

Link to Original Source

US Millennials: The Cheapest Generation

Paul Fernhout Paul Fernhout writes  |  about a month and a half ago

Paul Fernhout (109597) writes "Just noticed this two-year old Atlantic article on how US Millennials (aka Generation Y) are not buying houses or cars as much as previous generations, but are buying smartphones instead and using those phones to get on-demand access to things like Zipcars or other "sharing economy" services. It says: "In 2010, adults between the ages of 21 and 34 bought just 27 percent of all new vehicles sold in America, down from the peak of 38 percent in 1985. Miles driven are down, too. Even the proportion of teenagers with a license fell, by 28 percent, between 1998 and 2008. ... Just as car sales have plummeted among their age cohort, the share of young people getting their first mortgage between 2009 and 2011 is half what it was just 10 years ago, according to a Federal Reserve study. ... Smartphones compete against cars for young people's big-ticket dollars, since the cost of a good phone and data plan can exceed $1,000 a year. But they also provide some of the same psychic benefits — opening new vistas and carrying us far from the physical space in which we reside. ... If the Millennials are not quite a post-driving and post-owning generation, they'll almost certainly be a less-driving and less-owning generation. That could mean some tough adjustments for the economy over the next several years. ... Education is the "obvious outlet for the money Millennials can spend," Perry Wong, the director of research at the Milken Institute, told us, noting that if young people invest less in physical things like houses, they'll have more to invest in themselves. "In the past, housing was the main vehicle for investment, but education is also a vehicle." In an ideas economy, up-to-date knowledge could be a more nimble and valuable asset than a house."

Of course, education via the internet or through FOSS educational simulations may not cost that much either. Also, we are also seeing the bubble on student loan borrowing nearing the bursting point, where more and more young people are deciding to bow out of the entire academic credentialing arms race given the uncertainty of a financial return on such an investment (as much as education via schools or other venues may have other non-financial benefits)."

Link to Original Source

10 things that scare the bejeezus out of IT pros

Paul Fernhout Paul Fernhout writes  |  about a month and a half ago

Paul Fernhout (109597) writes "ITWorld has a slideshow that begins with: "As Halloween approaches, some may be creeped out by vampires and zombies and other minor evils. But IT workers know that just a few words can carry more horror than most ordinary souls can imagine — with nightmarish results ranging from wasted IT resources to botched rollouts to failed projects. Presented for your approval: 10 short sentences that will truly make your blood run cold this Halloween.""
Link to Original Source

Cold fusion reactor verified by third-party researchers

Paul Fernhout Paul Fernhout writes  |  about 2 months ago

Paul Fernhout (109597) writes "ExtremeTech reports that "Andrea Rossi's E-Cat — the device that purports to use cold fusion to generate massive amounts of cheap, green energy — has been verified by third-party researchers, according to a new 54-page report. The researchers observed a small E-Cat over 32 days, where it produced net energy of 1.5 megawatt-hours, or "far more than can be obtained from any known chemical sources in the small reactor volume." The researchers were also allowed to analyze the fuel before and after the 32-day run, noting that the isotopes in the spent fuel could only have been obtained by "nuclear reactions"...""

Hidden Obstacles for Google's Self-Driving Cars

Paul Fernhout Paul Fernhout writes  |  about 4 months ago

Paul Fernhout (109597) writes "Lee Gomes at Technology Review wrote an article on the current limits of Google self-driving car technology: "Would you buy a self-driving car that couldn't drive itself in 99 percent of the country? Or that knew nearly nothing about parking, couldn't be taken out in snow or heavy rain, and would drive straight over a gaping pothole? If your answer is yes, then check out the Google Self-Driving Car, model year 2014. Google often leaves the impression that, as a Google executive once wrote, the cars can "drive anywhere a car can legally drive." However, that's true only if intricate preparations have been made beforehand, with the car's exact route, including driveways, extensively mapped. Data from multiple passes by a special sensor vehicle must later be pored over, meter by meter, by both computers and humans. It's vastly more effort than what's needed for Google Maps. ... Maps have so far been prepared for only a few thousand miles of roadway, but achieving Google's vision will require maintaining a constantly updating map of the nation's millions of miles of roads and driveways. Urmson says Google's researchers "don't see any particular roadblocks" to accomplishing that. When a Google car sees a new permanent structure such as a light pole or sign that it wasn't expecting it sends an alert and some data to a team at Google in charge of maintaining the map. ... Among other unsolved problems, Google has yet to drive in snow, and Urmson says safety concerns preclude testing during heavy rains. Nor has it tackled big, open parking lots or multilevel garages. ... Pedestrians are detected simply as moving, column-shaped blurs of pixels — meaning, Urmson agrees, that the car wouldn't be able to spot a police officer at the side of the road frantically waving for traffic to stop. ..."

A deeper issue I wrote about in 2001 is whether such software and data will be FOSS or proprietary? As I wrote there: "We are about to see the emergence of companies licensing that publicly funded software and selling modified versions of such software as proprietary products. There will eventually be hundreds or thousands of paid automotive software engineers working on such software no matter how it is funded, because there will be great value in having such self-driving vehicles given the result of America's horrendous urban planning policies leaving the car as generally the most efficient means of transport in the suburb. The question is, will the results of the work be open for inspection and contribution by the public? Essentially, will those engineers and their employers be "owners" of the software, or will they instead be "stewards" of a larger free and open community development process?""

Link to Original Source

Humans Need Not Apply: A video about the robot revolution and jobs

Paul Fernhout Paul Fernhout writes  |  about 4 months ago

Paul Fernhout (109597) writes "This explanatory compilation video by CGP Grey called "Humans Need Not Apply" on structural unemployment caused by robotics and AI (and other automation) is like the imagery playing in my mind when I think about the topic based on previous videos and charts I've seen.

I saw it first on the econfuture site by Martin Ford, author of "The Lights in the Tunnel". It is being discussed on Reddit, and people there have started mentioning a "basic income" as one possible response.

While I like the basic income idea, I also collect other approaches in an essay called Beyond A Jobless Recovery: A heterodox perspective on 21st century economics. Beyond a basic income for the exchange economy, those possible approaches include gift economy, subsistence production, planned economy, and more — including many unpleasant alternatives like expanding prisons or fighting wars as we are currently doing. Marshall Brain's writings like Robotic Nation and Manna have inspired my own work.

I made my own video version of the concept around 2010, as a parable called "The Richest Man in the World: A parable about structural unemployment and a basic income". At 1:02 in the video I made, there is a picture of a robot near a sign "Humans Need Not Apply". The text there is: "Soon everyone was out of work. The politicians and their supporters said the solution was to lower takes and cut social benefits to promote business investment. They tried that, but the robots still got all the jobs."

Here is a p2presearch post I made in 2009 pulling together a lot of links to robot videos: "[p2p-research] Robot videos and P2P implications (was Re: A thirty year future...)". It's great to see more informative videos on this topic. CGP Grey's video is awesome in the way he puts it all together. Makes we wish I had done one like that with all those snippets of stuff I've seen over the years."

On MetaFilter Being Penalized By Google

Paul Fernhout Paul Fernhout writes  |  about 7 months ago

Paul Fernhout (109597) writes "MetaFIlter recently announce layoffs due to a decline in ad revenue that started with a mysterious 40% drop in traffic from Google on November 17, 2012, and which never recovered. Danny Sullivan at SearchEngineLand explores in detail how MetaFilter "serves as a poster child of problems with Google’s penalty process, despite all the advances Google has made over the years." Caitlin Dewey at the Washington Post puts it more bluntly: "That may be the most striking, prescient takeaway from the whole MetaFilter episode: the extent to which the modern Web does not incentivize quality.""
Link to Original Source

To Wash It All Away by James Mickens

Paul Fernhout Paul Fernhout writes  |  about 7 months ago

Paul Fernhout (109597) writes "James Mickens of Microsoft Research writes his last column for USENIX's ;login: magazine humorously about everything that is wrong with HTML, CSS, JavaScript and the modern Web page and why we should "wash it all away". An example from his column: "Describing why the Web is horrible is like describing why it's horrible to drown in an ocean composed of pufferfish that are pregnant with tiny Freddy Kruegers--each detail is horrendous in isolation, but the aggregate sum is delightfully arranged into a hate flower that blooms all year." He makes many excellent points about problems with all these technologies, but do these points matter much given the Web's momentum? And could we expect anything better in the near future (like a Social Semantic Desktop or other new standards for exchanging information)? In my opinion, the Web wins because we are reaching the point where if something does not have a URI, it is broken. And JavaScript is, all things considered, better than we deserved."
Link to Original Source

LATimes to discard all previous user comments

Paul Fernhout Paul Fernhout writes  |  about 10 months ago

Paul Fernhout (109597) writes "I just received an email from the LATimes they will apparently be discarding all previous user comments tomorrow as they transition to a new commenting system. They are giving about one day to "save your work". What does this example mean about trusting our content to third-parties, even ones that one might otherwise presume to be a "Newspaper of Public Record"?

The main text of the email: "Thank you for being a part of the latimes.com community. We're committed to providing a forum for meaningful discussion about the topics we cover and have upgraded our commenting system. As of Thursday, February 27, we are giving our readers better ways to connect and communicate, using improved tools to help keep debates spirited, but not mean-spirited. More details will be available at launch on latimes.com/readers. As we bid goodbye to our old system, past comments will not be carried over. If you'd like to save your work, we encourage you to do so before February 27. We look forward to hearing from you at latimes.com. Jimmy Orr, Managing Editor, Digital""

Gates Spends Entire First Day Back in Office Trying to Install Windows 8.1

Paul Fernhout Paul Fernhout writes  |  about 9 months ago

Paul Fernhout (109597) writes "According to Andy Borowitz: "Bill Gates's first day at work in the newly created role of technology adviser got off to a rocky start yesterday as the Microsoft founder struggled for hours to install the Windows 8.1 upgrade. ... After failing to install the upgrade by lunchtime, Mr. Gates summoned the new Microsoft C.E.O. Satya Nadella, who attempted to help him with the installation, but with no success."

I've read before on Slashdot that Vista took the hate for buggy drivers after big changes from XP. After that all got sorted out, lots of people praised Windows 7. Might we see the same thing here with a more stable Windows 9?"

Link to Original Source

Kickstarter hacked, user data stolen

Paul Fernhout Paul Fernhout writes  |  about 10 months ago

Paul Fernhout (109597) writes "CNet wrote: "Hackers hit crowd-funding site Kickstarter and made off with user information, the site said Saturday. Though no credit card info was taken, the site said, attackers made off with usernames, e-mail addresses, mailing addresses, phone numbers, and encrypted passwords.""
Link to Original Source

MIT Scientists Report Cold Fusion Success with "NANOR" Device

Paul Fernhout Paul Fernhout writes  |  about 10 months ago

Paul Fernhout (109597) writes "E-Cat World reports: "[A video] has been posted on Youtube by someone called ‘AlienScientist’ who attended (and filmed) the recent MIT Cold Fusion seminar and reports about what he has learned. He does a very nice job of summarizing the key points from the seminar, pointing out that Peter Hagelstein and Mitchell Swartz mention such things as how the cold fusion reactions can be enhanced by subjecting the cold fusion cell to an external magnetic heat and shining a laser on the cathodes. He also mentions that they say cracking in the metal and rapid gas loading can cause the deuterium to leak out, thus negatively affecting the amount of excess heat produced. The video also includes pointed criticism of the way the scientific community dealt with Pons and Fleischmann 25 years ago, and laments the lost opportunities that could have been realized if more care had been taken in trying to replicate the effect back then. The takeaway quote from the video (I think) is: “This is quite possibly the beginning of the largest technological breakthrough that our generation will witness.” ""
Link to Original Source

Start-up purchases controversial cold fusion E-cat technology

Paul Fernhout Paul Fernhout writes  |  about a year ago

Paul Fernhout (109597) writes "A North Carolina based company called Industrial Heat LLC has come out and admitted that it now owns Andrea Rossi’s ecat low energy nuclear reaction (LENR) technology (also sometimes called "cold fusion"). Industrial Heat has put out a press release in which seems to confirm rumors that it had spent $11 million to purchase Rossi’s device. The press release also confirmed speculation that Tom Darden of Cherokee Investment Partners a North Carolina equity fund is a principal investor in Industrial heat."
Link to Original Source

Willis Ware, 93, Engineer at Dawn of Computer Age, Dies

Paul Fernhout Paul Fernhout writes  |  1 year,10 days

Paul Fernhout (109597) writes "The NYTimes reports: "Willis H. Ware, an electrical engineer who in the late 1940s helped build a machine that would become a blueprint for computer design in the 20th century, and who later played an important role in defining the importance of personal privacy in the information age, died on Nov. 22 at his home in Santa Monica, Calif. He was 93.""
Link to Original Source

New surveillance tool to track posts about vaccines

Paul Fernhout Paul Fernhout writes  |  about a year and a half ago

Paul Fernhout (109597) writes "Michael Smith at MedPage Today writes: "A new surveillance tool might help immunize communities against vaccine scares, researchers reported. An international pilot project has demonstrated that it's possible to trawl through the Internet and quickly identify places where public fear about vaccines is on the rise, according to Heidi Larson, PhD, of the London School of Hygiene and Tropical Medicine in England, and colleagues. ... The researchers cautioned that the system has not been running long enough to demonstrate "long-term predictive value," but added it will let observers characterize, in real time, vaccine opinions by "topic, negative or positive content, location, time, and risk level.""

The work is funded in part by the Gates Foundation. It is discussed in positive terms at the Daily Telegraph as "Monitoring system to globally track false social media claims on dangers of vaccines" and in negative terms at at Natural News as "Internet monitoring system to stalk social media users who question safety of vaccines"."

European Commission to criminalize unregistered seeds and plants?

Paul Fernhout Paul Fernhout writes  |  about a year and a half ago

Paul Fernhout (109597) writes "Mike Adams at Natural News writes: "A new law proposed by the European Commission would make it illegal to "grow, reproduce or trade" any vegetable seeds that have not been "tested, approved and accepted" by a new EU bureaucracy named the "EU Plant Variety Agency." It's called the Plant Reproductive Material Law, and it attempts to put the government in charge of virtually all plants and seeds. Home gardeners who grow their own plants from non-regulated seeds would be considered criminals under this law.""
Link to Original Source

Knight Foundation launches News Challenge on topic of open government

Paul Fernhout Paul Fernhout writes  |  about 2 years ago

Paul Fernhout writes "The Knight Foundation opened on Tuesday its first Knight News Challenge of the year on the topic of Open Government under the guiding question: "How might we improve the way citizens and governments interact?""
Link to Original Source


Paul Fernhout has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?