Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Submission + - SPAM: Obama used a pseudonym in emails with Clinton, FBI documents reveal

schwit1 writes: President Barack Obama used a pseudonym in email communications with Hillary Clinton and others, according to FBI records made public Friday. The disclosure came as the FBI released its second batch of documents from its investigation into Clinton’s private email server during her tenure as secretary of state.

The 189 pages the bureau released includes interviews with some of Clinton’s closest aides, such as Huma Abedin and Cheryl Mills; senior State Department officials; and even Marcel Lazar, better known as the Romanian hacker “Guccifer.”

In an April 5, 2016 interview with the FBI, Abedin was shown an email exchange between Clinton and Obama, but the longtime Clinton aide did not recognize the name of the sender.

"Once informed that the sender's name is believed to be pseudonym used by the president, Abedin exclaimed: 'How is this not classified?'" the report says. "Abedin then expressed her amazement at the president's use of a pseudonym and asked if she could have a copy of the email."

Link to Original Source

Comment Re:Zuckerman suppresses evidence? (Score 1) 346

Who is misinformed? GOP and Conservative is vocally against fairness doctrine, you must have missed it over on PBSRadio. If the stations that carried conservatives Beck, Rush and Mark would have to offer equal time there would not be enough time in the day. It is slanted, it is biased but Conservative Talk has kept AM and FM talk stations alive in small markets across the united states.

Submission + - How One Dev Broke Node, and Thousands of Projects in 11 Lines of JavaScript (

An anonymous reader writes: Programmers were left staring at broken builds and failed installations on Tuesday after someone toppled the Jenga tower of JavaScript. A couple of hours ago, Azer Koçulu unpublished more than 250 of his modules from NPM, which is a popular package manager used by JavaScript projects to install dependencies. Koçulu yanked his source code because, we're told, one of the modules was called Kik and that apparently attracted the attention of lawyers representing the instant-messaging app of the same name. According to Koçulu, Kik's briefs told him to take down the module, he refused, so the lawyers went to NPM's admins claiming brand infringement. When NPM took Kik away from the developer, he was furious and unpublished all of his NPM-managed modules. "This situation made me realize that NPM is someone’s private land where corporate is more powerful than the people, and I do open source because Power To The People," Koçulu blogged. Unfortunately, one of those dependencies was left-pad. It pads out the lefthand-side of strings with zeroes or spaces. And thousands of projects including Node and Babel relied on it. With left-pad removed from NPM, these applications and widely used bits of open-source infrastructure were unable to obtain the dependency, and thus fell over.

Submission + - How to work on source code without having the source code?

occamboy writes: Perhaps the ultimate conundrum!

I've taken over a software project in an extremely specialized area that needs remediation in months, so it'll be tough to build an internal team quickly enough. The good news is that there are outside software engineering groups that have exactly the right experience and good reputations. The bad news is that my management is worried about letting source code out of the building. Seems to me that unless I convince the suits otherwise, my options are to:

1) have all contractors work on our premises — a pain for everyone, and they might not want to do it at all

2) have them remote in to virtual desktops running on our premises — much of our software is sub-millisecond-response real-time systems on headless hardware, so they'll need to at least run executables locally, and giving access to executables but not sources seems like it will have challenges. And if the desktop environment goes down, more than a dozen people are frozen waiting for a fix. Also, I'd imagine that if a remote person really wanted the sources, they could video the sources as they scrolls by.

I'll bet there are n better ways to do this, and I'm hoping that there are some smart Slashdotters who'll let me know what they are; please help!

Comment GE Corprate is an IT Provider (Score 1) 123

I worked at GE on the same team as the Author a decade ago at the Toaster Division...err Consumer and Industrial.. Back then the company had distinct silos as to business units and it was a major pain just to move sites from one silo to another, we were directed to reIP and rename everything on a annual basis.

Teams in each division spent months changing IP schemas of whole divisions because of an accounting change. We brought in new companies and spun new companies on a yearly basis. I think we spun out to another GE division and brought in the same plant 3 times in 4 years. . We had ITIL, We had SixSigma, We had 3+ ticketing systems and we had some real TPS reports.

GE the IT provider now seems to provide data center services for the unregulated parts enterprise. I am sure there are some holdouts at GE Healthcare and GE Turbine/Energy/Aviation running something on local iron. Also no manufacturing plant is going to release the process control to the other side of network. Putting everything on a VM is nothing new for GE. Putting disparate systems in the same data center, no big deal. Putting two divisions under one roof. If you datacenters, their own IT, their data. look at their history GE built their own brand of mainframes just to account for their own enterprise. They finally have divested themselves of owning their own corporate jewels, the information is everywhere, the information is nowhere

The data that runs the company is in other peoples hands, not my cup of tea but GE has long gone past sensible and risk adverse. Seeing that the IT and network infrastructure is treated as a both a commodity and a infrastructure redefines IT role. At least they are no longer playing with IT as a profit center.

Comment Re:Database Scaleability. (Score 1) 272

If you have hit limits before (MySQL?) use a very mature platform that operates effectively when the DB does get larger than available memory for indexes or dataset such as DB2, Oracle and MSSQL and strongly type inputs and normalize your data set in the first place and use a language native data connector APIs, pointers and record locking. Just about anything important to the application needs to go through a Stored Procedure because database should not trust anything directly from the application to be suitable to a query at the data layer. As always set yourself up on a platform with service accounts that are RO and RW. In other words don't set yourself up outside the big data mainstream.

If your product is a unexpected hit you will have what seams like zero moments to fix the database and program design without business impact. And start with the DB and the Application/Web layer being VMs load balancing to begin with and the database in some sort of cluster.. Have the entire Apps platform packaged to go to a outsouced data center or Amazon on day one so that a 1000x user growth per day you have a plan. This is sort of Web Application Architecture and Design 101 as any code monkey can put data into NOSQL and hope to get it back through just the right query.

Comment Re:Projections (Score 1) 987

Scientist like Astronomers and Physicists laugh at the models of climatologists as having so many mystery factors and constants that are never described and just pushed around to get a chart that goes to a extreme a few years out so that surprise funding is assured. Data and Stats guys have issues with the models used to give a global tempetures up to 1950. Computer code guys have spaghetti code problems with the models, because it is all written by interns. We all have problems with the error bars assigned to all these predictions for the past 20 years. We are sitting below the error bars from just 5 years ago, we are crying foul and suspect the who AGW is being confused with the long term warming or cooling trend that is built into earths climate.

I am so unsure of the data and who might have last molested it I cant tell you if we are in a natural temp rise or fall since the time of Napoleon, Astronomers cant even tell you if the sun is releasing less or more energy than 200 years ago. see

Comment Re:What about the hill next to that one? (Score 1) 230

There is a 1:100 risk that a house will be destroyed in the next year with something so bad that the occupants may die. Most us citizens think one in a hundread is a long shot.... it is to common on the timescale of residential housing. I don't want my federal tax dollars spent rebuilding as I know a 1:100 event will happen again in 10 years with the darndest of regularity. Limiting our risk via building codes is the way out, or go ahead and privatize the role of FEMA and let the insurance companies hire geologist and let the red lines begin.

LA Foothills, New Orleans, Bolder, Matha's Vineyard, SFO 1:100

Any Salt water beach, Sandy Point, Omaha, Houston 1:1000

Chicago, St Paul, Denver, 1:10000

Phoenix Az. 1:100000

Slashdot Top Deals

When all else fails, read the instructions.