Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Miss a Payment? Your Car Stops Running

wispoftow Re:end the market (907 comments)

Everyone who takes out a loan for any purpose (home, car, student loan) runs the same risk.

Lenders usually do not make money off of repossession; it is a mitigation of loss. The more diifficult it is to mitigate loss, the less likely a lender is to make the loan in the first place.

And there goes any chance whatsoever of anyone but the rich ever entering ownership unless they have all the cash upfront.

about three weeks ago
top

Miss a Payment? Your Car Stops Running

wispoftow Re:end the market (907 comments)

Why do you call it "predatory lending?" That implies that there is some sort of fraud or misrepresentation.

I'm pretty sure that the terms and conditions are stated in full: "if you (subprime borrower who wants a car) miss a payment, then your car will shut off. These are the conditions, take it or leave it."

I am opposed to usury, but what rates/terms constitute usury are determined by the government, that is elected by the people. And let's not forget that these banks are not owned solely by billionaires -- the largest stakeholders include depositors (like middle-class you and me). If the lender failed to be aggressive in enforcement of debt, then the system would collapse. If the lender failed to expand into new markets (including subprime lending), then we depositors would take our business elsewhere.

It is "cool" in some circles to promote Robin Hood thinking -- steal from the rich, to give to the poor. But jeez, take the bus. If this is so onerous as to be oppressive, there is always the option to do what my ancestors did -- move to lands of better opportunity.

Responsibility is a two-way street, and if we are going to scrutinize the lenders, we ought to also scrutinize the decisions being made by the consumer.

about three weeks ago
top

Ask Slashdot: What Old Technology Can't You Give Up?

wispoftow Re:35mm film (635 comments)

beat me to it. I love my Leica M3.

about 1 month ago
top

Islamic State "Laptop of Doom" Hints At Plots Including Bubonic Plague

wispoftow Self-Inflicted Damage (369 comments)

NB. I can't believe that I am responding to this submision, but here I go.

I can guarantee that whatever disease was launched would make its way back to the population that ISIS purports to represent. I predict that its consequences would eventually be more devastating in the generally-poorer populations.

It seems there is a reason that chemical/biological warfare didn't make it much farther than the first world war -- for the simple reason that the "wind" changes direction.

I think Shylock said it best: "cut me, and do I not bleed?" We're all humans, and we need to cut this crap out.

about 1 month ago
top

Statistics Losing Ground To CS, Losing Image Among Students

wispoftow Statistical Practitioners need to Modernize (115 comments)

I am a researcher in medical informatics, and statistics is a huge part of my job, though I am not a classically-trained statistician.

First, I would like to offer a stark contrast between two types of statisician: 1) statisticians of the old mold who are wedded to SAS and related tools and 2) research statisticians who employ modern methods such as Bayesian statistics and rather advanced calculus. The former tend to mold all problems into what is available in the canon of SAS routines, while the latter are capable of creating custom models that suit the problem at hand.

Then, there is a new breed of scientist -- the data scientist -- who tends to use black-box machine learning methods and the classical techniques, as programs such as SAS and R have "democratized" the field. I agree with the common gripe of many traditionally-trained statisticians who object that these "data scientist" tend not to understand the statistical background of these computer codes. In fact, it is easy to download R onto one's computer and start firing data through, with little regard for the merits of the model or its results. (Not all data scientists are like this, but I'm simply stating a general observation.)

Another problem with statistics is that it can be very confusing, understanding just what things like p-values mean. After a first course in statistics, it leaves many with a bad taste -- either being terribly confusing, or rather boring. In my opinion, this is because of traditional (frequentist) statistics, which have their origins from luminaries such as Fisher and Pearson.

The "action" today is in Bayesian statistics. This formulation allows for statistical concepts to be expressed is ways that (I believe) most people can understand. But executing Bayesian statistics mandates that one understand the underlying formulation of models; in general, they are not black-box methods. Furthermore, they can be quite computationally-expensive for large data.

Statistics is suffering from perceptions of being a button-pushing, boring profession. As has happened in many other fields (e.g. computational chemistry and CFD), computer programs have democratized the field so that those who have not had years of dedicated study and training can execute statistical models. In my experience, this can be a good thing, or a very bad thing. Another issue is that there is a significant build-up of half a century of code and protocols in both industry (think big business analysis) and government agencies (think FDA).

But modern statistics is actually a hot field. Provided that one understands the background, and is willing to go the extra mile to write custom code, the rewards are endless.

about 2 months ago
top

If Java Wasn't Cool 10 Years Ago, What About Now?

wispoftow Re:What's the point? (511 comments)

Because *some **people ***are &sick and *(&tired) **of *all &of ***the ****bullshit &that **goes &with writing C and C++ in order to get an order of magnitude performance increase over those dynamic languages that you allude to.

about 2 months ago
top

Ask Slashdot: Life Beyond the WRT54G Series?

wispoftow Netgear R7000 (427 comments)

I recently replaced my third generation Airport Extreme with a new Netgear R7000 "Nighthawk." I loaded Tomato "Shibby" branch, and was able to replace my firewall, webserver, openvpn, and a few other services with this bad-boy. Also, I get QoS.

Two weeks later, everything is fine. I am satisfied. It is interesting to me that the range of the Airport Extreme (despite being seven? years old), is comparable to this new wireless router. However, I am happy to invest in a repeater unit using this free software, rather than sinking more into the good--but infinitely proprietary, and less feature-ful)--Apple hardware.

about 2 months ago
top

Data Mining Shows How Down-Voting Leads To Vicious Circle of Negative Feedback

wispoftow Technical Subjects need Correct Answers (293 comments)

I'm afraid that this article touches on what I perceive as a growing problem: it's the notion that "Everyone's answers and opinions are right and have value."

This might be fine in some areas where many things are subjective, in which case the axiom "there's no disputing taste" is appropriate. In these cases, then I agree that one should probably hold one's criticism.

But especially in the technical areas, such as computer programming and the physical sciences, the laws of physics and logic often times point to a more correct answer. In my own work, I find that I am constantly wading through massive amounts of literature, and wondering -- what the hell happened to peer review that used to weed much of the crap out? Eventually, wrong answers and half-baked opinions stack up to warp reality, such that it is difficult to find or promote the few rigorous and correct.

I think it's a similar situation on peer-reviewed sites like Stack Exchange. Often times, the posted opinions for solution to a problem run the freaking gamut. I am glad that a lot of the good opinions (based on sound reasoning and experience) are boosted up, but the dreck (based on fuzzy thinking, old wive's tales, and "antipatterns") are ranked downward, thus giving some help to an interested third party (such as me) who really doesn't have time to be patient and P.C.

Disclaimer: the right answer can be the minority opinion -- which may have been knocked hard by other reviewers. Here I am speaking about the 99% of the time that the best answer is the most highly rated.

about 5 months ago
top

New PostgreSQL Guns For NoSQL Market

wispoftow How about Parallel Query Execution? (162 comments)

NB: I love PostgreSQL with all my heart. I always upgrade to the most recent version, because they implement features that I really need. Added to the existing features of Postgres, it's totally awesome.

But as I have moved toward "Big Data" and the market segment that these new-fangled (non-relational) databases target, I find myself wishing that Postgres would be able to run my vanilla query (*singular*) using all processors. As it is now, I have to either write some awful functions that query manually-partitioned subtables, or simply wait while it plods through all billion or so rows.

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re:Ten Reasons to use Modern Fortran (634 comments)

This is a very interesting example. Would you believe that gfortran 4.2.1 gives z=2 (two!) and gfortran 4.8 won't even compile, due to a bizarre REAL(4) vs REAL(8) error? There's something very wrong with this, and your point is taken. (The intent(inout) attribute of f90 would not have helped here, either.)

I would point out that, since this has side effects, I would probably have done this as a SUBROUTINE instead of a FUNCTION. Then, things would have materialized in a predictable way. I try to write all of my functions as "pure" functions that have no side effects.

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re: We're Not (634 comments)

You keep trying to convince me that Fortran is not a panacea like I might actually believe that it is. I have confirmed that you are trolling.

If one sets a random seed from a reproducible generator, then start a swarm of trajectories sampled from a Maxwell distribution of velocities, then one should be able to get the exact same computer renditions of those trajectories on any computer that implements IEEE arithmetic. These are computationally deterministic simulations. This is reproducible research.

Or maybe you are invoking quantum mechanical uncertainty for a classical mechanics simulations. Even quantum mechanical simulations that start with the same wave packet (x and p) can be reproduced faithfully if the initial conditions are known, and the order of operations is the same.This discussion has nothing to do with ensemble averages or quantum mechanical and

.

Compilers most certainly can dictate the order of operations of a CPU -- that's the whole point. Whether this is most efficient for a given architecture is another matter, and one reason why performance can drop if the CPU is ordered to perform its instructions in a suboptimal manner.

Finally, I agree with you that MPI has nothing to do with IEEE arithmetic, and I really don't know why you have brought this up.

(I regret that I must refrain from further discussion on this topic. Ummm... you win.)

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re:Ten Reasons to use Modern Fortran (634 comments)

Also, I think that my quick post was fairly incomplete, it's a pretty big topic. People generally don't want to make copies every time data enters a function, for performance reasons.

Modern Fortran has "intent" statements that prevent overwriting by functions, that go quite a way in improving safety, and guaranteeing immutability for parallelization considerations.

subroutine dosomething( A )
integer, intent(in) :: A(:,:,:)
end subroutine dosomething

Altering A in this subroutine generates an error. In Fortran, if you wish to "pass something by value" thus keeping the original data, you can always do that explicitly. The copy will be passed by reference. (To the best of my knowledge, I hope a Fortran maven will correct me if I am wrong)

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re:Ten Reasons to use Modern Fortran (634 comments)

In my mind, I equate "passing by reference" as passing the memory location of the first element to a function, and passing by value as making a copy, then passing that to the function/subroutine.

If the array being passed consumes almost all of the memory of the machine (very common in scientific computing), then making a copy first would leave you dead in the water.

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re:Ten Reasons to use Modern Fortran (634 comments)

I agree. I posted elsewhere on this thread saying essentially the same thing, especially about dynamic memory allocation and free-form input.

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re: We're Not (634 comments)

Please, go back and look at my previous comments. I said "consistent" and not "exact". Now, you have called "bullshit" on me, and so it's time to go to source to back up what I have said. I choose "Modern Fortran Explained" by Metcalf, Reid, and Cohen, specifically Chapter 11 which covers the Fortran implementation of IEEE 754-1985 and IEC (ISO) 559. These guys are associated with the likes CERN and NAG.

IEEE arithmetic has ~nothing~ to do with Fortran per se (see my comment above) -- the Fortran standard demands its implementation. You accusing Fortran users of "being stuck in its ways" is blatantly stupid -- as any language that implements IEEE is guilty of the same crime. A "type" is more than its storage -- a type is the union of its storage with its operators. In Fortran, (to the best of my knowledge) using IEEE arithmetic alters these operators, including essentials like +-*/ and including essentials like sqrt. (Aside: Note that GROMACS started life being a fast enough, precise enough, sqrt operator project due to its role in computing Euclidean distance. In my experience, it did not work out so well, unless a thermostat kept bleeding out the aberrant velocity build up.)

Since the FP types use finite width, no doubt there are still FP errors, as this is not exact arithmetic. Fortran does not fix this, and I never said it did. Again, I said IEEE, which Fortran implements, makes it "consistent."

Different CPUs have different performance characteristics for various operations. Where there is a preferred/faster order of operations, then the compiler can reorganize so that things run faster. This can be good and/or bad. From Metcalf, "Some computers perform division by inverting the denominator and then multiplying by the numerator. The additional round-off that this involves means that such an implementation does not conform with the IEEE standard."

My thoughts above were about ~different types~ of machines doing billions of operations in whichever way they please, thus leading to different results in long running simulations. It is possible to demand IEEE arithmetic and exception handling, in which case "Execution may be slowed on some processors by the support of some features."

Now, let's talk about your fuzzy thinking. You said "There is no physics based reason for insisting that a truncation error should always be the same": do you expect bitwise arithmetic to be different, provided that the same sequence of instructions with the same starting values? You also think/thought that scientists don't post their trajectories or snapshots. I'm actually starting to wonder about you.

This subthread had little to do with Fortran, except as an innocent bystander that has IEEE support. This had to do with the suggestion of the usefulness of consistent orders of operations that can lead to consistent results between different types of machines.

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re: We're Not (634 comments)

What you say is inconsistent with decades of work of my colleagues on the subject, and my own observations. (I have a few dozen papers in JACS and J Phys Chem)

a) The problem is that seemingly "stable" systems develop instability over time due to integration and floating point error. You mention chaos, and this is it -- sensitivity to minute change in initial conditions and accumulated tiny effects. But it's not randomness -- this chaos could be reproduced using well-defined arithmetic.
b) people do publish trajectories in supporting information or on their web sites, perhaps only snapshots, but how would you ever get from snapshot to snapshot to prove that you had implemented their methodology correctly, or to demonstrate the reproduction of a phenomenon: perhaps interesting, perhaps evolution to a corner case thus demonstrating a theoretical/methodological error? If you are working on different processors/compilers, it's almost impossible to reproduce.

I agree that people tend not to do IEEE arithmetic with classical MD. Please go back a comment or two and read it this time to confirm it for yourself. I was trying to give an example of a case where exactly reproducible results could be useful in the field of MD, particularly in development.

I remain convinced that IEEE arithmetic is useful in many important (but perhaps comparatively rare) circumstances, and that FP error will always be a lingering issue that rears its ugly head. Fortran implements ways of dealing with it consistently.

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re: We're Not (634 comments)

First of all, please note that I said ~exact~ reproduction. You keep going back to some ensemble average as being "good enough." Secondly, what you are saying about the scientific topic is inconsistent with numerical analysis of MD simulations using e.g., the velocity Verlet algorithm.

Please see: http://chris.sweet.name/docs/s... for an example of floating point error in action (special attention to the single vs. double precision differences that appear once the simulation has run a long time.) One of the early criticisms of GROMACS (the fastest MD!) was that it ran really great in single precision. But everyone else (AMBER, CHARM, and homegrown codes) were criticizing it for floating point round-off error leading to trajectories that flew apart because they developed too much momentum.

These don't "appear" if you have slapped a thermostat on the simulation (NVT ensemble). This is why you should always run an equilibration run and then switch to constant energy (NVE ensemble) when you are generating results that you wish to report. Otherwise, you are sweeping all sorts of problems due to floating point arithmetic and integration error under the rug.

But I suppose that you are doing classical MD -- trying to draw some inference on e.g. protein dynamics based on interactions between van der Waals spheres. These simulations tend to be "right" half of the time -- they are wrong until you've cooked the force field enough to match the definitive experiment. Constraints go a long way in keeping the system together.

So, in this environment, where the primary interest is generating "ooh's and ahh's" from movies of dancing proteins as quickly as possible, you are probably safe using single point arithmetic and hiding all of your sins with a nice, strongly coupled thermostat and barostat.

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re:IF (X/Y*Z) 100,300,50 (634 comments)

just to correct my comment: computed goto was ~usually~ done with integer arithmetic. I made my comment based on your use of X, Y, and Z which I am sure were declared using IMPLICIT DOUBLE PRECISION (A-H,O-Z) :-)

about 5 months ago
top

Why Scientists Are Still Using FORTRAN in 2014

wispoftow Re:Q: Why Are Scientists Still Using FORTRAN in 20 (634 comments)

Most people learned Fortran in a class intended to teach scientific programming. I have never, ever, ever seen a course catalog that lists CS 201 FORTRAN PROGRAMMING. It has always been about getting the scientific result -- it just so happens that Fortran has been pretty good at it and actually pretty simple -- and so it has often times been used.

That being said, C almost killed Fortran (77) because they waited so damned long to bring out Fortran 90. People were sick and tired of waiting for dynamic memory allocation and free form input, and so many people who were past entry-level programming started jumping to C (...which I would never relish to teach to a beginner who has no solid interest in computer programming).

about 5 months ago

Submissions

Journals

wispoftow has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?