Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Microsoft Promises Not To Snoop Through Email

swan5566 Re:Liable suit (144 comments)

You can only sue for actual harm that was caused. This would imply they would have to convince a jury that people took that campaign seriously.

about 4 months ago
top

NSA General Counsel Insists US Companies Assisted In Data Collection

swan5566 Re:Meanwhile if they openly admitted it... (103 comments)

But the NSA would be free to mention that fact and thus explain away their denial. But, they didn't.

about 4 months ago
top

Microsoft Kills Stack Ranking

swan5566 Re:Encountered this kind of thing ... (204 comments)

It could, but the question is should it.

Your complaint about C++ is like a CGA not knowing how to use Simply Accounting. You got a degree in computing science. Not C++ programming. Yes, C++ programming is useful, and yes, you might have wanted to take a course in that, but its not related to the degree.

And who's defining what a CS degree this way? My point is that if we re-define that to include more on-the-job elements, the overall value of the education will go up. Any time you have an organization (not just higher education) be the primary evaluator of itself, over time you will get policies and groupthink are more and more out of touch with reality.

True, but industry wants the other extreme, and nothing will satisfy them. Bottom line is if they want an employee who can do X. They can train that employee. That's how it works in other industries.

I think there's an easy middle ground to shoot for here. And note that industry are the ones providing the jobs, not the school, so they should intrinsically be in the driver seat more than the school should.

In the upper level compsci courses i took the languages were treated as a tool, not a subject. We were given a 1 day primer and the manuals, but it was a course on "advanced computing topic" not "language". You could have taken it on yourself to learn C++; if you can get a compsci degree you can learn C++. Take a correspondance course, write and release a freeware program. Done.

I'm surprised it took you until the interviews to catch this out. This should have come up earlier -- surely the career counselors would have caught it, or the work experience programs and internship placement stuff, or even just talking with other students and graduates and profs.

My university had a couple C++ electives I could take, including one by correspondence. Usually worth 1 credit. Or you could take C++ separately from university. I took a C++ windows programming course that looked at MFC and COM and the windows event model from an academic point of view. You didn't need them to graduate but everyone was encouraged to take them as they were "relevant" in the job market. The profs were all well aware of C++ in the world, and just thought it was a terrible teaching language for most subjects.

I'll be the first to admit wish I would known more about what was going to be coming my way after graduation, but I don't think that should let them off the hook. I paid them to educate me so I could get a job. Suppose I go buy a car from car dealership, but they don't put any oil in the engine (after all, they are selling you a car, not oil). You would call the person naive about cars if they ignored the engine light and tried to drive it as is, but that fact doesn't change that the dealership is dropping the ball. And having them "suggest" that you take additional/supplemental courses to prepare you for the job market is just an admission that their curriculum is sub-par. This again comes down to expectations. They don't better prepare you for post-graduation in their curriculum because they don't have to, and thus it detracts from the very point of higher education existing in the first place. My prediction is that if/when someone is able to change this system so that schools held to a better standard of preparation, the overall higher education quality will suddenly much more valued in society, and that person will go down as the genius who revolutionized the outdated western higher educational system.

about 9 months ago
top

Microsoft Kills Stack Ranking

swan5566 Re:Encountered this kind of thing ... (204 comments)

Work pays you because you understand calculus. You pay school to develop that understanding. School and Work are not the same.

You miss my point - school could do a better job preparing you for work.

Because my boss expects me to already understand what I'm doing. The entire reason you go to school is because you don't know, and you want to learn.

There's often times when I don't understand what I'm doing when I receive an assignment and I have to go teach myself something new to get it done.

You are expected to make mistakes and learn from them.

At my job? No - I'm expected to learn whatever I need to get it right the first time. At school? I had several courses where the professor told us straight up front he/she thought tests were stupid and out of touch with the real world and told us the majority of our grade would be homeworks and projects. Have to say I think I learned the most out of those courses, and being expected to get it right on the homeworks was not a hard thing at all - it made you take it more seriously than just assuming you would just bump up your grade at the final.

University is not grade school my friend, "homework" is your "in class quiz" and "practice problems"; its just not in class where its a complete waste of the limited time you have with the prof. I don't know about you, but I got 3 hours of class time a week. Depending on the course, I got another few hours of TA or supervised lab time. That class time was for the prof to explain the material, and answer questions. It would have been idiotic to use that time do practice problems on our own.

Well I had both in-class ones and ones where you submitted electronically before class. It made sure the students read the material before class and the lecture (which focused on the tougher parts of the material) wasn't wasted on the easy stuff. Seemed to work pretty well from my standpoint.

Why? In the real world your boss expects you to do Y, and assumes you already know the prerequisite A,B,C. School is where you learn your A,B,Cs.

Yes, and they can do a better job with teaching you about Y as well.

University a isn't trade school.

And this is precisely the problem. It should be, in part. And if the professors don't want/can't do it, then find better ones that will.

I didn't get a degree in computer science so that I could learn how to deploy Active Directory, how to properly configure Apache for security, or program against a given library/API that's popular "in industry", or learn R. You graduate with a degree in comp sci and you should know how computers work, how compilers work, how networks work, how programming languages work, what prodecural, functional, and object oriented are, you'll know about recursion, you'll know about concurrency and resource locking, semaphores and critical sections, atomic transactions, you'll know about AI, or SQL, you'll know how algorithms work, how stacks works, how fundamental data structures work, how to compute performance characteristics, etc. I use a lot of this knowledge all the time, and even the stuff I don't use I'm aware of and can recognize its applicability when it comes up. So when I get asked whether an R program used on inputs of n=3 or 4 runs in 5 minutes to an hour how big can they go... and I have the tools to analyze it and say its O(21^n) and you can get up to be about 6 on a PC, maybe 9 or 10 on a cloud platform for some $$$. (real example by the way from a couple weeks ago). The algorithm was about as good as it could be, so we analyzed the problem itself and came up with a new way of defining the solution space, that could be evaluated and searched in O(n^3) and give us the results we needed. And new we can go to n=20 and beyond on a laptop. That's what I went to university for.

The stuff industry wants, best practices for coding Java interfaces, deep API knowledge in whatever API they happen use that week (net, java, qt, whatever), how to use Team Foundation Server or git or whatever they happen to use that week, or whatever else (to use programming as an base example), is what you get from doing the job.

Perhaps there should be programming apprentiships and IT apprentiships in much the same way there are for plumbers and electricians, but plumbers and carpenters get PAID ON THE JOB to learn that stuff, and so should IT and software developers.

Industry wants everything for free. Frankly, if industry had its way, university graduates would have studied their employee orientation & policy, ISO procedures, and would already know where to park too.

Well I got a degree in CS to get a job related to CS, period. I really didn't have any expectation as to what I was going to learn, precisely - I trusted the school that they knew what I needed to know. I found out after I got out of school that there were a lot of things related to the computer industry that they didn't even mention once, let alone teach about it. And there were other things that they mentioned but gave you the distinct impression that it wouldn't matter if they glossed over it and get on with "more interesting" topics. Well guess what - interview after interview I kept running into the basic qualification questions. An example is "have you programmed in C++ before?". Sad to say, but somehow in all my coursework (undergrad and grad school), no. Other imperative programming languages, yes, and mostly functional languages like Scheme that no one in industry uses. I knew lots other really cool, advanced CS stuff that their company could use, but unfortunately that's not a substitute for C++ experience (coursework would have been acceptable). Could have I taught myself? Of course, but at that point it's too late. I need to get a job, not learn more. I'm not even talking about "flavor of the month" technology stuff, I'm talking about basic industry knowledge that a) many companies wants b) doesn't change much or too fast. It's not beyond the school to include/emphasize it more, it's just that they don't want to, and they won't change because currently they aren't held accountable to the type of education they provide. Sure, it's great by their standards, but their standards isn't the one people care about at the end of the day.

about 9 months ago
top

Microsoft Kills Stack Ranking

swan5566 Re:Encountered this kind of thing ... (204 comments)

I disagree.

Thanks to wolfram alpha, stack overflow, and the internet in general, homework is as meaningful or as meaningless as the student chooses to make it. Most professors assigned 5-10% of the grade to homework, and that was usually just for completion, not for correctness.

Homework is an opportunity to practice and develop and understanding of the material being taught.

These statements go against what I have experienced in the real world. The internet and such is the primary source of information for me, even with my text books sitting next to me on my desk. The ability to look up stuff and self-teaching on-the-fly is much more useful than remembering every last thing that was taught to me in school. That's not to say there's no value - being able to do calculus and such by hand is necessary because you need to understand what's going on, but no one is going to pay me to just sit there and do calculus problems by hand all day. I use a calculator, even though I know how to do long division - it's faster and less error prone. I'm not saying do away with testing altogether, but rather use the real world as delimiter of where emphasis of knowledge and compency should really be put, rather than some professor's pie-in-the-sky notions.

Also, you usually don't get "redos" on an assignment from your boss - it's your responsibility to identify if you don't understand something and to ask for help right away, not after he's "graded your work". Once it's on paper (or checked into a repository), it's expected that you've done what you can to get it right the first time. And the professor can always hand out practice problems or in-class quizzes for the competency check, but the major evaluation should mirror what is desired in the real world.

"Projects" are also rife with problems; with much of the grade reducing to grading the student skills at project management itself. That's an important skill too, but not the one that should be getting tested. (Although to be fair, taking tests is also a skill unto itself, but "test taking" is easier to learn to do well than "project management")

And group projects have all kinds of other issues. In my experience the only people who like group projects are people who treat it like free marks some of their classmates will do for them.

They don't have to be group projects, they can be individual ones.

I guess my overall comment here is that because professors have a Ph.D. in their discipline they think that means they know what's best on how to teach. That's not true. My opinion is that everyone who wants to teach at a college or university should have some sort of education training (you know, from the Education Department) to dispel a lot of the silly, off-in-the-clouds notions that many professors think are good teaching practices that people in the Education Department would be quick to squash. They also think that because they are the "higher education pinnacle of society" that they automatically know what the real world needs and that the real world conforms to their ideas of what a person of that discipline should know to be useful and functional. Also not true. It's shocking how not true this is. There needs to be more of a vocational emphasis in higher education, and professors need to teach more of the things that they currently like to raise their noses at, but are the very things that industry constantly complains about with fresh graduates and ends up having to do the teaching themselves on the job.

about 9 months ago
top

Microsoft Kills Stack Ranking

swan5566 Re:Encountered this kind of thing ... (204 comments)

I understand the difficulty, but in the case of advanced math courses, many of them don't have a lot of students in them, which deflates the argument those who would receive an F in a curve probably would have received and F anyway. IMO, the more advanced the course (especially in grad school), the less tests make sense at all for evaluation. It's better to have projects and homework be the primary evaluator since this is more in line with handling "real world" scenarios - which in theory is the point of giving out grades to begin with.

about 9 months ago
top

Microsoft Kills Stack Ranking

swan5566 Re:Encountered this kind of thing ... (204 comments)

I see your reasoning on how it factors out the variability amongst professors (which is a good thing), but an even better solution is to change the professors themselves - either by forcing a review on their teaching/evaluation methods, or removing the professor altogether. The job market for academia has been hyper-saturated for quite some time now, so it should rather easy to cycle through them until you find good ones.

about 9 months ago
top

Microsoft Kills Stack Ranking

swan5566 Re:Encountered this kind of thing ... (204 comments)

Bell curves "work" in academic settings because there's hardly any accountability imposed upon tenured professors for how they evaluate students. It's continually shown how grades (as of right now) are a poor predictor of success in the outside world, yet this continues to be ignored in the practical sense in academia.

about 9 months ago
top

Anti-Poaching Lawsuit Against Apple, Google and Others Given the Green Light

swan5566 Re:I'm confused (172 comments)

You miss the point. You can make your case for how competent you are during your interview, but that still won't hold a candle to having X years experience at company Y. Again, I'm talking about getting your foot in the door, not about your performance once you're in.

about 9 months ago
top

Anti-Poaching Lawsuit Against Apple, Google and Others Given the Green Light

swan5566 Re:I'm confused (172 comments)

For fresh graduates, it's not about the salary, it's about even getting your foot in the door.

about 9 months ago
top

Anti-Poaching Lawsuit Against Apple, Google and Others Given the Green Light

swan5566 Re:I'm confused (172 comments)

One problem is that poaching encourages a dichotomous working class system. Poaching is good for employees who have experience, since their wage will go up because companies fight over them, but it's bad for potential employees fresh out of school. No employer wants to be the one to front the capital to train them. They would just rather poach someone that will be effective on day 1.

about 9 months ago
top

Digital Revolution Will Kill Jobs, Inflame Social Unrest, Says Gartner

swan5566 Re:...And? (754 comments)

Granted, but that doesn't prevent him from suggesting "non-magical" solutions. My guess is that he has some solutions in mind, but he would rather someone else propose and defend them rather than being the unpopular voice.

about 10 months ago
top

Digital Revolution Will Kill Jobs, Inflame Social Unrest, Says Gartner

swan5566 ...And? (754 comments)

Gartner fails to include any sort of actionable response to this phenomena, or even any argument that anything can be done about it. Article seems half-baked.

about 10 months ago
top

Why Are Some Hell-Bent On Teaching Intelligent Design?

swan5566 Re:Young Earth Creationism Considered Harmful (1293 comments)

I would say the fact that your morality has been honed over millions of years by natural selection is a pretty good reason to listen to it. Just like how using your lungs to breath is a pretty good idea. And just like the conscious decision to use your biologically evolved lungs to breath, using your morality to make decisions is not even really much of a choice. The guilt of acting against your own moral code is a very compelling force.

Yes but the fact is that many people act according to opposing belief sets. Who's to say what's really right then, if it's based purely on subjective belief? Very compelling, yes, but that does not mean that it couldn't just be done away with, as some people do. And then who's to say that they are wrong for doing so?

I don't think morality needs to be immutable to be morality. Furthermore what could be less immutable than a morality controlled by an omnipotent dictator, that could change his mind about what is moral any time he wants? It seems like new revelations change morality pretty frequently.

In fact I think the naturalistic evolutionary explanation of morality is far more objective than the the "it comes from God" view. For one thing it actually has reasons for being the way it is rather than the non-reason of "It's the way God wanted it".

How is that a non-reason? That either assumes that God doesn't really exist or that God really isn't God (ie. something else takes higher priority). I'm sorry, but changing morality is inherently self-contradictory. Otherwise, you have no business calling the worst crimes in humanity wrong as long as they say "well it's right according to me".

It does not follow the scientific method. It is the starting point of the scientific method. Denying the axiom is unscientific because it denies an axiom of science. There is no point in believing in scientific method if you don't believe what it is based on.

The fruit gained from the scientific method does not require atheism. It still provides insight and the same results within the context of natural phenomena with that assumption in place. I see no compelling reason to think otherwise barring just accepting an authoritarian stance that atheism must come along for the ride.

My gripe here is that the labeling is an authoritative attempt to avoid scrutiny. It's the flip side of what atheists often say against Christians when they inappropriately take for granted that everyone agrees upon the authority of the Bible in a debate (which obviously is not the case).

This is no different than any axiom. If you could scrutinize an axiom and prove it to be true or false, it would not be an axiom. Believing in the Bible is an axiom like believing in a universe governed by physical laws is an axiom. They are no different qualitatively, so I agree with you there. My point is not to show that science is right and religion is wrong. My point is to show that these are incompatible viewpoints.

Disagree. You don't buy into an axiom without scrutinizing it very thoroughly. You just don't scrutinize it deductively. Otherwise it really is FSM, or whatever you like.

Why is my definition of a spirit any different? What difference does it make if the spirit is made out of atoms if the results are functionally the same? Even if you believed in God, why is it so hard to imagine that the physical universe God created is incapable of housing a soul made out of atoms?

My point is the different in using the word "spirit" as a euphemism, or if it is a real entity apart from the physical body.

There are many different definitions and conceptions of what free will is. Some are completely incoherent in any reality. Some are perfectly consistent with a determinism (i.e. compatibalism).

You are right to say that if I assume a purely physical universe that I don;t need an experiment to confirm this assumption. But these experiments were still very shocking to people. I suspect many of these people were trying to hold a dualist view of mind and body, and this experiment challenged that. Not being a dualist myself, I was not all that shocked by the result.

Well I would say to them to examine the assumptions baked into the conclusions, and to separate that from the evidence. I see no reason to be shocked personally - it would seem obvious that the decision-making process itself involve, well, the brain.

Where do assumptions ever come from? If they are based on evidence, then they are not assumptions, they are tentative logical conclusions, not assumptions.

I don't think we get to decide what assumptions we believe in. I think we naturally find some assumptions compelling enough to believe on faith and others not, and it's as simple as that. Sometimes these assumptions can change over time like when people lose their faith or have a religious experience that gives them faith.

I also think your assumptions depend a lot on how you were raised. For example, people in the United states are more likely to believe in the Bible rather than the Quran or Hinu manuscripts. This is clearly not a coincidence.

Indeed not, but that's getting into the dynamics of human decision-making and influence, which is a whole separate discussion. I guess I overall agree what you said, except that I do believe there is a free-will element of "choice" to it - different people decide all sorts of things regarding religion, including to believe that it is nothing worth considering.

about 10 months ago
top

Why Are Some Hell-Bent On Teaching Intelligent Design?

swan5566 Re:Young Earth Creationism Considered Harmful (1293 comments)

Imagine if you asked me why I didn't like the idea of morals from the Bible, and I said "This reduces my sacred sense of morality to just God. It makes the whole concept of morality meaningless so why should I listen to it? Just because I don't want to go to hell?

You would probably protest that my vision of what God is must be different than yours or I would not have said that. By the same token I think you're concept of "Just the physical world" is different than my view of the physical world which encompasses everything in the universe. I feel like morality being a product of an amazing process like natural selection is rather inspiring in it's elegance. Some people do not like this idea, but I find it to be very beautiful.

To assume that morality "evolves" is to go against the whole reason to listen to it. In short, it just becomes a candy-coated way of saying "I'm going to do whatever I want to do", or "We're going to do whatever we want to do", and try to inappropriately use objective language to try to avoid scrutinizing it. If it's not immutable and objective (and it doesn't depend on our state of evolution), then it's not what it makes itself out to be, which is a red flag that something's wrong with this definition.

Things may "Go back to normal", but now we can no longer look at the present state and back track to learn about the past. Imagine God changes the direction of a single photon. When we detect this photon we can in theory calculate where it must have been 1 million years ago (1 million light years away in some direction). But now all that is called into question.

But as I argued before, just because something can't be figured out through deductive reasoning is in no way grounds to assume that it can't exist.

No it doesn't. It is an assumption. It is an axiom. There is no proof for this. There *can* be no proof for this. This is why believing in miracles is such a problem. It assumes this axiom is false, and hence starts you from a position that is unscientific.

So it appears that this is just a war of semantics. You label it "unscientific", yet you also admit that it does not follow from the scientific method. My gripe here is that the labeling is an authoritative attempt to avoid scrutiny. It's the flip side of what atheists often say against Christians when they inappropriately take for granted that everyone agrees upon the authority of the Bible in a debate (which obviously is not the case).

I would rephrase it as... This argument assumes the human spirit is physical in nature and our consciousness is an emergent property of underlying physical laws. Isn't that amazing?

But that's negating the whole concept of a "spirit" to begin with, and is rather just a convenient way to summarize the human organism with a term that has a misleading connotation (if this definition were indeed the case).

Which study are you referring to?

There have been many.

http://en.wikipedia.org/wiki/Neuroscience_of_free_will

This all boils down to what you assume about the existence of the supernatural a priori (in this case, a human spirit). If you assume not, then obviously there's no free will, and you don't need any neurological studies to conclude that, because that would logically follow from a purely physical existence and cause-and-effect. However, if you assume the supernatural exists, then you cannot hope to fully understand everything about how the human brain works, because other forces beside physical ones are acting upon it.

What if I said that this requirement for evidence to believe something is inherently scientific, and it falsely grants science authority over the supernatural. Science works for everything except the supernatural. When it comes to FSM you must just have faith.

That's to make a prior assumption about what should qualify for evidence, and what shouldn't. That begs the question of where did that prior assumption came from.

about 10 months ago
top

Why Are Some Hell-Bent On Teaching Intelligent Design?

swan5566 Re:Young Earth Creationism Considered Harmful (1293 comments)

This is where I would disagree. But I would also ask why it is important that morality transcends humanity or the physical world?

Because that indicates that morality points to something more than just the physical world. When I try to reduce my sense of right and wrong and tell it "you're just a product of evolution", it doesn't fit. It makes the whole concept meaningless, so why should I listen to it?

Except that science treats claims of string theory (or as I like to call it "string hypothesis") to be testable in principle as a phenomenon of nature, even if tests have not yet been devised. I think most physicists will agree that string theory is the unfortunate name of a hypothesis that has certainly not risen to the level of confidence of a theory.

This is true, and especially since the discovery of Higgs, it make many exotic flavors of string theory less likely to be true.

No we haven't observed every particle in our science lab. But most of the atoms that existed at the time the events of the Bible took place, still exist. In fact, if Jesus was a real person, the odds that some of the atoms that comprised Jesus now comprise you are almost statistically inevitable. The time and place where all the miracles of the Bible supposedly took place are not so distant. The universe is 13 billion years old, and almost unfathomably large. If miracles were happening as of 2000 years ago on earth no less, we are basically living in miracle-land in basically the same time period in cosmological terms.

The matter involved with a supposed miracle would be constrained in time, not just space. Things would go back to normal once its done.

The inability to disprove the supernatural claims of the Bible is not good evidence for their truth.

That is true, but the reverse argument is also true.

But this is a fundamental assumption of science that the universe (including the one that God might inhabit) is governed by physical laws.

This "fundamental assumption" as far as it dictating God is what I disagree with, and I claim does not stem from any scientific result.

Humans according to Christianity are the causes of their own decision (i.e. free will), which is why we have moral responsibility. But science did not accept this assumption and has since gained a lot of ground in figuring out how the brain works. In a laboratory when can do experiments that analyze people's brains and determine when people are going to decide to do things before even they are consciously aware that they are going to do them.

Even some famous philosophers and scientists believed the human mind was immune to science (Descartes). This turned out to be a big mistake. I think the lesson to learn is never to take the "unmoved mover" argument for granted, whether for human minds or God.

But this argument already assumes non-existence of a human spirit and its interaction with our physical bodies, and our consciousness is only apart of our mental mind. Which study are you referring to?

There's a difference between questions whose answers are currently unknown, and questions whose answers are unknowable in principle.

If I said that the God of Abraham was real, but he too had a creator, and it was the flying spaghetti monster, would this argument persuade you? I could say that the reason God never mentioned FSM in the Bible was because God is still a stubborn atheist, and is unaware of his own creator. We will never discover FSM because he is outside of both physics and the supernatural reality of the Christian heavens (Which FSM created). Sure you can claim that it is Yahweh who is at the top, but it is actually FSM who is at the top. Also anyone else who claims that FSM has an even higher God is wrong, because I define FSM as the topmost creator, the prime mover.

This argument is likely unconvincing to you for the same reason Christianity is unconvincing to me.

But I do not believe in the existence of FSM because there is no evidence to. To put God and FSM on the same level of consideration is to ignore the mountain of historical scholarship associated with the Bible, and not to mention a lot of present-day testimonies of the miraculous happening, as well as my personal experiences.

about 10 months ago
top

Why Are Some Hell-Bent On Teaching Intelligent Design?

swan5566 Re:Young Earth Creationism Considered Harmful (1293 comments)

You're right I though I was talking to the original poster this whole time. Sorry about that.

No problem. Honest mistake. It happens.

In that same vein I am describing the idea of assuming these same "axioms", plus the axiom that scientific evidence and the scientific method are the only valid ways of determining objective truth. Maybe this is not true, but such is the nature of axioms.

Ahh, so this is the heart of it. That last addition is what I take issue with. Please understand though that I do understand the desire to consider that as an axiom, because the wonderful thing about science is how it conveys truth in a very concrete, reliable fashion. Moreover, it naturally repels attempts by people or groups of people who would try to manipulate truth for their personal political purposes. And I'll be the first to admit that the history of religion (even Christianity) is not without its ugly, ugly blemishes - and I won't defend those things at all. They were/are wrong. Nevertheless, my observation has been that there is more to the human condition than just scientific reasoning. "Reasoning" seems more general than that. In particular, you get the understanding of how the scientific method works, and how it reliably determines things that consistently lines up with observation. But you also get things like morality and intuitive awareness. These things are less quantitative, but nonetheless real. I've heard it argued that morality is just a product of human evolution's way of sustaining individuals and society as a whole, and I agree that it helps serve that function. But morality says there is more to itself than simply self-preservation. There's "meaning" to it, rather than just being a means-to-an-end, and its validity does not depend on the existence of myself or humanity. This phenomenon is independent of what the scientific method says of the physical world.

I should point out that I don't think religion and science are incompatible in principle. There are forms of religion that do not require breaking the barrier between the natural and the super natural. These would be religions that are deist or pantheist in nature. But I do think science is mutually exclusive with any sort of religion which accepts miracles as reality.

So here it sounds like you could just treat the supernatural in the way that some hypothesize different dimensions or universes (i.e. string theory).

I agree with this statement and that's what the problem is. This is also why science assumes that they do no happen. Rejecting the premise that "miracles do not happen", is tantamount to rejecting science as a method determining objective truth.

So here is why calling this "science" is problematic. When people hear the world "science" they think of its reliability due the reproducible, verifiable results that time has shown squares up with observing how physical reality works, and thus has become a trustworthy source of information for that reason, and that reason alone. To put things that cannot be verified in this way under the same term is try to borrow from this trustworthiness in an inappropriate way. In other words, I put trust in science only because of its methodology, not for its title. If something does not or cannot follow the methodology, then I do not seem it worthy to be called "science", because it lacks the very thing that make science worthwhile to begin with.

In science the entire universe is a science lab.

But we have not somehow observed every single physical interaction on every scale, including those cases in which supposed miracles have occurred. What we have done is taken a tiny amount of data and extrapolated it to give a base understanding of how the universe works. We can't do any better than that as far as science is concerned.

The way you phrase it makes it seem that physics acknowledges the existence of the supernatural and willingly concedes this domain to some other discipline (like theology). How I would describe it is that physics assumes a physical reality. Any non-physical claims can never be proven and are thus incompatible with physics. If it could somehow be shown that God was real, the view of physics would be that because God is real, he is now under the umnbrella of physics and it is the job of physics to understand the physical laws that govern the behavior of God.

I don't mean to imply their intentions, I'm just stating a verifiable fact. That's what physicists do. And it's out of necessity, not because they necessarily want to. And even if we could build a supercomputer that could simulate everything that we know of in physics, they would still have to assume this because you never know what's beyond your ability to observe or measure.

I am well aware of the halting problem, and yes I agree there are problems for which we not only have incomplete knowledge, but can be reasonably sure that we will never have complete knowledge.

As an analogy the string agnostic position is that we can never know if a God exists. This does not change the fact that there is an answer to this question assuming you have a coherent definition of God.

I will reiterate that I made a mistake in assuming you were the original poster, and I apologize.

I however still maintain the position that If you treat the Bible as being an authority over science, this is a fundamentally different position than one where science is the authority.

Chris believes that miracles can occur, but that science works most of the time (i.e. that physical laws are not universal)

Albert believes that miracles have never occurred, and therefore that science is in the business of discovering universal physical laws of nature.

Both Chris and Albert claim to believe in science, but I am saying they believe in 2 different things that seems similar on the surface.

If it becomes known that God changed the gravitational constant for 1 day to cause a miracle to happen, Chris will accept this and have no problem keeping the equation for gravity the same

Albert will insist on changing the equation for gravity to be something like F=Gmm/r^2 + x, where x is the miracle term. He will then pose the question of how "God's" decisions can be described by physical laws. Maybe we won't ever know the answer to this. But if Albert ever succeeds in predicting God's behavior with a set of equations, he will be satisfied, and many Christians will decide that any God reduced to an equation is not really God.

The view of science is that if it's "real" then it's in the domain of science, and that supernatural things are not real. If you do succeed in proving a supernatural thing to be real, what you have actually done is proven that a phenomenon previously thought to be supernatural is actually natural and therefore now in the realm of science.

As I said before, while I do personally espouse Albert's worldview, what I am arguing in this thread is not that Albert's view is true, but that Albert's view is the view consistent with the ideology of science and that Chris' view is not.

I realize there are many scientists that are believing Christians that actively contribute to the field of science. But for me this would be like an atheist pastor spreading Christianity because he thinks it will make the world a better place, while still believing it is actually false. And in fact there are many atheists who are angry with other atheists for trying to convert Christians to atheism, because they believe Christianity is a force for good even if it isn't true. But I think you would agree that this atheist pastor does not possess a true Christian worldview.

The overall comment here is that God made physics, but is not subject to them, including their regularity. He also made our ability to reason and observe. You could never hope to predict God's movements, because He's an intelligent being. He's the cause, and not an effect of something else more fundamental. I agree with what you said about proving a supernatural thing being a natural, although it's probably more appropriate to label that is something that's "unknown", since to say it's supernatural is to assume something about it previously. And again, the issue comes down to what "science" means, and why it carries weight, and whether or not things under it really belong there.

about 10 months ago
top

Why Are Some Hell-Bent On Teaching Intelligent Design?

swan5566 Re:Young Earth Creationism Considered Harmful (1293 comments)

Check who posted what there, friend. You are quoting someone else. I'll assume those comments referring to the other poster are just misdirected, unless you say otherwise. As far as other things you said, science and logic are not axioms - they are deductive and inductive methods (depending on the discipline) used upon your axioms. In short, there is no science experiment that can disprove anything miraculous. That's no fault on the part of the methodology of science - it is was it is. You asserted that miracles "destroys the ability of nature to be repeatable". Keep in mind that Christians don't just assert the supernatural, but intelligence behind the supernatural. It's not just some "noise" that randomly messes with nature - there's intent and a purpose behind it. So, you would not expect it to show up as some statistical anomaly in a lab. You will note that with physics experiments, they assume a "closed" model, where only the physical ("natural") phenomenon are assumed to operate (and usually they will make further assumptions on the simplicity of the interactions to make the math easier). And it makes sense to assume this for their intents and purposes because they are interested in how the physical (natural) world operates. But that should not be mistaken for "complete" or "open" knowledge of reality. And scientists admit as much each time something new is discovered any they update their models and theories, and we wouldn't say that there's anything incoherent about that. Another example is in theoretical computer science. We know from the Halting Problem that there is no way to logically deduce whether any given program will halt or not. However, it is still the case that every program will either halt or run forever, even though logical deduction cannot determine which. Perfectly coherent, yet admittedly incomplete logical knowledge (and logic, here, is the one saying that it's incomplete).

about 10 months ago

Submissions

top

Greenland Ice Melt Is Actually An Expected, Routine Event

swan5566 swan5566 writes  |  about 2 years ago

swan5566 (1771176) writes "After NASA issued a report that Greenland is currently experiencing “unprecedented” melting this year, glaciologists have now come back saying that this phenomenon was expected, and not due to man-made global warming. The last such occurrence happened in 1889, and is on about a 150-year cycle. Yet another log in the fire for the climate change debate."
Link to Original Source

Journals

swan5566 has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>