Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!



Justified: Visual Basic Over Python For an Intro To Programming

quietwalker Re:How about No Language (630 comments)

My first CS in college was like this. The teacher was very high and mighty into the theory. Pure academic.

"Computer Science," he said, "does not require computers." "When I wrote my first program, it was with pen and paper, and that's all you'll ever need."

So we ended up learning the same way, an introductory computer course where the first half was lambda calculus & church encoding, and the second half was functional programming with APL using an onscreen keyboard emulator since half the keys don't exist on a normal keyboard.

It was a crappy way to learn, and I'm sure it did it's job to weed out half of the 300 or so students that had to take it their freshman year. In fact the only thing I really learned is that just because someone has a huge body of knowledge, it in no way qualifies them to be a teacher, much less a competent teacher.

5 days ago

Justified: Visual Basic Over Python For an Intro To Programming

quietwalker Re:Pascal (630 comments)

GOTO is not immediately bad. It's simply a warning that the code might not be well organized, as language constructs already provide that functionality in more encapsulated forms.

You've been coding for 30 years, so you've seen the OO folks pop up and yell and screech that without an explicit (sometimes they use the phrase 'pure', but rarely) OO language, coding can't be done at all! Like, how could you organize with only dumb structs and functions? Java really ruined a lot of people in that aspect, making design patterns a required part of their certification without explaining that they're just communication shorthand, as opposed to say, required, explicit class names. I occasionally have to fight someone with "pattern prejudice," like this, and it's always an uphill battle combating ignorance.

GOTO is just like that, only with folks using the slightly higher level languages screeching at those coming from assembly, or those who didn't have to worry about reserving/jmping to high memory.

But what do they know? If a program can't rewrite it's own code, what good is it?

5 days ago

Justified: Visual Basic Over Python For an Intro To Programming

quietwalker Re:Javascript (630 comments)

The two top languages you should not learn by example are Javascript and PHP. In both cases, the low barrier to entry means that there's a wealth of code, but the average quality is extremely low.

Not only that, Javascript is a poorly understood language, even by those who spend a lot of time with it. You'll likely end up learning coding by ritual instead of understanding why it works - and more importantly, why when you do it, it doesn't. It's like trying to learn ruby by starting with an RoR app coded by people who think the framework is the only architectural decision that needs to be made. ... which is most of them, apparently.

5 days ago

Justified: Visual Basic Over Python For an Intro To Programming

quietwalker Re:Javascript (630 comments)

Javascript might be the next BASIC. "If you teach your students Javascript first, it will ruin them."

That being said, with a set of very rigid, predefined set of exercises and test environments, it could be one of the best for learning. Within a web browser, you have an incredibly simplified, easy to manipulate UI, putting it far ahead of any other language with graphical controls, and the immediate feedback from evaluating scripts means experimentation and discovery are built into the combined platform.

The problem, as I see it, is the functional aspects of the language. It's too big of a jump to have a new to-be-programmer learning functional programming before they've got a handle on writing programs that already mimic the procedural way they think.

So just avoid those aspects, and the other 98% of it that was badly done, and you've got a great choice.

5 days ago

Justified: Visual Basic Over Python For an Intro To Programming

quietwalker Re:This guy hasn't done his research. (630 comments)

In my experience, 'serious' mathematical crunching, people tend to use fortran. The array-based nature of it also makes parallelizing fairly straightforward. There are random popular libraries or frameworks that do some of what folks have had in fortran for decades, but they usually all fall flat when it comes to the heavy lifting, like SciPy, or has licensing or costs issues, like MATLAB.

On the other hand, I agree with you on prototyping in a scripting language. I used that technique myself many times, and it's great for proof of concept and initial architecture design. Also with scripting languages for general purpose code. Heck, I don't think I've updated my 'bundle of libraries I've written that I carry around to every job' in almost a decade. All that's built in to a scripting language already, or installable with a single command line directive.

5 days ago

Justified: Visual Basic Over Python For an Intro To Programming

quietwalker A random aside (630 comments)

Until I used the page tools to add a p { color: black; } to the page, I realized I was unconsciously squinting a bit to read the #888 font on a #FFF background.

Who makes whole paragraphs look greyed out? Seriously. May as well make it all caps.

5 days ago

Why Some Teams Are Smarter Than Others

quietwalker Re:could be fems average better at groups, men one (219 comments)

I wonder.

I read a study (that I can't find online) that compared sole owners of small businesses by gender, showing the standard male advantage in terms of profits and such, and then subdivided by motivations. One of the differences - cultural origin or not - was that men valued financial success greater than women, who preferred things such as flexible work hours, family priorities, and so on.

The surprise was that within these subdivisions, women generally outperformed men. Those who were most motivated by happiness - women were more happy. Those who wanted flexibility - women provided more flexibility. Most strikingly, for those who had financial success - women outperformed the men.

In a culture where we have a gender wage gap, it's hard to claim that women are culturally selected for when it comes to financial success, yet they beat the men. It could be application of cultural norms resulting in better communication, resulting in better success in any endeavors relying on interpersonal interaction, but along the backdrop of higher success rates for college, higher rates of advancement in management, and mirrored trends in other non-repressive countries, it forms part of a larger pattern that seems to strongly suggest that these are genetic differences, and that they have a significant impact.

Besides, we know from studying other primates that there are behavioral differences in genders, there's no reason to expect that we wouldn't have the same proclivities.

What's more interesting to me is that we're well into the transition between societies where stereotypical male traits (aggression, etc) are advantageous, and into one where other traits we could call stereotypically female (such as ability to communicate or emphasize) are becoming key. Perhaps we'll see western society flip these norms in the next 100 years or so? Let's just hope we don't end up wearing silly outfits like they did on Angel One (ST:TNG reference).

about a week ago

Why Some Teams Are Smarter Than Others

quietwalker Re:I have grown skeptical of these experiments. (219 comments)

I immediately considered the impact of this study on what we know about Agile development.

While I genuinely like the agile development concepts, I foresaw some fundamental problems with it. One of the major keystone values is a reliance on person-to-person and person-to-group communications. Something that, stereotypically, those in the software industry are not very keen on, and are rarely hired for.

If we believe this study, and deep interpersonal skills(*) are required to effectively leverage a team, it sure puts agile in a precarious position.

Perhaps this is why the current studies show that the current highly reliable best-performing methods appear to be a combination of waterfall and agile, providing a more concrete framework within which important interactions are explicitly structured, and not simply directed at a high level. When you have to rely on a tertiary set of skills for success, it makes sense to compensate for them as well as you can.

As a side note though, specialization is compatible with agile methodology. The dev team as a whole determines work effort, estimates, and how to complete a task. In general though, you'll likely have someone who's better at databases than others, or qa, or code, or is the domain knowledge expert. Tasks are organically sorted within the group, usually by the lines of efficiency. Not always though, just most of the time. So you can have a low-performing sprint if it's a task only suitable for a single individual. The group will still finish it as quickly as possible, and not any sooner.

(*) - I like how Neil Stephenson put it when regarding face-reading: "I don’t know how my face conveyed that information, or what kind of internal wiring in my grandmother’s mind enabled her to accomplish this incredible feat. To condense fact from the vapor of nuance."

about a week ago

Shanghai Company 3D Prints 6-Story Apartment Building and Villa

quietwalker Re:No one 3D printed a house (97 comments)

What caught me was the claim that they printed it all in a day. Concrete of any quality quick-drying fast enough to sustain the weight of a whole building, vertically? Really?

No, of course not.

They're only talking about the walls, and they're making them off site. The same process is used for any prefab concrete structure. What upholds their claim that they're using 3-D printing comes in that they can make any combination of shapes quickly and easily, without the need for a custom mold or form. Instead of setting all that up, it's just computer controlled - thus the actual gain, an 80% reduction in labor.

As others have noted though, we only have the company's word that it's safe. It doesn't seem like it has an internal rebar framework, or anything to sufficiently replace it ...

about a week ago

'Silk Road Reloaded' Launches On a Network More Secret Than Tor

quietwalker Re:Security by obscurity? (155 comments)

I was going to post this. It's not some secret, kept hidden from folks. It's just simply neither popular nor well known.

about two weeks ago

New AP Course, "Computer Science Principles," Aims To Make CS More Accessible

quietwalker Re:Hmmm ... (208 comments)

Education policy is not the domain of those who understand education - it's decided by politicians at nearly every level. Everything from what will be taught, to the books we use, to the structure of classes and rating systems meant to produce specific results without any real understanding of how those results are achieved or the real impact of them. They also can't make radical leaps - anything that might fail would result in losing their position, so they stick to minor modifications to existing systems - so there's no disruptive changes possible, as per Ken Robinson[1].

All that while fighting through often biased or partisan processes that result in, for example, including religion and denouncing evolution in Texas schoolbooks. In fact, you can say that government run institutions are process-driven more than anything else.

On the other hand, businesses are results driven. A business that does not produce product will shortly cease to be a business. That mechanism lends itself well to tackling any problem, even if it often discards moral or ethical considerations as not being part of the problem scope. So while their primary focus is of course, profits rather than education, when education is a requirement for profits, they're both well situated and motivated to provide that.

They can even take risks, with the knowledge that success will reward them many times over. So new styles of education are realistically evaluated and considered.

That's the nice part about capitalism. We can rely on human greed and ingenuity to produce almost any result, so long as we're able to figure out how to make it a requirement for fiscal success, whereas the political systems are motivated to not take chances and not to rock the boat, while at the same time claiming to be a boat-rocker.

So yeah, there's some PR gain in there for those companies, but that's just icing on the cake compared to their main benefit from supporting or redefining education.

[1] - See http://www.ted.com/talks/ken_r... ,
                          http://www.ted.com/talks/sir_k... ,
                          http://www.ted.com/talks/ken_r... ,
              for some interesting thoughts on disrupting the existing educational systems.

about a month ago

New AP Course, "Computer Science Principles," Aims To Make CS More Accessible

quietwalker Re:what is this crap (208 comments)

I don't believe that lowering average programmer salary is either the sole or primary motivator for this trend, even for businesses alone, much less other groups.

Businesses need more developers, and they haven't got them. It's as simple as that. The focus on women is simply the most efficient way to do it since they're vastly underrepresented in the field - every dollar spent on encouraging women nets more potentials than on men. It's just good ROI. The fact that it's a social currency is just icing on the cake.

Educators can see that it requires about half as much effort to achieve the skills that will provide an entry level job at about twice the pay of similar white collar jobs; again, good ROI. Not only that, there's a wealth of freely available training material, literally thousands of hours of tutorials from simplistic to horrifically complex. Free online courses, making this available across cultural and social lines. There are people living in war zones that are learning to program!

Programming education is good political currency for politicians too. Businesses and constituents appreciate more jobs and skilled workers. Minority groups appreciate the inclusive nature and extra focus. The boost to the economy & the lowering of unemployment together make for a better tax base, and so on.

Last, the worker themselves get great benefits. A low-stress white collar job with good security, reasonable hours, decent benefits, high pay, and preferential treatment to minorities, all for very little actual training.

Really, there's almost no downside in the current social, political, or economic climate. Rather, what has confused me is why everyone isn't already learning to program. I don't know anyone who wants to make a career in any consistently low paying job, much less a blue collar one involving physical labor, yet so few appear to take advantage of the opportunities presented in the field of software development.

about a month ago

Virtual Reality Experiment Wants To Put White People In Black Bodies

quietwalker Re:heh like Skyrim? (448 comments)

Not specifically. Dark Elves in the elder scrolls universe are just another race of 'mer' with no innate evil or goodness. Technically even the dwarves and orcs are mer-types, what you'd consider 'elves': http://elderscrolls.wikia.com/...

At the point of the the skyrim game, dark elves are basically Haitian refugees, as their entire country has gone to hell and is covered in yards of black ash from a volcano. People hate them because they're penniless, non-job-having, homeless beggars who often resort to thievery. They don't even burn as well as other races because of some innate fire resistance. ... but they do eventually burn.

about a month ago

Small Bank In Kansas Creates the Bank Account of the Future

quietwalker Re:Misleading article - you must use ACH (156 comments)

I think the biggest obstacle is actually the bankers. They do not like adopting new functionality, especially new functionality that causes their processes to change. They have no problems with tech that lets them do exactly the same thing they were doing before, like say, mobile apps, but new = scary. For example, in the ACH file description, there are two file format types: One is called 'DISK' and the other is called 'LINE' - for dial up. They just send the disk format now, over the net.

In short, they do not trust technology to get it right, and so will only accept a process that's modeled after their actual pen-and-paper model, so they can manually validate the results and understand exactly how it works. Then, once it works, they won't change it.

You also have the additional barrier of existing legal structures in the US that force a 'float' time to all transactions, but that would change if the bankers (and I guess, the market) demanded it.

It's all moot though. There's not a big need for minute to minute liquidity except among those who are very bad at managing finance, and they do not make for good customers. I honestly don't see a real market need for that sort of feature, nor people who'd pay for it, and so no real ROI on implementing it.

about a month ago

Small Bank In Kansas Creates the Bank Account of the Future

quietwalker Re:Welcome to the 21st century guys (156 comments)

We already use 'risk scoring' all the time; it's a fundamental part of our ATM software, and nowhere near as magical as it sounds. They're usually just fixed rules, like "no more than $500 dollars a day, or $200 per transaction" etc.

As for being available for business transactions, I didn't see that specified. If this is a replacement of ACH transactions, then it's likely that it works fine for people too. You know, most banks implement billpay via ACH transactions from a person (when they don't have to print and mail the check). It's just that things like payroll and issuing collections (like invoices), or transferring money to another account at a different bank are not standard end-consumer needs.

about a month ago

Small Bank In Kansas Creates the Bank Account of the Future

quietwalker Re:Misleading article - you must use ACH (156 comments)

Side note: There are many other notes about realtime money transfers in other countries. In most cases, those are again, time delayed at some point, and subject to reversal, it's just hidden from the customer. In fact, even wire transfers/money orders are reversible! The countries involved simply have laws pushing the risk elsewhere than the customer - usually the FI, I'd bet. This is especially true of large international exchanges, like SWIFT. You might even be able to pay more to such an exchange to expedite your transfers, or even cover the risk - for a good customer.

Though, that said, there's no reason a bank in country A might not have an agreement with country B, to automatically honor requests, assuming both countries have lax or non-existent financial regulation laws. In reality though, all countries have those laws, and that's why we have international exchanges.

This is not just a semantic difference either; there appears to be no difference to the customer in most, but not all circumstances.

about a month ago

Small Bank In Kansas Creates the Bank Account of the Future

quietwalker Misleading article - you must use ACH (156 comments)

Disclaimer: I used to write banking software for a living, including implementing ACH management on both the customer-facing and backend processing systems.

The article is blatently misleading regarding realtime transfer of funds, but it takes some knowledge to understand why. Let's talk about ACH transactions.

ACH, or Atomated Clearing House, is the network that the majority of electronic funds in the US use. As the article points out, it's ancient and horrible, basically a 1:1 translation of the paper funds reconciliation to electronic format. In essence, a customer creates an ACH transaction, which is sent to two endpoints; the federal reserve a.k.a. The Fed, and an ACH operator. Just like a credit processor, the ACH operator is then responsible for delivering the funds to the destination financial institution (FI) and they make their money by charging the originating FI. The transfer only goes through once both The Fed and the operator finalize the transaction, which can take a day or more, and most of them are held for additional days to provide for reversals (effectively, cancellations).

Here's some important takeaways:
    - To perform bank-to-bank transfers, you must either engage a third-party processor, or you must have an agreement (and process) with each individual bank you wish to transfer to.
    - These transfers are subject to some very specific banking regulations, some of it relating to reporting to the Fed, who can block the transactions.
    - Laws provide for effective reversal (issuing a reciprocal transaction, not necessarily a reversal) for 2 days for corp-to-corp transaction (CCD) and up to 60 days for transactions involving people (PPD).
    - Just like most retailers, these are batch processed, not in real time, though the banks will reserve funds and adjust your balance accordingly. No one minds because legal protections result in at least a 2-day processing window anyway.

Okay, so what do we need to perform this transfer in realtime? Well, first, you'd have to get every bank in the US and the federal reserve to switch to a new system that actually supports real time transfers, instead of the ACH. Then we'd have to completely overhaul the 40+ years of recent laws that were written with a batch-based system in mind, including removing many of the funds reservations activities (and the legal protections that require them) in favor of a realtime system.

So how does this bank do it?

Based on the info from the article, it sounds like the bank is managing two accounts per individual account; the customer-facing one which serves the 'realtime' aspect, and the actual one that is used for the ACH transaction. The risk comes in when the bank accepts a credit or debit prior to it being authorized and completed, and thus the need for 'risk management' software, identical to the sort that ATMs use, especially when configured as a local authorizer (for branches too far from the main branch and others).

They just don't show the end user the reservation of funds like most FIs, and they assume the risk directly so there's no odd 'processing' credits or debits in their statement.

So, it's just smoke and mirrors. They have to use ACH if they want to talk to other banks, and they're not doing manual wire transfers. They just aren't telling their customers. Though if they hit the anti-terrorist check (I wrote the software that matches against the government list too, at one time), their customer is going to find out really quickly that it's really just an ACH after all, and they ~don't~ have those funds - it's illegal for the bank to provide them!

about a month ago

What Canada Can Teach the US About Net Neutrality

quietwalker The US doesn't need to be taught (80 comments)

This isn't a matter of lack of knowledge or understanding. The US doesn't need to be taught, or led.

The US is currently on the divide between protecting consumers from potentially abusive practices or allowing businesses to run rough-shod over them. It's a debate regarding priorities between business, consumers, the economy, and social welfare, and despite my strong feelings on the subject, on a national level, there's no silver bullet answer that 'fixes it', especially since Canada hasn't actually done anything either, but commission a study.

In fact, studies of the sort that are being done in Canada have already been done in the US, at several different points in time, and the recommendation they had then was one of non-interference. With the inability for congress to act in any way other than to block action, that's likely how it's going to go.

What we could use is a surefire way to figure out how to light all the democrats and republicans on fire, and replace them with politicians that actually care more about the people they're meant to represent than their next elections, party, or party politics. If you've got one of those, let us know, cause THAT's what we're in dire need of.

about 2 months ago

Which Programming Language Pays the Best? Probably Python

quietwalker Not in the Austin job market (277 comments)

I'm pretty savvy with all the listed languages except Objective-C (only maintenance on existing apps), and have used them all at one time or another in a job. My linked profile garners around 3-4 recruiter contacts a week, and in my own little silo, I can say that while there may be 6 figure salaries out there for the Python and RoR, they are few and far between. The salaries I'm seeing on the top end for those development jobs rarely crest 70k.

On the other hand, there's bigger salaries for Java or C#. It's not too hard to find a 100k-110k senior Java or C# developer position.

Anecdotal evidence is not scientific data, but their results just don't match my personal experience in 2 decades of doing this.

However, I think I can see how they got the numbers.

According to the article, the data was retrieved by searching job ads, as opposed to taking a survey of people actually working at those jobs, and then permuting and filtering it. Given that:
    - Development job availability, especially with new technologies, is heavily skewed towards the west coast, where the cost of living is higher. From Austin to San Jose, the cost of living increase is between 50 and 75 percent - the 100k job is at least a 150k.

We can make a reasonable assumption that there will be more positions open, and that more of them will be higher paying relative to the entire US job market, likely breaking the 100k cap, as 100k is low relative to the cost of living.

    - Established development languages already have a majority of their positions filled, as opposed to emergent technologies which have more open positions

This will naturally result in a higher number compared to a language with less open positions, if the bar (100k) is low relative to the cost of living.

    - Emerging technologies lack experts simply because they haven't been around long enough to develop as many

So positions will be open longer, and more aggressively marketed by recruiters, meaning that they're more likely to double- or triple- count job postings that are unknowingly for the same job
Employers using recruiters often prefer to using a limited number of recruiters who themselves maintain a pool of direct-contact individuals with experience in a given field, meaning that those jobs are less likely to be publicly posted, whereas the new technologies require public announcement and investigations.

So in summary: I don't doubt the statics they used, but I think their methodology may be affected by a heavy bias, and therefore invalid.

about 2 months ago

Hawking Warns Strong AI Could Threaten Humanity

quietwalker I'm betting on 60%+ of what we ask it to do (574 comments)

Let's say it exceeds our own intelligence, that's fine - but you have to ask what purpose it has.

Take a human. What they do is based on what they've defined as their purpose - their goals both second-to-second and over their whole life. There's a whole series of organic processes which result in the determination of purpose and it's pretty random in part because we don't have explicit control over our environment or our thoughts.

However, (important) AI's won't be like that. We'll have control over their entire environment, and they'll be purpose built. You'll say "We need an AI to manage traffic," and then build that purpose into it. You won't take a randomly wired mechanism and plug it into a major public utility control panel. You won't worry that it was exposed to, and then became enamored with violence on the TV and decided to be an action movie star, and so is going to spend it's day watching rambo reruns rather than optimize traffic lights. The core of it's essence will be a 'desire' - a purpose - to manage traffic.

The end result is that AI's won't act destructive, threaten humanity, etc - unless we tell them to. In this light, the thing to watch out for would be military usage. Maybe don't put an AI in charge of the nukes. You'd also need to - among other things - allow AI's to have the freedom to NOT fire on an enemy, for example, because of the very mutable definition of the term enemy.

about 2 months ago


quietwalker hasn't submitted any stories.


quietwalker has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?