Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Tools for Automated Grading?

Cliff posted more than 8 years ago | from the tech-lightens-the-workload dept.

Education 100

Dont tempt me asks: "As all teachers and students are well aware, it is back to school time. As a math/computer teacher, I am constantly looking for ways to automate repetitive tasks. The one that seems to take up most of my time is grading. As is typical for us nerds, I find myself looking at handwritten tests and thinking 'there's gotta be a better way...' Since I can't find any related open-source projects, I have been thinking about creating one. I have been toying with the idea of using OMR (Optical Mark Recognition) to make my own scannable multiple choice tests. Is anyone doing this? If not, where would be a good place to start? In addition to teachers, this could be a useful technology for questionnaires, or other processes that require manual data entry."

cancel ×

100 comments

Scantron (3, Interesting)

justanyone (308934) | more than 8 years ago | (#13512888)

I taught astronomy at KU as a discussion section leader in 1991. We used scantron machines. These were #2 pencil IBM-card (~3 inches wide by ~8 inches tall) sized.

The machines could NOT have been expensive. Using them was dead simple. We (the section leaders) wrote several tests, and rearranged each test to have different orderings for the choices. Thus, on test version A-1, I had answers (a) Sun, (b) Moon, (c) Earth, then on A-2 I had (a) Moon, (b) Sun, (c) Earth, etc. Then, we looked at their version of the test, and put in the right key.

This kept cheating to a minimum; at the least they had to memorize the answers instead of the answer key. And, memorizing the answers was kind of okay in a sense since they at least paid attention to the subject material.

Re:Scantron (1)

CDMA_Demo (841347) | more than 8 years ago | (#13513010)

IMHO in both computer science and mathematics the process of finding a solution is more important than the answer itself. If a person has correct answer but got it only by mistake then what good is it? I am not sure if grading students so objectively will help them later on when they face advance problems in math/c.s. Teachers like you should consider this aspect of education in mind.

Re:Scantron (1)

Nos. (179609) | more than 8 years ago | (#13513056)

Most of my math classes were 50% for the answer and 50% for the work. I remember one test where I couldn't figure out one problem, I wrote down x=4 as the answer without any work showing (I hadn't read HHGTTG yet). The answer was 4, I got 1/2 marks for the question. I've also done questions where I did all the work correctly but for some reason wrote the incorrect answer at the end, something like:
....
x=4/2
x=4
Yes, I got the answer wrong, but up till that point everything was correct, I got 1/2 marks again. Seemed fair to me in both cases.

Or better, a VK (1)

Glonoinha (587375) | more than 8 years ago | (#13522480)

Just ask the kids this :

1.
You're in a desert, walking along in the sand when all of a sudden you look down and see a tortoise.
It's crawling toward you.
You reach down and flip the tortoise over on its back.
The tortoise lays on its back, its belly baking in the hot sun, beating its legs trying to turn itself over.
But it can't.
Not without your help.
But you're not helping.

Why is that?

2.
Describe in single words, only the good things that come into your mind - about your mother.

Re:Scantron (1)

99BottlesOfBeerInMyF (813746) | more than 8 years ago | (#13513440)

If a person has correct answer but got it only by mistake then what good is it?

Well, it is still the right answer. Sometimes it is better to be lucky than good. I guess the answer to your question gets into some muddy philosophical area. Do you want to be on the plane with the expert pilot or the luckiest man on earth?

Enough with that mumbo jumbo though... I agree that multiple choice is not a very good method for determining how well a given student is doing at math or in CS.

Re:Scantron (1)

CDMA_Demo (841347) | more than 8 years ago | (#13514040)

Your rhetoric is only good if you consider the student to be the pilot and the teacher to be the passenger. Sandly, this again shows how poorly understood the relationship between students and teachers is.

Re:Scantron (3, Funny)

Usquebaugh (230216) | more than 8 years ago | (#13514078)

In my neighbourhood teachers and students are not allowed to have 'relations'.

Re:Scantron (1)

CDMA_Demo (841347) | more than 8 years ago | (#13515223)

If you live in america, relationship has only one meaning. In other countries databases are more advanced...

Re:Scantron (1)

Uber Banker (655221) | more than 8 years ago | (#13513810)

I totally agree. To me, yeah I may sound a fool, but the definition of teacher is someone who teaches things. Not orates something in a textbook, not someone who performs intellectual msturbation on the naive, but someone who uses their knowledge and skill to help others achieve their goal.

The final exam that a course of teaching culminates in is usually the benchmark of achievement of the student, but in what way is this useful to a teacher given it does not help someone teach. No doubt tests are important to benchmark students' achievements over the run of a course, but they are a means to an ends, and that end is to teach, and to teach is not only to be knowledgable but to be skillful and sensitive enough to translate that knowledge from the teacher's mind to the student's mind. Using an automated scoring system does nothing other than to tell black from white and to ignore the huge range of abilities and difficulties their students face.

Hell, I know many who give more respect to their personally devised neuural networks than this guy seems to give his students.

Apologies for the inevitably poor spelling, I would not usually post in haste in my present state of ineberation, but this story pulled a nerve.

Re:Scantron (2, Interesting)

clifyt (11768) | more than 8 years ago | (#13515183)

"If a person has correct answer but got it only by mistake then what good is it?"

Because by knowing how a test works, it tells a LOT about the student. There is a lot of science behind item response theory...for instance, a right answer on a specific item may tell you that the student has guessed. That's right...the reliability of the item may be that over 70% of the people that passed the test got it wrong, while 70% of the people the failed got it right. Quite a bit can be figured out about the student by what they got right or wrong.

In fact, the classic 3 parameter model guessing is a BIG part about the calibration:

http://www.rasch.org/rmt/rmt181b.htm [rasch.org]

(difficulty and discrimination being the other two parameters)

Of course, this come more into play when you get into computer adaptive testing, but it still important on the standard multiple choice test (at least if its standardized).

As for math and computer science...I've never designed a CS test using multiple choice, but math is very useful. Again, adaptive testing with multiple choice is better, but its not like math is rocket science...as an instructor mentioned recently, math is not about memorizing formulas or otherwise, its about learning how to think creatively while breaking down a problem and coming up with a solution. If you can figure out the answers that don't belong and figure out the rest without having to do the problem, you've demonstrated knowledge about the ideas behind the problem in front of you...its just like the complaints we got about an essay rater we were working on several years back (it graded structure, not content)...people were submitting brilliantly written nonsense and then complaining that we were scoring it high -- the fact that someone was able to figure out ways around the writing rubric to game the system meant that they pretty much deserved the high scores they received because they were good writers -- regardless of the nonsense (unlike my own writing...this is not a good example of such writing).

All in all, this all depends on what you are after -- demonstration of mastery or for a diagnostic (or maybe to prove your students know as much as the students down the hall or the school across the state). Luckily, most educators are taught testing methods before they leave school and probably have a little more clue as to what they are looking for than students that think a test may just be multiple choice bullshit (though that could be considered negative face value :-)

clif (speaking for myself and not my office)

Scantron Exploit (2, Interesting)

Noksagt (69097) | more than 8 years ago | (#13513185)

So many of the scantron-style tests had heavy-black marks on either or both sides of the exam. The machine use these marks to locate each question. (A big square would correspond to start/stop of the test and a line would correspond to an individual question or similar). Ths beats having to guesstimate where the questions were by feed-through rate & also led to high-speed machine-grading.

Well, a clever student decided to draw his own marks on the side of the exam. This managed to trick the idiotic machine into thinking all questions were one off & subsequently his test & all tests which were fed through after his had a low grade; the machine tested if question 1 had the answer expected of question 2 and so on.

Re:Scantron Exploit (3, Interesting)

bluGill (862) | more than 8 years ago | (#13513500)

Back when I was in school, 15 years ago, my teachers were onto such tricks. It isn't hard to look at scores are you write them into your book. The teacher already knows 'Suzie' is smart, often getting a perfect score, so if he[1] misses most of the questions it is time to re-examine things by hand.

The most popular way to cheat was to mark the little box at the top that set this sheet to the master, which would re-program the machine to take your test as the correct answers.

None of these tricks were hard for a teacher to catch (if you knew about them it was easier, and the principal made sure they knew). Once you catch a student doing this you just write zero in for his score and re-run the tests.

[1]we miss you Johnny Cash

Re:Scantron (1)

moosesocks (264553) | more than 8 years ago | (#13515032)

Erm.....

What sort of teacher/student in the past 20 years hasn't used a scantron at some point or another?

the machines are dirt-cheap, and accurate enough for all intents and purposes.

of course, it does make the course a good deal less personal.

How about this? (2, Informative)

iseth (258694) | more than 8 years ago | (#13512898)

Autodata ExpertScan [autodata.com] does this already. I've used it for surveys and forms, it should work for a test too, but admittedly, I've not tried it for that.

It was a dark and stormy night... (1)

ion_ (176174) | more than 8 years ago | (#13512920)

There's a legend about the time when OMR was a new technology: a student had marked every single checkbox. The software gave him 100% points, because it only checked whether the correct answers were marked. :-)

Re:It was a dark and stormy night... (1)

jpsowin (325530) | more than 8 years ago | (#13512954)

Well, it didn't work for me...

Not that I tried it, of course.

Re:It was a dark and stormy night... (3, Interesting)

ragnarok (6947) | more than 8 years ago | (#13513024)

I had a teacher in high school who graded OMR like sheets manually with an overlay that only had the holes cut for the correct answers.

Several times if I didn't know the answer I would mark more than one (you don't want to mark all of them, it stands out too much) and I always got credit.

The teacher seemed like a smart guy too, I wonder if he was doing it intentionally to see if people would figure it out.

Re:It was a dark and stormy night... (1)

Overzeetop (214511) | more than 8 years ago | (#13514965)

When my wife was in grade school (before she was my wife, obviously), she would grade the papers for her grandfather's econ classes at Virginia Tech. Same system - key with holes to check the properly marked bubbles. I would think that minimal use of the mark-two would probably have fooled her.

Re:It was a dark and stormy night... (1)

Lehk228 (705449) | more than 8 years ago | (#13521617)

i never had such luck. the only teacher in HS i had who used a punched sheet also used an inverse punched sheet to check for double marking

Hole Punch (1)

crow (16139) | more than 8 years ago | (#13512942)

Use the low-tech solution. The answer key is created by using a hole punch on each correct blank. Then hold it up and mark in red any circle that isn't marked on the student's answer sheet.

Of course, you have to also look for students entering multiple answers (especially if they know how you're grading).

Re:Hole Punch (1)

Noksagt (69097) | more than 8 years ago | (#13513042)

Of course, you have to also look for students entering multiple answers (especially if they know how you're grading).
And, of course, a good way to do this would be to have a second key which has all of the wrong holes punched.

Re:Hole Punch (1)

moorley (69393) | more than 8 years ago | (#13513099)

Or reverse it. Create your own bubble sheet but use a template, whether holes cut in card board or a transparent print out, as the grader key.

Re:Hole Punch (1)

FLEB (312391) | more than 8 years ago | (#13514728)

Just punch out all the holes, and rim the correct answer in colored marker, or suchlike.

Don't bother with the OCR. (0)

(H)elix1 (231155) | more than 8 years ago | (#13512963)

Once you scan the tests, you will have a graphic. You should be able to look at areas within that graphic and 'see' if it is dark. Take a look at imagemagick.org - an open source set of graphic manipulation tools that make this bone head easy in most of the popular languages. Make your life easy and place marks on the form so you can calibrate you fed the test in properly.

Ask Slashdot: Automated Homework (4, Funny)

mr_rattles (303158) | more than 8 years ago | (#13512986)

Hi, I'm a CS student and always looking for ways to automate repetitive tasks. Is there any project out there that can do my work for me so that I don't have to?

So... (1)

Datoyminaytah (550912) | more than 8 years ago | (#13513005)

> I have been toying with the idea of using
> OMR (Optical Mark Recognition) to make my
> own scannable multiple choice tests.

Ooh! Ooh! Does this mean I can use a calculator when I take your test?

No? Hey, no fair! ;)

Not just grading.. (2, Interesting)

thefirelane (586885) | more than 8 years ago | (#13513016)

Why not do automatic test creation as well. If you're going to do multiple choice, you can have a 'pool' of questions. Each question has a score, based on the percentage of students who get it wrong usually.

You could then automatically create tests with a certain percentage of 'A level' questions and so on. This would also let you more-or-less predict the curve... 10% will get an A, and so on.

Since the grading is automated as well, it would feed back into each question's score automatically

This may sound disturbing, but this is what the SATs do... those small sections at the end are just next years questions being tested

I also had a professor in college who did this, but it was through mental calibration over years. Yes, this does mean you can not give out the tests after for the students to review... but the test was surprisingly fair.

Re:Not just grading.. (1)

Peter La Casse (3992) | more than 8 years ago | (#13515191)

I also had a professor in college who did this, but it was through mental calibration over years. Yes, this does mean you can not give out the tests after for the students to review... but the test was surprisingly fair.

I had a professor in college who did this, and he regularly gave out past exams for students to study from. He said before each test that even though he often reuses test questions, the result had always been a standard grade curve.

Re:Not just grading.. (0)

Anonymous Coward | more than 8 years ago | (#13515279)

You just described item response theory.

Nothing disturbing about this...this is one of the biggest growing fields in psychology right now and it makes perfect sense (or at least better sense than an educator that just makes up questions with no eye towards standards).

Re:Not just grading.. (2, Interesting)

big ben bullet (771673) | more than 8 years ago | (#13517545)

This is actually an excellent idea.

But what really matters is the content. My wife teaches the courses in primary education (6 to 12y olds). She has a hard time making up questions for each course in each grade. If there was an openly licensed 'questions base' (maybe even a wiki) this would help her alot. She would be happy to contribute her existing material ;-)

You might want to start such an initiative along with the actual open source application.

Ohw... and *please* mind the localisation so there's a potential for every language speaking community in the world, and not only the english speaking community.

The problem with multiple-guess tests (1)

Marxist Hacker 42 (638312) | more than 8 years ago | (#13513059)

Is that most human beings are WAY too predictable in creating them- 75% of the answers will be B, unless it's a true and false question in which case 60% will be A.

My suggestion- get togehter with a programmer, and save class time by giving your tests on the web. You can code the tests in XML- the ASP or Java or python or PHP or whatever can randomize them for you so that no two students get the test answers in the same order. Students can log in from the computer lab to take the tests after school, or from home. Grading? All stored under their user name in your database, likely both as an already totalled value for the gradebook and a detailed view so that you can zoom in on what individual students are having a problem understanding.

I'm willing to bet there are not a few unemployed programmers out there willing to take on such a simple project as open source just for the resume fodder. I personally don't have the time, plus I'm not real good at web programming.

Re:The problem with multiple-guess tests (1)

blahtree (55190) | more than 8 years ago | (#13513447)

The best practice for multiple choice tests is to put the answers in alphabetical order. Bingo, no predictability.

Re:The problem with multiple-guess tests (1)

Marxist Hacker 42 (638312) | more than 8 years ago | (#13513565)

Not really- I had a teacher that did this and I could constantly score 80% on his tests. His subconscious got the better of him- his answers were usually (not always, but usually) 2nd or 3rd in alphabetical order, and because he was concentrating on the alphabet, the wrong answers were always rediculous enough that skimming the material at 1500 WPM was enough to make a good guess.

Re:The problem with multiple-guess tests (1)

blahtree (55190) | more than 8 years ago | (#13513636)

You're supposed to make the answers first, THEN alphabetize them. It's not my fault your teacher was a tool.

Re:The problem with multiple-guess tests (1)

Marxist Hacker 42 (638312) | more than 8 years ago | (#13513690)

You're supposed to make the answers first, THEN alphabetize them. It's not my fault your teacher was a tool.

Still better to randomize them- though that would do a pretty good job. Of course, the grand majority of answers would still be EATONRIDSH based if they were in English- making the most likely answer probably the first one instead of the 2nd.

Re:The problem with multiple-guess tests (1)

Tango42 (662363) | more than 8 years ago | (#13513785)

The wrong answers would be in English too... how does distribtion of letters make any difference?

Re:The problem with multiple-guess tests (1)

Marxist Hacker 42 (638312) | more than 8 years ago | (#13513809)

The distribution of the letters would dictate the order of the alpha sort- it's not truly random. I'm not sure what the effects would be- but it's a highly interesting problem that certainly deserves more research. But as a random number generator- it's only a very weak pseudorandom. Even seeding with the system clock on the server would be better than that.

Re:The problem with multiple-guess tests (1)

Tango42 (662363) | more than 8 years ago | (#13517463)

The sorting method is completely separate from the correct answer. The distribution of letters will skew the sort, but the correct answer won't be in a specific place in that sort. The wrong answers follow the same distribution as the correct.

Re:The problem with multiple-guess tests (1)

alienw (585907) | more than 8 years ago | (#13515171)

Um... have you ever taken a multiple choice test? Doesn't sound like it, that's for sure.

Re:The problem with multiple-guess tests (1)

YrWrstNtmr (564987) | more than 8 years ago | (#13519420)

save class time by giving your tests on the web.

And this will be an open book test? Or 3 people taking the test at once?

Re:The problem with multiple-guess tests (1)

Marxist Hacker 42 (638312) | more than 8 years ago | (#13519561)

Or 3 people taking the test at once?

If so, they'll have to take it 3 times, and with the randomization factor thrown in, they aren't likely to get the same questions twice. Three times as much work.

And this will be an open book test?

All modern testing should be open book and timed. After all, in reality, that's the new mode of skills people need to survive- the ability to research solutions and discern among several potential solutions, while applying event-specific variables correctly.

Re:The problem with multiple-guess tests (1)

YrWrstNtmr (564987) | more than 8 years ago | (#13521275)

If so, they'll have to take it 3 times, and with the randomization factor thrown in, they aren't likely to get the same questions twice. Three times as much work.

With the one smarter kid supplying the answer to the other two all 3 times.

All modern testing should be open book and timed. After all, in reality, that's the new mode of skills people need to survive- the ability to research solutions and discern among several potential solutions, while applying event-specific variables correctly.

No. Some stuff you just need to 'know'. Especially at the high school level. If you merely look up the answer each time, you never internalize it. It never becomes a part of you.
"Quick...where's France?"
"I dunno...let me look it up."

You never really 'know' anything, beyond how to ask the omnipotent Google.

Re:The problem with multiple-guess tests (1)

Marxist Hacker 42 (638312) | more than 8 years ago | (#13521419)

With the one smarter kid supplying the answer to the other two all 3 times.

Thus teaching the other two the most invaluable lesson anybody can possibly learn- to listen to experts.

No. Some stuff you just need to 'know'.

And how much of that have you used in your career?

Especially at the high school level. If you merely look up the answer each time, you never internalize it. It never becomes a part of you.

But what you do internalize is where it exists- which is far more valueable than the fact itself. How many times have you had to know where France is in your lifetime? But if you know where Google Earth is, you not only are able to very quickly get to the location of France, but also Germany...or anyplace else.

You never really 'know' anything, beyond how to ask the omnipotent Google.

Which is all you ever really need anymore.

Re:The problem with multiple-guess tests (1)

YrWrstNtmr (564987) | more than 8 years ago | (#13521562)

Thus teaching the other two the most invaluable lesson anybody can possibly learn- to listen to experts.

Or.."You're too dumb to ever actually know anything. Just ask the other guy."

And how much of that have you used in your career?

'Career' isn't everything.
Imagine sitting around with some friends, discussing events in the Middle East, or WWII. Without having a picture in your mind of where countries sit in relation to others...you miss a lot of the implications of why, who, and where.

- You never really 'know' anything, beyond how to ask the omnipotent Google.

Which is all you ever really need anymore.

And you never really learn to think for yourself.

You and I will continue to disagree.

Re:The problem with multiple-guess tests (1)

Marxist Hacker 42 (638312) | more than 8 years ago | (#13521732)

Or.."You're too dumb to ever actually know anything. Just ask the other guy."

That remark resembles more than 55% of the general population- and it's the reason computers and databases were invented to begin with.

'Career' isn't everything.

True, but it's one of the two main reasons society is sending you to school in the first place. Sitting around with your friends answering trivia questions isn't among those reasons.

Imagine sitting around with some friends, discussing events in the Middle East, or WWII. Without having a picture in your mind of where countries sit in relation to others...you miss a lot of the implications of why, who, and where.

Yes, but that's not the reason our society pays between $2000 and $10,000 per student per year for school. And thus, it's beside the point. The real point is to learn what you need to be a productive citizen. That means career and enough of an understanding of history to vote correctly. No more, no less.

And you never really learn to think for yourself.

The only "thinking for yourself" that society as a whole is interested in is that which is profitable or pragmatically usefull. School is not the place to learn to think- it never has been.

Multitask (1)

Noksagt (69097) | more than 8 years ago | (#13513098)

Depending on your testing needs/ability, you should easily be able to do optical testing of multiple choice or true/false tests (or have them submit tests via computer). A lot of more valuable tests require reading and thinking to grade. This still isn't fun.

But to make it less of a time sink, just multitask; grade during your commute or tv or laundry or workout or whatever.

Stairs method (2, Funny)

BortQ (468164) | more than 8 years ago | (#13513102)

It works better for 'fluffy' subjects involving opinionated essay writing, but here's the method:

- Take all the submitted assignments and collect them in a big pile.
- Throw the pile down a flight of stairs.
- Everything that makes it to the bottom gets an A. For each step above the bottom take off 5% of the grade.
- That's it.

I've thought (heavily) about being a teacher.. (1)

Morgalyn (605015) | more than 8 years ago | (#13513113)

This is something I've spent some time mulling over in my head, for no apparent reason, since I'm not even in the education industry at the moment. If you are relying on OMR, you are relying on a scanner. If you have access to a scanner with a bulk input, this might be ok, otherwise you're going to spend as much time scanning the pages as you would just grading them. If you have the resources in your classroom, setting up a randomized computer test for your students would eliminate paper usage (and save on your copying fees!) and provide automated correction, etc. but does not allow for the students to 'show work' (but neither do the multiple choice tests). One method I have had teachers use in the past (the long, distant past, come to think of it) is utilizing the students as graders. One of my teachers would have her next period's class grade the previous period's examinations. She had a ticket at the top of the tests which we ripped off when we turned them in that matched the test document to our name, which she kept. That cut down on people letting their friends slip by with wrong answers (or marking things wrong on foe's tests that weren't). She audited a few tests each round.

Re:I've thought (heavily) about being a teacher.. (1)

gcatullus (810326) | more than 8 years ago | (#13513304)

In junior high school my best friend and I sat next to each other. We were given weekly spelling tests, and for grading we were supposed to hand them to teh person next to us. We worked out a system where, if both of us got something wrong, we marked it correct, but if one of us made more errors, he got graded down. We thought we were so slick - until we realized that we never got a spelling grade. They counted for nothing.

If you can automate, should you be grading? (5, Interesting)

gozar (39392) | more than 8 years ago | (#13513133)

My best math teacher assigned homework every night. She would flip a coin the next day on whether it would be for a grade or not. So 50% of the time she wouldn't have to grade anything.

Assessment should be about the students knowing the material. Stuff like showing your work goes a long way. Math is the easiest to automate, but that would only show you that the student got the correct answer, not where the answer came from (like from a friend!).

To lower you work load, flip a coin on whether the students will hand in the work. If they aren't handing it in, trade with another student and grade it in class. Scantron only sends the message to your students that you are too lazy to look at their work, so why should they put any effort into it.

Re:If you can automate, should you be grading? (2, Insightful)

fm6 (162816) | more than 8 years ago | (#13513230)

Your teacher was a smart person. But she was lucky the bureaucracy let her get away with it. And that none of the students went whining to their parents about being made to do homework that wasn't graded. With school more bureaucratic and political than ever, I doubt if that would be allowed today.

Re:If you can automate, should you be grading? (1)

iibagod (887140) | more than 8 years ago | (#13514302)

Amen to that....how else are we going to convince the students/parents that homework isn't merely busy work meant to keep the kids out of the parents'/teachers' hair?

Gotta justify that teaching credential somehow.

Re:If you can automate, should you be grading? (1)

RackinFrackin (152232) | more than 8 years ago | (#13517561)

That's allowed all the time today, and with good reason. The primary reason that math homework is given isn't so that it can be graded. It's to give the students some guided practice using the concepts that they'll be required to know.

Re:If you can automate, should you be grading? (1)

AvitarX (172628) | more than 8 years ago | (#13513957)

Daily homework should be gone over during class anyway (math problem type stuff anyway). My teachers would glance at everyones notebook and if you did it you got a check. Then we would go over it. Made sense to me.

I had a nother teacher that would twice a month give a quiz that was just page and problem numbers, if we did the homework we could anwser them, but it didn't make sense to do homework and then not go over it.

Re:If you can automate, should you be grading? (1)

teknomage1 (854522) | more than 8 years ago | (#13514508)

MAth is one of the subjects where it is most important to examine the process behind a particular answer. Reducing everything to a series of multiple choice questions and scantron bubbles would be disasterous. At that point you may as well quit teaching.

Why not remove paper altogether? (1)

pokka (557695) | more than 8 years ago | (#13513177)

When I was in high school, one of my teachers placed a small keypad/LCD on every desktop in the class (the seats were arranged as long booths, so wires weren't an issue). The LCD just prompted: "Answer to Question #X" and you typed in A, B, C, or D.

You could navigate back and forward, review your answers, and confirm completion. When you were done, the grade was instantly recorded and displayed to the student.

That was over 10 years ago! I see no reason why this shouldn't be possible (and inexpensive) in 2005. Furthermore, you could extend the idea to publish the grades for online viewing, e-mail the grade to students with a detailed answer key explaining why the selected answer was wrong, etc..

Re:Why not remove paper altogether? (1)

fm6 (162816) | more than 8 years ago | (#13513269)

Nowadays, it would be more cost effective just to give every student a laptop, with a secure network application for test-taking. And a lot of schools are doing just that. Of course, every time a story about that appears on Slashdot, we hear a lot of noise about technology not replacing teachers.

Well, it's true -- technology can't replace teachers. But it sure can give them more time to actually teach.

Re:Why not remove paper altogether? (1)

Lehk228 (705449) | more than 8 years ago | (#13521710)

how exactly is a fragile and expensive laptop more cost effective than a $20 (have to assume the school will be fleeced by manufacturers spending $2 to make each unit) block of plastic with an LED display and a few instance switches?

Re:Why not remove paper altogether? (1)

fm6 (162816) | more than 8 years ago | (#13523577)

Typical geek solution: give a teacher a few electronic parts and say, "Here's your new grading system!" Who installs the system? Who fixes it when it breaks? How do you run the wires so they aren't a hazard?

Re:Why not remove paper altogether? (1)

Lehk228 (705449) | more than 8 years ago | (#13524113)

as opposed to laptops because laptops are so simple to maintain and repair over time.

seriously a custom designed system built to be rugged and safe for a classroom is much better than throwing a pile of money at the problem and hoping it goes away.

Re:Why not remove paper altogether? (1)

fm6 (162816) | more than 8 years ago | (#13526451)

And if laptops were only used to take tests, you'd have a point. But they have many other classroom uses: writing (students do more writing with less proding if they have access to a laptop), research (accessing primary sources online for free instead of expensive, watered-down textbooks), educational software, etc.

I'm not sure what your point is with that "throwing money" remark, unless it's that we can't solve problems by buying technology without knowing what we're going to do with it. That's certainly true, but so what? Are you claiming that nobody who wants classroom computers knows what they plan to do with them? I see a lot of evidence to the contrary.

This LCD scheme, by contrast is poorly thought out. It worked for a single gizmo-loving teacher, but you haven't offered any serious evidence that it can be deployed cheaply, except that some of the basic parts are cheap. Others are pretty expensive: wiring, a gadget to tabulate the results, etc.

Re:Why not remove paper altogether? (1)

KTorak (860467) | more than 8 years ago | (#13513382)

My school has eliminated paper from the grade system. A teacher gives out an assignment, grades it, then puts it right into their PC. From the program (Grade Quick), they can also make attendence sheets, seating charts, and a vast amount of other stuff. They can even make say...homework ONLY 15% of your final grade and the program does all the math, saving a lot of time. If you want to know your grade, most teachers will just let you ask and they will show it to you. With the help of a 3rd party group (Edline), they can publish the grades to the web so you can view them from any where at anytime (when the teacher updates it).

As for grading assigments, my math tests have almost always been "show your work, and write the answer on the sheet." I've never gotten credit for only doing work but having the wrong answer. And the only multiple choice tests I have had were exams in 7th grade math, and Honors Algebra II.

Personally, I like the idea of inputing it all into a computer on a desk like that. I'm sure it could be adapted to have you put the EXACT answer you got and not do multiple choice...but it'd be difficult becuase so many problems in math can have varying answers based on rounding and other factors.

Student Grading (1)

Noksagt (69097) | more than 8 years ago | (#13513233)

Collect all of the tests, photocopy them, and allow students to grade themselves (walking through answers in class should take less than one class session & will help students understand what they missed). Cheaters can be caught by comparison to the photocopies.

If privacy isn't a concern, you can chop off the corner with the students' name & write a number on top & allow them to grade someone else's exam.

Multiple Guess (3, Interesting)

99BottlesOfBeerInMyF (813746) | more than 8 years ago | (#13513290)

I've always had a strong dislike for multiple choice and true/false testing. Taking those tests is often more of an exercise in test taking than it is in the subject matter. A good test taker can eliminate a good portion of the answers right away and use fairly intuitive psychology to improve the odds of guessing correctly.

What ever happened to demonstrating competence in a field? Forget multiple choice and true/false. Ask your students to actually solve applied math problems or actually write some code (or pseudo code). Maybe you can't do as much testing that way and maybe you can't shorten the time it takes to grade the papers but at least you will be testing something worthwhile.

Sorry for the rant but after having survived more than a decade of "education" that consisted primarily of memorize foo and the regurgitate, I'm fairly traumatized by the horror that is the educational system. I learned orders of magnitude more useful information by simply reading everything and anything and trying to apply what I learned to my pet projects. I took one too many tests where I knew several multiple choice answers where justifiable and "right" depending upon unspecified information not contained in the question and having to guess what the test author thought the correct answer was. Multiple choice, true/false, and automated testing are big indicators of a "fast food" mentality and I'm firmly against that sort of foolishness. Grumble, grumble, etc.

Re:Multiple Guess (3, Insightful)

bluGill (862) | more than 8 years ago | (#13513582)

There are times for memorization, and there are times when you need to go farther. In math you always need to go farther and understand the concepts. In shop you must get 100% (no misses allowed!) on the safety memorization test before you are allowed to take the test on real tools. Of course knowing that the margin of safety on some saw is 10 inches doesn't mean you won't put your fingers closer to the blade, but if you don't know that number it means you will.

Memorization is important. Do not overlook the value of memorizing some things (even if you will never need to know that poem once you pass the class, it is still useful to do it). Though overall I agree that there is too much focus on memorizing (mostly on the wrong thing!) in school, that doesn't mean you can get rid of memorizing.

Re:Multiple Guess (1)

jpostel (114922) | more than 8 years ago | (#13514909)

I don't have mod points to mod you up, so I will respond instead. The original poster did not indicate what level of mathematics they teach. I think it is important for the discussion.

My first thought when I started reading everyones comments was that they seemed to be missing the point that the first several years of mathematics are almost entirely memorization. Addition and subtraction are conceptual when showing children a bunch of apples and then taking two away. After that, it's memorization. Multiplication is also conceptual in the beginning, but memorization kicks in soon enough. Hell, even trigonometry and geometry have a good bit of memorization, and they are taught in high school.

Free response tests for seeing which students are not understanding which steps in the mathematical procedures are useful for helping individuals learn and address specific shortcomings and should not be ignored, but that should not eliminate the memorization process or, even worse, replace standarized mathematical testing. There is still a place for right and wrong answers in mathematics.

I used to annoy many of my math teachers because I did most of my work in my head. The first time I was called on it was when we were learning long division. I could do it in my head, but the teacher said to show our work. I would scribble the equation and then write the answer. I used to get the same from my trig and calc teachers. They think you are copying the answers from someone if you don't show how you arrived at those answers. I was not always perfect, but I rarely took even half the time to take a test. I got yelled at once for finishing a test before the teacher finished handing it out to the rest of the class. I raised my hand and asked what we were supposed to do when we were done with the test, and she told me we would not be done for a while, so I should just be quiet and get to work. The look on her face when I told her I was done was great. She offered to grade my test in front of the class to embarrass me, so I called her on it and got 95% correct. Strange what we remember from childhood. And I was always at the desk right next to the math teacher for some reason.

Factual knowledge is a commodity (1)

chocolatetrumpet (73058) | more than 8 years ago | (#13533594)

Memorization is stupid. Factual knowledge is a commodity. I can look anything up in seconds.

Anything worth knowing will be "naturally" memorized through use. Anything not worth knowing will be forgotten. Anything worth knowing that has accidently been forgotten can be looked up.

It's a beautiful system, and it's all natural!

Learn to read (1)

bluGill (862) | more than 8 years ago | (#13537889)

You need to learn to read. Or maybe reading comprehension (Though I'll admit that it would help if I was a better writer). You can look some stuff up in a book, but sometimes you should memorise that stuff before you need it.

When your clothes are burning off your back there is not time to find a book to look up "Stop Drop and Roll", so you have to memorize it in advance - in fact most people would never need to know what to do in that situation, but everyone should memorize what to do about it!

There is also value in memorization because it is brain exercise. You are unlikely to need to recite Poe's "the Raven", but that doesn't mean you shouldn't memorize a poem from time to time just to exercise your mind.

In fact your entire argument fails. You cannot look anything up in a book until you learn to read, and you cannot learn to read until you memorize your letters, phonics, and whatever else. True someday you will remember the rules of reading through use, but you need to memorize those rules before you can use them.

You're too insightful... (1)

mosel-saar-ruwer (732341) | more than 8 years ago | (#13513603)


What ever happened to demonstrating competence in a field? Forget multiple choice and true/false. Ask your students to actually solve applied math problems or actually write some code (or pseudo code). Maybe you can't do as much testing that way and maybe you can't shorten the time it takes to grade the papers but at least you will be testing something worthwhile.

Sorry for the rant but after having survived more than a decade of "education" that consisted primarily of memorize foo and the regurgitate, I'm fairly traumatized by the horror that is the educational system...

Look, obviously you're too insightful to be allowed to take part in this conversation.

And it's equally obvious that the original poster, "Dont tempt me" - who is too damned lazy to perform the most important aspect of his job, which, in this case, is to grade his students' math exams - has found for himself the perfect career path, that of public school educrat.

In twenty years, or thereabouts, he'll be raking in $250,000 per annum as some poor town's Superintendent of Schools.

So be happy for the guy.

And in the meantime, go take your Ritalin, like a good little plebe...

Re:Multiple Guess (1)

Bastian (66383) | more than 8 years ago | (#13513774)

So true. Having been a grader as an undergrad, I understand the attraction of Scantron-type systems. BUT I can also say that I don't remember nearly as much from the classes that used multiple-choice tests as I do from the classes that required you to actually show some command of the material.

It's a lot harder to grade, I realize, but the best tests in my opinion are the ones that have four essay questions, all of which require a great deal of thought on the part of the student.

Memorizing facts is nothing. If you can't use what you know, you haven't learned a thing.

Re:Multiple Guess (2, Insightful)

SlamMan (221834) | more than 8 years ago | (#13517595)

At some level you have to think about what's logistically possible for a teacher. Say you teach 8th grade science. Your school has 8 mods (or periods, whatever they call em these days), 6 of which you teach students during, 1 for your lunch, and 1 for "planning", which is usually spent deal with school bureaucracy, possibly calling parents, or once in a while doing actual lesson plans for the next day.

Each of your 6 classes has 30-35 students.

Every time you give an assignment to you students, you get 180-210 papers back to grade. Thats 210 papers back about every day. How do you find the time to grade 210 papers every day?

Now imagine each of those papers was a free response, encouraging you to think, and show ability to use what you've learned.

Now add in how much time it would take to come up with those questions, for 180 school days.

suggestions (2, Informative)

Darth_Burrito (227272) | more than 8 years ago | (#13513307)

Everyone I know uses scantron for multiple choice tests. If you're looking for some slightly more tech oriented solutions, here are some suggestions:

For multiple choice tests you could use off the shelf survey software like phpsurveyor or phpesp. Keep in mind these wouldn't necessarily be great at grading but it would let you easily analyze the test results question by question.

If you are grading programming assignments, you could develop your own testing suites using the *unit family of testing suites: nunit (.net), junit (java), phpunit (php), and I'm sure there are others. I think there's even some tools designed to evaluate test coverage like jcoverage (never used). Maybe you could have advanced students write test suites for the novice student assignments and evaluate/fix them with jcoverage... then use the test suites to automate testing of novice students.

Of course, there are only so many things that are easy to test in an automated fashion. You may have to give students exact specifications on interfaces and that may not always be desirable.

Re:suggestions (ok, automate THIS) (0)

Anonymous Coward | more than 8 years ago | (#13514651)

Suppose you have:

Submission 1:

print "hello world\n"

Submission 2:

print "hello "
print "w", "o", "r", "l", "d\n"

Suppose both submissions do the same thing when executed. An automated grader will note that, and score the submissions similarly. (Oops.) As a teacher, I've experimented with automated grading systems (including writing my own). For programming assignments, it seems to me that automation doesn't help either (a) me or (b) the students.

A lot of programs will transform the same inputs into the same outputs. Most of the possible programs are heinously bad, in terms of what I want the students to learn: good style. Getting the right answer (as output) is important, but how you get there is equally important.

Re:suggestions (ok, automate THIS) (1)

Darth_Burrito (227272) | more than 8 years ago | (#13515482)

Getting the right answer (as output) is important, but how you get there is equally important.

It sounds like correctness as potentially evaluated by a testing suite going off of a specification would be at least as important as code cleanliness/efficiency/style. Using regression testing suites would not solve all problems, but it seems like they would solve some. If you could work in regression testing into the curriculum somehow, there would be a value add for using the technology.

Perhaps the true value add is that you can see exactly which test cases fail while you are looking at a student's solution.

Another idea might be to let students run teacher supplied regression tests on their homework as they work on it. That way, they know when their solution is correct and when something is wrong. Maybe then you can focus more on evaluating style and maintainability and less time on correctness (because you will expect correct solutions).

Use an algorithm as the solution... (1)

moorley (69393) | more than 8 years ago | (#13513356)

And make the students do the work...

Create three variations of the test with each question yielding a slightly different mathematical answer. You could have it as simple as adding all the answer so they match a certain sum or have an algebra equation at the end that they plug the values into.

All you have to do is check the variation number of the test and the result of the algorithm.

To shake it up and prevent cheating you can use variations on the test (either variables or questions), you can use variations on the algorithm at the bottom of the test.

A simple algorithm would be the sum or the product of primes. Each question's answer is mathematically related to a distinct prime. You could have a computer program break it back out and let you know which question they got wrong. Or maybe a derivative of the double entry booking system with questions matched in pairs to cancel each other out.

All in all, whatever method you use you are putting the weight on the student instead of individual grading the exams by hand. So any human error they make will still have to be compensated with a human's eye (and compassion) but it will get you 80-90% of the way there.

Overkill would be to write a program with a set of rules to generate the questions on the test, the primes, and the algorithm. Then you put in the unique identifier for the test and the one result and it spits out which answers they got right and wrong, but you'll have to work them out yourself if the student contests. How much do you trust your programming skills? ;-)

Re:Use an algorithm as the solution... (1)

dj e-rock (700351) | more than 8 years ago | (#13516610)

There are LaTeX contributions that'll do all that for you. All you do is input the questions and the answers, and it'll randomize the answers, make the different versions, etc. Never used them, since I'm not in a teaching position, but if I were, I'd use `em.

A hybrid approach (1)

clutch110 (528473) | more than 8 years ago | (#13513379)

In math, I think showing work is one of the most valuable aspects of the test. This shows a deep understanding of the problem at hand. Why not have final answers on a scantron like sheet and once marked wrong, review just the incorrect answers on the work sheet they turn in along with the scantron. Grading will still take time but you will instantly get to the incorrect problems and help the people who need it most.

Strange questions for Slashdot (1)

MerlynEmrys67 (583469) | more than 8 years ago | (#13513540)

Hmmm... Lets see - started college in 1985. The technology for multiple guess tests back THEN was scantron...

Not sure if they have better or faster scantron machines today - but I would bet your school has something like it around somewhere.

Ask Slashdot - reinventing the wheel since the late 20th Century

Re:Strange questions for Slashdot (1)

Sigma 7 (266129) | more than 8 years ago | (#13528594)

Not sure if they have better or faster scantron machines today - but I would bet your school has something like it around somewhere.


They're still the same. The latest scanatron is still reported to have trouble distinguishing the correct answer if there is even a trace amount of pencil on another cell.

If you make a mistake, you either need to erase heavily (potentially damaging the paper) or get another sheet.

Do less grading!! (4, Insightful)

MagicDude (727944) | more than 8 years ago | (#13513625)

Not really a technological answer to what you were looking for, but I think it's worth mentioning. You say that you spend most of your time grading. I don't know how many students you teach, but I'm wondering if it's just a matter of you giving too much busywork to your students that you in turn have to grade. In my high school calculus class, my teacher assigned homework but he never graded it. At the beginning of every class, he make a quick pass around to see who had done it, and mark you off if you hadn't. He would then pass out an answer key for the assignment and go over any questions people had. His policy on homework was that it helps some people, and is just a pain in the butt for others. So the deal was that if your semester average was over 90%, there was no penealty for not doing homework. If your average was between 80 and 90, then you lost 0.5% for each homework not done, and so forth. As such, the only thing he graded for our class were the exams he gave every 3-4 weeks. So I'm just saying that perhaps the answer isn't to find faster ways to grade, but eliminating some of the pointless grunt work for you and for your students.

been there, done that (1)

Teach (29386) | more than 8 years ago | (#13513790)

Maybe I've written some software that can help you. I'm a computer science teacher at the high school level, and I take my grading very seriously. But, I hate "busy work" and am equally serious about using technology to streamline my grading workflow.

Now, I assume you only have one PC in your classroom, whereas every student in my room has their own machine in front of them. And that's a huge advantage; if the work is in digital form to begin with, it grades much more quickly.

Anyway, I've done some things to improve my workflow:

  • My skills quizzer [grahammitchell.com] that gives me infinite self-grading quizzes for life.
  • Scripting to make manual grading and subsequent data entry [grahammitchell.com] faster.

It's not my stuff, but Jeff Burgard's Gateway to Mastery [jjburgard.com] concepts are very, very good and encourage real accountability for learning in ways that keep the grading load manageable. (And he's got a lot for math teachers and doesn't assume you have access to more than one teacher PC.)

I don't have time to write more in this comment, but I do have more to say. Reply or email me if you're interested in discussing more.

Pimping LMS... (1)

EvilMagnus (32878) | more than 8 years ago | (#13513859)

I'm working on an open-source Learning Management System called Sakai [sakaiproject.org] that has a tool for online test taking - you set up the test, tell the system what the correct answers are, then students sign in and take the test whenever.

There are obvious limitations to online test taking, but it *does* provide automated grading, and with the right institutional commitment it can have a positive impact on student learning.

 

Just say no (1)

msuarezalvarez (667058) | more than 8 years ago | (#13513909)

I'd simply suggest keeping away from automated grading, as it is a very bad way of grading.

I have taught math and CS for quite a few years, and I have never ever given an exercise in an examination for which the point was the final result. Most of the time, I do not even look at the final result. Of course, it is a lot more work to actually read what students write, how they get to the results they get, how they argue if they argue, &c, but that is what I am teaching them to do, so that is what I am grading.

Re:Just say no (0)

Anonymous Coward | more than 8 years ago | (#13515757)

Automated grading doesn't have to mean you remove the human component. Sometimes it means you give the human component more time to focus on what it can best evaluate.

For example, with programming assignments, you could use automated regression tests to evaluate the correctness of solutions with respect to the specification. Then, while you were doing the manual portion of the grading, you would have a list of every test passed or failed for every problem. With this list, both you and the student would be able to more efficiently identify problem areas. In other words, an automated grading program could help you perform better manual grading.

It's just a tool, like a breadmaker or an automatic transmission. The right tool in the right hands is good. Any other combination is generally not so good.

Try Moodle and such. (3, Informative)

Paladin2ez (619723) | more than 8 years ago | (#13514041)

I'm a highschool math / science teacher and for a while I've been playing around with moodle (http://moodle.org/ [moodle.org] ). Though it may take a little to setup (PHP and MySQL are needed), it is a good system. Just make sure you have the power to use it.

All in all, it will allow you to make quizzes and lessons online that students can access. Questions can be auto sorted and even short answer questions with different possible answers. Its a beautiful system with the only flaw of facilitating a computer for each student to use. (I'm in an independant school so our kids have laptops at the ready, something we don't all have.)

The only other geek-oriented possiblity would be using scantrons or small LCD based devices, but from what I've seen nothing fits the bill. Possibly the best action might be changing how you grade and what work your students do (ie projects instead of tests and the similar). It works with a little imagination and there's alot less grading!

Make them grade each other... (1)

tktk (540564) | more than 8 years ago | (#13514217)

  • Collect the tests
  • Semi-randomly distrubute them back to the students.
  • Call out the answers, and have each student mark the wrong answers.
  • Get immediate feedback about problems that lots of students missed.
  • Make bored looking students work out the problem on the board.
  • Return test to original students.
  • If mistakes in grading are found, take points from mistaker and give to mistakee.

Works especially well if you don't happen to have a lesson plan that day.

This happened all the time in my high school. And despite that, I was still able to complete BC Calculus. Or whatever it's called now.

Re:Make them grade each other... (2, Informative)

greenplato (23083) | more than 8 years ago | (#13515040)

Besides concerns about cheating, that practice is likely a violation of the student's right to privacy. The Family Educational Rights and Privacy Act [ed.gov] prohibits the release of any information from a student's education record. Because test scores make up a large part of any final grade, sharing these results with other students is probably a no-no.

This happened all the time in my high school.

Yeah, high school is a good place to stomp all over kid's rights. Somebody has to put them in their place...

Re:Make them grade each other... (1)

epsalon (518482) | more than 8 years ago | (#13517011)

Just remove the students' names before redistributing the exams. Keep only an identifier with the answer sheet and put the students' names in a seperate list.

Re:Make them grade each other... (1)

jackbird (721605) | more than 8 years ago | (#13517677)

That's just as much, if not more, busywork as grading the tests.

However, I don't think the GP is correct - nobody objects to students working in groups, etc., and I fail to see how peer-graded homework that isn't handed in constitutes part of a student's "record." Nobody would do that for a major test or final exam.

Re:Make them grade each other... (1)

greenplato (23083) | more than 8 years ago | (#13519359)

Did you read the original question or comments?

The article specifically asks about grading tests, with no mention homework:
I find myself looking at handwritten tests and thinking 'there's gotta be a better way...'

Then tktk writes:
Collect the tests
...

And I respond:
Because test scores make up a large part of any final grade, sharing these results with other students is probably a no-no.

Then you write:
I fail to see how peer-graded homework that isn't handed in constitutes part of a student's "record."

That's super; I never suggested that there could be a problem with homework. I do think that peer-grading of exams is questionable, specifically I mentioned that it could be a violation of the students' rights.

Re:Make them grade each other... (1)

sdedeo (683762) | more than 8 years ago | (#13523937)

Jeez, chill. It's a simple matter to make things anonymous, but I doubt the process would be in violation of the act anyway (the teacher can make up the grades later based on the student's percentages.)

We used to do this as well, and it was a very educational experience. I can imagine it now... "Jimmy, what is the answer to question six?" "I'm sorry, the Family Education Rights and Privacy Act prohibits the release of any information from my education record. Since my knowledge of the answer constitutes part of my record, I respectfully refuse to answer."

What a dick. (1)

Seumas (6865) | more than 8 years ago | (#13515004)

What kind of teacher expects kids to do work that the teacher won't see and can't be bothered to deal with? If you want some fancy computer gizmo to do your homework grading for you, do your students get to have the same attitude and just copy and paste wikipedia articles?

Re:What a dick. (0)

Anonymous Coward | more than 8 years ago | (#13519343)

*soapbox on* No, *you're* the dick. Unfortunately, I encounter this sort of attitude all the time from those who expect "A" marks without having to do the work required. The culture of instant gratification remains alive and well. Pity. Besides, the analogy given is poor. One is making a near-impossible to impossible workload *slightly* easier to manage, the other is using somebody else's thoughts and representing them as your own. Which they aren't. Even if you agree with said wikipedia articles, couldn't you at least *attempt* to use your own brain to do your work? I define using your brain as being more than an automaton. Or are you the kind of person who looks at the answer the calculator gives him to a measured-quantity problem and says, "Ah! All of those digits look great! Let's just write them all down!" and then gets pissed because the professor marks it as completely wrong? If you're upset about the use of tools to make grading easier, why don't you try being a teacher that has 400+ students a semester? I happen to think that every one of my students' papers are worth looking at, but sadly, I don't have the time to do so. And even more sadly, it seems that few people are interested in going into the field of education so that we could *have* those nice, small classes where we could give a lot of individual attention to each student and *have* those detailed-feedback assignments. Do you think that perhaps that's being driven by the fact that it's so difficult to make a living on an educator's salary? Even just a little? Me, I'm happy to do my work for peanuts (and have done) because, by God, I love what I do. But, you'll never get small classes and homework assignments where the student is asked to do some detailed work-out of problems until there are enough competent and passionate professionals in the field. And for that, one thing you need to do is pay them what they're worth. And for *that*, we need a sea-change in the mindset of what people think is important. Sorry for the rant. This one struck a little too close to home.

Re:What a dick. (1)

Seumas (6865) | more than 8 years ago | (#13524813)

I define using your brain as being more than an automaton

Exactly. So you and I both agree. Teachers should use their brains, rather than having some crappy piece of code automating the process.

And don't whine to be about the hard life of a teacher and the low salary. It's not like it's a fucking new situation. If you've gone into educating as a profession at any point in the last 30 years, you knew what a shit job it was and didn't have a problem taking it - supposedly because you enjoy the actual job itself.

Re:What a dick. (0)

Anonymous Coward | more than 8 years ago | (#13554292)

I'm sorry, we don't agree, and it's too bad that we don't. You've de-contextualized a phrase that needs to have its proper context. First, teachers aren't being automatons by using the machines to grade. When I give a multiple-choice examination, it's a well-constructed one. And none of them have had a percentage above 50 (that I can recall) of the answers being B, because I'm aware of the phenomenon of "taking the easy way out" distractors *whacks colleagues that do that*. Many of the answers aren't single-response (multiple-choice) but are multiple-selection, meaning you might have to mark three or more bubbles in a line to answer the question correctly. That requires knowing the subject material, not lucky guesses. And a Scan-Tron machine grades those types of exams just fine, thank you, and the exams are great measures of how the students are doing. I have yet to give an A or A- grade to anyone that didn't have to actually *know* what they were doing. Secondly, I have every right to complain about the educational system going the way it's going versus pointless amounts of dollars being flung at those who don't deserve a tenth of what they're getting. You *bet* that there are poor, some extremely poor, teachers out there who've gotten their jobs by means that I've no idea about. Some of that is due to certain institutions of higher learning not really *training* teachers appropriately (I know of a University that turns out copious quantities of education graduates that haven't even had a course in Classroom Management!). Some of it is due to a tenure system that in many cases fails to continue evaluations of teachers past their tenure year. I have no idea of how to fix either of these problems, because my own energies are engaged in *being engaged*. Loving what you do doesn't mean you love everything about it, and if a teacher has taught you something, at least tell them so, if you aren't going to help them receive what's due them. Without good educators, society doesn't make much progress. I seriously doubt *any* other field can make that claim. For those that say, "oh, I don't owe them anything -- I'm self-taught!" then thank your parents, your sibling, your counselor, your confidante. Or any others that took care of your needs so you could tend to educating yourself. I will say this, though. The best thing I've ever heard from any one of my students, I heard just over a year ago. This gal stuggled with Chemistry when I had her. She was originally a history major. I mean, she cried. Stayed up late. Really got in there and hammered away at it. Was at office hours at 7:30 AM. And 4 PM. Had two kids. A husband. Worked 30 hours a week. Etc. Got a note from her inviting me to her graduation for her Master's degree in Chemistry. She had written, simply, "You changed my life." Then "Thank You," and her signature. I want that note to be buried with me. So don't *ever* tell me I don't deserve, and quality educators don't deserve, a lot more than they're getting. And listen up -- education is never a "shit job" for serious educators. That's the attitude that has gotten us into the situation we're in in the first damn place. Do you value education? Perhaps not. Education needs to stop being thought of as a low profession, and be restored to its proper place in society. Start with students coming to class on time, not walking in 10 minutes late. Constantly. My favorite way to handle that is give quizzes the first 10 minutes. Start with qualified professionals teaching classes, not giving people 180-day emergency certifications at the drop of a hat. I won't go on and on, because I have notes to prepare for Astronomy tomorrow, but consider -- enjoying the job doesn't equate to being able to pay the bills when they come due. And, sad to say, greenbacks are what the country runs on, not whether you're happy in your job. Someone will doubtless pipe up and say, "then why did you get into the field?!" My response is, "When did we stop being able to make a living doing what we love?" Cheers, and thanks for your response! I was glad to see you weren't offended too much by my tone, which was, quite honestly, out-of-line. Let's continue the discussion!

The traditional way (1)

holy zarquon's singi (640532) | more than 8 years ago | (#13515662)

I thought that the traditional way was to give them the test, collect them up, and throw all the papers down the stairs. Then the ones that fall furthest get the highest mark. Easy, and works for all non-practical subjects and some practical ones.

Re:The traditional way (1)

CFMLSpecialist (900388) | more than 8 years ago | (#13516138)

When the teachers get to use automated grading systems, I think the students should be allowed to use automatic homework systems.

No shortage of Open-Source solutions (1)

Sometimes_Rational (866083) | more than 8 years ago | (#13516226)

On SourceForge, look up Webwork, Moodle, AIM (Assessment In Mathematics), and LON-CAPA, or look at STACK here. [york.ac.uk] Webwork, AIM and STACK are primarily geared towards delivering math homework and I know that Webwork in particular can deliver fairly sophisticated problems. Webwork is entirely free and has several large free problem libraries, AIM is costly in that it uses Maple as one of its components, but STACK is similar and replaces Maple with Maxima or Octave (I think).

Moodle and LON-CAPA are more general. Moodle looks mangageable, but LON-CAPA looks huge and bulky to me and is probably very much a group project.

In the proprietary realm, I like the ALEKS [aleks.com] online system for teaching basic mathematics, and their statistics looks good, too. No, I don't work for them or own their stock. I don't care at all for any of the other proprietary systems I've come into contact with, and I've seen many of them.

Fundamental question (0)

Anonymous Coward | more than 8 years ago | (#13519751)

IMHO, your question is flawed. I would ask, "do multiple choice tests actually test students knowledge and problem-solving ability, or merely challenge their powers of deduction and 'gamesmanship'?" I used to ace standardized (SAT, ACT, ASVAB, and the like) and teacher-made MC tests all through my primary school career. Of course, once I got to University where I had to demonstrate actual learning, I got hammered - it took me years to learn 'how to learn' which is what is required to be truly educated, and also what is required to get the most out of a University education. Right now, there are probably few tools out there to help a public school teacher measure 'real learning', but I suggest you look for some or develop them - your students will be better served, IMHO. Good luck!
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...