Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

What Does a Software Tester's Job Constitute?

timothy posted more than 2 years ago | from the darts-glue-and-ping-pong dept.

Software 228

First time accepted submitter rolakyng writes "I got a call from a recruiter looking for software test engineer. I'm a software engineer and my job is development and testing. I know I mentioned testing but I'm pretty sure it's totally different from professional testing practices. Can anyone shed light on what a software test engineer's day to day responsibilities are? They said they'll call me back for a screening and I want to be ready for it. Any tips?"

Sorry! There are no comments related to the filter you selected.

Ummmm (0, Redundant)

eternaldoctorwho (2563923) | more than 2 years ago | (#39000297)

Just a guess, but: Testing software?

Re:Ummmm (2, Informative)

tripleevenfall (1990004) | more than 2 years ago | (#39000381)

Taking builk testing responsibilities off developers so they can work on more important stuff.

The epitome of a tester (3, Funny)

AliasMarlowe (1042386) | more than 2 years ago | (#39000501)

Any tips?

Study the techniques of Bob the Bastard in testing hardware. Apply analogous techniques to software. Software developers will fear and respect you, and tremble at the mention of your name.

Developers often make poor testers (5, Informative)

perpenso (1613749) | more than 2 years ago | (#39000659)

Taking builk testing responsibilities off developers so they can work on more important stuff.

Not quite. Developers often make poor testers. Software tends to get debugged and tuned for the way developers use the software, which is not necessarily how others (in particular customers) will use the software. How many developers have written a piece of code, tested it conscientiously themselves, presented it to others expecting no problems, and watched these other folks find serious bugs within minutes?

Having dedicated testers between developers and customers yields better products, even when the developers take testing seriously.

Re:Developers often make poor testers (4, Informative)

Firehed (942385) | more than 2 years ago | (#39000847)

Indeed. Some stuff happens from weird ways that users try to do things that developers simply wouldn't think to test. Case in point, bug I found today: someone bought a six-month subscription by entering 0.5 as the quantity on a 1-year subscription. Our code wasn't expecting non-integer quantities, but happily did the math to get the line-item subtotal. When the data was stored, the 0.5 quantity went into an integer column, which the database cast to an integer by rounding down and suddenly qty * price != subtotal. It was caught and quickly fixed by a data integrity check, but a QA/dedicated testing person that thinks of weird user interactions like that could have prevented it from going out to production in the first place.

Now we have one more thing added into unit tests so it won't happen in the future, but there you go. The code was not untested, it was just used in an unpredictable way.

Re:Developers often make poor testers (4, Informative)

networkBoy (774728) | more than 2 years ago | (#39000885)

I write test software, specifically software that acts on firmware. The majority of our devs do unit tests, but once the whole thing is assembled together and used as a system interactions show up that are not ideal.
You need some form of QA department between Devs and Customers.
As someone who is already familiar with software engineering you likely would be an ideal candidate for test engineering, because you know how software generally works, and can write meaningful bug reports.
For an interview, if they ask you for strengths, I would focus on your development background and ability to write meaningful bug reports.

Re:Developers often make poor testers (1)

Anonymous Coward | more than 2 years ago | (#39001009)

It's the same reason authors need proofreaders. When we create something our natural inclination is to "hope for the best". That means we will unconsciously ignore small issues because we are concentrating on the "happy path". A software testers mindset is completely the opposite (or should be). They are not there to create, but to destroy. It's helpful to have that kind of mindset when testing software. You have to analyze the software and generate test cases that you think may exercise parts of the program specifically to cause problems. Things like stress testing, edge case testing, and user acceptance testing are not something that your typical programmer performs day to day and may be completely missing from her tool set.

Re:Developers often make poor testers (5, Informative)

asliarun (636603) | more than 2 years ago | (#39001051)

Taking builk testing responsibilities off developers so they can work on more important stuff.

Not quite. Developers often make poor testers. Software tends to get debugged and tuned for the way developers use the software, which is not necessarily how others (in particular customers) will use the software. How many developers have written a piece of code, tested it conscientiously themselves, presented it to others expecting no problems, and watched these other folks find serious bugs within minutes?

Having dedicated testers between developers and customers yields better products, even when the developers take testing seriously.

Actually, that is not necessarily true. I get what you are trying to say, but you seem to gloss over the differences between QA, manual tester, and what the OP was referring to: Software Test Engineer.

To highlight some of the differences:

QA is responsible for "assuring quality". This is different from QC which is "checking quality". More often than not, a good QA is a process expert, with the assumption being that good processes ensure good quality. Their goal is to avoid the problem, not to detect the problem or fix the problem. Where the line gets blurred is the fact that a QA often performs the role of a manual tester. This usually depends on the size of the team.

Manual testing is usually QC - understanding what to test, how to test, and going ahead and testing it. They start off by translating the requirement specification (or user stories if you are agile) into a suite of test cases, add other test cases that might be non-functional or regression related, and finally test the system manually every time before it is released to customers.

Generally (although not always true), a "test engineer" is more of a developer than a tester. They are usually tasked to develop test frameworks using third party tools or even creating their own framework. The former usually involves scripting and lightweight coding and the latter can involve full blown coding. They can be developing a test framework for executing and managing unit tests and functional tests (often white box), and integration tests, regression tests, and performance tests (often black box). While many project teams skimp on devoting this much engineering to testing, it can give huge returns, perhaps even better returns than development can after a certain point.

To be fair, the OP has not mentioned anything else beyond "software test engineer" so the role might very well be manual testing. However, the word "engineer" leads me to believe it is more of a automation role. Having said this, companies often embellish their titles with "engineer" to make it sounds weighty.

Re:Ummmm (1)

stanlyb (1839382) | more than 2 years ago | (#39000389)

Try again. Repeat.

Re:Ummmm (-1)

masternerdguy (2468142) | more than 2 years ago | (#39000403)

for herpes?

Re:Ummmm (-1)

Anonymous Coward | more than 2 years ago | (#39000743)

I'm gonna give your dad ass herpes.

Re:Ummmm (5, Insightful)

fooslacker (961470) | more than 2 years ago | (#39000711)

Reading and understanding requirements....Writing testing strategies, test cases, and low level testing scripts that can be traced back to the individual requirements that they test....Understanding which test cases map to which functional blocks of a system....Identifying which test cases should be part of a regression pack and keeping that pack fresh through various versions of the software....running that regression pack when requested during future development cycles...performing change impact analysis to select subsets of the regression pack to test various changes...etc.

....and if it is a manager position then add in all the people stuff on top.

...and of course executing test cases and tracking the results.

Re:Ummmm (1)

Akratau (2429796) | more than 2 years ago | (#39000973)

Definitely! You are checking a list of features,however you don't get the features list solely from the software - you get it from the Business Requirements Document - you are testing that developers have delivered what has actually been asked for by the people who are doomed, sorry, destined to use the software.

Re:Ummmm (0)

Anonymous Coward | more than 2 years ago | (#39001319)

3 test cases, our company has about 1200 test cases for our software written between powershell vbscript ruby and javascript. Our software is a backbone to a lot of corporation allowing them automation of their repetitive tasks. Therefore we have to QA every possible SP and version of windows, and most major unix/openvms systems

The key is versatility and dependencies. Not only do you need to look at your software, you have to look at the underlying components and basically expect failure at every angle.

it means (1, Flamebait)

gangien (151940) | more than 2 years ago | (#39000301)

telling the developer he's right about everything and the product is never broken, or the tester will get a tongue lashing.

Re:it means (1)

tthomas48 (180798) | more than 2 years ago | (#39000319)

Sounds like someone's got a case of wanting to be a product manager instead of a tester.

Some developers appreciate their QA people (5, Insightful)

perpenso (1613749) | more than 2 years ago | (#39000453)

telling the developer he's right about everything and the product is never broken, or the tester will get a tongue lashing.

As a developer who was fortunate enough to have an internal QA department I can say that your opinion is not universal. Hell, myself and fellow developers were annoyed when our QA people were moved to an adjacent building. We preferred having them one floor away in the same building so that we could more easily walk over to their cube to see what they see. Of course our QA people were trained, part of the company not contractors, etc.

Re:Some developers appreciate their QA people (2)

stanlyb (1839382) | more than 2 years ago | (#39000507)

Seconded. If i have to choose between having a manager and QA team.......i would not even try to think.

Re:Some developers appreciate their QA people (4, Insightful)

Dastardly (4204) | more than 2 years ago | (#39000573)

I am a developer and I tell my testers to consider me to be evil, lazy, and malicious. They must assume I am actively trying to fool them into thinking the application is working even if it is not with the minimum amount of work possible. That generally gets them to find the defects.

Re:Some developers appreciate their QA people (5, Insightful)

dotrobert (923547) | more than 2 years ago | (#39000763)

I'm a QA engineer and I *do* consider the app developers to be evil, lazy and malicious. Well, not really, I just agree that I should assume so for testing purposes since they are humans. Also, in response to the parent comment, I'm REALLY glad that we work in parallel and directly with app developers. It gives them *and* us tremendous advantage.

Re:Some developers appreciate their QA people (1)

rickb928 (945187) | more than 2 years ago | (#39000823)

My QA tester was a contractor, and very good indeed. Among other things, he was totally invested in the process of delivering a great product. Contractor or not.

And he was let go, as we are making QA and UAT the developer's in-house responsibility. Don't bother to tell me how lousy an idea this is, management has their reasons, without consultation. We would normally predict failure, but it turns our they are releasing a beta of an entirely new product to replace this failed product within 2 months of the expected UAT completion. That's clearly a nasty ploy to escape the steaming pile that they have been making for us for 7 years now.

Don't ask. I can't tell you more anyways. It's just pus.

Re:it means (2, Funny)

Lunix Nutcase (1092239) | more than 2 years ago | (#39000499)

Sounds like someone is butthurt because he got chewed out for filing shitty bug reports that were vague and useless.

Re:it means (0)

Ambiguous Coward (205751) | more than 2 years ago | (#39001019)

Oh, come on, "It's broken" is perfectly accurate! Not terribly precise, but definitely accurate!

Re:it means (1)

Anonymous Coward | more than 2 years ago | (#39001189)

The skills you learned on the playground when you were eight years old help, as in this sample dialog:

QA: The build is broken. I'm opening up 11 new defect reports.
Eng: Wait! Did you get the latest fixes from last night?
QA: I installed it this morning. None of my test cases work.
Eng: You must have a configuration problem. Or you did something wrong.
QA: No, I did everything right. The build is broken.
Eng: No you messed something up.
QA: Did not.
Eng: Did too.
QA: Did not.
Eng: Did too. Anyway I'm preparing another checkin for tonight which will fix most of the remaining problems, but what you have should already basically work.
QA: Well it doesn't.
Eng: Well you're not testing it the way you're supposed to. These are probably mostly user errors.
QA: No, they're not.
Eng: Yes, they are.
QA: No.
Eng: Yes.

Boring test cases (1)

Anonymous Coward | more than 2 years ago | (#39000307)

It consists of repeatedly running slight variations on predefined use-cases. Basically following scripts and recording your experiences.

i.e. boooooring.

Re:Boring test cases (5, Informative)

Anonymous Coward | more than 2 years ago | (#39000861)

Posting as AC but I've been a tester for over 10 years at different companies, many of them contract work. I very much enjoy the work.

Let me clarify many of the things about being a software tester (which can also include embedded software/firmware). From my perspective:

Following the test script as written is only a small amount of the big picture.

Issue characterization. It's not good enough just to report the issue. How often is it reproducible? Device specific? Configuration specific? Timing specific? Line by line steps to reproduce, what was the observed behavior, what was the expected behavior. Is it only on first time launch? Does it reproduce on a variant? Localization form and fit--even if the language is not fully understood, when checking localized builds for form and fit, is there any trucation or overlap, does text go outside the button areas? Does it always reproduce, or how many times out of how many attempts?

Severity determination--not everything is going to be a showstopper but properly rate the ones that are showstoppers. Also, a low severity defect can still have a higher priority if more than 50% or so of users will see the issue.

Exploratory testing skills are equally essential. Even after running the script line by line in order, what else wasn't covered? What if a different file is used, a purposely corrupted file--is the software still robust?

Quick turnaround on resolved issues. Verify the issue is fixed and close the issue, or reopen the issue with additional information as to why the issue still occurred. However--if the issue was fixed and a new issue is a side effect of the fix, then the issue resolved gets closed and a new issue gets submitted with many testing teams. They do not usually add the new issue to the existing issue except as a comment, so that closing one issue does not lose the other issue.

Teamwork--collaborating with other testers, developers, managers, maybe even conference calls with outsource vendors.

Test case authoring. Even in a pure manual testing environment, existing test cases need to be updated (or removed if a function no longer exists), new test cases need to be added for new functionality, best coverage in the least amount of test cases is the goal.

Automated testing--developing test scripts in the automated testing environment. This is programming for testing purposes in many cases.

It all varies from company to company, project to project--but in a lot of cases being a software tester is not about just go to the next test case and mark it pass, fail, or blocked/not possible for each working day, every day.

I hope this helps.

What Does a Software Tester's Job Constitute? (5, Funny)

Anonymous Coward | more than 2 years ago | (#39000327)

A: Suffering

scripts (1)

X0563511 (793323) | more than 2 years ago | (#39000331)

To paraphrase an AC [] , you basically follow an established-by-development script over and over and over and over and over and over...

You do not think, you do not troubleshoot. You perform prescribed actions and check marks on your... checklist.

Re:scripts (2, Insightful)

Dastardly (4204) | more than 2 years ago | (#39000605)

Which as a developer I hate. What I really want testers to do is write automated tests. The only hand test a tester should do is one to see what the automated test should do. Yes, reality ends up being a mix, but reality should be informed by the ideal. It irritates the hell out of me that testers are doing the same thing over and over, that is what computers are for.

Re:scripts (5, Informative)

chuckfirment (197857) | more than 2 years ago | (#39000619)

I have to disagree. I've never worked somewhere like this, and I've been in SQA for many years at many jobs.

If you want to be a low-paid button pusher, yes. Do the same thing over and over, all day long with no deviation. If you want to enjoy your job testing, test the software. Try to break it. Troubleshoot. Do things the developers wouldn't expect. (After all, who expects an apostrophe in a name field? "We only expect regular text, right Mr. O'Hanlon?")

The job of a software tester (tester, not button pusher) is to try to find all the defects in the software and report them to development so they can be fixed.

BREAK IT (4, Insightful)

phrostie (121428) | more than 2 years ago | (#39000339)

Go down a list of features.
make sure they work one at a time.
then try to break them anyway.

Automation or manual testing? (5, Informative)

chuckfirment (197857) | more than 2 years ago | (#39000583)

OP is correct - the job of a software tester is to try to break the software. I've enjoyed working in software quality assurance (SQA) for over a dozen years now. I get paid to break things all day, and when I do break it - I don't have to fix it.

SQA is very different depending on where you go and what you're testing.

Web Applications - you'll want programming experience so you can write flexible automated scripts. You can test manually for every supported browser/OS combination, but it's tedious.
Desktop Applications - Sometimes manual testing is enough. If the software is large, you'll likely want to automate.

Large companies that move fast will want automation. Small companies that move fast will want automation, but might not realize it. You can get away with manual testing at small slow companies.

You don't need automation skills to be a software tester. You do if you want to become a software tester with a high income.

Testing scripts. Lots o' em. (2)

Kenja (541830) | more than 2 years ago | (#39000345)

There are several different industry software testing platforms which allow for the creation of complex automated testing scripts. These scripts can be very complex and constitute a programming language in their own right. The idea is to be able to provide a script that the developer can run to recreate a set of error conditions so they can be fixed. Likewise you would end up with a series of scripts, that when run (often over the course of a day or more) will test all aspects of the software and check for new errors or confirm functionality. These scripts will have to be updated as the application changes, so if done right its a full time job.

Re:Testing scripts. Lots o' em. (2)

afabbro (33948) | more than 2 years ago | (#39000463)

The job is probably somewhere between these extremes:

  1. As Kenja describes, managing a complex software testing platform, installing the software, licensing it, configuring it, writing the scripts for it, working with the dev team to make sure your testing is in sync with what they're doing, etc. Simply managing a complex product and all of its organizational interconnections could easily be a fulltime job. Besides the product, you'd probably have to know whatever programming language(s) are being used (Java, C#, whatever), plus the app's own scripting language, plus possibly some ancillary languages - perl, SQL, etc.
  2. Having a Word doc with a bunch of tests you are supposed to run every time they are ready to release a new version. You manually go through all these tests and send emails or enter a problem in a bug-tracking system once something breaks.

If they didn't say "we're looking for someone with GronkTest 3 experience" it's more likely to be nearer the latter than the former.

Software testing (0)

Anonymous Coward | more than 2 years ago | (#39000369)

It means:

a) Planning software testing (Which is the hard bit)

b) Testing the software (Which is the boring bit)

day to day (-1)

Anonymous Coward | more than 2 years ago | (#39000371)

lots of youtube and facebook, tell them that

Just say no (2, Insightful)

Anonymous Coward | more than 2 years ago | (#39000391)

If you are a software developer, do not take the job. Development is usually considered more skilled.

If you want to try just ask for your current salary and than you will have no problem since they will say no.

In short they are looking at you as a sucker who will accept less pay with more skills.

Re:Just say no (2, Interesting)

Anonymous Coward | more than 2 years ago | (#39000557)

It depends. Often if you are moving to a better or bigger company they will start you in test. A lot of people move on, but test can be great.

You typically work less hours.
Test Automation can often be more difficult that the Engineering positions.
The pay can be just the same.


Anonymous Coward | more than 2 years ago | (#39000405)

Of ... What Does a Software Tester's Job Constitute?


Depends (0)

Anonymous Coward | more than 2 years ago | (#39000421)

If you're a contractor, expect to be abused. You might not be just responsible for testing.

You are going to be the one who knows the software (4, Insightful)

roman_mir (125474) | more than 2 years ago | (#39000431)

The truth is that if one wants to find out what software does under different conditions, he shouldn't go the designer or the developer, he should go straight to the tester.

The job of a tester is to put together a meaningful plan - understand how the software is going to correspond to the business needs and test the main logical paths as well as some optional and failure paths and find out what the software really does as opposed to what people think it should do.

If the difference between what the software does and what is required is such, that the business will suffer because of it, this should be fixed, so this goes back to the developers.

Testers prepare test plans and test the software.

Good testers prepare the data and seed it, so that it is the same at the start of similar tests in each iteration.

Intelligent testers use various tools so that they don't do this by hand.

Excellent testers figure out what the business needs and actually provide good user-like (but better) feedback to the development.

Re:You are going to be the one who knows the softw (1)

LesFerg (452838) | more than 2 years ago | (#39000781)

The job of a tester is to put together a meaningful plan - understand how the software is going to correspond to the business needs and test the main logical paths as well as some optional and failure paths and find out what the software really does as opposed to what people think it should do.

Yeah. What he said. Sure, misconceptions abound, but when the role really is "software test engineer", the job is a lot more important than just running scripts and trying to make the software break. I think this may vary vastly between employers, and in some (or most?) situations the management themselves probably don't understand what they should be asking for from their testers. The reality is destructive testing only covers a portion of the job, while to do the job well a lot of knowledge and imagination can be needed to grasp the entire scope and complexity of the software, knowing what goes on in the back-end data, what the users see and believe is going on with the particular parts of the s/w that they use, how action in one person's role effects what is seen by a user in another role - the tester needs to see the entire view which individual users may never be involved in.

I have great respect for the testers I work with (not least because they can handle a grumpy developer without hitting me) and often prefer to go to the tester instead of the B/A for answers. Development environments may differ, but when I have a nasty bug to hunt down which requires data to be set up in multiple integrated software solutions to get just such an error to occur between them, I can't imagine what I would do without somebody there with the knowledge and patience to go thru the entire scenario to reach the point where he can say "now press that button and watch what happens".

Sure, its still annoying when the tester comes back and tells me that little Bobby Tables just broke my database, but yeah, they do need to do the simple basics first as well I guess.

Re:You are going to be the one who knows the softw (1)

no-body (127863) | more than 2 years ago | (#39001317)


The job of a tester is to put together a meaningful plan - understand how the software is going to correspond to the business needs and test the main logical paths as well as some optional and failure paths and find out what the software really does as opposed to what people think it should do....

That's incorrect - a tester follows a test plan and performs the test steps. This can be a program and the tester will evaluate the program output if it conforms to specs.

A test plan is part of the design process and should run parallel to development.

What you describe is ad-hoc testing - finding additional defects by going outside a structured, predefined test plan.
There is no way that more complex systems - banking, Internet, manufacturing etc. would function without a highly structured approach including testing.

The term "Engineer" is undefined - can mean anything.

I used to be in FVT &SVT at IBM (0)

Anonymous Coward | more than 2 years ago | (#39000437)

I tested OS/2 Warp (That was when they took OS/2 1.3 and recompiled it under Visual Age for 32 bit) and OS/2 Power PC (IBM Mach kernel).

FVT - Function Verifiction Test - basically an app programmer that would create apps that makes sure APIs do what they're supposed to do. SVT - System Verification Test: follow a script - day in and day out. Click on button, click on menus - down to every single one.

The FVT job was kind of fund. I learned things about app writing and OS internals that you'd never learn in school. I had to debug the OS many times.

SVT was boring as all Hell. Mindnumbingly monotonous.

For both deptartments, we had a script. FVT you'd write a program that did the script and SVT you did the script manually is what it comes down to.

makes COBOL look like a paradise (1)

spiffmastercow (1001386) | more than 2 years ago | (#39000439)

I pity our poor testers.. I have to assume that they must develop some sort of mental detachment capabilities to be able to sit there and test the same thing over and over and over. Go back to your cube. Spending hours trying to get the damn login screen to work in every browser might seem painful, but it's nothing compared to the hell that is testing.

Re:makes COBOL look like a paradise (1)

gestalt_n_pepper (991155) | more than 2 years ago | (#39000505)

That's why the good lord invented GUI scripting systems.

Re:makes COBOL look like a paradise (1)

Nerdfest (867930) | more than 2 years ago | (#39001125)

If you plan on testing something more than once you're probably wasting your time if you're not automating it. It's wasteful, unreliable, and makes it very expensive in time and money to release new versions, or even patches for old ones.

Re:makes COBOL look like a paradise (1)

Brian Feldman (350) | more than 2 years ago | (#39001245)

I'm sure there are also testers that think development is like this, a big inevitable slog. You are both wrong.

Are you Sure you are a Software Engineer? (3, Informative)

wwbbs (60205) | more than 2 years ago | (#39000441)

I've never heard of Software Engineer that did not now what a Software Test Engineer does. Perhaps User Acceptance Training? Your the middle man that takes requirements from client and makes sure that what the Developers produce works within the framework the client provided. Generally you create mock ups and run through the data until all the results and the interface are what the client expects.

Re:Are you Sure you are a Software Engineer? (1)

arkane1234 (457605) | more than 2 years ago | (#39000577)

I was thinking the same thing.. it's like an auto mechanic taking the job of a salesman, and wondering if someone could give him tips on how to show someone whats under the hood...

Pretty wide berth of possibilities here. (4, Informative)

Lashat (1041424) | more than 2 years ago | (#39000443)

Anything from glorified mouse-clicker and result recorder on up to programming test cases and developing an automation framework.

The line blurs depending on WHO you work for and WHAT you work on.

My best suggestion is to ask the person offering the job what they have in mind for someone of your skillset.

Re:Pretty wide berth of possibilities here. (1)

jtara (133429) | more than 2 years ago | (#39000725)

Best answer so far!

I once did a contract job for Conexant, devloping software to acquire and stuff test results into a database, format reports, etc. (for Docsis cable model qualification). Now I wish I'd never put that on my resume, because I get calls for "software test" all the time.

At Sony I got to see game testing. Now THAT was a three-ring circus! Signs posted about no sleeping in the halls, and no hats or sunglasses... Some of that testing actually is just playing unreleased games and recording your reactions to the game play.

I'm pretty good at "monkey testing" (pounding on a keyboard to try to make the software break) but I find writing test plans and test cases tedious. I've actually done monkey testing, and there are cases where it's appropriate.

Rails programmers, of course, all write their own tests. (Riiiiiiiiiight!)

There really is no way to know just what the term means, because it applies to such a wide variety of activities.

QA (1)

khellendros1984 (792761) | more than 2 years ago | (#39000455)

We call them "Quality Assurance Engineers" (QA). Ours talk to the software engineers to build applicable test cases, and then continually run the tests on the platforms they've been assigned to work on.

Most of the job is fairly mindless, just reporting errors to the main developers as bugs so the devs can work on fixing them.

a lot recruiters talk out of there ass / have no i (2, Insightful)

Joe_Dragon (2206452) | more than 2 years ago | (#39000461)

a lot recruiters talk out of there ass / have no idea about job some times there is not even a real job there.

It is not well defined. (1)

stanlyb (1839382) | more than 2 years ago | (#39000469)

The real tester should have some developers skills and:
1.Run predefined scripts.
2.Follow predefined scenarios (in case it is not possible to use scripts)
3.Try to "crash" the program, and document it, i.e. prove it that it is repeatable.
4.Being able to read the source code and even correct it, if the changes are insignificant.
5.Actually, you could say that the tester is "juniour software developer" who for one or another reason cannot move to the next level.
The reality is that nowadays you have to do only "1" and "2" to be considered a tester. Good luck.

Testing is applied epistemology. (1)

gestalt_n_pepper (991155) | more than 2 years ago | (#39000477)

You have software with expected behaviors. First you verify that behaviors occur. If they don't, you have a fail. You formulate a theory as to what might cause the fail and you test to see if you can reproduce it. If you can consistently create the fail, you report it to development, who may fix it if it's severe enough. It's somewhat analogous to scientific method, albeit writ small.

That's the box top explanation. In practice, of course, it's *much* more complicated.

YMMV (1)

alesplin (1376141) | more than 2 years ago | (#39000483)

Wow. Sounds like some test engineers out there with sucky jobs.

My last job was as a development engineer. We tested our own code while in development, then turned it over (along with any relevant and useful test cases) to the QA team.

I recently took a position as a Software Test Engineer at a different company. Our philosophy here is that we own the quality of the product. Granted, it takes a little more creativity to write tests to break an OS than an application, but my job here is quite challenging, with very little of the "run script, make checkmark" drudgery so many others seem to endure.

To the OP: ask lots of questions about what you'll be expected to test and how. Nobody likes drudgery, and being given a certain amount of leeway and ownership makes the job much better.

you are the person to blame (1)

cod3r_ (2031620) | more than 2 years ago | (#39000485)

Gota blame someone.. easy to get a tester and just blame them for not catching something.

I am a software tester. (4, Informative)

CannonballHead (842625) | more than 2 years ago | (#39000491)

I'm a software tester for data moving products.

While a lot of testing is repetitive, the repetitive stuff can often be automated. For example, there's functionality that exists in every release ... so automating those testcases such that they are easy to run hands-off is good. This automation is often something the tester will be doing.

For new features, what typically happens in my group is that the developers will explain how they implemented a given feature and how it should work. We are responsible for testing this feature - with any tips that dev gives us - as well as trying to put it through various scenarios that cause it to break.

In my product's line, for example, we do clean reboot and power cycle/crash testing. What happens to our product when the power goes out? What happens to the data that was being moved? Does it recover? That sort of thing. This requires thought - and, contrary to some comments here, since we all want our business to SUCCEED and make money, which means customers need to be happy with the product, development is happy when we find errors or scenarios that they did not plan for in their coding. The earlier we find them, the better.

Day to day activities? Well, I'd break it into two major sections.


The test group, in my case, is responsible for reading through the planned product's features and changes and coming up with a test plan accordingly. This is then reviewed by developers, the test group, etc. Usually, during this time period is when a lot of work on automating previously-manual testcases can be done, in preparation for the next release. Also, planning for what testing environments will need to be setup and starting to set them up... it depends how big your group is I guess. Since mine is relatively small, all the testers help out with setting up various machines for testing, too.


During testing, the test plan is executed. Day to day activities include test environment setup, manual testing, automated testing, discussing potential issues with developers and opening work requests (WRs) if it's decided that it is really an issue and not a weird environmental problem etc.

Need more input (1)

K. S. Kyosuke (729550) | more than 2 years ago | (#39000495)

"Can anyone shed light on what a software test engineer's day to day responsibilities are?"

Making yourself very unpopular among your fellow developers! :]

Seriously, that really depends on what exactly is being tested and how the testing is organized. I've seen people with the description of "tester" sitting in a single room doing totally different things. Have some more input data?

(But one virtually bulletproof hint is: If you don't know Python, learn it. It might save your life if you ever decide to test anything.)

Variety Pack! (4, Informative)

Isarian (929683) | more than 2 years ago | (#39000511)

As a software tester at my job, my work includes:

- Building test scripts for each application (I use Google Docs spreadsheets) that we develop
- Perform feature-specific or fix-specific regular testing of applications during development cycle
- Argue with developers over severity of bugs
- Coordinate full-scale software testing before each release
- Update documentation when developers fail to do so
- Argue with developers over importance of different features in terms of development time

A big part of what makes or breaks you as a software developer is the willingness to go off the beaten path. For example, when I test, this is what I consider:

- Hmm, that's an interesting text field, and it's meant for an IP address. I wonder what happens if I type "abc::1234**!!whymeeee" into it (input validation)
- This is a resizeable dialog - if I resize it absurdly in vertical/horizontal, do elements in the dialog scale correctly?
- Here's a text area that's meant for a paragraph or two of text. If I put the Iliad into it, does the text run off the page? (bounds checking, text limit checking)
- Here's a dialog that has to validate text - what are all the possible errors it could encounter, and are the error dialogs properly implemented for each? (check all error condition handling possibilities)
- This dialog is localized into 15 languages - is the page sized/formatted correctly in all languages?
- This program is meant to be installed to C:\Program Files\Blahcompany\Product - what if I install it to a nonstandard location?

This will ultimately put you at odds with a lot of developers because your job, every day, is to make the assumption that they have made mistakes that you will find. I enjoy it, and find it to be a rewarding experience, but that's because I work at a company that highly values its software testers and takes QA as a serious priority. Try to get a feel for how this company treats QA, because if all they're doing is using you as the fall guy for bugs you made them aware of before a release, it'll be no good.

Re:Variety Pack! (1)

Ralph Spoilsport (673134) | more than 2 years ago | (#39000733)

GOOD LIST! I would add that there are other aspects as well-

There's a "horizontal" distinction in testing, by colour: Black Box, grey box, and white box.

Black box is what the post above is doing - everything "under the hood" is not to be looked at, only the results. This is the easiest form of testing. It can be automated using scripts, if the UI's field order is stable. I like Black Box - you just get to "fuck shit up". My fave was to copy and paste the Unabomber manifesto into everything and watch the fireworks. Also, in the early stages, Black Box can have a lot of say in the design, because if the UI is crap, the black box tester will notice immediately - workflow is retarded, or there are too many clicks to a result, or similar issues. Also, Black box usualy entails running the daily build server, and doing a daily test of the build to see if it meets minimum criteria. If it doesn,t then the programmers are informed to not use that build and stick with the old one until it passes muster.

Grey Box is in between - you're looking at the code when you can, and you're developing test scenarios for scripting engines of other systems. For example: you need to test load balancing - so you need to know where in the code stuff goes in and out and how the server takes the data, processes it, and sends it out - that takes a bit of programming - the programmer can write that for you - and then it's your job to suss out what the results mean. Also, when something breaks, you need to find where in the code it's broken. How to fix it is not your job or problem - but you need to be able to say "this is where it divides by zero and causes the computer to puke blood".

White Box is where you're managing low level testing of each software component (after the programmer runs it to see if it works at all) and if it doesn't work, you need to be able to flag it properly and send it back saying "it broke here because" kind of stuff. I find white box even more stressful and difficult than actual programming. You have to be a "programmer's programmer". Pain inth e ass, and rarely worth the money or hassle.

I like doing black box testing. It's a blast. I love to just fuck shit up. It's a trip. It's a bit like doing detective work. Grey Box is more intense, but can also be fun, dpending on the app you're testing and the team you're with. IF they hired by the "No Asshole Rule" then grey box can be rewarding - it pays better than black box and it often includes a lot of black box.

That's how it was when I left software dev in 2005. From what I gather it hasn't changed that much - it's more automated than before, but only after the app is mature enough to withstand it.

Re:Variety Pack! (1)

Anonymous Coward | more than 2 years ago | (#39000873)

This is pretty much spot on. I'd include:

  • Automated testing suites and analysis of functional test results.
  • Planning, executing, and reporting of performance and load tests.
  • Regulatory compliance testing and documentation. Tell them you are a Verification and Validation Engineer and your pay will be double :)
  • Bug investigation and reporting on categorization/repeatability/severity.
  • Risk/hazard analysis and mitigation for legal compliance and regulatory compliance.

Re:Variety Pack! (1)

imgunby (705676) | more than 2 years ago | (#39001055)

I'd like to add a few more examples of what we think when looking at a feature/web page:

- Here's a required field, if I post the form and omit the name-value pairs, does the application reject the call?
- Here's a date field, does it reject dates too far in the past or future (01/01/10000) and reject/convert dates that don't exist (02/29/2000 or 03/54/2012)
- Here's an input field that holds text that will be shown on other pages, what happens if I put HTML in there?
- Here's an input field, will it correctly handle new name-value pairs (&this=that&)?
- Here's a signin form, if I enter a bad username or password, will it provide hints that can be used to mine usernames?
- Here's a forgot password form, can it be used to mine valid email addresses?

But people also need to keep in mind that we're *assurance* and not *insurance*. While we can catch a lot of bugs before the end-users see them, it really is up to the developers to produce quality code in the first place. There are few things harder, more expensive, and time consuming than trying to test quality into code.

Testing != button pushing (0)

Anonymous Coward | more than 2 years ago | (#39000517)

It can vary wildly. I work for a very large software org, and our test team is so varied in skill set and role that it would be hard to quantify it succinctly. Everything from scalability testers, who manage the scaling of our products to 1000s of instances and servers, and 100000s users... to interop testers who create realistic wide scale deployments based on customer feedback and test product interop... we have formalised functional testing to break/test/fix software, and we have automation engineers who create repeatable, structured tests which can run many times per day as changes are made to the codebase. Consequently, our test teams are some of the best paid engineers going - and can easily progress into senior management and director roles within the wider development team.

Suicide (1)

19thNervousBreakdown (768619) | more than 2 years ago | (#39000525)


Re:Suicide (0)

Anonymous Coward | more than 2 years ago | (#39000663)

You are in the wrong forum for this [] type of advice [] .

Re:Suicide (0)

Anonymous Coward | more than 2 years ago | (#39000771)

the stupid translation there is all screwed up.

The names are Elizabeth Petsilia and Anastasia Koroleva.

Testing is as testing does. (0)

Anonymous Coward | more than 2 years ago | (#39000531)

Testing software can mean different things. It depends on the sponsor of the testing. If you are testing on behalf of a customer, it is more evaluation oriented: is it good enough for our needs. If you are testing on behalf of a development sponsor who's intent is to sell it, they may want a more rigorous process that can include automated testing (writing code to exercise the application and demonostrate fitness), bug hunting (including corner cases, user simulations, etc), stress and load testing (make a bunch of worker threads and pound on something, be it library (e.g. a .dll) or a service (like db or web site). You may also get involved in ways you can help improve the code's testability, which is to examine the source and find ways to structure, configure, and expose it in ways that let you do more specific testings.

You might be testing in a small office, doing whatever the program manager wants. That is often the case. You may write test plans and test cases. You almost certainly will discover, write, and resolve bug reports and report that to your sponsor.

Simple. (1)

dalmor (231338) | more than 2 years ago | (#39000533)

Obvious statements are obvious...come on guys, sure a software tester can just simply follow a checklist. Can't say what a screener is looking for(i.e. scripted responses), but as an interviewer I would look for...

1) Passion to understand things and constantly digging deeper
2) Natural distrust of quality of code
3) Passion for security as well(it is another form of bugs)
4) Someone who is willing to stick to their guns(can believe 100% they are right) as much as willing to back down(not 100% ego).

Following scripts or going down features...anyone can do that, but if you don't have at least some of the drive as listed above, you will get burned out fast.

It could be something good ... (1)

perpenso (1613749) | more than 2 years ago | (#39000551)

It could be something good. At a past employer we had software engineers who developed the product, six figure telecommunications boxes, and software engineers who were part of the QA team. Those who were part of the QA team wrote code to test sub assemblies during manufacturing, to test fully assembled boxes before delivery to customers, regression tests to exercise the development team's software in simulated environments (i.e. testing development versions of code before approving the code for release), etc.

It's all about Agile! (0)

Anonymous Coward | more than 2 years ago | (#39000571)

There is a book called Agile Testing - read that. Also, read a free pdf called Scrum and XP From the Trenches. After that, learn about the paradigms of different testing: A/B Testing, Automation, and the standards around how test cases are come up with.

Depends on the company and the software package. (2)

Vandilzer (122962) | more than 2 years ago | (#39000607)

If they have a good regression system setup, it will mean you will write a test case once and it will automatically run for you, and you are good, you will automate all of this is it is not already done. You will only go back to look at these if something breaks.

You will also likely get the specifications for a new component and a script and be expect to try all possibilities, the better once know how to program and thus do thing that they know we likely miss. Bad testers are a dime a dozen, good tester are golden.

Generally think about it like being the first customer and having to learn a new components all the time. A person who's job it is to figure out all the ways to use something within the rules given (not that much different then hacking, actually if you think about it hackers are just really good testers, they win when they find a bug).

Testers are also involved often in customer deployments and trials here where I work because they have the most experience using new software. Developers know how to write thing, testers know how to use it (Big difference).

In the end, you will make what you will of it. Go talk with them, all you have to lose is a bit of time.

Best of luck,

I'm a programmer not a English major, if you have a problem with my writing, write a better grammar and spell checker, then we can talk.

Re:Depends on the company and the software package (0)

Anonymous Coward | more than 2 years ago | (#39001005)

This. It's incredibly variable.

I work for a company that produces an app that uses a MS SQL backend and a Visual FoxPro frontend (yes in 2012, we still use FoxPro). We have *no* automated testing. Testing is done by reading a bug report / feature request and making sure that the compiled version meets the specification. Of course, because we have no automated testing this means that we manually have to use the feature how we think a customer might use it.

Because our codebase is so old and filled with spaghetti (and because we have no formal test cases -- and sometimes some behavior is simply undefined due to oversights in the specification) the QA people have to constantly try to keep the program in working order via manual regression testing. Manual regression testing is mind-numbingly boring. You really want to kill yourself after running the same test over and over again, month in month out.

Being dumb (1)

rsilvergun (571051) | more than 2 years ago | (#39000631)

Seriously. The job of most software testers is try the software and then complain it's too hard to use so the UI guys can tweak it down to a 3rd grade level or so. I guess there's regression testing too.

Golden Rule of Testing. (2)

Modern (252880) | more than 2 years ago | (#39000649)

Those who can, do. Those that can't, teach or QA.

While not always true, you have probably heard the above statement many times. Testers in many cases are responsible for bringing the whole project together. Other departments might make portions, a graphics/animation department, software programers, maybe math groups or database, however you are the one who has to test all the components together. For all this responsibility you get the least amount of pay, and are expected to know everything about all the parts and pieces. Blame QA is the common cry heard among companies, when something gets out buggy. The job itself could be ungodly script running repetition, or pretty cool new project stuff that has not been automated and the bleeding edge of the companies new products(with ungodly script repetition to follow later).

It tends to be a pretty thankless job, but in some cases it could be a good way to get into the company and use the job as a stepping stone for a better paid, less responsible, software developer (or otherwise) job at that company.

Be very carefull (1)

nrasch (303043) | more than 2 years ago | (#39000671)

I think you ought to be very careful about going into QA. At many jobs that simply means pushing buttons and running the software. You certainly won't be adding any development/code writing skills on your resume at a job like that.

Also, they more you do something, the more companies will only want to hire you for that one thing. So, if you want to be a developer or remain a software engineer I'd think twice before filling your resume with QA positions. You'll likely only be considered for that position going forwards.

My $0.02, and good luck with whatever direction you end up going. :)

Re:Be very carefull (0)

Anonymous Coward | more than 2 years ago | (#39000739)

You are right, nrasch

Are you out of work? (0)

Anonymous Coward | more than 2 years ago | (#39000707)

Going into QA is career suicide for a software engineer. Seriously this is a big step down.

Huh? (0)

Anonymous Coward | more than 2 years ago | (#39000709)

Wait. You're a software engineer and you don't know what a software tester does??

Re:Huh? (0)

Anonymous Coward | more than 2 years ago | (#39000871)

Which means he's perfect material for being a tester! 99% of them don't know what their job is either. Most of them think their job is just to point-and-drool.

Test Engineer (2)

chrismcb (983081) | more than 2 years ago | (#39000713)

A tester tests the code. A test engineer typically will write code to test the code. These aren't unit tests they are writing. But test scripts. Testers tend to test the finished product. They are the ones doing integration testing. If you are currently a programmer you probably will not want to be a tester.

from Project Management perspective... (2)

charlieo88 (658362) | more than 2 years ago | (#39000757)

What Does a Software Tester's Job Constitute? Well, from the perspective of Project Management for my off shore outsourced projects... NOT A DAMN THING!

comic (1)

MagicM (85041) | more than 2 years ago | (#39000767)

Everything I know about testing, I learned from The Trenches [] .

Oh, and doing it myself, of course. Of course I test my code. Why? Who's asking?

One part of the job... (0)

Anonymous Coward | more than 2 years ago | (#39000787)

I worked testing software for phones (many different Android phones mostly) for awhile and often the test instructions were "Play with it for 30 minutes and see if it breaks." And fairly often it did.


DontBlameCanada (1325547) | more than 2 years ago | (#39000789)

Once you have a "testing" job on your resume, its pretty dang hard to get anyone looking for a coder to consider your application seriously. I've run into at least a dozen hiring managers who discard resumes for dev positions solely because the last job someone held was as a tester.

more thorough (0)

Anonymous Coward | more than 2 years ago | (#39000793)

Having dedicated testers doesn't relieve programmers of their own testing duties. A programmer writes tests for the data they expect, and programmes for that data. A tester writes tests involving all kinds of data. He's expected to be pretty well trained on security issues and will get to know how the program will react to different circumstances as well as or better than the programmers. He works closely with software architects and others in translating storyboardsinto automated tests too.

It's just what it says it is.... (2)

NotSanguine (1917456) | more than 2 years ago | (#39000807)

Software test engineer was my first technology job, so I was pretty junior. Your responsibilities might be more extensive if you have more experience. That said, my responsibilities included:
Overseeing the build process
Maintain build scripts/makefiles, manage software repositories
Grabbing the latest code from all modules and building from source
If the software didn't build properly, identify the source of the problem and (if I could) fix it, otherwise provide feedback to the responsible developer

Integration testing
Create and maintain test environments and scripts
Once the software was built, test major functionality to ensure that new code didn't break anything
Test whatever functions had been modified (new features, bug fixes, etc.)
Debug any issues and (if I could) fix them, otherwise provide feedback to the responsible developer(s).

Reproduce bugs that had been escalated to the dev team
Debug the issue and identify the source
Fix the bug (if I could), otherwise provide feedback to the responsible developer(s).

Software Releases
Build release versions (Beta and production) for distribution to customers and manufacturing
Regression test release versions

Other Stuff
Create test reports
manage bug tracking
Set up dev environments for the developers
Design and set up test beds for testing and troubleshooting

Developers were responsible for unit testing their code before releasing it to me for Integration. That may or may not be part of the job you're looking at.


A QA Engineers testing tools and skills (2)

dotrobert (923547) | more than 2 years ago | (#39000917)

What software/application you are working on can really change the equation, just as it does for the application developer, but here is a short list of software and skills I use in testing a publicly available web application, I'm sure I'm missing things, just a quick list...
  • Tests are written in Java.
  • TestNG for managing what gets tested and defining the data providers
  • Jenkins for continuous integration and scheduled testing
  • Selenium to drive the web app ( a proxy that gives us pretty darned complete access to a web page )
  • SQL most often from the java code, but sometimes manually in a SQL Query client to verify data.
  • Linux - general knowledge to help track down the source of problems on occasion. We dont like to throw "dumb" reports back to the devs.
  • Apache/Tomcat - we write log parsers to verify some activities, and sometimes just tail/grep them in narrowing down the cause of a bug.

And of course we manually test, too. But, damn, is that boring... :)

Run! (3, Insightful)

turgid (580780) | more than 2 years ago | (#39000929)

I was looking for work recently, too, and got lots of calls from recruiters looking for Software Test Engineers.

I've worked very hard to get to being a developer, so I resisted and eventually got a better paying and infinitely more stimulating development job.

Software Test Engineers are sort-of developers, but the emphasis is on understanding requirement to be able to implement a "test matrix" that will (perhaps exhaustively) exercise a system (hardware and software as a whole) through all of the "use cases" that a user might be expected to do i.e. how J Random Luser will use the product.

Practically, this means implementing hundreds (or maybe thousands) of automated tests driven from something like Fitness. If you're lucky, you'll get to implement your test cases in something like Ruby. If you're unlucky, it might be C#...

It's quite a skilled job. You need to know a bit of statistics (statistical significance, confidence levels, variance and all that), about Combinatorial Testing (test coverage) and a bit about scripting and good software design. You would also need to understand the difference between white- grey- and black-box tests and when they are appropriate.

There are two ways in, from being a "tester" upwards or sideways from being a developer. (Note I didn't say "down." It's a skiled job, but I've done it for a few weeks for the experience and I got bored quickly).

In the spirit of cost-cutting, most Western companies are offshoring their testing. For example, Xerox just got rid of their manual and automated test to HCL in India. McAfee have done the same.

Stick to development unless you're starving/about to have your house repossessed or want a little extra experience on the side (which is a good thing for your CV as long as you don't make it your whole life).

One last thing: clueless recruiters see "Test-Driven Development" on your CV and think, "Aha! A software tester!" I had hundreds of phone calls under that misapprehension. (They also don't know what a kernel is (Windows and Linux both have one so they must be the same, right?) or the difference between C, C++ and C#...)

Make sure the position is crystal clear (2)

Keith111 (1862190) | more than 2 years ago | (#39000941)

At Microsoft, for example, there are 2 test positions which sound the same but are VASTLY different. "Software Development Engineer in Test" and "Software Test Engineer" are the two positions, but SDET is given a lot more responsibility and gets to do a lot of development and works very closely with the developers and others on the team. STE generally gets little development and is all about making sure the test environment that actually execute tests is working and making sure the infrastructure is happy and the network is working. So my advice would be to make sure that you ask about other testing positions and discuss the differences and make sure you are getting the right thing. Testing is awesome and fun and very important and usually more difficult to do right than development but not all testing positions are created equally.

At our company... (1)

Anonymous Coward | more than 2 years ago | (#39000971)

Our company uses Agile as our development methodology. A task for our testers, from start to end, usually looks like this:

1. A new bug fix or feature request gets created in Jira (task management system). If it's a bug, the tester tries to reproduce it on their system. If they can't, they discuss it with the submitter until it's reproducable, or determined to be user error.

2. In the backlog grooming (thinking about and estimating the time cost of new feature and bug fix requests), the tester asks questions about what the final system will look like and do when finished. In the sprint planning meeting (assigning tasks to developers) they try to get a list of requirements that are, well, testable.

3. When the tester leaves the meeting, they start writing some test cases. "User clicks here, this happens. User enters this value, this happens." This includes lots of corner cases, like opening files stored in locations where the directory has a space in the name (i.e. "C:\Program Files"), or inputting negative values for things that should only be positive, or clicking outside the dialog window. They then code up some test scripts that with Marathon, using Jython.

4. When the developer marks the feature as resolved, the tester gets the latest updates from subversion, runs it with Maven, and starts going through the test cases they wrote. "When I clicked here, did this happen? When I entered this value, did that happen?" They also run the Marathon test scripts.

5. If the feature or bug fix passed all tests, the issue gets marked as being closed in Jira. If it doesn't pass all tests, then it gets kicked back to the developer who is responsible for the changes. Sometimes the requirements need to be renegotiated with the product owner.

This involves a lot of collaboration with the developers, product owners, and product support personnel. Qualities possessed by successful testers are: an attention to detail, perfectionism, creatively thinking outside the box, imagining what a user might do, imagining what a stupid user might do, communicating observations, comprehending vague requirements, writing documentation, and being able to politely goad developers into finishing their tasks well before the deadline so you have enough time to test. Notice that coding is not part of this? Also note that unit tests are the developer's job.

In case you're wondering, I'm a developer. My team's tester has a background in writing technical documentation, and has nearly completed her Masters degree in Project Management, but has no background as a coder. She's extremely good at what she does.

Good Test Engineer == Dev/QA Toolsmith Automator (1)

PureFiction (10256) | more than 2 years ago | (#39000989)

Your development background will be very useful in a QA / Test Engineer role, assuming you are considering joining a technically competent organization.

I say this because many companies have an antiquated view of "testers" as low skilled keyboard jockeys able to bang keys and input fields like monkeys on ritalin. Avoid these places like the plague...

A premium QA/Test Engineer will apply development and other solid technical skills to:

- Provision test systems spanning wide varies of operating systems, network configuration, applications and settings, in short: be able to build everything you need to test the systems tasked of you.

- Obtain a deeper understanding of the system under test; able to dig into code to discern logical errors and oversights, triage down to root cause and even suggest a fix/patch.

- Integrate test automation technologies into the software process so regression and performance testing is part of a continuous integration & test lifecycle. Manual testing should only be a part of your efforts, as software systems continually expand in scope and a manual-only test process will eventually be overwhelmed by progress.

- Extend and apply third party tools, ranging from code performance analyzers to network traffic capture/replay, code coverage analysis and unit test frameworks, fuzzers and chaos monkeys, etc.

- Understand security risks and defensive coding techniques to identify deficiencies in a code base or implementation/design which introduce vulnerabilities. Catching these defects before a product goes live is very rewarding and can be exceptionally cost effective.

- Develop internal tools or customize existing software using Shell, PERL, Python, Ruby, Java, C/C++, and other languages as required or appropriate for the task at hand.

- Communicate effectively with multiple stake holders in an organization: development, product support, marketing, administration, operations. These will all be interfacing with you and the ability to tailor the technical depth and nomenclature of your written and oral communications to each of these groups is critical to being an effective QA/Test Engineer.

And many other skills and capabilities I've not listed, depending on the context of your role in the group and the domain of the organization you work for.

Many people still consider QA a less important or prestigious occupation compared to other technical professions, like software development. While the prestige may be lacking, the job satisfaction of a competent QA/Test Engineer who applies development, operations, and security analysis skills to improve a product is significant.

The many varied resources you should incorporate into your tester toolbox is too long to list here. Many sites exist devoted to QA toolsmith / test automation / security analysis roles, and you're going to want some skills and tools from all of these specialties at your disposal.

Good luck! I hope you consider the switch; the world needs more competent QA/Test Engineers.

QA Engineer or QC tester? (0)

Anonymous Coward | more than 2 years ago | (#39001075)

Being a QC tester or QA Engineer can be anywhere from monkey grunt work, clicking through menus and following test cases while writing up bug reports to establishing test cases, standing up testing environments and developing scripts and metrics to configure those environments and take benchmarks. To me QA Engineer leans more to the latter and a QC tester leans more towards the former.

The kind of qualifications the recruiter is asking for is probably a good indicator of where on that scale you'd be but a QA engineer will some times do some of the grunt work a QC tester will do if the department is small enough but a QC tester rarely does what a QA engineer does because they tend to not be qualified. A lot of software companies will hire in house from QA engineers to be new developers because they are a lot more familiar with the coding standards and practices of the dev team than more qualified people from outside the company.

Our department's QA engineer has basically been working two full time jobs right now fulfilling the roles of both a QA engineer, establishing test cases and testing processes and a QC tester, doing the grunt work of going through hundreds of tests multiple times across various environments. We're taking on a small team of QC testers in the coming quarter and he'll be able to take on the full time role of QA engineer. There is definitely a difference between the two and often recruiters and employers will muddle the terms.

What my friends have told me ... (1)

Skapare (16644) | more than 2 years ago | (#39001097)

... is that I should have been a software tester because it seems every program I touch I always find something wrong with it or manage to break it somehow.

QA responsibilities (1)

Anonymous Coward | more than 2 years ago | (#39001195)

Day to day responsibilities for a Software Test Engineer or QA can involve several things:

Design test cases (think of ways to test the requirements of the software and document that knowledge),
Submit bug reports and feedback (find an exception, crash, incorrect data or calculation, usability, "it's ugly", performance),
Design/write automated scripts,
Design test harnesses (write utility programs and interfaces that hook into the software being tested to get to data or features that are otherwise unavailable or not Directly testable by a person),
Verify code changes for new feature implementations or bug fixes,
Manual scripted testing,
exploratory testing (manual, non-scripted testing).

These are still just generic descriptions. For instance, my day to day testing tasks can include packet sniffing, manipulating a UI, messing with a web page query strings, querying a database, crafting an overload scenario for a feature, cloning alot of client software to bombard a server, writing code, manipulating hardware configurations, testing network connectivity, using a stopwatch to track page load times, and managing an IIS server.

It will depend heavily on the maturity of the team doing the work and the software being developed, as well as skill of the person doing the work of course. You may find that it is more efficient to manually test certain features or functionality when first starting out, but as progress you are able to automate your tests. Or maybe what you are testing has no UI at all, in which case you'll have to write test harnesses or automated scripts from the start. Some features don't lend themselves to easy automation, and vice versa for manual testing.

As Far As I've Been Able To Tell... (2)

SageMusings (463344) | more than 2 years ago | (#39001303)

A tester needs to be prepared to take home less pay and expect high turnover in his/her dept (if he/she doesn't leave first).

We have a QA dept and they don't stick around more than a year, tops. By the time they really get into the product, they're either fed up with the pay, the hours, or they get switched to another product. QA catches few important bugs because we (a) treat them as second-class citizens and (b) we don't involve them at the beginning of the design cycle.

I've also seen some pretty brutish egos among fellow devs wear out QA staff. Do you want to subject yourself to that?

Constitute? (1)

dmomo (256005) | more than 2 years ago | (#39001305)

Among other things, a software tester's job CONSTITUTES the software development process. I think you mean "consist of"?

Don't try it (0)

Anonymous Coward | more than 2 years ago | (#39001349)

Having been in quality for over 20 years, both in manufacturing and software, I will tell you this--if you are a developer, and you actually have to ask what a tester does, don't bother trying it--you'll never get it, because you are either too immature, too stupid, or just haven't been paying attention.

In the real world, great developers and great testers don't play Kindergarten games, and call each other useless or unprofessional. They understand they each have unique skill sets, and work together to produce awesome stuff. I've worked with great developers and great testers. We did some great shit.

For the most part, good luck finding such developers and testers. Most are content with pissing on one another, I guess because their mothers didn't love them enough. Who knows.

Different everywhere (1)

Anonymous Coward | more than 2 years ago | (#39001389)

Tester roles vary widely with the company. Here at MS, testers are supposed to do 30% development: either writing their own test tools, or researching (if not fixing) bugs for the regular developers. One I know said he went into testing rather than developing because he got to write more code with less process. Being a MS SDET requires pretty much the same skillset as a developer, and people switch between the two occasionally.

At another large Seattle company, the testers knew little to nothing about the internals of the product, they basically ran the software periodically and tried to break stuff. Automated test runs (a major secret weapon for MS, BTW) were pretty much unknown: BVT's were a manual script. (Of course, this was just one group, and the products were not all that complicated; still, a BVT run took hours, and therefore didn't happen every day.) Most of the testers were young and I suspect the company wanted (cheap) quantity instead of quality.

The main characteristic of testers is the desire to break stuff; I can't do it because I want stuff to work. There is also some admin stuff (like test plans) that you might not have seen much before (MS teams usually have elaborate templates that make it pretty easy, elsewhere it can be anything from a bullet list in an email to a formal document.)

The takeaway is that you need to find out exactly what your job will entail, from the team, before signing on. Don't assume anything.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?