The ultimate software development tool

Best practices on project management, issue tracking and support

Category: Testing

7 Ways to Get More Out of Beta Testing

The weird and wonderful bugs that get thrown up when real users first start using your code never ceases to amaze. There’s always some odd edge case that had been overlooked, despite you think about little else for several weeks. We’ve been through this many times and concluded that beta testing is the solution to our problems.

Here are 7 things you can do to get the most out of our your beta tests:

  1. Ask for a commitment to provide feedback:

Response rates will be higher if you ask your beta testers upfront to commit for providing feedback. This doesn’t have to be formal, it could be just a part of an application form. But having agreed to it, people are more likely to follow through.

  1. Do not release with known bugs:

Most beta testers will only provide feedback once so you don’t want to burn any tester to just hear about known issues.

  1. Allow enough time:

Use the following as a rough guide. For a major development effort, say about a year’s work, you’d want to spare 10-12 weeks for beta testing. Decrease as necessary – so if it took a month to develop, then, around a week will suffice.

  1. Be feature complete:

Only beta test when your feature complete. Adding in things as you go sets you back to the start. Otherwise, it just means the new code and its impact on existing functionalityisn’t as well tested as the rest. Something you’ll regret later.

  1. Make it easy to get in touch:

You want to make it as easy as possible for your beta testers to provide feedback. Give them direct emails and offer to jump on a Hangout/Skype if they’d prefer.

  1. Follow up but don’t annoy:

While your product might be front and center for you, it’s not going to be that way for your beta testers. You’ll want to remind them along the way. However, don’t overdo it, they’re helping you out so you don’t want to annoy them with too many emails.

  1. Don’t forget to provide feedback:

Make sure to send them updates during and after the tests about how you are putting their feedback to use. People like to know that their time wasn’t wasted. And don’t be tight with the swag – a free t-shirt can do wonders!

 

9 Effective Code Review Tips

9 Code Review Tips

For everyone:

  • Review the right things, let tools do the rest

You don’t need to argue over code style and formatting issues. There are plenty of toolswhich can consistently highlight those matters. Ensuring that the code is correct, understandable and maintainable is what’s important. Sure, style and formatting form part of that but you should let the tool be the one to point out those things.

  • Everyone should code review

Some people are better at it than others. The more experienced may well spot more bugs, and that’s important. But what’s more crucial is maintaining a positive attitude to code review in general and that means avoiding any ‘Us vs. Them’ attitude or making code review burdensome for someone.

  • Review all code

No code is too short or too simple. If you review everything, then, nothing gets missed. What’s more, that makes it a part of the process, a habit and not an afterthought.

  • Adopt a positive attitude

This is just as important for reviewers as well as submitters. Code reviews are not the time to get all alpha and exert your coding prowess. Nor do you need to get defensive. Go into it with a positive attitude of constructive criticism and you can build trust around the process.

For reviewers:

  • Code review often and for short sessions

The effectiveness of your reviews decreases after about an hour. So putting off reviews and doing them in one almighty session doesn’t help anybody. Set aside time throughout the day including breaks not to disrupt your own flow and help form a habit. Your colleagues will thank you for it. Waiting can be frustrating and they can resolve issues quicker whilst the code is still fresh in their heads.

  • It’s OK to say “It’s all good”

Don’t get picky, you don’t have to find an issue in every review.

  • Use a checklist

Code review checklistsensure consistency – they make sure everyone is covering what’s important and avoid common mistakes.

For submitters:

  • Keep the code short

Beyond 200 lines, the effectiveness of a review drops significantly. By the time you’re at more than 400, they become almost pointless.

  • Provide context

Link to any related tickets or the spec. There are code review toolsthat can help with that. Provide short but useful commit messages and plenty of comments throughout your code. It’ll help the reviewer and you’ll get fewer issues coming back.

9 Integration Testing Do’s and Don’ts

Integration tests check whether your application worksand presents properly to a customer. They seek to verify your performance, reliability and of course, functional requirements. Integration tests should be run against any of your developer, staging and production environments at any time.

Writing good tests proving your solution workscan be challenging. Ensuring these tests to perform the intended actions and to exhibit the expected outcomes requires careful thinking. You should consider what you are testing and how to prove it works – both now and in the future. To help you create tests that work and are maintainable, here are 9 Do’s and 9 Don’ts to contemplate:

When Creating Integration Tests Do…

1. Consider the cost vs. benefit of each test

Should this be a unit test? How much time will it saveto write this test over a manual test? Is it run often? If a test takes 30 seconds to run manually every few weeks, taking 12 hours to automate it may not be the best use of resources.

2. Use intention revealing test names

You should be able to figure out or at least get an idea of what a test is doing from the name.

3. Use your public API as much as possible

Otherwise, it’s just more endpoints and calls to maintain when application changes are made.

4. Create a new API when one isn’t available

Rather than relying on one of the Don’ts

5. Use the same UI as your customers

Or you might miss visual issues that your customers wouldn’t.

6. Use command line parameters for values that will change when tests are re-run

Some examples include items like site name, username, password etc.

7. Test using all the same steps your customers will perform

The closer your tests are to the real thing, the more valuable they’ll become.

8. Switch your system under test back to the original state

Or at least as close to it as you can. If you create a lot of things, try to delete them all.

9. Listen to your customers and support team

They will find waysto use your systems that you will never expect. Use this to your advantage in creating real-world beta tests.

When Creating Integration Tests Don’t…

1. Write an integration test when a unit test suffices

It’ll be extra effort for no benefit.

2. Use anything that a customer cannot use

Databases, web servers, system configurations are all off limits. If your customer can’t touch it, your tests have no business touching it either.

3. Access any part of the system directly

Shortcuts just reduce the quality of your tests.

4. Use constants in the body of your tests

If you must use constants, put them in a block at the top of your test file or a configuration file. There is nothing worse than having to search through all your source files because you changed a price from $199.95 to $199.99.

5. Create an internal-only API

Unless necessary for security or administration.

6. Create an internal only UI

You’re supposed to test what the customer will see after all.

7. Make your test too complex

No matter how brilliant your test is, keep it simple. Complexity just breaks later. If you are finding it hard to write, it will be hard to maintain too.

8. Test more than one thing

Stick to what you need to test. If you try to do too much in one test, it will just get more complex and more fragile.

9. Leave the test system in a bad/unknown state

This means a broken or unusable site, database or UI.

 

How Low Should You Go? Level of Detail in Test Cases

It can be difficult to know just how much detail you should include in your test documentation and particularly in test cases.

Each case has a different set of needs and requirements in terms of purpose, usage, frequency and admin needs.

If it’s written at a too high level, then you’re leaving it open to too much interpretation and risking the accuracy of the testing. If it’s at a too low level, you’re just wasting your own time. It makes the maintenance more difficult and there’s an opportunity cost to other projects with demands on your time.

In this post, we break down some of the factors you should consider helping you find the right level.

Understand the Wider Context

Each of your project’s stakeholders will have concerns that will impact the amount of detail you need to provide. From your organization’s internal politics and appetite for risk to the extent to which the product is relied upon etc. This will provide a wider context for your test cases and start to improve your thinking. The documentation expectations at a lean startup may even differ greatly to that at some of the financial institutions.

Test Requirements and Resources

You need to provide enough information to describe the intent of the test case. This should clear all the elements that need to be tested. A special consideration should be given to any specific input values or a particular sequence of actions.

The amount of time you have to invest in the test case and the human or IT resources you have to enact the tests is obviously another key factor.

Know Your Audience

Also, consider the audience for each case. How technical are they? How much product knowledge do they have and how experienced at testing are they? More experienced testers who are familiar with the product will need fewer details but is the team likely to change in a foreseeable future? If so, then you might want to head off re-writes later by providing extra details now for those with less experience.

Some organizations have specific requirements to provide evidence of test coverage. Usually, it’s to show adherence for compliance to a standard or certification or for other legal issues.

Test and Product Considerations

Each test is different, from the importance of the test, to how long it will be in use for. If it’s likely to convert to an automated test script in the future, then including more details at that time might make it easier to do. There are similar considerations about the product you’re testing. Will the application be used in long-term? And are whereabouts in its lifecycle? The amount of change that you can expect for a recently built, an agile application is far greater than for some old system you’re maintaining. Unless it’s a wild, testless code beast that is.

There’s a Balance to be Found

These factors don’t necessarily mean you should include more detail but crucial and long-lasting tests justify the time if needed. However, there’s a balance to be sought. If you create highly specific tests, then even minor design changes or functionality alterations may mean you have to re-write the cases. They also lead testers to raise bugs for what end up creating problems with the test documentation, rather than impacting customers. They can have a knock-on effect too. They encourage the tester to only consider the specific paths through the application detailed in the case. Meaning they might not consider the functionality from a broader perspective.

There’s no silver bullet for coming to a conclusion, each organization’s requirements differ. And these requirements change depending on the project, product and individual tests. However, considering the factors above, you can find a level that works for you and your team.