London software testing news UK


Acceptance test RFP

Posted in Acceptance testing,Software testing by testing in London on October 31, 2011

This is from a public RFP issued in October. It gives an idea of how important acceptance testing is to some contracts:

“…requires four types of acceptance tests: functional, performance, reliability and availability. The Proposed Acceptance Test Plan (ATP) should address all four types of acceptance tests.

Acceptance tests will be conducted first on each System Component (e.g., CAD, Mobile, LRMS, Field Reporting and FRMS) independently. Upon acceptance of all System Components, a final set of Functional, Performance and Reliability Acceptance Tests will be performed on the integrated System to ensure that all Components work together as intended and at the contracted performance levels. The County will notify the Proposer of the successful completion of each test in accordance with task completion requirements in the Statement of Work.

In the event a Level 1 Error is corrected by the Proposer, then subsequently fails on two (2) additional occasions within the test period, the County has the right to be refunded all previous payments under the Contract.”

Open Source Automated Testing Tool

Posted in Automated testing,testing tool by testing in London on October 28, 2011

From PRNewswire

Gorilla Logic,, a leader in enterprise application development services and creators of open source test tools for mobile and rich Internet applications, today announced the availability of new independent research from Ovum Technology, an objective analyst firm that enables organizations to make better business and technology decisions. The report written by principal analyst, Michael Azoff, emphasizes, “For any mobile application developers targeting Apple iOS and the Flash platform, FoneMonkey should be considered for testing and QA purposes.”

London Olympic Games

Posted in Software testing by testing in London on October 27, 2011

A recent article in the Telegraph had a list of 10 things about the technology of the London Olympic Games. The list was attributed to Atos, one of the major technology providers for the London 2012 Olympic Games. Some quite interesting facts in there such as four billion people are expected to watch the London games on TV. And that there will be 8.5 billion intelligent terminals (that’s smart phones, such as Android or iPhone, tablets, such as iPad. and PCs) accessing the internet by 2012. Staggering numbers.

But from a testing perspective the key fact was:

200,000 hours – or 23.5 years’ – worth of testing will be carried   out on the IT systems before the games start, to simulate and prepare for   every possible scenario. To put that in perspective, that’s the equivalent   of 8,333 days’ work.

That is a fair bit of testing by any standard.

London Testing

IEEE Acceptance testing

Posted in Acceptance testing,Software testing by testing in London on October 26, 2011

I found this beta vesion  of the IEEE Technology Navigator yesterday. The page I found was on Acceptance Testing and it identified  1.785 reosurces for acceptnace testing. That seems a lot of resouces. That is until you compare it with the 284K  resouces related to the more general topic of testing.

Amongst the resources are the following standards:

  •  IEEE Guide: Test Procedures for Synchronous Machines Part I– Acceptance and Performance Testing Part II-Test Procedures and Parameter Determination for Dynamic Analysis
  • IEEE Recommended Practice for Field Testing Electric Submersible Pump Cable
  • IEEE Recommended Practice for Testing Electronics Transformers and Inductors
  • IEEE Standard Conformance Test Procedures for Equipment Interconnecting Distributed Resources with Electric Power Systems
  • IEEE Standard for Software and System Test Documentation

Overview of IEEE 829-2008

Acceptance testing

Posted in Acceptance testing by testing in London on October 25, 2011

Recently saw this statement about acceptance testing: “Acceptance testing is not about bug hunting or ‘breaking’ the software; it is about testing your requirements in real life conditions.”. It came from a large company that supplies testing and IT services.

Agree or disagree? What if we made it more general “testing is not about bug hunting or ‘breaking’ the software; it is about testing your requirements in real life conditions.“. Now what do you think?

You’ve probably heard this statement many times in various forms. Does shifting emphasis from a negative concept (finding mistakes) to a positive concept (showing that the requirements have been met) make testing more acceptable? Possibly. What it definitely does is move the focus from detecting things that aren’t working to declaring that things work. And this will affect how the acceptance testing team behaves.  There are few better ways to discourage bug detection than stating that this is definitely not what testing is about.

Testing should always be about finding bugs and as soon as we change that motivation we lower the effectiveness of the testing we perform.

Software testing with Visual Studio

Posted in Acceptance testing by testing in London on October 21, 2011

There is a book review here on software testing with Visual Studio. The view of the reviewer is that the book provides a good overview of testing with Visual Studio but that there are a lot of topics omitted. For example he mentions that it doesn’t cover stress testing, fuzzing and load testing. However, on the whole the review, which outlines what is in each chapter of the testing book,  is positive.

If you’re interested in getting a copy its Software Testing with Visual Stuio 2010 by Jeff Levinson & Steven Borg; published by Addison-Wesley, ISBN: 978-0321734488.

Vizual Studio Load Testing

Software testing puzzles

Posted in Software testing by testing in London on October 20, 2011

Testing puzzles can be interesting and challenging. For example, check out these software testing puzzles.

It was with interest that I noticed that there is a talk on software testing puzzles at next week’s QA and Test conference (the international conference on software testing and quality assurance for  embedded systems). in Spain. Looking at the abstract the topic ranges from puzzles such as the dice game puzzle (which is mentioned along with other puzzles to sharpen your testing skills at this blog) to challenges more akin to day to day testing activities (such as the Weekend Testing project).