London software testing news UK


Tablet computer for Pay TV

Posted in Software testing by testing in London on November 8, 2011

According to Fierce Cable, Motorola Mobility could be testing a tablet computer with the cable operators to allow their customers to manage pay TV.

It seems that cable technology vendors are busy developing and testing devices to deliver multi-platform content to cable customers.

“Pay TV subscribers with Apple’s iPhone and iPad and smartphones and tablets running on Android are already using apps that allow them to view pay TV programming and use mobile devices as a remote control.”

Selecting testing tools

Posted in Software testing,testing tool by testing in London on November 7, 2011

I was sent an email about a template you could use for issueing an RFP if you were selecting a testing tool. It stated that the functions and features in such a test tool template can be categorised as:

  • Used in Design
  • Used While Coding
  • Used While Testing
  • Test Support Tools
  • General Functionality
  • Integration with Other Tools
  • Testable Platform
  • Ease of Use and Customizable UI
  • Architecture
  • Industry
  • Tool Characteristics

Is this the set of test tool features you would choose if you were evaluating testing tools ?

Load testing and disasters

Posted in Load testing by testing in London on November 3, 2011

From TMC

“…the harsh weather that “never seemed to go away” held the transportation infrastructure and its operators in a stranglehold. And then that Icelandic volcano thing happened.

Transportation industry websites were overwhelmed with the massive traffic surge and some went down. But Swedavia, the country’s largest airport operator, had thought ahead and implemented a cloud-based offload service. So the Web traffic was redirected, within minutes, to a cloud-based high performance delivery network…”

London Games Testing Events

Posted in Acceptance testing by testing in London on November 2, 2011

The London 2012 sports testing programme, the London Prepares series, continues next year.

The London Prepares series enables LOCOG to test key aspects of their operations ahead of next year’s Games – such as results, scoring and timing systems, the fields of play, the venues and the people who will be working to ensure the Games run smoothly.

Tickets for the next events will go on sale at 10am on 17 November 2011 via Ticketmaster – giving you a great chance to be among the first to experience world-class sporting action at a London 2012 venue.

Next Generation Testing Conference

Posted in Events and improvement,Software testing by testing in London on November 1, 2011

There is a testing conference in London for the next two days at the Novotel London West. It’s called Next Generation Testing and you can find out more information about it here.

Key themes are:

  •  Agile Testing
  • Behaviour Driven Development
  • Automating Requirements
  • User Acceptance to Avoid IT Disasters
  • Risk Based Testing.

Other topics covered include:

  • Assuring quality in a virtualised operating environment
  • Cloud Testing
  • Expressing Testing in terms of Business Outcomes
  • Improved Test Design
  • Test Automation Strategies

Cloud testing

Acceptance test RFP

Posted in Acceptance testing,Software testing by testing in London on October 31, 2011

This is from a public RFP issued in October. It gives an idea of how important acceptance testing is to some contracts:

“…requires four types of acceptance tests: functional, performance, reliability and availability. The Proposed Acceptance Test Plan (ATP) should address all four types of acceptance tests.

Acceptance tests will be conducted first on each System Component (e.g., CAD, Mobile, LRMS, Field Reporting and FRMS) independently. Upon acceptance of all System Components, a final set of Functional, Performance and Reliability Acceptance Tests will be performed on the integrated System to ensure that all Components work together as intended and at the contracted performance levels. The County will notify the Proposer of the successful completion of each test in accordance with task completion requirements in the Statement of Work.

In the event a Level 1 Error is corrected by the Proposer, then subsequently fails on two (2) additional occasions within the test period, the County has the right to be refunded all previous payments under the Contract.”

Open Source Automated Testing Tool

Posted in Automated testing,testing tool by testing in London on October 28, 2011

From PRNewswire

Gorilla Logic,, a leader in enterprise application development services and creators of open source test tools for mobile and rich Internet applications, today announced the availability of new independent research from Ovum Technology, an objective analyst firm that enables organizations to make better business and technology decisions. The report written by principal analyst, Michael Azoff, emphasizes, “For any mobile application developers targeting Apple iOS and the Flash platform, FoneMonkey should be considered for testing and QA purposes.”

London Olympic Games

Posted in Software testing by testing in London on October 27, 2011

A recent article in the Telegraph had a list of 10 things about the technology of the London Olympic Games. The list was attributed to Atos, one of the major technology providers for the London 2012 Olympic Games. Some quite interesting facts in there such as four billion people are expected to watch the London games on TV. And that there will be 8.5 billion intelligent terminals (that’s smart phones, such as Android or iPhone, tablets, such as iPad. and PCs) accessing the internet by 2012. Staggering numbers.

But from a testing perspective the key fact was:

200,000 hours – or 23.5 years’ – worth of testing will be carried   out on the IT systems before the games start, to simulate and prepare for   every possible scenario. To put that in perspective, that’s the equivalent   of 8,333 days’ work.

That is a fair bit of testing by any standard.

London Testing

IEEE Acceptance testing

Posted in Acceptance testing,Software testing by testing in London on October 26, 2011

I found this beta vesion  of the IEEE Technology Navigator yesterday. The page I found was on Acceptance Testing and it identified  1.785 reosurces for acceptnace testing. That seems a lot of resouces. That is until you compare it with the 284K  resouces related to the more general topic of testing.

Amongst the resources are the following standards:

  •  IEEE Guide: Test Procedures for Synchronous Machines Part I– Acceptance and Performance Testing Part II-Test Procedures and Parameter Determination for Dynamic Analysis
  • IEEE Recommended Practice for Field Testing Electric Submersible Pump Cable
  • IEEE Recommended Practice for Testing Electronics Transformers and Inductors
  • IEEE Standard Conformance Test Procedures for Equipment Interconnecting Distributed Resources with Electric Power Systems
  • IEEE Standard for Software and System Test Documentation

Overview of IEEE 829-2008

Acceptance testing

Posted in Acceptance testing by testing in London on October 25, 2011

Recently saw this statement about acceptance testing: “Acceptance testing is not about bug hunting or ‘breaking’ the software; it is about testing your requirements in real life conditions.”. It came from a large company that supplies testing and IT services.

Agree or disagree? What if we made it more general “testing is not about bug hunting or ‘breaking’ the software; it is about testing your requirements in real life conditions.“. Now what do you think?

You’ve probably heard this statement many times in various forms. Does shifting emphasis from a negative concept (finding mistakes) to a positive concept (showing that the requirements have been met) make testing more acceptable? Possibly. What it definitely does is move the focus from detecting things that aren’t working to declaring that things work. And this will affect how the acceptance testing team behaves.  There are few better ways to discourage bug detection than stating that this is definitely not what testing is about.

Testing should always be about finding bugs and as soon as we change that motivation we lower the effectiveness of the testing we perform.

« Previous PageNext Page »