London software testing news UK


Trouble with DR Testing

Posted in security testing,Software testing by testing in London on August 31, 2008

From Internet News

DR testing is a mess. A whopping 30 percent of respondents said their DR tests failed. That’s better than the 50 percent failure rate in 2007, but it’s still pretty scary.

For 35 percent of the respondents, the tests failed because “people didn’t do what they were supposed to do,” Lamorena said. This means that much of recovery is still a manual process, and companies must begin looking at automation, he added.

Another cause is that tests are not run frequently enough. That’s because “when you run a test, it disrupts employees and customers,” Lamorena said. He added that 20 percent of the respondents said their revenue is impacted by DR tests, so “the tests cause the same pain to their customers as if they had a real disaster.”

DR testing

A sort of testing story about the London 2012 olympics

Posted in General by testing in London on August 30, 2008

From Reuters

Olympics minister Tessa Jowell has commissioned a viability test on the proposed equestrian venue at Greenwich Park, the shooting facilities at the Royal Artillery Barracks, Woolwich and the basketball venue on the main Olympic Park.

“We have commissioned KPMG to do a report on the equestrian, shooting and basketball venues, looking at whether the Olympic experience and the legacy they will provide represents value for money,” Jowell said.

“When you take the costs for these venues, it seems like a lot of money to a lot of people. It is a sort of testing-to-destruction to see whether that spending can be justified.”

London independent testing company

Testing blamed for election computer glitch

Posted in Software testing by testing in London on August 29, 2008

From Herald Tribure

A software glitch in Sarasota county’s new $3 million voting system slowed the vote tally and left elections officials adding votes with pencils and paper as they pushed to meet a state elections reporting deadline late Tuesday.

The problem did not affect how votes were counted, but kept two new $80,000 machines from being able to upload all 10,700 absentee votes into the elections computer.

Elections Supervisor Kathy Dent said she now does not feel comfortable using the two machines in the November presidential election, and plans to use other machines.

Elections watchers said it was another example of how Florida’s elections officials in Tallahassee have not thoroughly vetted voting systems.

“I would put the problem squarely on the shoulders of the testing program,” said W. “Skip” Parish, a computer expert.

Customer experience testing

QA and testing and code

Posted in Acceptance testing by testing in London on August 28, 2008

From ZDnet Asia

Involve quality in the entire application development process, instead of concentrating on it only from the software debugging stage, industry watchers have urged companies.

According to Partha Iyengar the amount of time spent on testing or debugging of applications ranges between 25 percent and 50 percent of the software development lifecycle. Companies with tried and tested processes, methodologies and tools are likely to spend less time, he said in an e-mail interview.

But while software testing is a necessary component of application development, companies need to think beyond using testing as a means to ensure quality in the product, noted Iyengar, who is also Gartner’s regional research director for India.

Software quality assurance testing

Software testing in Vietnam

Posted in Software testing by testing in London on August 27, 2008

From Vietnam.net

The software testing market is a niche segment within the IT outsourcing market, with global market size estimated at US$6.1 billion in 2005, according to the US-based IT research and advisory Gartner Group.

Software testing is an empirical investigation conducted to provide stakeholders with information about the quality of a software product or service in the context in which it is intended to operate.

India’s young software testing market, which started around 2001, has an estimated total market size of $1 billion (Meta Group). The industry has created an estimated 40,000 software testing professionals in 2008.

The intent of software testing is finding software bugs or errors to ensure optimum software performance for various industries.

Company launches Test Automation Service

Posted in Automated testing,Software testing by testing in London on August 26, 2008

From Market Watch

Foliage, a technology consulting and product development company, today announced its Test Automation Services program, a package of services designed to address the business challenges faced by companies regarding the verification phase of the software product development life cycle.
R&D costs are continuing to increase as a percentage of revenue and for many companies the cost of testing approaches 50 percent of their total R&D budget. In addition, the testing process has significant time and resource implications which can negatively affect the overall product development cycle. In the regulated aerospace industry, if testing is not comprehensive and completed correctly, the product risks the possibility of failing the FAA’s SOI audit resulting in delayed product schedules, revenue loss or deferment and the cost of reworking and redoing the tests.

To address these challenges, Foliage launched a suite of Test Automation Services that complements their existing product development services. Foliage has focused on developing an effective automated software test approach that when integrated into a continuous development process, has proven to significantly reduce the amount of time and effort expended in testing software-based products, directly lowering the costs associated with product testing and verification.

Test automation services

Theory of risk-based testing

Posted in Software testing by testing in London on August 25, 2008

If you’re looking for a structured paper on risk-based testing, then try this one on the theory and practice of risk-based testing.

Risk is a complex subject, but may be considered to be a function of two components: the probability of occurrence of a defined undesirable event and the severity of the event’s potential consequences. Risk analysis is summarised in an Australian and New Zealand standard (AS/NZS 1999) and is carried out in four stages:

  • Scope definition, in which the context and terms of reference of the analysis are defined;
  • Hazard identification, in which methodical exploration is carried out to identify the things that could go wrong (the ‘hazards’);
  • Hazard analysis, in which the identified hazards are analysed to estimate their potential consequences and probabilities of occurrence, and thus the risks that they pose;
  • Risk assessment, in which the risks are assessed against tolerability criteria.

Load and performance testing in the cloud

Posted in Software testing by testing in London on August 24, 2008

From Thomas Net

SOASTA, the leading provider of cloud-based testing solutions, today announced SOASTA CloudTest Lab. A Web testing solution built on the cloud to enable application testing in the cloud, SOASTA CloudTest Lab allows developers to easily and affordably turn any cloud computing environment into a Virtual Test Lab for load and performance testing of their applications.

SOASTA CloudTest Lab provides engineers with a 24x7x365 Web application test lab at their fingertips. The combination of a scalable hardware architecture with leading-edge Web testing technology enables the testing of every layer of a Web application or service. It delivers immediate value, scales seamlessly up or down with testing needs, and does not require investment in a cost prohibitive testing infrastructure. SOASTA CloudTest Lab natively supports every testing type including Load, Performance, Functional, and Web UI/Ajax.

American Megatrends Unveils Testing Framework

Posted in Software testing by testing in London on August 23, 2008

From Thomas Net

American Megatrends, a leader in storage and computing innovations, is proud to introduce its new TUFMAN(TM) testing software for validation of management software, which complies with major industry standards such as CIM, WS-Management and SMASH/CLP.

MegaRAC TUFMAN, or Test Utility Framework for Management, is a rich web-based framework for testing newly developed management software compliant with the standards established by the Distributed Management Task Force (DMTF), the leading industry organization in the field of system management.

DMTF standards such as the Common Information Model (CIM), Systems Management Architecture for Server Hardware (SMASH) and the new Web Services for Management, or WS-Management, aim to establish widespread interoperability in the administration of servers characterized by different operating environments and vendors.

Solution software testing

Performance testing and stress testing SAP systems

Posted in Load testing,Software testing,testing tool by testing in London on August 21, 2008

There is an interesting book availble from the HP Professional series: mySAP Tool Bag for Performance Tuning and Stress Testing

“In this book, a leading expert on SAP performance walks through every facet of tuning and optimizing mySAP Solutions, and the technology layers underpinning these solutions, to maximize performance and value. George W. Anderson covers the entire testing and tuning process: planning, staffing, developing, testing, executing, validating, evaluating…and acting on what you’ve learned.

The software testing book includes:

  • Quantifying concrete performance requirements—even for complex, cross-application business processes
  • Testing and monitoring daily system loads, month-end or seasonal business peaks, key transactions, and complex multi-system business processes
  • Conducting comprehensive server, SAN/disk subsystem, and database testing
  • Managing the testing process, leveraging proven best practices and techniques
  • Analyzing, verifying, and quantifying SAP availability, scalability, and TCO

SAP performance testing services

SAP LoadRunner load testing tool

Next Page »