London software testing news UK


Testing solutions that support SAP applications

Posted in Software testing by testing in London on June 30, 2007

From MCADCafe

Worksoft, a provider of upgrade and testing solutions for use with SAP applications that validate end-to-end business processes, today announced the availability of Worksoft Certify 7.2. This solutions and Worksoft’s patented scriptless technology manage the impact and management of software upgrades, reduce the complexity of building automated test cases, minimise the impact on critical business users.

Worksoft Certify provides a single integrated platform for testing business processes across the enterprise without relying on time-consuming and inefficient scripting processes. Processes that span multiple technologies and platforms, such as .Net, HTML, JAVA, and even the mainframe, can be effectively tested in a single, seamless test process. This end-to-end view of testing business processes, combined with Certify’s unique “scriptless” approach to creating and maintaining test cases, makes Certify the ideal testing solution for business analysts, end users, and other business-process-related subject matter experts. Certify 7.2 delivers significantly enhanced usability, performance, and scalability capabilities to more effectively create and manage these large cross-enterprise tests.

Worksoft enables customers using SAP applications to more securely manage upgrades, deploy new software and save millions of dollars annually by empowering them to maximize the return on their original enterprise investment and minimize the risk of upgrades and installs. The Certify Interface and Worksoft’s Business Process Solutions extend Certify’s support for SAP solutions by providing predefined repositories of SAP solution-based best practices for all core end-to-end business processes. Since all process flows and objects within Certify are stored as individual assets with familiar names and business narratives, the best practices content can quickly be adapted to each company’s specific configuration through a simple drag-and-drop interface. Additionally, Certify 7.2 automatically provides easy-to-read test documentation that serves as a reusable resource for business-process training and reporting. The detailed audit trail of validations results helps ensure documented compliance of regulatory requirements.

BT completes 21CN testing trials

Posted in Software testing by testing in London on June 29, 2007

From Light Reading

LONDON — Following the successful migration of the first live customers onto BT’s 21st Century Network (21CN) in South Wales, BT is to close the pioneering 21CN voice trial network which made communications history in 2005 when it carried the world’s first phone calls over an all IP next generation infrastructure. The trial network, linking BT exchanges in central London, Woolwich and Cambridge, has carried over 160 million calls and provided valuable testing and learning that is now being deployed with live customers.

As BT gears up to migrate some 350,000 customers in Cardiff and the surrounding area onto 21CN over the course of this year, the completion of the trial sends a further signal that 21C has entered a new phase, with BT focusing on the mass implementation of the new network.

The start of the 21CN voice testing in 2005 marked the beginning of BT’s muti-billion pound investment in a national next generation network for the UK. The trial formed a critical part of BT’s 21CN testing plans to ensure that customers continue to enjoy the high standards of service quality on 21CN as they do today on the existing PSTN network, and has been instrumental in shaping BT’s plans for implementing 21CN across the country, and the rest of the globe.

In 2006, the second phase of the trial saw the integration of equipment from BT’s strategic 21CN vendors. This provided BT with invaluable early learning experience of the 21CN technology and the operation of the network. With 21CN equipment already installed at hundreds of sites across the UK, and the first stage of the process of migrating customers well underway, the technology is fully proven and this component of BT’s 21CN testing programme is complete. 600 BT employees were involved in the trial which will close fully in September.

The closure of the 21CN voice trials marks the end of one critical phase of BT’s comprehensive testing programme for 21CN. BT continues to enhance its 21CN testing capabilities with the installation of world class testing facilities at Adastral Park and in Swansea. BT is working closely with communications providers, manufacturers and industry bodies to test systems, services and customer equipment to ensure full interoperability with 21CN and service excellence for customers.

Testing telecom software

Reasons for bad software testing

Posted in Software testing by testing in London on June 28, 2007

From Comp Lanc ICSE 2007

Despite advances in formal and automated fault discovery and their increasing adoption in industry, it appears hat testing, whereby software is ‘shown to be good enough’ will continue as the principal approach for software verification and validation. The strengths and limitations of testing are well known and there is healthy debate over automation. Case studies have proved valuable, and following in this programme of ‘empirical studies of testing’ we seek to better describe the practical issues in testing for a mall software company.

Best practice in testing has been largely uncontroversial, it being to adopt a phase based approach. The earlier phases in these models have increasingly been automated, whereas innovations focused on the latter stages have been more human centric (for example risk based testing. Agile methods, such as extreme programming (XP), disrupt such models with test driven development and often a  rejection of any testing that cannot be fully automated. The agile approach has been successful but there remains a lack of empirical evidence about such testing, and we are concerned as to whether it solves or merely displaces certain issues. Our experience is also that many companies who have adopted XP practices, do not, in fact, automate all tests.

Alongside the ‘best practice’ approaches there continue to be more pragmatic guides to testing. For example Whittaker [26] argues that “there is enough on testing theory” and looks at “how good testers actually do software testing”. Kaner provides wider “lessons” based upon his experiences in testing. From such guides it seems that drawing and learning from ‘experience’ is somehow as important as following a rational approach to testing.

he empirical study in this paper confirms what Whittaker calls for elsewhere: for theory based and practice based approaches to communicate and converge. In this paper we discuss the pragmatics of software testing for a small software company. The company, which we shall refer to as W1REsys, follow a programme of automated unit testing and a semi-automated programme of integration and acceptance testing. We focus on systems integration and acceptance testing and find the notion of ‘rigorous’ testing is defined organisationally rather than in accordance with some technical criteria. We discuss why it is important for software engineering researchers to understand that testing is a socio-technical rather than a technical process and that, for product companies there will inevitably be ambiguity related to integration and cceptance testing.

Model based testing

Posted in Software testing by testing in London on June 27, 2007

From Sys-con

Model-based testing has stirred up a significant amount of interest over the past couple of years. For some development and testing teams it’s certainly a test method worth exploring but be aware that model-based testing may be a great supplement to the automated testing you already do, but it’s really an adjunct and not a replacement for standard automated testing.

A properly constructed automated software quality assurance test aims to discover all of the ways that every feature in a piece of software can be used, and how these features will interact with all other features, exposing the flaws and bugs that lurk in the programming code. Standard tests accomplish this by running a pre-determined series of code examinations. Model-based testing, in contrast, uses algorithms to determine all of the usage paths for an application, pares down that number for maximum coverage and minimal testing, and then generates various test cases to try the application against.

Skilled software testers labour long hours to figure out all of the possible ways a user might interact with an application. But users are almost guaranteed to surprise programmers by figuring out some innovative and exciting way to use a program, ones cause results that may not have been predictable to a logical mind.

Risk-based testing 

Clarus in VITAL partnership to test and verify Cisco IP comms

Posted in Software testing by testing in London on June 25, 2007

From Business Wire

Clarus Systems,  a leading provider of integrated management and testing solutions for IP Communication deployments, upgrades and transformations, today announced a strategic alliance with VITAL Network Services, Inc., an independent, multi-vendor provider of network lifecycle services. The partnership was developed in response to demands for certification solutions to test and verify proper functionality of Cisco IP Communications deployments.

We are pleased to add VITAL Network Services to our growing managed service provider alliance program, said Brendan F. Reidy, President and CEO of Clarus Systems. Through these strategic relationships, Clarus Systems allows providers like VITAL to create valuable solutions that power telephony transformations using ClarusIPC®.

Under the new offering, VITALs network engineers will utilize ClarusIPC®, the leading automated testing and documentation application that maximizes system availability while optimizing system performance, to remotely test, document and certify all aspects of the network environment meet user feature and functionality requirements at the time of the deployment. This turnkey solution will enable customers to validate 100% of their deployment without outlaying the time and expense typically associated with this type of comprehensive feature, functionality and performance testing.

Verify software testing conference

Posted in Events and improvement,Software testing by testing in London on June 24, 2007

VERIFY 2007 is an independent conference held form October 29th to 30th, 2007 in Arlington, Virginia. There are three presentation tracks: security testing, test automation, and general testing.

Presentations are from industry leaders and practitioners and cover their techniques for succeeding in real-world software development projects and overcoming difficult implementation constraints. The conference is aimed at software testers, developers, test leads, development leads, and managers of test and development.

The keynote speeches are:

  • Software Security Testing — Gary McGraw, Ph.D
  • The Business Case for Test Automation — Bernie Gauf, President and CTO, IDT
  • Improving Your Development Process by Automating Infrastructure — Dr. Adam Kolawa, CEO, Parasoft
  • Continuously Aware: Evolving the Business-and-Tech-Savvy QA Organization — John Michelsen, Founder and Chief Architect, iTKO LISA

The conference focuses on:

  • Security testing
  • Java or .NET testing
  • Web Services testing
  • SAP testing
  • Model-based testing
  • Database testing
  • Data Migration testing
  • Agile testing, SCRUM, test driven development
  • Automated testing
  • Code Coverage Analysis
  • Risk-based testing
  • Test program management
  • Real world test solutions

For more information visit http://verifyconference.com

Software testing and hiccups

Posted in Mistakes,Software testing by testing in London on June 23, 2007

Software hiccups from The Indy Star

Some problems that could have been avoided with proper software testing:

  • June 10: HBO’s Web site crashes after “Sopranos” fans log on to register complaints about the show’s abrupt ending.
  • May 16: Indianapolis Public Schools discovers a security glitch in its software that allows the Social Security numbers of as many as 75,00 students and 3,000 teachers to be accessible via Google.
  • April 17: The BlackBerry wireless e-mail service suffers outages across North America that delay the sending and receiving of messages.
  • Jan. 17: TJX Cos., which runs T.J. Maxx and Marshall’s stores, reports an intrusion into its computer systems that expose 45.7 million credit- and debit- card account numbers.
  • Jan. 3: A hacker breaks into the Indiana.gov Web site and accesses personal information for 71,000 health-care workers certified by the state, as well as 5,600 credit-card numbers from Indiana residents.

application load testing

Error During Testing Grounds Airline

Posted in Mistakes,Software testing by testing in London on June 22, 2007

From PC World

An operational error during routine system testing of United Air Lines Inc.’s computer system was the preliminary root cause of a computer failure that forced the airline to cancel and delay U.S. flights yesterday, a United spokeswoman said. The failure was in the computers used to dispatch flights.

Urbanski, a spokeswoman for United, said the airline wouldn’t have a final count of affected flights until the close of business Thursday. She added that United plans to run a close-to-on-time schedule Thursday. However, there may be unrelated delays in the Chicago area this afternoon because of potential thunderstorms, she said.

United said it canceled 24 domestic flights Wednesday because of the computer glitch.

This is the second time this month that a computer malfunction caused significant delays in the U.S. On June 8, one of the two systems used by the U.S. Federal Aviation Administration to manage flight plans failed, causing flight delays and cancellations across the country.

HP announces new quality management tools

Posted in Software testing,testing tool by testing in London on June 21, 2007

From CBR

Following it’s acquisition of Mercury Interactive, Hewlett-Packard has announced a range of new business optimisation products and services. It has updated its Quality Center and LoadRunner product lines, providing integrated business requirements and software quality testing capabilities. HP said the integration will offer a more “agile testing” approach to software quality management that translates to quicker releases.

LoadRunner, an application performance testing tool, has been upgraded to include support for SOA architectures and rich internet applications.

HP has also added a new set of quality, performance and process best practices, called “Quality Factory Services”, to its HP Services Application Lifecycle offering and has beefed up its Service Management portfolio with new offerings for change control management, configuration management.

HP also now offers a new framework for consolidating its service management tools, and methodologies on standards like ITIL version 3 and COBIT.

Finally, HP has also tightened up integration across its quality management and testing tools and processes with a new configuration management database, that provides dependency mapping, and links to help desk and change management tools, and software that aligns change request, approval and deployment cycles.

Compuware Recognised for Software Testing

Posted in Events and improvement by testing in London on June 20, 2007

From CMcrossroads

Compuware announced that it has been named to BZ Media’s fifth-annual SD Times 100 list for the fifth consecutive year. Compuware was recognised in the “Test and QA” and “Security” categories. The SD Times 100 recognizes the leaders and innovators of the software development industry.

The winners of this year’s SD Times 100 awards have demonstrated their leadership in shaping the software development industry,” said David Rubinstein, Editor-in-Chief of SD Times. “We took into account each nominee’s products and services, its reputation among development managers, and the new ideas it brought out. These select individuals and organizations are the ones we’ve identified as helping to move the art of development forward.”

In the “Test and QA” category, SD Times Magazine editors wrote: “With perhaps the most diverse set of test tools offered, Compuware continues to excel equally in all areas, including unit, functional, load, performance and risk-based testing, and in requirements and even test-data management.”

Regarding Compuware in the “Security” category, SD Times Magazine editors
wrote: “An acknowledged leader in software testing moves into the security arena with a white- and black-box testing tool upgraded to scan potential security flaws from inside Microsoft Visual Studio.”

Next Page »