London software testing news UK


Mercury acquisition had made HP leader in automated software testing

Posted in General,Software testing by testing in London on February 28, 2007

From Globes online

A report by research and consulting company IDC says that HP-Mercury Interactive has made Hewlett Packard the unchallenged leader in the automated software quality tools market. HP acquired Mercury in 2006 for US$4.5 billion.

HP-Mercury has a strong lead over other companies in the sector, with a 62% market share in 2005, almost four times more than its second-placed competitor, IBM. Other companies in the sector are Compuware, Intel, and Empirix Inc. RadView has less than a 1% share of this market.

IDC says that HP-Mercury had 27.4% growth in the automated software quality tools market in 2005, driving the $950 million market forward. The market is believed to have exceeded $1 billion in 2006. IDC does not think that the market is saturated, and says that there is room for further growth. Software testing is still mostly done manually at many companies, giving plenty of room for penetrating automated solutions. IDC predicts that HP will continue to dominate the market and that it will expand its market share.

 Automated software testing services

Performance tests show improvements in blade switches

Posted in Load testing,Software testing by testing in London on February 27, 2007

From CRN 

Maximizing throughput has been the mantra of the data centre since servers have been plugged into switches. New applications and services, such as on-demand video and VoIP, along with SANs, have stretched infrastructure to its limits. But there is help on the horizon, in the form of 10-Gigabit Ethernet.

Ten-Gigabit Ethernet technology could not have arrived on the scene at a better time, since it aims to put the speed back into SANs and the power back into network applications. Blade Network Technologies, a company formed about a year ago when Nortel Networks sold off its blade server switch business unit, has entered the fray with its Nortel-based 10Gb Uplink Ethernet Switch Module, a component that brings Layer 2/3 switching into IBM and Hewlett-Packard blade housings. So far, the only 10-Gigabit blade switches available for IBM and HP systems are based on the Nortel chipset.

Third-party performance testing by the Tolly Group was performed on Blade Network Technologies’ Nortel 10Gb Uplink Ethernet Switch Module and Nortel 10Gb Ethernet Switch Module installed in an IBM BladeCenter. The results showed impressive performance for the switch modules as well as a favorable price/performance ratio. Both of these positive results can be traced back to the integration of the 10-Gigabit modules into a server blade, as opposed to operating in a separate, stand-alone chassis. The former style of configuration proves to be much more efficient, especially when high-performance server blades are housed in the same blade chassis. Performance testing showed that the Nortel-based modules attained full line speeds and achieved 100 percent zero-loss throughput (<=0.001 percent acceptable frame loss) for all standard frame sizes (64 to 1,518 bytes). Store and forward latency ranged from 1.86 to 1.93 microseconds (1/1000000 seconds) for Layer 2 operations. Tolly also showed that competing chassis-based 10-Gigabit solutions could not match those speeds.

performance software testing 

So when does it make sense to use an automated penetration test?

Posted in security testing,Software testing by testing in London on February 26, 2007

From FCW.com

Advocates of the new tools say the applications give in-house security professionals more control, including the ability to perform penetration tests as often as they want. Critics of automated tools say they are a poor substitute for a thorough and nuanced manual test that a skilled practitioner performs. Most experts agree, however, that an automated penetration test in the hands of an untrained novice could do more harm than good.

“A fool with a tool is still a fool,” said Bill Harrod, a security management consultant at CA, formerly Computer Associates.

Penetration testing and other types of IT security assessments are a growing industry. In a world made increasingly unsafe by identity theft, online rip-offs and other cybercrimes, IT security pros are under pressure to fortify systems and networks and protect information assets. A growing number of federal regulations require public- and private-sector CIOs to harden their systems and networks against external and internal threats.

Vulnerabilities are increasing exponentially. The number reached 5,990 in 2005, according to Carnegie Mellon University’s Computer Emergency Response Team Coordination Center, an increase from 171 vulnerabilities reported a decade earlier.

assurance quality software testing

Testing Ajax applications

Posted in security testing,Software testing by testing in London on February 25, 2007

From CBR Online

Watchfire Corp, which takes an ethical hacking approach to uncovering website vulnerabilities, is releasing a new version of its enterprise offering that extends coverage to testing loosely scripted, highly interactive Ajax-style applications.

The newly released version, Appscan Enterprise 5.0, adds the ability to auto generate Ajax-style applications, and simulates end user interactions, including pauses for steps such as filling out forms.
It also adds some new features aimed at getting QA specialists, and even developers, to get over their reluctance to do security testing. The hang-up of course is that security issues, such as cross-site scripting or buffer overflows, are Greek to developers and testers. It’s the type of stuff that security specialists only know.

So Watchfire has added some simplified screens that hide all the complex configuration settings that security can preset ahead of time. And it displays results in software testing and developing terms, showing vulnerable pieces of code, as opposed to saying, “cross-site scripting error.”

Separate from this announcement, Watchfire disclosed a hole in Google Desktop that an intruder could exploit to peer into a victim’s hard drive.

Google Desktop is one of those hybrid apps that uses a mixed web-based and locally installed desktop client. So it has the kind of exposure characteristic of a web app, but through its functionality, also penetrates the desktop.

user acceptance test UAT

Testing and Debugging DSP Systems

Posted in Software testing by testing in London on February 24, 2007

From Dr Dobbs

In software development, perhaps the most critical, yet least predictable stage in the process is debugging. Many factors come into play when debugging software applications. Among these factors, time is of the utmost importance. The time required to set up, test and debug a software application can have significant impacts on time-to-market, meeting customer expectations, and the financial impact of a well developed product that succeeds in the market. The integration of an application follows a model of multiple spirals through the stages of build, load, debug/tune, and change.

Debugging embedded real-time systems is part art and part science. The tools and techniques used in debugging and integrating these systems have a significant impact on the amount of time spent in the debug, integration, and test phase. The more visibility we gain into the running system, the faster we are able to detect and fix bugs.

Software testing risk based services UK

The impact of inadequate software testing is nearly $60 billion annually

Posted in Software testing book by testing in London on February 23, 2007

From Sys-con

Customers have high expectations that their software solutions have been stress-tested thoroughly in advance for every conceivable combination of events that might occur in production and that vendors who put out buggy products are exposed quickly.

Unfortunately, inadequate infrastructure for software testing is said to cost approximately $59.5 billion annually, according to a 2002 study, “The Economic Impacts of Inadequate Infrastructure for Software Testing,” conducted by the National Institute of Standards and Technology. This cost reflects, in part, the extra resources expended, due to inadequate testing tools and methods, to spot and correct errors found in the testing process. This figure represents the long and arduous road developers must travel to provide products that meet the expectations of their customers.

Because of the varied and disparate nature of Linux, the software testing process in the Linux environment is inherently complicated. In fact, it is a practical nightmare for most development teams, a time- and labor-intensive endeavour requiring manual installation of many Linux distributions and combinations on limited physical resources. Easily avoidable code problems and other potential issues are often missed simply because time and budget limits preclude the proper testing of every possible variable.

Stress testing software solutions

Wembley in final testing phase

Posted in General by testing in London on February 22, 2007

From Slam canoe

LONDON (AP) – Wembley officials say the stadium will be ready to host the FA Cup final on May 19 despite the postponement of a warmup event.  A “community day” scheduled for March 3 was put off Tuesday until March 17. The event will allow more than 60,000 local residents to visit the 80,000-seat venue.

“We still remain on track to host the 2007 FA Cup final but the FA (Football Association) will only announce the 2007 Cup final at Wembley once the stadium has been granted its general safety certificate,” Alex Horne, Wembley managing director,  said. “We are on course to go through the necessary testing procedures in order to get the certificate.”

Watchfire releases AppScan Enterprise 5 testing tool

Posted in Acceptance testing by testing in London on February 21, 2007

From Net Security

Watchfire announced AppScan Enterprise 5.  Based on next-generation technology, this new version further strengthens the power of the industry’s only web-based application security solution for security professionals, and now extends its utility to include a new point and shoot testing tool called QuickScan and integrated Computer Based Training to accelerate the adoption of security testing by QA and development teams.

Highlights of AppScan Enterprise 5’s Next Generation Architecture:

  • Advanced scanning capabilities that find vulnerabilities associated with the latest Web 2.0 technologies such as AJAX, as well as advanced JavaScript and Flash
  • Manual Explore and Recorded Login features to ensure successful site navigation and complete crawling
  • More flexible reporting framework with enhanced searching, grouping and filtering
  • More granular controls to lock down scanning and report access so sensitive security data is only available to those who truly need it
  •  Complete technology refresh with cleaner architecture and improved customization capabilities
  • Brand new graphical user interface, providing ease of use for developers

Software security testing and penetration testing services

Testing and the waterfall model

Posted in Software testing by testing in London on February 20, 2007

There is an article on the pro’s and con’s of the waterfall model for software development at Builder AU. On the testing phase it states:

“Testing: In this stage, both individual components and the integrated whole are methodically verified to ensure that they are error-free and fully meet the requirements outlined in the first step. An independent quality assurance team defines “test cases” to evaluate whether the product fully or partially satisfies the requirements outlined in the first step. Three types of testing typically take place: unit testing of individual code modules; system testing of the integrated product; and acceptance testing, formally conducted by or on behalf of the customer. Defects, if found, are logged and feedback provided to the implementation team to enable correction. This is also the stage at which product documentation, such as a user manual, is prepared, reviewed and published.”

It desribes the pro’s as:

“First, the staged development cycle enforces discipline: every phase has a defined start and end point, and progress can be conclusively identified (through the use of milestones) by both vendor and client. The emphasis on requirements and design before writing a single line of code ensures minimal wastage of time and effort and reduces the risk of schedule slippage, or of customer expectations not being met.

Getting the requirements and design out of the way first also improves quality; it’s much easier to catch and correct possible flaws at the design stage than at the testing stage, after all the components have been integrated and tracking down specific errors is more complex. Finally, because the first two phases end in the production of a formal specification, the waterfall model can aid efficient knowledge transfer when team members are dispersed in different locations.”

For the con’s see the article.

Business Acceptance Testing

UCAS choose eload for testing website

Posted in Software testing,testing tool by testing in London on February 19, 2007

From Public Technology

UCAS is a name that any A-Level student and higher education institution will be very familiar with. It is the central organisation that processes applications for full-time undergraduate courses at UK universities and colleges and it is the key link between the students and universities and colleges.

Currently UCAS supports around 330 higher education institutions offering over 55,000 courses; each using UCAS to process a total of 2.5 million applications from 520,000 applicants.

One of the major challenges for UCAS is that the process of applying to higher education institutions is deadline focused. Applicants, schools and universities have numerous deadlines to meet during the application process. This means that the IT infrastructure and systems at UCAS have to be able to cope with huge spikes in the number of users accessing the site for information, completing applications, monitoring and updating application information.

By May 2006, applicants looking to enter higher education in October 2006 had made over 13 million logins to the UCAS website to review the progress of their application.  Throughout the application cycle UCAS’ IT infrastructure is truly tested on a number of mission critical days, one such day in 2006 the website supported over 5 million page impressions with the tracking service supporting a peak load of 47 new logins per second.

UCAS recently used the e-Load solution to ensure that its Track service would handle the huge spike in the number of new logins, predicted to be over 20 logins per second. The e-Load tool allowed UCAS to locate and remove any bottlenecks in the existing infrastructure, to ensure the service could deliver between 40 and 60 logins per second. This work proved vital, as the application sustained an average login rate of 25 logins per second for the first day of operation.

OneSight provides a comprehensive view of an organisation’s entire web infrastructure to ensure the optimal performance of its web-based applications and deliver the best possible quality to its end users. It combines effective measurement of the customer experience with metrics on the performance and availability of supporting applications and infrastructure.

It also tracks the performance of the web applications, managing user transactions and the operation of components such as servers, databases and network devices, detecting changes that could jeopardize application performance; and allows UCAS to identify, prioritize and address emerging problems before customers or users are affected.

application load testing

Next Page »