London software testing news UK


Vmware offers performance testing tool

Posted in Load testing,Software testing,testing tool by testing in London on July 31, 2007

From zdnet

Vmware has launched a new tool to test the performance of virtualised systems, which in the past has been difficult to pin down despite the benefits of virtualisation.

VMmark measures the performance of applications running in virtualised environments. But the company admits it has been no easy job to find an accurate way to come up with a standard benchmark that accurately represents the enormous variety of customer environments running on virtualised systems. It has taken “two years of engineering design, collaboration with partners and review of extensive customer survey data” to develop the benchmark, the company said.

In order to produce the benchmark, VMware has avoided the relatively easy approach of measuring the performance of virtual applications software running on specific machines. Instead, the benchmark measures the scalability of heterogeneous virtualised workloads. According to VMware: “It provides a consistent methodology so benchmark results can be compared across different virtualisation platforms.”

The result is that, by using the benchmark, companies should be able to “make appropriate hardware choices, and compare the performance and scalability of different virtualisation platforms”, VMware said.

Smoke testing at a testing station?

Posted in General by testing in London on July 30, 2007

From Road Transport

A Vosa testing station in Thurrock, says it was told nothing about LGV smoke tests for London‘s Low Emission Zone until an operator contacted CM to complain about the confusion. David Newman says he downloaded information from Transport for London’s website about the engines that will need tests to prove they comply with the emission standard, but when he rang Purfleet testing station he claims no one had heard anything about it.

Newman  says he was also told the station didn’t have any Low Emission Certificates to give operators who run Euro-1 and 2 trucks but which reach the Euro-3 standard on particulate matter.

And he claims that when he rang a Vosa information line they didn’t know anything about the forthcoming smoke testing, due to start from 1 August, either. Newman says: “TfL is saying one thing and Vosa is saying another. They haven’t been informed. It’s just a sham.”

However, Purfleet station manager Kelly Freeman says it is not true that it was unaware of the smoke tests she says it has been kept up to date with information from TfL and Vosa.

Consumer testing for interoperability

Posted in General by testing in London on July 29, 2007

From  EE Times

With interoperability issues for digital entertainment systems still dogging the consumer electronics industry, a subsidiary of the chip vendor that invented the High-Definition Multimedia Interface (HDMI) has seized the opportunity to turn the industrywide problem into a profit centre. Now its year-old test program, touted at introduction as a much-needed de facto standard regime for CE interoperability, is under fire by some who say it is little more than a costly feel-good seal of approval.

Silicon Image Inc. unit Simplay Labs LLC created the SimplayHD Compatibility Test Specification to test High-bandwidth Digital Content Protection (HDCP) functionality in conjunction with HDMI for consumer electronics OEMs. The program also offers compatibility testing among devices from different vendors. The company offers OEMs that pass the test a SimplayHD certification logo. While SimplayHD is an elective service for OEMs, Simplay Labs has told OEMs that big-box retailers such as Best Buy will recommend products with the SimplayHD logo over competing offerings.

Some CE vendors immediately embraced the SimplayHD testing program, but in recent months a number of companies have begun to question the lab’s testing practices. The concerns cited include a lack of publicly available documented information on SimplayHD testing procedures and pass/fail criteria; long waiting periods for getting a system tested; and “extortionate” pricing for each test–$15,000 for each HDMI-equipped digital entertainment “source” system, such as a high-definition TV or DVD player.

“Several customers in Taiwan, China and Korea are having difficulty completing SimplayHD testing,” said Doug Bartow, strategic marketing manager for the advanced-TV segment at Analog Devices Inc. Without full disclosure of how the SimplayHD tests are conducted and what the pass/fail criteria are, “our customers are unable to correct the problem, let alone understand the nature of the problem,” Bartow said.

A year ago, the absence of coordinated HDCP/HDMI testing posed a huge problem for the industry. In some cases, even HDMI products already certified by HDMI’s authorized test centres would not function correctly without properly implementing HDCP, an Intel-developed digital content protection technology that controls content as it crosses HDMI connections.

Testing and Quality Control the only Certification Needed?

Posted in Software testing by testing in London on July 28, 2007

From InfoQ

A new certification for software developers should not be about OOP, metaprogramming, macros, design patterns or any in depth knowledge of programming languages. Reginald Braithwaite believes that only one subject must be on the examination list: testing and quality control.

Braithwaite stresses that this is not a debate on “whether to have separate testers or whether programmers should test themselves.” He simply asserts that, judging from his experience, safety is of crucial importance for software development in commercial environment. Hence, developers’ ability to ensure that software does what it is expected to do should be the prerequisite to any software development job:

My point here is that it’s tough to call yourself a “professional software developer” if you aren’t intimately familiar with the processes by which we evaluate the result of your work.

Imagine walking up to a Structural Engineer and talking about stress-testing materials. Wouldn’t you be worried if they shrugged and said, “that’s testing, I just design stuff”?

Testing disaster recovery is critical

Posted in General,security testing by testing in London on July 27, 2007

From Wired  

And anyone who does this kind of thing knows that disaster recovery planning isn’t enough: Testing your disaster plan is critical. Far too often the backup software fails when it has to do an actual restore, or the diesel-powered emergency generator fails to kick in. That’s also the flaw with the emergency kit; if you don’t know how to use a compass or first-aid kit, having one in your car won’t do you much good.

But testing isn’t just valuable because it reveals practical problems with a plan. It also has enormous ancillary benefits for your company in terms of communication and team building. There’s nothing like a good crisis to get people to rely on each other. Sometimes I think companies should forget about those team building exercises that involve climbing trees and building fires, and instead pretend that a flood has taken out the primary data centre.

It really doesn’t matter what disaster scenario you’re testing. The real disaster won’t be like the disaster recovery testing, regardless of what you do, so just pick one and go. Whether you’re an individual trying to recover from a simulated virus attack, or an organization testing its response to a hypothetical shooter in the building, you’ll learn a lot about yourselves and your organization, as well as your plan.

There is a sweet spot, though, in disaster preparedness…

Software developers keep testing onshore

Posted in Software testing by testing in London on July 26, 2007

From ITwire

86 percent of Australian and New Zealand software development organisations do all their testing at home, according to a survey.

The companies are a little more open to outsourced testing, with 77 percent saying all such work is done in house. The survey was conducted by Compuware, whose products include testing tools.

Most testing remains unautomated, with only seven percent of respondents automating more than 60 percent of their software testing, and 32 percent doing no automated testing.

“Organisations that can successfully automate a high proportion of their software testing will probably continue to test software in-house,” said Franco Flore, Compuware’s subject matter expert, application delivery management.

“Those that fall behind best practice, however, will be increasingly tempted to outsource or offshore at least some proportion of their software testing to achieve improved systems quality, quicker time to market and lower cost.”

To paraphrase Christine Keeler, he would say that, wouldn’t he? Still, vested interest and the truth can be compatible, and this writer would far rather see Australian software tested efficiently within the country than the task being sent offshore.

HP buys data centre automation software company

Posted in General by testing in London on July 25, 2007

From Red Herring

In a move aimed at strengthening its software business, computing giant Hewlett-Packard on Monday agreed to acquire Opsware for $1.6 billion in cash.

IDC analyst Stephen Elliot said HP is paying a premium because demand for Opsware’s products, known as data center automation software, has increased considerably in the past few years because it helps businesses cut costs. This type of software automates the data centre, performing functions such as patching, provisioning, configuration, compliance, and deployment.

Mr. Elliot also said Opsware will make Palo Alto, California-based HP the leading seller of data centre automation software and allow HP to better compete against IBM and others.

HP has taken several steps in the last two years to grow its software business. In September 2005, HP bought IT management software vendor Peregrine Systems for $425 million. A year ago, HP bought testing software maker Mercury Interactive for $4.5 billion.

The Opsware transaction is expected to close before October 31, and the company will become part of HP’s software business. Opsware CEO Ben Horowitz will lead the business technology optimization division, reporting to Thomas E. Hogan, senior vice president of HP Software.

ACORD expands testing and certification facility

Posted in Software testing by testing in London on July 24, 2007

From Insurance Networking News

Building upon the success of its initial rollout, the insurance standards body organisation ACORD announced the expansion of its testing and certification facility to include test capabilities for ACORD Reinsurance & Large Commercial (RLC) Placing message implementations.

The ACORD testing and certification facility supports developers implementing ACORD standards by enabling them to test their messaging systems using a “virtual” business partner. Developers can send ACORD messages to it, have the system validate them, and then have it send back responses indicating acceptance or errors. This live application streamlines the entire process, thereby increasing certifications and improving communication between business partners.

The system originally went live in February 2007 for ACORD Document Repository Interface (DRI) messages to support the London Electronic Claims File initiative. Now, with the incorporation of the Placing messages, the facility can also support the growing number of brokers and underwriters developing placing systems within the London.

“This is an important new step for our testing facility which will be followed by incorporating additional support for the RLC Accounting, Settlement and Claims messages later in 2007” said Lloyd Chumbley, assistant vice president, Standards, ACORD, Pearl River, N.Y.

Extending of the capabilities of the ACORD testing & certification facility is being fully supported by implementers within London and their market associations.

“The LMA is fully supportive of the use of the ACORD RLC message testing facility for placement. This will be a significant aid to any broker or syndicate who is developing an electronic placing capability, greatly reducing time required to test the message structure,” said David Gittings, CEO of the Lloyd’s Market Association.

An important aspect of the ACORD facility is its ability to provide formal certification that a developer’s system complies with the agreed ACORD standards.

“Organizations are starting to insist that new partners are certified by our system and this has led to strong and increasing usage since going live in February,” said Roy Laker, assistant vice president, London office, ACORD.

The ACORD testing and certification facility was developed in partnership with Trisystems Infobahn.

Platform independent numerical library IMSL

Posted in General by testing in London on July 23, 2007

From Dr Dobbs Portal

The benefits of having a platform-independent numerical library such as IMSL benefit customers in three ways:

  • Application developers typically focus their energy addressing the unique requirements of their application and utilize existing core routines for basic linear and algebra routines (rough analogy: carpenters build the building but don’t make their own hammers). Open source routines are available but they are generally not as good or robust. Those who require more advance routines for: FFTs, Interpolation, Differential equations, Regressions, etc. may be faced with limited choices: develop their own or use a commercial library such as IMSL. Thousands of applications use IMSL routines as their underlying libraries.
  • The fact that the same IMSL routines, which operate in the same manner, are available on multiple platforms makes porting applications to different environments much easier because the underlying math routines are already there.
  • Savings is in time and resource by using IMSL rather than developing, testing, debugging and deploying a basic routine usually pays for the cost of an IMSL license.

Absoft has been porting the IMSL library to various platforms for over 17 years. The library consists of almost 8500 FORTRAN 77 and Fortran 90 source files. Porting is simplified with an extensive makefile system developed for the gnu toolset (make, as, ar, ld, ranlib, etc).

The first step is to build the library at the highest optimization level. The library is then tested with an extensive test suite. Failures are identified and optimization is reduced until the failure is eliminated.

There are four library configurations: basic, SMP (with ATLAS BLAS), MPI, and MPI/SMP.

Once all test cases pass for all library configurations, the library is submitted to VNI for final acceptance testing and packaging.

Neither the 32-bit nor the 64-bit port of the library posed any significant problems on the Macintosh.

Automate testing for data and report integrity

Posted in Software testing,testing tool by testing in London on July 22, 2007

From Thomas Net

MicroStrategy a leading worldwide provider of business intelligence (BI) software, today announced its plans to launch MicroStrategy Integrity Manager(TM), a new product that tests the accuracy of BI applications by validating data and report integrity. MicroStrategy Integrity Manager automatically compares and verifies the consistency of reports as changes are made to the BI ecosystem, and then highlights issues and discrepancies to monitor the overall reliability of the BI content used by business decision makers. This product will be introduced tomorrow at the MicroStrategy Symposium in London.

Companies typically monitor the integrity of their BI data through manual data validation. However, despite the increasing importance of BI applications, BI support teams often lack the necessary resources to ensure that data or software integrity problems have not been unexpectedly introduced during data loads or software upgrades. Additionally, in self-service BI systems where end users create their own reports, many companies fail to perform validation testing on the large number of user generated reports. MicroStrategy Integrity Manager automatically reviews the integrity of data across an organization’s BI system, verifying that BI reports and analyses are based on reliable data.

MicroStrategy Integrity Manager reduces the need for resource-intensive manual testing by comparing versions of reports after data updates and throughout the BI development cycle, thereby automating report regression testing. Data inconsistencies can be captured much sooner in the development cycle, saving time in report testing, end user support, and issue resolution. With MicroStrategy Integrity Manager, companies can increase the scope, effectiveness, and accuracy of their data validation efforts, and free developers to focus on content creation rather than manual testing.

Even minor changes in the BI ecosystem may impact data accuracy and lead to inconsistencies across versions of BI reports. Common types of BI ecosystem changes that require report and data validation include:

  • Software upgrades
  • BI project migrations
  • Data warehouse ETL processes
  • User profile modifications
  • Database optimisations
Next Page »