London software testing news UK


Software Planner CAB

Posted in Software testing,testing tool by testing in London on December 31, 2009

From Softwareplanner

For a company to participate in the Software Planner Client Advisory Board, it must meet these criteria:

  • Must be a current Software Planner client in good standing.
  • Must be a champion for the product that is willing to offer advice for improvements, help us prioritize features for future releases and be willing to discuss Software Planner with other peers inside your own organisation.
  • Must be willing to attend periodic CAB webinars (sometimes monthly, sometimes quarterly) to discuss suggestions for improvement, vote on the priority of suggestions and to help us flesh out feature designs (these meetings will be kept to 1 to 2 hours).
  • Must be willing to become a reference for Software Planner (we promise not abuse this privilege).
  • Must be willing to participate in our Software Planner discussion forum to offer advice to other Software Planner clients and to learn from others.

Software testing combination at Google

Posted in Software testing by testing in London on December 30, 2009

From IT Inform

However, simply looking at the collective whole does not find a lot of problems. At least it doesn’t find a lot of important problems.

Test driving a car is looking at a car holistically. But how often does a test drive find a real bug? A test drive is about look and feel; it’s too high level to be a good bug finding exercise. To find a bug, you hire a mechanic to look at the car, component by component and subsystem by subsystem. Mechanics don’t test by driving the car on a date or by taking a Sunday drive. They test by monitoring specific subsystems and by looking for specific types of problems that eventually work their way out as a bug on a Sunday drive. A mechanic finds flaws quickly in the fuel system, the exhaust system, the electrical system. Sunday drivers must be patient for such flaws to work their way into their line of sight.

Proper software testing requires a combination of Sunday driving and a mechanic’s analysis. It is about looking at the big picture and analysing individual components and capabilities and how they contribute to the collective whole. The way we now do exploratory testing at Google treats it as such.

Back Testing Software

Posted in Acceptance testing by testing in London on December 29, 2009

From Artsgrantfinder

Back testing software is an integral cog in the process of analysing trading systems. Back testing is the process of testing a trading strategy using historical data rather than testing it in real time with real money. The metrics obtained from testing via back testing software can be used as an indication of how well the strategy would have performed had it been applied to past trades. Interpreting these results then provides the trader with sufficient metrics to assess the potential of the trading system.

Google testing

Posted in Acceptance testing by testing in London on December 24, 2009

From CNN

OK, so it’s a little early in the game to call this one a total fail. But after the breathless anticipation that greeted Google Wave and the hot rush to get an invitation for its beta testing, lots of users found themselves asking, “OK … now what?”

Google, for its part, released an 80-minute tutorial video — leading some observers to argue that if you need an hour and 20 minutes to explain what your product does, you might be in trouble.

It’s designed as a platform to allow users to communicate and collaborate in real time — a tool some predict will be used effectively by developers in the future.

Scalability testing

Posted in Load testing,Software testing by testing in London on December 23, 2009

From Photo Hot

Systems that work well during development deployed on a small scale can fail to meet performance goals when the deployment is scaled up to support real levels of use.

An apposite example of this comes from a major blue chip company that recently outsourced the development of an innovative high technology platform. Though development was behind schedule this was deemed acceptable. The system gradually passed through functional elements of the user acceptance testing and eventually it looked like a deployment date could be set. But then the supplier started load testing and scalability testing. There followed a prolonged and costly period of architectural changes and changes to the system requirements. The supplier battled heroically to provide an acceptable system until finally the project was mothballed.

Scalability testing

Eurostar begins testing

Posted in Acceptance testing by testing in London on December 22, 2009

From RTT news

Eurostar rail has begun test runs of trains in Channel Tunnel as rail engineers worked frantically to solve the mystery of multiple breakdowns between London and Paris ahead of the peak Christmas and New Year holiday rush season.

The breakdowns, apparently because of technical faults blamed on unprecedented winter conditions due to snow and sub-zero temperatures, have forced tens of thousands of passengers to cancel their journeys.

Call for papers Compsac 2010

Posted in conference by testing in London on December 21, 2009

From Compsac

The 4th IEEE International Workshop on Quality Oriented Reuse of Software (QUORS 2010). Topics of interest include, but are not limited to:

  • Reuse in Service Oriented Systems (SOA)
  • Service and service-based system engineering
  • Internet-based applications
  • Cloud computing
  • High quality software reuse methods
  • Dependable Component-Based Systems
  • Quality aspects of design patterns
  • Software quality metrics and testing
  • Component and service repository
  • Quality-oriented COTS product reuse
  • Resilient (adaptive) reuse
  • Agile methods and software reuse

Invited participants include researchers and practitioners in software reuse, software evolution, emerging software systems such as SOA, pervasive computing and embedded software, testing and quality assurance and all other relevant areas.

SAP testing upgrades

Posted in Acceptance testing by testing in London on December 20, 2009

From Search SAP

Poor test management. Many companies do a poor job of testing their SAP upgrades before implementing, he said. That includes both those that don’t perform adequate integration, user acceptance and performance testing, as well as those that test too much.

“They don’t take the time to do the analysis, to figure out what needs to be tested,” Shepherd said. “In most cases, there isn’t a good reason to test everything, so they spend way too much time and money on the testing.”

SAP load testing

Conference on Distributed Computing Techniques

Posted in conference by testing in London on December 19, 2009

From DisCoTec 10

Topics of interest include but are not limited to:

  • Languages and Models: new language and modeling concepts for distribution and concurrency including object-oriented, aspect-oriented, reflection, and meta-programming technologies; integration of language and modeling paradigms
  • Semantic Foundations: semantics for different types of languages, including programming languages, modeling languages, and domain specific languages; real-time and probability aspects; type systems and behavioural typing
  • Formal Methods and Techniques: design, specification, analysis, verification, validation and testing of various types of distributed systems including communications and network protocols, service-oriented systems, and adaptive distributed systems. Advances in tool-based formal analyses such as static analysis, model checking, theorem proving, and deductive verification for realistic programming and modeling languages are especially encouraged.
  • Applications of Formal Methods: applying the existing methods and techniques to distributed systems, particularly web services, multimedia systems, and telecommunications
  • Practical Experience with Formal Methods: industrial applications, case studies and software tools for applying formal methods and description techniques to the development and analysis of real distributed systems

Testing web applications

Posted in Load testing,Software testing by testing in London on December 18, 2009

From PR Log

Phil Lew, CEO of XBOSoft, recently spoke at the Applied Sciences College of Beijing Union University in Beijing, China. Students at the university pursuing associate and bachelor degrees in computer science and software engineering were among the attendees. Lew’s presentation addressed the topic of testing web applications and in particular, the special considerations required for performance testing web applications when there is increased user group variation. “Testing web based applications requires a different approach and mindset when compared to the conventional software applications of yesterday. User diversity, behavior and the platforms they use are all magnified” explained Lew. “Performance deserves special emphasis as even the most prolific feature set and functionality will be termed a failure if users walk away due to slow performance. And now with SaaS business models, they can leave and sign up with another software vendor in minutes” added Lew.

Load testing web applications

Next Page »