Sunday, May 2, 2010

Open Source Software Testing Tools

Grinder
Source: http://grinder.sourceforge.net/
Grinder is basically a load testing frame work in java.
Key features
• Generic Approach Load test anything that has a Java API. This includes common cases such as HTTP web servers, SOAP and REST web services, and application servers (CORBA, RMI, JMS, EJBs), as well as custom protocols.
• Flexible Scripting Tests are written in the powerful Jython scripting language.
• Distributed Framework A graphical console allows multiple load injectors to be monitored and controlled, and provides centralised script editing and distribution.
• Mature HTTP Support Automatic management of client connections and cookies. SSL. Proxy aware. Connection throttling. Sophisticated record and replay of the interaction between a browser and a web site.


DBMonster
Source: http://dbmonster.kernelpanic.pl/
dbMonster is a tool which helps database application developers with tuning the structure of the database, tuning the usage of indexes, and testing the application performance under heavy database load. dbMonster generates as much random test data as you wish and puts it into SQL database. It provides a very pluggable interface and is trivial to use. dbMonster is written in java.

WWW:Mechanise
Source: http://search.cpan.org/dist/WWW-Mechanize/lib/WWW/Mechanize.pm

WWW::Mechanize, or Mech for short, helps you automate interaction with a website. It supports performing a sequence of page fetches including following links and submitting forms. Each fetched page is parsed and its links and forms are extracted. A link or a form can be selected, form fields can be filled and the next page can be fetched. Mech also stores a history of the URLs you've visited, which can be queried and revisited
HTTPerf



JCrawler
Source: http://jcrawler.sourceforge.net/

JCrawler is an open-source (under the CPL) Stress-Testing Tool for web-applications. It comes with the crawling/exploratory feature. You can give JCrawler a set of starting URLs and it will begin crawling from that point onwards, going through any URLs it can find on its way and generating load on the web application. The load parameters (hits/sec) are configurable

0 comments:

Post a Comment