Load Testing.

Load Tests are end to end performance tests under anticipated production load. The objective such tests are to determine the response times for various time critical transactions and business processes and ensure that they are within documented expectations (or Service Level Agreements - SLAs). Load tests also measures the capability of an application to function correctly under load, by measuring transaction pass/fail/error rates. An important variation of the load test is the Network Sensitivity Test, which incorporates WAN segments into a load test as most applications are deployed beyond a single LAN

Load test needs database - Microsoft Technical Forums

Load test needs database - Microsoft Technical Forums

Automating Internet Explorer to Find All Links on a Web Page

David Wang : IIS

David Wang : IIS

Data Access and Storage Developer Center: Creating Dynamic Data Entry User Interfaces

Data Access and Storage Developer Center: Creating Dynamic Data Entry User Interfaces

Rainbow - Rainbow Portal - An Open Source C# Portal Built By The Community

The Rainbow project is an open source initiative to build a comprehensive content management system using Microsoft's ASP.NET and C# technologies.

Rainbow, available today in 29 languages, allows content authoring to be safely delegated to role-based team members who need little or no knowlege of HTML. Rainbow optionally supports a two-step approval-publish process. 75 plug-in modules are now included in the standard release, including support for an e-store, XML news feeds, Flash, Maps, Newsletter, Surveys, Forums, Document Management, Custom Lists, and more. It's also fairly easy to build your own custom modules using the guidelines provided on the Developer Documentation page.

Rainbow has received more than 63,000 downloads to date and is already in production at many commercial internet and intranet sites. Learn more about Rainbow.

Team Development Guide

Download details: Team Development Guide

Overview
This document provides development and procedural guidance for project teams building .NET applications with Visual Studio .NET and Visual SourceSafe. It discusses the processes, disciplines, and .NET development techniques that team members must adopt in a team development environment. It also describes how to create the necessary development infrastructure, which includes source control databases, development workstations, and build servers.

UpdateVersion User Guide

UpdateVersion User Guide

Navigating SharePoint

WindowsDevCenter.com: Navigating SharePoint

Watching Ports with Port Reporter

WindowsDevCenter.com: Watching Ports with Port Reporter

Sending SMS Messages Using Windows XP

O'Reilly Network: Sending SMS Messages Using Windows XP

Making Internet Phone Calls Using Skype

WindowsDevCenter.com: Making Internet Phone Calls Using Skype

Ruby: GCAndMemoryManagement

Ruby: GCAndMemoryManagement

Working with Visual Studio Team Test Documentation on MSDN

Working with Visual Studio Team Test

Visual Studio Team Edition for Software Testers

The Testers edition of VSTS is designed for use by the testers on your development team, although some of the tools are also available in the Developers edition. Ian Murphy tests them out. (Copy now revised for Beta 2!)

Rounded Corners Server Control

LearnAsp.com - Free Lessons by Charles Carroll

Nadeem's Download

Nadeem's Download

FREE ASP.net Book by Charles Carroll

http://www.learnasp.com/freebook/learn/: "FREE ASP.net Book by Charles Carroll"

Mozilla ActiveX control

Mozilla ActiveX control

ASP.NET : Control Gallery

ASP.NET : Control Gallery

James Geurts' Blog : Auto Increment Build Numbers for C# Projects in VS.NET 2003

James Geurts' Blog : Auto Increment Build Numbers for C# Projects in VS.NET 2003

Auto-Incrementing Build Numbers (C# for VS.Net)

Auto-Incrementing Build Numbers (C# for VS.Net)

Good C# related tools and downloads- Build Version #

downloads

Versioning ASP.NET Applications

.NET Magazine - Manage Web Sites With ASP.NET: "Versioning ASP.NET Applications
The .NET Framework and ASP.NET enable you to run multiple versions of ASP.NET on the same server to preserve compatibility with applications built against a previous version of the .NET Framework SDK. When the .NET Framework is installed on a server with an existing version installed already, the ASP.NET applications on the machine are updated automatically, by default, to use the .NET Framework's newest version. There are exceptions for applications that are bound to a later version of the runtime or an incompatible version. Although the .NET Framework is designed with backward compatibility as a goal, you might need to configure an ASP.NET application to use an earlier version in some cases. Or, you might need to run two applications on the same server targeted to different versions of the framework. Note that when you run IIS 6.0 and Windows Server 2003, you can enable ASP.NET in the IIS management console after installing ASP.NET. IIS 6.0 provides a new Web Service Extensions folder, enabling you to disable and enable IIS functionality selectively.

The ASP.NET ISAPI version associated with an application determines the runtime version the application uses. If you configure the application to use the earlier runtime version, the component is redirected automatically at run time to use this version. If the runtime version used to build the component is installed on the server, you can reconfigure the application to use the later runtime version. Microsoft designed ASP.NET to support different runtime versions, so when an application uses assemblies built against different versions of the .NET Framework, the runtime version associated with the application determines which version of the .NET Framework assemblies the application and all its components use.

For example, if"

Reliable Software Win32 Tutorial

Reliable Software Win32 Tutorial

Search Mail archives for any lisitng that is here, test with wtr-general

wtr-general

Securing Your SQL Server 2005 Express Edition Server

Welcome to the MSDN Library

.0 Introduction

Microsoft® SQL Server™ 2005 Express Manager is a new, lightweight database management tool built on top of the Microsoft Windows® .NET Framework 2.0. Express Manager is a free download and can be used to manage SQL Server 2000, SQL Server 2000 SP4, SQL Server 2000 Desktop Engine (MSDE 2000), and SQL Server 2005 Developer and Express Edition databases on local and remote computers.

Express Manager can be used in conjunction with the other powerful applications in the SQL Server Management Tools family to provide another option for managing SQL Server databases.

Microsoft SQL Server 2005 Express Manager Technical Preview

.0 Introduction

Microsoft® SQL Server™ 2005 Express Manager is a new, lightweight database management tool built on top of the Microsoft Windows® .NET Framework 2.0. Express Manager is a free download and can be used to manage SQL Server 2000, SQL Server 2000 SP4, SQL Server 2000 Desktop Engine (MSDE 2000), and SQL Server 2005 Developer and Express Edition databases on local and remote computers.

Express Manager can be used in conjunction with the other powerful applications in the SQL Server Management Tools family to provide another option for managing SQL Server databases.

Intro to DRb Ruby

Intro to DRb: "So, what is this DRb thing and why should you be interested? DRb literally stands for 'Distributed Ruby'. It is a library that allows you to send and receive messages from remote Ruby objects via TCP/IP. Sound kind of like RPC, CORBA or Java's RMI? Probably so. This is Ruby's simple as dirt answer to all of the above. "

Installing Perl & AWStats under IIS 6

DOM/Scripting - Articles and Bookmarks

Walkthrough: Accessing the DHTML DOM from C#

Web Application Articles

SpySmith from Quality Forge

SpySmith is a simple and powerful diagnostic tool, especially useful when testing a Windows GUI, a Web Site or a Web-based application. It allows you to peek inside I.E. Browser-based Documents and Window Objects to extract precise information about HTML source and/or the Windows hierarchy.

SpySmith runs invisibly in the background and can be called on as needed using a special key/click combination.

ObjectMother Website.pdf (application/pdf Object)

ObjectMother Website.pdf (application/pdf Object)

SendKeys.Send () Methods

Methods

Mozilla ActiveX Plug-in

Mozilla ActiveX Plug-in

INFO: Hosting ActiveX Controls in a Netscape Plug-in

INFO: Hosting ActiveX Controls in a Netscape Plug-in

Creating a Plug-In Framework

ASP.NET


Summary: Shows how to add plug-in support to your .NET applications, and provides a framework you can use to add this functionality. (9 printed pages)

Scott Hanselman's Weblog - WatirMaker Version 0.01 Source

Google Talk

Get Google talk, google's IM.

Build a Configurable Web-Based Bug Management Tool Using ADO.NET, XML, and XSLT

SUMMARY One of the most significant features of ADO.NET is its integration with XML. Developers can either use an ADO-like API to access the data or work directly with an XML representation of the data. This article demonstrates how both of these techniques can be used together to create Web applications that take advantage of XML standards such as XSLT. The example presented here is a bug tracking application built using C# and the.NET Framework. The development of the application covers several topics including data access using ADO.NET, the presentation of data using XSLT stylesheets, and the integration of ADO.NET with the .NET XML Framework..

Globalization Step-by-Step: Introduction

Globalization Step-by-Step: Introduction

Adding Visual Studio .NET 2003 Support for NAnt

Adding Visual Studio .NET 2003 Support for NAnt

Python for .NET

Python for .NET

Test first, by intention

Test First, by Intention
A code and culture translation from the original Smalltalk to Ruby
Original by Ronald Jeffries, translation by Aleksi Niemela and Dave Thomas

In this document we show you the Ruby version of the Smalltalk code published in the Pink book. There's also an online version of the original (PDF 0.5MB, and zipped 1.0MB).
Table of contents
1. Introduction
2. In the Beginning a Test Should Fail
3. Then it should pass
4. The second test
5. Coming up with an Algorithm
6. Refactoring
7. Refactoring II

SUNAN ABU-DAWUD, BOOK 37: Battles (Kitab Al-Malahim)

SUNAN ABU-DAWUD, BOOK 37: Battles (Kitab Al-Malahim): "Book 37, Number 4329:

Narrated Abdullah ibn Amr ibn al-'As:

When we were around the Apostle of Allah (peace_be_upon_him), he mentioned the period of commotion (fitnah) saying: When you see the people that their covenants have been impaired, (the fulfilling of) the guarantees becomes rare, and they become thus (interwining his fingers). I then got up and said: What should I do at that time, may Allah make me ransom for you? He replied: Keep to your house, control your tongue, accept what you approve, abandon what you disapprove, attend to your own affairs, and leave alone the affairs of the generality."

Born Geek: Firefox Toolbar Tutorial

Born Geek: Firefox Toolbar Tutorial

How to write Firefox extensions using BugMeNot as an example

How to write Firefox extensions using BugMeNot as an example

ONDotnet.com: Building Mono on Windows

ONDotnet.com: Building Mono on Windows

XML Schema: Understanding Structures

XML Schema: Understanding Structures

Guide to Linux Filesystem Mastery

Guide to Linux Filesystem Mastery

Guide to Linux Filesystem Mastery

Guide to Linux Filesystem Mastery

GtkSharpBeginnersGuide - Mono

GtkSharpBeginnersGuide - Mono

Monoppix - Basic XSP (ASP.NET) and Monodevelop Walkthrough

Monoppix - Basic XSP (ASP.NET) and Monodevelop Walkthrough

Test Focus SA

Test Focus SA

Best Practices in Software Test Automation

July/August 2005 Feature Article
Best Practices in Software Test Automation
What are Best Practices?

Best practices are guidelines and advice on the best way to do something; collected over time and based on experience with previous projects. Best practices in an organisation should come from the bottom of the organisation up, rather than being mandated by management. These best practices might even differ from organisation to organisation, but certain key factors are present in successful organisations and projects. Discussed in this article are twelve of the most important practices that can assist you in ensuring successful implementation of functional test automation on your projects and in your organisation.
Best Practice 1: Know your objective

There are many good reasons for doing test automation.

Test automation can save time, make testing easier, and improve the testing coverage. It can also help keep testers motivated; I can list a page of other benefits that can be derived from doing functional test automation. However, it is not likely that your organisation will need to do all these things at the same time. Different groups typically have different hopes and ideas of what they want test automation to offer them. These need to be stated, or else disappointment is likely.

A clear objective to work towards during test automation is highly important. It provides us with a compass heading towards which planning and implementation will be directed. Eventually, the results of the test automation implementation will be compared against the original objective to establish whether we did reach our goal or not.

Mistakes made…

* "Test automation is the answer to our testing problems."

Test automation assists in the testing process, and can therefore provide value to a test team. Certain benefits can be derived, but we need to know and understand exactly what they are without having false expectations or hopes. Automation is not a replacement for an inadequate test process or an inefficient test team. Address the testing problems prior to embarking on a costly test automation implementation. Automation's objective is definitely not to solve all testing problems in existence.
* "Since we have limited time to test, let's use test automation."

Test automation takes 3 to 128 times longer to implement than manual testing, and therefore it should be properly planned to achieve the expected benefits. If time is of the essence, consider adding more resources to the test process (this also has limited benefits, and might not always be a solution) rather than embarking on the lengthy process of implementing test automation. Functional test automation is definitely not a quick solution to a test project in distress. Multiple other solutions are better suited to address this need.

Best Practice 2: Test Automation requires a manual test process

"Success in test automation requires careful planning and design work, and it's not a universal solution. ... automated testing should not be considered a replacement for hand testing, but rather as an enhancement." (James Bach [pg. vii] in his foreword: Software Testing with Visual Test 4.0, Thomas Arnold II, IDG Books Worldwide Inc. Foster City, CA.: 1996. ISBN 0-7645-8000-0.)

The words of James Bach and many others are quite clear on this subject. Test automation will not give you a correct testing process or methodology. It will not enforce a methodology or process. Test automation merely supports the test process; it will not replace the manual test process.

Mistakes made…

* "We need to implement testing in our projects. Let's buy a tool and automate the testing."

You can script to your heart's content, if you do not have a proper manual test process in place the benefits of test automation will never be seen. It will only assist you in doing the incorrect process a lot faster. Test automation merely assists and enhances the test process itself. Ensure that a proper test methodology is implemented before attempting to add test automation.
* "We can just add the test automation to our current test process and start automating our current test scripts."

Test Automation requires test cases to support the automation process. If my test cases do not incorporate the all-important logon to the SUT (system under test), the automation script will need a lot of manual intervention, and would therefore be more manual than automated. The current test process might have to be changed, or adapted, to be implemented for the tool suite selected. Review and adapt the test process to ensure maximum benefit is realised from the tool suite to be used in your test process.

Best Practice 3: The earlier the involvement in the Software Development Life Cycle (SDLC) the greater the benefits

We all know how testing has its own phases aligned with the phases of the SDLC. The earlier testing gets involved in the life cycle of the project the better. Many articles have been written on this subject, and it holds true. Since test automation supports the testing process, the test team can get involved very early in the life cycle of a project. If the test team uses a test automation tool suite, they will most likely have a requirements management tool or test management tool included in the suite. These tools can assist the test team in various ways from an early stage in the development life cycle, even before a line of code is developed.

The more mature techniques and methods of test automation are closely tied to the testing methodology being used. An early start to the test life cycle, at the start of the development project life cycle, is vital to achieving the best results, and ensuring maximum return on investment.

Mistakes made…

* "The test automators cannot get involved in the project before the developers release the first build."

A software development project does not start with code development. Since functional test automation is also a development project, as we will discuss in Best Practice 9, it follows the same phases as a software development project and does not start at the coding phase. Proper planning and design are essential to the success of the functional test automation implementation, and need to be conducted hand-in-hand with the test life cycle.
* "Testing is only a single phase in the SDLC and therefore testers and test automation only need to be part of the project for the testing phase."

This 'prehistoric' misconception in the IT industry has been proven one of the major reasons for bad software quality.

Best Practice 4: Test Automation is a fulltime effort

When people are allowed to work on test automation on their own time, or as a back burner project whenever the test schedule allows, test automation becomes a sideline effort. Test Automation does not get the time and focus it needs.

Getting into automated testing requires preparing not only yourself, but also your company and environment. This requires focus and dedicated resources.

Mistakes made…

* "We have a test automation tool. Can the test team see if they can use it?"

It is clear that the expectation and an understanding of test automation is lacking. I can guarantee that the tool implementation will not be successful, and would result in almost no benefit to the test team. The test team is usually confronted with manual test work for development projects that does not meet the code delivery date for testing. The test automation implementation would probably be of the capture-record-and-playback form, and therefore the benefit would be low.

Test Automation will require training for the staff involved with it. Time is required to design and implement the architecture. Referring to the development nature of functional test automation implementation, the team will have to learn a new skill set. Test automation implementation requires dedication and focus to ensure success. In this instance, it is better to not implement test automation than to allocate ad hoc time for experimentation.

Best Practice 5: Select the correct product or suite of products

Buying the wrong tool is listed as one of the major challenges in test automation, because - no matter what kind of process, methodology, or organisation you have - if the tool is not a good technical and business fit, it will not be used.

We know that a good process and organisation are also essential for test automation. However, if the tool will not function at a basic level, the people who should use the tool, and thereby gain the benefit from the tool, will not use it.

Unfortunately, too few people do adequate research before buying a test tool. Adequate research includes defining a set of tool requirements based on what the intended users of the tool need to accomplish, developing a set of evaluation criteria by which candidate tools will be judged, and taking the experience of other people who have used the tools into consideration.

Select a tool that will allow you to implement automated testing in a way that conforms to your long-term testing strategy and test methodology.

Mistakes made…

* "A vendor demonstrated a tool to us and it seems as if the tool will add value. I think we should buy it."

Take time to define the tool requirements in terms of technology, process, applications, people skills, and organisation. Then look at multiple tool vendors and select the right fit for your circumstances. A good idea would be to do a proof of concept project with the tool that best fits your criteria.
* "We purchased the whole suite of tools from the vendor, but are only using the functional test automation tool."

Test Automation tools are expensive, and to justify the Return On Investment (ROI) it is normally not effective and efficient to use only the functional test automation tool in the product suite. Other benefits can be derived from using the full suite of products; for example, proper defect management, metric reporting on test status, etc. Purchase what you need, or use what you have fully.
* "The yearly license fees are so expensive we cannot afford to use the tool suite any more."

Too often test automation tools end up as 'shelf-ware' because of the cost of the licenses compared to the benefit derived. Refer to mistake two above for one of the reasons why the ROI is not being realised. Before signing the purchase order, be aware of the impact of annual licensing fees.

Best Practice 6: Get executive management commitment

Test automation can yield very substantial results, but requires a substantial commitment to be successful. Do not start without commitments in all areas, the most important being executive management.
'Thomas R. Arnold II, VP at ST Labs, Inc., sums it up fairly well:
"I would like to say up front ... that no test automation tool is a silver bullet. Automation takes time, effort, and commitment from all involved, including an understanding from management about the realities of what automation can and cannot do." ' 1

Management support is needed in designing and deploying test processes that will support the effective use of test tools, reinforce the role and use of automated test tools in the organisation, and allow time for tools to be integrated in the testing process.

Without management support, the entire test automation effort is at risk. If management does not - clearly and consistently - show their support for test automation, people will be less inclined to show interest in using the tools. This is a major concern, especially considering that overcoming the learning curve of some tools requires dedication.

Perhaps the greatest challenge seen in management support is balancing the high expectations of tool benefits against the time, effort, and discipline it takes to implement the tool. Management may become impatient about the lack of tool progress and shift their support to other initiatives.

Mistakes made…

* "We thought these tools were going to solve your testing problems."

When making the case to management for acquiring test tools, present the challenges as well as the benefits. Make sure that management understand their influence on how others will accept automated test tools.
* "We purchased a tool for you 3 months ago, and yet we cannot see any benefits in the project."

Communicate to management from the start that it takes time and planning to build a firm foundation of people, processes, and the right tools. Keep management informed of tool progress, and any issues that arise.

Best Practice 7: Select the appropriate technique/s for the project

As maturity levels improve in the functional test automation field, we see a decrease in cost for test automation implementations, and an increase in benefits derived for the project or organisation.

The increase in maturity levels has provided us with more mature techniques for implementing test automation. These techniques vary from the simplest record-and-playback to the most mature, viz., keyword or action based. These techniques each have their own benefits and drawbacks, appropriate for certain specific circumstances.

During a small data conversion project, a few lookup data tables' contents in the old system need to be compared to the new lookup data tables' contents after the batch conversion. If this can be done via the enquiry functions of both systems, the fastest method might be record-and-playback. This is a once off test of the conversion, and writing specific scripts to do this might take longer than the record-and-playback method. In terms of cost and timing, the record-and-playback technique would be the most appropriate for this project. This is one of the few instances where I would suggest using the record-and-playback technique; most of the time the benefits of this technique are few compared to the more mature techniques like data-driven and keyword test automation.

It is important to know, and be able to use, multiple techniques appropriately. If we do not use the various techniques when appropriate, our test automation will not provide us with the benefits we are aiming for.

Mistakes made…

* "Record-and-playback is sufficient for our needs."

The benefits of this simplistic technique are few, and its application is widely misused. More mature techniques are available with exponentially more benefits in terms of ease of use, cost saving, and time saving if implemented correctly.
* "We have created our test cases for the project and are approaching the release of the first build. Can we implement keyword test automation before the end of the project."

Keyword test automation is a very advanced test automation technique, and is closely tied to the testing methodology being used for the project. The preparation work for implementing keyword test automation requires more time than some other techniques. The framework needs to be established, and test design needs to support the method. I would not suggest implementing the method from scratch at this late stage of the project.

Best Practice 8: Do not attempt to automate all tests

Test Automation does not work well for all situations or tests. Some tests are better left for manual execution. If a test will not be repeated more than three times, it is probably not a good candidate to automate. It takes a lot longer to automate a test than to execute it manually. The benefit of the automated tests is linked to how many times it would be repeated. For example, multiple builds, or testing on multiple platforms, or executing the same test for multiple data sets.

Software that is tested manually will be tested with a randomness that helps find bugs in more varied situations. Since a software program usually will not vary, automated testing may not find some bugs that manual testing will. Automated software testing is never a complete substitute for manual testing.

Proper analysis needs to be done to identify the correct test cases to automate to ensure ROI for the effort. How to select tests for automation is a subject on its own, but the following suggestions can be helpful:

1. Perform a proof-of-concept (POC) on a cross section of test cases for the system to determine technical applicability and return on investment (ROI) metrics.
2. Perform an ROI analysis for each testing regimen to select automation candidates.
3. Perform a risk analysis on the automation candidates to prioritise what should be automated first.

If the tests selected for functional test automation are identified and prioritised, the test cases with the best potential to provide benefit will be automated first. If time constraints affect the test automation delivery, at least we know the largest possible benefit would be derived.

Mistakes made…

* "It was so much fun that we automated all the tests."

It might have been fun, but the ROI was definitely not optimal. Time spent on automation of certain tests would never be recovered. A general rule of thumb would be never to automate more than 60% of your test cases to ensure maximum benefit with the lowest investment cost.

Best Practice 9: Manage the automation as a development project

Test Automation development is a software development process, although it is seldom treated that way. Following software development practices can make the difference between the success and failure of an automated testing project.

We need to run test automation projects just as we do our other software development projects. The following pertains to test automation projects, and is true for development projects:

* Projects need developers dedicated to developing test automation.
* Test automation automates a task in which the programmer is probably not an expert. Therefore, expert testers should be consulted and provide the requirements.
* Test automation benefits can be seen if we design our approach before we start coding.
* Test automation code needs to be tracked and safeguarded. Therefore, we need to use source code management.
* Test automation will have bugs. Therefore, we need to plan to track them, and test for them.
* Users will need to know how to use it. Therefore, we need user documentation.

The type of work done during test automation is software code development, and since these are programs, they must be managed in the same way that application code is managed.

Mistakes made…

* "When the code is complete, give it to the testers so they can implement test automation."

Why would we follow a phased approach to software development, but not with test automation script development? We should enforce the same stringent processes and phases on test automation scripters as on developers… and the scripts need to be tested.
* "You don't need to follow all the phases of the software development life cycle with test automation."

The fact that we are writing software to test software, and not to perform some business scenario, does not mean the method is different. What is important for the one is important for the other; both are software development projects.

Best Practice 10: Use the correct resources

One area consistently missed by organisations desiring to automate testing is the staffing issue.

Automated testing is different from other types of testing:

1. Most tools use procedural languages to create their scripts.
2. These tools are very similar to IDEs (Integrated Development Environments).
3. Debugging scripts is the same as debugging traditional software.

…and therefore, resources are going to need:

* academic-level knowledge of procedural coding languages,
* basic knowledge of coding practices and procedures, and
* in-depth knowledge and practical experience with the tool being used for the test automation.

Because of this and because 'record-and-playback' (which generally does not require learning the back-end scripting) is not a valid way to automate most tests, the use of these tools becomes like that of a development effort. Given that such is the case, there must be support within the organisation to realise this and hire staff accordingly, as well as giving the necessary time for the development.

To accomplish this, a Test Automation Tool Specialist, or similar, position must be created, and staffed with at least one senior-level programmer. It does not really matter in what languages the programmer is proficient; what is important is that this person must be capable of designing, developing, testing, debugging, and documenting code.

Test Automation is a combination of testing and development. To mentor and teach your team and resources, use consultants as appropriate to bootstrap your effort. Learn as much as possible from theses consultants to enable you to avoid the stumbling blocks, and proceed successfully with your automation effort after they depart.

Mistakes made…

* "We sent our testers on a test automation course, and they still can't implement proper test automation scripts."

Learning the test tool Integrated Development Environment (IDE) does not make you a good scripter. A 3-day course will not teach you programming principles. Mentoring by a developer can accelerate the learning process. Add a person to the test team who is a test scripter. This person should be comfortable working with code, and be able to take the basic test designed by a test analyst and convert it into an automated script.
* "The testers did attend the vendor training."

The vendor training will train your staff in the product. Teaching someone Microsoft's development studio will not make them a code writer or programmer. The person will merely know how to use the IDE to write code. It is going to require more than just vendor training to turn your staff into test automators. I suggest hiring a test automation consultant to assist with getting the test team out of the starting block. The consultant can train and mentor your staff, and teach them how to avoid the most common pitfalls. Start the team with simple basic scripting concepts, and add complexity later.

Best Practice 11: Develop for maintenance

Without proper automation architecture planning and design the test team will soon find itself with hundreds or thousands of test scripts, thousands of separate run result files/logs, and the combined work of maintaining the existing scripts for various versions of the system, as well as creation of new scripts for new enhancements. Soon they will find themselves with a maintenance nightmare.

To avoid the maintenance problems that so many organisations succumb to, the following pointers can help:

* Create a reusable architecture.
* Develop generic, cross application initialisation and configuration parameters.
* Create reusable, cross application functions.
* Design test scripts by assembling components.

Avoid the creation of disposable automation through proper planning and design of your test automation implementation.

Mistakes made…

* "We did not design with maintenance in mind."

If you did not start your test automation with a design for maintenance, you have probably already become a victim of the maintenance nightmare. Review your current scripts and determine what amount of script/code snippets can be reused if you implement a different approach (maintenance focused). If the amount of rework is large, it might be better to restart your effort with the correct approach in mind.
* "No standards were followed in script creation."

This will also result in difficulty during the maintenance phase; or if the resource that created the scripts leaves for greener pastures. Coding and scripting standards allow for ease of maintenance for the whole team. Different people can understand each other's scripts with ease, and when someone leaves the team it is not the end of the world. When test automators return to scripts that they have not worked on for a while it is also easier to understand what they did the first time round. This leads to a reduction in script maintenance time.

Start to implement standards and good coding practices for the test automators or scripters in your project/organisation. It is never too late to start with this practice.

Best Practice 12: Review and improve implementation process after each project

Development practices suggest having 'post-mortems' at the end of a project. This holds true for the test automation project as well. The following information will surface during a 'post-mortem':

* New ways to deal with specific issues that were discovered during the project may be suggested.
* Experiences gained will provide insight into changes required for the current process or architecture in use.
* Ideas may emerge that may enhance and make the current process more efficient.
* Metrics to be reviewed may provide valuable information for the enhancement of processes.
* Further training or mentoring needs may emerge.

Use the information from the 'post-mortem' to learn and adapt your processes and best practices to gain maximum benefit from your test automation implementation.
# Mistakes made…

* "We are still implementing test automation in the same manner as we did on our first test automation project in the organisation."

It is clear that the team/organisation is not learning from its experiences. Learning from our experiences allows us to change our ways for the better, adapt processes for maximum benefit, and make life easier for ourselves. The test team needs to review what works for them and what does not, and adapt their methods to improve for future projects.
* "What benefits did we derive from this automation effort."

This should be easy to answer if we followed Best Practice 1: Know your objective. We set out with a goal in mind and, while we implemented the test automation, we worked towards our goal. At the end of the project, we should know if we achieved what we set out to do. If we cannot answer, it is time to implement Best Practice no. 1.

Conclusion

Once you determine your company profile, perfect your processes, establish test specialists, give the team members appropriate testing tools, and follow the best practices laid out in this article; your company can realise the benefits of automated software testing. When compared to manual testing, properly applied automation will result in higher quality products, lower risk to your company, faster approval, and decreased time to market.

The higher level you reach on the automated software testing maturity ladder, the more benefits you will realise. Whatever level you choose, however, keep in mind a major lesson of the past few decades: No matter what tools you buy, your largest investment by far will be in the processes and people you put in place to use those tools.

Use the best practices as specified by those who have travel-led the path to successful test automation implementation, and adapt these practices to fit your organisation's individual needs by learning from your test automation implementation.

Henk Coetzee
References:

1. http://web.archive.org/web/20040304002324/wwww.globaltester.com/sp9/removing.html

PHPAdvisory.com - Overview

PHPAdvisory.com - Overview

How to Check Your Website with Multiple Browsers on a Single Machine (Cross-Browser Compatibility Checking) (thesitewizard.com)

We all know the importance of checking our web pages with multiple browsers, especially when we are designing a new layout for a website. The number of extant browsers we need to check with are enormous: Internet Explorer (IE) 6, IE 5.5, IE 5.0, Netscape 7.X (ie, Mozilla 1.0.X), Netscape 6.X (or Mozilla 0.9.X), Mozilla 1.3.X (and above), Opera 7, Opera 6/5, Netscape 4.X, IE 4.X and so on. And then there are the different platforms: Windows, Macintosh (Mac), Linux, etc. The problem for most people is that multiple versions of certain browsers cannot co-exist with each other, the most notable example of this is IE for Windows. Unless you are privileged to have multiple computers, this presents a certain difficulty for the average webmaster. This article suggests some ways for you to run multiple versions of multiple browsers on one computer.

Note that this article is written primarily from the point of view of a person using Windows (the majority of people reading this article), although it does address the issue of Mac browsers and Linux browsers as well.

Ruby/.NET Bridge

Ruby/.NET Bridge

Ruby 1.6 Notice

Ruby 1.6 Notice

Excel Pages

Excel Pages

RUBY: ScriptingExcel

RUBY: ScriptingExcel

Busy Developers' Guide to HSSF Features

Reading and Rewriting Workbooks

POIFSFileSystem fs =
new POIFSFileSystem(new FileInputStream("workbook.xls"));
HSSFWorkbook wb = new HSSFWorkbook(fs);
HSSFSheet sheet = wb.getSheetAt(0);
HSSFRow row = sheet.getRow(2);
HSSFCell cell = row.getCell((short)3);
if (cell == null)
cell = row.createCell((short)3);
cell.setCellType(HSSFCell.CELL_TYPE_STRING);
cell.setCellValue("a test");

// Write the output to a file
FileOutputStream fileOut = new FileOutputStream("workbook.xls");
wb.write(fileOut);
fileOut.close();

namespace ASPNetTestcase {

namespace ASPNetTestcase {: "Microsoft.Web.Testing"

Testing ASP.NET 2.0 and Visual Web Developer

Testing ASP.NET 2.0 and Visual Web Developer

NUnitForms : An NUnit extension for testing Windows Forms applications.

NUnitForms : An NUnit extension for testing Windows Forms applications.

NAntRunner - Visual Studio AddIn for NAnt

NAntRunner - Visual Studio AddIn for NAnt

NAnt as a Continuous Integration Tool

NAnt as a Continuous Integration Tool

Internet Explorer (IE) 7 beta crashed, here is how to!

Bah!

So i downloaded and installed IE 7 beta and it does crash!

1)Go to www.abc.com with 'Display every error option enabled from Internet Options- Advanced'
2) You will be notified about JS error.
3) Keep the window open, have a coffee and you will get Crash message!

Check Following Screenshots:






Disappointed?

Oh well, don't be. Even firefox's latest version crashes on that site too ;-)

IETest

IETest

Web hosting plans starting at $1: PHP, CGI, Perl, MySQL, email, unlimited subdomains etc : ExtraServers

Web hosting plans starting at $1: PHP, CGI, Perl, MySQL, email, unlimited subdomains etc : ExtraServers

Microsoft Office Assistance: Methods and properties used to programmatically replicate a database (MDB)

Microsoft Office Assistance: Methods and properties used to programmatically replicate a database (MDB)

Compact and Repair an Access Database. Add Ref. to : AdoDb, Jro



< ?xml version="1.0" encoding="utf-8" ?>









using ADODB;
using JRO;
using System.Configuration;
using System.Data.OleDb;
using System.IO;

public class CompactAndRepairAccessDb : System.Windows.Forms.Form
{
private System.ComponentModel.Container components = null;
private JRO.JetEngine jro;
private System.Windows.Forms.Button btnConfirm;
private System.Windows.Forms.TextBox tbxOriginalDbSize;
private System.Windows.Forms.TextBox tbxCompactedDbSize;
private OleDbConnection cnn;

public CompactAndRepairAccessDb() {
InitializeComponent();

FileInfo fi = new FileInfo( ConfigurationSettings.AppSettings["PathOriginal"] );
int s = Convert.ToInt32( fi.Length/1000 );
this.tbxOriginalDbSize.Text = s.ToString() + " kb";
}

private void btnConfirm_Click(object sender, System.EventArgs e) {
// First close all instances of the database
// MUST HAVE EXCLUSIVE use of the DB by the app
//
cnn = new OleDbConnection( ConfigurationSettings.AppSettings["CnnStrOriginal"] );
cnn.Open();
cnn.Close();

// Compact DB
//
if( cnn.State.ToString() == "Closed" ) {
jro = new JRO.JetEngine();
jro.CompactDatabase( ConfigurationSettings.AppSettings["CnnStrOriginal"],ConfigurationSettings.AppSettings["CnnStrCompacted"] );

FileInfo fi = new FileInfo( ConfigurationSettings.AppSettings["PathCompacted"] );
int s = Convert.ToInt32( fi.Length/1000 );
this.tbxCompactedDbSize.Text = s.ToString() + " kb";

// Delete OldDb | Rename the Compacted DB
//
File.Delete( ConfigurationSettings.AppSettings["PathOriginal"] );
File.Copy( ConfigurationSettings.AppSettings["PathCompacted"], ConfigurationSettings.AppSettings["PathOriginal"] );
File.Delete( ConfigurationSettings.AppSettings["PathCompacted"] );
}
}
}

James Bach's Blog: How to Investigate Intermittent Problems

James Bach's Blog: How to Investigate Intermittent Problems

Project Management: Article info : Achieving CMMI Compliance

Project Management: Article info : Achieving CMMI Compliance

Common Software Project Management Mistakes

Project Management: Article info : Common Software Project Management Mistakes

Project Management: Article info : Tools for Project Management

Project Management: Article info : Tools for Project Management

Project Management: Article info : Measuring the Risk Factor

Project Management: Article info : Measuring the Risk Factor

SourceForge.net: Project Info - Programmers Notepad

SourceForge.net: Project Info - Programmers Notepad

SourceForge.net: Project Info - Track + Task Tracker

SourceForge.net: Project Info - Track + Task Tracker

PERT/CPM for Development Project Scheduling & Management

PERT/CPM for Development Project Scheduling & Management

Function Point FAQ

Function Point FAQ

Fundamentals of Project Management

Resource Library

PMPeerPublishing

he CRM Resource Library is your one-stop shop for information about the business benefits of customer relationship management (CRM) and how you can achieve similar results using Siebel products and services.

Software cost Estimation

Software cost Estimation

Risk Management in a Software Development Life Cycle

Risk Management in a Software Development Life Cycle

Longhorn Team RSS Blog : Windows RSS Publisher's Guide (work-in-progress)

Longhorn Team RSS Blog : Windows RSS Publisher's Guide (work-in-progress)

Software Sizing

STSC CrossTalk - Inside SEER-SEM - Apr 2005: "The System Evaluation and Estimation of Resources - Software Estimating Model (SEER-SEM) is a commercially available software project estimation model used within defense, government, and commercial enterprises. Introduced over a decade ago and now in its seventh release, it offers a case study in the history and future of such models. SEER-SEM and its brethren are built upon a mix of mathematics and statistics; this article provides insight into its inner workings and basis of estimation."

Business Process Management for Software Development

Business Process Management for Software Development

CRACK THE INTERVIEW

CRACK THE INTERVIEW

Parosproxy.org - Web Application Security

Parosproxy.org - Web Application Security

Applications for Windows SharePoint Services

Applications for Windows SharePoint Services

General IDE testing cycle status points

Five Testers From VC: "VC IDE Testing: cycle check points
[Rob]
General IDE testing cycle status points (iterations on/between any two points�)
Specification review
Test Plan Design
Test Plan Review
Developer Implementation design
Developer Implementation design review
Test case design [exploratory, user feedback, regression]
Automation Design
Automation Implementation and review
Manual Test Pass
Automation run
Project status evaluation and reporting
Since the organization is broken up between disciplines (Program management, Development, and Test), cross discipline communication helps deal with the implied dependencies in the eleven points above.
A typical cycle contains zero or more of the following obstacles based on the above points:
PM (program manager group) delivers spec(s) late, preventing review by schedule
test plans delayed by changing specs/parallel product cycle effort lowering priority
test plan reviews are cursory, resulting in missed holes fund later which cost more to correct
the design is the code
well, we can review the code�or just use the feature�
test cases ended up being designed without complete specifications�requires revisions regularly at added cost
automation gets pushed back due to UI/feature churn making tests obsolete an hour before they are checked in
Automation harness issues; changing requirements; dogfooding delays (a good thing, but causes progress trade-off) [dogfooding: using VC drops to develop automation for testing VC drops�]
Manual Test Pass: can always count on them, but they take too long if automation not strategic or complete (temptation is to delay automation to get complete manual coverage creating an ever deepening hole�bite the bullet and do the au"

Steve Rowe's Blog : Three Reasons To Consider Being a Test Developer

Steve Rowe's Blog : Three Reasons To Consider Being a Test Developer

What is it like to be Tester at MS: It's not sexy. But it sure is fun.

annotated in parsing : It�s not sexy. But it sure is fun.

the jackol’s den»Blog Archive » Reverse a String in C# - Mikhail Esteves

the jackol’s den»Blog Archive » Reverse a String in C# - Mikhail Esteves: "If you need a function to reverse a string in C#, here it is!

private string ReverseString(string InputStr)
{
char[] Chars = InputStr.ToCharArray();
int Length = InputStr.Length-1;

for (int x=0;x {
Chars[x] ^= Chars[Length];
Chars[Length] ^= Chars[x];
Chars[x] ^= Chars[Length];
}

return new string(Chars);
}"

How to improve verification planning

How to improve verification planning

In-Process Metrics for Software Testing

Articles

Although there are numerous metrics for software testing, and new ones being proposed frequently, relatively few are supported by sufficient experiences of industry implementation to demonstrate their usefulness. In this chapter from his book, Stephen Kan provides a detailed discussion of some tried and true in-process metrics from the testing perspective.

Ramadan - What is it?

  Ramadan is one of the most important and holy months in the Islamic calendar. It is a time of fasting, prayer, and spiritual reflection fo...