The more I read, the more I acquire, the more certain I am that I know nothing. -Voltaire
Double Choco Latte
Double Choco Latte provides basic project management, work orders, and call center tickets. It supports the following features and concepts:
* Work Orders - Use for tracking history of almost anything: bugs, requests, maintenance, project tasks.
* Projects - Include hierarchal support so sub-projects can be created. Parent project's statistics include all child project stats to provide an overall status of all projects.
* Call Tickets - Use for contact with clients. Self-timing.
* Time Cards - Time taken on a work order to accomplish an action. Multiple time cards can be input for a single work order.
* Attribute Sets
o An attribute set consists of: Actions, Priorities, Severities, Statuses.
o Each product can support its own attribute set for work orders and tickets.
* Accounts - Client based tracking
* Personnel - People involved in the projects, including hierarchal support
* Departments - To "categorize" personnel
* Severities - List a bug's severity level
* Priorities - Ranks bugs by priority
* Statuses - Tracks bugs by status
* Sequences - Related Work Orders carry the same Job Number and incremental sequence numbers. Can be used for projects.
* Products - Obvious, but also can be assigned a person to "report to". Useful for product managers to track what they're responsible for.
* E-Mail notification via watches.
o Notification for statuses: Open, Close, Change
o Optional notification for any activity
o Can watch a product or project and receive E-Mails for their respective work orders
o Can watch a specific work order.
o Can also watch tickets in the same way, with the exception of projects.
* Work Orders - Use for tracking history of almost anything: bugs, requests, maintenance, project tasks.
* Projects - Include hierarchal support so sub-projects can be created. Parent project's statistics include all child project stats to provide an overall status of all projects.
* Call Tickets - Use for contact with clients. Self-timing.
* Time Cards - Time taken on a work order to accomplish an action. Multiple time cards can be input for a single work order.
* Attribute Sets
o An attribute set consists of: Actions, Priorities, Severities, Statuses.
o Each product can support its own attribute set for work orders and tickets.
* Accounts - Client based tracking
* Personnel - People involved in the projects, including hierarchal support
* Departments - To "categorize" personnel
* Severities - List a bug's severity level
* Priorities - Ranks bugs by priority
* Statuses - Tracks bugs by status
* Sequences - Related Work Orders carry the same Job Number and incremental sequence numbers. Can be used for projects.
* Products - Obvious, but also can be assigned a person to "report to". Useful for product managers to track what they're responsible for.
* E-Mail notification via watches.
o Notification for statuses: Open, Close, Change
o Optional notification for any activity
o Can watch a product or project and receive E-Mails for their respective work orders
o Can watch a specific work order.
o Can also watch tickets in the same way, with the exception of projects.
Building Projects with NAnt
Introduction
This document provides an introduction to NAnt and describes a set of guidelines for building projects with it.
NAnt is a free build tool based on .NET. NAnt has many advantages over existing build tools that make it the build tool of choice. First of all, NAnt is platform independent. It can be installed and executed on every system for which a .NET implementation exists. Then, Instead of processing configuration files containing shell-based commands, which are system dependent, NAnt processes build files where targets and tasks are described in XML, making it easier to move projects across systems.
Of course, NAnt does not implement all the functionalities available through shell-based commands, but if you absolutely need a not-implemented functionality, you can either extend NAnt by writing your own task with your preferred .NET programming language, or relay to the general-purpose task, which allows the execution of any program installed on your system.
In order to take full advantage of NAnt capabilities, how projects are structured and how build files are written makes the difference. After introducing how to get, install, and run NAnt, this document presents a set of guidelines that will help you create better projects.
This document provides an introduction to NAnt and describes a set of guidelines for building projects with it.
NAnt is a free build tool based on .NET. NAnt has many advantages over existing build tools that make it the build tool of choice. First of all, NAnt is platform independent. It can be installed and executed on every system for which a .NET implementation exists. Then, Instead of processing configuration files containing shell-based commands, which are system dependent, NAnt processes build files where targets and tasks are described in XML, making it easier to move projects across systems.
Of course, NAnt does not implement all the functionalities available through shell-based commands, but if you absolutely need a not-implemented functionality, you can either extend NAnt by writing your own task with your preferred .NET programming language, or relay to the general-purpose
In order to take full advantage of NAnt capabilities, how projects are structured and how build files are written makes the difference. After introducing how to get, install, and run NAnt, this document presents a set of guidelines that will help you create better projects.
Build Process for .NET in windows Script
More information: Build Process for .Net Developers they can easily build there project with out opening the solutions.if there is any bugs while building it will creare file calles status.txt and stre bugs in that.
sitecopy
sitecopy is for easily maintaining remote web sites. The program will upload files to the server which have changed locally, and delete files from the server which have been removed locally, to keep the remote site synchronized with the local site with a single command.
Scripting ASP.NET Builds and Deployments
Building and deploying ASP.NET applications with Visual Studion.NET is a simple exercise. But to accomplish this feat of magic VS.NET makes some assumptions that may not apply in the real world of production web sites. One of the biggest of these assumptions is that software developers do not mind regularly mouse clicking through all of the wizards required to realize their handiwork. This article demonstrates how to automate building and deploying ASP.NET application via scripting.
Team Development with Visual Studio .NET and Visual SourceSafe
Summary: BuildIt is a Microsoft® .NET console application that automates the build process outlined in the patterns & practices article "Team Development with Visual Studio .NET and Visual SourceSafe" in the MSDN® Library. BuildIt is designed, developed, and tested by Sapient Corporation and it is reviewed by Microsoft, including team members of the Microsoft patterns & practices and Visual Studio® .NET development system. (28 printed pages)
Using BuildIt:
* Eliminates the time required to create, test, and maintain a custom build script.
* Makes a team's build process more repeatable and consistent.
BuildIt is designed to jump-start the build process used for development of .NET distributed applications. The downloadable program provides full source code and comprehensive documentation for Microsoft Visual C#® development tool and Microsoft Visual Basic® .NET development system.
Note BuildIt currently supports building solutions that contain setup projects developed with Visual Basic .NET, Visual C#, and Visual Studio .NET. It has not been tested with projects written in other .NET languages or with other setup projects (for example, setup projects from Wise or InstallShield).
Using BuildIt:
* Eliminates the time required to create, test, and maintain a custom build script.
* Makes a team's build process more repeatable and consistent.
BuildIt is designed to jump-start the build process used for development of .NET distributed applications. The downloadable program provides full source code and comprehensive documentation for Microsoft Visual C#® development tool and Microsoft Visual Basic® .NET development system.
Note BuildIt currently supports building solutions that contain setup projects developed with Visual Basic .NET, Visual C#, and Visual Studio .NET. It has not been tested with projects written in other .NET languages or with other setup projects (for example, setup projects from Wise or InstallShield).
Nullsoft Scriptable Install System
An installer is the first experience of a user with your application. Slow or unsuccessful software installations are the most irritating computer problems. A quick and user friendly installer is therefore an essential part of your software product.
NSIS (Nullsoft Scriptable Install System) is a tool that allows programmers to create such installers for Windows. It is released under an open source license and is completely free for any use.
NSIS (Nullsoft Scriptable Install System) is a tool that allows programmers to create such installers for Windows. It is released under an open source license and is completely free for any use.
Phorum: PHP Message Board
Phorum is a web based message board written in PHP. Phorum is designed with high-availability and visitor ease of use in mind. Features such as mailing list integration, easy customization and simple installation make Phorum a powerful add-in to any website.
VB.NET ProperCase to TitleCase in C#
VBForums.com - FYI - You can use VB specific functions in any C# project: ".NET does have a way to convert to Propercase (now called Titlecase) in C#...
I wrote a quick example for ya!
be sure to add
using System.Globalization;
CultureInfo ci = new CultureInfo('en'); //Create a new CultureInfo class for the english language
TextInfo cc = ci.TextInfo; //Get the textinfo class from the CultureInfo object
MessageBox.Show(cc.ToTitleCase('helLo joP hOw ArE yOU?'));"
I wrote a quick example for ya!
be sure to add
using System.Globalization;
CultureInfo ci = new CultureInfo('en'); //Create a new CultureInfo class for the english language
TextInfo cc = ci.TextInfo; //Get the textinfo class from the CultureInfo object
MessageBox.Show(cc.ToTitleCase('helLo joP hOw ArE yOU?'));"
IEEE Standard 829 Test Summary Report - Google Search
Acceptance criteria: The criteria that a system or component must satisfy in order to be accepted by a user, customer, or other authorized entity.
Acceptance testing: (1) Formal testing conducted to determine whether a system satisfies its acceptance criteria and enables the customer to determine whether to accept the system. (2) Formal testing conducted to enable a user, customer, or other authorized entity to determine whether to accept a system or component.
Computer Software Configuration Item (CSCI): An aggregation of software that is designated for configuration management and treated as a single entity in the configuration management process. Contrast with: Hardware configuration item See also: Configuration item
Configuration item (CI): An aggregation of hardware, software, or both, that is designated for configuration management and treated as a single entity in the configuration management process. See also: Hardware configuration item; Computer software configuration item.
Development testing: Formal or informal testing conducted during the development of a system or component, usually in the development environment by the developer.
Functional testing: (1) Testing that ignores the internal mechanism of a system or component and focuses solely on the outputs generated in response to selected inputs and execution conditions. Contrast with: Structural testing. (2) Testing conducted to evaluate the compliance of a system or component with specified functional requirements. See also: Performance testing.
Hardware Configuration Item: Hardware items that include disks, disk drives, display screens, keyboards, printers, boards, and chips.
Independent Verification and Validation (IV&V): Verification and validation performed by an organization that is technically, managerially, and financially independent of the development organization.
Installation and checkout phase: The period of time in the software life cycle during which a software product is integrated into its operational environment and tested in this environment to ensure that it performs as required.
Integration testing: Testing in which software components, hardware components, or both are combined and tested to evaluate the interaction between them. See also: System testing; Unit testing.
Load testing: Testing that studies the behavior of the program when it is working at its limits. See also: Stress Testing.
Operational testing: Testing conducted to evaluate a system or component in its operational environment.
Path testing (coverage): Testing that is designed to execute all or selected paths through a computer program.
Pass/Fail criteria: Decision rules used to determine whether a software item or software feature passes or fails a test.
Performance testing: Testing conducted to evaluate the compliance of a system or component with specified performance requirements. See also: Functional testing.
Quality Assurance (QA): (1) The process of evaluating overall project performance on a regular basis to provide confidence that the project will satisfy the relevant quality standards. (2) The organizational unit that is assigned responsibility for quality assurance. [A Guide to the Project Management Body of Knowledge (PMBOK Guide), 2000 Edition]
Quality Control (QC): (1) The process of monitoring specific project results to determine if they comply with relevant quality standards and identifying ways to eliminate causes of unsatisfactory performance. (2) The organizational unit that is assigned responsibility for quality control. [A Guide to the Project Management Body of Knowledge (PMBOK Guide), 2000 Edition]
Quality Management (QM): The processes required to ensure that the project would satisfy the needs for which it was undertaken.
Regression testing: Selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements.
Scenario: (1) A description of a series of events that could be expected to occur simultaneously or sequentially. (2) An account or synopsis of a projected course of events or actions. [IEEE Std 1362-1998, Guide for Information Technology – System Definition – Concept of Operations (ConOps) Document]
Software item: Source code, object code, job control code, control data, or a collection of items.
Stress testing: Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements. See also: Load testing.
Structural testing: Testing that takes into account the internal mechanism of a system or component. Types include branch testing, path testing, statement testing. Contrast with: Functional testing.
System testing: Testing conducted on a complete, integrated system to evaluate the system’s compliance with its specified requirements. See also: Integration testing; Unit testing.
Test: An activity in which a system or component is executed under specified conditions, the results are observed or recorded, and an evaluation is made of some aspect of the system or component.
Test case specification: A document specifying inputs, predicted results, and a set of execution conditions for a test item.
Test design specification: Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests.
Test Incident Report (TIR): A document reporting on any event that occurs during the testing process that requires investigation.
Test item: A software item that is an object of testing.
Test log: A chronological record of relevant details about the execution tests.
Test phase: The period of time in the life cycle during which components of a system are integrated, and the product is evaluated to determine whether or not requirements have been satisfied.
Test plan: A document describing the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring contingency planning.
Test procedure: (1) Detailed instructions for the set-up, execution, and evaluation of results for a given test case. (2) A document containing a set of associated instructions as in (1). (3) Documentation specifying a sequence of actions for the execution of a test.
Test Readiness Review (TRR): A review conducted to evaluate preliminary test results for one or more configuration items and verify that the test procedures for each configuration item are complete, comply with test plans and descriptions, and satisfy test requirements. Verify that a project is prepared to proceed to formal testing of the configuration item.
Test summary report: A document summarizing testing activities and results. It also contains an evaluation of the corresponding test items.
Testability: (1) The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met. (2) The degree to which a requirement is stated in terms that permit establishment of test criteria and performance of tests to determine whether those criteria have been met.
Testing: (1) The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component. (2)The process of analyzing a software item to detect the differences between existing and required conditions (i.e., bugs) and to evaluate the features of the software items. See also: Acceptance testing; Development testing; Integration testing; Operational testing; Performance testing; Regression testing; System testing; Unit testing.
Unit Testing: The testing of individual hardware or software units or groups of related units (i.e., component, modules). See also: Integration testing; System testing.
Acceptance testing: (1) Formal testing conducted to determine whether a system satisfies its acceptance criteria and enables the customer to determine whether to accept the system. (2) Formal testing conducted to enable a user, customer, or other authorized entity to determine whether to accept a system or component.
Computer Software Configuration Item (CSCI): An aggregation of software that is designated for configuration management and treated as a single entity in the configuration management process. Contrast with: Hardware configuration item See also: Configuration item
Configuration item (CI): An aggregation of hardware, software, or both, that is designated for configuration management and treated as a single entity in the configuration management process. See also: Hardware configuration item; Computer software configuration item.
Development testing: Formal or informal testing conducted during the development of a system or component, usually in the development environment by the developer.
Functional testing: (1) Testing that ignores the internal mechanism of a system or component and focuses solely on the outputs generated in response to selected inputs and execution conditions. Contrast with: Structural testing. (2) Testing conducted to evaluate the compliance of a system or component with specified functional requirements. See also: Performance testing.
Hardware Configuration Item: Hardware items that include disks, disk drives, display screens, keyboards, printers, boards, and chips.
Independent Verification and Validation (IV&V): Verification and validation performed by an organization that is technically, managerially, and financially independent of the development organization.
Installation and checkout phase: The period of time in the software life cycle during which a software product is integrated into its operational environment and tested in this environment to ensure that it performs as required.
Integration testing: Testing in which software components, hardware components, or both are combined and tested to evaluate the interaction between them. See also: System testing; Unit testing.
Load testing: Testing that studies the behavior of the program when it is working at its limits. See also: Stress Testing.
Operational testing: Testing conducted to evaluate a system or component in its operational environment.
Path testing (coverage): Testing that is designed to execute all or selected paths through a computer program.
Pass/Fail criteria: Decision rules used to determine whether a software item or software feature passes or fails a test.
Performance testing: Testing conducted to evaluate the compliance of a system or component with specified performance requirements. See also: Functional testing.
Quality Assurance (QA): (1) The process of evaluating overall project performance on a regular basis to provide confidence that the project will satisfy the relevant quality standards. (2) The organizational unit that is assigned responsibility for quality assurance. [A Guide to the Project Management Body of Knowledge (PMBOK Guide), 2000 Edition]
Quality Control (QC): (1) The process of monitoring specific project results to determine if they comply with relevant quality standards and identifying ways to eliminate causes of unsatisfactory performance. (2) The organizational unit that is assigned responsibility for quality control. [A Guide to the Project Management Body of Knowledge (PMBOK Guide), 2000 Edition]
Quality Management (QM): The processes required to ensure that the project would satisfy the needs for which it was undertaken.
Regression testing: Selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements.
Scenario: (1) A description of a series of events that could be expected to occur simultaneously or sequentially. (2) An account or synopsis of a projected course of events or actions. [IEEE Std 1362-1998, Guide for Information Technology – System Definition – Concept of Operations (ConOps) Document]
Software item: Source code, object code, job control code, control data, or a collection of items.
Stress testing: Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements. See also: Load testing.
Structural testing: Testing that takes into account the internal mechanism of a system or component. Types include branch testing, path testing, statement testing. Contrast with: Functional testing.
System testing: Testing conducted on a complete, integrated system to evaluate the system’s compliance with its specified requirements. See also: Integration testing; Unit testing.
Test: An activity in which a system or component is executed under specified conditions, the results are observed or recorded, and an evaluation is made of some aspect of the system or component.
Test case specification: A document specifying inputs, predicted results, and a set of execution conditions for a test item.
Test design specification: Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests.
Test Incident Report (TIR): A document reporting on any event that occurs during the testing process that requires investigation.
Test item: A software item that is an object of testing.
Test log: A chronological record of relevant details about the execution tests.
Test phase: The period of time in the life cycle during which components of a system are integrated, and the product is evaluated to determine whether or not requirements have been satisfied.
Test plan: A document describing the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring contingency planning.
Test procedure: (1) Detailed instructions for the set-up, execution, and evaluation of results for a given test case. (2) A document containing a set of associated instructions as in (1). (3) Documentation specifying a sequence of actions for the execution of a test.
Test Readiness Review (TRR): A review conducted to evaluate preliminary test results for one or more configuration items and verify that the test procedures for each configuration item are complete, comply with test plans and descriptions, and satisfy test requirements. Verify that a project is prepared to proceed to formal testing of the configuration item.
Test summary report: A document summarizing testing activities and results. It also contains an evaluation of the corresponding test items.
Testability: (1) The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met. (2) The degree to which a requirement is stated in terms that permit establishment of test criteria and performance of tests to determine whether those criteria have been met.
Testing: (1) The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component. (2)The process of analyzing a software item to detect the differences between existing and required conditions (i.e., bugs) and to evaluate the features of the software items. See also: Acceptance testing; Development testing; Integration testing; Operational testing; Performance testing; Regression testing; System testing; Unit testing.
Unit Testing: The testing of individual hardware or software units or groups of related units (i.e., component, modules). See also: Integration testing; System testing.
IEEE Standards Description: 829-1983
IEEE Standards Description: 829-1983: "ANSI/IEEE 829-1983 IEEE Standard for Software Test Documentation -Description
Content +
* 1. Scope
* 2. Definitions
* 3. Test Plan
*
o 3.1 Purpose.
o 3.2 Outline.
o
+ 3.2.1 Test-Plan Identifier.
+ 3.2.2 Introduction.
+ 3.2.3 Test Items.
+ 3.2.4 Features to be Tested.
+ 3.2.5 Features Not to be Tested.
+ 3.2.6 Approach.
+ 3.2.7 Item Pass/Fail Criteria.
+ 3.2.8 Suspension Criteria and Resumption Requirements.
+ 3.2.9 Test Deliverables.
+ 3.2.10 Testing Tasks.
+ 3.2.11 Environmental Needs.
+ 3.2.12 Responsibilities.
+ 3.2.13 Staffing and Training Needs.
+ 3.2.14 Schedule.
+ 3.2.15 Risks and Contingencies.
+ 3.2.16 Approvals.
* 4. Test-Design Specification
*
o 4.1 Purpose.
o 4.2 Outline.
o
+ 4.2.1 Test-Design-Specification Identifier.
+ 4.2.2 Features to be Tested.
+ 4.2.3 Approach Refinements.
+ 4.2.4 Test Identification.
+ 4.2.5 Feature Pass/Fail Criteria.
* 5. Test-Case Specification
*
o 5.1 Purpose.
o 5.2 Outline.
o
+ 5.2.1 Test-Case-Specification Identifier.
+ 5.2.2 Test Items.
+ 5.2.3 Input Specifications.
+ 5.2.4 Output Specifications.
+ 5.2.5 Environmental Needs.
+ 5.2.6 Special Procedural Requirements.
+ 5.2.7 Intercase Dependencies.
* 6. Test-Procedure Specification
*
o 6.1 Purpose.
o 6.2 Outline.
o
+ 6.2.1 Test-Procedure-Spe"
Content +
* 1. Scope
* 2. Definitions
* 3. Test Plan
*
o 3.1 Purpose.
o 3.2 Outline.
o
+ 3.2.1 Test-Plan Identifier.
+ 3.2.2 Introduction.
+ 3.2.3 Test Items.
+ 3.2.4 Features to be Tested.
+ 3.2.5 Features Not to be Tested.
+ 3.2.6 Approach.
+ 3.2.7 Item Pass/Fail Criteria.
+ 3.2.8 Suspension Criteria and Resumption Requirements.
+ 3.2.9 Test Deliverables.
+ 3.2.10 Testing Tasks.
+ 3.2.11 Environmental Needs.
+ 3.2.12 Responsibilities.
+ 3.2.13 Staffing and Training Needs.
+ 3.2.14 Schedule.
+ 3.2.15 Risks and Contingencies.
+ 3.2.16 Approvals.
* 4. Test-Design Specification
*
o 4.1 Purpose.
o 4.2 Outline.
o
+ 4.2.1 Test-Design-Specification Identifier.
+ 4.2.2 Features to be Tested.
+ 4.2.3 Approach Refinements.
+ 4.2.4 Test Identification.
+ 4.2.5 Feature Pass/Fail Criteria.
* 5. Test-Case Specification
*
o 5.1 Purpose.
o 5.2 Outline.
o
+ 5.2.1 Test-Case-Specification Identifier.
+ 5.2.2 Test Items.
+ 5.2.3 Input Specifications.
+ 5.2.4 Output Specifications.
+ 5.2.5 Environmental Needs.
+ 5.2.6 Special Procedural Requirements.
+ 5.2.7 Intercase Dependencies.
* 6. Test-Procedure Specification
*
o 6.1 Purpose.
o 6.2 Outline.
o
+ 6.2.1 Test-Procedure-Spe"
Agitar Software, Enterprise Developer Testing for Java
Sriram Sankar: Developer Testing at Google
Introduction
The Google Environment
XP at Google
Other Quality Initiatives at Google
Q&A
Introduction
The Google Environment
XP at Google
Other Quality Initiatives at Google
Q&A
Adding and retrieving Images from a SQL Server Table
This article describes the process to add and retrieve images from a SQL Server table using ADO.NET. You can possibly have an entry form that will allow the user to choose what operation he wants to do: add or view images. Depending on the option chosen you can display the relevant form.
To add images to the form, following procedure can be used. A textbox can be displayed on the form to accept the desired image filename from the user. This file must be existing on the local machine and can be chosen using a OpenFileDialog instance.
To add images to the form, following procedure can be used. A textbox can be displayed on the form to accept the desired image filename from the user. This file must be existing on the local machine and can be chosen using a OpenFileDialog instance.
Launch Condition Management in Deployment
The Launch Conditions Editor allows you to specify conditions that must be met in order to successfully run an installation. For example, you might want to check for a specific version of an operating system — if a user attempts to install on a system that does not meet the condition, the installation will not occur.
Searches can be performed on a target computer to determine if a particular file, registry key, or Microsoft Windows Installer component exists.
Predefined launch conditions allow you to add a both search and a launch condition in a single step. The Property property for the search is automatically referenced in the Condition property of the launch condition.
Note To learn more about condition syntax, see Deployment Conditions.
Searches and conditional evaluations are performed at the beginning of an installation and are performed in the order that they are shown in the Launch Conditions Editor.
Searches can be performed on a target computer to determine if a particular file, registry key, or Microsoft Windows Installer component exists.
Predefined launch conditions allow you to add a both search and a launch condition in a single step. The Property property for the search is automatically referenced in the Condition property of the launch condition.
Note To learn more about condition syntax, see Deployment Conditions.
Searches and conditional evaluations are performed at the beginning of an installation and are performed in the order that they are shown in the Launch Conditions Editor.
Terminal Services Keyboard Shortcuts RDP REmote Desktop
: "TABLE 1: Terminal Services Keyboard Shortcuts
Keyboard Shortcut Function
Alt+Del Displays the active application's Control menu.
Alt+Home Opens the Windows Start menu within the client session.
Alt+Page Down Cycles (from right to left) through the current taskbar programs.
Alt+Page Up Cycles (from left to right) through the current taskbar programs.
Ctrl+Alt+Break Toggles the client session between windowed and full-screen modes.
Ctrl+Alt+End Opens the Windows Security dialog box, similar to pressing Ctrl+Alt+Del on the local workstation's keyboard.
Ctrl+Alt+Ins Cycles through the current taskbar programs in the order in which the user started them.
Ctrl+Alt+Minus (-) Copies a snapshot of the active client window to the clipboard, similar to pressing Print Screen on the local workstation's keyboard.
Ctrl+Alt+Plus (+) Copies a snapshot of the entire client-session desktop area to the clipboard, similar to pressing Alt+Print Screen on the local workstation's keyboard."
Keyboard Shortcut Function
Alt+Del Displays the active application's Control menu.
Alt+Home Opens the Windows Start menu within the client session.
Alt+Page Down Cycles (from right to left) through the current taskbar programs.
Alt+Page Up Cycles (from left to right) through the current taskbar programs.
Ctrl+Alt+Break Toggles the client session between windowed and full-screen modes.
Ctrl+Alt+End Opens the Windows Security dialog box, similar to pressing Ctrl+Alt+Del on the local workstation's keyboard.
Ctrl+Alt+Ins Cycles through the current taskbar programs in the order in which the user started them.
Ctrl+Alt+Minus (-) Copies a snapshot of the active client window to the clipboard, similar to pressing Print Screen on the local workstation's keyboard.
Ctrl+Alt+Plus (+) Copies a snapshot of the entire client-session desktop area to the clipboard, similar to pressing Alt+Print Screen on the local workstation's keyboard."
The Art of Project Management: How to Make Things Happen
Master the many ways to say no
Sometimes, you will need to say no in direct response to a feature request. Other times, you'll need to interject yourself into a conversation or meeting, identify the conflict with priorities you've overheard, and effectively say no to whatever was being discussed. To prepare yourself for this, you need to know all of the different flavors that the word no comes in:
* No, this doesn't fit our priorities. If it is early in the project, you should make the argument for why the current priorities are good, but hear people out on why other priorities might make more sense. They might have good ideas or need clarity on the goals. But do force the discussion to be relative to the project priorities, and not the abstract value of a feature or bug fix request. If it is late in the project, you can tell them they missed the boat. Even if the priorities suck, they're not going to change on the basis of one feature idea. The later you are, the more severe the strategy failure needs to be to justify goal adjustments.
* No, only if we have time. If you keep your priorities lean, there will always be many very good ideas that didn't make the cut. Express this as a relative decision: the idea in question might be good, but not good enough relative to the other work and the project priorities. If the item is on the priority 2 list, convey that it's possible it will be done, but that no one should bet the farm assuming it will happen.
* No, only if you make happen. Sometimes, you can redirect a request back onto the person who made it. If your VP asks you to add support for a new feature, tell him you can do it only if he cuts one of his other current priority 1 requests. This shifts the point of contention away from you, and toward a tangible, though probably unattainable, situation. This can also be done for political or approval issues: "If you can convince Sally that this is a good idea, I'll consider it." However, this can backfire. (What if he does convince Sally? Or worse, realizes you're sending him on a wild goose chase?)
* No. Next release. Assuming you are working on a web site or software project that will have more updates, offer to reconsider the request for the next release. This should probably happen anyway for all priority 2 items. This is often called postponement or punting.
* No. Never. Ever. Really. Some requests are so fundamentally out of line with the long-term goals that the hammer has to come down. Cut the cord now and save yourself the time of answering the same request again later. Sometimes it's worth the effort to explain why (so that they'll be more informed next time). Example: "No, Fred. The web site search engine will never support the Esperanto language. Never. Ever."
Sometimes, you will need to say no in direct response to a feature request. Other times, you'll need to interject yourself into a conversation or meeting, identify the conflict with priorities you've overheard, and effectively say no to whatever was being discussed. To prepare yourself for this, you need to know all of the different flavors that the word no comes in:
* No, this doesn't fit our priorities. If it is early in the project, you should make the argument for why the current priorities are good, but hear people out on why other priorities might make more sense. They might have good ideas or need clarity on the goals. But do force the discussion to be relative to the project priorities, and not the abstract value of a feature or bug fix request. If it is late in the project, you can tell them they missed the boat. Even if the priorities suck, they're not going to change on the basis of one feature idea. The later you are, the more severe the strategy failure needs to be to justify goal adjustments.
* No, only if we have time. If you keep your priorities lean, there will always be many very good ideas that didn't make the cut. Express this as a relative decision: the idea in question might be good, but not good enough relative to the other work and the project priorities. If the item is on the priority 2 list, convey that it's possible it will be done, but that no one should bet the farm assuming it will happen.
* No, only if you make
* No. Next release. Assuming you are working on a web site or software project that will have more updates, offer to reconsider the request for the next release. This should probably happen anyway for all priority 2 items. This is often called postponement or punting.
* No. Never. Ever. Really. Some requests are so fundamentally out of line with the long-term goals that the hammer has to come down. Cut the cord now and save yourself the time of answering the same request again later. Sometimes it's worth the effort to explain why (so that they'll be more informed next time). Example: "No, Fred. The web site search engine will never support the Esperanto language. Never. Ever."
Tessitura Software Announces New Features at Worldwide User Conference
Intix: "Tessitura Software Announces New Features at Worldwide User Conference
Tessitura Network 2005 User Conference Attended by 500+ Users
DALLAS--(BUSINESS WIRE)--Aug. 22, 2005--The annual worldwide Tessitura Network, Inc. user conference was held in Boston this year with record attendance of over 500 users from approximately 100 arts organizations who have licensed Tessitura Software(R) from Impresario, L.L.C. The attendees represented licensees in Australia, the United Kingdom, Canada and the United States.
While at the conference attendees could chose from over 40 sessions presented by 70+ session leaders. General sessions announced the launch of the following major enhancements and interfaces to Tessitura:
* An optional Select Your Own Seat online buying path for organizations who elect to put that power in the hands of their online ticket buyers
* A proven Access Control system utilizing handheld scanners for real time admission authorization
* A kiosk system to handle automated 'will calls' and remote ticket purchases
The Tessitura Network, Inc. structure allows the licensees of the software to vote on priorities for the future development of the software. With version 5.5 released in July, the users voted on development priorities for version 6.0 and 6.5 at the meeting. Tessitura releases new versions twice a year and updates the included web interface at the same time. Starting in 2002, Tessitura licensees have been provided with a web interface that covers online ticketing, contributions, subscriptions and many other functions. No fees are due for any transactions to Impresario or the Tessitura Network, Inc. and thus arts organizations are able to maximize the revenue and contributions they generate. Panel discussions at the conference highlighted average web sales patterns for single tickets of up to 70%. The nonprofit Network organization is governed by a board of senior arts administrators whose interests are aligned with the ne"
Tessitura Network 2005 User Conference Attended by 500+ Users
DALLAS--(BUSINESS WIRE)--Aug. 22, 2005--The annual worldwide Tessitura Network, Inc. user conference was held in Boston this year with record attendance of over 500 users from approximately 100 arts organizations who have licensed Tessitura Software(R) from Impresario, L.L.C. The attendees represented licensees in Australia, the United Kingdom, Canada and the United States.
While at the conference attendees could chose from over 40 sessions presented by 70+ session leaders. General sessions announced the launch of the following major enhancements and interfaces to Tessitura:
* An optional Select Your Own Seat online buying path for organizations who elect to put that power in the hands of their online ticket buyers
* A proven Access Control system utilizing handheld scanners for real time admission authorization
* A kiosk system to handle automated 'will calls' and remote ticket purchases
The Tessitura Network, Inc. structure allows the licensees of the software to vote on priorities for the future development of the software. With version 5.5 released in July, the users voted on development priorities for version 6.0 and 6.5 at the meeting. Tessitura releases new versions twice a year and updates the included web interface at the same time. Starting in 2002, Tessitura licensees have been provided with a web interface that covers online ticketing, contributions, subscriptions and many other functions. No fees are due for any transactions to Impresario or the Tessitura Network, Inc. and thus arts organizations are able to maximize the revenue and contributions they generate. Panel discussions at the conference highlighted average web sales patterns for single tickets of up to 70%. The nonprofit Network organization is governed by a board of senior arts administrators whose interests are aligned with the ne"
How to set up Windows 2003 Web Edition without Plesk?
EV1Servers Forums - View Single Post - How to set up Windows 2003 Web Edition without Plesk?: "5.g) Make IIS recognize c:\php\sapi\php4isapi.dll, go into IIS, select the 'Web Service Extensions' tab, and add extension 'php' with required dll 'c:\php\sapi\php4isapi.dll' specifying it as 'Allowed'."
How to Report Bugs Effectively
How to Report Bugs Effectively: "# The first aim of a bug report is to let the programmer see the failure with their own eyes. If you can't be with them to make it fail in front of them, give them detailed instructions so that they can make it fail for themselves.
# In case the first aim doesn't succeed, and the programmer can't see it failing themselves, the second aim of a bug report is to describe what went wrong. Describe everything in detail. State what you saw, and also state what you expected to see. Write down the error messages, especially if they have numbers in.
# When your computer does something unexpected, freeze. Do nothing until you're calm, and don't do anything that you think might be dangerous.
# By all means try to diagnose the fault yourself if you think you can, but if you do, you should still report the symptoms as well.
# Be ready to provide extra information if the programmer needs it. If they didn't need it, they wouldn't be asking for it. They aren't being deliberately awkward. Have version numbers at your fingertips, because they will probably be needed.
# Write clearly. Say what you mean, and make sure it can't be misinterpreted.
# Above all, be precise. Programmers like precision."
# In case the first aim doesn't succeed, and the programmer can't see it failing themselves, the second aim of a bug report is to describe what went wrong. Describe everything in detail. State what you saw, and also state what you expected to see. Write down the error messages, especially if they have numbers in.
# When your computer does something unexpected, freeze. Do nothing until you're calm, and don't do anything that you think might be dangerous.
# By all means try to diagnose the fault yourself if you think you can, but if you do, you should still report the symptoms as well.
# Be ready to provide extra information if the programmer needs it. If they didn't need it, they wouldn't be asking for it. They aren't being deliberately awkward. Have version numbers at your fingertips, because they will probably be needed.
# Write clearly. Say what you mean, and make sure it can't be misinterpreted.
# Above all, be precise. Programmers like precision."
How Do I in Team Edition for Testers
Visual Studio Team System
How Do I in Team Edition for Testers
This is the gateway to find task-based topics to help you find the answers to your questions. The general categories for describing this information are listed below. These links provide pointers to pertinent information for Microsoft Visual Studio 2005 Team Edition for Software Testers.
How Do I
*
Work with test projects.
*
Configure my tests.
*
Manage my tests.
*
Use a test rig.
*
Run a test.
*
Get test results.
*
Use a work item.
*
Perform a unit test.
*
Perform a Web test.
*
Perform a load test.
*
Perform a manual test.
*
Perform a generic test.
*
Create an ordered list.
How Do I in Team Edition for Testers
This is the gateway to find task-based topics to help you find the answers to your questions. The general categories for describing this information are listed below. These links provide pointers to pertinent information for Microsoft Visual Studio 2005 Team Edition for Software Testers.
How Do I
*
Work with test projects.
*
Configure my tests.
*
Manage my tests.
*
Use a test rig.
*
Run a test.
*
Get test results.
*
Use a work item.
*
Perform a unit test.
*
Perform a Web test.
*
Perform a load test.
*
Perform a manual test.
*
Perform a generic test.
*
Create an ordered list.
How to: Use the Web Test API
How to: Use the Web Test API
You can write code for your Web tests. The Web test API is used to create coded Web tests, Web test plug-ins, request plug-ins, requests, extraction rules, and validation rules. The classes that make up these types are the core classes in this API. The other types in this API are used to support creating WebTest, WebTestPlugin, WebTestRequestPlugin, WebTestRequest, ExtractionRule, and ValidationRule objects. You use the Microsoft.VisualStudio.TestTools.WebTesting namespace to create customized Web tests. Use the object browser to examine the Microsoft.VisualStudio.TestTools.WebTesting namespace. Both the C# and Visual Basic editors offer IntelliSense support for coding with the classes in the namespace.
You can write code for your Web tests. The Web test API is used to create coded Web tests, Web test plug-ins, request plug-ins, requests, extraction rules, and validation rules. The classes that make up these types are the core classes in this API. The other types in this API are used to support creating WebTest, WebTestPlugin, WebTestRequestPlugin, WebTestRequest, ExtractionRule, and ValidationRule objects. You use the Microsoft.VisualStudio.TestTools.WebTesting namespace to create customized Web tests. Use the object browser to examine the Microsoft.VisualStudio.TestTools.WebTesting namespace. Both the C# and Visual Basic editors offer IntelliSense support for coding with the classes in the namespace.
The Scrum Software Development Process for Small Teams, IEEE
http://members.cox.net/risingl1/articles/IEEEScrum.pdf
Testing .NET Application BlocksVersion 1.0
Summary
Testing .NET Application Blocks covers many testing areas that were used during testing and verification of the various application blocks provided by Microsoft's patterns & practices group, such as functional, globalization, performance, integration, and security. The guide uses code examples, sample test cases, and checklists to demonstrate how to plan and implement each type of test; the guide also recommends tools to use to run the tests. It covers test considerations to be taken into account when customizing these application blocks or integrating them with your own applications.
Because this guide focuses on testing NET application blocks, its scope does not include user interface (UI) testing or database testing. Although the guidance is developed within the context of Microsoft patterns & practices application blocks, it still applies to testing .NET code in general.
Testing .NET Application Blocks covers many testing areas that were used during testing and verification of the various application blocks provided by Microsoft's patterns & practices group, such as functional, globalization, performance, integration, and security. The guide uses code examples, sample test cases, and checklists to demonstrate how to plan and implement each type of test; the guide also recommends tools to use to run the tests. It covers test considerations to be taken into account when customizing these application blocks or integrating them with your own applications.
Because this guide focuses on testing NET application blocks, its scope does not include user interface (UI) testing or database testing. Although the guidance is developed within the context of Microsoft patterns & practices application blocks, it still applies to testing .NET code in general.
Tester Blog: Dinesh
Testing Web applications
As a tester in Microsoft, I got experience in testing APIs, command line tools and Rich client based apps. However, testing web applications or commercial web sites is something I want to learn. Ex: What it takes to test a shopping web site? Some of the things I know in this domain are UI testing, security, load, stress and perf, testing with multiple browsers etc. I also learnt that Winrunner and Loadrunner are the commonly used tools. I would like to know details on how to get a web app to ship quality, commonly used testing strategies and trade offs, tools and process. If you can point to docs and tools that would be great.
Update:2/22: Since the last post, I found couple of good books on web testing:
1. Web testing Companion By Lydia Ash
2. Performance Testing Microsoft .NET Web Applications
As a tester in Microsoft, I got experience in testing APIs, command line tools and Rich client based apps. However, testing web applications or commercial web sites is something I want to learn. Ex: What it takes to test a shopping web site? Some of the things I know in this domain are UI testing, security, load, stress and perf, testing with multiple browsers etc. I also learnt that Winrunner and Loadrunner are the commonly used tools. I would like to know details on how to get a web app to ship quality, commonly used testing strategies and trade offs, tools and process. If you can point to docs and tools that would be great.
Update:2/22: Since the last post, I found couple of good books on web testing:
1. Web testing Companion By Lydia Ash
2. Performance Testing Microsoft .NET Web Applications
Getting Started With Carbon
http://developer.apple.com/referencelibrary/GettingStarted/GS_Carbon/index.html
Subscribe to:
Posts (Atom)
Ramadan - What is it?
Ramadan is one of the most important and holy months in the Islamic calendar. It is a time of fasting, prayer, and spiritual reflection fo...
-
ZipStudio - A versatile Visual Studio add-in to zip up Visual Studio solutions and projects - The Code Project - C# Programming
-
TargetProcess - Agile Project Management & Bug Tracking Software | Download (Project Management Software, Project Tracking, Bug Tracking...