Posted by Albert Gareev on Jun 08, 2009
Original date: 28 Apr 2009, 1:00pm Quality Assurance Functional/Regression Automated Testing and Test Automation Requirements: Usability, Maintainability, Scalability, and Robustness. Robustness Requirements Matrix Manned / Unmanned Test Execution – What level of manual efforts in automated testing would be acceptable for you? Baby-sitting Test script may stop or break execution at any moment. Manual action […] ...
Posted by Albert Gareev on Jun 07, 2009
Original date: 21 Apr 2009, 12:30pm Quality Assurance Functional/Regression Automated Testing and Test Automation Requirements: Usability, Maintainability, Scalability, and Robustness. Usability Requirements Matrix 2 Transparency – Do you want Automated Test execution to be another “black box”, or you want to have a fully reproduced picture with documented test results? Execution Log Built-in execution log […] ...
Posted by Albert Gareev on Jun 06, 2009
Original date: 14 Apr 2009, 1:30pm Quality Assurance Functional/Regression Automated Testing and Test Automation Requirements: Usability, Maintainability, Scalability, and Robustness Usability Requirements Matrix 1 Applicability – How would you want to use Automated Testing solution? Basic Regression Testing Implemented as a sequence of data entry steps going through the screens. No or simplified verification based on hard-coded ...
Posted by Albert Gareev on Jun 05, 2009
QA Automation – GUI Function Wrapping Some of the statements below will be covered in detail in my subsequent posts. Now I use them to get straight to Test Step requirements. Please read details below the picture. Quality Assurance Functional/Regression Automated Testing and Test Automation Requirements: Usability, Maintainability, Scalability, and Robustness. Let’s break them down […] ...
Posted by Albert Gareev on Jun 04, 2009
Original date: 30 Mar 2009, 1:30pm Front-End Test Automation Practices – Model-based Hybrid Keyword/Data Driven Framework 1. Description • Framework creation – Pure programmatically • Test creation – scriptless, visual design • Internal Data Model, capable to import/export data from various sources • GUI/Database checkpoints, parameterized/transitioned • Business Verification Rules • Framework-based structure; ...
Posted by Albert Gareev on Jun 02, 2009
Original date: 19 Mar 2009, 1:30pm Front-End Test Automation Practices – Keyword-Driven Framework 1. Description • Framework and keywords creation – programmatically • Tests creation – table editor (common practice – MS Excel) • Parameterized, capable to import spreadsheets • GUI/Database checkpoints, hard-coded and/or parameterized • Framework-based structure • Limited error handling • Keyword-based flow ...
Posted by Albert Gareev on Jun 01, 2009
Original date: 11 Mar 2009, 1:30pm Front-End Test Automation Practices – Data-Driven Framework 1. Description • Programmatically created • Parameterized, capable to import spreadsheets • GUI/Database checkpoints, hard-coded and/or parameterized • Library-based structure • Possibly error handling • Hard-coded yet data-driven flow (input and logic) • Standard reporting • Verification is limited to Testing Tool’s […] ...
Posted by Albert Gareev on May 31, 2009
Original date: 3 Mar 2009, 1:00pm Front-End Test Automation Practices – Record/Playback Enhanced 1. Description • Parameterized data • Dataset is a part of the script but not in the code • Verification via GUI checkpoints (hard-coded) • Hard-coded flow • No error handling • No or limited reporting • No structure 2. Advantages • […] ...
Posted by Albert Gareev on May 30, 2009
Original date: 26 Feb 2009, 1:00pm Front-End Test Automation Practices – Record/Playback 1. Description • Hard-coded data • Hard-coded flow • No error handling • No or limited reporting • No structure • No verification • No validation 2. Advantages • Easy to create • Quick to create • No programming required 3. Disadvantages • […] ...
Posted by Albert Gareev on May 22, 2009
Original date: 4 Dec 2008, 1:49pm ...
Posted by Albert Gareev on Dec 10, 2008
A company hired another “automated tester”… Process description In the nutshell, the actual job description is: Bring up an application-under-test Start “testing” script Babysit “testing” script, i.e. manually click/type on the GUI when the script is stuck, then resume script After execution is done, review “test logs” Reviewing the “test logs” includes: Go through […] ...
Posted by Albert Gareev on Oct 10, 2008
For Test Automation, Functional Decomposition is a test flow analysis and test design technique. And yet somehow it is often confused with structural programming. Decomposition of Test Flow Applying Functional Decomposition technique in Test Flow Analysis means splitting (decomposing) a testing scenario into its constituent parts, that are logically complete and data-wise isolated. Some of these parts […] ...
Posted by Albert Gareev on Aug 10, 2008
What is it? IDEF0 is abbreviation for Integration Definitions for Functional Modeling [of Systems]. Zero means type of modeling. (IDEF0 – function modeling, IDEF1 – information modeling, IDEF2 – dynamics modeling). How it looks? Image courtesy: Wikipedia How it works? Image courtesy: Berry College ...
Posted by Albert Gareev on May 30, 2008
Test Flow – The testing process in its dynamics as a sequence of operations: interactions with an application under test, observations and evaluations. Complex Test Flow is represented by a tree of possible test flows for sub-functionalities, including branches and loops. Basic Test Flows On-Screen Data Entry Flow The simplest test flow. It performs basic straight “happy […] ...
Posted by Albert Gareev on May 24, 2008
Parent page: Basic Test Flows Back-End Verification Test Flow Database Data File Service Entry/Exit Points Entry Point. Single. Exit Point. Single. Same as Entry. Operations GUI interaction – None GUI observation – None Evaluation – assessment of Pass/Fail criteria for comparison. Pass/Fail Criteria Fail Criteria Failed to find data record Assessment criteria failure Stop Criteria DB […] ...
Posted by Albert Gareev on May 23, 2008
Parent page: Basic Test Flows Simple Query Verification Test Flow Entry/Exit Points Entry Point. Single. GUI screen (window, web page, etc.). Exit Point. Single. Same GUI screen. Operations GUI interaction – typing text, selecting items, clicking buttons, etc. GUI observation checking screen (windows, web pages, etc.) in context of which GUI objects are checked checking GUI objects […] ...
Posted by Albert Gareev on May 22, 2008
Parent page: Basic Test Flows GUI Screen Verification Test Flow Entry/Exit Points Entry Point. Single. GUI screen (window, web page, etc.). Exit Point. Single. Same GUI screen. Operations GUI interaction – None GUI observation checking screen (windows, web pages, etc.) in context of which GUI objects are checked checking GUI objects (exist, enabled, etc.) check GUI […] ...
Posted by Albert Gareev on May 21, 2008
Parent page: Basic Test Flows On-Screen Data Entry Flow with Confirmation Entry/Exit Points Entry Point. Single. GUI screen (window, web page, etc.). Exit Point. Single. Confirmation GUI screen. Operations GUI interaction – typing text, selecting items, clicking buttons, etc. GUI observation checking screen (windows, web pages, etc.) in context of which GUI operations are performed checking […] ...
Posted by Albert Gareev on May 20, 2008
Parent page: Basic Test Flows On-Screen Data Entry Flow Entry/Exit Points Entry Point. Single. GUI screen (window, web page, etc.). Exit Point. Single. Last operation. Same GUI screen. Operations GUI interaction – typing text, selecting items, clicking buttons, etc. GUI observation – checking screens (windows, web pages, etc.) in context of which GUI operations are […] ...
Posted by Albert Gareev on Sep 27, 2007
Automated Test Report Requirements Every executed Test Scenario must be reported Report must include for each Test Case: description both in brief and detailed views Report format must be user-friendly and easily transferable to formats like MS Word, Excel Test Case Report Requirements Enlist order of the all Test Steps executed Define Test Data source […] ...
Posted by Albert Gareev on Jun 23, 2007
2-tier Data-Driven Test Automation Architecture (WinRunner) – Advanced Reporting ...
Posted by Albert Gareev on Jun 20, 2007
2-tier Data-Driven Test Automation Architecture (WinRunner) – Functional Diagram ...
Posted by Albert Gareev on Jun 15, 2007
2-tier Data-Driven Test Automation Architecture (WinRunner) Business Components Business Components – Functions that implement a certain business-case testing specific functionality which is used by Test Scripts. Typically Business Components are contained within compiled modules; that allows re-using the implemented logic in a similar Test Scripts. Business Components may incorporate standard WinRunner TSL functions and Service […] ...
Posted by Albert Gareev on May 15, 2007
What’s it about? Automated tests represent a certain test logic as a sequence of steps, manipulating data and interacting with a GUI. Test data, as in manual testing, could be data we feed into application-under-test (input data), retrieve from application-under-test (actual data, or actual result), and compare with (expected result). These data have to be […] ...
Posted by Albert Gareev on Jan 22, 2007
…On hard-coding of test data, continued Parameterization is a structured programming approach Starting with example. Such a function is useless when you need to sum numbers different than 2 and 3. Now let’s apply mapping. It is a little bit better, but we still have to change the source code and restart the script, if […] ...
Posted by Albert Gareev on Jan 20, 2007
Hard-coding – storing data within the code A printed document is called “hard copy”, while its electronic version is “soft”. We can easily apply a change on a soft copy. And yet the code, compiled and assembled into an application, while stored electronically, is a “hard” copy too. Applying changes to it is almost an […] ...