Chennai Tester's Meet
STARWEST 2011
Exploratory Testing and Session Based Test Management
Testing for the Cloud
Steve Jobs spoke about iCloud few weeks back. Infosys CEO Kris Gopalakrishnan says "Take Advantage of Cloud Computing". About an year back Larry Ellison asked "What the hell is cloud computing" but he seems to be eating his own words now. Today, Oracle offers two sets of cloud-enabling products and technologies: some of its core technologies like grid computing and middleware, still bearing their traditional names; and a second and newer group of tools to which Oracle has attached the very term—cloud computing.
So cloud computing seems to be the next big thing and i look forward to people calling themselves Cloud testing experts in the near future. Probably, IV&V companies might come up with test automation frameworks designed exclusively to meet the needs of cloud and there could be lot of focus on Performance and Security testing since data sharing on cloud environments would increase the need for performance and security testing. Let's wait and see what cloud has to offer for the Testing community.
Whats happening to Estimation?
Some of the recent customers we have signed up are not interested in Test Estimation at all. They want us to start with a Proof of Concept, evaluate based on the results of the PoC and if the results are rosy, go ahead and start the engagement. No clear milestones and deliverables defined, no visibility on resource addition/deletion, no Test Plan/Test strategy identified for the application etc. Terms like Function Points, Testcase Points, Cocomo etc are no more heard in the vicinity.
This trend is definitely not good. Test Managers and Leaders must emphasize the need for estimation and planning to prospects/customers.
Point to Ponder
This is a list of approaches, styles, and philosophies in software development. It contains also software development processes, software development methodologies and single practices, principles and laws.
Agent-oriented programming
Agile software development
Agile Unified Process (AUP)
Aspect-oriented Programming
Behavior Driven Development (BDD)
Big Design Up Front (BDUF)
Blind Men And Elephant Approach (BMAEA)
Brooks's law
Cathedral and the Bazaar (see also Release early, release often)
Code and fix
Cone of Uncertainty
Constructionist design methodology (CDM)
Continuous integration
Control tables
Conway's Law
Cowboy coding
Crystal Clear
Dependency injection
Design-driven development (D3)
Design Driven Testing (DDT)
Domain-Driven Design (DDD)
Don't Make Me Think (book by Steve Krug about human computer interaction and web usability)
Don't repeat yourself (DRY) or Duplication is Evil (DIE) or Once and Only Once (OAOO), Single Point of Truth (SPoT), Single Source Of Truth (SSOT)
Dynamic Systems Development Method (DSDM)
Evolutionary prototyping
Extreme Programming (XP)
Feature Driven Development
Good Enough For Now (GEFN)
Hollywood Principle
Inversion of control
Iterative and incremental development
Joint application design, aka JAD or "Joint Application Development"
Kaizen
Kanban
KISS principle original (Keep It Simple and Stupid), derogatory (Keep It Simple, Stupid!)
Lean software development
Literate Programming
Microsoft Solutions Framework (MSF)
Model-driven architecture (MDA)
MoSCoW Method
Open source
Open Unified Process
Parkinson's Law
Quick-and-dirty
Rapid application development (RAD)
Rational Unified Process (RUP)
Release early, release often (see also The Cathedral and the Bazaar)
Responsibility-driven design (RDD)
Scrum
Separation of concerns (SoC)
Service-oriented modeling
Software Craftsmanship
Software System Safety
Solid (object-oriented design)
Spiral model
Structured Systems Analysis and Design Method (SSADM)
SUMMIT Ascendant (now IBM Rational SUMMIT Ascendant)
Team Software Process (TSP)
Test-driven development (TDD)
Two Tracks Unified Process (2TUP)
Unified Process (UP)
Unix philosophy
V-Model
Waterfall model
Wheel and spoke model
When it's ready [1]
Win-Win Model
Worse is better (New Jersey style, as contrasted with the MIT approach)
You Ain't Gonna Need It (YAGNI)
Source: Wikipedia
Wonder how testing happens in each of these?
PDF attachment in hotmail.
Test(Data,Case,Environment)<==>Bug
Test Case <==> Test Data <==> Test Environment <==> Bug
acting as a great tool for any tester / test team.
In other words, every bug should have its own test case/test data mapped.
In the same way, every test case (or test run) should have its test environment / test data / bug (if applicable) mapped.
It will help tester / test team in
1.High Rate of Reproducible bugs
2.Making the test process as resource independence
3.Transparency in testing
4.Getting a view on test coverage
5.Getting the Test case coverage ratio (No.of test cases Vs Total number of bugs)
Albeit, it seems to be so naive to read, but one of challenging tasks in any test team (inversely proportional to testing team size).
The problem starts when test team enters into hectic schedule of testing during the project. During and After this high concentrated testing, team gets exhausted for maintaining this mapping and concentrates more on 'after live' issues.
But even after two/three short releases when testing team gets little time for doing this exercise, unless team is highly disciplined, it is very difficult to carry this (mapping) exercise. Test Manager can easily use this mapping as a yardstick to measure the 'discipline' of testing team!
Power of Ticketing
We are aware of filing Bugs against the AUT (Application Under Test). Sometimes the issues filed in Bug tracking system all are not bugs, but some may found to be a Enhancement (Nice to have feature), some are simply 'Task' PR which is assigned against developers.
This Task PR (meaning that particular task has to be done by Assignee of the bug) can be very efficiently used by Test Manager to keep track of the tasks to be done by Testers.
Imagine in a testing module both Manual and Automation testers are working and at the beginning of the testing life cycle many things are discussed as 'to be done' and the way they are tracked is through emails, task requests in Outlook and simply by "Managers". We can use the bug tracking system for this purpose and Manager can raise 'Task' tickets against Testers for all the tasks he wanted tester to do. Or even tester can themselves can create and assign to themselves.
Examples for those task bugs are
(i) Creating new test cases for the new features (and getting sign off from product management and developers). This issue is considered as 'Completed' only when Peer Review/ Product management Review/Developers review/ Second level review is completed. We can create the workflow in our bug tracking system accordingly.
(ii) Select test cases from manual test case repository in order to automate and get sign off from Automation engineer.
(iii) Automate all the selected manual test cases in a feature and get Peer / Client sign off.
(iv) Finish self review for Performance appraisals.
(v) Verify all the bugs for this release.
(vi) Publish the Performance numbers between last release and this release
(vii) Clean the test cases (delete all the obsolete test cases) ..Criteria => Test cases written in past 3 years.
(viii) Complete the knowledge transfer session (this task issue is considered to be completed only if the person who is getting KT has given the reverse presentation and signing off the documents)
(ix) Do 5 interview before 30-Feb
and many more.
All these are treated as open Bugs and considered as important criteria in testing signoff of the particular release.
We can create separate areas in bug tracking systems, such as "Manual Testing Work" , "Automaton Testing work" etc This is analogous to IT help desk ticket but internal to testing team. How diligently we follow this reflect the success.
My 'Building' Experience
Point here is it will take at least one-two hours for me to take the latest build and if I find any showstopper, I stopped testing and started to wait for the fix and after the fix I again started 'building'.
There is no another tester and there is no change in this process, no comment from PM (since I didn't have test manager) and I was surrounded by developers.
After a while, I was finding only few bugs that too show stoppers and concentrating on improving the Anthill's build.xml to make quicker builds , which was lauded as good solution by few developers. In this, around 50% of my time went for running anthill / waiting for build / improving the anthill process.
All went fine until one morning when my client found one 'Critical bug' not 'show stopper' in one of the previous build (few days old), and obviously developers started saying to check in the latest build which he refused and filed the bug. The answer from him is albeit there may be some fix in the coming builds, this bug has been found in 'xyz' build and that is true and so I am filing the bug. Later we can change the status of the bug accordingly.
After this incident, I took the backseat and STOPPED building the code and made a build cycle. I made clearly that I myself as a tester will take the build every Wednesday and I will file bugs based on the build (irrespective of whether they are fixed in the later builds) and then I realized there were many Critical/Major serious bugs are there in the product which should have been caught long long back.
But my actions were taken at the end of the project and as expected our estimation went terribly wrong and blame came to me also as a tester (in fact I got a lion's share...:-( ).
Moral: (i) Decide / Negotiate about the build cycle/ interval between builds at the early stage of the testing and follow the rule religiously (unless its very critical fix).
(ii) If you as a tester took additional responsibility of either build engineer, document writer , whistle blower in CMMi process or whatever, be clear to give priority to testing and then go for others.
OLE in Gmail
I am not able to copy some excel fields and paste in my gmail. In other words, OLE is not supported in Gmail. I am not very sure whether OLE is possible outside Microsoft component.
If anybody has any work around for this (embedding an excel sheet/part of excel sheet in Gmail), I would be thankful :-)