Chennai Tester's Meet










Finally, we managed to host Chennai Tester's meet at Aspire on October 29th 2011. I had been in touch with some of the RIA-RUI(Rich Internet Application-Rich User Interface) members since June to conduct the event at Aspire premises but unfortunately my travel schedule during the months of July, August and September made us postpone the event to October.

Though 140 Testing members across companies from Chennai had registered for the event, the Diwali weekend combined with heavy rains in Chennai came as dampener. Overall, 50 members came for the event of which close to 15 were from Aspire. The event started with my CTO Mr.Shankar Krishnamoorthy addressing the participants. As part of his speech, Mr.Shankar welcomed the gathering and stressed the importance of Testing in today's context.

My presentation "Knowledge Transfer in New Assignments - Challenges and some tips to overcome them" followed next. I emphasized on the need to extract Tacit Knowledge through various forms, the most important of them being Socializing. The audience connected well with the title and some of them came up with their experience and challenges in acquiring Tacit knowledge.

We had Mr.Murali from Testpro as the guest speaker at the event. Mr.Murali gave a motivational speech to the participants about the external influences that an employee encounters in an organization and how they should handle it. The forum was left open for debate post Mr.Murali's speech. The topic for the debate was "Documentation - Yes or No" in QA assignments. The participants came up with some interesting points. The debate ended with most of the attendees saying "Required and Relevant Documentation" adds value.

Lunch was organized for the attendees and the participants used that opportunity to network with each other.

STARWEST 2011



I reached Anaheim around 2a.m on October 2nd from Philadelphia. Downtown Anaheim was lively and bustling with activity. Microsoft Sharepoint conference was happening at the Anaheim Convention Center and Starwest 2011 at Hotel Disneyland. I went to SQE booth the next morning to collect the booth staff badges for me and my colleagues. I was surprised to see the level of professionalism at the conference venue. Clearly written directions helped us to contact the right people for our needs and we were assigned a SQE staff for the duration of the event to help us network with others, setting up our booth etc.

On 3rd morning we went to setup our stall and completed that activity at about 10a.m. James Whittaker and several other leading QA consultants (whose names i have seen only in forums prior to the event ) were giving speeches as part of the industry technical presentations at Starwest 2011. A Test Lab was setup near the entrance area where visitors were asked to solve testing puzzles etc. James and Jonathan Bach, Michael Bolton were present close to the Test Lab showcasing Rapid and Exploratory Testing approaches.

One of the eye catching aspect for visitors was the book stall which was strategically located in the visitors area. The book stall had each author per day sitting there and signing books purchased by the visitors. I happened to see Erik Van Veenendaal on one of the days. As a strategic move in line with out corporate theme, we had a violinist from LA who played in our stall for 2 days that attracted lot of visitors. Dorothy Graham was one of the prominent QA personalities to visit our stall and appreciate the music. I connected with Dorothy during her visit to our stall and mentioned that i am a big fan of her thought inspiring articles in Stickyminds and other forums.

The organizers(SQE) had done so much of planning that everything was so flawless. Breakfast, Lunch and Tea were provided in the location where booths were present and the booth staff had priority access to everything. A bar counter was opened during the evenings and was available for the visitors as well. Wipro, Infosys, Mindtree, Applabs and Aspire were some of the companies from India. Ranorex, SOASTA, HP, Microsoft(showcasing Visual Studio Test Edition) were some of the renowned Global companies at the event. Though i had been part of conferences like STC in India before Starwest 2011 was my first international conference and i should say it helped me broaden my QA perspective. My session on "Test Maturity - Tweens, Teens and Twenties" attracted about 23 attendees and most of them had come for the Amazon Kindle and iPad raffle offer that we had. :-)). I will try to post some snaps soon.

Exploratory Testing and Session Based Test Management

One of the customers we recently signed had a very tight timeframe for release and was relying on our team to help him deliver the product with good quality. There was not much time for Knowledge Transfer and our team was expected to pitch in for day one. I suggested Exploratory Testing to the customer and to measure the effectiveness of ET recommended SBTM(Session Based Test Management).

Being onsite-offshore model i was not sure how SBTM would work since i had this preconceived image that SBTM works only when our team is sitting with the customer's team at their premises. I prepared a SBTM template to share with the customer which had details of the Charter, Test Notes, Bug Investigation and Reporting, Issues, Opportunity Testing, Charter Vs Opportunity etc.

I shared the template for review with my team before presenting it to the customer. The team felt that such a template is too detailed and the customer may not be interested in that level of detailing. We reworked on the template to include details that would be more relevant to the customer such as Bugs Filed during the day, Clarifications raised etc. Except for the Charter most of the other fields were changed. What started as a SBTM report has now become a Daily status report. :-))

My question to other visitors to this blog are:

Is Session Based Test Management possible in onsite-offshore model or pure offshore model? If yes, how to do it effectively?

Can we convince customers about Charter Vs Opportunity Testing?

I understand that ET(Exploratory Testing) can be effectively performed using skilled testers but unfortunately in many cases you don't get that luxury. How to perform ET using a team that is not very skilled?

How do you show product knowledge acquisition as part of ET to the customer?

Testing for the Cloud

The buzzword seems to be "Cloud Computing" now.

Steve Jobs spoke about iCloud few weeks back. Infosys CEO Kris Gopalakrishnan says "Take Advantage of Cloud Computing". About an year back Larry Ellison asked "What the hell is cloud computing" but he seems to be eating his own words now. Today, Oracle offers two sets of cloud-enabling products and technologies: some of its core technologies like grid computing and middleware, still bearing their traditional names; and a second and newer group of tools to which Oracle has attached the very term—cloud computing.

So cloud computing seems to be the next big thing and i look forward to people calling themselves Cloud testing experts in the near future. Probably, IV&V companies might come up with test automation frameworks designed exclusively to meet the needs of cloud and there could be lot of focus on Performance and Security testing since data sharing on cloud environments would increase the need for performance and security testing. Let's wait and see what cloud has to offer for the Testing community.

Whats happening to Estimation?

There used to be a time when every customer wanted the vendor to give detailed estimates for testing their application. I have seen estimates planned for 2-3 years from the engagement start date and that'll include resource ramp up/down plans, cost planning based on resource plans etc. All this seems to be changing now.

Some of the recent customers we have signed up are not interested in Test Estimation at all. They want us to start with a Proof of Concept, evaluate based on the results of the PoC and if the results are rosy, go ahead and start the engagement. No clear milestones and deliverables defined, no visibility on resource addition/deletion, no Test Plan/Test strategy identified for the application etc. Terms like Function Points, Testcase Points, Cocomo etc are no more heard in the vicinity.

This trend is definitely not good. Test Managers and Leaders must emphasize the need for estimation and planning to prospects/customers.

Point to Ponder

This is a list of approaches, styles, and philosophies in software development. It contains also software development processes, software development methodologies and single practices, principles and laws.
Agent-oriented programming
Agile software development
Agile Unified Process (AUP)
Aspect-oriented Programming
Behavior Driven Development (BDD)
Big Design Up Front (BDUF)
Blind Men And Elephant Approach (BMAEA)
Brooks's law
Cathedral and the Bazaar (see also Release early, release often)
Code and fix
Cone of Uncertainty
Constructionist design methodology (CDM)
Continuous integration
Control tables
Conway's Law
Cowboy coding
Crystal Clear
Dependency injection
Design-driven development (D3)
Design Driven Testing (DDT)
Domain-Driven Design (DDD)
Don't Make Me Think (book by Steve Krug about human computer interaction and web usability)
Don't repeat yourself (DRY) or Duplication is Evil (DIE) or Once and Only Once (OAOO), Single Point of Truth (SPoT), Single Source Of Truth (SSOT)
Dynamic Systems Development Method (DSDM)
Evolutionary prototyping
Extreme Programming (XP)
Feature Driven Development
Good Enough For Now (GEFN)
Hollywood Principle
Inversion of control
Iterative and incremental development
Joint application design, aka JAD or "Joint Application Development"
Kaizen
Kanban
KISS principle original (Keep It Simple and Stupid), derogatory (Keep It Simple, Stupid!)
Lean software development
Literate Programming
Microsoft Solutions Framework (MSF)
Model-driven architecture (MDA)
MoSCoW Method
Open source
Open Unified Process
Parkinson's Law
Quick-and-dirty
Rapid application development (RAD)
Rational Unified Process (RUP)
Release early, release often (see also The Cathedral and the Bazaar)
Responsibility-driven design (RDD)
Scrum
Separation of concerns (SoC)
Service-oriented modeling
Software Craftsmanship
Software System Safety
Solid (object-oriented design)
Spiral model
Structured Systems Analysis and Design Method (SSADM)
SUMMIT Ascendant (now IBM Rational SUMMIT Ascendant)
Team Software Process (TSP)
Test-driven development (TDD)
Two Tracks Unified Process (2TUP)
Unified Process (UP)
Unix philosophy
V-Model
Waterfall model
Wheel and spoke model
When it's ready [1]
Win-Win Model
Worse is better (New Jersey style, as contrasted with the MIT approach)
You Ain't Gonna Need It (YAGNI)

Source: Wikipedia

Wonder how testing happens in each of these?

PDF attachment in hotmail.

I didn't notice this issue for a long time (albeit I am a long term hotmail user, but I seldom use it for 'serious' purpose (like job, getting bills etc). Of late, I tried hotmail for some of my service billings (like broadband, bank etc.)

 Quite interestingly, I am NOT able to open any PDF attachment received via hotmail account. I tried in Firefox 3.6 and I.E 7.I have to forward to my gmail account to open it.
 
Incredulously, I googled and yep that is expected. You cannot open PDF attachment in hotmail for security purpose! Guyz, this is too bad , may be okay in technical perspective but not in user's perspective, across the globe I strongly believe bills and other important documents are coming in PDF format (with or without encryption) and if those cannot be opened from hotmail.... 

Height of security & irrationality !




Test(Data,Case,Environment)<==>Bug

In my experience, I am repeatedly finding that the proper mapping between
Test Case <==> Test Data <==> Test Environment <==> Bug

acting as a great tool for any tester / test team.

In other words, every bug should have its own test case/test data mapped.
In the same way, every test case (or test run) should have its test environment / test data / bug (if applicable) mapped.

It will help tester / test team in

1.High Rate of Reproducible bugs
2.Making the test process as resource independence
3.Transparency in testing
4.Getting a view on test coverage
5.Getting the Test case coverage ratio (No.of test cases Vs Total number of bugs)

Albeit, it seems to be so naive to read, but one of challenging tasks in any test team (inversely proportional to testing team size).

The problem starts when test team enters into hectic schedule of testing during the project. During and After this high concentrated testing, team gets exhausted for maintaining this mapping and concentrates more on 'after live' issues.


But even after two/three short releases when testing team gets little time for doing this exercise, unless team is highly disciplined, it is very difficult to carry this (mapping) exercise. Test Manager can easily use this mapping as a yardstick to measure the 'discipline' of testing team!

Power of Ticketing

I am not sure whether this practice is followed across companies in industry. At least I am sure that few projects are not following this system.

We are aware of filing Bugs against the AUT (Application Under Test). Sometimes the issues filed in Bug tracking system all are not bugs, but some may found to be a Enhancement (Nice to have feature), some are simply 'Task' PR which is assigned against developers.

This Task PR (meaning that particular task has to be done by Assignee of the bug) can be very efficiently used by Test Manager to keep track of the tasks to be done by Testers.

Imagine in a testing module both Manual and Automation testers are working and at the beginning of the testing life cycle many things are discussed as 'to be done' and the way they are tracked is through emails, task requests in Outlook and simply by "Managers". We can use the bug tracking system for this purpose and Manager can raise 'Task' tickets against Testers for all the tasks he wanted tester to do. Or even tester can themselves can create and assign to themselves.

Examples for those task bugs are

(i) Creating new test cases for the new features (and getting sign off from product management and developers). This issue is considered as 'Completed' only when Peer Review/ Product management Review/Developers review/ Second level review is completed. We can create the workflow in our bug tracking system accordingly.

(ii) Select test cases from manual test case repository in order to automate and get sign off from Automation engineer.

(iii) Automate all the selected manual test cases in a feature and get Peer / Client sign off.

(iv) Finish self review for Performance appraisals.

(v) Verify all the bugs for this release.

(vi) Publish the Performance numbers between last release and this release

(vii) Clean the test cases (delete all the obsolete test cases) ..Criteria => Test cases written in past 3 years.

(viii) Complete the knowledge transfer session (this task issue is considered to be completed only if the person who is getting KT has given the reverse presentation and signing off the documents)

(ix) Do 5 interview before 30-Feb




and many more.

All these are treated as open Bugs and considered as important criteria in testing signoff of the particular release.

We can create separate areas in bug tracking systems, such as "Manual Testing Work" , "Automaton Testing work" etc This is analogous to IT help desk ticket but internal to testing team. How diligently we follow this reflect the success.

My 'Building' Experience

Few years back, I was working in a project as a single tester. That project was developed from scratch and everything is from India only. One of the additional responsibilities came to me is to 'build' the project through a tool call "Anthill". Since I was relatively new to industry I was excited and began to play with this. Since there was no proper planning,there were major fixes every day and I as a tester (who is having additional responsibility of 'building' with the latest fix) started taking builds on daily basis. In addition, for every bug I filed I started getting calls from programmers to 'check in the latest build..just to ensure..'.I was able to file bugs only after getting the latest code.

Point here is it will take at least one-two hours for me to take the latest build and if I find any showstopper, I stopped testing and started to wait for the fix and after the fix I again started 'building'.

There is no another tester and there is no change in this process, no comment from PM (since I didn't have test manager) and I was surrounded by developers.

After a while, I was finding only few bugs that too show stoppers and concentrating on improving the Anthill's build.xml to make quicker builds , which was lauded as good solution by few developers. In this, around 50% of my time went for running anthill / waiting for build / improving the anthill process.

All went fine until one morning when my client found one 'Critical bug' not 'show stopper' in one of the previous build (few days old), and obviously developers started saying to check in the latest build which he refused and filed the bug. The answer from him is albeit there may be some fix in the coming builds, this bug has been found in 'xyz' build and that is true and so I am filing the bug. Later we can change the status of the bug accordingly.

After this incident, I took the backseat and STOPPED building the code and made a build cycle. I made clearly that I myself as a tester will take the build every Wednesday and I will file bugs based on the build (irrespective of whether they are fixed in the later builds) and then I realized there were many Critical/Major serious bugs are there in the product which should have been caught long long back.

But my actions were taken at the end of the project and as expected our estimation went terribly wrong and blame came to me also as a tester (in fact I got a lion's share...:-( ).

Moral: (i) Decide / Negotiate about  the build cycle/ interval between builds at the early stage of the testing and follow the rule religiously (unless its very critical fix).
(ii) If you as a tester took additional responsibility of either build engineer, document writer , whistle blower in CMMi process or whatever, be clear to give priority to testing and then go for others.

OLE in Gmail

I am not sure whether this facility is available in Gmail or not, if it is not , it would be great to have.

I am not able to copy some excel fields and paste in my gmail. In other words, OLE is not supported in Gmail. I am not very sure whether OLE is possible outside Microsoft component.

If anybody has any work around for this (embedding an excel sheet/part of excel sheet in Gmail), I would be thankful :-)

Interesting Observation

Today i came across an interesting observation in my Yahoo Mail Inbox. I had some Birthday reminders which i did not read but deleted directly. Thereafter i opened an mail from one of my friends but the Birthday Alarm mail got opened. I was surprised and then again went back using the browser back button and again clicked on my friend's mail and once again the birthday alarm mail content got displayed. I refreshed again and tried and was able to see my friend's mail. Not sure what could have been the issue. Any such experience for readers?