Microsoft Interview Question: (2nd colleague): [scenario ba... |

Interview Question

Principal Software Test Lead Interview Reading, England (UK)

(2nd colleague): [scenario based session on the whiteboard

 , to see how you think when on the spot/pressured] 1. First question: on one of my first projects here at MS, I'm told I've got 2hrs before I have to present a genius test solution for our client's million pound project, given the following details: - migration project where an identity management software was used to populate SAP (using Active directory) - 2 env. setups (test & PROD) - Like-live test data is available - personnel resources: Architect, one third party test resource unfamiliar with the technology, client has 8man developer team offsite, you're on the team, systems support, network sme 2. Second question: you're given a WPF appl., having a single text box on it which receives 2digit numbers and processes them into a db on clicking a button. How would you test it (giving sample test data)?

Interview Answer

1 Answer


A mantra to remember is: "fall in love with your client, not the technology", that way your primary aim is to assist your client through their challenges.

1. Response to question 1:
A question asking feast:
sample questions:

 - who owns the product?
 - is there a definition of done? (aka test policy consisting of quality gates, appetite for risk, defect profile)?
 - what's each member's availability?
 - how much time have we got?
 - can we get more project resource?
 - what's our cadence?
 - when do we have to deliver what by?
 - is there a backlog?
 - is there a risk catalogue/ do we have a lessons learnt log?
 - are there any oracles?
 - who authored them?
 - is there a defect mgmt process?
 - what technology are the dev team building with?
 - what tools are available on the Microsoft side?
 - what tools are the partner devs using?
 - is there a defects log?
 - are there any tools used for test case management?
 - are the clients accessible to us?
 - at what stage in development are we?

2. Response to question 2:

 1. Equivalence partitioning:
    - valid numbers,
    - invalid numbers,
    - text,
    - (copy & paste) images, executable scripts
    - blank/ empty input

 2. Boundary value analysis:
    - Upper boundary 99 (UB),
    - Lower boundary 00 (LB),
    - 100 (UB+1),

3. State transitioning:
   - button behaviour (oncl1ck/rollover/normal state)

4. Greybox test:
     - data 'Save' to db

5. Error message displayed:
   - success
   - invalid

6. Whitebox test:
   - validation of regular expressions applied e.g. a bug was found stating the number 6 wasn't being accepted i.e. 5,7, 15, 17, 18 ..etc accepted but not 6, 16, 26 etc

7. Exploratory tests:
   - click frenzy

PS: I didn't remember all these on the day, so the interviewer helped with the responses about the grey & whitebox techniques. It's okay to miss some out.

A response to the question "how/why did we miss those bugs could be", you won't find every bug in reality; offering a creative resolution - you could implement a defect management process where procedures for find such bugs are incorporated back into existing test cases and/or new test coverage planned.

Interview Candidate on 21-Jan-2016

Add Answers or Comments

To comment on this, Sign In or Sign Up.