Google Interview Question: Given two files that has list... | Glassdoor.co.in

Interview Question

Java Software Engineer Interview Mountain View, CA (US)

Given two files that has list of words (one per line

 ), write a program to show the intersection.
Answer

Interview Answer

6 Answers

9

Read into a list, sort. For two loops, iterate and advance each index as appropriate.

The problem itself is not that hard but what makes it difficult for me was trying to do it on the white board. My advice: have some practice run yourself on a white board with some of the fundamental algo questions: the kinds that will find in CS 101 mid-term.

Interview Candidate on 15-Jun-2009
6

Better still to read the first into a HashMap and then as you read the second list look it up in the HashMap. If you find the entry in the HashMap put it into a List containing the intersections. For any reasonable hashing function that would be O(n) for the whole operation.

Of coure, since we're talking about reading from disk files it will all probably be I/O bound.

Some Guy on 04-Nov-2009
7

Hashmap is good. How about a hashset? the checking of duplicate elements in the same file is taken care automatically; also while creating the intersection list, "contains" is taken care automatically; also same efficiency as a hashmap !!

ksm on 24-Feb-2010
5

The best soln imo is to use a trie. So put the list of first words into a trie. And then go through the list of second words and check if they are in the trie. Would be order of n , for inserting and checking , where n is length of the word. Memory can ,in worst case can be 26 power n, assuming english alphabet.

Anonymous on 24-Oct-2018
0

Cmcr

Anonymous on 24-Nov-2018
0

This problem is the textbook example for a bloom filter (because of large data and likelihood of disk access). This is probably the insight they were hoping for, but they probably would have settled for using a common hash set as well.

Anonymous on 12-Jan-2020

Add Answers or Comments

To comment on this, Sign In or Sign Up.