最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Big O of Finding the Intersection in two Unsorted Arrays Using filter() in JavaScript - Stack Overflow

programmeradmin1浏览0评论

I just started learning Big O notation and I'm trying to understand the Big O of different functions to see which one is better.

I'm struggling to calculate the time and space plexity for the following code.

function findCommonElem(arr1, arr2) { 
  let result = arr1.filter(x => arr2.includes(x));
  console.log(result);
}

  findCommonElem(arr1, arr2);

From what I understand, mon array methods like filter() usually have a big O of O(n) so in this case, it'd be O(m+n) depending on the length of each array. However, I could be super wrong.

Can someone explain, please? Thanks so much!

Bonus question: Compared to sorting the arrays then using a while loop for the same function, which one would be considered as "better"?

I just started learning Big O notation and I'm trying to understand the Big O of different functions to see which one is better.

I'm struggling to calculate the time and space plexity for the following code.

function findCommonElem(arr1, arr2) { 
  let result = arr1.filter(x => arr2.includes(x));
  console.log(result);
}

  findCommonElem(arr1, arr2);

From what I understand, mon array methods like filter() usually have a big O of O(n) so in this case, it'd be O(m+n) depending on the length of each array. However, I could be super wrong.

Can someone explain, please? Thanks so much!

Bonus question: Compared to sorting the arrays then using a while loop for the same function, which one would be considered as "better"?

Share Improve this question asked Sep 28, 2020 at 10:52 supvickysupvicky 551 silver badge5 bronze badges 5
  • 1 Why m + n? findCommonElem pares every (in the worst case) element of arr1 with every element of arr2 -> m * n – Andreas Commented Sep 28, 2020 at 10:55
  • 1 Isn't it O(m * n)? – radulle Commented Sep 28, 2020 at 10:55
  • As the Array filter function has to pass through all the entries so do the Array Includes . plexity will be O(m*n) – Ahmad Suddle Commented Sep 28, 2020 at 11:00
  • Thank you so much for your fast reply @Andreas and @radulle! That does make sense. As I mentioned, I could be super wrong LOL so thanks for correcting! :) – supvicky Commented Sep 28, 2020 at 11:02
  • Doesn't it also make a difference which way around you have the arrays? If arr2 is bigger than arr1, you should be applying the filter to arr2 instead and using arr1 for the includes() test? – ATD Commented Sep 28, 2020 at 11:05
Add a ment  | 

3 Answers 3

Reset to default 13

The above function has a time plexity of O(M * N).

But can you make this solution more efficient?

Yes. You can reduce it to O(M + N).

TLDR: Use a hash table to achieve linear time plexity O(M + N).


Approach 1

Check every elements of array 1 with every element of array 2. (This is the approach you're using.)

function findCommonElem(arr1, arr2) {
  return arr1.filter(x => arr2.includes(x));
}

const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];

console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];

  • Time plexity = O(M * N)

    • Every element from array 1 is checked with every element of array 2 at the worst case. So, it's M * N.
  • Space pexity = O(M) or O(N)

    • At most, all of the elements from one of the either arrays could be in the intersection.

Approach 2

Use a hash map to linearize the nested loop. First, populate the hash map with array 1 elements. Then check through the array 2 using the map to find the intersection.

function findCommonElem(arr1, arr2) {
  const map = new Map();

  arr1.forEach(item => map.set(item, true));

  return arr2.filter(item => map.has(item));
}

const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];

console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];

The above function returns the same output. But here's the difference - the nested loop is reduced to two linear loops for the arrays. This means both arrays are traversed only once.

  • Time plexity = O(M + N)

    • Array 1 is traversed once (M elements).
    • Array 2 is traversed once (N elements).
    • Checking if the map contains the element with map.has() takes constant time O(1).
    • Total run time = M + N
  • Space pexity = O(M) or O(N)

    • Space plexity is still the same here, because space needed for the new hash map is either O(M) or O(N). And our intermediate array takes either O(M) or O(N). Which is still the same.

Bonus: Don't know much about how hash maps work internally? Watch this.

Approach 3

Use set instead of map. The set data structure is best for this use case as you don't need the mapped value (the true value) in approach 2.

function findCommonElem(arr1, arr2) {
  const set = new Set(arr1);

  return arr2.filter(item => set.has(item));
}

const arr1 = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10];
const arr2 = [2, 4, 6, 8, 10, 12, 14, 16, 20];

console.log(findCommonElem(arr1, arr2)); // [2, 4, 6, 8, 10];

This uses relatively lesser space but the algorithmic plexity of TC and SC are still same.

  • Time plexity = O(M + N)
  • Space pexity = O(M) or O(N)

Thanks to Nick Parsons for pointing this out.

let's say that arr1.length is n and arr2.length is m. so the filter function run the lambda function you wrote for each item in arr1. The lambda function checks if an item is in the arr2 and in the worst case where it didn't find it the function ran on all the array so m times. therefore arr1.filter(x => arr2.includes(x)) run at the worst case O(n*m).

as for the space plexity, the filter function creates a new array and in the worst case that array size is as big as the original so the space plexity is O(n)

As it correctly mentioned, big O here would be equal to O(n*m), here is an explanation for this:

  1. arr1.filter plexity is O(n)
  2. arr2.includes plexity is O(n) as well

However for each iteration in .filter you execute each time arr2.includes, which leads to O(n) * O(n) = O(n * m)

You may increase performance if you replace for example arr2 with JS Set. JS Set.has plexity is O(1), so using Set instead of arr2 should help you to achieve O(n) plexity.

I believe I cannot answer on sorting question, cause I haven't understood clearly what you mean.

发布评论

评论列表(0)

  1. 暂无评论