Error when removing duplicates from Javascript Arr

2020-05-10 06:56发布

UPDATE

Here is a fiddle of the problem: https://jsfiddle.net/q9c5fku3/ when I run that code and look at the console I see it's console.logging a different number in the array.

Thanks for your replies, sorry I'm getting downvotes but this is really confusing me.

I've tried it again using different numbers, I'm wondering can you guys test these numbers on your end and see if you get a different result?

    var myArray = [621608617992776, 621608617992776, 10156938936550295, 621608617992776, 10156938936550295, 10156938936550295, 621608617992776, 10156938936550295];
    console.log(myArray);

    var myArrayTrimmed = [];

    for(var i in myArray){
        if(myArrayTrimmed.indexOf(myArray[i]) === -1){
            myArrayTrimmed.push(myArray[i]);
        }
    }
    console.log(myArrayTrimmed);

This is giving me the following array in the console:

[621608617992776, 10156938936550296]

For some reason the second number has increased by 1.

====================

Original Question:

I have this array:

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200];

I'm trying to create a new array named myArrayTrimmed that will be the same as the above array, except it will have duplicates removed. This should result in:

var myArrayTrimmed = [100, 200];

This is the code I'm using to try to achieve this:

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200];
var myArrayTrimmed = [];

for(var i in myArray){
    if(myArrayTrimmed.indexOf(myArray[i]) === -1){
        myArrayTrimmed.push(myArray[i]);
    }
}
console.log(myArrayTrimmed);

This isn't working correctly, while it is removing duplicates, for some reason it's subtracting the number 1 from 200, so the output in the console is:

[100, 199]

I think this must be due to the -1 in the code, but I don't know how else to remove the duplicates.

3条回答
够拽才男人
2楼-- · 2020-05-10 07:12

I believe this is the best way to do this

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
    reduced = Object.keys(myArray.reduce((p,c) => (p[c] = true,p),{}));
console.log(reduced);

OK .. even though this one is O(n) and the others are O(n^2) i was curious to see benchmark comparison between this reduce / look up table and filter/indexOf combo (I choose Jeetendras very nice implementation https://stackoverflow.com/a/37441144/4543207). I prepare a 100K item array filled with random positive integers in range 0-9999 and and it removes the duplicates. I repeat the test for 10 times and the average of the results show that they are no match in performance.

  • In firefox v47 reduce & lut : 14.85ms vs filter & indexOf : 2836ms
  • In chrome v51 reduce & lut : 23.90ms vs filter & indexOf : 1066ms

Well ok so far so good. But let's do it properly this time in the ES6 style. It looks so cool..! But as of now how it will perform against the powerful lut solution is a mystery to me. Lets first see the code and then benchmark it.

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
    reduced = [...myArray.reduce((p,c) => p.set(c,true),new Map()).keys()];
console.log(reduced);

Wow that was short..! But how about the performance..? It's beautiful... Since the heavy weight of the filter / indexOf lifted over our shoulders now i can test an array 1M random items of positive integers in range 0..99999 to get an average from 10 consecutive tests. I can say this time it's a real match. See the result for yourself :)

var ranar = [],
     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 10;
for (var i = 0; i<count; i++){
  ranar = (new Array(1000000).fill(true)).map(e => Math.floor(Math.random()*100000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");

Which one would you use..? Well not so fast...! Don't be deceived. Map is at displacement. Now look... in all of the above cases we fill an array of size n with numbers of range < n. I mean we have an array of size 100 and we fill with random numbers 0..9 so there are definite duplicates and "almost" definitely each number has a duplicate. How about if we fill the array in size 100 with random numbers 0..9999. Let's now see Map playing at home. This time an Array of 100K items but random number range is 0..100M. We will do 100 consecutive tests to average the results. OK let's see the bets..! <- no typo

var ranar = [],
     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 100;
for (var i = 0; i<count; i++){
  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*100000000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");

Now this is the spectacular comeback of Map()..! May be now you can make a better decision when you want to remove the dupes.

Well ok we are all happy now. But the lead role always comes last with some applause. I am sure some of you wonder what Set object would do. Now that since we are open to ES6 and we know Map is the winner of the previous games let us compare Map with Set as a final. A typical Real Madrid vs Barcelona game this time... or is it? Let's see who will win the el classico :)

var ranar = [],
     red1 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     red2 = a => Array.from(new Set(a)),
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 100;
for (var i = 0; i<count; i++){
  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*10000000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("map & spread took: " + avg1 + "msec");
console.log("set & A.from took: " + avg2 + "msec");

Wow.. man..! Well unexpectedly it didn't turn out to be an el classico at all. More like Barcelona FC against CA Osasuna :))

查看更多
We Are One
3楼-- · 2020-05-10 07:24

Hope this will be useful

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200];
var myArrayTrimmed = [];
myArray.forEach(function(item){
 if(myArrayTrimmed.indexOf(item) ==-1){
   myArrayTrimmed.push(item);
}

})
console.log(myArrayTrimmed);

Check this jsfiddle

查看更多
贼婆χ
4楼-- · 2020-05-10 07:30

You can use below code to efficiently remove the duplicate elements -

var myArrayTrimmed = myArray.filter(function(elem, pos) {
return myArray.indexOf(elem) == pos;
});

your myArrayTrimmed will contain the unique values.

查看更多
登录 后发表回答