2538

I have a very simple JavaScript array that may or may not contain duplicates.

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];

I need to remove the duplicates and put the unique values in a new array.

I could point to all the code that I've tried but I think it's useless because they don't work. I accept jQuery solutions too.

Similar question:

Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
askedFeb 10, 2012 at 14:53
kramden88's user avatar
1

51 Answers51

6288

TL;DR

Using theSet constructor and thespread syntax:

uniq = [...new Set(array)];

( Note that varuniq will be an array...new Set() turns it into a set, but [... ] turns it back into an array again )


"Smart" but naïve way

uniqueArray = a.filter(function(item, pos) {    return a.indexOf(item) == pos;})

Basically, we iterate over the array and, for each element, check if the first position of this element in the array is equal to the current position. Obviously, these two positions are different for duplicate elements.

Using the 3rd ("this array") parameter of the filter callback we can avoid a closure of the array variable:

uniqueArray = a.filter(function(item, pos, self) {    return self.indexOf(item) == pos;})

Although concise, this algorithm is not particularly efficient for large arrays (quadratic time).

Hashtables to the rescue

function uniq(a) {    var seen = {};    return a.filter(function(item) {        return seen.hasOwnProperty(item) ? false : (seen[item] = true);    });}

This is how it's usually done. The idea is to place each element in a hashtable and then check for its presence instantly. This gives us linear time, but has at least two drawbacks:

  • since hash keys can only be strings or symbols in JavaScript, this code doesn't distinguish numbers and "numeric strings". That is,uniq([1,"1"]) will return just[1]
  • for the same reason, all objects will be considered equal:uniq([{foo:1},{foo:2}]) will return just[{foo:1}].

That said, if your arrays contain only primitives and you don't care about types (e.g. it's always numbers), this solution is optimal.

The best from two worlds

A universal solution combines both approaches: it uses hash lookups for primitives and linear search for objects.

function uniq(a) {    var prims = {"boolean":{}, "number":{}, "string":{}}, objs = [];    return a.filter(function(item) {        var type = typeof item;        if(type in prims)            return prims[type].hasOwnProperty(item) ? false : (prims[type][item] = true);        else            return objs.indexOf(item) >= 0 ? false : objs.push(item);    });}

sort | uniq

Another option is to sort the array first, and then remove each element equal to the preceding one:

function uniq(a) {    return a.sort().filter(function(item, pos, ary) {        return !pos || item != ary[pos - 1];    });}

Again, this doesn't work with objects (because all objects are equal forsort). Additionally, we silently change the original array as a side effect - not good! However, if your input is already sorted, this is the way to go (just removesort from the above).

Unique by...

Sometimes it's desired to uniquify a list based on some criteria other than just equality, for example, to filter out objects that are different, but share some property. This can be done elegantly by passing a callback. This "key" callback is applied to each element, and elements with equal "keys" are removed. Sincekey is expected to return a primitive, hash table will work fine here:

function uniqBy(a, key) {    var seen = {};    return a.filter(function(item) {        var k = key(item);        return seen.hasOwnProperty(k) ? false : (seen[k] = true);    })}

A particularly usefulkey() isJSON.stringify which will remove objects that are physically different, but "look" the same:

a = [[1,2,3], [4,5,6], [1,2,3]]b = uniqBy(a, JSON.stringify)console.log(b) // [[1,2,3], [4,5,6]]

If thekey is not primitive, you have to resort to the linear search:

function uniqBy(a, key) {    var index = [];    return a.filter(function (item) {        var k = key(item);        return index.indexOf(k) >= 0 ? false : index.push(k);    });}

In ES6 you can use aSet:

function uniqBy(a, key) {    const seen = new Set();    return a.filter(item => {        const k = key(item);        return seen.has(k) ? false : seen.add(k);    });}

or aMap:

function uniqBy(a, key) {    return [        ...new Map(            a.map(x => [key(x), x])        ).values()    ]}

which both also work with non-primitive keys.

First or last?

When removing objects by a key, you might to want to keep the first of "equal" objects or the last one.

Use theSet variant above to keep the first, and theMap to keep the last:

function uniqByKeepFirst(a, key) {    let seen = new Set();    return a.filter(item => {        let k = key(item);        return seen.has(k) ? false : seen.add(k);    });}function uniqByKeepLast(a, key) {    return [        ...new Map(            a.map(x => [key(x), x])        ).values()    ]}//data = [    {a:1, u:1},    {a:2, u:2},    {a:3, u:3},    {a:4, u:1},    {a:5, u:2},    {a:6, u:3},];console.log(uniqByKeepFirst(data, it => it.u))console.log(uniqByKeepLast(data, it => it.u))

Libraries

Bothunderscore andLo-Dash provideuniq methods. Their algorithms are basically similar to the first snippet above and boil down to this:

var result = [];a.forEach(function(item) {     if(result.indexOf(item) < 0) {         result.push(item);     }});

This is quadratic, but there are nice additional goodies, like wrapping nativeindexOf, ability to uniqify by a key (iteratee in their parlance), and optimizations for already sorted arrays.

If you're using jQuery and can't stand anything without a dollar before it, it goes like this:

  $.uniqArray = function(a) {        return $.grep(a, function(item, pos) {            return $.inArray(item, a) === pos;        });  }

which is, again, a variation of the first snippet.

Performance

Function calls are expensive in JavaScript, therefore the above solutions, as concise as they are, are not particularly efficient. For maximal performance, replacefilter with a loop and get rid of other function calls:

function uniq_fast(a) {    var seen = {};    var out = [];    var len = a.length;    var j = 0;    for(var i = 0; i < len; i++) {         var item = a[i];         if(seen[item] !== 1) {               seen[item] = 1;               out[j++] = item;         }    }    return out;}

This chunk of ugly code does the same as the snippet #3 above,but an order of magnitude faster (as of 2017 it's only twice as fast - JS core folks are doing a great job!)

function uniq(a) {    var seen = {};    return a.filter(function(item) {        return seen.hasOwnProperty(item) ? false : (seen[item] = true);    });}function uniq_fast(a) {    var seen = {};    var out = [];    var len = a.length;    var j = 0;    for(var i = 0; i < len; i++) {         var item = a[i];         if(seen[item] !== 1) {               seen[item] = 1;               out[j++] = item;         }    }    return out;}/////var r = [0,1,2,3,4,5,6,7,8,9],    a = [],    LEN = 1000,    LOOPS = 1000;while(LEN--)    a = a.concat(r);var d = new Date();for(var i = 0; i < LOOPS; i++)    uniq(a);document.write('<br>uniq, ms/loop: ' + (new Date() - d)/LOOPS)var d = new Date();for(var i = 0; i < LOOPS; i++)    uniq_fast(a);document.write('<br>uniq_fast, ms/loop: ' + (new Date() - d)/LOOPS)

ES6

ES6 provides theSet object, which makes things a whole lot easier:

function uniq(a) {   return Array.from(new Set(a));}

or

let uniq = a => [...new Set(a)];

Note that, unlike in python, ES6 sets are iterated in insertion order, so this code preserves the order of the original array.

However, if you need an array with unique elements, why not use sets right from the beginning?

Generators

A "lazy", generator-based version ofuniq can be built on the same basis:

  • take the next value from the argument
  • if it's been seen already, skip it
  • otherwise, yield it and add it to the set of already seen values

function* uniqIter(a) {    let seen = new Set();    for (let x of a) {        if (!seen.has(x)) {            seen.add(x);            yield x;        }    }}// example:function* randomsBelow(limit) {    while (1)        yield Math.floor(Math.random() * limit);}// note that randomsBelow is endlesscount = 20;limit = 30;for (let r of uniqIter(randomsBelow(limit))) {    console.log(r);    if (--count === 0)        break}// exercise for the reader: what happens if we set `limit` less than `count` and why

Sign up to request clarification or add additional context in comments.

43 Comments

filter and indexOf have been introduced in ECMAScript 5, so this will not work in old IE versions (<9). If you care about those browsers, you will have to use libraries with similar functions (jQuery, underscore.js etc.)
@RoderickObrist you might if you want your page to work in older browsers
This isO(n^2) solution, which can run very slow in large arrays...
Try this array:["toString", "valueOf", "failed"].toString andvalueOf are stripped completely. UseObject.create(null) instead of{}.
Anyone know how fast the Set conversion solution is, compared to the others?
|
518

Quick and dirty using jQuery:

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];var uniqueNames = [];$.each(names, function(i, el){    if($.inArray(el, uniqueNames) === -1) uniqueNames.push(el);});
Martijn Pieters's user avatar
Martijn Pieters
1.1m326 gold badges4.2k silver badges3.4k bronze badges
answeredFeb 10, 2012 at 15:13
Roman Bataev's user avatar

9 Comments

As this was reverted back to the originalinArray solution by a reputable person, I am going to again mention: this solution is O(n^2), making it inefficient.
I really wish in 2020 we could start depreciating jQuery and other even-more dated answers... Stackoverflow is starting to show some age here...
I agree @NickSteele but I find it does happen naturally over time if you look at votes and not the accepted answer. The best answer will gravitate towards the top as older deprecated answers get downvoted
If you're using jquery, there is $.unique, though that will also sort the items in the result. The best answer is below (creating a set from the array), this answer is inefficient, and out-of-date.
JQuery was and is still made to provide document traversal (now built into browsers), animation (now much faster and cleaner alternatives exist), event handling (now much better implementations exist that also work in Node), and Ajax (Ajax was replaced by WebSocket almost a decade ago). Since all 4 corners of JQuery are comparatively outdated, the only reason to use JQuery is if you already know it and don't have the bandwidth to learn something better. Everything that JQuery does today is done better by other libraries, or has been totally replaced.
|
398

I got tired of seeing all bad examples withfor loops or jQuery. JavaScript has the perfect tools for this nowadays: sort, map and reduce.

Uniq reduce while keeping the existing order

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];var uniq = names.reduce(function(a,b){    if (a.indexOf(b) < 0 ) a.push(b);    return a;  },[]);console.log(uniq, names) // [ 'Mike', 'Matt', 'Nancy', 'Adam', 'Jenny', 'Carl' ]// One-linerreturn names.reduce(function(a,b){if(a.indexOf(b)<0)a.push(b);return a;},[]);

Faster uniq with sorting

There are probably faster ways, but this one is pretty decent.

var uniq = names.slice() // 'slice' makes a copy of the array before sorting it  .sort(function(a,b){    return a > b;  })  .reduce(function(a,b){    if (a.slice(-1)[0] !== b) a.push(b); // slice(-1)[0] means last item in array without removing it (like .pop())    return a;  },[]); // This empty array becomes the starting value for a// One-linerreturn names.slice().sort(function(a,b){return a > b}).reduce(function(a,b){if (a.slice(-1)[0] !== b) a.push(b);return a;},[]);

Update 2015: ES6 version:

In ES6, you have Sets and Spread which makes it very easy and performant to remove all duplicates:

var uniq = [ ...new Set(names) ]; // [ 'Mike', 'Matt', 'Nancy', 'Adam', 'Jenny', 'Carl' ]

Sort based on occurrence:

Someone asked about ordering the results based on how many unique names there are:

var names = ['Mike', 'Matt', 'Nancy', 'Adam', 'Jenny', 'Nancy', 'Carl']var uniq = names  .map((name) => {    return {count: 1, name: name}  })  .reduce((a, b) => {    a[b.name] = (a[b.name] || 0) + b.count    return a  }, {})var sorted = Object.keys(uniq).sort((a, b) => uniq[a] < uniq[b])console.log(sorted)
Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredApr 7, 2013 at 22:42
Christian Landgren's user avatar

12 Comments

Nice! Would it be possible to sort the array based on the frequency of duplicate objects? So that"Nancy" in the above example is moved to the front (or back) of the modified array?
@ALx - I updated with an example for sorting based on occurrence.
sort() appears to be called incorrectly in your second example: if a is < b then it returns the same value as if a == b, which can lead to unsorted results. Unless you're doing something clever here that I'm missing, should be.sort(function(a,b){ return a > b ? 1 : a < b ? -1 : 0; })
If the data is just an array of names with no requirement other than to eliminate duplicates, why bother with sort, map, and reduce? Just use a set - job done in O(n) time. --msdn.microsoft.com/en-us/library/dn251547
@Dave yes- see my example on[...new Set(names)] above
|
174

A single line version using array.filter and.indexOf function:

arr = arr.filter(function (value, index, array) {   return array.indexOf(value) === index;});

ES6 approach

arr = arr.filter((value, index, array) =>   array.indexOf(value) === index)
João Pimentel Ferreira's user avatar
João Pimentel Ferreira
16.5k13 gold badges98 silver badges131 bronze badges
answeredFeb 11, 2013 at 21:18
HBP's user avatar

4 Comments

care to explain how it eliminates dupes?
@web_dev: it doesn't !! I have corrected a previous edit which broke the code. Hope it makes more sens now. Thanks for asking!
This unfortunately has poor performance if this is a large array -- arr.indexOf is O(n), which makes this algorithm O(n^2)
This solution in fact is extremely slow as @CaseyKuball suggests - seestackoverflow.com/questions/67424599/…
164

Vanilla #"dateCreated" datetime="2012-02-10 15:03:50Z">

Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredFeb 10, 2012 at 15:03
Casey Kuball's user avatar

7 Comments

In more recent browsers, you could even dovar c = Object.keys(b). It should be noted that this approach will only work for strings, but it's alright, that's what the original question was asking for.
It should also be noted that you may lose the order of the array because objects don't keep their properties in order.
@JuanMendes I have created an order-safe version, which simply copies to the new array if the value has not been seen before.
What is happening on this lineobj[arr[i]] = true; ??
@kittu, that is getting theith element of the array, and putting it into the object (being used as a set). The key is the element, and the value istrue, which is entirely arbitrary, as we only care about the keys of the object.
|
76

UseUnderscore.js

It's a library with a host of functions for manipulating arrays.

It's the tie to go along with jQuery's tux, and Backbone.js's suspenders.

_.uniq

_.uniq(array, [isSorted], [iterator])Alias:unique
Produces a duplicate-free version of thearray, using === to test object equality. If you know in advance that thearray is sorted, passingtrue forisSorted will run a much faster algorithm. If you want to compute unique items based on a transformation, pass aniterator function.

Example

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"];alert(_.uniq(names, false));

Note:Lo-Dash (anunderscore competitor) also offers a comparable.uniq implementation.

answeredJun 30, 2012 at 3:07
Brandon J. Boone's user avatar

1 Comment

unfortunately underscore does not provide the ability to define a custom equality function. The callback they do allow is for an 'iteratee' function e.g. with args (item, value, array).
75

One line:

let names = ['Mike','Matt','Nancy','Adam','Jenny','Nancy','Carl', 'Nancy'];let dup = [...new Set(names)];console.log(dup);
answeredAug 1, 2017 at 1:39
Jonca33's user avatar

3 Comments

Best answer, if you're using ES6
what means this 3 dots?
@Vitalicus, that's the spread operator in ES6. Read morehere
67

You can simply do it in JavaScript, with the help of the second - index - parameter of thefilter method:

var a = [2,3,4,5,5,4];a.filter(function(value, index){ return a.indexOf(value) == index });

or in short hand

a.filter((v,i) => a.indexOf(v) == i)
answeredJun 15, 2017 at 11:05
Ashutosh Jha's user avatar

3 Comments

this only works for an array containing primitives?
Works also without requiring thea variable, as the array is the 3rd parameter offilter:[1/0, 2,1/0,2,3].filter((v,i,a) => a.indexOf(v) === i) (note that it also works nice withInfinity ☺ )
You can also do.filter((v,i, array) => array.indexOf(v) == i) if you are using this after a map, reduce etc.
43

UseArray.filter() like this:

var actualArr = ['Apple', 'Apple', 'Banana', 'Mango', 'Strawberry', 'Banana'];console.log('Actual Array: ' + actualArr);var filteredArr = actualArr.filter(function(item, index) {  if (actualArr.indexOf(item) == index)    return item;});console.log('Filtered Array: ' + filteredArr);

This can be made shorter inES6 to:

actualArr.filter((item,index,self) => self.indexOf(item)==index);

Here is a nice explanation ofArray.filter().

Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredSep 14, 2017 at 6:46
Sumit Joshi's user avatar

1 Comment

doesn't work when the array is an array of arrays
40

The top answers have complexity ofO(n²), but this can be done with justO(n) by using an object as a hash:

function getDistinctArray(arr) {    var dups = {};    return arr.filter(function(el) {        var hash = el.valueOf();        var isDup = dups[hash];        dups[hash] = true;        return !isDup;    });}

This will work for strings, numbers, and dates. If your array contains objects, the above solution won't work, because when coerced to a string, they will all have a value of"[object Object]" (or something similar) and that isn't suitable as a lookup value. You can get anO(n) implementation for objects by setting a flag on the object itself:

function getDistinctObjArray(arr) {    var distinctArr = arr.filter(function(el) {        var isDup = el.inArray;        el.inArray = true;        return !isDup;    });    distinctArr.forEach(function(el) {        delete el.inArray;    });    return distinctArr;}

Modern versions of JavaScript make this a much easier problem to solve. UsingSet will work, regardless of whether your array contains objects, strings, numbers, or any other type.

function getDistinctArray(arr) {    return [...new Set(arr)];}

The implementation is so simple that defining a function is no longer warranted.

answeredFeb 6, 2013 at 22:32
gilly3's user avatar

10 Comments

@Tushar - Your gist gives a 404. No sorting algorithm hasO(n) complexity. Sorting would not be faster.
@Tushar - there are no actual duplicates in that array. If you want to remove objects from an array that have exactly the same properties and values as other objects in the array, you would need to write a custom equality checking function to support it.
@Tushar - None of the answers on this page would remove any duplicates from such an array as is inyour gist.
Consider usingObject.create(null) instead of{}.
just note that IE is late to the party for Set
|
38

The most concise way to remove duplicates from an array using native JavaScript functions is to use a sequence like below:

vals.sort().reduce(function(a, b){ if (b != a[0]) a.unshift(b); return a }, [])

there's no need forslice norindexOf within the reduce function, like I’ve seen in other examples! it makes sense to use it along with a filter function though:

vals.filter(function(v, i, a){ return i == a.indexOf(v) })

Yet anotherES6 (2015) way of doing this that already works on a few browsers is:

Array.from(new Set(vals))

Or even using thespread operator:

[...new Set(vals)]
Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredSep 11, 2015 at 23:44
ivoputzer's user avatar

4 Comments

Set is great and very intuitive for those used to python. Too bad they do not have those great (union, intersect, difference) methods.
I went with the simplistic one line of code that utilizes theset mechanic. This was for a custom automation task so I was not leery of using it in the latest version of Chrome (within jsfiddle). However, I would still like to know the shortestall browser compliant way to de-dupe an array.
sets are part of the new specification, you should use the sort/reduce combo to assure cross-browser compatibility @AlexanderDixon
.reduce() is not cross-browser compatible as I would have to apply a poly-fill. I appreciate your response though.developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/…
30

Simplest One I've run into so far. In es6.

 var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl", "Mike", "Nancy"] var noDupe = Array.from(new Set(names))

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set

answeredDec 25, 2016 at 5:18
Deke's user avatar

1 Comment

For Mac users, even though this is an ES6 function, it works in macOS 10.11.6 El Capitan, using the Script Editor.
26

Solution 1

Array.prototype.unique = function() {    var a = [];    for (i = 0; i < this.length; i++) {        var current = this[i];        if (a.indexOf(current) < 0) a.push(current);    }    return a;}

Solution 2 (using Set)

Array.prototype.unique = function() {    return Array.from(new Set(this));}

Test

var x=[1,2,3,3,2,1];x.unique() //[1,2,3]

Performance

When I tested both implementation (with and without Set) for performance in chrome, I found that the one with Set is much much faster!

Array.prototype.unique1 = function() {    var a = [];    for (i = 0; i < this.length; i++) {        var current = this[i];        if (a.indexOf(current) < 0) a.push(current);    }    return a;}Array.prototype.unique2 = function() {    return Array.from(new Set(this));}var x=[];for(var i=0;i<10000;i++){x.push("x"+i);x.push("x"+(i+1));}console.time("unique1");console.log(x.unique1());console.timeEnd("unique1");console.time("unique2");console.log(x.unique2());console.timeEnd("unique2");

answeredJun 5, 2017 at 19:56
ShAkKiR's user avatar

6 Comments

Upvote for the use of Set. I don't know the performance comparison though
I have read somewhere that an Array is faster than a Set (overall performance), But when I tested in chrome, the implementation with Set was much much faster! see the edited answer :)
better practice is to use Object.defineProperty(Array.prototype,"unique".. instead of Array.prototype.unique = ... See more info herestackoverflow.com/questions/10105824/…
the Set approach doesn't seem to work for me in Node. new Set([5,5]) seems to return [5,5] in some cases. I'm as baffled as you are. Edit: I found out what's happening. new Set([new Number(5), new Number(5)]) returns [5,5]. Apparently Node thinks the two number 5s are different if I instantiate them with new... which is honestly the stupidest thing I've ever seen.
@Demonblack This is a valid concern. x=new Number(5) and another y=new Number(5) will be two different Objects, as oppose to just var x=5 and var y=5. new keyword will create a new object. I know this explanation is obvious but that's all I know :)
|
25

In ECMAScript 6 (aka ECMAScript 2015),Set can be used to filter out duplicates. Then it can be converted back to an array using thespread operator.

var names = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl"],    unique = [...new Set(names)];
Michael Oryl's user avatar
Michael Oryl
21.8k15 gold badges82 silver badges119 bronze badges
answeredDec 10, 2014 at 15:57
Oriol's user avatar

3 Comments

the constructor of Set actually requires thenew keyword
@Ivo Thanks. Previously Firefox's implementation didn't requirenew, I wonder if the ES6 draft changed about this behavior.
some constructors might be indeed called as functions though this kind of behaviour depends on the browser's implementation of the spec ;)
22

Go for this one:

var uniqueArray = duplicateArray.filter(function(elem, pos) {    return duplicateArray.indexOf(elem) == pos;});

Now uniqueArray contains no duplicates.

Pang's user avatar
Pang
10.2k146 gold badges87 silver badges126 bronze badges
answeredFeb 27, 2015 at 9:53
Juhan's user avatar

Comments

22

The following is more than 80% faster than the jQuery method listed (see the tests below).

It is an answer from a similar question a few years ago. If I come across the person who originally proposed it, I will post credit.It is pure JavaScript.

var temp = {};for (var i = 0; i < array.length; i++)  temp[array[i]] = true;var r = [];for (var k in temp)  r.push(k);return r;

My test case comparison:http://jsperf.com/remove-duplicate-array-tests

Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredJan 25, 2013 at 17:52
Levi's user avatar

3 Comments

I add a more fast version in revision 4. Please, review!
the test didn't seem to be using arrays??? i've added (yet another) one that seems to be consistently fast over different browsers (seejsperf.com/remove-duplicate-array-tests/10) : for (var n = array.length, result = [array[n--]], i; n--;) { i = array[n]; if (!(i in result)) result.push(i); } return result;
The link is broken:""This Deployment has been disabled. 429: TOO_MANY_REQUESTS""
19

I had done a detailed comparison of dupes removal at some other question but having noticed that this is the real place i just wanted to share it here as well.

I believe this is the best way to do this

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],    reduced = Object.keys(myArray.reduce((p,c) => (p[c] = true,p),{}));console.log(reduced);

OK .. even though this one is O(n) and the others are O(n^2) i was curious to see benchmark comparison between this reduce / look up table and filter/indexOf combo (I choose Jeetendras very nice implementationhttps://stackoverflow.com/a/37441144/4543207). I prepare a 100K item array filled with random positive integers in range 0-9999 and and it removes the duplicates. I repeat the test for 10 times and the average of the results show that they are no match in performance.

  • In firefox v47 reduce & lut : 14.85ms vs filter & indexOf : 2836ms
  • In chrome v51 reduce & lut : 23.90ms vs filter & indexOf : 1066ms

Well ok so far so good. But let's do it properly this time in the ES6 style. It looks so cool..! But as of now how it will perform against the powerful lut solution is a mystery to me. Lets first see the code and then benchmark it.

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],    reduced = [...myArray.reduce((p,c) => p.set(c,true),new Map()).keys()];console.log(reduced);

Wow that was short..! But how about the performance..? It's beautiful... Since the heavy weight of the filter / indexOf lifted over our shoulders now i can test an array 1M random items of positive integers in range 0..99999 to get an average from 10 consecutive tests. I can say this time it's a real match. See the result for yourself :)

var ranar = [],     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],     avg1 = [],     avg2 = [],       ts = 0,       te = 0,     res1 = [],     res2 = [],     count= 10;for (var i = 0; i<count; i++){  ranar = (new Array(1000000).fill(true)).map(e => Math.floor(Math.random()*100000));  ts = performance.now();  res1 = red1(ranar);  te = performance.now();  avg1.push(te-ts);  ts = performance.now();  res2 = red2(ranar);  te = performance.now();  avg2.push(te-ts);}avg1 = avg1.reduce((p,c) => p+c)/count;avg2 = avg2.reduce((p,c) => p+c)/count;console.log("reduce & lut took: " + avg1 + "msec");console.log("map & spread took: " + avg2 + "msec");

Which one would you use..? Well not so fast...! Don't be deceived. Map is at displacement. Now look... in all of the above cases we fill an array of size n with numbers of range < n. I mean we have an array of size 100 and we fill with random numbers 0..9 so there are definite duplicates and "almost" definitely each number has a duplicate. How about if we fill the array in size 100 with random numbers 0..9999. Let's now see Map playing at home. This time an Array of 100K items but random number range is 0..100M. We will do 100 consecutive tests to average the results. OK let's see the bets..! <- no typo

var ranar = [],     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],     avg1 = [],     avg2 = [],       ts = 0,       te = 0,     res1 = [],     res2 = [],     count= 100;for (var i = 0; i<count; i++){  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*100000000));  ts = performance.now();  res1 = red1(ranar);  te = performance.now();  avg1.push(te-ts);  ts = performance.now();  res2 = red2(ranar);  te = performance.now();  avg2.push(te-ts);}avg1 = avg1.reduce((p,c) => p+c)/count;avg2 = avg2.reduce((p,c) => p+c)/count;console.log("reduce & lut took: " + avg1 + "msec");console.log("map & spread took: " + avg2 + "msec");

Now this is the spectacular comeback of Map()..! May be now you can make a better decision when you want to remove the dupes.

Well ok we are all happy now. But the lead role always comes last with some applause. I am sure some of you wonder what Set object would do. Now that since we are open to ES6 and we know Map is the winner of the previous games let us compare Map with Set as a final. A typical Real Madrid vs Barcelona game this time... or is it? Let's see who will win the el classico :)

var ranar = [],     red1 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],     red2 = a => Array.from(new Set(a)),     avg1 = [],     avg2 = [],       ts = 0,       te = 0,     res1 = [],     res2 = [],     count= 100;for (var i = 0; i<count; i++){  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*10000000));  ts = performance.now();  res1 = red1(ranar);  te = performance.now();  avg1.push(te-ts);  ts = performance.now();  res2 = red2(ranar);  te = performance.now();  avg2.push(te-ts);}avg1 = avg1.reduce((p,c) => p+c)/count;avg2 = avg2.reduce((p,c) => p+c)/count;console.log("map & spread took: " + avg1 + "msec");console.log("set & A.from took: " + avg2 + "msec");

Wow.. man..! Well unexpectedly it didn't turn out to be an el classico at all. More like Barcelona FC against CA Osasuna :))

answeredMay 26, 2016 at 13:57
Redu's user avatar

1 Comment

Just btw I getarr.reduce(...).keys(...).slice is not a function in Typescript trying to use your ES6 method
15

Here is a simple answer to the question.

var names = ["Alex","Tony","James","Suzane", "Marie", "Laurence", "Alex", "Suzane", "Marie", "Marie", "James", "Tony", "Alex"];var uniqueNames = [];    for(var i in names){        if(uniqueNames.indexOf(names[i]) === -1){            uniqueNames.push(names[i]);        }    }
answeredDec 3, 2014 at 20:53
drew7721's user avatar

2 Comments

+1 for=== . It wont work for arrays with mixed types if we dont check for it types. Simple but effective answer
should be the main answer !
10

Here is very simple for understanding and working anywhere (even in PhotoshopScript) code. Check it!

var peoplenames = new Array("Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl");peoplenames = unique(peoplenames);alert(peoplenames);function unique(array){    var len = array.length;    for(var i = 0; i < len; i++) for(var j = i + 1; j < len; j++)         if(array[j] == array[i]){            array.splice(j,1);            j--;            len--;        }    return array;}//*result* peoplenames == ["Mike","Matt","Nancy","Adam","Jenny","Carl"]
answeredNov 11, 2016 at 17:35
bodich's user avatar

Comments

10

A simple, but effective, technique, is to use thefilter method in combination with the filterfunction(value, index){ return this.indexOf(value) == index }.

Code example:

var data = [2,3,4,5,5,4];var filter = function(value, index){ return this.indexOf(value) == index };var filteredData = data.filter(filter, data );document.body.innerHTML = '<pre>' + JSON.stringify(filteredData, null, '\t') +  '</pre>';

See alsothis Fiddle.

Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredMar 29, 2017 at 9:31
John Slegers's user avatar

5 Comments

Genious! And, for instances, if you want to have the repeated ones, (instead of removing them) all you have to do is replacethis.indexOf(value) == index bythis.indexOf(value, index+1) > 0 Thanks!
You could even resume it to a single "filter" line:filterData = data.filter((v, i) => (data.indexOf(v) == i) );
Last time I bother! Sorry... picking up my 1st answer, in 2 lines you could get a JSONvar JSON_dupCounter = {}; with the repeated ones and how many times they were repeated:data.filter((testItem, index) => (data.indexOf(testItem, index + 1) > 0)).forEach((found_duplicated) => (JSON_dupCounter[found_duplicated] = (JSON_dupCounter [found_duplicated] || 1) + 1));
this only works for arrays of primitives?
@frozen : If works with everything where== can be used to determine equality. So, if you're dealing with eg. arrays, objects or functions, the filter will work only for different entries that are references to the same array, object or function (see demo). If you want to determine equality based ondifferent criteria, you'll need to include those criteria in your filter.
7

Apart from being a simpler, more terse solution than the current answers (minus the future-lookingES6 ones), I performance tested this, and it was much faster as well:

var uniqueArray = dupeArray.filter(function(item, i, self){  return self.lastIndexOf(item) == i;});

One caveat: Array.lastIndexOf() was added inInternet Explorer 9, so if you need to go lower than that, you'll need to look elsewhere.

Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredDec 19, 2015 at 7:25
csuwldcat's user avatar

Comments

6

Generic Functional Approach

Here is a generic and strictly functional approach with ES2015:

// small, reusable auxiliary functionsconst apply = f => a => f(a);const flip = f => b => a => f(a) (b);const uncurry = f => (a, b) => f(a) (b);const push = x => xs => (xs.push(x), xs);const foldl = f => acc => xs => xs.reduce(uncurry(f), acc);const some = f => xs => xs.some(apply(f));// the actual de-duplicate functionconst uniqueBy = f => foldl(   acc => x => some(f(x)) (acc)    ? acc    : push(x) (acc) ) ([]);// comparatorsconst eq = y => x => x === y;// string equality case insensitive :Dconst seqCI = y => x => x.toLowerCase() === y.toLowerCase();// mock dataconst xs = [1,2,3,1,2,3,4];const ys = ["a", "b", "c", "A", "B", "C", "D"];console.log( uniqueBy(eq) (xs) );console.log( uniqueBy(seqCI) (ys) );

We can easily deriveunique fromunqiueBy or use the faster implementation utilizingSets:

const unqiue = uniqueBy(eq);// const unique = xs => Array.from(new Set(xs));

Benefits of this approach:

  • generic solution by using a separate comparator function
  • declarative and succinct implementation
  • reuse of other small, generic functions

Performance Considerations

uniqueBy isn't as fast as an imperative implementation with loops, but it is way more expressive due to its genericity.

If you identifyuniqueBy as the cause of a concrete performance penalty in your app, replace it with optimized code. That is, write your code first in an functional, declarative way. Afterwards, provided that you encounter performance issues, try to optimize the code at the locations, which are the cause of the problem.

Memory Consumption and Garbage Collection

uniqueBy utilizes mutations (push(x) (acc)) hidden inside its body. It reuses the accumulator instead of throwing it away after each iteration. This reduces memory consumption and GC pressure. Since this side effect is wrapped inside the function, everything outside remains pure.

answeredSep 4, 2016 at 12:41

Comments

5

If by any chance you were using

D3.js

You could do

d3.set(["foo", "bar", "foo", "baz"]).values() ==> ["foo", "bar", "baz"]

https://github.com/mbostock/d3/wiki/Arrays#set_values

answeredMay 31, 2015 at 22:38
Shankar ARUL's user avatar

1 Comment

Beautiful, but loading the full fledged powerful visualization library to only filter duplicates seems overkill. Lucky I need the lib for some purpose and I will be using this. Thanks very much.
5
for (i=0; i<originalArray.length; i++) {      if (!newArray.includes(originalArray[i])) {        newArray.push(originalArray[i]);     }}
answeredOct 25, 2017 at 13:54
MBJH's user avatar

1 Comment

Could you elaborate on the code?
4

The following script returns a new array containing only unique values. It works on string and numbers. No requirement for additional libraries only vanilla JS.

Browser support:

Feature Chrome  Firefox (Gecko)     Internet Explorer   Opera   SafariBasic support   (Yes)   1.5 (1.8)   9                   (Yes)   (Yes)

https://jsfiddle.net/fzmcgcxv/3/

var duplicates = ["Mike","Matt","Nancy","Adam","Jenny","Nancy","Carl","Mike","Mike","Nancy","Carl"]; var unique = duplicates.filter(function(elem, pos) {    return duplicates.indexOf(elem) == pos;  }); alert(unique);
answeredMar 10, 2015 at 8:19
GibboK's user avatar

Comments

4

Use:

https://jsfiddle.net/2w0k5tz8/

function remove_duplicates(array_){    var ret_array = new Array();    for (var a = array_.length - 1; a >= 0; a--) {        for (var b = array_.length - 1; b >= 0; b--) {            if(array_[a] == array_[b] && a != b){                delete array_[b];            }        };        if(array_[a] != undefined)            ret_array.push(array_[a]);    };    return ret_array;}console.log(remove_duplicates(Array(1,1,1,2,2,2,3,3,3)));

Loop through, remove duplicates, and create a clone array place holder, because the array index will not be updated.

Loop backward for better performance (your loop won’t need to keep checking the length of your array).

Peter Mortensen's user avatar
Peter Mortensen
31.4k22 gold badges110 silver badges134 bronze badges
answeredAug 18, 2015 at 14:45
THE AMAZING's user avatar

Comments

4

A slight modification ofgeorg's excellent answer to use a custom comparator:

function contains(array, obj) {    for (var i = 0; i < array.length; i++) {        if (isEqual(array[i], obj)) return true;    }    return false;}//comparatorfunction isEqual(obj1, obj2) {    if (obj1.name == obj2.name) return true;    return false;}function removeDuplicates(ary) {    var arr = [];    return ary.filter(function(x) {        return !contains(arr, x) && arr.push(x);    });}
Nimantha's user avatar
Nimantha
6,5376 gold badges32 silver badges78 bronze badges
answeredApr 9, 2013 at 14:33
vin_schumi's user avatar

1 Comment

OK, the OP has left the building:"Last seen more than 11 years ago". Perhaps somebody else can chime in?
4

Although the ES6 solution is the best, I'm baffled as to how nobody has shown the following solution:

function removeDuplicates(arr){    o = {}    arr.forEach((e) => (o[e] = true))    return Object.keys(o)}

The thing to remember here is that objectsmust have unique keys. We are exploiting this to remove all the duplicates. I would have thought this would be the fastest solution (before ES6).

Bear in mind though that this also sorts the array.

answeredJul 6, 2017 at 21:01
Sancarn's user avatar

Comments

3
$(document).ready(function() {    var arr1=["dog","dog","fish","cat","cat","fish","apple","orange"]    var arr2=["cat","fish","mango","apple"]    var uniquevalue=[];    var seconduniquevalue=[];    var finalarray=[];    $.each(arr1,function(key,value){       if($.inArray (value,uniquevalue) === -1)       {           uniquevalue.push(value)       }    });     $.each(arr2,function(key,value){       if($.inArray (value,seconduniquevalue) === -1)       {           seconduniquevalue.push(value)       }    });    $.each(uniquevalue,function(ikey,ivalue){        $.each(seconduniquevalue,function(ukey,uvalue){            if( ivalue == uvalue)            {                finalarray.push(ivalue);            }           });    });    alert(finalarray);});
Gwenc37's user avatar
Gwenc37
2,0467 gold badges19 silver badges22 bronze badges
answeredJul 15, 2014 at 9:15
user3840178's user avatar

Comments

3

Another method of doing this without writing much code is using the ES5Object.keys-method:

var arrayWithDuplicates = ['a','b','c','d','a','c'],    deduper = {};arrayWithDuplicates.forEach(function (item) {    deduper[item] = null;});var dedupedArray = Object.keys(deduper); // ["a", "b", "c", "d"]

Extracted in a function

function removeDuplicates (arr) {    var deduper = {}    arr.forEach(function (item) {        deduper[item] = null;    });    return Object.keys(deduper);}
answeredNov 6, 2014 at 13:34
Willem de Wit's user avatar

2 Comments

This doesn't work. You aren't usingarrayWithDuplicates anywhere.
@Oriol Sorry about that, I forgot one line. I edited the example.
Protected question. To answer this question, you need to have at least 10 reputation on this site (not counting theassociation bonus). The reputation requirement helps protect this question from spam and non-answer activity.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.