I am trying to implement simple validation of credit card numbers. I read about the Luhn algorithm on Wikipedia:
- Counting from the check digit, which is the rightmost, and moving left, double the value of every second digit.
- Sum the digits of the products (e.g., 10: 1 + 0 = 1, 14: 1 + 4 = 5) together with the undoubled digits from the original number.
- If the total modulo 10 is equal to 0 (if the total ends in zero) then the number is valid according to the Luhn formula; else it is not valid.
On Wikipedia, the description of the Luhn algorithm is very easily understood. However, I have also seen other implementations of the Luhn algorithm on Rosetta Code and elsewhere.
Those implementations work very well, but I am confused about why they can use an array to do the work. The array they use seems to have no relation with Luhn algorithm, and I can't see how they achieve the steps described on Wikipedia.
Why are they using arrays? What is the significance of them, and how are they used to implement the algorithm as described by Wikipedia?
A very fast and elegant implementation of the Luhn algorithm following:
On my dedicated git repository you can grab it and retrieve more info (like benchmarks link and full unit tests for ~50 browsers and some node.js versions).
Or you can simply install it via bower or npm. It works both on browsers and/or node.
Code is the following:
The variable
counter
is the sum of all the digit in odd positions, plus the double of the digits in even positions, when the double exceeds 10 we add the two numbers that make it (ex: 6 * 2 -> 12 -> 1 + 2 = 3)The Array you are asking about is the result of all the possible doubles
var luhnArr = [0, 2, 4, 6, 8, 1, 3, 5, 7, 9];
So for example
Another alternative:
Unfortunately none of the codes above worked for me. But I found on GitHub a working solution