I was looking at some of the solutions in Google Code Jam and some people used this things that I had never seen before. For example,
2LL*r+1LL
What does 2LL and 1LL mean?
Their includes look like this:
#include <math.h>
#include <algorithm>
#define _USE_MATH_DEFINES
or
#include <cmath>
The
LL
makes the integer literal of typelong long
.So
2LL
, is a 2 of typelong long
.Without the
LL
, the literal would only be of typeint
.This matters when you're doing stuff like this:
With just the literal
1
, (assumingint
to be 32-bits, you shift beyond the size of the integer type -> undefined behavior). With1LL
, you set the type tolong long
before hand and now it will properly return 2^40.