I somehow get the following timestamp on my program. I understand if there's IO involved, real time can be larger than the sum of user time and system time, but how do you explain this when user time alone is larger than real time
real 0m8.512s
user 0m8.737s
sys 0m1.956s
Your original post had user time not larger than real time. Your user and sys time together are larger than real time, but that is possible as explained in this entry
The program is probably using multiple cores at some point. User time is summed over the cores that have been used, so e.g. using 100% of two cores for 1s makes for 2s user time.