From the definitions I've been reading:
threads are basically pieces of code that are running concurrently (at
the same time).
However how can they be running concurrently with the existence of a thread scheduler?
I read that the thread scheduler basically chooses randomly a thread to run at a certain moment from the pool of Runnable
threads. From that i got that at a precise point of time, only one runnable thread is truly in the run state(running). (all of this is from SCJP Sun Certified Programmer study guide) Can anyone clarify this?
Are these threads truly running concurrently?
However how can they be running concurrently with the existence of a thread scheduler?
They are not always running concurrently, the scheduler's job is to swap the running threads around so that they appear to be running concurrently. i.e. too fast for you to see.
The scheduler uses a time slice which is 0.1 ms. You can only see a flicker of 10 - 25 ms, so this is too fast for your to see, but it is quickly swapping threads so it appears there is concurrency.
e.g. you don't see movies jumping from one frame to the next. Each frame is changed every 1/42nd of a second so you think you see movement when actually to a high speed camera the screen would look jumpy.
If you have one logical CPU, all the thread are being swapped to one CPU. If you have multiple logical CPUs, a small set can be running at once and the rest have to wait.
Practically they are running concurrently but technically they are not (in the case of a single core CPU). The schedulers chooses a thread to run, executes some number of instructions for that thread, and then context switches to execute instructions for another thread. All of this happens so fast that it appears the threads are running concurrently. With a multi core thread the same concept applies but it is able to be executing two or more threads at once.
While this is technically true, in programming it is useful to pretend you don't know that. Treat threads in Java as though they are actually running concurrently. The java concurrent constructs are an abstraction so that you can effectively run commands in parallel without having to know what is going on at the operating system level and below.
On multicore processors yes, on single cores no.
This is because on a multicore processor there are actually multiple cores to execute different logic regardless of what is happening on the other cores. This is not possible on a single core processor. To give the illusion of multithreading on a single core processor the JVM switches between execution of the different threads randomly and very frequently, so as to make it seem as if many things are happening at once. In reality, only one thing is happening at a time but the CPU is only making small amounts of progress to a task and then switching to another task and repeating this process very often. As a example lets say that I have something like this:
Thread 1
1-2-3-4-5
Thread 2
A-B-C-D-E
The JVM would switch between the execution of the two threads maybe giving something like this:
1-A-B-2-3-C-4-D-E-5
Both threads end at (or around) the same time is if they were happening at the same time but that is not actually what happens.
Note that I just made up this order because it is random and may be different on different JVMs or different machines but I hope you can see what I mean. Generally the JVM is good enough at making it seem like multithreading for it not to make a difference to the programmer.
See, many threads might wait for some kind of interaction with I/O devices. In that case, the scheduler stops the execution of that thread, picks a thread from the waiting queue of threads and starts executing that thread. The thread which was stopped can again come back to ready state (it will again be placed in the waiting queue), then it will be ready for scheduling / execution.
Considering it seems this question and related answers are likely adding to the confusion about the topic through inaccurate usage of terminology I'll add to the discussion.
Are these threads truly running concurrently?
Yes. This is true regardless if the machine has multiple cores or not. What concurrent means changes a bit based on how many cores the machine has. Specifically, machines with multiple cores are capable of parallel execution. So, it's possible to have concurrency without parallelism.
Since before multi-core machines existed computers appeared to run many threads simultaneously even though that was not the case. This was achieved via thread scheduling algorithms of various types but the end result is threads sharing the processor over a period of time. Many threads are making progress over the same time span. They are running concurrently because it's typically the case that a thread will not complete in one time quantum. A visualization of that based on a completely fair round-robin scheduler would be something like:
T1-T2-T3-...-TN-T1-T2-T3...TN-...
-------------Time------------->
When multiple cores, multiple processors, or even threaded (a.k.a HyperThreaded) processors are present the machine is capable of parallel execution of threads. This means the threads are indeed running simultaneously. A simplified visualization of that based on a system seemingly no longer needing a thread scheduler as you suggest would look like:
T1-T1-T1-T1-T1...
-----Time----->
T2-T2-T2-T2-T2...
But, there should be an obvious issue here that must be fixed with a thread scheduler even in the presence of multiple cores. Can you guess how many threads are running at one time even on a computer primarily used to play around on the internet? LOTS! There currently isn't technology available to allow for dedicated cores, logical or otherwise, for every thread on a system. So, what we have today is concurrent parallelism. A simplified example of that based on a completely fair round-robin scheduler would look like:
T1-T2-T3-...-TN-T1-T2-T3...TN-...
-------------Time------------->
T4-T5-T6-...-TM-T4-T5-T6...TM-...
You can't avoid a thread or process scheduler and not just because the amount of threads running but because threads have different priorities and other less-important threads have to be preempted in order to provide the best experience for the user. For example, a thread handling keyboard input cannot just be tossed into a queue and allowed to run whenever it gets its turn. Doing that would result in my machine being smashed into a million pieces trying to write this post.
For further reading, I'd suggest picking up the supposed go-to OS book which deals with many concurrency/multi-programming topics.