What is the difference between concurrent programm

2019-01-01 11:32发布

What is the difference between concurrent programming and parallel programing? I asked google but didn't find anything that helped me to understand that difference. Could you give me an example for both?

For now I found this explanation: http://www.linux-mag.com/id/7411 - but "concurrency is a property of the program" vs "parallel execution is a property of the machine" isn't enough for me - still I can't say what is what.

14条回答
何处买醉
2楼-- · 2019-01-01 12:02

If you program using threads (concurrent programming), it's not necessarily going to be executed as such (parallel execution), since it depends on whether the machine can handle several threads.

Here's a visual example. Threads on a non-threaded machine:

        --  --  --
     /              \
>---- --  --  --  -- ---->>

Threads on a threaded machine:

     ------
    /      \
>-------------->>

The dashes represent executed code. As you can see, they both split up and execute separately, but the threaded machine can execute several separate pieces at once.

查看更多
谁念西风独自凉
3楼-- · 2019-01-01 12:03

Interpreting the original question as parallel/concurrent computation instead of programming.

In concurrent computation two computations both advance independently of each other. The second computation doesn't have to wait until the first is finished for it to advance. It doesn't state however, the mechanism how this is achieved. In single-core setup, suspending and alternating between threads is required (also called pre-emptive multithreading).

In parallel computation two computations both advance simultaneously - that is literally at the same time. This is not possible with single CPU and requires multi-core setup instead.

suspending and taking turns versus parallel computing

According to: "Parallel vs Concurrent in Node.js".

查看更多
有味是清欢
4楼-- · 2019-01-01 12:04
  • Concurrent programming is in a general sense to refer to environments in which the tasks we define can occur in any order. One task can occur before or after another, and some or all tasks can be performed at the same time.

  • Parallel programming is to specifically refer to the simultaneous execution of concurrent tasks on different processors. Thus, all parallel programming is concurrent, but not all concurrent programming is parallel.

Source: PThreads Programming - A POSIX Standard for Better Multiprocessing, Buttlar, Farrell, Nichols

查看更多
只靠听说
5楼-- · 2019-01-01 12:05

In programming, concurrency is the composition of independently executing processes, while parallelism is the simultaneous execution of (possibly related) computations.
- Andrew Gerrand -

And

Concurrency is the composition of independently executing computations. Concurrency is a way to structure software, particularly as a way to write clean code that interacts well with the real world. It is not parallelism.

Concurrency is not parallelism, although it enables parallelism. If you have only one processor, your program can still be concurrent but it cannot be parallel. On the other hand, a well-written concurrent program might run efficiently in parallel on a multiprocessor. That property could be important...
- Rob Pike -

To understand the difference, I strongly recommend to see this Rob Pike(one of Golang creators)'s video. Concurrency Is Not Parallelism

查看更多
与风俱净
6楼-- · 2019-01-01 12:06

Although there isn’t complete agreement on the distinction between the terms parallel and concurrent, many authors make the following distinctions:

  • In concurrent computing, a program is one in which multiple tasks can be in progress at any instant.
  • In parallel computing, a program is one in which multiple tasks cooperate closely to solve a problem.

So parallel programs are concurrent, but a program such as a multitasking operating system is also concurrent, even when it is run on a machine with only one core, since multiple tasks can be in progress at any instant.

Source: An introduction to parallel programming, Peter Pacheco

查看更多
泛滥B
7楼-- · 2019-01-01 12:11

1. Definitions:

Classic scheduling of tasks can be SERIAL, PARALLEL or CONCURRENT

SERIAL: Analysis shows that tasks MUST BE executed one after the other in a known sequence tricked order OR it will not work.

I.e.: Easy enough, we can live with this

PARALLEL: Analysis shows that tasks MUST BE executed at the same time OR it will not work.

  • Any failure of any of the tasks -- functionally or in time -- will result in total system failure.
  • All tasks must have a common reliable sense of time.

I.e.: Try to avoid this or we will have tears by tea time.

CONCURRENT. Analysis shows that we NEED NOT CARE. We are not careless, we have analysed it and it does not matter; we can therefore execute any task using any available facility at any time.

I.e.: HAPPY DAYS


Often the scheduling available changes at known events which I called a state change.


2. This is not a { Software | Programming } Feature but a Systems Design approach:

People often think this is about software but it is in fact a systems design concept that pre-dates computers

Software systems were a little slow in the uptake, very few software languages even attempt to address the problem.

You might try looking up the TRANSPUTER language occam if you are interested in a good try.

( occam has many principally innovative ( if not second to none ) features, incl. explicit language support for PAR and SER code-parts execution constructors that other languages principally suffer from having in the forthcomming era of Massive Parallel Processor Arrays available in recent years, re-inventing the wheel InMOS Transputers used more than 35 years ago (!!!) )


3. What a good Systems Design takes care to cover:

Succinctly, systems design addresses the following:

THE VERB - What are you doing. ( operation or algorithm )

THE NOUN - What are you doing it to. ( Data or interface )

WHEN - Initiation, schedule, state changes, SERIAL, PARALLEL, CONCURRENT

WHERE - Once you know when things happen then you can say where they can happen and not before.

WHY - Is this a way to do it? Is there any other ways? Is there a best way?

.. and last but not least .. WHAT HAPPENS IF YOU DO NOT DO IT ?


4. Visual examples of PARALLEL vs. SERIAL approaches:

Recent Parallel architectures available in 2014 in action on arrays of 16-, 64-, 1024- parallel RISC uP-s

Quarter of century back - a part of the true parallel history with Inmos Transputer CPU demo video from the early 1990s

Good luck

查看更多
登录 后发表回答