How to dispatch code blocks to the same thread in

2019-03-10 00:26发布

问题:

Main aspect of the question: It's about iOS. Can I somehow dispatch code blocks in a way, that they will all (a) run in background and (b) on the same thread? I want to run some time-consuming operations in background, but these have to be run on the same thread, because they involve resources, that mustn't be shared among threads.

Further technical details, if required: It's about implementing an sqlite plugin for Apache Cordova, a framework for HTML5 apps on mobile platforms. This plugin should be an implementation of WebSQL in the means of the Cordova's plugin API. (That means, it's not possible to wrap entire transactions within single blocks, what could make everything easier.)

Here is some code from Cordova's Docs:

- (void)myPluginMethod:(CDVInvokedUrlCommand*)command
{
    // Check command.arguments here.
    [self.commandDelegate runInBackground:^{
        NSString* payload = nil;
        // Some blocking logic...
        CDVPluginResult* pluginResult = [CDVPluginResult resultWithStatus:CDVCommandStatus_OK messageAsString:payload];
        // The sendPluginResult method is thread-safe.
        [self.commandDelegate sendPluginResult:pluginResult callbackId:command.callbackId];
    }];
}

But as far as I know, there is no guarantee, that those dispatched code blocks (see runInBackground) will run on the same thread.

回答1:

GCD makes no guarantee that two blocks run on the same thread, even if they belong to the same queue (with the exception of the main queue, of course). However, if you're using a serial queue (DISPATCH_QUEUE_SERIAL) this isn't a problem as you then know that there is no concurrent access to your data.

The man page for dispatch_queue_create says:

Queues are not bound to any specific thread of execution and blocks submitted to independent queues may execute concurrently.

I'm not aware of any way to bind a queue to a specific thread (after all, not needing to care about threads is a major point of GCD). The reason why you can use a serial queue without worrying about the actual thread is this promise:

All memory writes performed by a block dispatched to a serial queue are guaranteed to be visible to subsequent blocks dispatched to the same queue.

That is, a memory barrier seems to be used.

When dealing with threading issues, your main concern is usually to avoid that two threads access something concurrently. If you're using a serial queue you do not have this problem. It usually doesn't really matter which thread is accessing your resources. For example, we're using serial queues to manage Core Data access without a problem.

Edit:

It seems you really found a rare case where you need to be working on the same thread. You could implement your own worker thread:

  • Prerequisites:
    • A NSMutableArray (let's call it blockQueue).
    • A NSCondition (let's call it queueCondition).
  • Create a new NSThread.
    • The thread's method has an endless loop in which it locks the condition, waits for it if the queue is empty (and the "quit" bool is false), dequeues a block and executes it.
  • A method that locks the condition an enqueues the block.

Due to the condition, the thread will simply sleep while there's no work to do.

So, roughly (untested, assuming ARC):

- (void)startWorkerThread
{
    workerThread = [[NSThread alloc]
        initWithTarget:self
        selector:@selector(threadMain)
        object:nil
    ];
    [workerThread start];
}

- (void)threadMain
{
    void (^block)();
    NSThread *currentThread;

    currentThread = [NSThread currentThread];

    while (1) {
        [queueCondition lock];
        {
            while ([blockQueue count] == 0 && ![currentThread isCancelled]) {
                [queueCondition wait];
            }

            if ([currentThread isCancelled]) {
                [queueCondition unlock];
                return;
            }

            block = [blockQueue objectAtIndex:0];
            [blockQueue removeObjectAtIndex:0];
        }
        [queueCondition unlock];

        // Execute block outside the condition, since it's also a lock!
        // We want to give other threads the possibility to enqueue
        // a new block while we're executing a block.
        block();
    }
}

- (void)enqueue:(void(^)())block
{
    [queueCondition lock];
    {
        // Copy the block! IIRC you'll get strange things or
        // even crashes if you don't.
        [blockQueue addObject:[block copy]];
        [queueCondition signal];
    }
    [queueCondition unlock];
}

- (void)stopThread
{
    [queueCondition lock];
    {
        [workerThread cancel];
        [queueCondition signal];
    }
    [queueCondition unlock];
}


回答2:

In GCD: no, that's not possible with the current lib dispatch.

Blocks can be executed by dispatch lib on whatever thread which is available, no matter to which queue they have been dispatched.

One exception is the main queue, which always executes its blocks on the main thread.

Please file a feature request to Apple, since it seems justified and sound. But I fear it's not feasible, otherwise it would already exist ;)



回答3:

You can use NSOperationQueue. You can make it use just one thread by using method - (void)setMaxConcurrentOperationCount:(NSInteger)count. Set it to 1



回答4:

You can either use NSOperationQueue with a MaxConcurrentOperationCount of 1 or go the manual way down the road by using NSThread model (rather than Grand Central Dispatch). Using the latter I would recommend you to implement a worker-method which runs in a thread and pulls work-packages (or commands) from a queue or a pool that is being feed from outside of the thread. Just make sure you use Locks / Mutex / Synchronisation.



回答5:

Never tried this but this might do the trick. Use separate properties of atomic dispatch queues for each operation.

    @property (strong, atomic) dispatch_queue_t downloadQueue;

Queue/Thread 1 for first operation

    downloadQueue = dispatch_queue_create("operation1", NULL);

etc.

Since atomic is thread safe, downloadQueue should not be accessed by other threads. So it makes sure that there will be only single thread per operation and other threads will not access it.



回答6:

Create a serial dispatch queue, and dispatch all the calls to that serial dispatch queue. All the calls will be performed in the background, but sequentially on the same thread.



回答7:

If you want to perform a selector in Main Thread, you can use

- (void)performSelectorOnMainThread:(SEL)aSelector withObject:(id)arg waitUntilDone:(BOOL)wait

and if you want to it to perform in background thread

- (void)performSelectorInBackground:(SEL)aSelector withObject:(id)object

and if you want to perform in any other thread use GCD(Grand Central Dispatch)

double delayInSeconds = 2.0;
    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, (int64_t)(delayInSeconds * NSEC_PER_SEC));
    dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
        //code to be executed on the main queue after delay
    });


回答8:

Just like this,

dispatch_asyn(dispatch_get_current_queue, ^ {
});