قالب وردپرس درنا توس
Home / IOS Development / Ultimate Grand Central Dispatch Tutorial in Swift

Ultimate Grand Central Dispatch Tutorial in Swift



Learn the principles of multi-threading with the GCD frame in Swift. Queues, tasks, groups, all you need, I promise.


GCD Concurrent Tutorial for Beginners

The Grand Central Dispatch Framework (GCD, or Dispatch Only) is based on the underlying thread game design pattern. This means that there are a fixed number of threads that arise from the system – based on some factors like CPU cores – they are always available waiting for the tasks to be performed at the same time. 🚦

Making threads on track is an expensive task, so GCD organizes tasks in specific queues, and later the tasks waiting for these queues will be performed on a proper and accessible thread from the pool. This approach leads to good performance and low execution latency. We can say that the Dispatch framework is a very fast and efficient constraint framework designed for modern multicore hardwares and needs.

Caller, multi-tasking, CPU cores, parallelism and threads

A processor can run tasks made by you programmatically, this is usually called coding, development, or programming. The code performed by a CPU core is a thread. So your app will make a process consisting of threads. 🤓

Earlier, a processor had a single kernel, it could only handle one task at a time. Later, timeframes were introduced, allowing CPUs to perform threads at the same time using context switching. As processors gained more horsepower and cores, they were able to realize multi-tasking using parallelism. ⏱

Today, a CPU is a very powerful device, it is capable of performing billions of tasks (cycles) per second. Because of this high availability rate, Intel introduced a technology called hyperthreading. The shared CPU time between (usually two) processes running at the same time, so that the number of free threads doubled in essence. 📈

As you can see, unanimous execution can be achieved with different techniques, but you do not have to worry about it so much. It is up to the CPU architecture how it solves coincidence, and the task of the operating system is how much thread is to be downloaded to the underlying thread pool. The GCD frames will hide all complexity, but it's always good to understand the basic principles. 👍


Synchronous and Asynchronous Driving

Each workpiece can be performed either synchronously or asynchronously.

Have you ever heard of blocking and non-blocking code? This is the same situation here. With synchronized tasks, you will block the execution queue, but with asynk tasks your call will return immediately and the queue can continue to perform the remaining tasks (or work projects as Apple calls them). 🚧

Synchronous Execution

When a work object is performed synchronously with the synchronization method, the program waits until the run is completed before the method returns.

Your function is most likely synchronous if it has a return value, so func load () -> String will block the case running until the resources are fully loaded and returned.

Asynchronous Execution

When a work object is performed asynchronously with the Async method, the method call returns immediately.

The completion blocks are a good song by async methods, for example, if you look at this method Func Load (completion: (String) -> Void) see that it has no return type but the result of The function is sent back to the caller later in a block.

This is a typical mode of use if you have to wait for something in the method like reading the contents of a large file from the disk, you do not. I do not want to block your CPU, just because of the slow IO operation. There may be other tasks that are not heavy at all (mathematical operations, etc.) they can be performed while the system reads the file from the physical hard drive. 💾

With shipping queues, you can perform your code synchronously or asynchronously. With synchronous driving, the queue is waiting for the job, with asynk driving returns the code immediately without waiting for the task to complete. ⚡️


Dispatch cows

As mentioned earlier, GCD organizes cows, these are just like queues in the mall. On each shipping queue, the tasks will be executed in the same order as you add them to the queue ̵

1; FIFO: The first task in the line will be performed first – but you should note that the completion order is not guaranteed. Tasks will be completed according to code complexity. So if you add two tasks to the queue, one slow one one and one quick one later, it can quickly finish before it slows down. ⌛️

Serial and simultaneous cows

There are two types of shipping queues. Serie queens can perform a task at a time, these queues can be used to synchronize access to a particular resource. However, simultaneous queues can perform one or more tasks in parallel at the same time. Seriekø is just like a line in the mall with a cashier. Simultaneous queue is like a single line split for two or more cashiers.

Main, global and custom queues

Hovedkøen is a serial one, each task on the main queue runs on the main thread.

] Global cows system is delivered simultaneous cows divided through the operating system. There are just four of them organized by high standard, low priority plus an IO-blocked background queue.

Custom queues can be created by the user. Custom simultaneous queues are always mapped in one of the global queues by entering a property of Quality of Service (QoS). In most cases, if you want to run tasks in parallel, it is recommended to use one of the global simultaneous queues. You should only create custom serial queues.

System queues

  • Serial main queue
  • At the same time global queues
    • High Priority Global Queue
    • Default Priority Global Queue
    • Low Priority Global Queue
    • Global Background Queue (io Gas Damage)

Custom Queues By Service Quality

  • UserInteractive -> Serial Headquarters
  • userInitiated (async UI-related tasks) -> High Priority Global Queue
  • Standard -> Default Priority Global Queue
  • Tools -> Low Priority Global Queue
  • Background -> Global Background Queue
  • ] Unspecified (lowest) -> low priority global queue

Enough from the theory, let's see how to use the Dispatch framework in action! 🎬

 Become a cartridge!

How to use the DispatchQueue class in Swift?

Here's how to get all of the queues from above using a completely new GCD syntax available from Swift 3. Keep in mind that you'll always use a global contemporary queue instead of creating your own, except if you're using it simultaneous queue to lock with barriers to achieve wire safety, more about it later

How to buy a queue?

Then you perform a task in a background queue and update the user interface in the main queue after the task is done, using Dispatch Queues is easy.

Synchronization and Asynchronization in Queues

There is no big difference between synchronization and asynchronization methods on a queue. Synchronization is just an async call with a semaphore (explained later) waiting for return value. A sync call will block, on the other hand, an Async call will return immediately. 🎉

Basically if you need a return value we're syncing, but in all other cases just go with async. DEADLOCK WARNING : You should never call sync on the main queue because it will lead to a deadlock and a crash. You can use this chip if you are looking for a safe way to make sync calls on the main queue / thread. 👌

Do not ring synchronization on a serial queue from the serial queue thread!

Delay Section

You can only delay code execution using the Dispatch Framework.

Perform simultaneous loop

Sending queue simply allows you to perform iterations at the same time.

Troubleshooting

Oh, by the way, it's only for debugging, but you can return the name of the current queue using this little extension. Do not use in production code !!! ⚠️


Using DispatchWorkItem in Swift

DispatchWorkItem encapsulates work that can be performed. A work item can be sent on a DispatchQueue and within a DispatchGroup. A DispatchWorkItem can also be set as a DispatchSource event, record or cancel trades.

So you only like operations by using a job item you can cancel a running task. Also work items can notify a queue when the task is completed.


Concurrent Tasks with DispatchGroups

So you need to perform multiple network calls to construct the data required by a controller? This is where DispatchGroup can help you. All your long-running background tasks can be done at the same time, when all is done, you will receive a notification. Just be careful that you need to use thread-proof data structures, so always change arrays for example on the same thread! 😅

Note that you must always balance the enrollment and make calls on the group. The sending group also lets us follow the completion of various work items, even though they run on different queues.

One thing you can use shipping groups for: Imagine showing a nicely animated load indicator while doing some real work. The work might be done faster than you could expect, and the indicator animation could not be completed. To solve this situation, you can add a small delay task so the group will wait until both tasks are completed. 😎


Semaphores

A semaphore is simply a variable used to manage resource sharing in a simultaneous system. It's a very powerful object, here are a few key examples in Swift.

How to make an asynchronous task synchronous?

The answer is simple, you can use a semafor (bonus point for timeout)!

Lock / easy access to a resource

To avoid race conditions, you will probably use mutual exclusion. This can be achieved by using a semaphore object, but if your object needs heavy reading capabilities, consider a shipping barrier-based solution. 😜

Waiting for multiple tasks to complete

Like shipping groups, you can also use a semaphore object to be notified if multiple tasks are completed. You just have to wait for it …

Batch execution using a semaphore

You can create a wire pool as a behavior to simulate limited resources using a shipping theme. For example, if you want to download many images from a server, you can run a series of x each time. Quite useful. 🖐


DispatchSource object

A shipment source is a basic data type that coordinates the processing of certain low-level system events.

Signals, describe, processes, ports, timers and many more. Everything is handled through the shipment shipment. I really do not want to get into the details, it's quite low level things. You can monitor files, ports, signals with shipping sources. Please read only the official Apple documents.

I would like to just make an example here using a shipment hour timer.


Wire Security Using Shipping Box

Wire Safety is an inevitable subject in the case of multi-threaded code. At the beginning, I mentioned that there is a thread pool under the hood of the GCD. Each thread has a run-loose object attached to it, you can even run them by hand. If you create a thread manually, a runner is automatically added to that thread.

⚠ You should not do this just for demo use, always use GCD queues!

Queue! = Thread

A GCD queue is not a thread, if you run multiple asynchronous operations in a single queue, your code can be run on any available thread to suit your needs.

Thread Safety is about avoiding corrupted states

Think of a mutual array in Swift. It can be changed from any thread. It's not good, because at least the values ​​in it will be destroyed like hell if the array is not wire safe. For example, multiple threads try to insert values ​​into the array. What's up? If they run in parallel, which element should first be added? Now, therefore, you sometimes need to create thread's secure resources.

Serie Islands

You can use a serial queue to enforce mutual exclusivity. All tasks in the queue run serially (in a FIFO order), only one process runs at a time, and tasks must wait for each other. One major disadvantage of the solution is speed. 🐌

Concurrent cows that use barriers

You can send a barrier task to a queue if you provide an additional flag to the async method. If a task such as this comes to the queue, it will make sure nothing else is done until the barrier task is completed. To summarize this, barrier tasks are synchronized (points) tasks for simultaneous queues. Use async barriers to write, synchronize blocks for readers. 😎

This method will result in extremely fast reading in a wireless environment. You can also use serial queues, semaphors, locks all depends on your current situation, but it's good to know that all available options are not?


Some antimony patterns

You must be very careful about deadlocks, race conditions and the author's authoring problem. Generally, the synchronization method calls on a serial queue, which will cause most of the problems. Another problem is thread security, but we've already covered that part. 😉

The Dispatch Framework (aka. GCD) is amazing, it has such a potential, and it really takes some time to master it. The real question is which way is going to take Apple to embrace simultaneous programming to a whole new level? Lifting or waiting, maybe something new, let's hope we see something in Swift 6.


External sources


Source link