GIL global interpreter locks lock lock signal volume

Contents

  • GIL global interpreter lock
  • Note:
  • The role of multithreading:
  • Deadlock phenomenon and recursive lock
  • Note
  • Semaphore
  • li>

  • Thread queue
    • FIFO queue:
    • LIFO queue:
    • Priority queue

GIL global interpreter lock

GIL global interpreter lock: study interpreter lock based on cpython

  1. GIL is essentially a mutex lock

  2. The purpose of GIL is to prevent multiple threads in the same process from executing (concurrently)

    Multiple threads in a single process cannot achieve parallelism, but can achieve concurrency

  3. This lock is mainly because the memory management of cpython is not “thread safe” “

    1. Memory management:

      Garbage collection mechanism

The existence of GIL is to ensure the safety of the thread

Note:

Multiple threads come to execute, once an io operation is encountered, the GIL interpretation lock will be released immediately , Hand it over to the next incoming thread

The role of multithreading:

Look at the problem from two angles

'''Four tasks are computationally intensive, and each task requires 10s: Single core: open process: consumes too much resources 4 processes: 40s open threads: consume resources much less than processes 4 threads: 40s multi-core : Open process parallel execution, the efficiency is higher 4 processes: 10s open thread concurrent execution, the execution efficiency is low. 4 threads: 40s four tasks, IO intensive, each task requires 10s: Single core: open process consumes too much resources 4 processes: 40s open thread consumes much less resources than process 4 threads: 40s multi-core: open process parallel Execution, efficiency is less than multithreading, because it will immediately switch the execution authority of the CPU when encountering IO4 processes: 40s + the extra time consumed by opening the process to start concurrent execution of threads, the execution efficiency is higher than that of multi-process 4 threads: 40s''' from threading import Threadfrom multiprocessing import Processimport osimport time# Compute-intensive def work1(): number = 0 for line in range(100000000): number += 1# IO-intensive def work2(): time.sleep(1)if __name__ = ='__main__': # Test and calculation intensive# print(os.cpu_count()) # 6 # # Start time# start_time = time.time() # list1 = [] # for line in range(6): # p = Process(target=work1) # Program execution time 5.300818920135498 # # p = Thread(target=work1) # Program execution time 24.000795602798462 # # list1.append(p) # p.start() # Test IO-intensive print(os.cpu_count ()) # 6 # Start time start_time = time.time() list1 = [] for l ine in range(40): # p = Process(target=work2) # Program execution time 4.445072174072266 p = Thread(target=work2) # Program execution time 1.009237289428711 list1.append(p) p.start() for p in list1: p.join() end_time = time.time() print(f'Program execution time (end_time-start_time)') # In the case of computationally intensive: use multi-process# In the case of IO-intensive: use multi-thread # Efficiently execute multiple processes and multiple IO-intensive programs: use multi-process + multi-threading

deadlock phenomenon and recursive lock

< pre class="python">from threading import Lock, Thread, current_thread,Rlockimport timemutex_a = Lock()mutex_b = Lock()## print(id(mutex_a))# print(id(mutex_b))”’Recursive lock: RLock is used to solve the deadlock problem: It is likened to a master key that can be used by multiple people, but when the first one is used, a reference count will be made to the lock, and only when the reference count is 0, can it be truly released and let another One person to use “’mutex_a = mutex_b = RLock() class MyThread(Thread): # Thread performs tasks def run(self): self.func1() self.func2() def func1(self): mutex_a.acquire( ) # print(f’User {current_thread().name} grabbed the lock a’) print(f’User {self.name} grabbed the lock a’) mutex_b.acquire() print(f’User {self.name }Grab the lock b’) mutex_b.rele ase() print(f’User {self.name} releases the lock b’) mutex_a.release() print(f’User {self.name} releases the lock a’) def func2(self): mutex_b.acquire() print (f’User {self.name} grabbed the lock b’) # IO operation time.sleep(1) mutex_a.acquire() print(f’User {self.name} grabbed the lock a’) mutex_a.release() print(f’User {self.name} releases the lock a’) mutex_b.release() print(f’User {self.name} releases the lock b’) for line in range(10): t = MyThread() t. start()

Note

Lock can not be used indiscriminately

semaphore

Mutual exclusion lock: likened to a household toilet. Only one person can use it at the same time

Semaphore: It is likened to multiple toilets in public toilets. It can be used by multiple people at the same time

from threading import Seaphore,Lock,Thread,current_thread import timesm=Semaphore(5)muetx=Lockdef task(): sm.acquier() print (f'{current_thread().name}Execute task') time.sleep(1) sm.release() for line in range(20): t=Thread(target=task) t.start()

Thread queue

FIFO queue:

First in, first out

LIFO queue:

Last in, first out

Priority Queue

According to the parameter, the size of the number is ranked, the number is higher Smaller, higher priority

# Ordinary thread queue: First in, first out# q = queue.Queue()# q.put(1)# q.put(2)# q.put(3)# print(q.get()) # 1# LIFO Queue: Last In, First Out# q = queue.LifoQueue()# q.put(1)# q.put(2)# q.put (3)# print(q.get()) # 3# Priority queue q = queue.PriorityQueue() # 超级懂# If the parameter is a tuple, the first numeric parameter in the tuple will prevail q.put(('a优','先','doll head', 4)) # a==97q.put(('a先','优','doll head', 3)) # a ==98q.put(('a level','level','doll head', 2)) # a==99''' 1. First judge the value of the ascii table according to the first parameter 2. Judge the first The order of Chinese characters in the two parameters. 3. Then judge the number in the second parameter--> string number---> Chinese 4. And so on ``'print(q.get())

Contents

  • GIL global interpreter lock
  • Note:
  • The role of multithreading:
  • Deadlock phenomenon and recursive lock
  • li>

  • Note
  • Semaphore
  • Thread queue
    • FIFO queue:
    • LIFO queue:
    • Priority queue

  • GIL global interpreter lock
  • Note:
  • The role of multithreading:
  • Deadlock phenomenon and recursive lock
  • Note
  • Semaphore
  • Thread Queue
    • FIFO queue:
    • LIFO queue:
    • Priority queue

Leave a Comment

Your email address will not be published.