Python Threads - Critical Section
What is the "critical section" of a thread (in Python)?
A thread enters the critical section by calling the acquire() method, which can either be blocking or non-blocking. A thread exits the critical section, by calling the release() method.
Also, what is the purpose of a lock?
Asked by: Aldus589 | Posted: 05-10-2021
Other people have given very nice definitions. Here's the classic example:
import threading account_balance = 0 # The "resource" that zenazn mentions. account_balance_lock = threading.Lock() def change_account_balance(delta): global account_balance with account_balance_lock: # Critical section is within this block. account_balance += delta
Let's say that the
+= operator consists of three subcomponents:
- Read the current value
- Add the RHS to that value
- Write the accumulated value back to the LHS (technically bind it in Python terms)
If you don't have the
with account_balance_lock statement and you execute two
change_account_balance calls in parallel you can end up interleaving the three subcomponent operations in a hazardous manner. Let's say you simultaneously call
change_account_balance(100) (AKA pos) and
change_account_balance(-100) (AKA neg). This could happen:
pos = threading.Thread(target=change_account_balance, args=) neg = threading.Thread(target=change_account_balance, args=[-100]) pos.start(), neg.start()
- pos: read current value -> 0
- neg: read current value -> 0
- pos: add current value to read value -> 100
- neg: add current value to read value -> -100
- pos: write current value -> account_balance = 100
- neg: write current value -> account_balance = -100
Because you didn't force the operations to happen in discrete chunks you can have three possible outcomes (-100, 0, 100).
with [lock] statement is a single, indivisible operation that says, "Let me be the only thread executing this block of code. If something else is executing, it's cool -- I'll wait." This ensures that the updates to the
account_balance are "thread-safe" (parallelism-safe).
Note: There is a caveat to this schema: you have to remember to acquire the
with) every time you want to manipulate the
account_balance for the code to remain thread-safe. There are ways to make this less fragile, but that's the answer to a whole other question.
Edit: In retrospect, it's probably important to mention that the
with statement implicitly calls a blocking
acquire on the lock -- this is the "I'll wait" part of the above thread dialog. In contrast, a non-blocking acquire says, "If I can't acquire the lock right away, let me know," and then relies on you to check whether you got the lock or not.
import logging # This module is thread safe. import threading LOCK = threading.Lock() def run(): if LOCK.acquire(False): # Non-blocking -- return whether we got it logging.info('Got the lock!') LOCK.release() else: logging.info("Couldn't get the lock. Maybe next time") logging.basicConfig(level=logging.INFO) threads = [threading.Thread(target=run) for i in range(100)] for thread in threads: thread.start()
I also want to add that the lock's primary purpose is to guarantee the atomicity of acquisition (the indivisibility of the
acquire across threads), which a simple boolean flag will not guarantee. The semantics of atomic operations are probably also the content of another question.
A critical section of code is one that can only be executed by one thread at a time. Take a chat server for instance. If you have a thread for each connection (i.e., each end user), one "critical section" is the spooling code (sending an incoming message to all the clients). If more than one thread tries to spool a message at once, you'll get BfrIToS mANtwD PIoEmesCEsaSges intertwined, which is obviously no good at all.
A lock is something that can be used to synchronize access to a critical section (or resources in general). In our chat server example, the lock is like a locked room with a typewriter in it. If one thread is in there (to type a message out), no other thread can get into the room. Once the first thread is done, he unlocks the room and leaves. Then another thread can go in the room (locking it). "Aquiring" the lock just means "I get the room."Answered by: Rafael876 | Posted: 06-11-2021
A "critical section" is a chunk of code in which, for correctness, it is necessary to ensure that only one thread of control can be in that section at a time. In general, you need a critical section to contain references that write values into memory that can be shared among more than one concurrent process.Answered by: Maya634 | Posted: 06-11-2021
multithreading - Implementing a buffer-like structure in Python
I'm trying to write a small wsgi application which will put some objects to an external queue after each request. I want to make this in batch, ie. make the webserver put the object to a buffer-like structure in memory, and another thread and/or process for sending these objects to the queue in batch, when buffer is big enough or after certain timeout, and clearing the buffer. I don't want to be in NIH syndrome and not wan...
multithreading - Beginner-level Python threading problems
As someone new to GUI development in Python (with pyGTK), I've just started learning about threading. To test out my skills, I've written a simple little GTK interface with a start/stop button. The goal is that when it is clicked, a thread starts that quickly increments a number in the text box, while keeping the GUI responsive. I've got the GUI working just fine, but am having problems with the threading. It is pr...
multithreading - input and thread problem, python
I am doing something like this in python class MyThread ( threading.Thread ): def run (s): try: s.wantQuit = 0 while(not s.wantQuit): button = raw_input() if button == "q": s.wantQuit=1 except KeyboardInterrupt: s.wantQuit = 1 myThread = MyThread () myThread.start() a=5 while not myThread.wantQuit: ...
multithreading - python conditional lock
How can I implement conditional lock in threaded application, for instance I haw 30 threads that are calling function and for most off the time all threads can access is simultaneous, but depending on function input there can be condition when only one thread can do that one thing. (If value for input is repeated and some thread is still working then I need lock.) I now that there is module threading with Rlock() ...
multithreading - How to debug deadlock with python?
I am developing a multi-threading application, which is deadlocking. I am using Visual C++ Express 2008 to trace the program. Once the deadlock occurs, I just pause the program and trace. I found that when deadlock occurs, there will be two threads called python from my C++ extension. All of them use Queue in python code, so I guess the deadlock might caused by Queue. But however, once the extension goes ...
multithreading - python console intrupt? and cross platform threads
I want my app to loop in python but have a way to quit. Is there a way to get input from the console, scan it for letter q and quick when my app is ready to quit? in C i would just create a pthread that waits for cin, scans, locks a global quit var, change, unlock and exit the thread allowing my app to quit when its done dumping a file or w/e it is doing. DO i do this the same way in python and will it be cross platform? (...
multithreading - Client Server programming in python?
Here is source code for multithreaed server and client in python. In the code client and server closes connection after the job is finished. I want to keep the connections alive and send more data over the same connections to avoid overhead of closing and opening sockets every time. Following code is from :
multithreading - Python, SQLite and threading
I'm working on an application that will gather data through HTTP from several places, cache the data locally and then serve it through HTTP. So I was looking at the following. My application will first create several threads that will gather data at a specified interval and cache that data locally into a SQLite database. Then in the main thread start a CherryPy application that will query that SQLite dat...
multithreading - how to put a function and arguments into python queue?
This question already has answers here:
multithreading - How can I profile a multithread program in Python?
I'm developing an inherently multithreaded module in Python, and I'd like to find out where it's spending its time. cProfile only seems to profile the main thread. Is there any way of profiling all threads involved in the calculation?