How can I do synchronous rpc calls
I'm building a program that has a class used locally, but I want the same class to be used the same way over the network. This means I need to be able to make synchronous calls to any of its public methods. The class reads and writes files, so I think XML-RPC is too much overhead. I created a basic rpc client/server using the examples from twisted, but I'm having trouble with the client.
c = ClientCreator(reactor, Greeter)
c.connectTCP(self.host, self.port).addCallback(request)
reactor.run()
This works for a single call, when the data is received I'm calling reactor.stop(), but if I make any more calls the reactor won't restart. Is there something else I should be using for this? maybe a different twisted module or another framework?
(I'm not including the details of how the protocol works, because the main point is that I only get one call out of this.)
Addendum & Clarification:
I shared a google doc with notes on what I'm doing. http://docs.google.com/Doc?id=ddv9rsfd_37ftshgpgz
I have a version written that uses fuse and can combine multiple local folders into the fuse mount point. The file access is already handled within a class, so I want to have servers that give me network access to the same class. After continuing to search, I suspect pyro (http://pyro.sourceforge.net/) might be what I'm really looking for (simply based on reading their home page right now) but I'm open to any suggestions.
I could achieve similar results by using an nfs mount and combining it with my local folder, but I want all of the peers to have access to the same combined filesystem, so that would require every computer to bee an nfs server with a number of nfs mounts equal to the number of computers in the network.
Conclusion: I have decided to use rpyc as it gave me exactly what I was looking for. A server that keeps an instance of a class that I can manipulate as if it was local. If anyone is interested I put my project up on Launchpad (http://launchpad.net/dstorage).
Asked by: Anna362 | Posted: 28-01-2022
Answer 1
If you're even considering Pyro, check out RPyC first, and re-consider XML-RPC.
Regarding Twisted: try leaving the reactor up instead of stopping it, and just ClientCreator(...).connectTCP(...)
each time.
If you self.transport.loseConnection()
in your Protocol you won't be leaving open connections.
Answer 2
For a synchronous client, Twisted probably isn't the right option. Instead, you might want to use the socket module directly.
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((self.host, self.port))
s.send(output)
data = s.recv(size)
s.close()
The recv()
call might need to be repeated until you get an empty string, but this shows the basics.
Alternatively, you can rearrange your entire program to support asynchronous calls...
Answered by: Elian652 | Posted: 01-03-2022Answer 3
Why do you feel that it needs to be synchronous?
If you want to ensure that only one of these is happening at a time, invoke all of the calls through a DeferredSemaphore so you can rate limit the actual invocations (to any arbitrary value).
If you want to be able to run multiple streams of these at different times, but don't care about concurrency limits, then you should at least separate reactor startup and teardown from the invocations (the reactor should run throughout the entire lifetime of the process).
If you just can't figure out how to express your application's logic in a reactor pattern, you can use deferToThread and write a chunk of purely synchronous code -- although I would guess this would not be necessary.
Answered by: Patrick233 | Posted: 01-03-2022Answer 4
If you are using Twisted you should probably know that:
- You will not be making synchronous calls to any network service
- The reactor can only ever be run once, so do not stop it (by calling
reactor.stop()
) until your application is ready to exit.
I hope this answers your question. I personally believe that Twisted is exactly the correct solution for your use case, but that you need to work around your synchronicity issue.
Addendum & Clarification:
Part of what I don't understand is that when I call reactor.run() it seems to go into a loop that just watches for network activity. How do I continue running the rest of my program while it uses the network? if I can get past that, then I can probably work through the synchronicity issue.
That is exactly what reactor.run() does. It runs a main loop which is an event reactor. It will not only wait for entwork events, but anything else you have scheduled to happen. With Twisted you will need to structure the rest of your application in a way to deal with its asynchronous nature. Perhaps if we knew what kind of application it is, we could advise.
Answered by: Brad487 | Posted: 01-03-2022Similar questions
python - Is it possible to use celery for synchronous tasks?
Nearly synchronous works, too; basically, I want to delegate the data access and processing behind a web app to a task queue for most jobs. What's the fastest latency that I can consider reasonable for celery tasks?
Update (for clarification)
I guess for clarity I should explain that throughput -- while nice -- is not a necessary issue for me; I won't be needing scaling in that direction for a while, yet....
python - Why does my synchronous job code does not work
I have wrote below code segment to initiate function in every 2 seconds. But, it seems it is not work. Why?
from threading import Timer
class A :
value = None
def AX(self):
value = 12
obj = B()
Timer(1,obj.BY, [self.value]).start()
class B:
def BY(self,value):
print "refreshed :", value
if __name__=='__main__':
obj = A()
obj.AX()
python - What makes WSGI is synchronous in nature?
(Correct me if I made a mistake)
WSGI is a standard interface for python based web application. But it's
said that WSGI is synchronous in nature. So, even something like Tornado
will handle WSGI app synchonously...
python - Synchronous celery queues
I have an app where each user is able to create tasks, and each task the user creates is added to a dynamic queue for the specific user. So all tasks from User1 are added to User1_queue, User2 to User2_queue, etc.
What I need to happen is when User1 adds Task1, Task2, and Task3 to their queue, Task1 is executed and Celery waits until it is finished before it executes Task2, and so on.
Having them execute al...
python - How to change code to synchronous mode in for loop
I am doing Kaggle APTOS competition and I want to balance image number according to the following steps:
brighten too dark images
flip horizontally, vertically, and both based on brightened images
change contrast or sharpness based the step2 images
But I found the preprocessing steps are executed asynchronously, which means images are brightened and flipped at the same time.
python - Is it possible to use celery for synchronous tasks?
Nearly synchronous works, too; basically, I want to delegate the data access and processing behind a web app to a task queue for most jobs. What's the fastest latency that I can consider reasonable for celery tasks?
Update (for clarification)
I guess for clarity I should explain that throughput -- while nice -- is not a necessary issue for me; I won't be needing scaling in that direction for a while, yet....
Synchronous wget command sent via socket in python
I'm sending wget commands to a remote server via sockets.
Here is the client code:
import socket
s=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect(('192.168.1.16',12345))
s.send('wget http://url/file1')
print "file 1 OK"
s.send('wget http://url/file2')
print "file 2 OK"
s.send('wget http://url/file3')
print "file 3 OK"
s.close()
On the server, code is as follo...
A good persistent synchronous queue in python
I don't immediately care about fifo or filo options, but it might be nice in the future..
What I'm looking for a is a nice fast simple way to store (at most a gig of data or tens of millions of entries) on disk that can be get and put by multiple processes. The entries are just simple 40 byte strings, not python objects. Don't really need all the functionality of
python - Synchronous beahviour of tornado
Is there a way in tornado framework to setup callbacks synchronously.
For eg
print word
self.twitter_request(path= "/search",
access_token=self.current_user["access_token"],
callback=self.test,q=word,rpp="100")
And my test function is defined as
def test(self,response):
print "Test"
In the above request, I have a set of 2 words wh...
python - Synchronous AMQP publish
I know there is a number of libraries, implementing AMQP support in python. What I need, though, is a library, that will allow me to do AMQP publish in a synchronous style, because it will be used from a WSGI app, so the usual asynchronous callback-driven style of interaction with the queue broker will be a bit out of place there.
Other parts of the system use pika
python - Synchronous PySCXML
I'm currently looking into idioms and libraries for Python helping me with state machine design for a control task.
I've found good suggestions in this other SO question: Python state-machine design.
Among the answers, PySCXML was suggested which allows t...
python - Synchronous reading of a multipart upload alongside Django
I have a Django application which needs to have access to reading multipart file uploads as file-like objects as they're uploaded, which means that I need more or less synchronous access to the request object and a way to unpack it in chunks to binary data. Django unfortunately handles uploads by moving them directly into memory or to temporary files, which won't work for my use case.
Some one recommended...
python - Indices are not synchronous of using one lists indices in another
Coming from R, this was hard to grasp. Taking elements from a list start with position 0.
The problem is that using one list to select items from another list are not running at the same pace here.
list1 = [1,2,3,4]
list2 = [1,2,3,4]
for x in range(0, len(list1)):
print(list1[list2[x]])
This will result in:
>> 2
>> 3
>> 4
>> IndexE...
python - Why does my synchronous job code does not work
I have wrote below code segment to initiate function in every 2 seconds. But, it seems it is not work. Why?
from threading import Timer
class A :
value = None
def AX(self):
value = 12
obj = B()
Timer(1,obj.BY, [self.value]).start()
class B:
def BY(self,value):
print "refreshed :", value
if __name__=='__main__':
obj = A()
obj.AX()
networking - Python synchronous urllib2 result
In a certain part of my software, I want to simply get the source code of an URL, then I would like to parse that string (the source) and do something. Problem is, I can't figure out how to get said source when I actually run the program, even though it works in the IDLE.
import urllib2
user_agent = 'Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)'
req = urllib2.Request('http://www.google.com')
response = u...
Still can't find your answer? Check out these communities...
PySlackers | Full Stack Python | NHS Python | Pythonist Cafe | Hacker Earth | Discord Python