问题描述
假设我有2台服务器.
第一个服务是提供一些计算的服务,可以持续很长时间(分钟至小时).
第二个服务器将使用此服务计算一些数据.
我正在尝试为第一台服务器设计一个REST API,到目前为止还不错.但是我想听听关于如何完成长期任务完成通知的一些意见.
我考虑到到目前为止的2种方法:
- 民意调查 - 第二个服务器会不时地询问结果.
- 回调 - 第二个服务器将在完成后为第一个呼叫的URI设置URI.但这在REST API中有点闻起来.
你怎么看?
推荐答案
除了我已经回答了 .com/Q/3468177/21926">这个类似的问题,我建议使用通知的Atom Publisthing协议(您可以发布到第二个服务器).
其他推荐答案
对于您的情况,我会选择投票.当第二个服务器提出初始请求以在第一台服务器上创建作业时,它应该得到具有最终状态页面的URL的响应.然后,第二台服务器轮询每5-15分钟URL检查工作状态.如果第一台服务器使该URL为RSS或Atom feed,则用户还可以将其RSS读取器指向同一URL,并确定是否完成了工作.当人和机器都可以从单个来源获取信息时,这是一个真正的胜利.
其他推荐答案
如果您使用Python,则可以利用RabbitMQ和芹菜来完成这项工作.芹菜使您可以在队列中创建一个项目,然后暂停执行您通过它运行的任何操作(即.无需进行轮询或回调.
问题描述
Suppose I have 2 servers.
The first is a service that provides some computations, which can last long time (minutes to hours).
The second server will use this service to have some data computed.
I'm trying to design a REST API for the first server and so far so good. But I'd like to hear some opinion on how to model notifications when the long lasting task is finished.
I considered 2 approaches so far:
- Polling - the second server will ask every now and then about the result.
- Callback - Second server will setup an uri for the first one to call after it is done. But this smells a bit in REST API.
What do you think?
推荐答案
In addition to what I've already answered in this similar question, I'd suggest using the Atom Publishing Protocol for the notification (you could publish to your second server).
其他推荐答案
For your situation I would choose polling. When the second server makes the initial request to create the job on the first server, it should get a response that has the url of the eventual status page. The second server then polls that url every 5-15 minutes to check the status of the job. If the first server makes that url an RSS or Atom feed, then users could also point their RSS readers at the same url and find out themselves if the job is done. It's a real win when both people and machines can get information out of a single source.
其他推荐答案
If you use Python, you can take advantage of RabbitMQ and Celery to do the job. Celery lets you create an item in a queue and then pause execution of whatever you're running it through (i.e.: Django) such that you can consume the output of the queue processor as it becomes available. No need for polling OR callbacks.